Knowledge management for century hospital system

Published: November 30, 2015 Words: 2818

This paper discusses a systems approach to the delivery of healthcare, taking care of the caregiver, a process-based quality management system, and patient safety quality management tools.

Knowledge management is defined as the art and science of changing or transforming data and information into useful knowledge.

Knowing what to fix is the first step in quality improvement. We must know what is wrong, and have the ability and resources to fix it. An organization is a system within itself. All systems have inherent active failures, like an act of omission or commission, and latent failures, like a hazardous condition or flawed process designed into policies, protocols, and procedures, which, in the correct sequence of events, will allow the system to fail. Healthcare is a complex system comprised of multiple subsystems, each of which operates independently and creates a ripple effect throughout the entire organization.

Systems have interactions at the interface of various processes. The more interfaces there are, the more complex the system, and the more opportunities there are for error. Complex processes are more prone to fail than simple processes, thus increasing the occurrence of an unplanned and undesirable event. In a November 1998 presentation, Don Berwick, MD proposed--if the probability of success in a one step process is 99%, likelihood of error is 1%. Similarly, 25 steps, 22%, 50 steps, 39%, and 100 steps, 63%.

A Systems Approach to the Delivery of Healthcare

Health and Human Services, the Agency for Healthcare Research and Quality, CMS, JCAHO, NQF, and the Leapfrog Group, all have requirements for the healthcare system and its caregivers. Sixty days after the January 24, 2003 publication of the final rule in the Federal Register, hospitals, to comply with the Conditions of Participation, are now required to meet the standards described in the Quality Assessment and Performance Improvement Program (QAPI). There is a need to document knowledge of processes that require improvement, methods to achieve improvement, and ways to measure outcomes of the change. To comply, all hospitals will need an event reporting system. The need for e-Health IT, CPOE, EMR, and other health information systems continues to grow.

A systems approach integrates human resource solutions with organizational needs and priorities. Systems' thinking enables us to see the 'whole picture' and recognize that everything we do is interrelated. An event in one part of the system affects all other parts of the system. Every system has driving forces and restraining forces that upset the steady state, or equilibrium. Driving forces, like the QAPI, alter a steady state. Restraining forces hinder movement toward a desired goal, e.g., restrictions on the quality of care for an HMO patient are greater than for a PPO patient. Although a system in equilibrium is predictable and safe, every system is in a constant state of change.

The Importance of Taking Care of the Caregiver

Reducing vulnerability to the threat of medical errors is the most effective way to achieve patient safety. An effective patient safety event reporting system should demonstrate trust-based processes between the caregivers themselves, and between the caregivers and the administration. It should be non-punitive and proactive in eliminating defects and minimizing variation. For caregivers to speak up, the identity of the reporting party must remain confidential, and resulting remedial action must be disseminated throughout the system.

A Patient Safety Event Reporting System should be designed to demonstrate a measurable increase in patient safety, analyze events, and lower operational costs. It should mitigate the potential for harm, increase caregiver and patient satisfaction, and enhance process-based quality management and performance improvement. Variation in care provided and failure to disseminate evidence-based best practices are factors leading to preventable medical errors. Currently, we must reduce variation in medical practices and provide evidence-based medicine, yet minimize the potential for the occurrence of medical errors, and provide better medical care and service to our patients. Health care providers and students in medicine, nursing, and pharmacy who make better use of systems and systems thinking can create change. Caregivers must be part of the solutions to patient safety problems.

A Process-based Quality Management System for Healthcare

Quality system goals are to develop, implement, maintain, and continually improve the healthcare quality management system to enhance patient safety and satisfaction, prevent errors, and increase workflow effectiveness and efficiency. Meeting industry healthcare requirements and standards will reduce variation and waste.

Tools for Patient Safety Quality Management

Artificial Intelligence (AI) is the integration of knowledge-based systems (KBS) and expert systems (ES). KBS provides computer-based automation of logical reasoning. It uses AI techniques to perform deductive and inductive reasoning.

Detection solutions involve data mining techniques that include neural networks, decision trees, decision tables, naive Bayes, and clustering methodologies. Data representations that maximize the power of discrimination between good and bad events (pattern-related enhancement) are used. The segmented forecasting model uses a mature or frequently done event model, and a new or sparsely done event model that uses rules derived from specific domain knowledge. Complex indicators are implemented to define potential incidents. Specific techniques are utilized to build clusters characterized by risk density, i.e., potential incidents.

Early recognition/detection of conditions, action, and lack of action that have the potential to cause medical errors are essential. Key activities are classification: analysis, segmentation, correlation, clustering of the data and information and forecasting: descending trends and behaviors from clustered data.

Fundamental components of a healthcare delivery system are analyzed and modeled. Relationships and interrelations between components are analyzed and classified. As events are reported over time, patterns of similar characteristics emerge.

The "detection real-time solution" classifies good action, uncertain action, and bad action. Qualitative results obtained are shortest delays in detection, efficient use of signals, ease of change, adaptive, and have robustness in time. These solutions have structural flexibility in merging between data learning and specific knowledge with a reasonable level of false positives and effective management of signals.

Most of the existing authentication schemes for mobile communication are static in nature and principally

dependent on strength of authenticating identifiers for users identity. The acceptance of all the transactions of a

user under a single authentication level, is most vulnerable. We propose a novel transaction based authentication

scheme(TBAS) for mobile communication using cognitive agents. The proposed approach provides range of

authentication by dynamically deploying authentication challenges based on mobile transaction sensitivity and users

behaviors. The method has been simulated using the Agent Factory framework for cognitive agents generation and

their communication. The performance analysis and the simulation of the proposed system shows that, there is a

considerable reduction in security cost compared to regular session based authentication schemes. By combining

transaction based authentication with behavior analysis the authentication attacks can be effectively identified.

Keywords: mobile communication; security; authentication; mobile transactions; cognitive-agents

INTRODUCTION

Mobile communication and services over emerging wireless technologies provide anyone, anytime and

anywhere access. Increased importance in mobile telecommunication and dominance of data communication

promoted large segment of users to accept the mobile data communication as a part of their

day-to-day activities. However, the wireless medium has certain limitations over the wired medium such

as: open access, bandwidth insufficiency, complex system functioning, power confinement, and relatively

unreliable network connectivity. These limitations make it difficult to design efficient security schemes

for authentication, integrity and confidentiality. Wireless networks and the current generation of 3G

networks have a packet switched core which is connected to external networks such as the Internet,

making it vulnerable to new types of attacks such as denial of service, viruses, worms, channel jamming,

unauthorized access, eavesdropping, message forgery, message reply, man-in-the-middle attack, session

hijacking, etc., similar to the Internet [2]. Out of many security issues of mobile communication, the focus

of this paper is designing an effective, dynamic and intelligent decision based authentication technique

for mobile communications.

Mobile Authentication

Authentication is a process to identify a mobile user(MU), in order to authorize him/her to use

system resources for specified purposes. Authentication involves negotiating secret credentials between

prover and verifier for protecting communications. The primary aim of any authentication protocol or a

scheme is "verifying the linkage between an identifier(usually claimed by the individual, but sometimes

observed) and the individual [3]." Introduction of many value added services in mobile world, has

triggered exorbitant growth of mobile users population, and many of these services demands a stringent

authentication mechanism to ensure that legitimate users are using the network and services.

Most of the existing authentication schemes may be broadly classified into three categories [4]: 1.

application level authentication, where the user enters the application level data such as user-ID, password,

PIN's, OTP's or some times bio-metric information as the basis for authenticating communications between

the endpoint device and the service provider's server. 2. device level authentication, in which the end

systems, which may be servers or client devices, stores some form of secrets used by cryptographic

algorithms running on these systems. These secrets are either of type shared or unshared, which could be

bound to hardware in use, for e.g., cryptographic key bound to the SIM of a mobile device. 3. network

level authentication, will enable the exchange of session keys based on the public/private key pairs of the

two mutual authenticators.

Device based authentication protocol is one of the common type of authentication practiced by mobile

based application service providers [2]. Here, it is essential to register the device in advance to use the

service. Even though the authentication mechanism looks stringent, it does not be able to detect service

misuse from compromised mobile devices. It also indirectly limit the users freedom of changing the device

at his/her will, which is very common in a mobile environment.

Authentication services effect QoS in several ways. The public/private-key based authentication mechanisms

consume more time and power due to the computational complexity of encryption and decryption

of data [5]. To achieve efficiency in authentication, challenge/response authentication mechanisms based

on secret keys are widely used in wireless networks.

However, in these mechanisms, the credentials of the mobile user(MU) are encrypted and transmitted

hop-by-hop for remote verification among authentication servers, which increases the overhead of communication.

The extended waiting time will influence the QoS parameters such as authentication cost,

delay and call dropping probability. In order to improve the security and efficiency during authentication,

several schemes are proposed, focusing on the design of lightweight and secure authentication protocol

[6], [7], [8], [9], [10]. Most of the systems proposed are static in nature, by providing common scheme of

authentication irrespective of the sensitivity of the communication. As a result of this, the authentication

protocols fail to establish relationship between correct identifier and correct principal. This leads to a

situation, where the correct identifier submitted by incorrect principal is validated and authenticated to

get the services.

Various attacks have been developed to defeat a single-factor application level authentication, to name

a few: social engineering; passphrase guessing; phishing; pharming; Trojans; malware; etc. [11], [12],

[13]. To overcome this a two-factor authentication at application level was introduced, which combines

something that a user knows with something he or she possesses, but it is not foolproof, since the failure

modes for different authentication factors are largely independent [3], for example, the proper working

of mobile device is independent of the user remembering passphrase or PIN. Further the session level

implementation of a two-factor authentication makes the user remain authenticated for the duration of the

session, i.e., until they log off or close the browser. But it is a catch-all approach, meaning users will be

kept authenticated regardless of type of transactions those are performing, it attributes the same level of

risk to all transactions available to the user.

Transaction-based authentication schemes are

Related Works

Some of the research works conducted in the area of

application-level security schemes to implement dynamic

security solutions are mentioned here.

An adaptive encryption protocol is proposed in

Reference [22] based on the fractal framework. This

protocol dynamically chooses a proper encryption

algorithm based on the application-specific requirements,

and device configurations. The awareness of

network variations at the application-level, helps in

intelligent adaptation of encryption protocol. The

dynamic self-adaptive security model using agents

is proposed in Reference [23]. The main idea of

this model is regarding security management as a

dynamic process, security policy must adapts dynamic

of network intrusion. It consists of circular actions:

security analysis, and configure, real-time inspect,

give an alarm and respond, audit and evaluation.

The selective encryption for multimedia content is

proposed in Reference [24]. An optimal security level,

is chosen based on the cost of the multimedia information

to be protected, and the cost of the protection

itself. If the multimedia to be protected is not that valuable

in the first place, it is sufficient to choose relatively

light level of encryption. On the other hand, if the multimedia

content is highly valuable or represents government

or military secrets, the cryptographic security

level must be the highest possible.

Location based encryption proposed in Reference

[25], builds on established cryptographic algorithms,

and protocols in a way that provides an additional layer

of security beyond that provided by conventional cryptography.

It allows data to be encrypted for a specific

place or broad geographic area, and supports constraints

in time as well as space. It can be used with both

fixed and mobile applications, and supports a range

of data sharing, and distribution policies. Another

work on data encryption for mobile users based on the

location is proposed in Reference [26]. The location

coordinates is incorporated during the data encryption,

the receiver is able to decrypt the data only when the

location coordinates matches with its current location.

The Reference [27] proposes a conceptual framework

for context-based security systems. Contextbased

security aims at adapting the security policy depending

on a set of relevant information collected from

the dynamic environment. As the environment evolves,

the context change, thus, security policies dynamically

change in order to cope with new requirements. The

Reference [28] suggested the negotiating security technique

across multiple terms of transaction, such as terminal

types, service types, user's preference, and the

level of sensitivity of information. An adaptive security

model is proposed in Reference [29], which dynamically

adapts the security level according to a set of

contextual information such as terminal types, service

types, network types, user's preferences, information

sensitivity, user's role, location, and time, usingMAUT

(Multi-Attribute Utility Theory), and simple heuristics

in order to support secure transactions in the heterogeneous

network

The enterprise portal research area focuses on the products and infrastructures associated with

the development and deployment of enterprise portals. Gartner defines an enterprise portal as a

Web software infrastructure providing access to, and interaction with, relevant information assets

(information/content, applications and business processes), knowledge assets and human

assets, by select targeted audiences, delivered in a highly personalized manner. Enterprise

portals may face different audiences, including employers (business-to-enterprise), customers

(business-to-consumer) and trading partners (B2B). Intranets and extranets relate to the

enterprise portal topic, as well.

Gartner enterprise portal research covers trends, technologies, product functionality, market

developments, vendor strategies, and best practices for deployment and management, product

and vendor selection, and portal acquisition and deployment cost analysis. Our research also

addresses the relationship of enterprise portals to other technology areas, including, but not

limited to, enterprise mashups, CRM, social computing, business process management (BPM),

ERP and supply chain management (SCM).

The portal market experienced strong growth during this decade, despite negative economic

cycles during this time frame. Gartner clients continue to express high interest in enterprise

portals, even though the enterprise portal concept and the primary technology enabling it have

reached maturity. Enterprise portals reflect a type of enterprise Web site, as well a type of

technology used as the foundation for an enterprise Web presence. Portals are used by

enterprises, but portals also are relevant to the consumer Internet. Many companies that revisit

their Web strategies implement portal technologies aimed at providing an externally facing Web

presence. Internet megaportals, such as Yahoo, Microsoft Network (MSN) and Google, are highprofile

examples of portals in a consumer Web context. Social networking, social tagging, wikis,

blogs and personal home pages are examples of Web 2.0 functionalities that frequently were

found early on in the consumer Web, which enterprise vendors providing horizontal portal

offerings now claim to provide. Although the primary focus of this research is enterprise portals, it

also will track technology developments among megaportals that will affect enterprise portal

software and deployment patterns, such as advancements with end-user mashups.

Many, but not all, enterprise portals are built using a portal product that is based on a portal

container, or a software suite that includes portal functionality. A portal product is a packaged

software application that is used to create and maintain enterprise portals. These products can be

used to design vertical or horizontal enterprise portals.

Although software-as-a-service (SaaS) vertical portals have been popular in several industries for

several years, 2008 marked greater interest in SaaS and in cloud-based portals for horizontal

portal deployments. Alternative approaches from the classic horizontal portal container approach