[THIS TRANSCRIPT IS UNEDITED]

NATIONAL COMMITTEE ON VITAL AND HEALTH STATISTICS

SUBCOMMITTEE ON HEALTH DATA NEEDS, STANDARDS AND SECURITY

August 5, 1997
Morning Session

Capital Hilton
16th and K Streets, NW
Washington, D.C. 20036

Major Topic: Perspectives on Security Issues in Implemention of Administrative Simplification Provisions of PL 104-191


August 5, 1997

P R O C E E D I N G S (9:05 a.m.)

DR. COHN: I want to welcome everyone to this morning's meeting and set of hearings. My name is Simon Cohn. I am a physician and the clinical information system coordinator for Kaiser Permanente, and a member of the committee.

Now, John Lumpkin will be chairing this session, but he is a little bit delayed. So I am the acting chair until he shows up.

This is a meeting of the Subcommittee on Health Data Needs, Standards and Security of the National Committee on Vital and Health Statistics. For the next day and a half, we will be taking testimony related specifically to the issues of security and implementation in the Health Insurance Portability and Accountability Act of 1996.

In keeping with the general tenor of the committee, what we are going to is to initially start by having everyone introduce themselves, all the committee members. Then we are going to be asking everyone in the audience to also introduce yourself and identify your affiliation.

Bob, do you want to start?

DR. GELLMAN: I'm Bob Gellman. I'm a privacy and information privacy consultant here in Washington and a member of the committee.

DR. VAN AMBURG: I'm George van Amburg, Michigan Public Health Institute. I'm a health researcher and a member of the committee.

MS. BALL: I'm Judy Ball from the Office of the Assistant Secretary for Planning and Evaluation, HHS, and I am staff.

MR. BLAIR: I'm Jeff Blair. I'm with IBM Health Care Solutions, and I am also with the ANSE Health Care Informatics Standards Board.

DR. FRAWLEY: Kathleen Frawley, the American Health Information Management Association.

DR. BRAITHWAITE: Bill Braithwaite, with the Office of the Assistant Secretary for Planning and Evaluation at HHS, and I am staff to the subcommittee.

MR. SCANLON: I'm Jim Scanlon, U.S. Department of Health and Human Services. I'm the executive director of the National Committee.

MS. GREENBERG: I'm Marjorie Greenberg of the National Center for Health Statistics and executive secretary to the committee.

(The remainder of the introductions were delivered off mike.)

DR. COHN: If our panel would also introduce themselves.

MS. WALLER: Goldberg, Kohn, Bell, Black, Rosenbloom and Moritz, Chicago.

DR. DRAZEN: Erica Drazen, I'm chair of the Computer-Based Patient Record Institute, and run the research function for First Consulting Group.

MR. MILLER: Dale Miller, Irongate, Incorporated.

MS. BROWN: Laura Brown, Ernst and Young.

DR. LANDWEHR: Carl Landwehr, Naval Research Laboratory.

DR. COHN: Carl, as I understand, you are replacing Jerry Sheehan from the National Research Council?

DR. LANDWEHR: That's correct.

DR. COHN: As is typical with this committee, we obviously would ask all of our panelists to make a presentation. I believe we have asked them to limit it to eight minutes per presentation, with time then for the questions for the committee. After that if there is time for questions from the floor, we will also take those, but we will try to maintain our time frame with breaks as indicated.

I think all of you have copies of the tentative agenda. With the exceptions of some changes in speakers which we will identify prior to any of the panels, it is really unchanged at this point.

I should at this point also introduce Dr. John Lumpkin, who just arrived and who will be taking over as chair of the committee. John, we are at the point now where we have just introduced the panelists, and we're just going to have them start.

DR. LUMPKIN: Great, okay. With that, would our first panelist like to make a presentation?

DR. LANDWEHR: Good morning, and thank you for the opportunity to testify before the National Committee on Vital and Health Statistics Subcommittee on Health Data Needs, Standards and Security.

My name is Carl Landwehr. I am a supervisory computer scientist at the U.S. Naval Research Laboratory, where I head a section responsible for computer security research within the Center for High Assurance Systems. I served on the National Research Council study committee that produced the report, for the record, Protecting Electronic Health Information. I believe that members of the NCVHS have received copies of the report, which was released in March of this year. Will submit another copy along with my oral testimony this morning.

The National Research Council committee on which I served was asked by the National Library of Medicine, Warren G. Magnuson Clinical Center and the Massachusetts Health Data Consortium to investigate ways of improving the privacy and security of health care applications of the national information infrastructure.

The charge to the committee was three fold: first, to observe and assess existing procedures and practices for protecting the privacy and security of electronic health information; second, to identify other mechanisms worthy of testing in a health care environment, and third, to outline promising areas for future research.

In order to carry out this charge, the committee conducted a series of site visits to six different health-related organizations, during which we discussed privacy and security concerns, as well as the mechanisms used to protect electronic health information.

The committee also met with a range of other experts in computer security, health informatics and patient privacy to better understand concerns about privacy and security and to identify security methods that work in health care and other industries.

Before I discuss the committee's findings and recommendations, let me first define the terminology the committee used in the report.

The committee used the term privacy to refer to an individual's ability to limit the disclosure of personal information. This definition implies that privacy is a goal that we try to achieve by protecting that information that has been revealed, and by limiting disclosure of information to others.

Confidentiality refers to a condition in which personal information is shared or released in a controlled manner. Hence, confidentiality policies outline the specific rules governing releases of information, and attempt to balance the need for access to information against the desire to protect the patient's privacy.

Security consists of technical and non-technical measures for protecting information and information systems against malicious attacks or accidents. As such, security measures support confidentiality policies by helping insure that information is available only to those with a legitimate need to know, and is released only in accordance with stated policies. But security also includes measures to insure the integrity of information and the availability of the information systems in which it is stored.

The committee found that a wide variety of practices, both technical and organizational, have been developed and implemented. Organizational practices are at least as important as technical measures in protecting electronic health information.

Our site visits revealed that those practices have not been consistently and uniformly deployed throughout the industry, however. To date, health care organizations have not had consistent economic and regulatory incentives to strengthen security. Sporadic violations of privacy and security have failed to rally broad interest, and few sanctions exist that compel broader attention to privacy and security.

Most organizations face strong pressures to expand the capabilities for access to their health information systems. Those that have put information on line are beginning to take protective steps they believe are reasonable and justifiable. However, no single organization the committee visited has adopted the full range of practices the committee believes is necessary to protect electronic health information.

Two other factors slow the adoption of security technology in health care. One is a lack of standards that define the type of security mechanisms that health care organizations should demand from vendors of medical information systems. The other is the lack of a mechanism for health care organizations to share information about security breaches that have occurred and practices that are effective in preventing them.

To remedy these problems, the committee made a series of recommendations. First, the committee called for all organizations that handle patient identifiable health care information, regardless of size, to adopt a wide-ranging set of technical and organizational policies practices and procedures. The technical practices include first, individual authentication of users, so they can be held to account for their actions. Second, access controls to insure that users can access and retrieve only information for which they have a legitimate need. Third, audit trails to allow organizations to track and review all accesses made to patient identifiable health information. Fourth, physical security and disaster recovery techniques to keep intruders and emergencies from affecting operations. Fifth, the protection of remote access points to prevent outside crackers from breaking into information systems via Internet or modem connections. Sixth, protection of external electronic communications through encryption or the use of dedicated lines. Seventh, software discipline to prevent the installation or downloading of malicious software that could copy or corrupt patient data. Eight, system assessment to insure organizations are aware of the vulnerabilities of their information systems.

In order to be effective, these technical practices must be complemented by a set of organizational practices that establish structures for creating, implementing and enforcing policies. The organizational practices recommended by the committee include: development of policies that specify the types of information that will be released to users inside and outside the organization, and to outline the security mechanisms that will be employed to protect information and systems. Second, security and confidentiality committees that regularly review and update those policies. Third, information security officers who are responsible for implementing and monitoring compliance with security policies and practices. Fourth, education and training programs to help all health care workers to understand their responsibilities in protecting health information and the practices used. Fifth, sanctions that are employed consistently and uniformly to discipline those who violate confidentiality and security policies. Sixth, improved authorization forms that tell patients who inside and outside the organization will have access to their medical records. Seventh, patient access to audit logs to allow patients to review who has seen their records.

The committee believes that these practices can be adopted almost immediately with minimal difficulty and expense.

Incorporation of these practices into standards promulgated by the Secretary of Health and Human Services represents the first step toward improving the protection of electronic health information. Additional efforts will also be needed to facilitate and develop deployment of technical solutions and revision of standards over time.

To do so, the committee also recommends that the federal government work with industry to develop the necessary infrastructure to help health care organizations better protect health information. It recommends the establishment of a standing health information security standards committee within NCVHS to develop and update privacy and security standards as necessary. It also recommends the establishment of an organization modelled after the computer emergency response team at Carnegie Mellon University, for sharing information about security threats, incidents and solutions in the health care industry.

Such measures, while significant, will not by themselves insure adequate privacy for patients. They do not address systemic concerns about patient privacy that stem from the widespread sharing of patient information throughout the health care system.

Health information routinely flows among health care providers, insurers, pharmacists, state public health organizations and perhaps even employers, life insurers and marketing firms. Such sharing is largely unregulated and represents a significant concern to patients and privacy advocates alike.

Privacy concerns posed by these data flows require national discussion to determine how best to balance the patient's privacy interests against other organizations' legitimate needs for health information.

Efforts under way to develop the universal identifier for patients heighten these systemic concerns. While a universal identifier can more easily link information for care payment, administration and research, it could also facilitate the linkage of health records with records outside the health care system, whether employment, driving or financial. Such privacy concerns must be considered explicitly in any attempt to develop a universal identification system.

In summary, the committee strongly encourages the development and promulgation of security standards to help health care organizations determine how best to protect electronic health care information, and to help vendors determine which capabilities to provide in health information systems.

Such standards must include organizational as well as technical practices, and should outline the capabilities desired in security systems rather than particular implementations or solutions. At the same time, the National Committee on Vital and Health Statistics and the Secretary of Health and Human Services must take additional steps to foster the development and deployment of innovative security solutions through the creation of a standard setting body and the establishment of an information sharing body. Additional efforts will be needed to address systemic concerns stemming from the sharing of information throughout the health care industry.

The committee believes that adoption of its recommendations will foster continued progress in modernizing the health care industry, while insuring that patient privacy is maintained. I would encourage the NCVHS to review the full contents of the committee's report for additional perspective on these issues.

I'll now be pleased to answer any additional questions.

MS. BROWN: Thank you for the opportunity to participate in these hearings. It is an exciting day for health care information security to reach this level of attention.

My name is Laura Brown, and I am a senior manager with Ernst and Young. In this role, I have the opportunity to work with and advise health care organizations on information security issues.

Health care information security has been for the most part a neglected business requirement over the past few years. The passage of the Health Insurance Portability and Accountability Act with security standards included in the provisions is a vital step in bringing health care information security to a baseline level in the health care industry and to a minimum level that is present in many other industries.

Developing this baseline, however, will be a challenging effort. To achieve this goal, it is critical to understand the evolution of information security in health care and the barriers which obstruct greater levels of security in the industry.

As an information security practitioner for the past nine years, including the last five in the health care industry, I hope I can provide to you an overview of the evolution of health care information security, a discussion of the challenges facing implementation, and recommendations for successful integration of security practices.

As all of you know, health care was a very different industry just a few years ago. For the most part, any existing information security requirements were limited to the protection of financial information. A number of things occurred, however, which changed the industry and brought forth a mandate for far stronger information security practices.

Those included, organizationally, the industry changed. Organizations merged and new players were defined. The industry embraced technology, and an industry which traditionally lagged in information technology was now pursuing it. The industry began aggressively collecting clinical information. Consumer awareness for privacy issues heightened. The industry embraced the computer-based patient record vision, and the industry went online to the Internet.

All the while these changes took place, information security vulnerabilities emerged, increased and evolved. An insufficient effort was made to understand and mitigate these emerging risks. As a result, and as confirmed by the National Research Council's report, today's health care information security practices in the U.S. are extremely poor, and many organizations are operating at below due care standards.

Addressing the issues is not, however, an easy task. There are wide-reaching effects stemming from these industry changes, and these effects plagued the industry, creating a challenging and difficult environment in which to implement policies, procedures, standards and technologies.

A few of the barriers which face information security include: a lack of legislation. Currently, there is no federal health privacy legislation. In the absence of such, organizations are unclear as to the rules or guidance around which to define information security practices. State requirements provide only narrow guidance and differ by state.

The result is that organizations are faced with electing whether or not to implement information security practices. For those organizations which do choose to implement security practices, they must many times rely on ethical tenets of what seems to be the right thing to do. On the other hand some organizations have opted to pay the costs of a lack of an information security program. One in particular, after paying a settlement of $3.5 million for an information security breach, still elected not to implement a security program.

Despite well-publicized health care information security breaches organizations still opt to implement little or no information security practices. More often than not, security is relegated to a low priority issue, and is not deemed a key performance indicator.

Further effects resulting from this lack of legislation can be seen with respect to the health care systems and vendors. Health care vendors have been unclear as to security requirements. Often, these vendors request security requirements definitions from customers. Customers who desire security options, on the other hand, demand the vendors know the security requirements, and build those requirements into their systems.

A second issue hindering implementation of security is the lack of policy or precedent from which to borrow. Generally accepted information security practices in the health care industry do not exist, and health care information security standards have not until recently been in existence. Further, existing standards do not yet define a complete information security infrastructure, but rather address pieces of a program. Standards from other industries such as the banking industry cannot necessarily be utilized, due to differing business objectives under which the standards were created. The result is that security policies, guidelines and standards must be for the most part defined from scratch.

The execution of this task however is hindered by another security issue, which is that the discipline of information security itself not a well understood field within the health care industry. For one thing the role of the health care chief security officer or information security manager is a unique discipline, charged with addressing organizational information protection issues. As such, the position or responsibilities should not be assigned as an additional duty, but rather as a key player in the organization.

Further, a realistic concept of an information security infrastructure is unusual in the industry. A week ago, I was told that a particular health care organization considered itself to have an information security program because it utilized audit trails. This was the organization's only security practice. Audit trails are of course one small component in a well-structured program.

Finally, the issue of culture should be mentioned. The integration of security measures or technologies in health care is often met with resistance, due to the level of transparency of the security practice demanded by health care practitioners. The technology must be transparent enough such that it in no way encumbers job function.

Further complicating integration is the perception that information security practices and technologies will hinder providers. All agree, accessing information quickly in an emergency circumstance is critical. The industry clearly has a delicate balance to maintain in order that such a dire scenario not occur.

It is this highly complex composite of issues facing organizations which often pressures them to elect not to address information security issues today. In order to implement information security practices within the industry, I would like to provide the following recommendations to assist in avoiding the pitfalls.

First and most obviously, organizations must implement a comprehensive information security program or infrastructure. When specific practices or measures are mentioned in the NRC report, the practices must be integrated into a larger comprehensive information security framework.

Numerous companies have learned the perils of a piecemeal approach to security. Implementing security practices in a piecemeal fashion will most likely result in money spent with no positive result. The health care industry must seize the opportunity to effectively address information security appropriately and at the program level. That framework must include policies, standards and guidelines, which govern a technical security architecture supported by security principles.

This framework must also include important disciplines such as local area network and wide area network security. Many organizations focus on remote dial-in and firewall security, but neglect standard WAN-LAN security practices. Misconfigured routers and outdated and unpatched operating systems are severe risks facing organizations, especially those connected to the Internet.

Despite the challenges, I do see reasons to be optimistic. There are organizations today which clearly recognize the criticality of security when utilizing the Internet for health care information related transactions. This type of understanding and commitment to implementing secure processes, despite a lack of guidance or precedent, is a solid step in the right direction.

Secondly, chief security officer and information security management talent from other industries is migrating into health care, because of the exciting challenges and needs of the industry. These individuals bring highly skilled management and technical backgrounds and expertise that will bring us to the level at which we need to be.

Three years ago, I was able to identify only four health care information security practitioners. Today, I am aware of more than 30 in the industry. Appropriately addressing information security in the health care industry is critical to its acceptance and trust by consumers. The near future state of the health care industry has been envisioned and security is an enabler to realizing that vision.

I can assure that committee that we recognize the criticality of the information we protect. If we are given the support to build the framework we need, there will be no shortage of information security practitioners who are willing to stand up to the challenge and bring order to health care information security.

Thank you.

MR. MILLER: Good morning. I am Dale Miller. I am director of consulting services for Irongate, Incorporated.

Thank you very much for the opportunity to provide input and recommendations for an information security standard. By way of a background, Irongate is an information security consulting company, providing services to health care organizations to help them strengthen their information security programs.

My comments are based on the experience of providing these services to health care, banking and other industries as a consultant for the past 11 years. During the last five years these services have been devoted almost exclusively to health care clients.

In general, I believe the health care industry has done a poor job of implementing effective information security programs. Some of the reasons for this, I believe, are a lack of properly defined and accepted responsibility, lack of adequate systems capabilities, and the failure of organizations to implement organization-wide comprehensive information security programs.

First, addressing the lack of properly defined and accepted responsibility, although senior management of most health care organizations understand that the organization must protect patient confidentiality, many organizations have not formally recognized and defined the responsibility for information security. Because the responsibility with the information security program has not been assigned, there is no individual or team with the responsibility to insure that policies and procedures have been developed, information security training is provided to all staff members, appropriate controls have been implemented, and auditing and monitoring of the information security function is performed.

Basic coordination of security measures across departments and systems does not occur. Different systems with the organization have different levels of security, security exposures that are not directly systems functions, such as disposal of paper containing confidential information, are frequently not adequately addressed.

The second reason, lack of adequate system capabilities. In general, current health care information systems provide very poor tools for managing access control. Although good system security practice for access control has been defined for many years, and systems in use in other industries provide tools for implementing good security, many health care systems do not include these security features. Developers of health care systems are not using the security expertise and experience from other industries, integrating good features in their system.

Many health care information systems today do not include features to encrypt passwords, enforcement of password length, enforced periodic password change, automatically log off unattended terminals, permit distributed administration of access control or encrypt sensitive patient data.

Major health care organizations are currently installing computer-based patient record systems that do not even include password encryption. System personnel in those organizations can gain access to any user's password, and if they are so inclined, impersonate any caregiver to gain access to confidential patient information and/or assigned patient records. Those organizations will not be able to enforce individual accountability in confidentiality, accuracy and authenticity of the record.

Most health care organizations utilize several different information systems provided by different vendors. The users of these systems are required to know multiple passwords, sometimes as many as six or eight. Users tend to create security exposures by writing the passwords down rather than memorizing them.

New health care systems are designed to support a single sign-on process that allows users to gain access to all the information to which they are authorized, using a single log-on I.D. and password.

Not all systems provide functionality to control access to information with sufficient granularity, limiting a user's access to information necessary for the individual's assigned responsibilities. For instance, in many organizations, all physicians and sometimes other caregivers are granted access to the information for all patients associated with the organization, whether or not the particular physician is caring for the patient. At a large facility, this may mean that a physician has unrestricted access to the information of thousands of patients.

Since few organizations have defined their specific needs, they do not appear to demand that system vendors provide adequate security functionality when these organizations evaluate and purchase a system. All too frequently, information systems security seems to be addressed during the implementation phase, after it is too late to influence the security and functionality to be provided by the vendor.

The technology to provide greatly improved security of health care information exists, but that technology has not been included in many health information systems.

The third point, failure of organizations to implement information security programs, also contributes to industry's problems with lack of appropriate information security. The broad scope of the security function within an organization of the need to implement a formal information security program tailored to the needs of the organization does not appear to be well recognized by senior management of health care organizations.

While the concepts and principles for protecting information are very similar from organization to organization, the information security program at each organization must be designed to meet the specific needs of the individual organization.

For example, the challenges of protecting confidentiality in a rural hospital that may also be the major employer in the community are somewhat different than in a large medical center in a major metropolitan area. In the smaller community, the hospital management caregivers, volunteers and clergy may all know the majority of the patients, and all may have access to that patient information via the hospital's information system.

Lack of a formal program in these organizations leads to inconsistent and incomplete implementation of security measures, both procedurally and in system functions. Many organizations do not appear to recognize the extent and the costs associated with the disparate security activities being undertaken by various project teams and committees. Former coordination of these activities is far more cost effective and yield significantly improved security measures.

I have listed several recommendations and actions that this committee should take. This committee has advocated the passage of health privacy legislation. I believe such legislation would be most effective if it includes measures that requires the protection of each person's privacy and provides for an individual to recover damages from health care organizations on a breach of confidentiality, even in the absence of provable economic damages.

I believe that such legislation should not specify detailed technical requirements for information security, because it would likely limit information and preclude using new security technology for the protection of health information.

In my opinion, legislation which creates incentives for organizations to improve security, rather than mandate specific measures, is likely to be most helpful. In addition to legislation, organizations such as HCFA could increase management's attention to security by requiring that as a condition of participation in Medicare programs, the board of directors or trustees of health care organizations be held accountable for establishing organization-wide information security programs. In very much the same manner, the banking circular 229 issued by the Comptroller of the Currency in 1988 held boards of banks responsible for establishing formal information security programs at national banks.

The work of standards developing organizations, particularly at ASBM, is extremely helpful in providing the basic foundation organizations can use to develop comprehensive information security programs. These standards contribute to consistent levels of protection, and I believe that their work should be encouraged and supported in lieu of legislation.

Accreditation organizations such as the joint commission can cause improvement in information security to occur by increasing the emphasis on information security during their accreditation survey. In addition to the implementation of good information security practice, accreditation criteria should also require that health care organizations require their information security staff to acquire certification as certified information systems security professionals, and should require CISSP accreditation when hiring information security managers or engaging information security consultants.

I believe this committee can help to improve the state of information standard in health care by just raising the awareness of the responsibility of management of health care organizations to address information security, by increasing the awareness of health care organizations to demand that vendors include more security functionality in their product, and by fostering the education of consumers of health care about how their health information is gathered, used and disseminated.

Thank you very much.

DR. DRAZEN: Good morning. I am Erica Drazen. I am here as chairperson of the Computer-Based Patient Records Institute, otherwise known as CPRI. I also am head of the Emergent Practices Institute, which is a joint venture between First Consulting Group and a group of 50 CEOs from integrated delivery systems. That group is called CCI.

I am here representing CPRI. We appreciate the opportunity to address the state of the industry concerning vertebral fracture, security standards and practices. However, I should mention that in my other role, I'm not in conflict, since we have actually adopted all the CPRI recommendations in our institute.

As the use of information technology becomes more widespread in health care, security is certainly one of the critical issues that must be addressed. Computer-based record systems that are properly designed and implemented have the potential to provide greater protection for confidentiality for individual identifiable health information than any paper-based system.

The key factors that enable computer-based systems to provide increased security are the ability to positively identify the user, to verify authorization, determine the right access for that user, to restrict retrieval to only specific information, even down to the data element level, to encrypt information for transmission, to track all accesses to data, and to remove identifying information when identity is not required to perform a task, as is true of many of the uses of information in health care today.

We believe that currently available technology combined with known organizational policies and procedures can be implemented now, and would greatly increase the security of electronically stored patient information.

A little bit of background of the CPRI. The CPRI is a not-for-profit organization that is funded and consists of diverse stakeholders in health care. We are essentially and organization of organizations that is committed to advancing the use of effective computer-based systems and hence improving health care quality, cost and access.

The CPRI was founded in 1992 as a direct result of the Institute of Medicine system. We provide a neural forum for the industry to come together to discuss issues and to develop common solutions for implementing computer-based records systems. We also support and endorse the efforts of standards organizations to develop technical standards and implement guidelines, and we promote the adoption of these standards in the industry.

Our assessment of the state of the industry is that up until recently, health care providers have relied on traditional ethical standards of the professions to protect access to records and protect health information. However, as the industry has changed and we have become a cross-continuum industry, and as health care becomes more team based, obviously the need to transmit and access information has increased greatly.

In response to this, much of the health information is now stored and transmitted in an electronic media. Therefore, there is a heightened need and also a heightened opportunity to protect information through technical and organizational security measures.

Several studies, some of which have already been mentioned, have increased awareness of the need to protect confidentiality of information. The Institute of Medicine study, Health Data in the Information Age, published in 1994, and more recently the National Research Council study which has already been discussed.

CPRI supports the NRC recommendations that health care providers must adopt technical and organizational practices to protect health care information. CPRI also concurs with the National Committee on Vital and Health Statistics' recommendations to Secretary Shalala that the health care industry must work with government to create a legal framework and set the proper set of incentives for heightening interest in privacy and insuring industry-wide protection of health information.

We however also believe that there are measures in place that could be implemented now to greatly increase the confidentiality and security of information that is stored electronically. The first step is to recognize that recommendations for technical and organizational practices have been developed, and their implementation does not necessarily require a great expenditure of funds or revamping of entire systems. There are simple and straightforward measures that can be taken that will make a significant impact. They would require a conscious effort to implement and vigilance in monitoring, but they take considerably less effort and considerably less resources than any potential breach of confidentiality.

As providers move along a path toward integration and toward implementing more electronic data, obviously there is a need to continue to adopt new technologies and new practices as they are available.

Our recommendations. We believe that one barrier to improved security was the lack of expertise in designing security programs, in evaluating systems for security, and in implementing changes. Therefore, the CPRI responded by developing tools and models that could be adapted and then adopted by providers who want to improve their security programs.

Because we felt that it was essential that these guidelines be widely available, we have actually published them on our website, and they are available for any organization to access and download. These guidelines provide a comprehensive set of tools to implement a security program, and they are also continuously under development.

I will discuss a few of those. The guideline for developing security policies provide a comprehensive outline of what a security policy should include and how to adapt that policy for the organization's culture and environment. The guideline on security management provides a job description for a security manager or for a smaller provider, outlines the responsibilities that should be covered to manage the security program. It outlines a workable committee structure for involvement of the provider community.

The guideline on education provides the tools needed to set up a training program for security. Sample confidentiality statements are provided to use for entering into agreements with medical staff, other employees, vendors and consultants. The security features document provides a checklist for a provider or their consultant to use in assessing the security features of a computer system. It also could be used by vendors to determine what is required. This month, we are adding a guideline for implementing electronic signatures and we are also working on a guideline for controlling access to health information. We provide a glossary of terms to increase the ability of folks to talk to each other about issues related to security and confidentiality.

CPRI documents undergo broad circulation for feedback prior to approval by the membership, and all guidelines go through a balloting process within CPRI with a diverse membership ballot based on the original document and the feedback are provided. The security features document, which specifies the features that should be available in purchased systems, was sent to over 900 individuals, including all the chief information officers who belong to the College of Health Information Management Executives.

This diverse input assures that the guidelines are realistic and usable. We urge the industry to adopt these guidelines and to participate in the development of further technical standards in the standards development organizations. Widespread adoption of security guidelines for organizational practices and technical standards would provide individual organization with a standard level of security in the industry as a whole, with a consistent set of expectations relative to security.

In conclusion, we appreciate the opportunity to come here and encourage the use of existing mechanisms and tools and to build on these to insure that health care will be able to reap the benefits of information technology while preserving confidentiality and data integrity that the American people expect and deserve.

MS. WALLER: Good morning. I'm Adele Waller. I am a principal in the Chicago law firm of Goldberg, Kohn, Bell, Black, Rosenbloom and Moritz. I currently serve as chair of the health information and technology substantive law committee of NHLAHHLA, the awkwardly named organization created by the recent merger of the American Academy of Health Care Attorneys and the National Health Lawyers Association. I authored the legal appendix to the IOM report on the computer-based patient record.

Although my firm's clients include health care providers, health plans and providers of health information services, I am appearing here this morning in my individual capacity and not on behalf of any client or organization. As an attorney, I need to say that.

I appreciate the opportunity to address the committee this morning on the state of the industry with respect to health information security. I will discuss the trends in the industry that the committee will need to take into consideration if it is to recommend to the Secretary health information security standards that will effectively protect the confidentiality, integrity and availability of health information.

It is crucial that the information security standards adopted by the Secretary under the Health Insurance Portability and Accountability Act reflect the integration of health information systems across multiple health care providers and other health care organizations, so that responsibility for health information security can be appropriately assigned. I will also discuss the need for greater awareness of the importance of health information security among the senior management of health care organizations.

There have been significant changes in the health industry and its use of information technology since publication of the Institute of Medicine report on the CPR. These changes are revolutionizing the medical record, and are creating complex challenges to health information security.

The IOM report was published in an era when most patient record systems were maintained by a single institutional health care provider such as a hospital. At that time, electronic recordkeeping still reflected traditional provider based medical recordkeeping with each health care provider delivering care to a patient maintaining a separate medical record on that patient. Since publication of the IOM report, hospitals, physicians, other health care providers and in many cases health plans have come together to form integrated delivery systems so they can provide seamless care to patients across the continuum of care and manage the health of populations.

As part of their movement toward integration, health care organizations are integrating their patient information systems. Increasingly, integrated delivery systems are capturing information concerning each encounter a patient has with a provider in the system in a single longitudinal electronic patient record. This change from the traditional provider/sender model for patient records to the new patients that are modelled represents a revolution in medical recordkeeping.

Patients and their records make possible improvements in the quality, continuity and cost effectiveness of health care that are simply not possible if each provider treating a patient maintains a separate record on that patient. Nevertheless, integration of multiple providers, patient and other health information systems creates new security challenges and demands new approaches to security, as health information maintained by integrated delivery systems is to be appropriately safeguarded.

Health information systems are integrated across multiple providers and health plans. Health information security is no longer under the control of a single health care organization. Each organization participating in an integrated information system or network must have in place an appropriate health information security program, or information in the system or on a network will not be secure.

One organization's health information security deficits may make all of the health information in a system or on a network insecure. In addition, it is important that all health care organizations integrating their health information systems work cooperatively and adopt and enforce compliance with comparable and compatible security policies for procedures and practices, to assure the security of the health information they maintain.

The task of cooperation among health care organizations integrating their health information systems is complicated by the fact that the organizations participating in systems integration initiatives are often not under common corporate control, but may instead be loosely linked by contracts and by information and communications infrastructures.

Without a common parent corporation or single point of control, the authority to implement an appropriate health information security program, enforce policies and enforce sanctions will be decentralized. A recent NRC report emphasizes the importance of appropriate organizational policies, practices and procedures protecting health information.

Participants in integrated delivery systems and other decentralized organizational structures can address important health information security issues across these broader virtual organizations, in part by entering into appropriate contracts with each other in the system's integration initiative. Enforcement of such contractual obligations however is often not a practical way to compel a health care provider or health plan that is lax in the area of health information security to upgrade its security practices. If they are to be effective in an environment where health information systems or multiple health care organizations are integrated, health information security standards to be promulgated by the Secretary under the Health Insurance Portability and Accountability Act must require not only that each health care organization meet federal information security standards, but also that health care organizations forming health information networks or integrating their health information systems with those of other organizations establish effective structures and mechanisms for cooperation in implementing health security programs across these networks in integrated systems.

Now I want to turn briefly to the role of information systems, vendors and consultants. The standards to be promulgated by the Secretary should address the responsibility that health information system vendors, consultants and others must assume for maintaining health information security when they have access to the health information systems of health care organizations.

It is common for vendors and consultants providing support services to health care organizations to be given access to the client organizations' information systems at all times, or at least on demand. More and more, health care organizations are out-sourcing operation of their information systems, making the security of their health information totally dependent on the out-sourced vendor.

Health information security depends on all these outside organizations. Yet, the Health Insurance Portability and Accountability Act does not make them directly subject to the health information security standards to be promulgated by the Secretary.

In the current environment, contracts between health care organizations and outside vendors and consultants do not always contain strong health information security requirements. In addition, contractual limitations on the liability of the vendor or consultant for breaches of security may mean that the vendor or consultant does not have real incentives to maintain appropriate security.

Information system acquisition contracts often limit the liability of the vendor to the amount of the price paid to the vendor or to a stated dollar amount, and often include vendors' disclaimer of liability for any special or consequential damages arising from malfunctions of the system or a vendor's act. That would include, by the way, harm to patients, damages for harm to patients.

The committee should consider recommending that the Secretary's health information security standards include mandatory provisions for each contract between a health care organization that is directly subject to the standards and an outside entity such as a system vendor or consultant given access to the information systems of the organization. These provisions should require that the outside entity comply with the Secretary's information security standards and that any contractual limitations on and disclaimers of the outside entity's liability not apply to liability arising from its failure to comply with these security standards.

Such mandated provisions would assist health care organizations in obtaining better cooperation and assistance from information systems vendors and consultants, and would give the vendors and consultants better incentives to maintain appropriate health information security.

Finally, I would like to address the role of senior management of health care organizations. The NRC report addresses the need for health care organizations to adopt technical and organizational policies, practices and procedures to protect health information. The report emphasizes the importance of management's leadership in developing strong information security programs, and notes that health care organizations appearing to have moved towards stronger cultural supports for confidentiality and security controls are those in which the values, policies and procedures have come from the very top of the organization.

I work with numerous health care organizations across the country on legal issues related to electronic medical records and health information systems' integration initiatives, health information networks, health information security and confidentiality and the ownership of health care data.

These organizations that help information management professionals and information systems professionals are often keenly aware of the importance of health information security, and make it a priority. However, in relatively few of these organizations, as health information security appears to be a high priority for the organization's senior managers. Often, this is because senior managers are simply unaware of the critical importance of health information security, and safeguarding the organization's valuable information assets in protecting patients and in preventing the adverse legal and public relations consequences that could result from a security breach.

I believe that the committee in its recommendations to the Secretary has the opportunity to highlight the need for government resources to be directed towards educating the leadership of health care organizations on the importance of health information security. Health information security will only become an organizational priority if it is the priority of leadership, and leadership must be educated if this is to occur.

I want to thank the committee for this opportunity to review some important features of the current health care environment that have an important impact on the ability of health care organizations to protect the valuable health information they hold. Americans must be assured that the confidentiality of their health information will be protected, and that their health information will be available and accurate when it is needed for their health care.

Thank you.

DR. LUMPKIN: Thank you very much. Do we have questions from the committee?

DR. GELLMAN: I have a number of questions. I'd like to begin -- Laura, you said the absence of privacy legislation was one of the reasons why people have not necessarily addressed the security issue. I wonder if that is much of an excuse. There seems to be very little in the privacy legislation that specifically addresses security in virtually all of the security measures that have been discussed here, and some that of course we haven't gotten to yet, that really don't have anything to do with the legislation. Is that a fair assessment?

MS. BROWN: Can you repeat the last again?

DR. GELLMAN: That most of the security measures that have been discussed really don't have anything to do with the legislation.

MS. BROWN: Security practices themselves stem from the guidance that legislation would provide. The standards and practices that are in development and that have been developed again follow upon what an organization or a group of organizations such as that belonging to CPRI feels should be the way that things are.

So the privacy legislation can only enhance those efforts that are ongoing.

DR. GELLMAN: I'm just trying to make the point that there is very little substance on security in the legislation. I think Dale, you made the point that there shouldn't be details in legislation. I think that is right.

DR. LUMPKIN: Excuse me, Bob. Can you get closer to the microphone? I'm having trouble hearing you.

DR. GELLMAN: That legislation shouldn't prescribe security practices. Basically, what you find in legislation is, it says there ought to be reasonable security, and leave it up to somebody else to figure out what that means, possibly the Secretary, in some more flexible way.

Anyway, I'm just trying to make the point that lack of privacy legislation is a poor excuse for not addressing security today.

On to another issue, in talking to people who run electronic health information systems, I have often heard them say that they don't have any problems, they don't have any security breaches, that there has never been a problem with their system. The question I always ask them is, how do you know? I wonder if any of you could address that general subject. How do you know what is going on in a system unless you have done some kind of an audit? Anyone?

MS. BROWN: I think adding on to the audit aspect is -- the tools are not there to prevent it from happening in the first place, so the audit may not even record that a breach happened. So I think it is both ends.

MR. MILLER: I think that there is very little knowledge of the actual break-ins. The organizations that we work with are dealing with a problem, where someone has come to them, that there has been -- you are concerned about a breach of confidentiality, usually are not sure how that happened, whether someone overheard someone in the hallway, or whether the information was obtained through the system.

I think that there are a lot of breaches that are occurring that organizations just don't know they happened, because the lack of basic system auditing of access is not occurring in a large number of organizations.

DR. LUMPKIN: Let me perhaps branch off of that. We know why we may not know these breaches have occurred, but given the attention that has been focused on that, do we have any stories? You don't necessarily have to bring them today, but we would appreciate any case studies that may indicate examples of where health care organizations have been victimized by breaches or individual patients. I think that would be helpful for the committee.

DR. DRAZEN: I think we all have stories. We can certainly document those stories.

I think that when people say they haven't had issues, that means that they haven't made a monetary settlement, and that they are unaware of breaches which involve disclosure of information for some personal gain. But I think that if we walked around almost any health care organization today, we would see breaches of confidentiality. Visit lists of who is coming in for their initial O.B. visit posted on the wall for all to see, systems which have 10,000 users and a password that only consists of five digits, so you can not necessarily break in on somebody's specific code, but you can use any five-digit combination and get in. People just -- since these don't result necessarily in external disclosure of information, they are not counted in the same category.

But I also would want to disagree with some of the things that have been said. I think that this is an issue that CEOs of health care organizations and boards recognize, that it is their responsibility. If they only knew what to do about it, they would implement measures very quickly. It is not an issue that is being ignored at the executive level. I think that because of some of the issues that Dale mentioned, they really aren't very well equipped to deal with it, since they don't necessarily have a person that is a security expert. They wouldn't know how to go out and qualify somebody for that position, so they really are lacking the expertise to be able to deal with it.

DR. LUMPKIN: Let me perhaps clarify my request. One of the things that we are dealing with as a committee, we recognize that in many ways, the system may have a lot of open doors, but there are a number of small communities in Illinois where people don't lock their doors. They don't lock their doors because no one comes in.

We suspect that there is a security breach, or certainly a security risk there. In order to better explain to people our recommendations, I think it will be useful for us to be at least able to demonstrate some vignettes of the fact that there is a hazard to leaving your door unlocked. I think it is very important for us. What is coming to this committee -- and it has been very clear in some of the hearings -- is that it is not adequate to say that the paper information processes of health care have all of these security breaches. But there is a higher standard that I think is expected, because the public believes that there is a much greater risk to an electronic system. So we need to look at not only the vignettes, but vignettes related to an electronic system, if possible.

MS. WALLER: I am actually aware of one story, and I am unable to disclose any details of who the patient is or who the client is. I am actually aware of a major incident of clinical information systems sabotaged by a rogue employee of a vendor. The rogue employee came in through a back door and sabotaged the system so badly that a health care organization had pretty good backup and recovery systems in place, and it was unable to get any clinical data back up for 24 hours, and took months to recover anything like normal operation of its system.

Distressingly, the response from the vendor was very casual. The vendor actually told the client how the incident happened and that sort of thing, and then proceeded really for months to take a very casual approach, even though we believe that the rogue employee had violated federal and state computer crime laws and other things.

So I am aware. I think that the public is more focused on the confidentiality aspects of security. I as someone who is paid to help health care organizations identify and manage their risks am actually much more concerned about integrity and availability issues.

DR. LUMPKIN: If you could just write that up, maybe a paragraph or so, leaving out the vendor and the place. We are not looking to crucify anybody; I think we're just looking to illustrate the problem.

MR. MILLER: One of the problems in trying to collect these incidents is that when patient are armed by a disclosure, in order for them to take action or have something happen as a result of that disclosure, usually it results in further disclosure. You don't always hear about the fact that some disclosures occurred because they lost their job, but if it affects their family relationship, the people in the risk management of the health care organization may hear about it, but no one else.

The kinds of things that we are hearing from some of our clients, things like someone calling a hospital and saying, I'm coming in for some procedure. I have a relative that works at the hospital, and I would like assurance that that relative will not have access to my records. There are things in there that I would like not to be public. We are hearing that more and more from organizations that are getting that kind of question from patients and prospective patients.

DR. GELLMAN: Let me just make one point. The issue of threats to the system; threats come from all over. I think the NRC report did a good job of laying out the range of alternatives. It is too often when we are talking about computerized systems that we talk about hackers or we talk about other kinds of computer-based problems, whereas the only two studies that I know, investigations that I know of that have been done in really the last 20 or 30 years in North America showed an incredible pattern of stealing of medical records by private detective companies and insurance companies, in ways that patients had no idea, no one had any idea of what was going on. The patients are the last to know in these kinds of situations. The only way these things ever came to light was essentially by accident or by issuing compulsory process search warrants in one case in order to get access to records and follow up the investigative leads.

No one in the whole health care industry had a clue about this. None of this was ever reported by any of the people who were involved, insurance companies and certainly of course not the investigative service companies that were stealing the records.

So there are widespread practices out there. There are plenty of people with incentive to get records in ways that patients and facilities have no idea what has happened, and no one can see a direct connection between a particular event and the acquisition of a record, so there is not even any suspicion.

So it seems to me there is a tremendous burden here on the record keepers to pay a lot more attention to the actual threats that are out there, and even some of the ones that aren't visible on the surface, or aren't ones that you would otherwise predict, because otherwise, there is never going to be any action taken to try and stop these activities.

DR. COHN: I could perhaps be slightly off the topic, but I feel like I need to ask a really naive question, and I will apologize if I am embarrassing the other panel members here.

DR. LUMPKIN: We'll let you know about it later.

DR. COHN: Okay, please. It just seems to me on a very high level that obviously, issues of security fall into two areas. One is the policies related to security and the other is the technology.

I am actually struck as I listen to this, and I can't quite decide as I listen to the panelists, whether the issue is that there really is the technology out there, and it is just that many people do not consider this to be valuable information. In other words, if this were a bank, would we have the same problem? That is the question. Or is it really that the technology -- that this is really much more sophisticated, and we just don't have all the technology that we need.

Perhaps the panelists can help clarify that for me, because I see the span as I have listened to all of you. Help me with this one.

DR. LANDWEHR: I'll take a crack at that. I think in this community, there is a lot of technology that is not being used and could be used in many cases. We are hearing discussions about passwords in systems. This is not new technology. This is in fact technology that I wish would be out of date. But it is nevertheless not even uniformly used in this area.

So technology is advancing in this area, but there is plenty of technology that could be used that isn't being used in most places.

MS. BROWN: I think where some of the issue stems is that information security stems from business decisions that have been made, and technology is implemented to uphold those business decisions, because health care has changed so much and so quickly and so drastically.

Those business decisions haven't been made. Therefore, the technology can't be added to uphold those business decisions. Now, once the decisions are made, there are very basic technologies which the industry has not used.

Now, granted, we may find out that there are technologies which are not available yet, because we are such a unique industry. But I think it is the lack of business decision up front which hinders the implementation and the understanding of what technology is going to work.

MR. MILLER: In order to make those business decisions, those are tough decisions and it forces management into putting some things down on paper that sometimes are very difficult, in terms of who has access, in some instances restricting access by physicians, decisions that may be very politically unpopular at first.

So I think that leads to the problem of stepping up, making those policies initially. I think another aspect is that this problem is very complex. In other industries, a number of people with access to information tends to be far fewer. We hear the stories that for an in-patient hospital visits, 70, 80, 100 people, pick a number, have access to the record.

When you look at how do you deal with that complexity across an organization, direct caregivers people trying to get the bill paid, people looking at quality of care, people doing research, people using the record for education. From an information security perspective, it is a very complex problem.

That I think is a little bit different from other industries. If we were to use the technology that is available, we could make tremendous improvements in the current state of security.

MS. WALLER: As someone who has negotiated with numerous vendors for clinical data repository and master patient index systems, and particularly the data repository systems, and who has had clients present lists of security requirements to the vendor, it is a very common experience to have a vendor say, if you choose X, Y and Z of these requirements, your system response time will slow down. We will double the response time, we are willing to warrant.

So what I have found is that with the major clinical data repository systems that are now available and on the market, the client is forced into very difficult tradeoffs between functionality for the purposes they need the system for, and security features that they would like to have.

When you are doing these multi-provider data repositories, you actually may need more finely granulated security features and better audit trails, because you have a much more complex system across which to administer security.

DR. DRAZEN: I just wanted to make one comment. I think on balance that the balance is toward the answer that the technology is available. The fact that the technology has not been implemented in some of these vendor solutions, the fact that it isn't on everyone's list, is a problem, but it is not a technical problem.

I do agree that this issue of the balance between access and security is one of the areas where we could use technology advances. But I would agree that what we have today if implemented could go a long way toward securing confidentiality of information.

DR. FRAWLEY: I just wanted to follow up on the issue regarding vendors. Many of the HIMA's members are in the process of trying to acquire clinical information systems. What they are actually finding is what we call disincentives, where the vendor is advising the client that in order to provide adequate security based on either what CPRI recommends or the standards development organizations, that that will require customization to the tune of millions of dollars. Yet, they will find out that there are other hospitals around the country who have faced similar situations, acquiring the same clinical information system. So there are disincentives that we are concerned about in terms of dollars.

The other concern of course is the fact that during implementation, it often becomes aware and obvious to the health care organization that the representatives that were made by the vendor in terms of the adequacy of the audit trail function or other security technologies is not present. Many of our members have called us to report organizational practices that almost border on fraud.

I would just like to get a comment particularly from Laura and Dale in terms of what your perspectives are, in terms of being out there, helping clients who are struggling with implementation.

MR. MILLER: I would agree that from what we see, during the implementation phase is when people discover the security issues. Often they don't address them until that point. They haven't included detailed security requirements in their evaluation process from the vendors.

The experiences you are describing are very similar to what we are seeing with our clients, that people are starting to ask for this. They ask for it too late. Then the vendor response is that it must be a customization.

DR. FRAWLEY: I can validate the first part, but even up front, some of my clients have requested that specific security controls have been implemented in this system, and they have come back and said, this will be millions of dollars extra.

DR. DRAZEN: But I think if we look back, we'll find the same is true of, for instance, when first HL 7 became a consensus standard. People said yes, and they really didn't have it, or people required payments. But I think that the perseverance is important, because obviously the vendors do respond to customer demands.

So the fact that they are saying yes is a good sign. The fact is, if they are required to say yes enough times, they will develop those systems.

MR. BLAIR: I've heard from the testimony that I have had a chance to read and listen to -- and I am aware of the fact that we have begun to wind up trying to coordinate a number of the standards, develop the activities at the future function level, and when it gets to the privacy/confidentiality for federal legislation, of course there are activities to go forward on that.

We also wind up having a pretty good set of policies and practices defined by the CPRI and ASDM with respect to data security within a given health care organization. However, the testimony that you have given, many of you have wound up raising concerns that as the health care environment has changed with integrated health systems, multiple providers, also linked with payors, also linked with outside vendors and consultants and carriers, that we seem to be losing control in trying to have consistent implementations.

My question to you is focused specifically on the area of the policies and practices. I think there is going to be other opportunities for us to focus on the commonality and integration of the encryption technologies and other technologies with some of the standard folks that will be speaking to us.

But specifically with respect to the policies and practices, do you feel as if in addition to the guidelines and standards that have been developed by the CPRI and ASTM, for example, that there is a need to go a step further and have the joint commission and NCQA, for example, agree on common accreditation activities -- not accreditation, but that do their annual audits, where we could begin to -- and I understand there are others for the intermediaries as well, so that we can begin to have all of these elements of the health care delivery system begin to apply the same standards for policies and practices for data security, as well as for the technology.

MR. MILLER: Many of the projects we are involved with were initially started by the organization in response to an upcoming joint commission survey. So there is a lot of incentive to be prepared for those accreditation surveys.

I think if the NCQA and joint commission were to look at how organizations are utilizing CPRI's material and the ASTM standard, and do part of their accreditation survey on that basis, I don't think they need to develop their own standards, but look at how the organizations are using those standards, I think that would provide a significant incentive to health care management to formally address security.

MR. SCANLON: I wonder if I could get a sense from the panel of the general nature of federal standards that you would be recommending. You seem to be concluding that there more or less are generally accepted health information security practices or policies or software that could be brought to bear in the health area, if some other barriers could be overcome.

You also seem to be describing -- you focus most of your discussion on the actual clinical setting, more or less this morning, the hospital or the provider and the networks that they would employ.

First of all, what would be the level of standards that one might want to consider at the federal level? Are we dealing with policies and principles of a broad nature, or are you talking about fairly specific techniques and even software that should be part of the standard that would be envisioned?

Secondly, would the standards really have to differ, depending on whether you are talking about an insurance company, a clearinghouse, a hospital, a clinical data repository? Would the standards be such that they could apply across all of them, or would you really have to differentiate and have different standards, almost, for those different settings?

MS. BROWN: I'd like to address the first part. I think it is important that, because of the fact that there are so many existing guidelines or standards that are out there, that there be a survey to really look at how all of these come together and provide a framework for how they all fit into place.

I think that just telling organizations that these are the practices and policies which you will put in place will almost tend to confuse. I believe that they will just comply for compliance sake and not really accept, because they understand it.

I think that providing a logical layered understanding of, this is where policy is, and upon that you build principles and upon that we have developed these standards, and this is how all these standards come into place, I think you are going to be mitigating more risk when it is done that way, and I think that there will be more of an acceptance of a true understanding of how information security fits into the whole health care industry.

Having said that, I think any of the practices and guidelines discussed this morning, if grounded upon principles, would be valid throughout the industry.

DR. DRAZEN: I would say that I think the issue that needs to be determined at the national level is the issue of rights for accessed information, what are the patient's rights to access information, what are the providers' and what are the payors' rights. So what is the appropriate need to know for those categories.

One of the issues we have today is that everyone wants to have access to everything, and it feels that they have a need to know for everything. Once those issues of the ownership and the privacy rights are established, then I think the same guidelines can be applied to implement that legislation throughout the industry.

MR. MILLER: I think that at the federal level, the most effective -- what would be most effective are policy general concept kinds of things rather than detailed how-to's. I think as far as across industry, the concepts and policies are very applicable across industries.

I think the implementation, based on how the organization does business, can vary greatly. The general policies an general concepts and tools need to be adapted to those organizations. So providing guidance a the policy level I think is most important, especially as this crosses health care providers' insurance in different organizations.

DR. LANDWEHR: We did try to deal with that in the report to some extent. I think I would agree with what has been said in terms of, you don't want to get extremely specific until you really have to, because the more specific you get, the sooner it will probably get out of date. But you want principles for accountability, for example, so you need a system that can provide you with accountability, whether it is in a rural clinic or a major metropolitan hospital. You want to have reasonable accountability within that system, and that is the kind of principle that you build on.

DR. BRAITHWAITE: I had a question for the panel that refers back to the general title of this panel: overview of issues and state of the industry.

I heard one person say that we couldn't really use what is done in other industries, specifically the banking industry, because health care is much more complex. I would like to hear all of the panelists address that. Is it in fact -- is our industry in fact so different and so complex that we can't use what is going on in other industries that we can't adapt it and make it useful in the health care industry, that we have to re-invent everything for our own field?

MS. BROWN: I'll comment first because it was my comment. I don't think I was saying you cannot use. I think what my fear -- and what I have been hearing as I attend different organizational meetings, is that there is a belief out there that we can just take everything that the industry has ever done and just slap it into health care and it will be an easy fix.

I think the point of caution I was trying to say is, there has been obviously some great standards development efforts that have happened in other industries. Let's look at those, let's be very careful about how we bring them in. Let's not just say, oh, these will apply to us.

There were different networks that were already in place. There were different organizational bodies governing the banking industry, differing business objectives, more over integrity as opposed to our issues of integrity and confidentiality. So there are enough differences that I urge caution from other places where I have been, where people have said, let's just take all the standards out of banking and put them into health care.

MS. WALLER: Yes, I would agree you cannot import what banking has done and just apply it on a one to one basis to health care. I think Laura alluded to one of the big reasons. If you look at the primary goals of security in the banking industry, privacy is not really very high on their list. What they care about is when you put your card and your PIN in the automatic teller machine, that you get the amount of cash out that is recorded at the other end at your bank on a CERUS network or whatever.

They care a lot about accuracy and integrity issues, and they care about availability, but they may not care as much about privacy, and it is because they are not under the same legal restraints that the health care industry is.

DR. DRAZEN: I think another real difference between banking and health care is this issue of access to information. It literally is true that minutes might make a difference in accessing information, and health care information is not consolidated, and it really makes a difference that one provider know what happened in another setting, whereas the banks can be pretty well segregated.

If you look at the number of accesses to a number of entries of information in health care, the multiples are orders of magnitude higher. So we have more access, and access is critically important. That is an area where there is a huge difference with other industries. Although I would agree with Adele that many of the technologies for authentication, for auditing, for creating audit trails, are directly transferable, so we need to make sure that the solutions actually meet our business requirements in health care.

MR. MILLER: I would strongly disagree with the statement that new methods and techniques need to be invented for health care. I believe that we can apply especially the tools that are being used in other industries than health care. I don't think we need to wait for a new technology to be invented.

There is however a saying amongst some information security practitioners that information security is not like socks; one size doesn't fit all. So the customization of the program to the organization or to the network is extremely important, making sure that the policies and the way these tools are implemented match the organization's culture, its systems environment and its management style.

That is where the customization needs to occur. We can take a lot of the tools and technology that are out there, and the basic general tools of accountability and apply those to health care.

MS. BROWN: One more thing. I think that is another --

DR. LUMPKIN: It is Charles' turn.

DR. LANDWEHR: I would just say that I think the technology developments are probably not going to come from this field. The basic technologies are going to be imported into this field.

What you have to look at is, medical records are different in many ways than banking records. They are longer term, they are more personal, and they are often more voluminous. So you're going to adapt solutions from other fields using the technology from other fields, but they will have to be adapted and imported appropriately to this field.

MS. BROWN: And you adapt those solutions based on an understanding of risk. I think that is a difference, too. What is the banking industry risk, the adapted technology accordingly, what is health care's risk. You look at your risk, you perform a risk assessment, you adopt the technology to meet the risk as decided by management.

DR. LUMPKIN: The acceptable risk.

MS. BROWN: The acceptable risk.

MR. MILLER: If there is a breach of security in the financial environment, you can always make the customer whole by repaying the money. Usually the confidentiality disclosure that occurs is not as significant as a health care confidentiality disclosure. It is very difficult to make the patient whole after a health care disclosure.

DR. VAN AMBURG: I have two questions. All of you have commented at least briefly about the fact that the technology exists today to protect the data that is not being used by the systems. That is where they are being shared, timeouts aren't being used, et cetera.

To what degree do you think that is the result of resistance from the actual users of the system?

DR. LANDWEHR: I think there is significant resistance.

DR. VAN AMBURG: Why?

DR. LANDWEHR: It costs time, in some cases.

DR. DRAZEN: I think that there is significant resistance by users to systems that aren't appropriately designed to meet security requirements.

Generally, people that leave systems on and don't have timeouts are systems where it takes five minutes to log on, and you only get through one out of two times. However, systems that are designed appropriately, people use appropriately.

So I think that we have to look at the systems rather than the users for most of the solutions to these issues.

DR. VAN AMBURG: My second question. I am particularly interested in data integrity. I think it was described as a massive corruption of the system which was fairly easily detectable.

What is the state of the art for subtle corruptions or system corruptions, where maybe one value was changed on 15 patient records uniquely? How is this detected, and what are the risks of the system with this.

MR. MILLER: I think the risks for essentially a terrorist act are very high. As Bob mentioned earlier, such a changing random values would be very difficult to detect, especially if those values are within normally expected ranges. I think most organizations today would not know that that occurred. They would have to have some good luck in running into some indication that would let them know that someone had changed the information in the system.

I think the current controls in place in many organizations would not provide protection against this happening, and would not provide a record of it occurring.

DR. LUMPKIN: I have a couple of questions. Let me put in your mind that we are charged with coming up with some recommendations somewhere in this process. Given that that is our charge, if we were to set in place a recommendation to move to some standard of security, what would be the cost to the industry to comply, and what would be a reasonable time frame in which to expect them to comply with those kind of standards?

DR. DRAZEN: I think this is actually going to vary tremendously by the individual institution. So it is really hard to come up to those estimates.

But I think the cost of complying with policy standards and procedures should be very minimal, and a year to 18 month time frame is appropriate. The cost to comply with some of the technical standards will require a replacement of existing systems. That at a minimum is usually an 18-month to two-year process.

So I think that on the technical side, it may be a longer time frame, and it may be more costly. However, having said that, there is a tremendous amount of change going on in the industry today, and a tremendous number of systems are being bought, especially for the ambulatory care environment. So the cost of delaying these recommendations will be enormous for the industry.

So I think that the cost this year would be about 50 percent of the cost if the recommendations don't come up for a year, for instance.

MR. MILLER: I think it is very difficult to tell how much the incremental cost might be. In many of the organizations, there is concern about information security and activity occurring at low levels in these organizations. Many committees or project teams implementing a system are dealing with security, and they spend a lot of hours and a lot of effort dealing with that. If that is not coordinated across the organization, so that the next time implementing another system somewhere else in the organization goes through the same things, so that the money is being spent, the effort is being spent, so coordinating that would give us much better results, coordinating it across the organization with a senior management focus. So the actual costs might not increase very much.

The primary costs I think to most organizations is recognizing that they must assign the responsibility for addressing security across the organization. That may mean additional people or it may mean taking people away from other tasks to deal with it.

DR. DRAZEN: I think if we got a bunch of creative people together, we could actually come up with a cost savings for doing this.

DR. LUMPKIN: Let me follow up on that question. We had heard a lot of questions, and some of the resistance that it seems I am hearing is coming from the vendor community. To my mind, if a vendor tells me that it is going to cost more to add security, then what they are trying to tell me was not planned very well in the beginning.

I went to a major university, and they were building a library. They got about halfway done, and they discovered that someone forgot to account for the weight of the books. So rather than tear it down, because that would look bad to the 'lums, they re-engineered it, which cost twice as much as it would have cost if they had torn it down and rebuilt it.

So if you build something in from the beginning, it obviously is much more functional than if you tack it on at the end.

So my question is, given your knowledge of the state of the art, and the products that are out there, if in fact we implement security standards, in your perspective -- because we will ask this of the vendors a little bit later -- do you think that the vendor community would be able to put product on the street in a very short period of time that will meet the standards of not being cludgy, so that providers will refuse to use it?

DR. DRAZEN: Since we have many vendor representatives in CPRI, the consistent message we have gotten from those vendors is that their cost is associated with having no standard, because every customer wants something different. If a standard is declared, they are in a position to be able to meet it.

Also, many of the vendors, especially those that are developing applications to meet our new requirements, like clinical data repositories, patient identifiers, ambulatory care systems, have actually built their systems in an era where security was very important. Again, with the lack of consistent standards, they have all invented their own ways to deal with confidentiality and security.

So I think that some vendors will be penalized, but also, the vendors that have kept track of what is required will be rewarded. One of the problems right now is that they aren't being rewarded, because this is not a requirement, so the investment that vendors have made in implementing confidentiality and security features in their systems don't meet with marketplace advantage.

So I think that it actually will reward some and penalize others. But this is not new information that this is a requirement in health care, so folks that were aware of it have been building around the requirements.

MR. MILLER: One problem we face today is that vendors have not used the security features that are already built into operating systems, that were designed into the operating systems, because the organization wanted to sell them to the Department of Defense. The health care systems are using security in the individual applications.

So I think there are some incremental improvements that can be made, but the best security really is security that is integrated throughout the design of the product, and would require a redesign. I think there are some short term fixes that can be made in many products that will improve security. With the rapid turnover in the design of systems, I think there is a design opportunity as they start from scratch to build in security, if it a major objective for them.

MS. WALLER: I think there would be a big difference if you were going to insist that health care organizations conform with the standards with respect to their legacy systems. You have to understand that the law requires that you keep patient records around sometimes for a very long time. We can assume that we are going to have a lot of legacy systems, clinical information systems and others around for a long time. I think the cost of re-engineering those systems to make them secure could be really too onerous for the industry to bear.

Given the way the software licenses were written, it would be the health care organizations rather than the vendors that would for the most part bear those costs. So I think that if you are saying, going forward, new systems have to meet standards, I think we could do it a lot quicker than if you're going to impose those standards retroactively on legacy systems.

DR. DRAZEN: I actually would disagree with some of that. I think that first of all a lot of those legacy systems have to be replaced because organizations are integrating and changing. And there are opportunities to put the security with a clinical data repository, and have that as the source of access on the network, and have that as a source of access.

So just because you might have an old departmental system that doesn't meet security practices doesn't mean that you can't guarantee confidentiality of information throughout your integrated delivery network. There are other ways to do that. Frankly, there are also ways that that legacy system, for instance, doesn't necessarily need to know patient identification information. We have traditionally supplied that for instance to the laboratory, because it came down on the laboratory slip and you needed to know that to get it back. But if it is a laboratory system nowadays it is all electronic. You can assign a special number to that patient that is just an accession number and be able to compare with past data, and never know who the patient is within that system.

DR. LUMPKIN: One last question, and this is hopefully a quick short answer. I would like to get one from each of you.

If you were in our position, which is to make a recommendation to the Secretary, is there a standard that you would feel comfortable in recommending to the Secretary, versus just general principles? I think we are at the position where we would feel obligated to recommend a standard by some accepted group. What recommendation would you make to the Secretary?

DR. LANDWEHR: I guess my recommendation is not that you tell them a standard, but that you set up a standards organization. That is really what the committee was after. It is not something you do once in any case; it is something that has to be kept up to date.

MS. BROWN: I would just feel better if there were a survey done of all the standards that are out there, and then they were logically put into some logical larger framework before they were just implemented as piecemeal scattered practices or standards.

DR. LUMPKIN: So you see no likely candidate at this time?

MS. BROWN: I wouldn't pull any one of them. I would integrate a lot of them. A lot of them have a lot of strengths.

MR. MILLER: There has been a lot of work done with ANSE and ASTM and those existing standards organizations. I think we recommend that ASTM and the others under this umbrella be recognized as the standards organizations.

DR. LUMPKIN: But you see no candidate standard?

MR. MILLER: I think there are a series of information security standards in health care that exist today. I think that it would be unwieldy to have one standard. I think there is a series of standards applying to different aspects of the security problem. I think it is the importance of an organization to maintain those organizations or set of organizations to keep those standards up to date and to keep applying new technology.

DR. DRAZEN: I would recommend that the committee take responsibility for outlining all the areas that need to be covered in the standards. I do think there are some standards-like bodies like ASTM and CPRI who can fill some of those holes. But there is no holistic standard, so I think the committee does have to take responsibility for making sure that all the areas are covered with a recommendation.

MS. WALLER: I would agree that there is no one standard out there. I would hate to see the committee adopt something that is frozen in time, because I think it will quickly become out information in a security standard, rather than an information security standard.

I have clients who exchange health information on the Internet, and literally, the threats and the solutions to respond to those threats occur on an hourly, daily and a weekly basis. I think that there is no one standard that the committee could promulgate that would address all of that, unless it is at such a high level that it has no detail.

DR. LUMPKIN: We are scheduled to take a break. I'll take one more question.

MR. BLAIR: I would like to piggyback on your question a little bit, and maybe direct it a little bit more focusedly.

The CPRI and the ASTM have been developing guidelines and standards for some time. They do address a lot of areas within existing health care organizations. I would like to get reactions from each of you on two possible directions. Number one, do you feel as if a lot of the overlapping activities of those two groups can be readily converged? If the two of them work together on common definitions and a common set of practices, is that something where you feel that those inconsistencies can be readily worked out in a reasonable three, four, five, six month time frame? Is that an option for us?

The other is, do you envision that these basic guidelines and principles from CPRI and ASTM can be used as a foundation within a reasonable period of time to extend data security practices so that it can handle multiple institutions, the integrated delivery systems, the Internets, intranets, extranets.

Two questions. Could I hear from each of you, please?

DR. LANDWEHR: I think speaking as a representative of the committee, I can't really respond to that, because I don't have enough knowledge of those two organizations.

I would be hesitant to recommend that this subcommittee rely however on two disparate groups converging. This sounds like a possibility, not a certainty in any case.

MS. BROWN: To your first question, I think Kathleen can answer. Yes, you can take what has been done and combine them, because we just did that last week.

But to your second question, I do believe that it is important to take what has been done and compare it to generally accepted information security principles and practices outside of health care, and make sure that the health care industry doesn't go off on its own path, creating security principles that really don't align with other industries as far as generally accepted security system practices.

So I think that they don't necessarily create a foundation, but they certainly contribute to it. I would again urge looking outside to get the common foundation and build those into that.

MR. MILLER: I think the work done by CPRI and ASTM, they can use right now to very much improve the security. I think that work can be brought together by organizations. I think that there will always be multiple standards organizations, not just one that has the answer for the moment.

I think that Laura's point of using certified information systems security professionals is a global perspective to information security. Those people can apply CPRI and ASTM standards and make a significant improvement today.

DR. DRAZEN: I can speak for half of that collaborative, namely, the CPRI. We would enthusiastically support that idea. However, with a few cautions. Being a volunteer organization, we always have trouble doing things, because we have no funding for that. So a bit of funding would help inspire it, as well as a charge from the committee. With a charge from the committee, you can inspire volunteers even without money.

So I think that that would be very productive. We have already had some initial discussions, and have this on our agenda of things that we would like to see accomplished, too, to certainly meld the glossaries.

I also think that bringing in the NRC report as a perspective on the areas to be covered, we could also quickly identify where neither organization had really adequately covered a topic and provide a focus for the committee for their search to fill in the gaps. I would include that as part of the charge.

DR. LUMPKIN: I can guarantee that this committee is willing to double our funding.

MS. WALLER: I as a legal practitioner rather than as a health information security practitioner am not qualified to speak specifically on that. However, given that I have a lot of clients that have relied on both of those organizations in developing their information security programs, I would enthusiastically endorse building on what they have already done. I think that there are a lot of people who have already paid attention to what those systems have accomplished.

DR. FRAWLEY: I have to make my disclosure first, so that everybody knows my particular bias. I did chair the CPRI work group for five years with Adele and Dale, that published all of those guidelines. I am also a member of ASTM. I was also on the NRC study committee with Carl.

That is why I'm here, right?

We did have a joint meeting in April at the ASTM meeting in Nashville, and we did have representatives from HHS, ASTM, CPRI and HL 7 meet to talk about ways to bring our work together. One of the things that we did discover obviously was that the industry needs a standard glossary. When we talk about access and people talk about disclosure we get into a lot of debates about whether we are on the same page.

We did have follow-up meetings in June around the ANSE HISB meeting. We did have a team of representatives who went to Seattle, Washington, volunteered. Seven people trekked across country, representing the various groups to work on five new standards, that will be balloted through ASTM. CPRI is presently working with ASTM now to see if there is a way that we can take some of our guidelines through ASTM and then have them balloted and approved as ANSE standards.

So that process is in place. I think that all of the standards development organizations are to be commended for coming to the table and really trying to work together.

So what we would like to see is some commonality and not necessarily everybody going off and doing 50 different standards and having industry saying, which one do I select. So just so the committee knows that that work is in progress. It has been very positive. I think also, the HHS staff that is working on the security team side has probably helped them, in terms of not being hit with lots of different perspectives.

DR. GREENBERG: John, if I could just ask one question related to that, hopefully not keep people too long from their break.

In light of that work that is going on, ASTM, CPRI, and that has been going on for some time and has obviously been accelerated, I was trying to understand what the recommendation of the NRC report concerning the creation of a standard setting body, what that was related to, and that the national committee should encourage that. I didn't know if that was related to this work that is already going on or that you were actually recommending a new standard setting body be developed.

DR. LANDWEHR: What was recommended in the testimony this morning I think was a new body within the NCVHS. I think that was from the report.

DR. FRAWLEY: You also have to remember, we wrote our recommendations last July before HIPAA ever passed. Then we had to reconvene our study committee in September, because we were panic stricken that perhaps we had not addressed some of the requirements of HIPAA. We looked at it again, but that was before the NCVHS met and set up the subcommittee process that we now have.

So you have to remember that that recommendation was drafted a year ago, just basically saying that there should be a process within the national committee. Obviously, the national committee went that way, and we now have this subcommittee dealing with health data needs standards and security.

So I think basically, we have already done what the NRC report --

DR. GREENBERG: So I shouldn't be reading this too literally about a standard setting body.

DR. FRAWLEY: No.

DR. GREENBERG: Because that is not what the committee has done, and it probably could not be done by the committee.

DR. FRAWLEY: Right.

DR. GREENBERG: You feel that what has been done meets the requirement?

DR. FRAWLEY: Yes.

DR. GREENBERG: Good.

DR. LUMPKIN: I'd like to thank the panel. I apologize for coming in late, but I did get up at 3:30 this morning in order to be here, and certainly it was well worth it. So thank you very much for coming. We'll take a 15-minute break.

(Brief recess.)

DR. LUMPKIN: Can we have our second panel for today, panel of provides? I'd ask each one of you to introduce yourselves, and then we'll start off with Mr. Auton.

MR. AUTON: Hi. I'm Sean Auton. I'm the assistant medical center attorney at the University of Michigan.

MR. BEATTY: Hi. I'm Gary Beatty from the Mayo Foundation. I'm an electronic commerce specialist.

DR. COOPER: I'm Ted Cooper. I'm an ophthalmologist at Kaiser Permanente. Also, I'm the chairperson-elect for the CPRI.

DR. LUMPKIN: Okay. Sean?

MR. AUTON: First off, I'd like to apologize for my crackling voice, and I'd like to assure the committee that it isn't a relapse to puberty, rather just a severe cold.

Members of the National Committee on Vital and Health Statistics, my name is Sean Auton. I'm the assistant medical center attorney at the University of Michigan. Prior to my legal career, I was a teaching assistant at the University of Michigan College of Engineering, where I taught courses in computer network design, implementation and security.

Today, I am pleased to discuss security standards and issues surrounding health care. I commend this subcommittee for undertaking a project to gather input from such a large and diverse group of health care participants. The creation, gathering, organizing and promulgation of health data affects a wide variety of participants, each of which has its own set of issues.

It is important when attempting to set a standard for health care data that these various needs and uses are understood, to prevent the creation of a standard that interferes with the delivery of care. My comments today will focus on what the medical record is, who is responsible for it, the standards applicable to custodians of these records, and the challenges facing these custodians in the electronic age. I will also review the difference between internal and external uses of electronic information and how security standards need to allow for these differences.

The medical record and the data contained within it is owned exclusively by the hospital. Chapter 482, Section 482-B3 of the Code of Federal Regulations requires a hospital to maintain a medical record and release information only in accordance with federal and state law, court orders and subpoenas. While there are a few exceptions, such as for photography, this information is the business record of the institution. It is important to note that the providers of this data, patients, retain certain rights regarding the use and dissemination of the data, but they are not able to remove the data from an institution.

This is a longstanding principle in medical care, with the definitive case being decided in Michigan in 1935. The state Supreme Court ruled in the Gary v. J.A. Mercer Company that a patient is not buying a medical record. Rather, they are purchasing the professional services of the physician to create and interpret a record. While patients are granted access to their records, courts have with one notable exception ruled that such access does not entitle a record to actual possession of the record.

The role of the provider as custodian of the medical record is well defined under a myriad of rules and regulations. Providers are required to maintain a complete and accurate medical record under federal and state law. Such a record requires a combination of specific medical, personal and financial information. Furthermore, such a complete record is required for participation in federal, state and private reimbursement programs. Often, payors will require that this record be made available to them for billing review.

These requirements imposed on providers are a formal framework for the use and security of health data. Unfortunately, many of these requirements are imposed only on the provider and do not cover what happens to the information once it leaves the provider's domain, such as when it is shipped to an insurer.

Further complicating matters is the fact that almost all patient privacy requirements are state based. Federal privacy law in the area of health care is limited to information about drug and alcohol use, and there is no federal recognition of the physician-patient privilege.

The federal court for the Northern District of Indiana visited this issue in a 1995 case of the United States v. MHC Surgical Centers, and concluded that, it strikes the Court as outrageous that individuals who have sought medical treatment for matters of a highly personal nature will have their privacy invaded because they had the misfortune to consult a physician who has become the subject of an IRS investigation.

However, the physician-patient privilege is a creature of statute. Because the Seventh Circuit has not explicitly recognized the existence of any corresponding privilege under federal common law, this court reluctantly concludes that the IRS is entitled to the information that it seeks.

With the advent of electronic information systems enabling providers to ship information across state lines. In the absence of any federal regulation, privacy rules that are purely state based quickly become unworkable or unenforceable.

The state of Michigan offers a perfect example of this problem. Currently, the medical processor for all claims in the state is Blue Cross/Blue Shield of Illinois, which resides in the Seventh Circuit. While Michigan and Illinois both allow access by payors to an individual's medical record, neither one of the states changed its policies or established new requirements for the handling of patient data. All hospitals in Michigan will be obliged to follow this standard.

The logical and troublesome conclusion from this case is that as electronic information systems make it possible for a provider to handle and ship information to every state, the provider is forced to comply with all 50 state requirements. Federal action is required to either conform the traditional view for the electronic age that a provider is liable for federal and its own state rules regarding health data, or the federal government can take action to standardize medical record requirements superseding current state law. While the latter solution levels the playing field for providers from all states, such action may weaken privacy protection and regulations of records in states that have the most advanced medical record legislation.

Furthermore, without standardized legislation in the area, medical providers will have to develop a different set of standards for the sharing of information internally versus providing information externally, even to another health care provider.

The University of Michigan has applied all the current rules for paper-based medical records to its electronic environment. One benefit of this policy has been to leave us in a position where we are already compliant with the recommendations that the National Research Council has presented in protecting electronic health information.

From this experience, Michigan has developed an awareness of the problems with the current rules, and has discovered problems that need to be addressed for electronic health systems to move forward. Using the NRC recommendations as a guideline, I shall address each of these issues.

The first issue is the individual authentication of users. Almost all computer systems today offer some method of having a unique identifier for a user. Michigan has taken the additional step of informing users not to share their unique identifier with anyone, and receives a signed statement from each user that they agree to comply with the standards of use in the system they have been granted access.

The issue of logging off the user after a period of non-use can be a problem. It makes sense to log users off a system in highly public areas, researchers and laboratory workers who work in limited access areas and often have to multi-task are sometimes harmed by a time limit. Certainly, a rule of reason is to link the time before a log-out occurs, where such log-outs are used, is the best approach to the problem, rather than a capricious time element built into a system that is incapable of change.

The second issue with authentication is only now emerging. With the development and promulgation of more advanced authentication systems, it needs to be reviewed if such systems make sense in the health care environment, and if the gains outweigh the costs. Encrypted authentication protocols and token based authentication systems such as Curberos are possible to implement, but come with a high overhead cost, are often incapable of being used with the legacy systems that still hold much of the health care market, and on a cost benefit analysis may not offer a higher degree of protection.

The University of Michigan has had quite a bit of experience with Curberos and its limitations. While encrypted authentication protocols sound impressive, one has to question whether the cost is justified for internal use. Smaller hospitals and clinics may have little or no communication with outside networks, and even large facilities will have limits placed on outside access.

If the facility is monitoring for illicit uses of password capture mechanisms internally, plus trains and sanctions users to violate this policy, then Curberos based systems are only for protection against outside attack.

As mentioned, such systems are also capable of being compromised, as documented by the computer emergency response team at Carnegie Mellon University.

Access controls and audit trails provide a different set of problems as health care enters the new computer-based era. The lessons from the paper-based system can still prove invaluable. Audit trail mechanisms are a method of keeping track of access to a record, but even with legacy systems they are cumbersome and useless for preventing wide access.

An example is a mainframe based medical record system that is capable of tracking user transactions. In a facility like the University of Michigan, this system would process tens of thousands of transactions per day. Sadly, the only way to discover an unauthorized access is to search a particular record and see who has accessed it, a process that occurs usually when a complaint of a privacy violation has been received. Even a test sampling done daily for quality assurance will yield little if any practical benefit for perhaps ten of only 10,000 transactions per day are unauthorized, reducing such an effort to looking for a needle in a haystack.

As we progress to distribute computing systems, this problem increases exponentially, as these types of systems will demand perhaps millions of transactions per day. An important lesson to be remembered from our paper-based systems will apply to this situation.

By tracking every transaction, this is the digital equivalent of tracking every person who touches a medical record. What is important is not who carries the record to possibly leaf through it, but rather that protection mechanisms are established for who has the original copy of the record, who can access the record, what changes have been made to the record and who is the ultimate authority for the record.

External communications do bring up another set of problems that demands a more thorough review and audit, and this situation will be addressed below. For internal communications, however, the key elements are to recognize who has a copy of the record, who has maintained the authority of the record, and who has made changes to the record.

The physical access in security systems has been a relatively easy point to address. Thus far however, it has been practical needs and concerns that have assured that these measures are taken. The centralization of systems makes it easier for a small team of individuals to have backups and maintenance and machines, and the physical restriction of access to machines insures that vandalism or theft does not occur.

A more difficult issue is the assessing of security in use for internal systems. Often many of these systems are not truly interoperable, even if the vendor maintains that their system complies with existing technology standards such as HL-7. The university has had to spend a significant amount of time and money to redesign systems to make them truly interoperable.

Today, it is standard procedure for a hospital to require an implementation standard for all software to comply with prior to payment or funding. But problems occur with the fact that no current standard addresses all of the data needs in the health care environment. Even with just one piece of software, issues will arise concerning physical installation, a communication standard and the application standard. Absent some type of joint working relationship among these various standards, providers will continue to be hampered by having to do customizations of each piece of software implemented.

After a system is installed, issues concerning the confidentiality of information, proper use of the system and method of sanctioning improper use of the system arise. The university requires a blanket confidentiality statement for all employees of the medical center to sign. The rationale for each employee to sign a confidentiality paper is that potentially, every employee may have a need for access or be exposed to health care data. Within a hospital, it is common practice for all employees to be available to assist a patient in their emergency.

An example is the disaster drill each department at our medical center participates in, where non-licensed individuals help the physicians and the nurses in the movement of patients. The need for each employee to have a nurse standing in patient care exists, as anticipated by most providers.

Often outsiders also have access to the internal environment of a medical center. Nowhere is this more apparent than in the software arena, where vendors must have access to real patient data to evaluate their systems and correct any problems that may occur. The university is still liable for any confidentiality violations that may occur while these employees are reviewing information. Therefore, it requires the signing of a non-disclosure statement by each vendor who may have access. A copy of a generic non-disclosure statement used by the university I have included as Attachment A of my testimony.

Currently, sanctions against an employee who violates patient confidentiality can include any penalty the department sees fit to execute, including the termination of employment. Sanctions against vendors at the university as seen in the non-disclosure can include termination of the contract, a court ordered injunction preventing further activity, indemnification for any harm caused by the conduct, and liquidated damages up to the amount of $50,000 per occurrence.

Being an academic medical center, data often is made available for the use of research by physicians. Medical research may be the most highly regulated area of activity for any hospital. Prior to the commencement of any study, a physician must go before an institutional review board to gain approval. The physician needs to disclose if any patient identifiable information is going to be used in the study and what measures are going to be used to insure that patient consent is obtained and patient confidentiality is not compromised.

As medical centers continue to develop the electronic record, it makes sense that they take advantage of such systems to quickly share medical information with other health care providers to insure the best care of a patient. Currently, such sharing of information already occurs but instead of using a network, a fax machine is employed. If a remote hospital contacts the university requesting the records of a patient that has entered its emergency room, we use a fax-back service whereby we confirm that the facility is legitimate, and send the information requested. A method to verify a request should also be employed in the electronic environment, but the issue arises of how much authentication does a provider have to do before gaining access to the information.

Spoofing and false electronic mail are commonplace in the Internet and often hard to verify. Providers need to know the standard they will be held liable to when they receive electronic requests. Possible solutions include an (word lost) transfer the information, a national directory that verifies providers for the use of verification tools and key servers.

Payors are another group that a provider must grant access to the medical record. Today, such requests often require that a provider respond in an electronic format. While the Health Insurance Portability and Accountability Act has established some limits of what an insurer can do with the data, it falls far short of the responsibilities that a health care provider has for the data.

Such a discrepancy exists on the state level as well, as Michigan requires payors to have access to the data, but it does not place on them the requirements that provider have in maintaining confidentiality. An issue for further discussion is if yrs have a right to the entire medical record, if they can archive an individual's medical record, and who is the true custodian of the record and who owns the data rights.

Another group that access health data are regulatory bodies, courts and attorneys. Often such reviews are required to insure compliance with the law, review suspect business practices and investigate potential medical malpractice claims.

The requirements for how long such information needs to be available vary from state to state, along with some federal requirements. In Michigan, for example, records may need to be maintained from three to 19 years. A troubling development is the fact that some electronic storage media can be highly volatile, or even if the media is stable, the equipment to read the media becomes unavailable.

Such an occurrence has arisen in Colorado, where the Rockwell Corporation is being sued in conjunction with the United States Department of Energy. Electronic records that date back to the 1970s have been requested by the plaintiff in the action, and the court has held Rockwell in contempt for not being able to produce the information, because the media is unreadable by any existing machine.

Rockwell has been forced to retain the Digital Equipment Corporation as a personal consultant in trying to recover the data. The cost of reproducing the information has already run into the hundreds of thousands of dollars, and will reach into the millions by the time the case is over.

The court has ruled that all of these recovery costs must be borne by Rockwell. Such a case makes it apparent that a provider faces burdening costs if it maintains records forever, or it will try to pass these costs on to its vendors to insure backward compatibility or convert historical data to the current system.

All states require some variation of access to medical records for patients, including the right to add their own information to the record. Some states do limit patient access to mental health records only with the physician's approval to insure that it does not interfere with the patient's treatment. Some patient privacy groups have advocated a European type of system, where a patient can edit medical information or request that information be removed from the record. This system is unworkable under our current regulations in the United States prohibiting any alterations in the medical record.

Recommendations. As health care enters the electronic age, it will be important to remember the main reason for the creation of health data is to treat patients. While privacy is a concern in the electronic age, one should not be quick to adopt a Chicken Little mentality toward the new widespread availability of information.

Patient confidentiality has long been an important issue in the paper system that hospitals have addressed. There is no reason to assume that by converting to an electronic system, providers will become lax in this duty.

With this rationale in mind, there are some issues that do need to be addressed immediately. First, the availability of health care information to be disseminated across all boundaries. The time has come for the United States to standardize its regulation of what a health record is, who is the custodian of the record, and who can access the data.

Second, there is a need to insure that the interoperability can occur among a wide variety of systems. As discussed, with so many different elements involved in making up an electronic record, no one standard organization can provide a solution.

Last, while some regulation appears necessary to insure compliance with the new standards, it is important to maintain a rule of reason approach that allows smaller providers and legacy systems to continue to participate in health care. Thank you.

MR. BEATTY: Again, my name is Gary Beatty from Mayo Foundation. On behalf of the Foundation, I would like to thank this committee for the opportunity to make our presentation on this very important issue of privacy, confidentiality and security.

My testimony is going to go into a little bit of a different direction, towards the communication outside of an organization, and give a little bit of a case study of what we have done within the Foundation to implement security practices. Attached to my testimony, I also have included documents that outline internal policies, standards for confidentiality, medical records, access policies, our confidentiality agreement that we ask our employees to sign, as well as our business partners, data security policy and standards within our information systems, and then also an article that I have written in the past dealing with secure health care EDI across the Internet.

Maintaining and assuring confidentiality is an essential part of our environment within the Mayo Foundation. All members of the Foundation have an obligation to conduct themselves in accordance with the policy, and hold in confidence all information concerning patients, employees and business information. Confidential information includes all materials, both paper based and electronic, related to the operation of the Foundation, including but not limited to financial information, patient names and other identification information, patient personal and medical information, patient billing information, employee names, including salaries and employment information, proprietary products and product development, marketing and general business strategies, any discoveries, inventions, ideas, methods or programs that have not been publicly disclosed, and any information marked confidential.

Only physicians or other authorized individuals may access or use for release patient or medical information. Such matters are confidential between the health care provider and the patient. Mayo Foundation's confidentiality policies are implemented through publication and educational programs, a signed confidentiality statement by employees and business partners, and the utilization of technical safeguards such as physician I.D. access cards to clear the data, user I.D.s and passwords, appropriate security mechanisms foundation with those, data encryption, authentication, firewall technologies, and the list goes on.

The Mayo Foundation is committed to the ideals of the Health Insurance Portability and Accountability Act of 1996 related to health care simplification. We have taken an aggressive role in implementing its requirements for the use of electronic data interchange. Aside from the transactions mentioned in the legislation, we evaluated various communication technologies that have been employed in the past, including clearinghouses, value added networks, direct connections and the Internet which includes intranets, extranets and whatever acronyms you want to put on to the various networks in today's world, to determine the most cost effective and efficient means of communications.

We have also reviewed these communication technologies against our privacy policies to determine any potential security risks. We have chosen to support all of these technologies, favoring the Internet and intranets to communicate with other trading partners. These trading partners also include clearinghouses and value added networks.

Many concerns ranging from privacy and confidentiality to fraud were raised as we reviewed these technologies. A lot of questions came up including, can anyone else read the transactions, how do I know who sent the transactions, did the transactions arrive exactly as they were sent, can the receiver deny having received the transactions, does the submitter know the transactions were received, and how do I track this activity.

We found that the security technologies and procedures do exist today which can answer these questions, allowing the trading partners to be confident that they can implement secure health care EDI. These include encryption to guarantee confidentiality and privacy, authentication via digital signatures to guarantee the identity of the submitter, and digitally signed return receipts similar to return receipt mail to guarantee the delivery of the transactions. The Mayo Foundation is using asymmetric key or public or private key encryption from a company called (word lost) Privacy, using 1,024 bit keys with trading partners using insecure communication technologies such as the Internet and intranets. Currently, communication over dedicated lines are not encrypted, although this policy is being reviewed.

Earlier this year, Mayo conducted a pilot project using private health care only intranet called MedNet in the state of Minnesota, to submit health care claims to the Minnesota health information network, or MHIN for reporting purposes.

This pilot project involves the submission of approximately 2500 health care claims on a weekly basis. Once the public keys are obtained between the two organizations, they will set up a process to encrypt the claims using MHIN's public key and then digitally sign the files using our private key. The encryption file was then sent to MHIN using file transfer protocol FTP over MedNet.

When they received the encrypted files, they authenticate the file from Mayo using Mayo's public key to guarantee the claims were indeed from us, and the integrity of the data was intact. They then decrypted the file using their private key for processing purposes. Once they completed processing the claims, they returned the acknowledgement of receipt back to Mayo to close that communication loop.

Some of the benefits that we realized from this pilot project were lower costs for communications, a simplified pricing as compared to that of clearinghouses and value added networks, easier and less expensive communication versus direct connections, higher communication speeds. We found that some of our communication sessions, where we were dialing up over dedicated lines, were taking in the range of 30 to 40 minutes. We start doing this across MedNet, we found that we were able to do the same transaction in ten seconds. So much higher communication speeds.

Single network for file transfer, for EDI, e-mail, Internet access, et cetera, and potential to distribute processing using Internet applications.

Some of the challenges that we found was the public key distribution problems. With a large number of potential trading partners within the health care industry, how do we distribute these keys, and how do we develop a level of trust of who is holding on to these public keys and who do they belong to. We are looking to organizations such as key certificate authorities to be established to resolve that large logistical problem within the health care industry.

I'd like to thank the organization for the opportunity to make a presentation of a case study.

DR. LUMPKIN: Thank you. Dr. Cooper?

DR. COOPER: I appreciate this opportunity to provide a perspective from a health care delivery organization on the protection of health care information.

I'm Ted Cooper. I'm an ophthalmologist at the Kaiser Permanente Medical Center in Redwood City, California. I have practiced there since 1973, and I have served as department chief and assistant physician.

Since 1984, my primary responsibility has centered on the Permanente's medical group's need for clinical information systems. I chair the committee that makes and implements policy on confidentiality, privacy, security and access of all data and information in Northern California Kaiser Permanente. I'm an associate clinical professor of ophthalmology at the Stanford School of Medicine, and the chairperson-elect of the Computer-Based Patient Record Institute.

Kaiser Permanente is the pre-eminent HMO in the United States. We have been delivering prepaid health care to our members as a public non-profit health plan since 1946. The program is a group model HMO with the Permanente's medical groups contracting for the delivery of health care services to Kaiser health plan members. Our national membership exceeds 8.69 million members in 19 states and the District of Columbia. The largest private health care delivery program in the United States, with 90,000 employees, 9,935 full-time equivalent contracting physicians. In Northern California we care for over 2.68 million members. The Northern California Kaiser Permanente owns 15 medical centers with hospitals, 30 medical office complexes, has over 3700 full-time salaried physicians and employs over 25,000 staff. We store health care information in both paper and electronic records. In addition to these remarks, I have provided written testimony.

I wish to acknowledge the assistance of Sue O'Neill, our information system technology security administrator in the preparation of this testimony.

Security and confidentiality of computer-based health records. Well designed, implemented and monitored computer-based health record systems provide protection for health information that is superior to paper-based systems. The major factors that provide computer-based systems with this superiority are their ability to positively identify each user, verify authorization, pre-define access rights, restrict retrieval based on the circumstances of the accessed request, the need to know now, encrypt transactions, record each user access in logs, and display personally identifying information only when necessary.

Health information protection at Kaiser Permanente in Northern California. We modeled our approach on the Computer-Based Patient Record Institute guidelines on security, and recommend them as a foundation for any organization. The primary methods we use to protect health care information are policy, sanctions for violation of policy, security training and awareness, physical security, unique user identification codes and passwords, regularly required changes of passwords, individual permission based access control, log-in warnings, locking of keyboards and blinking of screens on demand, (word lost) use for a period of time, logging all access and tracking actions, ready access to audit log reports, and standard procedures for the release of health information.

Each of our 2.5 million members may choose to visit any of our many sites, or use a telephone to seek care from our nurses and physicians at any time. So far, we have found that it would be operationally impractical to limit access to health care information only to those clinicians who have seen a patient in the recent past or when an appointment is scheduled. We have not found breaches of confidentiality that would make this necessary.

Our patients see that we have information systems when they use our services. They know that we depend on their Kaiser Permanente medical number to schedule appointments, register for office visits, provide phone advice and fill prescriptions. Many of our physicians access online health care information in their exam rooms with the patient present.

We are making very limited use of the Internet to transfer identifiable health care information. We have policies that guidelines that permit the use of physician-patient e-mail over the Internet. We serve Silicon Valley, and many of our members are very conversant with Internet security issues, but still request the use of unencrypted e-mail. Common sense seems to prevail in the content of these unencrypted e-mail messages, and we are still waiting for our first security situation to arise with this.

We have a World Wide Web pilot project. A password protected site permits a member to request appointments, obtain advice on our services and access Kaiser Permanente illness based support groups and health care references. Authentication is based on certificates, and all transactions are encrypted under secure sockets layer.

A personal identification number system for our members was developed and implemented to support this effort. We use this PIN system for automated telephone interactive voice response systems, for appointment verification, cancellation, and prescription refills.

Impediments. A major impediment to good health care information security is the absence of industry standards for policies procedures and technology required to provide adequate protection. Other major impediments to good protection are complacency, over confidence, competition with other priorities for attention and resources, limits of technology in legacy systems, turnover of personnel, and corporate mergers and reorganizations.

Concerns. We have concerns that regulations might place a large and costly burden on administrative overhead, for example, requiring written consent to be obtained to collect, store and use information on patients each time care is delivered, requiring each patient to give information to each individual caregiver or user before access to our record is permitted, requiring a patient's primary care physician to approve each clinician or user who requests access to a record.

The analysis of health department data is required to determine best and most cost effective ways to treat and manage illness and health. We have done this research, and an institutional review board is used to protect patient interests. However, essentially the same analyses are required as part of business or managerial reporting and decision support. I would like to see regulations that will protect the patient confidentiality interest in this situation.

In addition, I am concerned about the potential violation of confidentiality through the sale of identifiable health information. As an ophthalmologist, organizations have offered to sell me lists of names and addresses of likely candidates for refractive surgery, cataract extraction and laser surgery for diabetic retinopathy. A regulation prohibiting and providing prosecution for the sale of such information seems to be required.

I am also concerned about the potential for health care information to be used to discriminate against individuals without their knowledge or consent when they apply for health care, life and other insurance, and in education and employment. Regulations preventing such use are essential.

Another concern is that access log requirements might be written that would make the delivery of health care by teams unworkable. In a situation where several individuals are looking at a single display, for example, the intensive care unit or the emergency room, presumably only one is logged on. How do we capture the access of other team members for the access audit log?

Similarly in primary care, teams of doctors and others, for example, physician assistants, nurses, medical assistants, pharmacists, health educators and clerks are all involved in the care of the patients as they flow through the office. Team members often look at the paper record together. How do we manage access control for online records when a team looks at the same display?

Recommendations. Having federal regulations that establish reasonable minimum standards for health care information protection would be an enormous aid to health care delivery organizations. If we had such regulations, we would not have to spend resources to determine what protection is required and then justify the resources necessary to develop and implement it.

We do not need to justify the resources that are required to implement federal standards. Many discussions with great amounts of passion could be avoided. We could just do it.

It is essential to have regulations with significant penalties and adequate prosecution for violations of the regulations. The regulations should cover patient and provider rights for health care information, informing patients and providers of the information practices of organizations, securing consent to store and use health care information for the delivery of health care, managing health care organizations' research and performance reports on providers, who has authorization to access online health care information, circumstances of appropriate access, transmitting health care information, disclosing and redisclosing of health care information, and the sale of individually identifiable health care information.

In summary, at Kaiser Permanente we have found it necessary to have a formal program for health care information protection. The development of a health care information security program was taken into consideration, patterns of human behavior, to deploy solutions that are workable in the health care delivery setting. I think federal regulations that establish reasonable and appropriate standards for health care information protection would be an enormous aid to health care delivery organizations.

Thank you for inviting me to provide this testimony.

DR. LUMPKIN: Thank you. Questions?

MR. BLAIR: Could you help me understand a little bit better -- maybe you could clarify this, because in your testimonies you referred to the fact -- this applies to all of you, but in particular, Ted, you indicated that you are using the guidelines from the CPRI and SSL, and Gary, you would up indicating that using encryption technologies and all.

So on the one hand, I hear that you are implementing policies and practices and functions and technologies that in one way or another would be considered standards or emerging standards, and on the other hand, I hear you also wind up saying specifically that one of the problems is the lack of standards. So could you be specific in indicating where a lack or inadequate standards are making it difficult for you? This is for each of you.

DR. COOPER: Maybe I'll start. From my perspective, it is a matter of knowing what is enough. I have access to a huge amount of talent and expertise in many of the consultants who have already testified and many of the vendors who are yet to testify. When we talk to them, we may get differing opinions about what is enough. A set of accepted industry practices and standards on what is enough would be fine.

MR. BLAIR: Thresholds?

DR. COOPER: Thresholds similar to -- if we had as part of the JCHO or the NCQA, something that certified -- or separate organizations, that certified practices of health care information protection. Presumably they could come in and inspect us, and tell us whether we were doing a good enough job.

Right now, I rely on our internal and external auditors to come in and audit both our systems approaches and how people handle information at our medical centers. We get a lot of our best direction from them. But I still don't know what is enough. The challenge is to relate to management how much money should we keep spending in this area, and when are we spending too much. It is a human judgment; it is not real hard fact science, it is not evidence based medicine as yet.

MR. BEATTY: When we looked at the various technologies and so forth that were available to us, we chose one of many. Even within the tools that we selected, there were various parameters that we could select in there, for example, when you deal with asymmetric keys. What is the appropriate size, and what is the reasonable amount of risk to take.

If you look at say, for example, brute force technology and what it would take to break keys, certainly keys the size of 112 or 128 bits are more than adequate. We chose to go with the 1,024 just because it added that additional thing. When we looked at the shelf life of health care information, it goes well into the future. So we wanted a guarantee that that health care information is secure for a good long time.

What we are looking at as far as recommendations is setting up what we call a security perimeter, various principles and practices that should be employed with an organization at a high level for security. For example, the practices within an organization to establish I.D.s and passwords and various tokenized types of things, whether it is I.D. interfacing for the physicians to the clinical information, to establish certain base practices that should be used inside, external types of things. If you are using direct communication, point to point types of things, certain minimum levels of security. If you're doing things across communication vehicles like the Internet or the intranet, a base for those types of things, where maybe you incorporate data encryption, authentication, return receipts, all of that type thing, so that you establish the guidelines that organizations can implement and feel comfortable with, that they are as the legislation says reasonable precautions to protect the security and confidentiality.

MR. BLAIR: Picking words out of what you are saying here, I am hearing minimal level and base. Is that the essence of what you're saying?

DR. COOPER: I think it is reasonable. We have to take the reasonable precautions. There are minimums and as Dr. Lumpkin said, there is the amount of risk that you're willing to take, looking at the risk exposure.

MR. AUTON: I think it is a key question you ask, and it is important to note the difference between policy and technology standardization, which is critical. I think it is critical for this committee to look at that.

What is missing right now is the policy, along with some technology standardization which will occur. But I would take this case in point. Health care doesn't drive technology standardization in most arenas wit the exceptions perhaps of DICOM and HL-7. But certainly development of a standard like SSL has nothing to do with health care and has everything to do with just networking.

Health care will not drive the development of an asynchronous anchor mode communication standard. This will be from a number of different factors that are financially driven, they are market driven, and health care could certainly have them put in that, but it doesn't drive the standard.

What is missing in health care is the policy of what standards to use, how to use the standards. For instance, with encryption, should it be required that hospitals encrypt all information internally? There is no set standard. Is that even reasonable to require? I would suggest the rule of reason that requires hospitals to encrypt all their information on their internal networks is ridiculous and too burdensome. That would have nothing to do with the technology standard.

As we go down the road, and the most important thing to look at is, yes, there may be certain technology standards that need to be developed peculiar to health care. They already exist for some of them, like an HL-7 standard for interoperability exists. Should we enforce a technology standard?

I think the better question for the committee to consider is, rather than enforcing a technology standard, what policies should we enforce. Rather than saying use HL-7, you have to have interoperability. Rather than use an encryption mechanism, you have to insure that data integrity or that data has the ability to not be easily manipulated by outside people, would be the route to go.

DR. COOPER: If I could just add a little bit, I really concur that the standard set of policies, somebody's Good Housekeeping Seal of Approval for a good health care organization would b e very helpful. I also think it would be helpful to have some categorization of where you would employ this technology, and a similar stamp of approval to say it is appropriate to use secure socket slanters here, is it appropriate to use this kind of encryption there.

MR. BLAIR: Could I ask a follow-up question?

DR. LUMPKIN: Certainly.

MR. BLAIR: If the area where you need more help and guidance is minimums, thresholds, baselines, I think I am also hearing that there may be different appropriate minimums and baselines, depending on the environment that you may be working in. I'm not exactly sure where to draw the lines or the boundaries for that environment.

For example, do we wind up looking for different criteria, standards, minimums, thresholds based on whether it is intranet or a value added network? Or is the boundaries that we should look at whether or not it is a single organization versus multiple organizations in a payor and a provider and a clearinghouse? I don't know if there is necessarily a specific answer yet, but I would like to hear whatever guidance or thoughts you may have on where the boundaries could or can be drawn.

MR. AUTON: Well, even with that type of definition -- for instance, size of the organization, whether it is an Inter or intranet application, it starts to border again on a technology issue, rather than addressing what is it that you want to achieve. Is the policy for a hospital that you maintain information, you set a confidentiality standard for information, and then you hold the hospital liable to that standard? Then what the hospital chooses to do with the information is up to it.

So the hospital may say, internally I don't encrypt anything, I store it. But when I send things externally, I'm going to put a PGP encryption on it, and send it out there, because it is going out to the world. I'm still liable for it, because that is what I've been told, that I can't let anyone touch a medical record without me being liable to some extent or another for this record.

If the standard becomes -- say, for instance, you change your policy and say insurers can get a record, and insurers are now just as liable as medical care providers, and once they get that record, the health care provider is no longer liable whatsoever. Once I get it to the insurer, I stop worrying about what they do, how they get it, what they use that intranet or Internet. It has nothing to do with the standard I need to know as a provider.

DR. MOR: I heard the earlier panel and then the discussion just now. I'm trying to get some understanding as to each of you who are with organizations that appear to have very sophisticated systems in place, very different kinds of organizations with different kinds of trading partners and different levels of breadth.

Each of you have called for standards and policies. I wonder, is this in order to insure that there is a level playing field out there amongst all providers? Or is it to insure that there is a common standard of liability, so that you are not doing too much or doing too little, and people know what the rules of the game are with regard to what their liability is, if there is some kind of infraction?

DR. COOPER: I'll start. From my perspective, it is similar to standard accounting rules. You know what the rules are, you know what is generally accepted and approved. You don't have to spend money in every one of our hospitals, every one of our different divisions across the country, figuring it out all over again, or the next health care organization next door.

I'm not so much concerned about a level playing field, but knowing that I am playing by the rules.

MR. BEATTY: We are looking at the numbers of potential trading partners, and what logistics are going to be incorporated in trying to communicate with a vast number of people that we need to communicate with.

The whole impetus behind the health care simplification portion is to reduce costs. So we are looking for a level playing field in setting standards to allow us to help or reduce costs. That is where we are looking for standards that allow us to be able to communicate information from the party, but in a very secure fashion to protect the privacy and confidentiality of the health care information.

MR. AUTON: The easy answer is yes to both parts of your question. But I think it is important to realize why that is yes. This panel is a great example of that. The University of Michigan Medical Center certainly is no Kaiser or Mayo Foundation in size, but it fits us.

I would take it to even more of an extreme. Let's compare Kaiser and a small 20-bed hospital that is out in the middle of nowhere in Michigan. It is a 20-bed facility, they still have to process claims. In the state of Michigan, it means they have to cross state lines and go to the state of Illinois.

Now, what does this mean? This means this hospital is liable for any standard. If they violate a federal standard they could be disqualified for federal reimbursement if they don't meet JCHO guidelines. It means if they violate confidentiality, or those could be patients that show up on Dr. Cooper's mailing list, then they would be liable in a lawsuit in the state of Michigan or perhaps in the state of Illinois, because of the way the current system is structured.

So the level playing field certainly has financial issues behind it, but it does for everyone, regardless of size of the facility. And it does for individual physicians as well as providers. Without the proper policy development, everyone is out there shooting in the dark, trying to figure out what is going on.

DR. LUMPKIN: Actually, though, it seems to me that most places would consider it to be an honor to be associated with the state of Illinois, not a problem.

MR. SCANLON: I wonder if I could ask you a little bit about the interrelationship between privacy and security. You each come from a state with probably a different medical record confidentiality law. Two of the states I think have patient consent laws. California has a new privacy law, and I think Michigan does as well.

To what extent are the security policies that you have implemented a function of the specific privacy laws or confidentiality laws in your state? Is a national privacy law a prerequisite for national standards and security?

DR. COOPER: It would be very helpful to have a national standard. In California, our interpretation of the California statute is that you cannot put out the HIV status of an individual, positive or negative, or whether they have had the test.

Federal regulations tell us what to do with substance abuse, and perhaps both California and federal tell us about something we should do with mental health records, or records originating in those departments. A quandary is, if you go to a G.P. and get a diagnosis of depression, that is not protected especially by California's mental health law. If you go to a psychiatrist and get the same diagnosis, then it receives additional protection.

MR. BEATTY: Consistency is definitely important. For our foundation, we are for the most part in five different states, and each state does things a little bit differently. Trying to interpret and understand which, depending on who the patient is and in which state they reside and so forth, we have to abide by those rules.

From a payor community side, it gets even more complex, where all of a sudden you have a patient that can reside in any one of 50 states, and try to be able to handle all the various varieties of rules and so forth across all the state lines. So consistency is very important, and it would be beneficial to have federal rules and guidelines.

MR. AUTON: What we have used in Michigan has been, as I stated in my testimony, that we have attempted to adopt all the paper based policies to our electronic environment. We have been fortunate that Michigan does have a fairly well developed set of rules for record privacy.

The problem is that is still stated based. I think there is an excellent example from the late '70s of this. The federal government did in the mid '70s pass rules, and this is our national privacy act. It applies as far as medical care for alcohol and drug abuse, and that's it.

However, the state of Michigan in the late '70s passed a law that was important, that required physicians to report suspected child abuse to the department of social services. It included in child abuse the definition that drug or alcohol abuse was de facto child abuse, requiring investigation, particularly for pregnant mothers.

Now the state of Michigan was in a tough situation, because as a physician in the state of Michigan, you have a state law requiring you to tell about drug and alcohol abuse, and you have a federal law that says, don't tell anyone. Hudsell Hospital in Detroit had the best solution that physicians have: well, I'm in trouble one way, I'm in trouble the other way, let me talk to a lawyer. The lawyer said, let's just take it to court and let those two kids fight it out.

That is what happened. The innovative solution the Sixth Circuit came up with in 1979 was for what we called qualified service organization agreements. Really, what the Sixth Circuit imposed in this case was a meeting of the two laws, in essence creating a federal law that said drug and alcohol abuse, but there is an exception for the department of social services. The department of social services has to meet these requirements, which includes maintaining the confidentiality of information.

This was quite a successful solution. In 1984, the federal government did pass into law, and it is part of federal law now, such qualified service organization agreements, and you can disclose to state social service agencies, drug and alcohol abuse.

Had this been done originally, though, it would have eliminated seven or eight years of confusion. While the court truly was innovative in this case, they still had to go to court, and we still had to go up to appeals court before we had a decision that rendered it.

So conflict can still occur. There can still be good rules. Both were very good rules, excellent federal rule, excellent state rule. But only the federal government can level the playing field insofar as making sure there is one set of rules everyone complies with.

The trouble is, in the electronic era these problems are going to be exacerbated at a very fast rate, because of the growth of the different types of health maintenance organizations, different payor organizations that grow up and cross state lines now. So we probably can't afford the type of mistake that was made in the '70s with the drug and alcohol abuse privacy legislation.

DR. LUMPKIN: It seems to me that we heard in the earlier panel that there were -- my committee members can help me, but it seems to me that there are seven basic practices that were recommended by the NRC. I am trying to understand the issues that were raised, because it seemed that I was a lot clearer after the first panel than I am right now. That is good, because you brought me down to base.

I remember the times in the emergency department when I really needed access to a medical record. Part of our problem I think is the difficulty defining the term, because a medical record isn't a single entity. Well, it used to be when it showed up in the emergency department and it was this big, which is about two feet thick, and you tried to find that one little piece of information you needed to know whether or not you should give the right treatment.

But of those seven items, -- and maybe we can just walk through those, and I can understand a little bit about whether or not you agree that those are workable items, or what the potential problems are.

The first is individual identification of users so that they can be held accountable to the actions. I didn't hear anyone object to that.

The second was access control, to insure that users can access and retrieve only information which they have a legitimate need for. I have heard some comment earlier about the fact that there are systems where, if you have access to the system, you have access to everything on there. That is one end of the spectrum.

The second is similar to a system that we have in our maternal and child health system, which almost every little grouping of information has access control. There are certain pieces of information only certain providers have access to. In fact, only your case manager has access. That would be the other end of the spectrum.

So I just want to understand if you are all weighing in somewhere in between on that. Do you believe that there should be some access control so that not everyone within the health care facility can see every part of the medical record?

MR. AUTON: Yes, that would be fair with a caveat to that. I think that it is unfair, and that would be appearance over function claim. For instance, let's take a billing system. Within a billing system now, my billing people don't have access to HIV status, but if they are processing a claim for my communicable diseases clinic and they are processing a prescription for AZT.

Now, the trick to if it, have these people gained access to my communicable diseases clinic information, or are they truly just processing a billing system? Much of the information we have discussed across health care, as I'm sure you are quite familiar with, crosses different planes for legitimate reasons, including surgery billing summaries and regulations that have come down from the Health Care Financing Administration, that require that we include certain information that the billing people have the ability to check, on the physician record and certain pieces of information.

So is this just a billing system now, or is this my communicable diseases or my surgery system? If you wanted to just say for appearance sake that my surgery system only my surgeons have access to, my billing only my billing people have access to, you could say that and easily agree to that. But there still is the question out there, and you have to think about that within health care.

DR. LUMPKIN: Well, actually that wasn't my question. My question had to do with sexual history. You may have a section of the medical record which says sexual history, which may be very detailed and recorded by a person's obstetrician, but has no relationship to the billing department, as you mentioned. So they may have access to the diagnosis, but not necessarily that particular part of the record.

So what I am suggesting is -- and I think you have answered that -- which is that there is some agreement that there needs to be some segmentation of the medical record and access within an organizational institution be limited to those portions which is appropriate for that person's job.

MR. AUTON: I would agree, and I would suggest that there are some policy guidelines for this already. If this group was to address it, they would need to review those stated based guidelines, particularly for mental health and communicable diseases. Even with the paper based records systems, these records are kept separate from the general medical record files.

So as we go forward, it would be interesting to identify data that is deemed to meet that type of access control. Clearly from history, all states have enforced some type of control on psychiatric and communicable disease information. Will this change, won't this change? It has just been within the last ten years that HIV was added to the communicable diseases section, because of the growth of HIV. There will be changes like that in the future. But there could be good sound policy for what is confidential information that needs access control and what isn't.

DR. LUMPKIN: The third was audit trails to allow organizations to track and review all access made to patients identified by health information.

I have heard some disagreement with that. On page six of Sean Auton's comments, he raises an issue about audit trails as being too burdensome.

DR. COOPER: We don't find them too burdensome.

DR. LUMPKIN: You don't find it too burdensome?

DR. COOPER: I find it very useful for security investigations, but also as a security awareness program, with all the employees knowing that the health information manager at each center can run a report at any patient's request who has looked at their record, or any employee who has looked at their record, or manager. That is a key part of that aspect.

MR. AUTON: I probably should clarify my comment, since I was the one that made it. I will agree that there are audit mechanisms and audit trails that should be used. But again, I would hate to see the triumph of form over function.

If it is believed that an audit trail that tracks every transaction which we have in place on our mainframe, is going to prevent security violations, it won't. If somebody wants to look up information, they do it, and the correction is made after the fact.

For privacy advocates, that is troublesome. There have been some issues raised with that.

Furthermore, what exactly do you want to audit? How extensive does that go when it goes to an audit trail? For instance, do you audit when somebody is pulling up an electronic record in a distributed database environment? I may have pieces of key information of a record across six or seven different systems.

Now, when I make a request to look at a record in my distributed database, I have a local machine that makes the request from these six or seven different machines. Do I now need to log on the local machine that I called it up, using the local database, so that Dr. Whomever called up this record? Or on each one of the machines -- say one of them is the pharmacy machine; does it need to log, we just had a request from pediatrics to upload this patient's information?

Now that we are starting to track each transaction, this could become incredibly burdensome and not very useful if it is an internal machine to machine communication. Where does my audit log start and stop?

Yes, they are useful. The mainframe example, already there is a problem with traditional mainframe. In distributed computing environments, audit logs of each machine probably aren't going to be very useful, and probably not the intent that you want to get. You would rather track what an laboratory is doing rather than machine to machine logging.

DR. GELLMAN: Could I follow up on that? Dr. Cooper, you said you have audit trails. Do you have any systematic review of the audit trails?

DR. COOPER: I would say we have some, but not what we would like to have. Again, the value I see in the audit trail is really in the access of an individual to an audit log report of who has looked at their record, or some record that they are concerned about.

DR. GELLMAN: I understand that. I think from the perspective of most patients, where there might be 50 or 100 different people who looked at their record in the course of an admission, that they won't have a clue who most of those people are, and so a patient review of those records wouldn't probably be meaningful. Although, on the other hand, I have heard examples of this, where hospital employees who would actually be in a position to know will look at their trail, and they can make use of it. But if there is not some independent review, I have heard of audit trails being used in other kinds of systems, not health systems, and the audit trails are narrowly maintained for years and no one ever looks at them. So they don't serve as a barrier to anything and they are not very effective.

DR. COOPER: Our systematic review of audit log is to take a sample of places where we feel the risk is likely to be that there will be some kind of curiosity, or somebody will have looked at things.

So managers looking to see if managers have looked at their staffs' records, that sort of thing, we deal with. By no means is it extensive or terribly useful.

DR. GELLMAN: I understand that. It just seems to me that the requirement for audit trails by itself doesn't really accomplish anything. The question is, what do you do with them. That has to be part of any kind of a standard or policy or whatever, how these things are going to be used. Otherwise they are worthless.

DR. LUMPKIN: I think Sean's point is right. And I think our experience in what we do in Illinois with our maternal and child health system -- a lot depends on what you do and the algorithms that you use to determine those high risk transactions.

For instance, someone could be enrolled in WIC on Thanksgiving Day and generally is considered to be a high risk transaction, and likely to be fraudulent. Similar filters can be used on audit trails within a health care facility. I would suspect that the first time someone looks at a record is probably at more high risk for a non-authorized transaction than the third or fourth or fifth or sixth time they look.

So I think that the issue is whether or not the burden -- and this is really what I wanted to get to -- is whether the burden is on the provider's side, or is this in fact transparent to the provider, whether or not there is an audit trail. In my value, which defaults to one of my mentors, Rosen's Rule number three, which is, when all else fails, do what is right for the patient, we don't want to over burden the system, but in fact, I did want to get a feeling from the providers here whether or not that was a concern about the functionality or whether or not it did provide a burden upon the actual business of delivering services to individual patients.

MR. AUTON: I think a meaningful audit log would be useful, but you would have to develop, again, rather than just a technology answer, but policy. This is a policy that has to keep up with technology a little bit. Just requiring an audit log, as I have stated, is burdensome and probably useless for all transactions.

I could probably generate a report right now that said this IP address requested this record and had these 20 different transactions. I could show a patient a trail of everywhere their record went across six different sub-nets, University of Michigan Hospital, and perhaps seven different high-end powerful computers. Is that useful to them? Does that even mean anything to them, including what the net transactions were? Probably not. But more importantly, if I was able to show them their pharmacy record that said that pediatrics and psych requested this record, so there would be copies there, and oh, by the way, Blue Cross/Blue Shield has requested a copy of the record, that is a meaningful audit trail to a patient.

MR. BEATTY: And that goes into who is really accessing the information. If you get down to the point of what Sean has talked about, as far as getting down to the very granular audit trails versus who is actually accessing the information, and Ted's comment about the value of that being that if the people who actually accessing know that their transactions are being logged, and could be potentially audited, that helps protect the privacy of that information, if they know that there is a potential of being caught.

But at the same time, you look at large organizations, and the volumes of information, and try to develop and policies and procedures for determining how do you actually perform that audit, it gets down to the point where indeed, you don't really look at it until after the fact, in trying to address something that has already gone by, rather than being proactive up front.

So there are benefits, but there are also costs, and we just have to weigh it out.

DR. COOPER: If we had a computer misuse detection system that every time you hit the key would prevent it if it was a misuse, we wouldn't need the audit log, perhaps.

DR. LUMPKIN: I get involved in this in the state, because we do a number of enforcement programs. There are some principles of enforcement, which is that if an individual believes that if they do wrong, they will get caught, and if they get caught they will be punished, and that punishment will be relatively closely related to the time in which they do the thing. That tends to be more of a deterrent than a system where they don't believe that they are going to get caught.

So a deterrent has a very important component of what we are talking about doing, because people either do right because it is the right thing to do, or they are afraid of getting caught. And we want to encourage the best behavior.

MR. SCANLON: You have described different kinds of networks, some of them dedicated and very secure, in terms of your own provider settings, others where you basically reach out over an Internet and you share information, sometimes encrypted with those others. Many of the security policy issues described relate to protection against insider threats and disclosures. Yet you obviously recognize the potential for external threats.

How would the federal standards that we are discussing potentially this morning -- the standards presumably would include measures that apply to both insider and outsider threats. How would you overall though rate the relative proportion of what is an insider threat versus an outsider in these sorts of networks, and to what extent would there be somewhat different approaches in terms of standards?

MR. BEATTY: In trying to reiterate the question, are you trying to figure out where is the greater risk?

MR. SCANLON: Yes, relative.

MR. BEATTY: We relate that back onto exposure. When we start looking at internal -- we have the auditability capabilities and the system capabilities to track what activity is going on. We can put in place systems to guarantee the privacy and confidentiality of information through security tactics.

When we look at outside the organization, we lose that control. That is where we are looking to incorporate dedicated communication lines to our business partners and payors in that community through dedicated means. We are also looking at all the various communication means and looking to secure the information as appropriate.

In a case like MedNet, where it is a private network within the state of Minnesota, where the only way you can get onto that network is to be related with the health care industry in the state of Minnesota, we have limited exposure across that network. It is not like being on the Internet; you are basically exposing yourself to the world. Our trading partners find it much more easy to work with that type of network than trying to transmit information across the Internet.

We tend to treat both the same in our security precautions. Anything that goes across that type of network, we secure with the same technologies.

DR. COOPER: I would say that in Northern California, in Kaiser Permanente, we are more concerned with the internal threats right now. If we look out two years to five years, I think we are more concerned with the external threats. In a competitive environment, proprietary information in the business may be what is more at risk than specific patient confidentiality.

DR. LUMPKIN: I thought with the size of Kaiser that there was no outside.

DR. COOPER: That is part of our marketing plan.

MR. SCANLON: But the standards than that the committee would be making recommendations on, you feel that we have to cover this whole range of potential exposures and threats with possibly different standards for each? Okay.

DR. LUMPKIN: I'd like to thank the panel for a very stimulating session. We are now scheduled to go to lunch, and we will return at 1:45.

(The meeting adjourned for lunch at 12:30 p.m., to be reconvened at 1:45 p.m.)