Science Progress | Where science, technology, and progressive policy meet
BIOETHICS

Bank On It

An Ethics and Policy Agenda for Biobanks and Electronic Health Records

Distribution coordinator Abigail Wilson prepares an outgoing shipment of processed tissue at Tissue Banks International's San Rafael, Calif., facility on Friday, May 12, 2006. SOURCE: AP/NOAH BERGER A distribution coordinator prepares an outgoing shipment of processed tissue at Tissue Banks International's San Rafael, California in 2006.

In the early days of bioethics, the dominant paradigm was about finding ways to slow down the application and use of emerging technologies. While some still tenaciously cling to this paradigm, the ethics of information technologies applied to biobanks and electronic health records, or EHRs, is producing a major shift in thinking. There may in fact be a bioethical imperative to incorporate EHRs into medical practice, as they improve the quality of care delivered and allow for the organization of information that can allow epidemiologists and other researchers to understand critical patterns in public health.

Given that the president’s budget request, released earlier this month, includes investments in biomedical research that focus on genomics, translating science from bench to bedside, and applying science to enable health reform, now is an important moment to look closely at the ethical questions that will inform public policy for biobanks and EHRs.

These two technology platforms deserve special emphasis for two reasons. First and foremost, there is a close connection between them. Of the many definitions of biobanks, the German National Ethics Council (Deutscher Ethikrat) definition captures this connection well: biobanks are “collections of samples of human bodily substances (e.g., cells, tissues, blood or DNA as the physical medium of genetic information) that are or can be associated with personal data and information on their donors.”[1] This description reflects the great power these two separate platforms provide to probe more deeply the relationship between genotype—the specific combination of DNA that defines each of us—and phenotype—the way our bodies express that genomic information.

Second, biobanks and electronic health records implicate both clinical ethics issues—those arising at the bedside for health care providers and patients—and research ethics issues—those arising for scientists, research subjects, ethics review bodies, and regulatory authorities. Both of these sub-specialty areas confront similar and complementary ethical issues; for example, issues arising from the nature and adequacy of informed consent, the sufficiency of systems to protect personal privacy and confidentiality, or the need to balance concerns relating to data security and the need to know. A growing research base supports calls for more attention to these issues, and yet current professional ethics frameworks and policy consultation methods are poorly organized and ill-equipped to anticipate and fully address ethical issues in health information technology, or HIT, generally, or to provide adequate ethical assessment of the tools that elicit these issues.

Our aim is to provide an overview of issues raised by each technology, examine existing research on how willing patients are to share health information, delineate the gaps in existing regulatory policy, and recommend necessary avenues for empirical research to build systems that protect the security and privacy of health information while improving personal and public health outcomes.

Bioethics as an instrument of policy development

Bioethics is increasingly used to inform public policy in many areas of health and science.[2] A common approach occurs when bioethics scholars try to determine whether a drug, device, or procedure should be used at all and, if so, under which type of policy controls (e.g., legislation, regulation, guidance). For many years (especially in the latter part of the 20th century) bioethics has been about finding arguments to support the recommendations to stop, or at least slow down, take care, beware. Discussions about organ transplantation, the use of machines in end-of-life care, gene therapy, and stem cell research were about controversy and the need to determine the scope of appropriate use. But what if there were machines that were essential or necessary for high quality care of all patients? Were that the case, it would be blameworthy not to use the machines, computers and analytic tools for this purpose.[3] It is only in epidemiology and public health that we see such strong imperatives to study and use certain tools for the benefit of all. It might be that the use of biobanks will constitute another such imperative.

More than two decades of research have demonstrated that the establishment, implementation, and dissemination of HIT raises profound ethical, legal, and social issues for patients, clinicians, researchers, and society.[4] With the passage of the American Recovery and Reinvestment Act of 2009 came $19 billion and the promise of a profound and comprehensive expansion of the use of health information technology in health care and society, and with it a commensurate set of ethical and policy issues. Developments in HIT are sufficiently challenging to occupy ethical and policy analysis, but when coupled with parallel and interconnected developments in the life sciences—mapping and sequencing the human genome, and the advent of real-time research data sharing and exchange—HIT generates issues that extend well beyond concerns about privacy protection and confidentiality of medical information to include a host of other issues including:

  • Access to and control of personal health records by patients, health care providers, community service organizations
  • Data identification and de-identification in biobanks
  • Dissemination of risk information for use in both patient safety and all-hazards preparation and response; emergency public health informatics
  • Bioinformatics
  • Computational decision support
  • Open source/intellectual property
  • Secondary use of information by government and industry
  • The growth of telemedicine and telehealth.

Our own empirical research, discussed in this article, resonates with other studies suggesting that given appropriate control over their health information, patients are willing to share relevant health information that supports public health research. With that in mind one can envision a research agenda that will help the public and policymakers understand the implications of these platforms going forward:

  • Detailed investigations are now needed that explore the specific challenges arising from the direct connection between biobanks and electronic health records.
  • Research on patient preferences should be expanded to include willingness to permit secondary and tertiary use of information in electronic records.
  • Exploration of the concepts of community privacy and confidentiality is necessary.
  • Increased attention should be devoted to ethical analyses of the consequences of digitizing biobank content.
  • Assessments should look at the impact of curricula for training health professionals and researchers.
  • There should be ongoing review of relevant state and federal regulations to assess harmonization challenges.
  • Research should examine the translation of findings into policy.

One might hypothesize that acceptance of these recommendations would lead to conceptual and empirical tools of no small utility to policymakers, and could inform strategies for ethically optimized governance structures and oversight. In so doing, we are proposing that a progressive bioethics agenda should concern itself with how to control and manage technology, not to dread or disdain it; that moral hand-wringing is of limited practical utility; and that caution in science and policy is hollow without critical ethical analysis. What we have elsewhere described as “progressive caution” seems a reasonable course to take in a century that will be shaped by intertwined revolutions in genetics and information technology.[5]

Here we outline a set of ethical and policy challenges raised by both repositories of human biological material and electronic health records. It is or should be uncontroversial that the future of health care will see genomic data and information become an integral part of the patient record.

Biobanks

From the very early history of clinical pathology, studies of archived human biological materials including specimens of blood, DNA, but also bone, organs and other tissues have played a prominent role in the diagnosis and treatment of diseases as diverse as cancer, heart disease, diabetes, and stroke,[6] as well as other diseases of significant public health impact.[7]Biobanks exist on every continent of the globe, including Antarctica.

Figure 1: Biobanks around the world
global map of biobanks

Figure 1 provides a graphic illustration of many of these repositories, principally those limited to national or other institutional repositories.[8] While no global census of the number of samples and specimens has been undertaken, the National Bioethics Advisory Commission conducted one of the first domestic inventories in the United States. NBAC estimated that there were 282 million specimens stored in the nation’s pathology laboratories, newborn screening collections, forensic DNA banks, blood banks, umbilical cord banks, organ procurement organizations, tissue banks, and research-related repositories maintained for longitudinal studies.[9] Elisa Eiseman and Susanne Haga later updated this data, adjusting the figure upwards to more than 350 million.[10] Both numbers are likely substantial underestimates because they do not include proprietary databases, classified military banks, or privately maintained collections, let alone the thousands of “fridges” maintained in university and hospital laboratories. A conservative estimate of the samples stored in repositories around the world must now exceed one billion. So ubiquitous are these banks and their potential so great, that Time magazine last year listed biobanks one of the “Ten Ideas That Are Changing the World Right Now.”[11]

Common to the establishment and maintenance of every bank—domestic or international, public or proprietary—are a set of ethical and policy issues that must be addressed from the moment the banks are designed through the collection and storage of materials, and which continue when materials are shared and disseminated.

Ethical issues in the collection, storage and use of human biological materials

We can portray the “standard” encounter between a patient and her physician as follows: the virtuous physician, respectful of individual patients, seeks permission to undertake interventions (treatment, surgery, etc.) that patient and physician jointly believe to be in the patient’s best interest. In so doing, the respectful clinician provides sufficient information to allow an informed choice by the patient to be treated, while at the same time protecting certain information from the gaze of those who have no need to know (or see) it.[12]

Similarly, we may describe the “standard” research paradigm governing the nature of the relationship between an investigator and prospective research subject as follows: the virtuous researcher is one who designs studies that answer valuable and valid questions, avoids conflicts of interest that compromise scientific objectivity and bias; submits protocols, including clearly written consent forms and descriptions of how informed consent will be sought, for prior scientific and ethics review and approval by an Institutional Review Board; recruits participants while protecting vulnerable populations from exploitation; and conducts the study according to accepted scientific standards of rigor, analysis, and reporting.[13]

These two “paradigms” may only be ideals, but whatever the valence we give to them both are subject to challenges arising from genomic science. The esteemed Canadian physician William Osler wrote in 1892: “If it were not for the great variability among individuals, medicine might as well be a science and not an art.” This statement was prescient in many ways. Little did Osler know that a little more than a century later researchers armed with the complete sequence of the human genome would turn their attention to the minute but important differences between people at the level of the individual letters of the genetic alphabet—A,C,T,G. These differences, called single nucleotide polymorphisms, or SNPs, help to explain why some people respond to drugs and others do not, why some are at increased risk of succumbing to certain diseases while others are not. Although many of the issues arising from these developments were first outlined a decade ago, here we review and update some of these, with an eye toward gaps in existing research that we must address in order to craft new public policies.[14]

Identifiably

Figure 2: Altman’s Curve
atlmans curve

A key consideration in determining the extent to which the collection of human biological material raises ethical concerns is the degree to which research involves a human subject, and particularly whether the biological material can be linked to the person from whom it was obtained.[15] The debate about research use of human biological materials has been at times complicated by the fact that the language of the field varies and is often at odds with the categories used in the applicable federal regulations. To the extent that individuals can be identified, they can be harmed either directly or indirectly. Stanford bioinformaticist Russ Altman and colleagues helpfully framed the dilemma facing genomic scientists and privacy advocates.[16] Put simply, the more SNPs that are identified, the more an individual person can be identified and, therefore, the less privacy protection that can be assured. The converse of this relationship holds as well: the fewer SNPs identified, the less one is able to make meaningful associations of genotype and phenotype. At the extremes, one can imagine two undesirable outcomes of this relationship: absolute privacy protection dramatically inhibits research whereas complete access to SNP information dramatically inhibits privacy protection. The challenge is in identifying the optimal balance between the two concerns—bearing in mind that the electronic health record of the near future will come to serve as a favored repository or source of genomic information for both clinical and research purposes. Still, “Altman’s Curve” (as we have chosen to call it, see Figure 2), is only a heuristic device to capture the real dilemma between the need to find genetic relationships of significance and the need to ensure adequate protection of private information. One of the approaches to resolving this dilemma has come from empirical research conducted on public attitudes about and willingness to participate in biobanking—something we discuss in detail below.

Public Attitudes

A growing body of evidence exists regarding the public’s willingness to donate tissue or other biological material to science in general, and to biobanks in particular. A review of the empirical literature conducted on PubMed in early 2009 found no fewer than 60 studies, with at least 20 surveys published between February 2008 and January 2009. At the risk of simplifying a very robust set of studies undertaken on different groups of people, in different countries, under different conditions, being asked different questions, it would appear that in recent years there has been a gradual increase in the public’s expression of willingness to participate in biobanks.[17]

Studies by one of us (Meslin) in Indiana are consistent with this general claim. In 2006 and 2007 we surveyed cancer patients who contributed leftover tissue to the Indiana University Cancer Center Tissue Bank and found that a clear majority of subjects would permit unlimited future research on stored human biological materials without re-contact and re-consent, and, further, that a significant minority appear to desire ongoing control over future research uses of their tissue.[18] In 2007-2008 when we surveyed women in community health clinics to estimate their willingness to donate specimens for DNA analysis by needle stick as compared with collection of saliva, the majority of the 279 women surveyed would do both in high numbers—68.3 percent opted for the needle stick and 75.7 percent for the saliva.[19] In both of these surveys, we learned that several factors modulated support for biobanking. For example, in our study of cancer patients, about two-thirds, 62.6 percent, of respondents agreed or strongly agreed that it was “all right” for researchers to use their donated tissue to develop a new tool or treatment for profit, though support for “for-profit” biobanking varied somewhat with this population depending on age, education, and other demographic factors. In our study involving women in the community health clinic, we found a number of reasons why they indicated an unwillingness to participate, including worries about the use of the specimens, violations of privacy, the potential for future discrimination, and the fear surrounding unfavorable results.

We also undertook a more comprehensive telephone survey of more than 1,000 Indiana adults in 2007 and 2008, one of the aims of which was to assess public confidence in medical and genetic research.[20] Respondents were asked five questions relating to privacy, answering each using a scale of 1-10 with 1 being “not at all concerned” and 10 being “extremely concerned”:

  • How concerned are you that genetic research is carried out by pharmaceutical, biotechnology and other for-profit businesses?
  • How concerned are you that information collected in the course of genetic research might be used by people other than the researchers?
  • Specifically, how concerned are you that this information might be used by employers?
  • How concerned are you that this information might be used by health insurance companies?
  • How concerned are you that this information might be used by schools?
Table 1: Concerns About Privacy: 1 = “not at all concerned” and 10 = “extremely concerned”

Business Non-Science Employers Insurance Companies Schools
TOTAL 6.47 6.78 6.47 7.70 5.76
Gender
Male 6.13 6.73 6.56 7.56 5.55
Female 6.80 6.83 6.39 7.83 5.95
Race
White 6.44 6.72 6.43 7.71 5.67
Minority 6.55 7.24 6.71 7.61 6.24
Age
18-24 6.36 6.38 6.05 6.93 5.25
25-44 6.34 6.77 6.31 7.67 5.59
45-64 6.57 7.04 6.85 8.10 6.00
65+ 6.62 6.51 6.40 7.59 6.04
Education
HS or less 6.65 6.87 6.58 7.59 5.89
Some College 6.48 6.91 6.55 7.76 5.93
4yr Degree 6.21 6.50 6.23 7.75 5.33

Table 1 provides the demographic data relating to each of these questions. In general, the highest level of concern among the public is related to the use of genetic information by insurance companies. The group with the highest level of concern comprised those approaching retirement (45-64-year-olds) who reported among the highest levels of concern over all five of the issues presented.

Finally, a national telephone survey in September 2009 sought the opinions of close to 400 people about genetic research and the use of personal information, including specific questions about identifiability.[21] For example, we asked respondents to consider the following question:

Q2: If I were asked to provide access to my medical records to obtain information that could be used for genetic research, I would be willing to give permission for use of my records.

On ascale of 1-5, where 1 signified that they “strongly agreed”, and 5 that they “strongly disagreed”, the responses from 397 respondents were as follows:

(1) 19.8% [strongly agreed]
(2) 8.10%
(3) 19.5%
(4) 16.3%
(5) 36.6% [strongly disagreed]

We also asked this question:

Q5: How confident are you that genetic research is generally carried out in ways that protect the privacy and confidentiality of the research subjects involved?

On the same scale (1 = not at all, 5 = extremely concerned), the public sample (N = 397) responded as follows:

(1) 8.40% [not at all concerned]
(2) 14.10%
(3) 27.20%
(4) 24.60%
(5) 25.70% [extremely concerned]

We also asked a series of questions designed to elicit attitudes about the possibility that researchers might be able to identify individuals in published studies with increasing certainty, using attacks such as those proposed by Homer[22] and more recently by colleagues from Indiana University-Bloomington.[23] We first gave an introduction:

Now I would like for you to imagine that you are invited to participate in a genetic research study where you will be asked to give a blood sample that will be analyzed in a laboratory. When the study is completed, the results will be published. While you will not be personally identified by name, address, or any of the other usual ways, there are now sophisticated statistical techniques under development that might be able to identify you as a participant in the study. These techniques involve looking at DNA of all the people in the study, and then examining the blood samples. It is possible, therefore, to identify you, even though your name was not mentioned in the published article. Since the article will be read by other scientists and many other people, it is possible that they too might be able to identify you as a participant in the genetics study.

We then asked the following question:

Q7: Knowing this, how concerned would you be in being identified in this way? Please select a number between 1 and 5, with 1 being not at all concerned and 5 being extremely concerned.

Of the 398 people who responded, answers were as follows:

(1) 22.40% [not at all concerned]
(2) 16.10%
(3) 23.60%
(4) 13.40%
(5) 24.50% [extremely concerned]

Given these responses, we then probed further to determine whether the likelihood of identifying individual persons affected their level of concern. Four questions were asked, providing respondents with different probabilities of being identified, ranging from < 5% to 95% or more. The table below lists the responses to the interviewer’s question when different probabilities of identifying the individual were given.

Table 2: Levels of concern based on different probabilities of identification for personal genetic information in published research

What if the probability of identifying you is <5%? What if the probability of identifying you is between 5 and 20% What if the probability of identifying you is about 50%? What if the probability of identifying you is 95% or more?
Yes 71.20 79.60 72.10 86.50
No 28.80 20.40 27.90 13.50
N = 391 N = 283 N = 226 N = 161

It is tempting to accept data of the kind presented above as dispositive—and conclude that the public’s opinions ought to guide public policy. We would, however, urge caution in drawing such premature conclusions. The first reason for this caution is reflected in the data above—we are not at all clear about the explanation for why a greater percentage of people would agree to participate in a study where there is a greater (rather than lesser) chance of their being identified.[24] A second reason for being cautious is explained by a counter-example from Australia.

The experience of the Western Australia Data Linkage Unit

For more than three decades, the state government of Western Australia has been collecting one of the world’s largest administrative health datasets, including birth records, midwives’ notifications, cancer registrations, inpatient hospital morbidity, in-patient and public out-patient, mental health services data and death records.[25] Used in combination with medical record audits, the WA dataset provides a platform for comprehensive evaluation of health system performance. Moreover, investigators have developed a system for linkage that is aimed at meeting the dual goals of protecting privacy and enabling health systems research.[26] This “win-win” approach results from keeping any identifiable information from the researchers, who only need the linked data on exposures and outcomes for their analyses. Of note, since this program has been in place, general requests for access to identifiable data have declined markedly.[27] Indeed, when officials asked people in the general community if they approved of their information being used in this way, they found that citizens were not only supportive of the use, but they questioned why it was not already in use for research purposes.[28]

Our conclusion from this empirical data is that it is not enough to know that the public has concerns (as evinced by the public opinion data above). Instead, it is critical to appreciate that the context for these concerns inform the type of tradeoffs between protecting privacy and permitting access to information to advance research on human health.

Governance and regulatory issues

With the domestic and international proliferation of biobanks and their associated connections to health information databases, scholarly attention has been turning from the ethical issues arising from the construction of biobanks to the ethical issues that emerge in their operation and management.[29] There has been no shortage of guidance documents on these issues. A search of the authoritative HumGen database listed 52 international, 38 regional and 204 national guidance documents on the topic of biobanks alone.[30] In the United States, a set of federal regulations governs the oversight of research involving human subjects, and this is the same regulatory structure for research on human biological materials.[31]

Many commentators have observed that there are significant ambiguities in the regulations for the protection of human subjects.[32] A pictorial representation of the Common Rule (see below) prepared by NBAC in 2001 offers a partial explanation for the situation. Sixteen federal agencies and offices have agreed to follow the same “common” set of rules (45 CFR 46, Subpart A), leaving more than 50 other agencies to fend for themselves:

diagram of agencies that follow the federal Common Rule for human subjects research

But this does not explain why, for example, the FDA’s regulations have differed in critical ways from those of the Common Rule.[33] Despite this challenge, some argue that sufficient clarity now exists in current regulation to permit Institutional Review Boards to make decisions about research protocols involving human biological materials.[34] In other words, U.S. regulations already provide IRBs with the authority they need to make determinations about whether consent forms could be constructed to permit blanket consent, and about the adequacy privacy and confidentiality protections. We believe that this is good example of the point we raised at the outset: namely that when biobanks first caught the attention of bioethics, the reaction was understandably cautious particularly given the uncertainty about how informed consent would occur, whether protections were adequate, whether IRBs were sufficiently trained.

Although collections of human biological material have been stored and used for decades, the accelerant effect of the complete map and sequence of the human genome and other genomes elicited a predictable response: stop, or at least slow down, while we assess the ethical, legal, and social implications. After a decade of experience and research, it is appropriate to step back and recognize that many of the main problems have been addressed (or at least identified). It is now time to ask whether the pragmatic (that is, the progressive policy) stance may require that bioethics turn its attention to asking not whether people should be contributing to biobanks, but how to better further their interest in doing so.

Electronic Health Records

The practice of medicine and nursing necessarily (though not of course sufficiently) require the keeping of records. Biobanks are, in many respects, records, albeit organic ones. More familiarly physicians’ notes about signs and symptoms, treatment decisions and outcomes serve as reminders to guide the care of individual patients; and as a source of information for research, to shape the care of all patients. Without stored accounts about what providers see, feel, hear, measure and do, there can be no effective medical practice, no sharing, teaching, or learning of any substance or consequence.

The practice of making notes about patient encounters is ancient and has been attributed to Hippocrates, though what survive are case histories intended to be used for teaching. Fielding Garrison’s 1913 classic, An Introduction to the History of Medicine, notes that in 25 of the 42 cases in the Hippocratic corpus, the patients died—and were therefore especially instructive; he compares these to the records of the Roman physician Galen, which are boastful and limited to remarkable cures and the errors of other practitioners. Hippocrates wrote: “I have written this down deliberately believing it is valuable to learn of unsuccessful experiments and to know the causes of their failure.”[35]

More than a millennium later, the Syrian physician Ishap bin Ali Al Rahwi (CE 854–931) suggests in Ethics of the Physician that clinicians had a duty to make two sets of notes, with one copy for a council of physicians to assess and determine if the standard of care had been followed. It is apparently the first documented instance of peer review.[36] Even the most rudimentary data can be of use: John Snow’s famous analyses of case reports and maps contributed to halting a cholera epidemic in London in 1854.[37]

The clearest early modern statement of the utter necessity of complete and easily accessible medical records comes, arguably, from Abraham Flexner in his 1910 analysis of U.S. and Canadian medical education. Flexner sees the medical record as essential for quality care and the education of those who would provide it—in ways not dissimilar to contemporary claims for the utility of biobanks for translational medicine and pharmacogenomics:

Pupils are more apt to disappoint than to astonish their teachers; they do not generally better their instruction. In consequence hospital records made by internes [sic] graduated by these schools are scant and unsystematic … whoever is responsible, poorly kept records are very apt to denote inferior bedside instruction. The situation is this: there lies the patient; teacher, interne, and students surround the bed. The case is up for discussion. A question arises that requires for its settlement now a detail of the patient’s previous history, now a point covered by the original physical examination, now something brought out by microscopic examination at some time in the course of the disease. If complete, accurate, and systematic records hang at the bedside, there is an inducement to ask questions; doubtful matters can be cleared up as fast as they are suggested. That, then, is the place for the records—full records, at that. In few instances are the records full; in still fewer are they, full or meager, in easy reach.[38]

The 1940s saw the invention of the first programmable electronic computing machines (developed in secret as tools of warcraft) and, in temporal coincidence, the policy decision that properly maintained medical records should be a requirement for hospital accreditation.[39] Within a generation, physicians were experimenting with, developing and, well, fooling around with computers as storage devices for those records. There are many reasons why it made sense to explore the utility of information technology. They include:

  • Human memory is fallible, variable and, for certain complex information, short. The clinical encounter generates too much information to recall accurately. This was, in one degree or another, always a challenge, but given the amount of clinical information generated by the modern clinician it became clear that storing this information on paper is feckless and perhaps even futile.
  • Even if one could easily and swiftly find the information needed for patient care, it was difficult to analyze. Computers make it easy to track and compare lab values, diagnoses and prescriptions, say, over time.
  • Information technology enables analyses that bear on change, quality, error and other phenomena. A computer lets one compare the patients on Ward A (or Hospital X) to those on Ward B (or Hospital Y), for instance. Simple reminder and alert systems run on quotidian clinical data.
  • Computers support research that would otherwise be impossible, or at least impossibly tedious.
  • Information technology supports the kinds of analyses and assessments that now go by the names of “comparative effectiveness research” and “meaningful use.”

Here is the case, made more than two decades ago in 1988, for “Fully operational computer-stored medical record systems”:

These systems have demonstrated three kinds of benefits: (1) Computer-stored medical records can solve many of the logistic problems of finding, organizing, and reporting patient information that occur with purely paper systems. (2) They can improve the efficiency and accuracy of physicians’ decisions by performing calculations and by identifying clinical events that need attention. (3) They can guide future policies and practices by analyzing past clinical experience within a hospital or a physician’s office.[40]

Ethical issues in the development and use of electronic health records

The paper-based medical record, which continues to predominate in U.S. practices, clinics and hospitals, raises ethical and security issues insofar as:

  • Someone not authorized or supposed to view them might do so at their points of use or storage. Consider a passerby, a family member, or an orderly deciding to have a peek at a patient’s chart.
  • Records might be improperly transported or discarded. Patient charts have been found in the street, in dumpsters and in other places not connected to patient care.
  • Paper charts might be used inappropriately, as for instance when they are removed from clinic or hospital and taken to a clinician’s home for review or research, say, and are overseen by family members, for instance.

In fact, one could argue, privacy and confidentiality are more at risk when people speak carelessly about a patient than they are when patient information is stored in paper records. At any rate, the evolution and spread of electronic health records and, more recently, personal health records (electronic tools with which patients view and manage their own health information)[41] have changed the way we need to think about information privacy and security—even as we agreed that paper records are too inefficient, clumsy, and difficult to access and learn from.

The challenge posed by any system of record retention for medical information is simply stated: How do we make information about patients easily available to those who need it for patient care and other legitimate uses, and unavailable—difficult or impossible to access—for all others? Among the corollaries:

  • How will electronic records affect the risks of privacy and confidentiality breaches?
  • What happens when records are shared or distributed across databases? What security risks arise when digitized health data and information are stored, replicated, and transmitted?
  • How will personal health records alter the privacy landscape for groups, communities, and society more broadly?
  • What will be the effects on health care and information security when, in a pharmacogenomically-focused world where drug makers can develop medicines tailored to individuals, EHRs are linked to biobanks (and, for that matter, when some of the information contained in biological material becomes an integral part of the EHR)?
  • What is the relationship between information security practices developed to safeguard data from corruption and inadvertent and intentional alteration and practices developed to protect privacy and confidentiality?

Further, some experts have argued that the EHR is or can be more secure than paper records, in part because, unlike paper, an electronic record can be sculpted, structured or secured to impede or prevent inappropriate access.[42] Many of the mechanisms to achieve this security have already been put in place and, indeed, have become the standard for health care organizations: password/passphrase and other login requirements to access records; audit trails, which record the identity of all those who have viewed a record; encryption standards for data transmission; etc. Indeed, there is a growing body of professional and regulatory oversight addressing the security of records, including FDA requirements for audit trails (21 CFR Part 11).[43] In fact, evolving security standards have identified the “trusted insider” as among the most insidious sources of inappropriate access.[44] A trusted insider has a login and perhaps even some plausible, but not actual, need to access a record. Consider the hospital clinician who wants to find out why his sister’s partner is visiting the infectious disease clinic…[45] This means that one of the greatest sources of concern for EHRs is remote or offsite access.

Now, the adoption of various mechanisms of encryption and firewall protection can address these concerns in varying degrees, but there has been for some time generally broad agreement that security mechanisms alone are just inadequate to the task of confidentiality protection. They are necessary but inadequate, according to the Privacy/Data Protection Project at the University of Miami School of Medicine:

There is a tendency to focus on technical measures, such as encryption, when discussing information security. Relatively simple physical protections, such as restricting access to areas with computers, fax machines, etc., can be just as important. … Most important are the “administrative” (policy and procedural) efforts, from the rules about “who may see what” to details such as how user ids and passwords are disseminated. Even the most sophisticated technical and physical measures will be defeated by bad practices.[46]

The Security Rule under the Health Insurance Portability and Accountability Act captures this insight, as the Privacy/Data Protection Project goes on to explain:

covered entities that ‘collect, maintain, use or transmit’ PHI [personal health information] in electronic form must construct ‘reasonable and appropriate administrative, physical and technical safeguards’ that ensure integrity, availability and confidentiality. Such measures—notably in the form of policies and procedures—must provide protection against ‘any reasonably anticipated threats or hazards.’[47]

Aspects of these requirements have been known for some time, and they point to what should be regarded as a suite of best practices for applied ethics in the domain of electronic health records and perhaps especially so when those records are merged with or linked to biobanks. Generally, experts recognize three intertwined approaches: public policy initiatives, including laws that penalize egregious abuses; technological standards, including the likes of audit trails and encryption; and education and training.[48] This last is too often overlooked and, in consequence, too infrequently embraced. Health professionals and others who are entrusted with patient information have ancient duties to safeguard that information.

The moral obligations to protect privacy and confidentiality are uncontroversial, but the foundations of privacy rights are obscure to some. This is a teaching moment. The easy cases—don’t sell patient data to businesses without patient consent—might require little exegesis, but more difficult cases—What if EHR information can be used to warn third parties of health risks? How should biobanks communicate data about an individual to a potentially affected family member?—require some grounding in the processes for balancing competing values. This, too, is fertile ground for educators.

The relationship between privacy and public willingness to participate (as evidenced through their informed consent), considered above, points to the importance of publishing and otherwise disseminating empirical data that bears on the question of secondary, tertiary and n-ary use of health information stored in EHRs. A growing body of research parallels the Western Australian experience[49] and “suggests that patients are in fact willing to share their information and, indeed, that privacy concerns do not necessarily pose the kinds of constraints and inhibitions customarily invoked to limit information sharing.”[50] In addition to being rich in potential applications to public policy, studies about patient preferences, a key component of most definitions of evidence-based practice, can inform curricula that provide guidance and standards for developing public policy when values are in (potential) conflict.

This is rarely as important as it is when considering the utility of EHRs and PHRs for public health and epidemiology, as patients, clinicians, and citizens alike have duties to support public health. Information technology supports this mission, will not disrupt the foundation of trust that underpins the clinician-patient relationship—especially given the accretion of research suggesting that patients are willing to share some of their personal health data to support the health of populations.[51]

About the Authors

Eric M. Meslin, Ph.D., is Founding Director of the Indiana University Center for Bioethics, Associate Dean for Bioethics and Professor of Medicine, Medical and Molecular Genetics, and Public Health in the Indiana University School of Medicine. He is also Professor of Philosophy in the School of Liberal Arts, an Affiliated Scientist at the Regenstrief Institute and Co-Director of the IUPUI Signature Center Consortium on Health Policy, Law, and Bioethics. He has more than two decades of bioethics research and policy expertise in universities and the federal government in four countries. He held academic positions at the University of Toronto and at the University of Oxford and is currently Visiting Professor-at-Large at the University of Western Australia. He was Executive Director of the White House’s National Bioethics Advisory Commission (NBAC) from 1998-2001, and prior to that was director for bioethics research at the Ethical, Legal, and Social Implications (ELSI) research program at the National Human Genome Research Institute. His health and science policy expertise includes more than 100 publications on topics ranging from international health research to science policy.

Kenneth W. Goodman, Ph.D., is Founding Director of the University of Miami’s Bioethics Program. He is Professor of Medicine in the University of Miami Miller School of Medicine, with joint appointments in the Department of Philosophy, the Department of Epidemiology and Public Health, the Department of Electrical and Computer Engineering and the School of Nursing and Health Studies. The UM Ethics Programs have recently been designated a World Health Organization Collaborating Center in Ethics and Global Health Policy, one of six such centers in the world and the only one in the United States; Dr. Goodman directs this Center. He is a Fellow of the American College of Medical Informatics and chairs the Ethics Committee for the American Medical Informatics Association, for which he founded the Ethical, Legal and Social Issues Working Group more than a decade ago. He has more than two decades of research and educational expertise in the area of health informatics, ethics, and computing.

Financial Disclosures

The IU Center for Bioethics Program in Predictive Health Ethics Research (PredictER) is supported by a grant from the Richard M. Fairbanks Foundation, Indianapolis; Grant #UL1RR025761-01; NCRR/NIH: Indiana Clinical and Translational Sciences Institute; Eric M. Meslin, Ph.D. is a consultant to Eli Lilly and Company. Some of Kenneth Goodman’s work was supported by funding from the Robert Wood Johnson Foundation’s Pioneer Portfolio.

An earlier version of this paper was prepared for the use of the Indiana University Center for Applied Cybersecurity Research as a white paper (work in progress) for the conference held October 26-27, 2009 in Indianapolis.

Notes

[1] Biobanks for Research: Opinion, German National Ethics Council (Nationaler Ethikrat, Berlin, 2004).

[2] Meslin, EM, “When Policy Analysis is Carried Out in Public: Some Lessons for Bioethics from NBAC’s Experience,” in: James Humber and Robert Almeder, eds. The Nature and Prospect of Bioethics: Interdisciplinary Perspectives (Humana Press: Totowa, NJ, 2003): pp. 87-111.

[3] Miller RA, Schaffner KF, Meisel A., “Ethical and Legal Issues Related to the Use of Computer Programs in Clinical Medicine,” Annals of Internal Medicine 1985;102:529-537. See also: Stanley FJ, Meslin EM, “Australia Needs a Better System for Health Care Evaluation,” Medical Journal of Australia (2007); 186: 220-221.

[4] Goodman KW., ed., Ethics, Computing and Medicine: Informatics and the Transformation of Health Care (New York: Cambridge University Press, 1998); Goodman KW, Miller R., “Ethics and Health Informatics: Users, Standards and Outcomes,” in EH Shortliffe et al., eds., Medical Informatics: Computer Applications in Health Care and Biomedicine, 3rd ed. (New York: Springer-Verlag, 2006): 379-402.

[5] Cf. Goodman KW., “Bioethics and health informatics: an introduction,” in Goodman, ed., op. cit., pp. 1-31.

[6] Ackerknecht E., “Medicine at the Paris Hospital,” 1794–1848 (Baltimore: Johns Hopkins University Press, 1967); Korn D., “Contribution of the Human Tissue Archive to the advancement of medical knowledge and public health,” in: National Bioethics Advisory Commission, “Research Involving Human Biological Materials: ethical issues and policy guidance, Vol. II: commissioned papers,” (Bethesda, MD: US Government Printing Office; 2000): E1–E30.

[7] Khoury MJ, Little J., “Human Genome Epidemiology Reviews: The beginning of something HuGE,” American Journal of Epidemiology 2000; 151(1): 2-3.

[8] This map was developed by Jere Odell, IU Center for Bioethics, from publicly available sources on the web. It is meant to be illustrative, but not exhaustive.

[9] National Bioethics Advisory Commission, “Research Involving Human Biological Materials: ethical issues and policy guidance, Vol II: Commissioned papers,” (Bethesda, MD: US Government Printing Office, 2000); Eiseman E., “The National Bioethics Advisory Commission: contributing to public policy” MR-1546-STPI (Santa Monica, CA: RAND Corporation, 2003).

[10] Eiseman, E., and Haga, S.B., “Handbook of Human Tissue Resources: A National Resource of Human Tissue Sample,” MR-954-OSTP (Santa Monica, CA: RAND, 1999); Eiseman, E., Bloom, G., Brower, J., Clancy, N., & Olmsted, S.S., “Case Studies of Existing Human Tissue Repositories: ‘Best Practices’ for a Biospecimen Resource for the Genomic and Proteomic Era,” MG-120-NDC/NCI (Santa Monica, CA: RAND, 2003a).

[11] Time, March 16, 2009. The other nine: Jobs Are the New Assets; Recycling the Suburbs; The New Calvinism; Reinstating The Interstate; Amortality; Africa: Open for Business; The Rent-a-Country; Survival Stores; and Ecological Intelligence.

[12] A voluminous literature exists on these topics. See for example, Pellegrino ED and Thomasma DC, For the Patient’s Good (New York: Oxford 1984); Ramsey, P., For the Patient’s Good (Princeton University Press, 1960); Veatch RM., “A Theory of Medical Ethics” (New York: Basic Books, 1981).

[13] An equally voluminous literature exists on this topic, but one paper in particular is highlighted because of its enduring impact. Beecher, HK., “Ethics in Clinical Research,” New England Journal of Medicine (1966) 274(24):1354-1360. In a memorable quotation, Beecher described the most reliable safeguard for ensuring ethical experimentation is: “…the presence of an intelligent, informed, conscientious, compassionate, responsible investigator.”

[14] National Bioethics Advisory Commission, “Research Involving Human Biological Materials: ethical issues and policy guidance, Vol II: Commissioned papers,” (Bethesda, MD: US Government Printing Office, 2000).

[15] The relevant regulatory provision is found at 45 CFR 46.102(f), referring to identifiable private information.

[16] Lin Z, Owen AB, Altman RB., “Genetics, Genomic Research Human Subject Privacy,” Science (2004) Jul 9, 2004;305(5681):183

[17] Meslin, EM., “The Value of Using Top-Down and Bottom-Up Approaches for Building Trust and Transparency in Biobanking,” Public Health Genomics (in press).

[18] Helft PR, Champion VL, Eckles R, Johnson CS, Meslin EM, “Cancer patients’ attitudes toward future research uses of stored human biological materials,” J Empir Res Hum Res Ethics (2007);2:15-22.

[19] Haas DM, Renbarger JL, Meslin EM, Drabiak K, Flockhart D, “Patient attitudes toward genotyping in an urban women’s health clinic,” Obstet Gynecol (2008);112:1023-1028.

[20] IUPUI Survey Research Center, “Public attitudes regarding genetic research: Survey methods and findings,” (IU Center for Bioethics, 2009), available at www.bioethics.iu.edu

[21] We are still analyzing the survey results. Data presented in this paper are for illustrative purposes only.

[22] N. Homer, S. Szelinger, M. Redman, D. Duggan, W. Tembe, J. Muehling, J. V. Pearson, D. A. Stephan, S. F. Nelson, and D. W.Craig, “Resolving individuals contributing trace amounts of dna to highly complex mixtures using high-density snp genotyping microarrays,” PLoS Genet, 4(8):e1000167+ (2008).

[23] Wang R, Li Y, Wang XF, Tang H, Zhou X, “Learning Your Identity and Disease from Research Papers: Information Leaks in Genome Wide Association Study,” Technical Report TR680, http://ns2.lam-mpi.org/cgi-bin/techreports/TRNNN.cgi?trnum=TR680, accessed October 1, 2009.

[24] As noted above, these data have not been fully analyzed.

[25] Hobbs MS, McCall MG., “Health statistics and record linkage in Australia,” J Chronic Dis (1970);23(5):375-381; Stanley FJ, Croft ML, Gibbins J, et al., “A population database for maternal and child health research in Western Australia using record linkage,” Paediatr Perinat Epidemiol (1994);8:433-447; Holman CDJ, Bass AJ, Rouse IL, et al., “Population-based linkage of health records in Western Australia: development of a health services research linked database,” Aust N Z J Public Health (1999);23:453-459.

[26] Kelman CW, Bass AJ, Holman CD, “Research use of linked health data–a best practice protocol,” Aust N Z J Public Health (2002);26:251-255.

[27] Trutwein B, Holman CD, Rosman DL, “Health data linkage conserves privacy in a research-rich environment,” Ann Epidemiol (2006);16(4):279-280.

[28] Stanley FJ, Meslin EM, “Australia Needs a Better System for Health Care Evaluation,” Medical Journal of Australia (2007); 186: 220-221

[29] Kaye J, Stranger M (eds), Principles and Practice in Biobank Governance (Surrey, UK, Ashgate, 2009).

[30] http://www.humgen.org/int/GB2_p.cfm?mod=1, accessed October 1, 2009.

[31] This set of regulations includes but is not limited to the Common Rule (45 CFR 46 Subpart A ); relevant FDA regulations at 21 CFR 50/56; the HIPAA Privacy Rule 45 CFR 160, 164; and the Genetic Information and Non-Discrimination Act.

[32] National Bioethics Advisory Commission, “Research involving human biological materials: Ethical issues and policy guidance, volume i : Report and recommendations of the national bioethics advisory commission” (Bethesda, MD, National Bioethics Advisory Commission, 1999); Evans BJ, “Inconsistent regulatory protection under the U.S. Common rule,” Camb Q Health Ethics (2004);13:366-379; Evans BJ, “Finding a liability-free space in which personalized medicine can bloom,” Clin Pharmacol Ther (2007);82:461-465. Evans BJ, “Seven pillars of a new evidentiary paradigm: The food, drug, and cosmetic act enters the genomic era,” Notre Dame Law Review (2009).

[33] Evans BJ, Meslin EM, “Encouraging translational research through harmonization of FDA and Common Rule informed consent requirements for research with banked specimens,” J Leg Med (2006);27:119-166

[34] Office for Protection from Research Risks, “Issues to consider in the research use of stored data or tissues” (1997) www.ohrp.gov; Drabiak-Syed K, “State codification of federal regulatory ambiguities in biobanking and genetic research,” J Leg Med (2009);30:299-327; Wolf LE, Lo B, “Untapped potential: IRB guidance for the ethical research use of stored biological materials,” IRB: Ethics & Human Research (2004);26:1.

[35] Garrison FH, An Introduction to the History of Medicine, 2nd Ed. (Philadelphia: W.B. Saunders Company, 1917), p. 88. Note that the Hippocratic corpus is likely a composite, drawn from several sources, and there is disagreement among some historians about the very existence of a man, Hippocrates, as the author of documents attributed to him.

[36] Spier R, “The history of the peer-review process,” Trends in Biotechnology (2002);20(8):357-358; Al Kawi MZ, “History of medical records and peer review,” Ann. Saudi. Med. (1997);17:277–278; Ajlouni KM, Al-Khalidi U, “Medical records, patient outcome, and peer review in eleventh-century Arab medicine,” Ann. Saudi Med. (1997);17:326–327.

[37] Koch T, Denike K, “Crediting his critics’ concerns: Remaking John Snow’s map of Broad Street cholera, 1854,” Social Science & Medicine (2009);69(8):1246-1251. It has been suggested that the epidemic was on the wane before Snow’s intervention.

[38] Flexner A, “Medical Education in the United States and Canada, Bulletin Number Four (The Flexner Report)” (New York: The Carnegie Foundation for the Advancement of Teaching, 1910).

39 Goodman, KW, “Health Information Technology and Globalization: Managing Regional Morals and Universal Ethics,” in R. Chadwick, H ten Have, EM Meslin, (eds), Health Care Ethics in an Era of Globalisation. Sage Handbooks (in press).

[40] McDonald CJ, Tierney WM, “Computer-stored medical records: their future role in medical practice,” JAMA (1988);259(23):3433-40.

[41] The first study of ethical, legal and social issues raised by PHRs was Project HealthDesign, Robert-Wood Johnson Foundation-funded initiative begun in 2007. Among findings by a University of Miami team is that in an era of social networking and other on-line interactions, traditional conceptions of privacy are shifting, and that privacy itself is a somewhat vaguer concept than customarily thought. For instance, young people especially are far more inclined than expected to allow medical information to be shared by others who are not health professionals. See http://www.projecthealthdesign.org/overview-phr/ELSIgroupresources for a list of ethics reports from Project HealthDesign.

[42] Barrows R, Clayton P, “Privacy, confidentiality and electronic medical records,” J Am Med Inform Assoc (1996);3:139-48.

[43] The U.S. National Institute of Standards and Technology is a key source of guidance on a variety of information technology standards. See “NIST Special Publication 800-12: An Introduction to Computer Security – The NIST Handbook,” Chapter 8, for an analysis of audit trails, available at http://csrc.nist.gov/publications/nistpubs/800-12/800-12-html/.

[44] U.S. Congress, Office of Technology Assessment, Report Brief: Protecting Privacy in Computerized Medical Information (Washington, D.C.: U.S. Government Printing Office, 1993).

[45] For an overview of security and privacy ethics and standards, see Cushman R, Privacy / Data Protection Project, University of Miami, available at http://privacy.med.miami.edu/index.htm. The “Encyclopedia” entries under “security” give synopses of core requirements of HIPAA’s Security Standard and Rule.

[46] Cushman R, Privacy / Data Protection Project, University of Miami, Encyclopedia entry “Security and Data Protection,” available at http://privacy.med.miami.edu/glossary/xd_security_basicdef.htm. Emphasis added.

[47] Ibid., “Security Standard/Rule (HIPAA),” http://privacy.med.miami.edu/glossary/xd_security_stds.htm

[48] Alpert SA, “Health care information: access, confidentiality, and good practice,” in Goodman KW, ed., Ethics, Computing, and Medicine: Informatics and the Transformation of Health Care (Cambridge: Cambridge University Press, 1998), pp. 75-101.

[49] Stanley FJ, Meslin op. cit.

[50] Goodman KW, “Ethics, information technology and public health: New challenges for the clinician-patient relationship,” Journal of Law, Medicine and Ethics, in press; citing Marquard JL, Brennan PF, “Crying wolf: Consumers may be more willing to share medication information than policymakers think,” Journal of Health Information Management (2009);23: 26-32.

[51] Goodman, ibid.

Tags:

Comments on this article

By clicking and submitting a comment I acknowledge the Science Progress Privacy Policy and agree to the Science Progress Terms of Use. I understand that my comments are also being governed by Facebook's Terms of Use and Privacy Policy.