April 2013
Introduction
Pharmacogenetics is the study of the role that inheritance plays in individual variation to drug response. Understanding the influence of heritability on an individual’s drug metabolism offers the potential to identify which drug, and at what dose, is likely to be safest and most effective for a particular individual, which helps medical practitioners improve patient outcomes as well as reduce adverse drug events (ADEs).
Currently, the majority of medicines are taken in dosages determined by patient age (pediatric versus adult), weight, and other clinical factors. In many instances, these criteria are proving to be inadequate to ensure that a medicine will be safe and effective for a particular individual. Pharmacogenetics promises to take the guesswork out of prescribing safe and effective drugs. However, the use of pharmacogenetics in both clinical research and medical practice poses various ethical concerns. While these concerns are common to the fields of both genetics and genomics, they are nonetheless also relevant to pharmacogenetics. Some of the major issues are discussed here.
Background
In the 1950s scientists first identified drug metabolizing enzymes and discovered that DNA sequence variation within genes is associated with individual variation in the activity of these enzymes. These variations in enzymes were thought to explain adverse reactions to drugs, thus demonstrating that inheritance plays a role in an individual’s drug response. The first evidence for this came from research that showed that 10% of African American men serving in the Korean War became anemic after ingesting an anti-malarial drug. The same drug rarely, if ever, caused these problems for Caucasian soldiers. The anemic reaction was found to be caused by a variation of the G6PD gene - a variation common among people of African descent but not among Caucasians. Subsequent research found that the normal form of the gene makes an enzyme that helps protect red blood cells against certain chemicals. Lacking that protective effect, those with the variant form are vulnerable to deleterious effects, such as anemia.
The concept of pharmacogenetics originated from the clinical observation that some patients had very high, and others very low, plasma or urinary drug concentrations following drug administration.1 While individual differences in drug response can result from non-genetic factors, such as age, sex, weight, disease, drug-drug interactions, drug-food/nutrient interactions, etc., genetic variations are now known to play a clinically significant role in drug absorption, distribution, metabolism, and excretion.
Figure 1. Genetic tests may determine a patient’s response to new medication. Image courtesy Wikimedia Commons.
Currently, the use of pharmacogenetics in clinical medicine is growing, particularly in oncology where variations in multiple genes (e.g., DPD, UGT1A1, TMPT, CDA, and CYP2D6) influence the effectiveness and toxicity levels of certain chemotherapeutic agents in individual patients. Identifying patients’ genetic variants enables physicians to properly prescribe drugs for individuals who have a variant associated with an impaired ability to either obtain a therapeutic benefit or safely detoxify drug metabolites. To date, the most clinically successful use of pharmacogenetics is the genotyping of HLA-B*5701 for hypersensitivity to antiretroviral drugs used to treat HIV, in particular, abacavir.2
Ethical Issues
What are the ethically appropriate conditions for using pharmacogenetics in clinical practice? This is a question that is important to consider, particularly as science and technology continue to advance very rapidly. Whole genome sequencing, for example, is poised to eliminate the need for individual genetic tests, thus raising ethical concerns about the creation of genetic information which individuals may or may not want to know and which may or may not remain privately secured. The following section will present and analyze some of the major ethical considerations; while not comprehensive in scope, together they represent an overview of the predominant issues.3
Is Pharmacogenetics a “Good” or “Bad” Allocation of Scarce Resources?
One could argue that the cost of developing pharmacogenetics is a misallocation of increasingly scare societal funds, particularly given other pressing global needs such as providing clean air and drinking water or preventing global famine and infectious diseases. Others contend that pharmacogenetics offers enormous potential for improving the treatment, prevention, and eventual eradication of disease. In the U.S. alone, over 770,000 people are injured or die each year in hospitals from ADEs that cost, on average, up to $5.6 million each year per hospital. This estimate of ADEs does not include the costs associated with serious but non-fatal drug reactions, including hospital admissions, malpractice and litigation costs, or the costs of injuries to patients. The cost of treating patients requiring hospitalization due to ADEs is estimated at between $1.56 billion and $5.6 billion annually.4 It is believed that many of these reactions are due to genetic variants and that hospitalization, even death, can be avoided by testing people for ADEs prior to prescribing certain drugs.
What Constitutes Convincing Evidence of Clinical Utility and Who Should Decide?
Determining whether a pharmacogenetic test is clinically beneficial (i.e. has proven clinical utility) is controversial, in part because most tests are less-strictly regulated than in vitro diagnostics the FDA regulates as medical devices.5 The debate about what constitutes “enough evidence” to deem a test appropriate for clinical use is central to the adoption of pharmacogenetics. Many pharmacogenetic tests are widely available but clinical adoption lags because some experts and professional societies believe significant knowledge gaps exist about clinical utility and, therefore, testing should not be standard of care at this time.6
In many cases, the evidentiary base underlying a test’s clinical utility is unproven or controversial; as a result, such tests are not widely used. Many believe the functional measure of clinical utility is reimbursement: if payors deem a test experimental (i.e., not yet proven clinically beneficial), it’s not reimbursed by insurance providers.7 Reimbursement, however, can be viewed as a barrier to patient access, even if a test is proven clinically valuable.8,9
A prime example of this dilemma is the FDA’s labeling and recommended use of the drug Warfarin (commonly known as Coumadin). While originally formulated to prevent the formation of blood clots, Warfarin has recently been used to prevent thrombosis and embolism. However, Warfarin has also been implicated as the second leading cause of ADEs requiring emergency room visits.10 Roughly one third of patients receiving Warfarin carry variants in two genes—CYP2C9 and VKORC1—that put them at high risk of bleeding if given the standard dose. On August 16, 2007, the FDA cleared a genetic test for Warfarin sensitivity and updated Warfarin’s original labeling to say that variations in CYP2C9 and VKCOR1 can affect optimal dosing. On February 4, 2010, the FDA updated the labeling again, this time indicating that testing of CYP2C9 and VKCOR1 can assist in the selection of optimum dose. Notably, the FDA did not require or recommend testing prior to dosing at either time. However, neither the Center for Medicare and Medicaid Services (CMS) nor most private insurers pay for testing of these genes because evidence indicates that many factors other than gene variants may affect Warfarin metabolism and influence optimal dosing. Further, because different labs offer tests for different variants in CYP2C9 and VKCOR1, this leaves providers uncertain as to which test is the best.11,12,13
That said, new evidence continually updates clinical knowledge. Medical recommendations not only change over time, but advice given at one time can be contradicted at a later time. For example, medical advice to avoid butter and eat margarine was based on solid evidence indicating the cardiac dangers associated with butter consumption. In recent years, however, evidence has indicated that margarine is unsafe because the oils are made from crops treated with pesticides and that production requires the use of hydrogenation (which creates trans-fats, now known to be unhealthy) and various chemicals to improve color and smell.
Figure 2. DNA profiling can help researchers and medical practitioners identify how individual patients respond differently to the same medication. Image courtesy National Cancer Institute.
Genomics is showing us that what is safe for one person may not be for another, raising questions about whether statistically significant results are applicable across the board. Additionally, what constitutes adequate evidence has been, and will likely continue to be, controversial. In light of their Hippocratic Oath, physicians are obligated to do no harm, but can this obligation be fulfilled when the information available to physicians about how particular medicines will affect their patients is so meager? At present, physicians generally have no way of knowing whether the drug they prescribe will cause an adverse effect in their patients.
Will all individuals—regardless of their socio-economic status—have equal access to pharmacogenetics and its benefits?
Questions remain about access to costly tests and expensive therapeutics, particularly given the structure of the U.S. health care system. Access to expensive treatments can be possible for the terminally ill who qualify for clinical trials. However, aside from the hope that an experimental treatment will reverse a disease process, financial incentive remains a significant motivation for many individuals to participate in studies. As a result, those in greater need of money are more likely to participate, which introduces a source of bias in the study design. At the political level, what constitutes a fair health care system continues to be a contentious issue. Unless—or until—a societal commitment to genomic benefit sharing (including the use of pharmacogenetics/pharmacogenomics) is actualized, market forces will continue to drive access.
Right to Know and Right not to Know One’s Genetic Information
One of the basic bioethical principles is the moral right to self-determination, or the right to voluntarily, deliberately, and intentionally choose for one’s self, also referred to as the principle of autonomy. Autonomy is generally considered to imply the right to know. For example, patients have the right to know what is being done to them and the right to act freely on that knowledge (e.g., consent or refusal). Implied in this line of reasoning is that our genes are a core component of who we are and, as such, contain deeply personal information about heritable disease risks and one’s ability to metabolize drugs and nutrients. Because an individual’s genotype is an integral part of who that person is, some argue that each of us has an absolute right to know our own genetic information.
Because of its ability to predict diseases and correctly diagnose a variety of health concerns, medical genetics is an area of medicine that highlights individuals’ rights to obtain—or not to obtain—their own genetic information. Proponents of individuals’ rights feel that it should be each individual patient’s choice—and not that of a physician, a clinical researcher, or government regulatory agency—whether to learn his or her genotype, in part or whole. Some argue this right doesn’t pertain to pharmacogenetic testing because the psychological stakes are relatively low (i.e., testing participants will not be learning information about their lifespan, potential long-term health issues, heritable diseases, etc.). On the other side of the debate, some bioethicists argue that individuals ought to have a right to know their genetic information even in the context of studies, such as whole genome sequencing (WGS) research, that require participants to sign away their right to know.14
The principle of autonomy also supports the right to not know, which is not as easily protected. James Watson, one of three scientists who discovered the structure of DNA, decided to have his genome sequenced and the results released to public databases, with the exception of his ApoE (apolipoprotein E) status. The ApoE gene has three variants, one of which is the largest known indicator of late-onset Alzheimer’s disease. Dr. Watson, knowing both that late-onset Alzheimer’s disease is incurable and that it caused the death of his grandmother at age 84, chose not to make this information public.
However, researchers discovered that—by using advanced computational tools—they could analyze linkage disequilibriums and predict Watson’s ApoE status, thereby demonstrating that omission of genetic information doesn’t necessarily prevent inference of risk factors for disease.15 Several subsequent research studies have documented how anonymized genetic data can be re-identified using computer analysis and access to publicly accessible data bases.16 In 2008, the NIH shut down its open access database of genetic information in response to research demonstrating the ability to identify a single individual’s genetic profile out of a pool of DNA.17 One’s right to not know is at risk unless, or until, solutions are developed to strengthen protection of this right.
On the other hand, regulations forbidding the return of results to research participants are challenging the principle of beneficence. Genetic studies are increasingly finding information that can benefit study participants (if they choose to know), but consent rules prevent researchers from sharing so-called “incidental” findings. Ethical debate exists as to whether researchers have a moral obligation to inform study participants about their findings, particularly if they are actionable and risks can be mitigated.18,19 Recently, the American College of Medical Genetics recommended that incidental findings obtained in a clinical (not research) setting should be disclosed to the patient.20
Similar questions arise over weighing the respective rights to know of society versus the individual. How far ought the right to know to extend? What genetic information is so vitally important that it might be appropriate to break doctor-patient confidentiality to protect a patient’s relative? In Pate v Threlkel, the Supreme Court of Florida found that a physician has a duty to warn patients of the genetically transferable nature of the condition for which they are being treated. Though this duty extends to informing the patient’s children, the court held that—in this case—the duty is satisfied by warning the patient of the familial implications of genetic testing. The decision, therefore, did not specifically require the information to be transmitted to the patient’s children.21
The Drive for Open Consent: Is it Ethical?
Informed consent is functionally a contract between individuals (research participants, both patients and healthy people) and the entities conducting research, providing health care, or performing genetic testing on biological specimens (clinical research settings, clinics, or genetic testing facilities). This contract specifies what will be done to the individual and the potential harms and benefits associated with certain procedures. It also makes legal the individual’s voluntary agreement to the specified conditions.
However, because a plethora of data are needed to validate the clinical utility of new findings, pharmacogenetic research increasingly involves add-on studies to increase understanding about genetic contributions to disease risks. Given the sizeable amount of information required to launch adoption of clinical pharmacogenetics and personalized medicine, researchers have created large-scale databases at both national and international levels. As a result, participants have no way of knowing what additional research may be conducted on their donated specimens, and researchers lack the administrative capacity to re-contact individuals for consent to each additional study. This creates a significant problem for studies that use traditional consent (opt in or opt out): once data are made anonymous, consent is no longer restricted to the original study, yet it can’t feasibly cover all possible future studies or secondary data use. At the time of the original study and consent, researchers can’t explain all the ways a participant’s data might be used in the future, in part because they don’t yet know what the data mean and how they might be useful to advancing medical knowledge. This raises ethical concerns about how these add-on studies can, or ought to be, consented. This reality also highlights the issue of how much autonomy individuals have over what their specimens are used for.
The notion of open, or broad, consent has been proposed to meet these challenges. Unlike traditional consent, open consent extends beyond the original date to future research and thus provides blanket consent that enables research to advance efficiently. The rationale is that researchers have neither the time, nor money to pay administrative staff, to re-consent every participant for each future study, particularly if subjects have died, moved, become incompetent or reached the age of consent in which case they, not their guardians, are obligated to consent. Researchers clearly favor this approach because it gives them freedom to operate, expediting their ability to continue their work.
Advocates of open consent argue that research participants agree to provide specimens from which data are to be generated and used for the greater good of society. Because the need for specimens is great, some researchers even argue that each individual’s genome should belong to society. Dr. Lee Hood, a biotech pioneer credited with creating the technological foundation for genomics by inventing DNA sequencers, has publicly declared that individuals owe society their genomes—that is, their genetic information—so that researchers have access to a sufficient amount of data to enable the advancement of genomics (and its subfields, including pharmacogenetics) to benefit humankind. Implied in this statement is that research participants should allow their data to be widely used, including for purposes that arise in the context of ongoing research which could not have been specified at the time of the original consent. Some bioethicists argue that open consent is actually more ethical than traditional consent because of its candor with respect to the fact that privacy cannot be guaranteed. Most open consent forms state that, while every effort will be made to maintain the privacy and security of one’s personal data, privacy cannot be guaranteed.
On the other side are those who want to develop strategies to enable greater patient control over their data, such that individuals should be able to decide exactly what additional studies—if any—they want their specimens involved in. The issue of control is complicated by numerous factors, including the fact that hospital pathology laboratories store and can utilize research specimens at their discretion. Even still, while the labs technically own the specimens, research clinics control the genetic information in electronic medical records. It’s personal information, but when placed in an electronic medical record the care institution can share the contents, particularly if de-identified, with other researchers, insurance groups, etc. Given this web of sharing and access, research participants may not be fully aware of the subtleties to which they are agreeing when consenting, thus creating additional obstacles for those wanting to preserve patient control.
Genomic research is moving more and more toward open data sharing. Most stored genetic data is de-identified and made anonymous (essentially assigned a bar code identifier) so that personal information is not known to researchers. Increasingly, however, anonymization has been proven unable to protect individual privacy. Using a computer, an internet connection, and publicly available online resources, scientists at the Whitehead Institute were able to correctly identify 50 individuals who had submitted personal genetic material as participants in genomic studies.22,23
The Future
In spite of our best efforts to anticipate and resolve ethical quandaries arising from the application of new genetic technologies, it is likely that unexpected conflicts will continue to arise. Those discussed in this article are not an exhaustive list.
Pharmacogenetics allows researchers to conduct gene profiling to answer questions about patient responses to medicines, which results in the design of safer and more effective medicines. The science and its applications are real today and will be increasingly utilized in coming years. While it is extremely unlikely that individuals will be excluded from health insurance because of their response (or lack thereof) to a particular drug, or that they will be subject to employment exclusions (in hiring, promoting or job responsibilities), ethical issues are central to policy debate about the appropriate use of pharmacogenetic testing and its related benefits.