EVENTS | VIEW CALENDAR
Molecular screening tools and personalized healthcare: Determining success early in drug development
Traditionally, disease treatment options have been largely dependent on the outcome of epidemiological studies that use large cohorts of patients to provide evidence of the effectiveness and safety of drugs. However, it is widely recognized that not all patients with similar disease characteristics benefit to a similar extent from a particular drug. In fact, about 90 percent of drugs work effectively for only 30 to 50 percent of individuals.
According to a report by Spear, et al.1, a particular cancer drug class is ineffective for about 75 percent of individuals of a patient population. Breakthroughs in the field of genomics, including the completion of the sequence of the human genome in 2001, have led to the development of many molecular techniques that allow the elucidation of differences in the genomes or transcriptomes of patients. These differences can increasingly be attributed to observed differences in drug effectiveness.
Today, these genomic tools are the basis of the field of personalized healthcare and companion diagnostics. Companion diagnostics describes the process of stratifying patients depending on their response to a particular drug treatment. By facilitating personalized healthcare, molecular techniques have the potential to significantly improve patient care, rescue effective drugs from development failure and provide cost benefits to both healthcare systems and pharma development programs.
The mutational testing of the KRAS gene in patients with metastatic colorectal cancer is one important example of successful patient stratification. Certain mutations in the gene result in non-responsiveness of a tumor to treatment with EGFR-inhibiting drugs. Implementation of mutation testing removes the risk of treating patients with an ineffective drug and the resulting side effects, while at the same time eliminating the costs of an ineffective treatment.
The Oncology Times reported an economic analysis that estimated potential cost savings resulting from personalized healthcare in the range of more than $600 million on drugs alone2. In light of this potential, public funding with a focus on personalized healthcare has significantly increased. Recently released budgets for the U.S. Food and Drug Administration (FDA) and National Institutes of Health (NIH) for the fiscal year 2012 underline this trend: The NIH budget will increase to $31.38 billion, funding "Enhancing the Evidence Base for Health Care Decisions" as one of three major themes.
New technologies—particularly second-generation sequencing technologies—that enable sequencing of whole genomes at costs significantly below $10,000 per genome will further fuel the pipeline of potential molecular biomarkers that may be employed for patient stratification. Many diseases are multifactorial, and multiple markers may be needed to explain more complex disease phenotypes or to serve as marker sets to predict a certain disease outcome or guide towards a specific therapy. In the near future, these new sequencing technologies may enable the screening of hundreds of cancer genomes and thus the elucidation of underlying disease mechanisms.
However, new discoveries also substantially increase the complexity of the validation of biomarkers. In addition to being encoded in the genotype of a patient, biomarkers are also likely to be determined by epigenetic mechanisms, for which the complexity of data mining and interpretation is currently not well understood.
While molecular screening methods have supported the discovery of potential new biomarkers, these technologies have often failed to benefit drug development programs. This is due to biological variation of the marker itself or is caused by variability of methods and technologies that are employed in the validation of the biomarkers.
Standardization of experiment workflows is of key importance to eliminate as many variables from such complex and data-driven workflows as is possible. A biomarker that demonstrates clinical utility must rely on sample and assay technologies that are robust and ultimately compatible with diagnostic laboratory operations. Standardization starts with the collection of the biological sample and stabilization of contained biomolecules.
In one example, Mueller, et al., demonstrated the importance of immediate RNA stabilization of blood samples in a research study monitoring leukemia therapy and evaluating the bcr-abl to abl transcript ratio3. Standardized collection, transport and storage of biological specimens until later processing—along with optimized, adapted nucleic acid isolation methods for further downstream assays—can contribute significantly to reducing data variability. Full automation of such processes avoids user interaction errors and variation introduced through manual handling steps in a workflow that can vary significantly between different operators. The need for highly standardized experimental workflows is also leading to an increasing number of collaborative projects among public and private researchers.
High-throughput biomolecular analysis or related applications in second-generation sequencing need to be further validated by independent assay technologies. State-of-the-art technologies comprise real-time PCR and other sequencing-based approaches such as Sanger sequencing or pyrosequencing. Potapova, et al.4, have recently described how pyrosequencing can be efficiently utilized in both high-throughput sequencing and for further validation in a lower throughput format. In contrast to Sanger sequencing, real-time sequencing based on the release of pyrophosphate during the course of the sequencing reaction provides the means for quantitative measurements combining the benefit of sensitive mutation detection with accurate high-resolution sequence information. Pyrosequencing also allows analysis of the epigenetic methylation of a potential biomarker.
Real-time PCR, used in clinics on a daily basis, is another validation technology that has undergone an evolutionary development leading to further standardization of assays and elimination of variability from data interpretation. Multiplex PCR also contributes to the elimination of variability, as multiple markers can be analyzed with high performance in a single reaction, which has been demonstrated by Ishii, et al., among many others5.
Other tools, such as microarrays or gene pathway PCR arrays, aid in the selection and validation of potential biomarker candidates by allowing scientists to investigate gene expression or somatic mutations of tens to hundreds of disease-related genes with one experiment. As an example, RT2 Profiler PCR Arrays provide important means of quality control to monitor the efficiency and performance of the underlying PCR reaction—essential information for the validation of biomarkers.
Second-generation sequencing has already proved successful in explaining previously uncharacterized genetic disorders6, and many new technologies are in development. Whether these new technologies for high-content biomolecular screening can deliver on their promise to elucidate complex disease mechanisms still remains to be proved.
Undoubtedly, the final answer will strongly rely on the use of standardized pre-analytical sample processes and robust assay technologies for validation of newly discovered biomarker candidates. It is critical to identify early on in drug development biomarkers that possess clinical value and to move from the analysis of genetic differences to the implementation of personalized healthcare into clinical practice.
Dr. Dirk Löffert is vice president of sample and assay platform technologies at QIAGEN, where his responsibilities include technology and product development for sample preparation and assay solutions for the life sciences. Löffert received his Ph.D. in molecular biology and immunology from the Institute for Genetics at the University of Cologne in Germany.
1. Spear, Brian B., et al.: Clinical Trends in Molecular Medicine, May 2001.
2. Tuma, Rabiya: Oncology Times, March 2009.