MDD logo ACS Publications
 
December 2004 From Concept to Development
About MDD
Volume 7, Issue 12
 
Table of Contents
Article Indexes
Author Indexes
 
 

 

Cutting things down to size

 
 

Broad use of a standardized microfluidics technology platform

       
Kevin Hrusovsky and Mark Roskey Download PDF | Table of Contents  

 

Caliper Life SciencesOver the past 20 years, the pharmaceutical industry has invested heavily in technologies that increase throughput, decrease costs, and provide access to new classes of scientific data. However, there is a well-documented disparity between levels of R&D investment and the development of innovative new drugs. To an extent, this disparity exists because technology innovation has been focused on optimizing individual processes, which resulted in highly specialized data and created functional silos within major pharmaceutical companies.

Information flow among these silos has slowed to a trickle because the tasks performed by scientists have become compartmentalized, and the data and processes of each area have been optimized and analyzed independently. As each area was optimized independently—with a focus on increasing throughput, reducing assay volumes, and decreasing unit costs—shortcuts were taken that ultimately resulted in simplified or “single pot” reactions that generate lower-quality data and higher numbers of false positives. As a direct consequence of shortcuts taken in the quest for throughput, drug discovery databases are now fraught with errors.

Scientists today have come to realize that developing a thorough knowledge of complex disease states requires experimental techniques that are equally complex and data sources that are high in quality and fidelity. This shift to more complex research efforts has caused a change in core values from throughput and cost to higher-level ideals, such as quality and standardization. Quality and standardization are essential to support information sharing, shorter experimentation cycles, and increased organizational knowledge.

The “industrialization” of drug discovery has proven to be an elusive goal. The pharmaceutical industry has deployed several organizational models—centers of excellence focused on disease states, core laboratories focused on functional areas, centralized testing, and others. The attempt to industrialize drug discovery has involved massive investment, ranging from expensive information technology architecture and management consulting fees to global reorganizations. Through this process, much has been risked, and much learned. The industry has learned that centralizing core processes to increase efficiency results in additional bottlenecks and decreased productivity. It has learned that higher throughput should not be achieved at the expense of quality. Most importantly, the industry has learned that information that is not shared is lost.

The pharmaceutical industry has evolved, but have its suppliers shared in this learning process? Nate Cosper, drug discovery consulting manager at Frost & Sullivan, believes that suppliers have not kept pace with industry advances. “Drug discovery technology suppliers continue to build tools that are designed for a pharmaceutical model that is broken,” he warns.

Not seeing the broader vision, suppliers continue to focus on single techniques for improving throughput, cost, and ease of use. “Suppliers of drug discovery technologies have focused on optimizing tools for individual techniques, but in so doing have actually hindered the information revolution,” Cosper explains.

A holistic approach to optimizing drug discovery processes is needed to address the challenges that face the pharmaceutical industry. The biological questions being studied by scientists today are substantially more complex than the questions addressed 10 or 20 years ago, and research techniques must become equally complex to answer those questions. Additionally, research processes must be integrated so that information generated from different techniques and laboratories can be combined to yield a higher level of understanding about complex disease states.

Within major pharmaceutical companies, there are numerous functional silos generating different data from different technologies with different levels of error. The error is being generated from different assay formats, different supplier technologies, different researchers, and different experimental conditions. When all of these sources of error are added together, the resulting database becomes fraught with inconsistencies. Because of the lack of standardization, scientists become focused only on their silos; they understand and trust only the data that is generated by their teams. As a direct consequence, information is not shared within the organization and the innovation process is hindered.

“In the field of observation, chance favors only the prepared mind.”
—Louis Pasteur

One way to create an organization of prepared minds is to have standardized technologies so that scientists can move and share data across different therapeutic areas or functional groups and cross-pollinate by sharing ideas (1). Such standardization is achieved by having people use the same technology platform to conduct different types of experiments. However, standardization through centralization often results in additional bottlenecks. Decentralization is an essential component of innovation.

Microfluidics. New technology platforms have features that bring benefits to a wide range of applications.
Microfluidics. New technology platforms have features that bring benefits to a wide range of applications.
Credit: Caliper Life Sciences

This is precisely what microfluidics can provide to the drug industry (2). Microfluidics technology is based on instruments that are capable of transferring small volumes of liquid, ranging from microliters to nanoliters. Microfluidic “lab-on-a-chip” technology requires an understanding of the forces that control fluid movement and reaction conditions, and brings the potential benefits of miniaturization, integration, and automation. Manufacturing such chips combines methods from the microchip industry with expertise in fluid dynamics, biochemistry, and software and hardware engineering to create miniature, integrated biochemical processing systems.

A microfluidics platform provides better-quality data in an electronic format, allows shorter assay development times, and, through Kinase Selectivity Screening (KISS), provides more clinically relevant information about side effects earlier in the discovery process. Microfluidics confers the benefits of a standardized platform, without the limitations of centralized operations. A common interface and standard operating system also make it easier to share information and adopt new techniques.

“Because of the direct measurement and the high-quality data generated by microfluidics, this technology is a platform we have adopted as a cornerstone of our discovery process,” says Bill Janzen, vice president of operations at Amphora Discovery Corp. And, because microfluidic systems can be applied broadly to genomics, proteomics, screening, and diagnostics, microfludics is a standardized platform that is widely deployable today.

This broad applicability is a revolutionary development, not just in the way microfluidics is viewed but also for the entire discovery process. Companies that have realized this—such as Millennium Pharmaceuticals, Amphora, Eli Lilly, Pfizer, and Aventis—have been able to rethink their discovery engines and have experienced organizational and scientific advantages.

Kurt Stoeckli, vice president and global head of lead discovery technologies at Aventis Pharmaceuticals, describes how the company has benefited from microfluidics. “Aventis has evaluated several new drug discovery approaches, and we believe that the microfluidics technology platform meets the demanding standards of our chemical biology program,” he says. “It is a good fit for Aventis’s information-driven discovery efforts.”

“The key benefit is the unprecedented data quality we obtain using microfluidics technologies in a standardized platform,” Stoeckli adds.

Integrating the chip

Caliper Life Sciences has listened to the perceptions and concerns of scientists and has taken action. Recognizing that developing a “lab on a chip” was not sufficient for the rigors of the pharmaceutical industry, Caliper set about integrating the chip into the lab.

In July 2003, Caliper Technologies acquired Zymark Corp. This combination bridged the interface between micro- and macrofluidics. It combined Caliper’s detection platform with Zymark’s experience in nanoliter liquid handling to feed a microfluidics platform and interface with existing multiwell plate architecture.

Shifting values. More complex research efforts have caused a change in core values from throughput and cost to higher-level ideals.
Shifting values. More complex research efforts have caused a change in core values from throughput and cost to higher-level ideals.
Credit: Caliper Life Sciences
“The coupling of Caliper’s innovative technology with Zymark’s strong customer relationships and automation expertise has enabled the development of solutions that are ready to be deployed today,” Cosper notes. And, based on Zymark’s pharmaceutical industry experience, the combined company began working with customers to develop solutions optimized for entire organizations, not just for functional silos.

Caliper realized that the promise of microfluidics is much larger than any one company can deliver. Customers have made substantial investments in existing infrastructure, and for new technologies to be adopted they need to interface with that infrastructure. Previously, Caliper focused almost exclusively on developing microfluidics internally to maximize its intellectual property estate.

Today, Caliper is working with others—including Agilent Technologies, Bio-Rad, QIAGEN, and Affymetrix—to establish microfluidics products in a range of applications. These partnerships are part of a systematic, standardized approach to develop microfluidics-enabled solutions that can be integrated into existing laboratory workflows to accelerate drug discovery and enhance the disease diagnosis.

By eliminating variations in sample preparation, reaction conditions, and detection methods, microfluidics provides consistently high-quality data. Additional benefits of the platform include shorter assay development times and versatility in the types of experiments that can be run. Because of the high-quality data and the broad applicability of the technology, one of the most compelling benefits conferred by microfluidics is the ability to standardize on one platform. This offers organizational advantages by ensuring that experiments performed in disparate geographical locations are all conducted in a uniform, directly comparable fashion.


A small paradox

Prior to the advent of high-throughput screening (HTS), pharmaceutical companies relied on complex experimentation, scientific knowledge, and a good deal of serendipity to lead them to the next blockbuster drug. Increasing competitive and economic pressures forced companies to explore more systematic approaches to drug discovery, and thus HTS was born.

Unfortunately, to achieve the levels of miniaturization and automation that the HTS approach required, the complexity of experimentation was reduced. The trade-off for increases in throughput and decreases in assay volume was a reduction in the nature and quality of data being generated, with a resulting drop-off in organizational learning from experimentation.

One such example is the elimination of a step to separate reactants and products after an enzymatic reaction because miniaturized assays conducted in multiwell plates are not amenable to separations. Eliminating this separation step introduces a greater propensity for false positives and negatives resulting from background contamination. Consequently, the information generated through miniaturized, high-throughput experimentation is often limited and less reliable.

In many ways, the challenges faced by the pharmaceutical industry today parallel those experienced by the computer industry in the past. And, because of the similarities, lessons can be learned from how the computer industry addressed a need for higher throughput while trying to maintain a level of complexity in its experimentation.

Initially, computers could only perform calculations that were considered trivial, and mathematicians could easily perform more complex calculations by hand. However, in some cases simple calculations needed to be performed numerous times, and therefore were better suited for computers. In these cases, mathematicians spent a substantial amount of time simplifying the calculation and converting it to a card-based format. Punch cards allowed mathematicians to make calculations significantly faster, but with a concomitant loss in complexity. Of course, punch cards had their own limitations. For example, they were difficult to create, it took a long time to read the entire set, and they could get out of order.

Some manufacturers, not realizing that revolutionary change was necessary, set out to improve the process by developing card feeders; increasing the speed of card reading; further automating the creation, storage, and access of punch cards; and increasing the complexity of calculations that could be represented on the cards. These evolutionary steps were useful at the time; however, visionary companies pursued revolutionary innovations such as integrated circuits and random-access memory and ultimately transformed the industry. Today, computers easily perform complex calculations, as well as a multitude of additional tasks that would have been considered science fiction just 20 years ago.

Figure
Head to head. Microfluidics compares well with other assay approaches in providing better data quality, higher throughput, and greater assay complexity.

As with the computer industry, many scientific instrumentation manufacturers today continue to focus on optimizing single techniques by increasing the speed of the experiment, decreasing the cost, and increasing the level of automation. Although each of these innovations provides incremental value, revolutionary technology is required to enable researchers to perform assays with a level of complexity that matches the underlying biology.

Ironically, by further expanding the very trend that initially reduced complexity by introducing false positives and poor data quality, namely miniaturization, microfluidics is able to solve those same issues. By adopting microfluidics and achieving further miniaturization within the context of an overall laboratory automation solution, scientists can perform assays with greater experimental control and sophistication, resulting in higher-quality data and a return to organizational learning.

 

 

References

 
 
  1. Thomke, S. Experimentation Matters: Unlocking the Potential of New Technologies for Innovation; Harvard Business School: Boston, 2003.
  2. Hrusovsky, K. Keynote address at MipTec 2004; www.caliperls.com.
 

About the Authors

 

Kevin Hrusovsky is president and CEO and Mark Roskey is vice president of marketing at Caliper Life Sciences.