Whether drug discovery has benefited fully from combinatorial chemistry and high-throughput screening methods is open to debate. But high-throughput experimentation and parallel approaches for exploring synthetic routes and optimizing safe and cost-effective reaction conditions are gaining favor among some process chemists. The benefits include higher productivity, wider coverage of experimental space, increased information, and--every so often--unanticipated insights into the chemistry taking place.
Parallel reactor equipment available to the process chemist ranges from the relatively simple, such as multivessel stir-plate carousels, to very sophisticated high-throughput systems. In turn, prices range from a few thousand dollars into the millions, depending on the complexity of the reactor setup and the associated hardware, software, and automation capabilities. The market also is competitive, users say, with at least a few manufacturers occupying each niche and many others offering a range of products.
"Automated experimentation can be as simple as a robot doing one experiment," says Scott G. Sheffer, vice president of marketing and operations for Symyx Technologies. "Parallel chemistry is a nice way of doing multiple experiments, but if you don't increase throughput on the analytical side, you are just doing more experiments, and it takes just as long to characterize those experiments.
"From our perspective, high-throughput experimentation is where you actually achieve the greatest productivity and operational efficiency gains," he continues. "It's a combination of parallel experimentation with a backbone of hardware and software that allows you to keep up with the experimental data being generated and to characterize materials on a very fast basis."
Symyx has collaborated with Merck since late 2000. Initial projects focused on developing workflows--a combination of specific Symyx Discovery Tools equipment and software to complete a desired task--for solubility testing and for identifying polymorphic forms of lead compounds. In 2003, the companies began work on process optimization workflows.
Merck has used a Symyx parallel pressure reactor (PPR) to screen and optimize catalytic high-pressure reactions, including homogeneous asymmetric hydrogenation, heterogeneous hydrogenation, and reductive cyclization reactions. The system has 48 reactor cells of 2 to 6 mL each operating at up to 500 psi and 200 ºC under inert conditions. It also integrates experimental design, preparation, control, and analytical analysis and results viewing.
In one instance, the goal was to come up with an economically viable process for the optimized synthesis of 2-substituted indoles for Merck drug compounds in development, says D. Richard Sidler, who is a senior research fellow in process research at Merck Research Laboratories. Known conditions reported in the literature required relatively high catalyst loadings, temperatures, and pressures.
THE LAB SCREENED different palladium sources, phosphine and phenanthroline ligands, catalyst loadings, catalyst-solvent combinations, and catalyst-ligand ratios in several PPR runs, taking about a day each. "We were able to go from 5 to 10 mol % palladium and 400 to 500 psi CO at 120 ºC down to 0.1% catalyst loading, 15 psi, and 70 ºC and get a better yield," Sidler says. The result is a reaction he calls "truly catalytic and economically viable." Without high-throughput methods, the optimization process might have taken one full-time chemist three to six months to complete, he estimates, whereas the PPR screening took about a week to 10 days.
"The ability to run anywhere from a few dozen to hundreds of experiments in parallel is giving us a huge advantage," Sidler says. "We apply it anywhere we can in process research." His group, for example, has unearthed important solvent effects that might otherwise have been missed. In the past, "you might look at 10 reactions, and when none of them seem to work, you'd just move on to a new approach," he explains. "Now, you can cover more ground, and you'll often find a particular combination of solvent, catalyst, and ligand that gives unique benefits."
The Merck lab has added a Symyx high-pressure reactor system to allow for more rapid reaction screening in finding initial leads. It has a 96-well plate format with reaction volumes of 100 to 1,000 µL using 1 to 5 mg of substrate per well and operates between -10 and 200 ºC at up to 1,500 psi. According to Symyx, when the system is combined with other workflow components, chemists can run and analyze up to 384 reactions per day in an automated manner.
Material usage is an advantage of the 96-well format, Sidler says, because the amounts needed add up to a reasonable quantity of compound to obtain and consume early in process R&D. "We use an initial screen to hone down to a particular set of conditions that we want to explore more thoroughly," he explains. Once that's done, "we'll scale those up in a little bit larger reactors--maybe 2 to 20 mL, using anywhere from 10 mg to a few hundred milligrams per reaction--to look at fewer variables." If the results are consistent, conditions are then optimized by using the best leads in even fewer experiments at even larger volumes.
Catalyst screening and crystallization are two areas in which high-throughput experimentation is needed, says Tom van Aken, vice president for global marketing and sales at Avantium Technologies. The company provides high-throughput instrumentation and software technologies, as well as R&D services, to the drug and chemical industries. Although Avantium is now independently owned, GlaxoSmithKline and Pfizer were among the original strategic shareholders.
Pfizer has outsourced work to Avantium, which has conducted more than 80 projects for the drug firm, van Aken says. One example was the selective catalytic hydrogenation of three different nitro compounds containing other functionalities (Org. Process Res. Dev. 2004, 8, 469). Hydrogenation of nitro groups is important in the synthesis of drugs such as Viagra, Zyvox, and Agenerase.
||MULTIDIMENSIONAL Product yield from low (small yellow spheres) to high (big blue spheres) is shown as a function of catalyst, solvent, and additive for hydrogenation reactions of an aliphatic nitro compound. [Reprinted from Org. Process Res. Dev. 2004, 8, 469. © 2004 American Chemical Society.]
The Pfizer and Avantium scientists' optimization approach was to "fish in the right place and cast as broad a net as possible," they reported, and to "triangulate on the basis of internal experience, literature precedents, and, for catalysts, also on the recommendation of experts." The survey, which included 264 experiments taking a few weeks in each of three studies, explored dozens of different catalysts, several solvents, and a handful of additives.
In the end, promising catalyst and reaction conditions were identified for two of the reactions. The most promising of those--resulting in an increase in yield from 40% to 90% for the hydrogenation of an aliphatic nitro compound to an amino acid--was confirmed at the 1-L scale. Further optimization of the solvent composition resulted in a process that holds promise for production-scale synthesis, the researchers claim.
Avantium also offers services for screening PEGylated proteins, which are those coupled to polyethylene glycol (PEG) molecules to reduce immune reactions and prolong drug circulation. Both the protein drugs and PEG reagents are expensive, van Aken explains, and as much as 50% of the drug can be lost during the coupling reaction. Reaction times are also very long (100 hours). But with parallel experimentation, 100 to 200 PEGylation reactions using very small amounts of material can be screened and then optimized for the correct solvent-reagent combination, coupling chemistry, yield, and impurity profile in a matter of weeks.
High-throughput experimentation frequently is employed when statistical approaches to experimentation trump chemical intuition. Examples that researchers offer include repetitive screening for appropriate ion-exchange resins, absorbents to remove impurities, or crystallization conditions. "Crystallization is one of the most poorly understood scientific phenomena, and there are so many things that impact it that it's difficult to predict," van Aken says. "Running many experiments in parallel is a huge advantage."
Identification and control of crystal polymorphs is extremely important from an intellectual property standpoint and for consistent drug production. Case in point: Johnson & Johnson's recent willingness to spend $230 million to acquire TransForm Pharmaceuticals and its high-throughput crystallization and formulation technology. The six-year-old company has worked with several drug firms, finding new crystal forms of major drugs such as Pfizer's Zoloft and J&J's topiramate.
"In general, pharmaceutical companies are quite interested in high-throughput experimentation technology," Symyx's Sheffer says. "There is always going to be an internal debate within customers about the return on investment, but those that have deployed high-throughout workflows have achieved returns and paybacks in a very short period of time." Symyx equipment costs between a few hundred thousand to a couple of million dollars.
Roche's investment in a Surveyor system from Argonaut Technologies has more than paid for itself, says Martin R. Guinn, distinguished engineer at Roche Colorado. The Surveyor, which cost about $180,000, has robotic reagent addition and reaction sampling for high-performance liquid chromatography (HPLC) analysis for up to 10 independently controlled reaction vessels. The Roche lab has used its system for peptide synthesis and for crystallization and adsorption studies.
Part of the reason for buying the system, Guinn says, was "to demonstrate that we could reduce the solvent usage in making Fuzeon." The HIV drug is a complex 36-amino acid polypeptide produced in three fragments on polystyrene resin supports (C&EN, March 14, page 17). "There was a huge amount of N-methyl-2-pyrrolidone used to wash the resins," he explains.
IT TAKES about a week or two to build a fragment, Guinn says, so parallel experimentation was critical for optimizing a complex synthesis of more than 100 steps in a reasonable time frame. "We were able to demonstrate about a two-thirds reduction in solvent usage compared to the standard process, which translated into millions of dollars in savings," he adds.
Merck, Roche, and Pfizer operate specialist labs containing a variety of high-throughput experimentation equipment. Suppliers and equipment used by the companies include Anachem's ReactArray, Systag's FlexLab, and Mettler Toledo's MultiMax systems, as well other Argonaut units, such as the Endeavor for high-pressure reactions and, at Pfizer, a prototype Atlantis reactor for slightly larger volumes. Companies including HEL, Chemspeed, Asynt, and Radleys Discovery Technologies make other parallel reactors and vessels found in process labs.
"At Pfizer, we implement laboratory automation for process R&D in two ways: via specialist labs and by direct rollout to project chemists," says Joel M. Hawkins, senior research fellow at Pfizer. Scientists in the specialist labs typically evaluate or develop new technologies and then apply them to process chemistry problems in collaboration with the project chemists. Eventually, select technology is then disseminated to the project labs for personal use or is made available on a walk-up or service basis.
"We have found that instruments designed for specialist groups are not necessarily the best for dissemination to the project labs," Hawkins comments. "For the project labs, hardware and software must be simple and intuitive to operate, small sized in order to fit into current laboratory space, and robust so that it is up and running most of the time." The greatest return will come from the project labs, he believes, where increased productivity quickly offsets the cost of relatively simple equipment.
Although simple to use, lab equipment for dissemination can be very sophisticated and can include associated analytics, Hawkins explains. "We have found that technologies, such as heat-flow measurements that were once used only by chemical engineers, can be incorporated into simple-to-use equipment for broad use by chemists," he says. "Combining this with an appropriate number of reactions in parallel, temperature profiling, strong agitation, and good ergonomics can give a very powerful tool which is embraced by the project chemists."
Hawkins has been presenting recent developments from Pfizer at industry meetings. His examples include using a ReactArray to optimize catalyst usage for both metal removal and cost considerations in Heck reactions, as well as determining reaction robustness to levels of residual water, phase-transfer catalyst, and solvent. Simply tracking heat flow to understand reaction rates has offered considerable insight into potential scale-up problems for exothermic reactions.
Using an Endeavor unit and measuring hydrogen uptake has become a straightforward means of following the extent of hydrogenation reactions in Hawkins' lab. The lab also has been exploring the calorimetry capabilities of a MultiMax to quantitatively follow reaction rates and kinetics in Diels-Alder test cases. Although heat flow offers a useful, nearly continuous measure, Hawkins says they are working to integrate it with HPLC and infrared measurements to follow and understand more complex chemistries and kinetics.
To address the needs of individual chemists, Pfizer scientists have worked with Argonaut to develop the Advantage 2410 unit. Called a "personal screening synthesizer," it sells for about $18,000 and includes four 2- to 10-mL reaction vessels in a benchtop unit roughly the size of a hot plate. It has independent temperature control and monitoring, real-time data collection, and parameter setup via a touch-screen panel.
In March, Biotage agreed to acquire Argonaut. Argonaut had already discontinued some legacy systems, and Biotage CEO Jeff Bork says Biotage will continue to offer the still-supported process instruments, which had sales of about $3.5 million in 2004. In addition to the 2410 unit, the Advantage series includes a 3400 process optimization workstation, developed with drug industry partners, and the 4100 scale-up multireactor.
Roche's lab managers have also looked at the Argonaut 2410 and other personal-use equipment, offering it to chemists to try out and give feedback on its utility. "We're trying to change the way individual chemists are running experiments," Guinn explains, "getting them away from round-bottom flasks to using a unit that can go into their hood and allows them a lot of flexibility and the ability to run multiple experiments, log data, and control dosing and reactions.
"It's actually more like what they would be able to do in the plant," Guinn comments. "We often have more sophisticated instrumentation and control capabilities in our pilot plant than we have in the laboratory." The new lab equipment that's emerged in the past four or five years, however, has improved and added those capabilities, process chemists say. It also lets them collect data around the clock, increasing productivity and more effectively using resources.
"The drive is really to generate more robust processes quicker," Guinn says. Lab instrumentation that more closely mirrors production capabilities and provides insight into reaction conditions is a major step in this direction. "Pharma has done a pretty good job at developing good processes. But everybody recognizes that we need to know more about our processes and we need very efficient processes to drive down the cost of goods."
The near-term challenge, though, remains propagating the new tools out of the specialist groups and into the project labs, where they sometimes meet resistance. "As these tools are adopted, they will become less expensive, more reliable, and easier to use, and value truly will be gained," says Owen W. Gooding, senior director for product development at Argonaut. At the same time, specialist labs can efficiently serve the broader organizations and keep the higher investment, less transportable, and more complex equipment busy and serviced.
Symyx and other manufacturers believe that most chemists--once they understand how to use the equipment and see meaningful results--warm up to it pretty quickly. "Chemists are used to running their reactions in a hood and doing them one at a time, like the chemistry they learned in graduate school and have spent their careers doing," says Eric Carlson, Symyx group leader for Discovery Tools. Using automated and high-throughput systems requires "a bit of a mind shift in how you approach the chemistry and how you plan your experiments."
THE INCREASE in productivity is a big driver for change, Carlson explains, as pharmaceutical companies strive for getting more results with the same staff. "This technology is not meant to displace the scientist--it is a research tool and needs to be used effectively. It doesn't help any organization to run lots of poor experiments, but the technology does allow you to design the right experiments and run those in a consistent manner."
Instrument makers and users admit that the technology brings an increased burden in data handling and analysis. "Clearly, when you start to run many experiments, the analysis and trying to make sense of the results take a lot more time," Merck's Sidler says. "Probably one of the biggest drawbacks we find in high-throughput and parallel screening is the analysis time." Appropriate informatics tools are crucial, he and others agree, for collecting, processing, storing, sharing, and reporting the data.
"The bottleneck very quickly becomes analysis," adds Christopher J. Welch, who heads the analysis and preparative separations group within process research at Merck. For example, even using a relatively fast, 10-minute HPLC assay for analyzing a run of 100 experiments adds up to a significant amount of time. Welch's group has been working with Sidler's to develop more rapid analytical methods to ease the bottleneck.
Working in their favor is the fact that the analysis can be more qualitative when screening reactions, Welch explains. "You don't need to know whether an individual well gives 95% or 98%--you can find that out in a follow-up experiment. Initially, you just need to know whether the results are good or bad." In addition to faster assays, the Merck researchers are also looking at parallel analysis to complement parallel reaction screening.
"You have two routes in trying to go for high-throughput analysis," Welch says. "You can do very, very short assay times, or you can do massively parallel assays, keeping traditional assay times but doing eight or 24 at a time." The best of both worlds, he adds, will be doing assays that are both fast and in parallel.
Merck is evaluating parallel microfluidic chromatography equipment, including an eight-channel HPLC instrument that it has developed with Eksigent Technologies and a 24-channel system from Nanostream. Instead of taking a day or more, a 96-well reaction plate can be analyzed with an eight-channel system in under an hour.
In the meantime, supporters of high-throughput experimentation say it allows them to design a range of experiments and explore conditions at a rate that otherwise would not have been possible. Statistical tools such as chemometrics and design of experiments (DOE) have emerged to go hand in hand with experimentation (C&EN, July 14, 2003, page 37). Some researchers, such as at Pfizer, are combining process optimization with microreactor technology to explore new reactions (see page 43).
Dominique M. Roberge, project leader at Lonza, advocates combining reaction calorimetry, automated reactors, and DOE as a central part of process development (Org. Process Res. Dev. 2004, 8, 1049). Lonza scientists have used this approach to study a two-step reaction: the ruthenium-catalyzed oxidative cleavage of a cyclic ,-unsaturated ketone followed by cyclization to a lactam (Org. Process Res. Dev. 2004, 8, 1036). Understanding and optimizing the reaction parameters led to a technically feasible process, subsequently scaled up for production.
Scaling up to kilogram or ton quantities may be problematic after finding a result at a microliter scale, Avantium's van Aken warns. Thus, Avantium emphasizes rational experimental approaches based on statistical techniques, rather than simply automating hundreds or thousands of very small-scale experiments every time. For example, after finding a lead in high-throughput mode, the company has a system for doing 192 targeted DOE reactions, testing the important parameters on a milliliter reaction scale.
Reaction screening is helping the process development lab meet workload demands, originating from discovery research, and rapidly zero in on areas most likely to produce scalable reactions. Development timelines are shrinking, Roche's Guinn says. "There is such a rush to get compounds into the clinic that the old paradigm of being able to develop each step through a one-experiment-per-day approach is really not going to fit with the realities of our current timeline.
"Management in industry recognizes that we need to be more innovative in our process development and use the tools that are available to get data we hadn't been getting in the past," he continues. "We have less flexibility to have failures. We need to be successful the first time."
|MORE ON THIS STORY
Researchers find that processes run in microreactors open doors to more efficient and novel chemistry useful for fine chemicals and intermediates
KNOW THY PROCESS
Regulatory push for process analytics sets new goals for pharmaceutical manufacturing