FINE CHEMICALS
Process design and improvement tools help chemists and engineers quickly deliver cleaner, safer, and more cost-effective methods
A. MAUREEN ROUHI, C&EN WASHINGTON
|
 |
|
WAVE OF THE FUTURE Phoenix Chemicals' new continuous high-pressure hydrogenation plant for asymmetric reductions operates at pressures greater than 200 bars and temperatures greater than 200 °C. It has an output of at least 100 metric tons per year.
PHOENIX CHEMICALS PHOTO |
|
|
Translating chemistry in a flask to chemistry in a thousand-gallon reactor requires not only know-how, creativity, and skill but also time to develop and test ideas. As speed to market has become more and more critical, the time for process development has correspondingly shrunk. Over the past decade, chemists and engineers have increasingly embraced process design and improvement tools to make up with speed and efficiency what they lack in time. These tools--as well as innovations in catalysis, continuous processing, and alternative reactors or reaction systems--enable new reactions to reach the plant more quickly and fine chemicals manufacturing to be less hazardous, more environmentally friendly, and more cost-effective than ever.
Statistical methods and high-throughput experimentation are among such tools. Their acceptance has been spurred by the influx of software for experimental design and data analysis, the growth of lab automation, and the effective interfacing of statistics, robotics, and experimentation through personal computers. The number of scientific papers describing process developments and optimizations enabled by these tools is rising, says Trevor Laird, editor of Organic Process Research & Development (OPRD).
Companies providing statistical software--such as Accelrys, Stat-Ease, and Umetrics--also have noted the growing interest in statistical methods. Mark J. Anderson, a chemical engineer and a cofounder of Stat-Ease, attributes this to the power of the statistical method called design of experiments (DoE) to detect interactions among factors affecting process performance.
Anderson uses the example of microwave popcorn to explain the concept of factor interactions. Factors that can affect the taste include the type of microwave oven, the power setting, the brand of popcorn, and the cooking time. Experiments he performed with the help of fifth-grade students revealed a significant interaction between power setting and time in affecting taste, he says. At low power, increasing the time does not affect taste significantly. But at high power, longer cooking adversely affects the taste--the popcorn is likely to burn.
Such interactions cannot be detected by one-factor-at-a-time experimentation. "The interactions are hidden gold waiting to be mined by DoE techniques," Anderson says. "I was able to make a million dollars' worth of improvement in the yield of a vitamin product when I found interactions in an existing process that nobody could find through the one-factor-at-a-time approach."
Varying one factor at a time is classic experimentation, but it would be extremely inefficient for process development. In a recent study, chemists at DSM identified 17 parameters that can be critical for just one reaction. Multiply that by the potential levels per factor, and the number of experiments becomes unmanageable.
The power of DoE lies in testing multiple variables simultaneously within one block of experiment. David Nicolaides, product manager for materials informatics at Accelrys, explains how it works: Typically, researchers hypothesize about which factors (such as pH, temperature, hydrogen pressure, catalyst loading, or solvent) will affect some reaction property (such as yield, selectivity, or product purity). Then, on the basis of their expertise, researchers will set limits on each factor--for example, the pH cannot go beyond a certain range. These define the window where the best process will be sought.
"Someone without statistical expertise might simply carpet-bomb that window with experiments," Nicolaides says. "The statistical method essentially says: 'I see what factors you're interested in and what you're trying to measure. If you're interested just in a first approximation, then these many experiments at these specific values are the ones you should run.' "
THIS SCREENING of critical factors is the first level of DoE. The statistical methods derive equations identifying the key factors and how they interact. In another level of DoE, called optimization, the researcher, focusing only on the critical factors within the boundaries previously set, asks all kinds of questions: What happens if I change this factor? How can I maximize yield? How can I minimize by-products?
"If you have many experimental parameters, you can make empirical mathematical models to include them all," says Hans-René Bjørsvik, a chemistry professor at the University of Bergen, in Norway, and formerly a process chemist at Borregaard Synthesis. These models can then be used to generate contour maps that indicate the direction one must take to optimize the desired response.
"It's a beautiful tool for the process chemist," Bjørsvik says. "The more complex the reaction is, the more value you can get. Even if the chemistry is very difficult and you don't know much about it, you can get a lot of help with just a few experiments."
James J. Carey, principal process development chemist at DSM Pharma Chemicals R&D in Greenville, N.C., is an avid practitioner. "DoE complements chemical know-how, intuition, and training by allowing us to approach chemistry more quantitatively," he says. Recently, he has been including cost in his statistical models. "The highest yield may not always be the most profitable," he says.
Carey illustrates this with a low-temperature aldol reaction that had been operated at 40 °C. Using DoE, his group found that operating at 0 °C was more profitable despite the lower yield because it cost more to chill the reactor to 40 °C. People emphasize yield, he says, but the most important factor really is cost.
DoE is also extremely useful in mixture design. Anderson's usual example is preparing homemade play putty from Elmer's glue, borax, and water. The question is, What proportions of these ingredients will give the best bounce and the best feel to the putty?
Guided by DoE, Anderson prepared 16 blends with a kitchen mixer. Members of his family rated the "feel" quality by squeezing. His youngest daughter helped to test the bounce by standing on a step stool and dropping a balled-up blend from a specified height. Anderson then scrambled to measure how high each ball bounced. The results showed that simultaneously maximizing the feel rating and the bounce is impossible. But they suggested a recipe for an acceptable putty in terms of both properties, he says.
Computational tools are also helping in predictive monitoring of production runs, according to Christopher P. Ambrozic, a senior consultant at Umetrics. For example, he says, a client from the specialty chemicals area presented a problem of a process failing unpredictably. When it does, production halts for about 36 hours. Umetrics was asked to help identify the cause of the problem and find a way to avoid it.
Umetrics took the plant's historical production data and separated the good batches from the bad ones, Ambrozic says. Using software, Umetrics converted data from the good batches into a batch fingerprint: what the process should look like at every single point in time, Ambrozic explains. From that fingerprint, they derived a multivariate signal that dynamically changes with time. That signal summarizes in one value the hundreds or even thousands of signals from the production line in real time. The signal augurs impending failure when it goes beyond certain limits, and in this case an alarm is set off two hours in advance. Operators then adjust the process to put it back on track.
In another case, involving high-speed processing, multivariate analysis helped predict imminent failure one minute in advance, Ambrozic adds. In this case, computerized control systems respond to the alarm and execute the corrections required to restore the process.
COMPELLING PRODUCTIVITY gains have spurred wider use of statistical methods in process R&D for pharmaceutical ingredients. Another driver is a Food & Drug Administration initiative strongly encouraging drug companies to apply data creation, collection, and analysis technology to existing production lines to improve efficiency, reduce failure rates, and ultimately lower the cost of pharmaceutical products.
In other areas, process improvement tools are just beginning to be recognized. Statistical methods are hardly known in academia, for example. Only a few chemical engineering departments teach DoE, Anderson says. Most of the education comes from training offered by vendor companies such as Stat-Ease. Some companies--Eastman Chemical and GlaxoSmithKline (GSK), for example--provide in-house training. The ACS Education Division regularly offers workshops.
That academia pays little attention to these essential tools for chemists working in the real world concerns companies like GSK. Earlier this month, it convened a two-day conference in Stevenage, England, titled "Understanding Chemical Processes." The target audience was academia.
According to conference organizer Martin R. Owen, manager of strategic technologies at GSK, "Experimental design tools are some of the most useful available to industrial chemists. They allow chemists to make rational, data-driven decisions about operating conditions for commercial processes. But they are not routinely taught as part of the curriculum." GSK hopes to form partnerships with academia to promote these tools within the chemistry community, he adds.
Other barriers to adoption of statistical methods are certain misconceptions. A common one is that DoE takes away creativity. In fact, Carey says, DoE "causes us to be creative in a different way."
About five years ago, Carey and his team were faced with a crystallization in ethyl acetate that was giving variable yields. They found through DoE that carryover of residual solvent--tetrahydrofuran and water--from the previous operation to the crystallization vessel was a significant factor. DoE also showed that carryover of only THF or only water had no result, but just a small amount of THF and water together reduced yields by as much as 15%. The interaction between water and THF was unexpected. On the basis of the findings, the crystallization solvent was changed to ethanol.
"That's where creativity comes in," Carey says. "DoE did not tell me to use ethanol. But it showed that we were losing yield when both THF and water were present. With our knowledge and experience, we thought a hydrogen-bonding solvent would be helpful, and it was. By forming hydrogen bonds with water, ethanol [solved] the problem."
"DoE is an exciting tool to validate ideas," says Shawn Dougherty, a process chemist at Eastman Chemical. "Chemists can't sell ideas by just saying 'Believe me.' They need supporting facts. DoE is a way of speaking that is easy to understand."
Productivity gains from statistical methods have been amplified by high-throughput automated experimentation made possible by equipment from companies like Argonaut, Chemspeed, and Mettler Toledo. Carey estimates that automated high-throughput equipment can speed up research by five to 10 times.
These systems also encourage chemists to be more creative, Dougherty says. "Chemists can more quickly test their hunches and generate data to sell their ideas to management without taking a lot of resources from other projects."
"High-throughput experimentation is something we decided to do fairly early," comments Johannes (Hans) G. de Vries, a principal scientist for homogeneous catalysis at DSM Pharma Chemicals. Customers would like to see a process within 12 months or less. Without automation, 12 months is not enough time, if even just one step involves a catalyst, he says. With robotic screening of libraries, however, a catalyst can be found in three weeks.
IT'S PARTICULARLY TOUGH for catalytic asymmetric hydrogenations, de Vries says, because the solvent itself affects reaction rates and enantioselectivity. Screening of both ligands and solvents is required. Today, screening 10 different ligands against 10 different solvents is routine. "You would never have done this in the past," he says. "One or two solvents would be screened, and if neither worked, you'd give up. Now, we very often find solvents we never would have expected to give the best results."
The quality of data is also much more likely to be bias-free with automated systems. According to Ulf Tilstam, a senior research scientist at the Lilly Development Center in Belgium and a member of the editorial board of the journal OPRD, "Automated systems don't care whether a reaction is supposed to fail." As he explains: "When you use statistical design, you set up a framework of parameters, and you include parameter levels that you know will not work. That is solely to get data, to have the set points required by the statistical analysis. If you give the experiment to a human being, he or she knows that the experiment will go wrong and may think about it in a different way. That somehow changes the data you get."
Only in the past three or four years, as simpler and easy-to-operate equipment became commercially available, have organic chemists begun familiarizing themselves with automated parallel experimentation tools, says Roel Ferwerda, global marketing director for the reaction engineering group of Mettler Toledo.
Once chemists are comfortable with the equipment, they can easily realize the advantages. By being able to perform multiple experiments simultaneously, chemists gain time. With software directing and robots performing tasks previously executed manually, chemists simplify their lives. And with the ability to insert probes to monitor the reaction and to analyze data in real time, chemists gain more knowledge in a shorter time than has ever been possible.
 |
|
WORKING SMARTLY Carey uses Argonaut's Surveyor for process optimization studies with statistical design methods.
DSM PHOTO |
|
|
|
FULL INTEGRATION of parallelization, automation, and in situ analytics is not yet available, Ferwerda says, and that may not be a bad situation. A fully integrated system might just be too overwhelming now. Starting with simple units and then increasing the complexity in stages might be more effective in increasing acceptance of lab automation.
For example, the now widely used multiple reactor blocks and carousels fulfill the time-saving parallelization function. Mettler Toledo and others have added software control to multiple reactors to fulfill automation. But, Ferwerda says, some users take advantage of only the parallelization, while others may program entire recipes and come back for results the next day.
The final piece--analytics to gather information about the reaction in real time--is only now finding its place. The problem, Ferwerda says, is effective integration. "Multiple reactor systems with integrated infrared spectroscopy and liquid chromatography will be successful only if the parts can be integrated into one seamless system that creates results from the data being monitored and that presents those results to the scientist in a clear way," he explains. For this reason, Mettler Toledo has just established a new business unit, called Applied Informatics, to create software for integrated systems and user-friendly interfaces.
Another barrier to the use of automated tools is lack of knowledge among new chemistry graduates. "At this point, automation is really happening just in industry. Academia will have to follow because they have to deliver the students who know how to use these tools," Ferwerda says.
But many academic labs cannot afford those tools. Therefore, with the help of several drug companies, Mettler Toledo has started the MultiMax University Student Education Program to place automated tools in universities. (MultiMax is Mettler Toledo's brand name for its multiple-reactor product lines.)
Alan Smith, vice president for technology at Avantium Life Sciences, cautions that automation can trap people into a numbers game. Avantium's core business is high-throughput experimentation coupled to rational design of experiments and data management. "Because an 80-reactor tool gives the ability to try a lot of things, people are tempted to do more," he says. Many drug companies invested in expensive platforms that are now collecting dust because they could not efficiently turn masses of data into useful information and knowledge, he adds.
The goal of process R&D is to transfer the cleanest, safest, and most cost-effective process from the lab to the production plant. Statistical methods and automated experimentation are just two of the basic tools. Researchers also examine how other facets of a reaction may help achieve that goal. Some of the most intense activities are in catalysis and reaction systems.
A search of the databases of Chemical Abstracts Service (CAS) reveals that more than 1,800 patents addressing stereoselectivity in fine chemicals preparations have been issued worldwide since 2000. The outstanding trend, explains Matthew Toussant, CAS director of editorial operations, is use of selective chemical catalysts and processes.
Tilstam observes the same trend. Among his duties at OPRD is to contribute to a monthly feature of items of interest to process R&D chemists and engineers. Preparing for this feature requires him to pore over many papers. "People are looking more and more into catalytic reactions," he tells C&EN. "I also see that people are starting to use catalytic reactions more in scale-up because of the ability to screen a lot of parameters with automated equipment."
The fine chemicals industry traditionally scales up from stoichiometric laboratory procedures, observe Hans de Vries and his colleague André H. M. de Vries in a recent paper [Eur. J. Org. Chem., 2003, 799]. This approach is fast and reliable, but it generates up to 100 units of waste per unit of product, they say.
Homogeneous catalysis, they point out, can reduce waste, increase selectivity, produce the desired enantiomer in chiral syntheses, shorten synthetic routes, and require milder reaction conditions. Nevertheless, only about 10% of all fine chemicals production steps currently use homogeneous catalysis. That percentage will rise as homogeneous catalytic reactions become more robust and easier to execute than they are at present.
Ian C. Lennon, a Cambridge, England-based technology leader for Dowpharma, cites two recent examples of better chemistry through homogeneous catalysis. One is an asymmetric route to the drug Pregabalin; another, to (R)- or (S)-methylsuccinamic acid, both of which are important chiral building blocks.
Pregabalin [(S)-(+)-3-aminomethyl-5-methylhexanoic acid] is being developed by Pfizer to treat epilepsy, neuropathic pain, and anxiety. A route reported in 1997 yields racemic product, which is resolved with (S)-mandelic acid. An asymmetric route designed by Pfizer and Dowpharma has just been published [J. Org. Chem., 68, 5731 (2003)].
THE KEY STEP is an asymmetric hydrogenation of the tert-butylammonium salt of 3-cyano-5-methylhex-3-enoic acid to (S)-3-cyano-5-methylhexanoate with a rhodium DuPhos catalyst at a substrate-to-catalyst ratio of 2,700. Hydrogenation of the nitrile gives Pregabalin in 61% yield and 99.8% enantiomeric excess. The advantages are 20% reduction in cost of goods, 30% reduction in waste, and 39% improvement in throughput.
In the second example--preparation of methylsuccinamic acid enantiomers using rhodium DuPhos--the substrate-to-catalyst ratio is much higher: 100,000 [Org. Process Res. Dev., 7, 407 (2003)]. Products are obtained in greater than 99.5% enantiomeric excess.
At the catalyst loading used, the level of metal in the crude product is only 9 ppm, says Christian T. Goralski, a Dowpharma scientist and one of several chemists who worked on this problem. After only one crystallization, that level drops to 0.8 ppm. "You remove virtually all the metal in one simple process. This reaction is incredibly productive and very clean," he says.
In progress of a different sort, DSM chemists have simplified the catalyst for certain Heck reactions by eliminating the ligands. Heck and similar reactions are aromatic substitutions leading to carbon-carbon bond formation. They are catalyzed usually by palladium with phosphine ligands.
Most reactions work well with 3 mol% of palladium and 9 mol% of phosphine ligand, Hans de Vries says. But using that much of a very expensive metal in a commercial plant is not economically feasible, and the phosphine ligands contaminate the product. In addition, the reaction works well with aryl iodides but poorly with aryl bromides, which are cheaper. Understanding the catalytic cycle, the DSM chemists realized that eliminating the ligand would solve all the problems.
Hans de Vries explains the thinking: With aryl iodides, the reaction is fast enough to keep all the palladium in the catalytic cycle. Aryl bromides react more slowly, and so not all the palladium is in the catalytic cycle. Some of it starts aggregating and eventually precipitates. The problem is keeping palladium in solution for catalysis, which is the role of the phosphine ligands. But the phosphine ligands also make the catalyst less reactive toward aryl bromides.
Because aggregation is probably higher order in palladium, and catalysis is probably first order in palladium, the DSM chemists thought that lowering the palladium concentration might slow aggregation more than catalysis. And it works: At 0.1 to 0.01 mol% palladium, catalysis works just fine. "The solution is very counterintuitive because when a catalyst does not work, people tend to throw in more rather than cut back," Hans de Vries says.
SAFER, CLEANER, and more cost-effective processes also can be realized through biocatalysis and biotransformations. In the early 1980s, DSM began exploring biocatalysis for synthesis of antibiotics such as cephalosporins. A key step is coupling a nucleus--7-amino-3-deacetoxycephalosporanic acid (7-ADCA)--to a side chain. According to Marijn Rijkers, a senior scientist at DSM Pharma Chemicals, the chemical coupling required very low temperatures (50 to 60 °C) and chemical protection and activation of various groups to minimize side reactions. Rijkers says biocatalysis reduced waste by 67%, use of organic reagents by 80%, use of organic solvents by 82%, use of steam by 60%, and use of liquid nitrogen by 100%.
Since 2000, more than 400 patents on use of microorganisms, parts of microorganisms, or enzymes to produce higher purity specialty chemicals have been issued, according to Toussant of CAS. Biocatalysis and biotransformation for fine chemicals production is expected to grow as companies like Codexis and Diversa make available enzymes and microorganisms that enable breakthroughs in chemical manufacturing.
According to John H. Grate, chief technology officer of Codexis, evolving enzymes and fermentation strains through gene and genome shuffling has "fundamentally changed" the way enzymes or microorganisms are applied to chemical production. Instead of designing a process around an enzyme or a microorganism, an enzyme or a fermentation strain is evolved to fit a process. And because evolution is massively accelerated with gene or genome shuffling, evolution and process design can proceed on parallel development tracks.
Fermentation strains improved by gene shuffling are now in commercial processes, Grate says. One such process is Pfizer's production of the antibiotic doramectin; another is DSM's production of the antibiotic precursor 7-ADCA. In both cases, an enzyme in the metabolic pathway had been limiting production. Gene shuffling provided better genes that were then incorporated into the fermentation organism.
The power of genome shuffling was demonstrated last year by researchers who, in one year, obtained a fermentation strain for making the antibiotic tylosin that is as productive as the current strain being used by Eli Lilly, which had taken two decades to generate [Nature, 415, 644 (2002)]. Eli Lilly and Codexis now are collaborating to improve Eli Lilly's fermentations.
Opportunities abound outside the pharmaceutical arena. Last month, Codexis and Cargill, an international processor of agricultural and food products, agreed to develop collaboratively an organism that will efficiently make 3-hydroxypropionic acid from carbohydrates. 3-Hydroxypropionic acid can be converted to 1,3-propanediol, malonates, acrylic acid, and polyesters. The project will establish a new raw material based on renewable resources, Grate says.
Meanwhile, Diversa has developed a new route to statin side chains via DERA (2-deoxyribose-5-phosphate aldolase) enzymes. These aldolases break down the sugar in DNA.
Nine years ago, Chi-Huey Wong, a chemistry professor at Scripps Research Institute, showed that a DERA enzyme can catalyze two successive aldol transformations to form a cyclic product [J. Am. Chem. Soc., 116, 8422 (1994)]. "Through discussions with Chi-Huey, we recognized that the reaction can be used to make advanced intermediates for drugs like Lipitor and Crestor," says Mark J. Burk, Diversa's vice president for chemical and industrial R&D.
Diversa licensed Wong's DERA process patents and has since discovered through genomic DNA screening several enzymes that are much better than the one that Wong used originally. Diversa also has demonstrated the feasibility with a 100-g-scale synthesis. It is now engaged in discussions with potential partners to further exploit the reaction, Burk says.
In other developments, process R&D departments are experimenting with, and even implementing, new reactors and reaction systems. Although fine chemicals traditionally have been manufactured by batch production, continuous processing is attracting attention.
|
 |
|
TOO LONG, TOO SHORT, JUST RIGHT The three cartoons illustrate the principle of Phoenix Chemicals' variable-residence-time reactor. In all cases, reactants are represented by the green and red feeds. The desired product should emerge as a green stream. When the reactants traverse every mixing element in the reactor--equivalent to a residence time of 31 units, as shown at top--the output stream is brown, indicating lots of by-products. When only the first three elements are used--equivalent to seven time units, middle--the output is blue, suggesting unreacted material. If the residence time is tweaked slightly to nine units--flow through the first and fourth elements, bottom--the reactor generates material of the required specification. |
|
|
At Phoenix Chemicals, for example, "the goal is to do everything on a continuous basis," according to the general manager, Colin A. Leece. The company embraced continuous processes initially to minimize inventories of hazardous chemicals. But, Leece says, "we don't see why any chemical reaction that's done batchwise can't be done in a flow system."
At Phoenix, reactors that can make 120 tons of product per year can fit on top of an office desk. The equipment is small--some reactors aren't bigger than a tea mug, Leece says. It is inexpensive compared with batch vessels; easy to install, operate, maintain, and adapt. "We find that the industry alters the chemistry to fit the equipment," Leece observes. "We build the equipment to fit the chemistry."
Last month, Phoenix launched a new continuous reactor called the Variable Residence Time (VRT) reactor. According to Lee Proctor, technology manager, setting up a reactor for a specific residence time is inefficient. Different reactors will be required for different residence times because adjusting residence time by changing flow rates affects the mixing and heat-transfer characteristics, Proctor says.
A binary progression of units of residence time--1, 2, 4, 8, 16, and 32--is the core of the VRT reactor. Any residence time can be expressed as a binary expansion. Proctor says Phoenix Chemicals currently uses a VRT reactor to produce a number of products, including a hydroxynitrile intermediate for Lipitor.
Merck KGaA also uses miniature reactor technologies. Before it exited from the vitamin market, one step of its vitamin H production was run continuously in small reactors, according to Klaus Bofinger, vice president for production and development. That's because continuous processing in small reactors overcame the hurdle of preparing a bismetallated species, he says.
By using one side of the molecule in one reaction and then the other in a second reaction, this bismetallated species enabled a whole sequence of reactions to be condensed into one step. But the chemistry required tricky time and temperature regimes, concentrations had to be very tightly controlled, and the scale at which a reasonable yield could be achieved was too small for batch operation, Bofinger says. "The results [with continuous processing] were so convincing that in two years we were in commercial scale."
Merck KGaA continues to hone its expertise. "Sometimes reactions produce intermediates that could explode or that have to be handled at low temperature. Continuous processing allows production on demand and requires fewer resources for cooling," says Michael Grund, head of central process development engineering. But more development is required, he adds. Microreactors are still inefficient in handling solids and suspensions, and process control needs further improvement.
Bofinger believes it will be some time before microreaction technology is applied to regular production of active pharmaceutical ingredients (APIs). "But even now we are already using microreactions in process development of new APIs, which may be marketed in five years' time."
According to Lukas Utiger, head of exclusive synthesis at Lonza, the company uses continuous processing for specific reactions, like ozonolysis, and it is limited generally to starting materials and unregulated pharmaceutical intermediates. However, Lonza researchers are exploring heterogeneous reactions in microreactors. Reactions with solid/liquid or gas/solid/liquid systems are complicated, he says. "If you can master that in microreactors, the technology will be open for further exploitation."
MEANWHILE, Dowpharma has invented a semicontinuous Swern oxidation to make an advanced drug intermediate. According to Dowpharma process chemist J. Russell McConnell, the Swern oxidation usually is run under cryogenic conditions because the intermediates are unstable. However, such conditions are difficult to sustain at a large scale. If the residence time of the intermediates could be reduced, the reaction could work at higher temperatures. "The approach we took was to generate the intermediates inside an in-line reactor and give them very little time to do anything else," he says.
The process is only partly continuous. Two separate reagent streams come together in an in-line reactor--a tube outside of and connected to a conventional stirred-tank reactor. The intermediate forms as the reactants traverse the tube and immediately is dumped into the stirred tank, where it is converted to the aldehyde. Yields are only slightly higher than those realized under cryogenic conditions, but operability is vastly better, McConnell says.
At Thomas Swan & Co., a specialty chemicals manufacturer, continuous processing intersects another trend--novel reaction systems. In July 2002, the company launched a commercial-scale, multipurpose continuous facility for running reactions in supercritical carbon dioxide.
"People have been looking at supercritical CO2 for a long time, but most of them used it in batches," says Ian MacKinnon, innovations manager at Thomas Swan. "The products are not much different from what you get using something else as solvent. Because under those conditions, what you get is the product of thermodynamic equilibrium. And in general that is not much changed by changing the solvent."
The breakthrough was realizing that the technology needs to be applied to a continuous process, MacKinnon says. "By doing that, you gain one adjustable variable, which is the kinetics of the different reactions taking place. You can select a set of conditions under which the reaction you want is favored and the rest are suppressed."
Thomas Swan's new facility currently is producing trimethylcyclohexanone from isophorone, a reaction that displays the fine-tuning that continuous processing with supercritical CO2 allows. When carried out in liquid acetone, the reaction is very slow and conversion is only up to 70%, MacKinnon says. With continuous processing in supercritical CO2, quantitative conversion of just the carbon-carbon double bond is achieved at modest temperatures. The carbonyl double bond is untouched.
Initially focused on catalytic hydrogenations, Thomas Swan now is examining other chemistries, including Friedel-Crafts alkylations. Here the comparison with the conventional method is "quite startling," MacKinnon says. "Instead of using an alkyl halide, you use an alcohol. Instead of aluminum trichloride or hydrogen fluoride as catalyst--both are pretty filthy--it's done over an acid resin. Instead of multiple alkylation, you get selective monoalkylation. And the by-product is water."
Christopher M. Rayner, a reader in organic chemistry at the University of Leeds, in England, is skeptical that continuous supercritical fluid reactors will find wide use in advanced pharmaceutical synthesis. "Most of the reactions that have been studied involve relatively small molecules and relatively simple transformations," he says.
Still, Rayner, whose research includes complex chemistries in supercritical fluids, believes the technology will become more mainstream. Applications in dry cleaning and polymerization chemistry are demonstrating the technology's reliability, he says. "Eventually, it will filter to other areas, including fine chemicals and pharmaceuticals."
Overall, continuous processing for advanced intermediates and active ingredients is rare in fine chemicals manufacturing. One reason is that companies are heavily invested in batch vessels and are reluctant to discard them. Another is the natural aversion to idle assets.
"There's always a danger when one embraces new technology and makes a humongous investment," says Bill Heggie, vice president for process chemistry at pharmaceutical manufacturer Hovione. "You might have projects that can use the technology. But a few years down, as is common in this business, none comes to fruition. Then you have a big white elephant."
Siegfried, a Swiss supplier of pharmaceutical intermediates and active ingredients, knows all about white elephants. "A few years back, everyone was trying controlled particle size distribution by milling, so we purchased a special mill with a classifier," Hans-Rudolf Marti, head of development at Siegfried, tells C&EN. "But the concrete application on a large scale was difficult to sell." Now, he says, Siegfried acquires a new method only if a project requires it or if it offers a clear opportunity to improve chemical conversion.
Statistical methods and automation likely will be adopted more widely. "People in industry have no quibbles about improving processes, even though they may not completely understand why things happen," Anderson says. "By being smart with their experiments, they learn more about the system and they produce results that are much more effective."
|
CHOCK-FULL OF DATA Multidimensional contour plots show the predicted yield and selectivity of aerobic oxidation with bubbling oxygen of 2.4 mmol of acetophenone in 12 mL of tert-butyl alcohol at a reaction time of five hours. The responses were studied as a function of three variables: amount of the base potassium tert-butoxide, temperature, and amount of the catalyst 1,3-dinitrobenzene. Red contour lines show variation in yield; blue contour lines show variation in selectivity. Both yield and selectivity are poor under the conditions defined by plot a. When the amount of catalyst is 5% (plot b), selectivities improve but yields are no more than 75%. The best conditions are found at 10% catalyst (plot c). The reaction begins to achieve 100% selectivity when the base is around 5.0 mmol and the temperature is about 73 °C. Under those conditions, yield is only 75%. If the amount of base is increased to about 8.0 mmol and the temperature is raised to about 100 °C, yield rises to 90% [J. Org. Chem., 67, 7493 (2002)]. |
|