[ Skip Navigation ]


The modern pharmaceutical industry traces its origin to two sources: apothecaries that moved into wholesale production of drugs such as morphine, quinine, and strychnine in the middle of the 19th century and dye and chemical companies that established research labs and discovered medical applications for their products starting in the 1880s. Merck, for example, began as a small apothecary shop in Darmstadt, Germany, in 1668, only beginning wholesale production of drugs in the 1840s. Likewise, Schering in Germany; Hoffmann-La Roche in Switzerland; Burroughs Wellcome in England; Etienne Poulenc in France; and Abbott, Smith Kline, Parke-Davis, Eli Lilly, Squibb, and Upjohn in the U.S. all started as apothecaries and drug suppliers between the early 1830s and late 1890s. Other firms whose names carry recognition today began with the production of organic chemicals (especially dyestuffs) before moving into pharmaceuticals. These include Agfa, Bayer, and Hoechst in Germany; Ciba, Geigy, and Sandoz in Switzerland; Imperial Chemical Industries in England; and Pfizer in the U.S.

Patent medicine recipes were not, in fact, patented, though the formulas were usually a secret COURTESY OF THE NATIONAL LIBRARY OF MEDICINE

REMEDY Patent medicine recipes were not, in fact, patented, though the formulas were usually a secret.

A merging of these two types of firms into an identifiable pharmaceutical industry took place in conjunction with the emergence of pharmaceutical chemistry and pharmacology as scientific fields at the end of the 19th century. Oriented to identifying and preparing synthetic drugs and studying their impacts on pathological conditions, both disciplines were intimately linked with the rise of the industry.

Pharmaceutical firms, first in Germany in the 1880s and more recently in the U.S. and England, established cooperative relationships with academic labs. The resulting exchange of research methods and findings drove a focus on dyes, immune antibodies, and other physiologically active agents that would react with disease-causing organisms. Postulated by Paul Ehrlich in 1906 following more than a decade of research, the concept that synthetic chemicals could selectively kill or immobilize parasites, bacteria, and other invasive disease-causing microbes would eventually drive a massive industrial research program that continues to the present.

Already in the early 19th-century, chemists were able to extract and concentrate traditional plant-based remedies, giving rise to treatments such as morphine and quinine. By the start of the 20th century, applying similar methods to animal systems resulted in the isolation of epinephrine (adrenaline) as the first hormone that could be used as a medicine. Meanwhile, synthetic organic chemistry evolved as an industrial discipline, especially in the area of creating dyestuffs derived from coal tar. It was only a short step from staining cells to make them more visible under microscopes to dyeing cells to kill them. Chemists soon modified the raw dyestuffs and their by-products to make them more effective as medicines. Early products of research continue to have application today; for example, N-acetyl-p-aminophenol, the active ingredient in Tylenol and Panadol, is a fast-acting metabolite of the analgesics acetanilide and phenacetin created in German laboratories in the 1880s. In 1897, a chemist at Bayer, Felix Hoffmann, first synthesized aspirin, another staple of our medicine cabinets. The end of the 19th century also saw the development of several important vaccines, including those for tetanus and diphtheria.

A theory relating chemical structure to pharmaceutical activity emerged from the interplay of experimental results from animal and human tests using vaccines, antitoxins, and antibodies with chemical knowledge about dyes and their molecular structures. This structure-activity theory inspired Ehrlich to pursue a long and systematic course of research that resulted in the antisyphilitic Salvarsan, often considered the first systematically invented therapy.

The progressively more important role of the chemist and chemical science in pharmaceuticals in the early-20th century is mirrored in the history of the American Chemical Society's Division of Medicinal Chemistry. It was founded in 1909 as the Division of Pharmaceutical Chemistry one year after ACS instituted a divisional structure. Chemists in the U.S. had gained new stature and industrial employment due to the requirements for accurate analysis of medicines contained in the 1906 Food & Drugs Act. But U.S. chemists only rarely had the freedom to create new drugs, and relatively few companies manufactured complex therapies. Those activities were largely monopolized by German chemists working in conjunction with the major German chemical companies. World War I blockades forced U.S. chemists to replicate German processes for producing drugs such as aspirin; Salvarsan; and Veronal, a powerful hypnotic useful in easing the pain of battle wounds. In 1920, the ACS division renamed itself the Division of Medicinal Products to reflect the wartime change in focus from analysis to synthesis. Edging ever closer to research functions, in 1927 the division took on its present name.

The press room of the Althouse Azo Dye Manufacturing Plant, Reading, Pa., in 1946. Dyes were among the first substances investigated for pharmaceutical activity COURTESY OF CHEMICAL HERITAGE FOUNDATION COLLECTIONS
A Merck delivery truck in front of company headquarters in New York City, circa 1908 COURTESY OF MERCK

DYESTUFFS The press room (photo at top) of the Althouse Azo Dye Manufacturing Plant, Reading, Pa., in 1946. Dyes were among the first substances investigated for pharmaceutical activity. A Merck delivery truck in front of company headquarters in New York City, circa 1908.

While largely unregulated by government bodies prior to the 20th century, the pharmaceutical industry faced challenges in differentiating its products from patent drugmakers whose secret recipes, in fact, were not patented. Professional bodies, including national physicians' associations, pharmacists' groups, and national formularies (which trace their origins to a 1498 pharmacopoeia for the apothecaries of Florence, Italy) set manufacturing standards and occasionally exposed fallacious claims made concerning medical ingredients. The development of diphtheria antitoxin in the 1890s and subsequent cases of inactive or contaminated doses led the health ministries in Germany and France to test and oversee biologicals; likewise, the U.S. Hygienic Laboratory was authorized to license manufacturers under the 1902 Biologics Control Act.

Government regulators' authority to remove products from the market or constrain advertising claims, however, was limited in the U.S., Europe, and elsewhere. Larger companies supported additional legislative interventions, including the 1906 Food & Drugs Act in the U.S. and similar laws in several European countries that prohibited adulteration and forced manufacturers to reveal ingredients on product labels.

Nevertheless, at the start of the 1930s, most medicines were sold without a prescription and nearly half were compounded locally by pharmacists. In many cases, physicians dispensed medicines directly to patients; companies often supplied physicians with their favorite formulations. While the medical profession was well-established in Europe and America, the pharmaceutical industry was only beginning to develop medicines to treat pain, infectious diseases, heart conditions, and other ailments. Direct application of chemical research to medicine appeared promising, but only a few substances--newly isolated vitamins and insulin--were more effective than treatments available at the turn of the century. The industry's position at the crossroads of science, medicine, and growing health care markets nevertheless set the stage for explosive growth.

NEXT PAGE: The pharmaceutical "golden era": 1930- 60


The Top Pharmaceuticals
That Changed The World
Vol. 83, Issue 25 (6/20/05)
Table Of Contents