"I see modeling and simulation as a critical, enabling technology essential today to capture, test, integrate, transfer, and institutionalize knowledge acquired along the value-adding and information supply chains from our suppliers and their suppliers to our customers and their customers. Those companies that use these tools effectively will provide increasing value to the marketplace. Those that do not will be pushed aside."
- James A. Trainham, director of engineering research and development, DuPont.
"The model integrates the organization. It is the vehicle that conveys knowledge from research all the way up to the business team. And it becomes a tool for the business to explore different opportunities and to convey the resulting needs to manufacturing, engineering, and research."
- Irving G. Snyder Jr., director of process technology development, Dow Chemical.
These statements by Trainham and Snyder capture the sense of movement contained in the information undercurrents swirling below the surface of the chemical industry today. Companies in the industry are at the beginning of a major transformation in how they operate: how they focus their research and engineering design activities, how they run their plants, how they decide which products to market, and how they make tactical and strategic business decisions. Information technology is at the heart of the transformation.
As might be expected at such an early stage, approaches to the information future differ from company to company, sometimes from department to department within a company. But one common emerging feature is that communications among subdivisions in the organization - research, engineering, manufacturing, marketing, and business management - are likely to be increasingly centered on process modeling and simulation.
The information revolution is indeed poised to transform chemical business in a major way.
It is a theme running through a concept at Dow called "the plant of the year 2000," which the company has officially accepted as its vision of what it wants its facilities to look like in the next century. "This has been endorsed by all of the top manufacturing people, it has been reviewed by our business people, it's been reviewed with R&D, and it's actually been reviewed by key members of our board of directors," Snyder tells C&EN. "And there isn't anybody that doesn't say, 'That sounds fantastic, that is really exciting, I just can't wait.'"
How to get there from here and how each of those groups connects their actions to facilitate it is the real challenge, Snyder admits. Nevertheless, he says, "The thing is catching hold."
Trainham points out that chemical businesses and processes have for a long time been run on inferential information and run pretty well. However, he says, the issue today is that companies are pushing the envelope toward perfection and must wring every single quality pound out of their plants. "The only way you can do that," he says, "is through fundamental understanding and capturing of that knowledge in models."
Process modeling and simulation go hand in hand. A process model is a mathematical representation of a production process. Simulation is the use of the model to predict a plant's performance and its economics. Until fairly recently, their application has been a rather specialized design province of chemical engineering.
Not surprisingly, the companies producing software for modeling and simulation have been responding to a new business environment for application of such software. For example, officials at Simulation Sciences Inc. (SimSci), Brea, Calif., think of modeling and simulation as an enterprise application. Those at Aspen Technology Inc. (AspenTech), Cambridge, Mass., speak of life-cycle modeling and integrated modeling technology.
"What we're really talking about instead of engineering tools and services," says Y. L. Wang, president and chairman of the board of SimSci, "is enterprisewide business tools spread over different types of business applications in an integrated way, as an integral part of the tools that the company needs to operate its business." SimSci has been trying to push in that direction for several years now, according to Wang.
Joseph F. Boston, president of AspenTech, says the concept of integrated modeling technology employed in life-cycle modeling has become the theme for the company's vision of the future. Life-cycle modeling - sometimes referred to as companywide modeling, Boston says - is considered the application of modeling technology to improve profits and engineering productivity throughout the life cycle of a process, from R&D through manufacturing.
Behind such thinking, changes are taking place in the user base for process simulation. Michael C. Rowland, SimSci's director of marketing, describes its characteristics.
The end user today, Rowland says, typically is much more computer literate. There generally are two types of users in today's plants, he notes. One is an older engineer, with much more experience with the plant, who may have had some experience using simulation software on a mainframe installation, but who generally hasn't been all that impressed with the kinds of software available on personal computers (PCs). The other type is an engineer who is younger, who has grown up with PCs, is very comfortable using computers to do technical calculations, and because of exposure to popular commercial packages, such as word processors and spreadsheets, has a very high expectation of what the software should look and feel like.
"That shift is still going on," Rowland says. "Obviously, the former type is slowly decreasing and the latter type is slowly increasing."
But there are other ways in which the user base is changing as well, particularly with larger companies. It's changing, Rowland explains, in the sense that as companies in general downsize their engineering staffs, the requirements for using simulation software are, if anything, getting larger. Not only is such software needed to keep the plants running well, he says, but also to comply with all of the government regulatory and reporting requirements that chemical process industry companies face today.
Because of the profitability picture for these companies, Rowland says, they generally just don't have the large central engineering staffs they used to have. Or if they do, the staffs have been cut back significantly. So, while the workload hasn't changed - if anything, it's greater - there are fewer people available to do the work. "So what we find," Rowland says, "is that a greater percentage of the engineering staff within a given company needs access to the results and the benefits that come from simulation software."
But more than that, Rowland adds, "What the large companies are telling us is, 'Look, the information that process simulation provides is useful to us not only as engineering information but as business information. We need that information so we can perform decision-support functions as a business unit. Our accounting department needs to get these results. Plant management needs to get these results. The people who are maintaining the plant need to get these results.'"
The challenge for the software company, Rowland tells C&EN, is that these groups are all used to seeing results in their own formats. Senior managers like to look at reports a certain way. Accountants like to look at them another way. So, Rowland says, "How can we take all of the results that come out of this calculation engine and make them available easily and quickly and in the format in which certain job types are used to seeing them?"
Such changes in the market have been experienced by AspenTech as well. Indeed, this changing industrial environment provided the theme for Aspenworld 94, the third in a triennial series of invitational technical conferences hosted by the company. Held in Boston in November, Aspenworld 94 took an in-depth look at the idea of process modeling as an enabling technology in reengineering of the process industries. Speaking in a lead-off plenary session, Trainham and Snyder helped to set the tone.
Addressing the current business and technological trends, Trainham captured their meaning for DuPont. Being world-class is no longer good enough, he said. There are many world-class companies battling for a piece of the pie. Only the best will win. Hence, according to Trainham, DuPont's goal is to become "best in the industry" in its chosen chemicals, materials, and energy businesses by possessing plants "that have the flexibility to make products our customers want, when they want them, with the highest quality, at a price they are happy to pay, all the time, every time, no excuses, no exceptions - period."
To achieve this goal, DuPont has adopted an eight-point approach to becoming "best in the industry." And each has a modeling or simulation component.
The first six points lead to construction of a plant: discovery of a superior product that can be manufactured with world-class chemistry; understanding of the process; translation of that understanding into a viable, dynamic process model; confirmation of the model in pilot facilities, as necessary; development of on-line analyzers to get compositional and physical property data in real time; and design of the most effective manufacturing facilities. The process model developed in step two or three, refined in steps four and five, and used for design in step six will then be used to run the plant - step seven - perhaps with model-predictive control.
The final point deals with operations optimization, the kind of modeling needed to make a business run exceptionally well. Manufacturing plants must get the raw materials they need when they need them. Products must be manufactured so that customers get what they want when they want it. Production facilities must be maintained in a way that ensures their reliability, safety, and operability.
"Outstanding supply-chain management cannot make a second-rate manufacturing process world-class," Trainham said, "but second-rate supply-chain management can make a best-in-the-industry manufacturing plant perform no better than a mediocre one."
And, Trainham emphasized, to do everything well takes an array of modeling tools. Tools are required to meet the different needs and levels of knowledge at each point along the chain - including molecular modeling, process synthesis and conceptualization, process simulation (both dynamic and steady-state), product and facilities design, operations optimization, and environmental, atmospheric, financial, and business modeling. Trainham views the modeling tools available today as far better than those available even a few years ago. They can be used to simulate chemical and business processes with enough accuracy for making business decisions and doing real-time process control.
At Aspenworld 94, Dow Chemical's Snyder had a provocative take on what the role of modeling in a company could or should become. With concepts more explicit but complementary to those of Trainham, he sketched in what he sees happening in terms of what he called plant operational paradigms. He views plant control strategies as having evolved over the years with projections into the future through 10 somewhat hierarchical levels.
Plant operations have progressed a long way from the paradigm of many decades ago: "Produce a product without blowing the plant up." Operating practice then advanced through the paradigm level of "produce a product with instrumentation," and in the 1970s moved to that of "produce a product effectively and efficiently."
Now plant operating personnel are facing the next paradigm level: "Produce a product optimizing multiple variables." What this translates into, Snyder explained, is "run the plant with less maintenance, higher yield, less energy, quality at least as good or better than was produced before, and at the same time increase productivity, which means fewer people and less capital."
Moving to this paradigm is a shock, Snyder admits, but he maintains it must be done if a company intends to remain world-class competitive. It's no longer adequate to optimize individual areas of interest. Rather it's now necessary to optimize a whole series of interrelated areas.
The next major level in Snyder's scheme is "control the plant rather than the unit operation." This is a big step, he pointed out, that starts with enhanced understanding of the chemistry taking place in the plant.
"We don't understand the chemistry in any of the processes we operate at Dow," Snyder said, making his point. "We don't understand them in the kind of details that I would like to understand them."
It's generally known, of course, that A plus B makes C, Snyder explained. But he presented a litany of what he would ideally like to know: the reaction mechanism; the transient intermediates that A and B go through in producing C; the reaction mechanisms of all the by-product reactions; which of all the steps in the reaction mechanism are kinetically controlled, which mass-transfer controlled, and which heat-transfer controlled; if the reaction is homogeneous, what takes place at every point in the reactor at every point in time; and if the reaction is heterogeneous, the diffusion characteristics of raw materials to the catalyst surface or into the catalyst, as well as the reaction, reaction mechanism, and by-product reactions within the catalyst, the diffusion characteristics of products away from the catalyst, and the nature of the heat transfer around the catalyst particle.
With something approaching this level of understanding, Snyder said, "I think it will be possible to increase the output of most of our reactor systems between 50 and 100 percent - and there'll be some 200 to 300 percents also."
This level also includes integrated control - the integration of the control strategy on an entire unit operation or series of unit operations so that control of the entire unit operation or section of a plant may be optimized as a total entity.
The next level raises the ante: "optimization of the plant with continuous technical supervision." This is a level where control will be based not solely upon the expertise of the operators and plant personnel, but upon the knowledge of all the people who have worked on the development of the chemistry and the process design. It would involve development of a model of the entire plant by using rigorous chemical engineering techniques. The chemistry of the primary reactions and by-product reactions would be modeled.
It would thus be possible to run the plant as a model on a computer and test out operating scenarios - higher rates, different feedstocks, modified operating conditions - before they are tried on the actual plant. The model could also be used for operator training and to test plant start-ups and shutdowns. Moreover, the model would run in real time, parallel to the plant, for model-predictive control.
And that's not all. The next step at this level is to have an interactive expert system. "What I'd really like to have," Snyder said, "is all the chemists and engineers who ever worked on this process standing outside the control room door, available 24 hours a day, 365 days of the year, with instant recall." An alternative, he added, is to try to incorporate their knowledge into an expert system that could monitor the plant and communicate with the operators along with the model-based control.
Shutdown-free plants should also be possible, according to Snyder. The idea is to integrate the knowledge of the plant personnel - maintenance technicians, mechanical engineers, material scientists, design engineers, and others. "This is certainly a technical challenge," he admitted. "But the biggest challenge here is to step out of our present paradigm and accept the fact that 10 years of continuous operation is possible and desirable."
The next level - "optimization of the site" - would eliminate control rooms for individual plants. When there is a control system that permits hands-off operation of a plant, and there is an expert system to coach the operating personnel through unusual events, then a centralized control room serving many plants is certainly possible, Snyder explained.
Which leads to the next level: "economic optimization of the business." This level would employ business models to provide the business team an opportunity to explore different business strategies or scenarios.
On one day, for example, the business decision might be to optimize plant output because of a need for more product. Another day, or week, the decision might be to optimize the costs from the plant - for example, by selecting some alternative feedstock. Optimization of energy consumption or minimization of undesirable effluents or emissions from the plant, or some combination of such factors, might be other goals.
Also needed at this level, Snyder believes, is effective interaction between a business model and the plant model. Through their interaction, he said, the plant/business model could explore many possibilities, some of which may not even have been considered by the business team.
At the next level comes "direct customer interaction." There is a real need, according to Snyder, for more effective product models, residing on the supplier's computer or downloaded to a customer's computer, that allow customers to take physical properties of products and design or engineer their own end-use applications.
The ultimate goal in Snyder's vision: "The customer decides which product he wants. He interacts with the business model within the company. He then is given a price, he's given availability, he enters an order. The business model recognizes him as a customer that pays his bills on time, accepts the order, schedules the order in the scheduling model that interacts with the plant model. The product is produced. The product is shipped and invoiced. All without human interaction."
An integral part of most of the paradigms is the process model. To Snyder, the model really becomes the tool that integrates the organization. It is the way that data, information, and knowledge are conveyed from research to engineering to manufacturing and on to the business team.
For example, Snyder said what he would like to see at Dow is that whenever a research program begins, the staff develop a model of the phenomenon they want to research. Initially, the model needn't be more complicated than the chemistry: A+ 2B→C. As the chemistry develops, however, the next experiments could start bringing in some of the chemical kinetics. The model would continue to grow to incorporate increased understanding of the chemistry and process. "The model," Snyder explained, "becomes a way of capturing and preserving the knowledge so that it may be passed on to the engineers who will design the plant."
The process engineers then would take the model and consider various design alternatives, arriving at one that gives the best economic return. In the course of its work, process engineering might discover gaps in the model, and these would then become the basis for additional research to fill them in.
The model would then pass on to the plant, where it would become part of the plant's control strategy. And finally, the model should be made available to the business group so the commercial people could explore the profitability of different business scenarios. Some scenarios the business people would be interested in might not be covered by the model. But, Snyder explained, rectifying this situation would become the basis for new research work in projects that would be readily understood and supported by the business management.
Other speakers at the Aspenworld 94 conference fleshed out the process modeling scene in the chemical industry today. In one manner or another and to one degree or another, engineers in companies are addressing process modeling as an expanding endeavor and one that integrates many activities across the corporate enterprise. At the same time, chemical engineers at universities and other institutions are busy addressing some of the more generic technical concerns involving information management and data exchange.
All of this activity amounts to an information revolution that is impacting how companies will operate in the future, according to Arthur W. Westerberg, a chemical engineering professor at Carnegie Mellon University, Pittsburgh. In a presentation prepared with colleagues that included consultant Jerry L. Robertson, Eswaran Subrahmanian of the research faculty in the university's Engineering Design Research Center (funded by the National Science Foundation), and chemical engineering Ph.D. student Mark E. Thomas, Westerberg discussed work on analyzing what might be termed the philosophical underpinnings of that revolution.
Almost all institutions are undergoing a reengineering and reshaping of their patterns for organizing work and the workplace, Westerberg noted in Boston. In the process, traditional folklore about management truths, new management ideas, and concepts about managing for and achieving quality products are all being examined or tested. Westerberg sees a parallel between the current search for improved work processes and organizational design and those efforts aimed at improving chemical and petroleum process design. Computer-based information technology will have a major impact on both, he ventured.
The current reengineering is the latest in a series of transformations in business thinking that began early in this century. At that time, the idea of scientific management came into fashion. It replaced individual judgment with a "scientific" approach to achieving the most efficient method, and it brought with it the explicit job descriptions, time and motion studies, and work method standards expected to ensure work efficiency. Management's functions in this scheme are to enforce work standards and provide the best tools and working conditions.
Changes in thinking led to a broader view of management in the early 1970s, one in which management was seen as establishing the mission, setting objectives, and providing resources for the institution. This view placed a particular emphasis on the importance of the customer.
The concept of total quality management emerged in the 1980s. Among its elements are an accurate and complete understanding and view of the customer, a need to understand the entire work process, systematic measurement of performance to ensure minimum variation, and the need to continuously improve the work performing system.
Westerberg likens the earlier methods for analysis of management to chemical engineering unit operations. The more recent quality approaches are analogous to overall process descriptions - the integration of parts and their context with the outside world. Analysis of the overall process as an entity and examination of the interactions of the components to determine how to improve the process, he said, appear to be characteristics of both.
Now the information revolution is beginning to have an impact. Business philosophers see a new transformation of organizations. Among the characteristics, Westerberg noted, are flatter organizations with less bureaucracy, networks where everyone learns from everyone, where intuition becomes valuable because there is so much data, and where much of the work is performed by contract rather than hired staff. Middle management is replaced by computers and people managers are replaced by independent, competent, and self-confident self-managers.
Today, the analysis of work emphasizes integration rather than decomposition. Hence, further echoing the observations of others, Westerberg noted that information technology becomes a key enabler for the reengineered corporation. It creates opportunities to change the work process dramatically rather than just performing the same operations faster.
Westerberg recognizes that although there are elements common to chemical engineering and business processes, one basic difference is the human element in business processes. Morale, participation with high-performance teams, innovation, motivation to learn, and just plain thinking hard, he pointed out, are characteristics for which information technology has little to offer at present and on which success in both business and process design relies.
Nevertheless, the parallels suggest to Westerberg that current organizational management analysis is not unlike the state of process design analysis when the idea of process synthesis - the assembly of individual unit operations and process elements into alternative process flowsheets - was embryonic. Hence, institutional management may benefit from the systematic techniques developed and proven for process synthesis.
In this regard, Westerberg pointed out, the technology of information management/consolidation - the discovering or creating, abstracting, interpreting, organizing, using, and sharing of information - is evolving rapidly. Although the design activity in any corporation represents only a small fraction of total resources involved or total information flow in the corporation, it is strongly influenced by changes in the information flow.
In envisioning a simple total corporate information flow loop for a typical chemical or petroleum company, Westerberg finds that the major subdivisions of the overall organization form four nodes in that loop. For example, a procuring node represents obtaining raw material or other inputs, manufacturing represents the process of transforming inputs to outputs, and marketing represents dialogue with the customer. Corporate management is the other of the major nodes. Further subnodes - such as operations, maintenance, design, construction, and research within manufacturing - depict major activities within each of the major subdivisions, and further subdivisions in most organizations are likely.
Westerberg cited three points that can be made for this scheme today: Almost everything done within the design subnode is influenced by some information from one or more of the other subnodes; organizational control, as usually practiced, requires that information based on current data be approved by the management of each of the subnodes before it is shared with other nodes; and all information is not shared among all nodes.
However, in looking to the future, Westerberg expects that as information becomes more broadly shared and as organizational culture changes, the information available for design activities will also probably change. It appears likely, he said, that decisions made by parts of the organization and documented by providing "functional or job specifications" are more likely to be made by the design team. For example, data about costs of equipment, catalysts, utilities, and the like, will be available in real time. Product quality requirements and projected sales data would also be current, as would knowledge about the availability of financial resources. The design activity would thus have broad corporate ramifications.
There is, Westerberg concluded, an "exciting future for designers."
Many companies that have engaged in process modeling since its early days have had operations employing the technique stretching back perhaps 25 years or more. During that time, modeling has progressed considerably and has assumed an ever greater influence on process design work. Even so, only in very recent times has process modeling burst beyond the boundaries of a rather specialized activity.
There are many reasons beyond simply the availability of better computer hardware and software - although these do play a role. No one experience is typical. But a few snapshots of recent process modeling activities at widely varied companies give an idea of some of the motivations involved and approaches being taken.
"A new era of engineering exists," is how Chris Boehringer sees it. Boehringer, technology manager at specialty chemicals manufacturer Lubrizol Corp., Cleveland, underlined for the Aspenworld 94 audience the changing world within corporations, where realignment, reengineering, and total restructuring have placed a big demand on companies to reduce technical staffs and any kind of costs. With resources more limited, he explained, it has become increasingly necessary for companies to prioritize work and address the most important items.
Lubrizol's engineering endeavor is one that has undergone a realignment, with activities of a formerly large central engineering group shifted to various facilities around the U.S. and the world. But, according to Boehringer, the engineering group believes it can meet its overall corporate goals by increasing its effectiveness through standardizing on and expanding use of such engineering tools as process modeling and simulation.
Traditional engineering and design projects are very data-intensive, Boehringer pointed out. And design iterations, although common within functions, were difficult to do within the overall operation because information wasn't being shared. Inflexible designs and capital overruns are two of the problems resulting from this situation.
To address such considerations, Lubrizol has been implementing a new approach to engineering that deals both with process engineering concerns, such as simulation, and with project engineering concerns, such as cost estimating. By standardizing on engineering productivity software tools in both areas and managing these tools by way of a common network, the new approach has in effect created two linked islands of information. Lubrizol expects this approach to lead to better design, enhanced operability, and shorter project cycles, among other benefits.
A very different type of company, British Nuclear Fuels (BNFL), Warrington, England, views process modeling as a key technology supporting its expansion through increased competitiveness and diversification. So BNFL is actively promoting the greater use of modeling in the company.
Founded in 1971, BNFL is a major supplier of nuclear fuel cycle services throughout the world. It operates two nuclear power plants in the U.K., produces commercial nuclear reactor fuel, supplies uranium hexafluoride to enrichment plants worldwide, and reprocesses irradiated nuclear fuel and manages the waste products.
The company, according to BNFL process engineer Stephen Baker, has been seeking to establish a culture in which process modeling is seen as a natural tool for engineers to use as part of their everyday work. "We must," he said, "move away from the thoughts that modeling is a specialist activity only to be undertaken by company experts."
To this end, the company has so far trained more than 50 engineers from all its sites and divisions in the use of the necessary software and has worked on ways to make the modeling process easier. As a result, Baker said, BNFL engineers are not only producing computer models in process development and design, but are also routinely using models to support their company's plant operations.
Use of process modeling as a way of storing and communicating the process knowledge that has been developed and applied in designing the plant is seen by Baker as particularly important. Development and maintenance of a model, he said, provides the thread linking all the activities that require process data at each stage of the project life cycle. BNFL has made some initial steps toward developing an integrated environment aimed at passing information routinely throughout the different stages of the life cycle.
Air Products & Chemicals, Allentown, Pa., has been a long-time user of process modeling and simulation in its two principal businesses, industrial gases and chemicals. Some 25 years ago, the company began development of its own in-house simulator and thermodynamics/physical properties database. But since the system was customized to meet the needs of cryogenic gas separation (the raw material, air, is free, for example), it was not as effective for chemicals applications as were other simulators being developed by software firms.
About five years ago, Air Products undertook a number of key improvements in its process simulation capability, according to John B. Pfeiffer, director of research and engineering systems at the company. Among improvements sought were a user-friendly interface to replace the name list input used with the in-house simulator; a simulator to run on demand to meet ever-tighter project schedules; a means to eliminate rekeying of information and provide easy transfer of files among the different engineering disciplines working on a project; and provision of a graphical/flowsheet capability to eliminate the need for separate storage of flowsheets and energy and material balance files.
Engineers at Air Products have been working for a couple of years now to link the proprietary design models to a new simulator and link the simulator to the proprietary properties database. These efforts are nearing completion. Pfeiffer described how the company is beginning to see new advantages that process simulation will provide for its engineering work processes.
In the chemicals area, for example, implementing new technology requires a close and intense interaction among R&D personnel and process engineers, and between the process engineers and plant operating personnel. Facilitating communications across these interfaces, Pfeiffer pointed out, is of paramount importance for the success of a project or program. "Clear objectives and responsibilities certainly help here," he said, "but simulation tools also play an important role in the work process."
Looking toward closer integration of the company's worldwide engineering efforts in the future, Pfeiffer also noted that the simulation tools will provide a common basis for engineering communication across geographical boundaries. They are, he said, quoting a company official, "a first step toward eliminating geographic boundaries that will enable Air Products to engineer around-the-clock."
The history of process modeling and simulation at DuPont probably stretches back as far as if not further than anyone's - more than 30 years. The company's experience over that time encapsulates many of the business issues and technological developments in the evolution of process modeling to the key information role it is assuming today.
Indeed, as of last summer, users at DuPont were logging 300 hours per month for process simulation on the company's Cray C90 supercomputer, according to David L. Filkin, a consultant in the scientific computing division.
Process modeling is just one of four types of modeling at DuPont, along with properties, product, and ecospheric modeling, Filkin explained at the Boston meeting. Among their applications: Properties modeling has been used to predict thermodynamic properties of chlorofluorocarbon alternatives, so that many alternatives could be scouted without the need for making them. Product modeling applied to carpet fiber characterization has been used to predict the appearance of a fiber from its shape before being made. And ecospheric modeling has been used for investigation of ozone depletion and deep-well injection of hazardous waste.
DuPont's strategy, according to Filkin, is not so much one of building the right computing environment but one of building an environment for computing. The goal is to provide in a network everything required - so that, for example, any engineers or scientists needing a Cray supercomputer can pretend they have it on their desktop.
Process modeling began at DuPont in the 1950s, when company engineers developed an in-house software program called Chemical Process Evaluation System (CPES). Written in FORTRAN, it ran on a UNIVAC computer with punched cards for input. Jay R. Balder, manager of the process engineering group at DuPont, noted that until the early 1980s, CPES was the only simulator that could meet the company's modeling needs.
In 1982, vendors offering steady-state simulation software for license arrived on the engineering scene. Also at that time, DuPont introduced distributed computing capability within the company when it purchased Digital Equipment Corp. VAX computers.
Balder recalled that DuPont's first license for a vendor-supplied process simulation software package was acquired in 1983 with the licensing of Process from SimSci. DuPont's organizations are very autonomous, according to Balder, and one part of the engineering organization, critical of some aspects of CPES, were attracted to that software. Another part of engineering was at the same time developing a new version of CPES and promoting its use throughout the company. By using the features of VAX, the developers created CPES6, which Balder said offered the unique capability of doing interactive simulation, a capability ahead of its time.
With the addition of a third simulation program in 1986 - Aspen Plus from AspenTech - engineers could access any of three powerful steady-state simulation software products. Meanwhile, the company was leasing a Cray supercomputer that was used heavily by scientists for their computational needs, though not by engineers for process modeling and simulation. As a result of internal pressures, however, a joint effort was launched by engineering and the scientific computing division to develop and install a Cray version of CPES.
"It was wildly successful," according to Balder. The engineers loved it because of the rapid turnaround. Eventually, all three simulation programs became resident on the Cray.
In the earlier days, the UNIVAC computer system was expected to be financially supporting, with user fees based on time used on the central processing unit. The CPES group also had to be self-supporting and added a surcharge onto the computer charge to generate income for the group. When the engineering department licensed Process for all of DuPont, it did not receive any corporate funds to do so. The VAX computers added a further financial wrinkle in that when the programs were installed on a VAX, engineering lost a revenue source.
Other funding complexities persisted until the Cray became available. The scientific computing division made a case that the computer was a corporate asset whose use should be increased. Also, corporate financial support was gained for operation of the Cray computing center, and run-time charges were removed completely.
"These two changes had a significant impact on modeling and simulation," Balder said. "Modeling was free. The number of users increased, the usage of the software increased, and the use of the Cray increased."
The latest chapter in the DuPont history came in the early 1990s, when the company began major efforts to improve the way it purchased goods and services. Convergence to a single software vendor was the result for the company's modeling activities. After engaging in studies and experimentation, and weighing many factors, a team considering the issues concluded that selecting a single process simulator was the right thing to do and by 1992 had settled on Aspen Plus.
Balder pointed out that the company recognized taking software away from engineers was not something to be done casually. "If the only savings were license fees," he said, "I do not think it would have been worth the lost productivity that would result from convergence. The larger strategic issue involved the hidden costs of following a multiple simulator strategy."
Probably the most important of the costs, according to Balder, involved the physical property database and associated thermodynamics. For a successful simulation, setting up the correct thermodynamics and physical properties is extremely critical, and important physical property data and thermodynamics were not interchangeable among the different vendor's software programs. Hence, engineers using different simulators having different databases were getting different simulation results for the same process. With convergence on one simulator, only one version of the database would be needed.
Thus, after convergence, engineering saw great potential in moving all the physical property data developed for particular processes on particular simulators into one Aspen-based database. But, according to Balder, before this activity got very far, the scientific computing division proposed creating a corporate physical property database with capabilities to meet needs beyond those of simulation alone. This has now been done, mostly by using methods recommended by the American Institute of Chemical Engineers' Design Institute for Physical Property Data.
"In hindsight," Balder said, "we could have created such a database without converging our simulation software. But I believe it was the convergence effort, the decision of our major businesses to work together rather than independently in the simulation arena, that provided the foundation necessary to support our corporate database effort."
There is much left to do, according to Balder. For example, he noted that the company has added to its software to meet the needs of stand-alone PC users, though it still hasn't sorted out its strategy regarding PC-based simulation versus a client-server mode. Having modeling and simulation become valued by the business leadership is a challenge. There is also a need to find ways of increasing the use of modeling and simulation on a day-to-day basis by people in plants and research laboratories.
But, Balder said, "These business challenges coupled with the rapid evolution of computer hardware, telecommunication networks, and simulation software will keep us all busy into the foreseeable future."
In that, he undoubtedly echoes the sentiments of many in the chemical community within their own contexts.
[ACS Home Page] [ACS Publications Division Page]