Chemical & Engineering News,
March 27, 1995

Copyright © 1995 by the American Chemical Society.

Supercomputers: Bringing megabytes to the masses

A National Science Foundation panel recently defined supercomputing as a computational and communications capability that allows individuals and groups to extend their ability to solve research, design, and modeling problems substantially beyond that available to them before. This results-oriented definition indicates a significant shift in how supercomputers are viewed. The world of supercomputers is changing in many ways, and with their power now reaching the desktop at an acceptable price, chemistry is one of the scientific areas standing to benefit.

Edward R. McCracken, chairman and chief executive officer of Silicon Graphics Inc., Mountain View, Calif., offered an insider's view of these changes this past November in Washington, D.C., at Supercomputing '94, the annual benchmark conference of the high-performance computing industry sponsored by the Institute of Electrical & Electronics Engineers.

McCracken noted the change from "big science" - the Superconducting Super Collider - to "networked science" - the Human Genome Project. "The shift from big science to networked science combined with the constraints of government funding for basic research," he said, "means that it's no longer viable to make the fastest supercomputer you can and hang the expense."

It becomes increasingly important instead to map the available computational cycles to work being done, McCracken stressed. With today's technology, the fastest computer just might not be the most expensive. This is particularly important for supercomputing, he added, because the field of scientists who need supercomputers is getting larger and larger at a very rapid rate.

Supercomputers, McCracken said, have been treated more like national landmarks - "something that you have to travel to get to and that tells you more about the past than the present and the future." It's necessary, he said, to reconnect supercomputing with the rest of the computing industry.

According to McCracken, the benefits of scalability and interoperability have been largely missing from the high-performance computer arena. But that has to change. Otherwise, he said, the industry is missing a huge opportunity to address the real-world computational needs, which tend to scale up and down, depending on the application, and which involve different kinds of computers from different suppliers.

The microprocessor, McCracken said, is the means the industry has to introduce scalability and interoperability to supercomputing. As an example of increasing power, he noted that reduced-instruction-set computing (RISC) microprocessors delivered less than 10 million floating point operations per second (MFLOPS) fewer than 10 years ago. Today, new microprocessors, such as those from IBM and Silicon Graphics, perform at the rate of hundreds of MFLOPS. And speed has increased. Clock rates for microprocessors over the past 10 years have increased by a factor of 50 to more than 200 MHz.

But speed isn't everything, and the number of MFLOPS isn't everything either. One of the goals is to provide supercomputing capability to solve the vast majority of supercomputing problems at a price people can actually afford. "Price matters, too," he said, "and microprocessors can get even more powerful in the future, solve even more problems, and do it at a fraction of the cost."

Moreover, McCracken said, there is a need to think not of supercomputing centers but rather of supercomputing networks. "Soon," he said, "we'll have a world not of 10 to 20 supercomputing centers but a world of 10,000 to 20,000 supercomputers, all networked together. In this world, scalability includes a horizontal dimension - of more users, more sites, and more sources of information."

McCracken's evolutionary model for supercomputing thus has three steps. The first step is what he calls the analysis shift - brought about by computing in general and supercomputing in particular. These technologies made it possible to tackle and solve problems that had previously been beyond anyone's grasp.

The second step is what McCracken terms the insight shift, which was brought about when computers began to communicate visually. This capability meant the analytic power of supercomputing could be coupled with the enlightening power of three-dimensional graphics images.

The third step is the access shift. This is where the analytic and presentation powers that have been amassed are made available to a much broader community of people who want to learn and understand.


[ACS Home Page] [ACS Publications Division Page]