CSC/ECE 506 Fall 2007/wiki1 2 cv
High Performance Computing Trends in Scientific and Engineering Applications
The demand and development of High Performance Computing (HPC) systems has been primarily driven by the need to solve complex problems within a reasonable time period. These problems are found quite frequently in a wide variety of scientific and engineering fields.
The field of High Performance Computing is changing and expanding quite rapidly as new processor designs are developed and as new techniques for writing parallel applications are developed. As High Performance Computing systems become faster and more efficient, the demands placed on them also increase.
Hardware used in HPC
Numerous architectures have been used throughout the history of High Performance Computing. Most modern HPC systems are based on vector processors or commodity general purpose processors.
Vector Processors
Vector processors are very efficient for certain types of problems because they are capable of performing an operation on many data elements simultaneously. However, vector systems generally cost more to design and develop because the demand for these types of processors is smaller than other types of processors. For certain types of problems, the extra performance from a system designed around vector processors is worth the extra cost. For other problems, the price difference may make other designs more desirable (http://www.csar.cfs.ac.uk/about/csarfocus/focus12/application_performance.pdf).
Commodity Processors
Many modern HPC systems are now being designed around using many commodity general purpose processors connected together using a high-speed interconnect. These types of systems typically connect large numbers of commodity computer systems together using a high speed network. The group of systems is treated as a single, high performance system known as a cluster.
Clusters built using these types of designs are generally quite cheap to design and build because they use processors and designs that are already being mass produced for use by the general public. The popularity of these designs has been steadily increasing over the past several years (http://www10.informatik.uni-erlangen.de/~deserno/HLRB2003.pdf).
Many commodity processors are now being designed with two or more processor cores on a single chip. The number of cores per chip is expected to increase with Intel demonstrating an 80 core chip in the spring of 2007 (http://www.intel.com/pressroom/archive/releases/20070204comp.htm). Using these processors in a cluster allows for a system that can have hundreds of processors for very little cost.
Software Applications
High Performance Computing systems are used to solve a wide variety of scientific and engineering problems. One common use is simulating plasma physics in a three-dimensional space. Such applications commonly use the Lattice-Boltzmann method to model the flow of a fluid in a given environment.
Another common application is simulating the physics of black holes and other astronomical bodies. Some of the most demanding problems involve simulations using Einstein’s theory of General Relativity as applied to gravitational forces.
To make developing applications to solve such problems easier a few development toolkits have been designed. One such toolkit is called the Cactus Computational Toolkit (http://www.cactuscode.org/). Cactus can be used to develop applications to solve a wide variety of problems and to perform simulations. It is specifically designed to allow an application to be developed using a standard commodity system and then later run on a HPC system.
External Links
Application Performance of Modern Number Crunchers
Performance of Scientific Applications on Modern Supercomputers
Performance Evaluation of Scientific Applications on Modern Parallel Vector Systems
Intel Research Advances 'Era Of Tera'
Cactus Computational Toolkit
Wbosborn 13:48, 4 September 2007 (EDT)