Thursday 16 October 2014

Third Part: Supercomputer

Third Part: Supercomputer

Previously, we learned about reason studying "parallel computing".And now, let study somethings different and meet our new "friend"- Supercomputer.

A Dialog between Student X and new friend-Supercomputer

Student X     : Nice to meet u, Supercomputer.Few of us are not familiar on "you". Can u                                         introduce yourself?

Supercomputer: Well. Below is all about me:

   Supercomputers, the world`s LARGEST and FASTEST computers.Supercomputer is a computer that performs at or near the currently highest operational rate comparing to any other computers such as desktop computer. It is typically used for scientific and engineering applications that must handle very large databases or do a huge amount of computation ( or both ). There are usually a few well-publicized supercomputer that operate at extremely high speeds.Most supercomputers are really multiple computers which perform parallel processing.

Student X : What are different between "you" and others such as desktop computer?

Supercomputer: Hm..There are some difference between "us":

   Actually, the parts of a supercomputer are comparable to those desktop computer. They both contain hard disk, memory, and processors.Although both desktop computers and supercomputers are equipped with similar processors, but their speed and memory sizes are totally different. For example, a desktop computer which built in the year 2000 normally has a hard disk capacity of between 2 and 20 gigabytes and one processor with tens of megabytes of RAM- just enough to perform task such as word processing, web browsing and some simple video games,Meanwhile, a supercomputer of the same period has thousands of processors, hundreds of gigabytes of RAM and hard drives that allow for hundreds or thousands gigabytes of storage space.

  Large number of processors, enormous disk storage and substantial memory greatly increase the power and speed of the machine. Although desktop computers can perform millions of floating-point operations per second, but supercomputers can perform at speeds of billions of operations per second and trillions of operations per second.

Student X    : Can u tell us some about "your" past?

Supercomputer : This will be a long story. But,Well.I will summarize it :

Sources from Here:

  The story of supercomputers goes back to the 1960s , with the Altas at the University of Manchester and a series of computers at Control Data Corporation (CDC) designed by Seymour Cray.The Altas was a joint venture between Ferranti and the Manchester University and was designed to operate at processing speeds approaching one microsecond per instruction, about one million instruction per second. The first Atlas was officially commissioned on 7 December 1962 as one of the world`s first supercomputer-considered to be the most powerful computer in the world at that time by a considerable margin, and equivalent to four IBM 7094s.

Timeline of Supercomputer Evolution:

Year 1969 : CDC 7600 was released.It surpassed the 6600 with a clock speed of 36.4 Mhz and used
                    a pipelined scalar architecture.
Year 1972 : Seymour Cray left CDC to form his own computing firm,Cray Research.

Year.....Want Know more?Click Here


Student X  : What are special application of "you" compared to others?

Supercomputer : There are lots of super-cool uses:

  Supercomputers are prefect for tackling big scientific problems,from uncovering the origins of the universe to delving into the patterns of protein folding that make life possible.

(1) Recreating the Big Bang


The "Big Bang", or the initial expansion of all energy and matter in the universe, happened more than 13 billion years ago in trillion-degree Celsius temperatures, but supercomputer simulations make it possible to observe what ween on during the universe`birth.
For example, Researchers at the TACC at the University of Texas in Austin have used supercomputers to simulate the formation of the first galaxy, while scientists at NASA`s Ames Research Center in Mountain View, Calif have simulated the creation of stars from cosmic dust and gas.

(2) Testing nuclear weapons

Since 1992, the United States has banned the testing of nuclear weapons.But that doesn`t mean the nuclear arsenal is out of date.
Example,the Stockpile Stewardship program uses non-nuclear lab tests and , yes , computer simulations to ensure that the country`s cache of nuclear weapons are functional and safe.
In 2002, IBM plans to unveil a new supercomputer, Sequoia, at Lawrence Livermore National Laboratory in California.According to IBM, Sequoia will be a 20 petaflop machine, meaning it will be capable to perform twenty thousand trillion calculations each second and besides, create better simulations of nuclear explosions and to do away with real-world nuke testing for good.

(3) Predicting climate change

The challenge of predicting global climate is immense.There are hundreds of variables, from the reflectivity of the earth`s surface to the vagaries of ocean currents.Dealing with these variables requires supercomputing capabilities.

For example,NASA Supercomputer are success to generate closer look at future climate conditions in U.S.Using previously published large-scale climate model projections, a team of scientists from NASA, the Climate Analytics Group, Palo Alto, Calif.,a non-profit that provides climate data services, and California State Universiry, Monterey Bay, has released monthly climate projections for the coterminous United States at a scale of one half mile,or approximately the size of a neighborhood..

For more information , please forward Here.



Next section : Top 10 Supercomputers in the world
To be continued.........

No comments:

Post a Comment