Friday, December 14, 2012

Engineers Hunt for Ways to Cool Computing

Increasingly, heat looms as the single largest obstacle to computing's continued advancement. Solutions could include liquid coolants or circuits arrayed on 3-D grids like the brain


A laptop computer can double as an effective portable knee-warmer ? pleasant in a cold office. But a bigger desktop machine needs a fan. A data center as large as those used by Google needs a high-volume flow of cooling water. And with cutting-edge supercomputers, the trick is to keep them from melting. A world-class machine at the Leibniz Supercomputing center in Munich, for example, operates at 3 petaflops (3 ? 1015 operations per second), and the heat it produces warms some of the center's buildings. Current trends suggest that the next milestone in computing ? an exaflop machine performing at 1018 flops ? would consume hundreds of megawatts of power (equivalent to the output of a small nuclear plant) and turn virtually all of that energy into heat.

Increasingly, heat looms as the single largest obstacle to computing's continued advancement. The problem is fundamental: the smaller and more densely packed circuits become, the hotter they get. ?The heat flux generated by today's microprocessors is loosely comparable to that on the Sun's surface,? says Suresh Garimella, a specialist in computer-energy management at Purdue University in West Lafayette, Indiana. ?But unlike the Sun, the devices must be cooled to temperatures lower than 100 ?C? to function properly, he says.

To achieve that ever more difficult goal, engineers are exploring new ways of cooling ? by pumping liquid coolants directly on to chips, for example, rather than circulating air around them. In a more radical vein, researchers are also seeking to reduce heat flux by exploring ways to package the circuitry. Instead of being confined to two-dimensional (2D) slabs, for example, circuits might be arrayed in 3D grids and networks inspired by the architecture of the brain, which manages to carry out massive computations without any special cooling gear. Perhaps future supercomputers will not even be powered by electrical currents borne along metal wires, but driven electrochemically by ions in the coolant flow.

This is not the most glamorous work in computing ? certainly not compared to much-publicized efforts to make electronic devices ever smaller and faster. But those high-profile innovations will count for little unless engineers crack the problem of heat.

Go with the flow
The problem is as old as computers. The first modern electronic computer ? a 30-tonne machine called ENIAC that was built at the University of Pennsylvania in Philadelphia at the end of the Second World War ? used 18,000 vacuum tubes, which had to be cooled by an array of fans. The transition to solid-state silicon devices in the 1960s offered some respite, but the need for cooling returned as device densities climbed. In the early 1990s, a shift from earlier 'bipolar' transistor technology to complementary metal oxide semiconductor (CMOS) devices offered another respite by greatly reducing the power dissipation per device. But chip-level computing power doubles roughly every 18 months, as famously described by Moore's Law, and this exponential growth has brought the problem to the fore yet again (see 'Rising temperatures'). Some of today's microprocessors pump out heat from more than one billion transistors. If a typical desktop machine let its chips simply radiate their heat into a vacuum, its interior would reach several thousand degrees Celsius.

That is why desktop computers (and some laptops) have fans. Air that has been warmed by the chips carries some heat away by convection, but not enough: the fan circulates enough air to keep temperatures at a workable 75 ?C or so.

Source: http://rss.sciam.com/click.phdo?i=04fc5bd0b06b4e839f20b58db7f564e8

r kelly r. kelly macular degeneration whitney houston funeral judi dench bobby brown leaves funeral donnie mcclurkin

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.