Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Control... Alt... Bang! The curse of the power-hungry computer

There's a new IT rule: the smaller it is, the hotter it gets - and techies have had to travel back in time to find a solution

Stephen Pritchard
Sunday 20 August 2006 00:00 BST
Comments

Temperatures are rising in the computer world. One of the most popular clips on video download sites right now is footage of a Dell laptop bursting into flames, after its battery overheated.

The US computer giant announced last Monday that it was recalling 4.1 million of the Sony-made batteries used in many of its machines, the largest-ever product recall in the consumer electronics industry. And it is unlikely to be the last. Manufacturers are trying to cram ever more power into smaller computers, which means they are much more likely to overheat.

The problem is not limited to laptops. Anyone who has held a long conversation on one of the latest 3G mobile phones will know that they quickly become hot to the touch. But the biggest problems of all lie in the large computer rooms, known as data centres, that power the IT systems for banks, telecoms firms and retailers. According to the IT company Sun Microsystems, the average data centre uses 2MW of power, or enough electricity for 2,000 homes.

A generation ago, the greatest constraint facing companies that needed high-power computing was cost: the big mainframe computers, or supercomputers, were out of reach of all but the largest firms, and even multinationals would have only one or two machines.

As the cost of computer hardware fell, space became the issue. Network computers, known as servers, used to be large machines housed in metal cabinets the size of wardrobes, limiting how many machines companies could fit in their data centres.

Miniaturisation, however, has changed the rules again.

In the past five years companies have bought larger and larger numbers of small servers, based on similar designs to a desktop PC. Such machines - known as "industry standard" servers using chips from Intel and AMD - are cheap, widely available and relatively easy to set up and manage. But they use far more power, and become far, far hotter, than older computers.

Steve Prentice, research vice-president at industry analysts Gartner, warns: "Every large data centre that was built for mainframe or mini-computers will suffer from problems with electrical power, because the power demands of an Intel-based server are far, far higher than an old mainframe, in terms of watts of electrical power per square foot of floor space." He says that a fully loaded rack of today's computers would use 25kW of electricity (1MW = 1,000kW) compared with 2kW to 3kW for the same space three or four years ago.

But there is simply not enough power to go around - especially in North America, where California and parts of New York have suffered power cuts. Even where the power supply does not fail, large-scale computer users cannot tap into enough power to expand their systems.

"Our data centre has no more power available," explains Peter Rinfret, chief executive of Iris Wireless, a company that runs text and multi-media messaging for mobile networks in the US, Asia and Europe. "We have already waited 95 days to hear from the electricity company whether we can have more power. If not, we will have to build another data centre."

Mr Rinfret has turned to the computer maker IBM to see whether Iris can avoid having to build a vastly expensive data centre by installing servers that use less power. IBM is not alone in this: chip-maker Intel and computer maker Sun are focusing their design efforts on less power-hungry systems.

IBM launched a new generation of power-efficient servers this month. Although the machines use energy-efficient chips - in this case from Intel's rival AMD - other, more mundane parts of a computer can make a big difference to power efficiency.

IBM, for example, uses transformers that it claims are 91 per cent efficient in its servers; a standard, off-the-shelf transformer is only 70 per cent efficient. According to Bernie Meyerson, chief technology officer for IBM's systems and technology group, this is not just a waste of power. It contributes to the modern computer's arch-enemy: heat. Wasted electricity does not simply disappear: it is transformed into heat, and if computers run too hot, performance suffers and systems might even shut down.

To prevent this, excess heat has to be removed by air conditioning, which in turn demands more electricity. This leads to a vicious circle in which some companies spend more on powering and cooling data centres than on the computer equipment itself. With energy prices rising on both sides of the Atlantic, the situation will only get worse.

Aaron Davis, vice-president of data-centre technology company APC, says that much of the problem lies in how data centres were designed: they are simply not equipped for the high-power, high-density computers that businesses are buying today. "Rising energy costs mean that energy bills and the energy consumed by IT are becoming bigger and bigger issues for finance directors," he says. "A third of that cost could be removed by better designs of data centres."

It is all too easy to underestimate the scale of the problem. When electricity was cheap and computers expensive, companies paid little attention to power bills. But higher electricity prices have changed those rules, just as businesses have committed to using more power-hungry computers.

Forrester Research, a firm of IT industry analysts, calculates that a mid-sized data centre with 2,500 computers will use enough electricity in just one month to power 420,000 homes, or a city the size of Amsterdam, for a whole year. And many data centres are far larger: Google, for example, uses half a million servers to run its search engine and other services.

Mr Davis at APC estimates that, globally, businesses and other large computer users could save $100bn (£53bn) in running costs by redesigning systems.

Some of the ways companies can do this might seem low tech. APC, for example, advocates putting fans in between racks of computers, rather than trying to cool the whole computer room. And IBM is making water-cooled radiators that fit on the back of computer cabinets.

Water cooling was a standard feature of computers in the 1960s, but IBM engineers make no apologies for stepping back in time to solve today's problems. "The rear-door heat exchanger solves the problem of cooling in a small space," says Mr Meyerson. "Water is four times as efficient at cooling as air."

Just don't try to cool your overheated laptop by putting it in a bucket of water.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in