Green Computing of the Future

Google’s massive computing facilities use 12V auto batteries as back-up power source! Is it true? Yes, per David Kanter, editor-in-chief of Real World Technologies.
Google, much like Amazon, Yahoo! or Facebook relies on aggregate computing processing power. Cost remains an issue, despite increased hardware affordability and performance advantages from better systems architecture. This might be old news at Slashdot, as it was based on events of perhaps mid-year 2009. It was new to me though.

“Google shared some of its Borg hardware with the world… a 12V car battery was used for backup, rather than an UPS. Why? Modern UPS designs improve reliability in exactly the same way, but the conversion efficiency is not perfect. A good UPS might convert power at 90% efficiency. For a company with a dozen servers, a 10% inefficiency power is unfortunate, but may not be worth chasing after… [Google’s] car battery approach achieves 99.9% efficiency, and with hundreds of thousands of servers, that can make a big difference.”

I’d like to see the numbers. Reliability is based on expected failure rates. Without knowing more about the probability distribution for 12V car battery failure, and how multiple 12V batteries are configured for back-up power, I don’t have sufficient information to validate. Other considerations would be: Exactly how many 12V batteries does Google use? How much space do they require and isn’t there some trade-off due to the enormous footprint of a vast 12V battery farm? If feasible, it would be a wonderfully innovative and clever solution, using a relatively more environmentally friendly and functionally superior design for back-up power.

Here’s another example of energy-efficiency:

“Google simplifies their designs to only use 12V power rails. A modern motherboard requires several voltage planes: 3.3V, 5V and 12V. To provide each voltage, the power supply needs to convert 115V AC to the appropriate level. Instead, Google outputs 12V and uses high efficiency voltage conversion on the motherboard to go from 12V to other voltage levels”.

The macro-benefits of data center optimization are very green-friendly, and not exclusive to Google. Any large corporate data center consumes a great amount of electricity, and outputs waste heat. Locating a data center near cheap power sources is sensible, and is presumably why Amazon and Google, amongst others have chosen the Pacific Northwest due to the region’s plentiful hydroelectric power.

The most pleasing idea was to recycle the heat and warm air generated by server rooms and data centers. The excess heat would be used for temperature control, to heat buildings, swimming pools or maybe industrial purposes. The article concludes with a real-world example; Telehouse, a U.K. colocation provider, specifically designed its London data center with an exhaust pipe to move heat to third parties in the Docklands.

see Exectweets, Unconventional Computing: The Future is Hot Air by David Kanter, May 2010.

Published in: on 14 May 2010 at 3:35 am  Leave a Comment  
Tags: , ,