Google has published a detailed post of its efforts to reduce datacenter power costs by targeting the center, not the computers.
Every hardware vendor out there, from IBM (NYSE: IBM) on down is
frantically looking for ways to cut power and heat in their products, but
the datacenter building has not gotten as much attention. IBM has tried with
its green datacenter efforts, but by and large, the building remains an
overlooked issue.
Big mistake, because for every watt of power used by the computers, you
will spend 0.96 watts to run the datacenter – in other words, requiring 0.96
watts of power for the facility itself for every 1 watt of power for
computers. Google datacenters, on the other hand, are 21 percent of the
cost. In some cases, it’s as low as 15 percent.
If you only worry about computers, you’re fighting half the battle,
according to Erik Teetzel, energy program manager at Google (NASDAQ: GOOG).
“To do proper datacenter efficiency, it helps to have a holistic picture of
operations,” he told InternetNews.com. “You have to take into account
what you would spend on capital costs.”
But people have not. Power has been cheap in many places, so companies
are not incentivized to be more efficient. Companies have been more anxious
to get up and running, and thought about power efficiency later. And many IT
managers say they are responsible for the computers in the datacenter, but
not the building, said Teetzel.
That’s changing. “We’re seeing a lot of people saying ‘hey, you can do
both and you should do both.’ You should look at total cost of ownership and
make energy efficiency a priority. Being very efficient with the way you
operate your computer infrastructure can have significant benefits to the environment of course but also to the bottom line.”
Building management is overlooked
Google’s strategy encompasses five elements: efficient equipment,
efficient datacenter water management, server retirement and clean energy.
The first part is rather straight forward. The company used highly efficient
power supplies and voltage regulator modules, the two worst offenders for
power loss in a computer, said Teetzel.
The company removed components not used, like sound and discrete
graphics. The result is Google servers only lose a little over 15 percent of
the electricity they pull from the wall during these power conversion steps,
less than half of what is lost in a typical server. Google estimates it
saves $30 per server, per year with this method.
For the datacenters, things get a bit more complicated. Teetzel said it
is possible to retrofit an older building to gain maximum cooling
efficiency, although it’s easier to build from scratch as well.
Next page: Efficient and effective airflow
Page 2 of 3
From there, the challenge is efficient and effective airflow, where hot
spots or cold spots need to be avoided and both thermal management and
airflow analysis must be done. For actual cooling, Google used evaporative
cooling instead of active refrigerators. Cold water is run through the
datacenter, absorbs the heat from the computers, and is run through a
cooling tower.
The water runs down a high surface, similar to the artistic waterfalls seen in restaurants or homes, only this serves the purpose of letting the heat dissipate.
Google found that letting the water essentially cool off by letting it
flow was much more effective than using Freon-drinking chillers. “We saw
significant gains and it’s certainly something we would recommend in the
future,” said Teetzel.
The notion of logistics only partially works, he added. One of the most
famous examples of this is the Sun Mobile Datacenters being stored in a mine
deep underground in Japan, where they plan to take advantage of its cool
climate.
Teetzel said that’s a great idea to a point, the point being local
performance. “For the most part, performance, our ability to service
queries, is dependent on the location of the datacenter relative to the load
center. So putting these in Antarctica would be difficult in terms of a load
standpoint,” he said.
But a lot of firms requiring a datacenter can’t do a lot about the
building, according to James Staten, senior analyst with Forrester Research.
“They don’t own it, or if they own it there’s a fair amount of high cost to
make it more efficient,” he told InternetNews.com.
Plus, most companies are faced with the priority of getting a datacenter
up and running first, dealing with energy efficiency second. “Usually when
people need to build datacenters, they are growing and need a new datacenter
right now,” said Staten. “In the world of building a new datacenter, that’s
50 weeks away. So you have to forecast this pretty far in advance to know
what your growth rates are like.”
Next page: The remaining issues
Page 3 of 3
The remaining issues
This leads to issue three, water management. On average, two gallons of
water is consumed in a datacenter for every kilowatt-hour of electricity
produced in the U.S., and it must be fresh water. Ocean water has too many
particulates and is highly corrosive.
The first step in the water savings comes from using less electricity.
Less juice, less water needed. The next step is recycling the water, either
its own or using it from local sources that, while not suitable for
consumption, is clean enough to cool the systems. By the end of 2008, two
Google facilities will run on 100 percent recycled water and by 2010 Google
expects recycled water to provide 80 percent of its total water consumption.
Then there’s disposal. Servers constantly break down, but also, more
efficient hardware is constantly hitting the market. For Google, setting up
proper disposal of old computers was actually quite tough. For starters, it
doesn’t have a single vendor to turn to for recycling. HP and IBM have been
quite active in recycling programs but Google builds its own servers, so it
couldn’t turn to an OEM partner.
First, there weren’t a lot of e-waste disposal programs out there.
Second, the socially-conscious company didn’t want its old gear being sold
to third-world companies that would dispose of them in incinerators with no
pollution scrubbers.
Repurposing the servers
The main solution was repurposing. A full 70 percent of servers that
ended their days answering user queries were reused somewhere else, either
as servers within the company or as spare parts. Others were sold or
donated.
While Google has focused on the building, it still has the hardware in
mind, and a few other ideas for the datacenter building. “We believe that we
can definitely do better,” sayid Teetzel. “This is a first starting point.
Part of what we want is to make sure the whole industry starts to look at
this.”
Staten said Google is “an excellent example” for building efficient
datacenters. “They are in a position where they can afford to take more
risks and try things other companies can’t, so they can prove this green IT
technology first,” he said.