Hosting Firm The Planet Embraces Global Cooling

The Planet

The Planet’s custom cabinets

Source: The Planet

Forget its search algorithms; Google guards its datacenter layout and cooling strategies like KFC protects its secret recipe of 11 herbs and spices. The steps it takes to maximize efficiency are among its closest secrets.

Contrast that with The Planet, a giant hosting provider based in Dallas, Texas, which has no problems talking about its techniques or sharing them. The firm detailed them in a recent blog post and further elaborated on its techniques with InternetNews.com.

Datacenter cooling is becoming an increasing problem for all datacenter owners and operators as cooling costs are rapidly outpacing the cost of the actual hardware. With the centers spending two or three times as much on cooling as power to run the machines, it means fewer machines can be installed in the center.

The Planet has six datacenters – four in Dallas and two in Houston. All told, the company services 25,000 customers and more than 14 million Websites. That translates into a lot of computers to manage.

The effort is literally a ground up task, as Jeff Lowenburg, vice president of facilities, notes many companies are sloppy about what they put under the raised floor on which their computers reside.

“In a lot of the datacenters I’ve seen, you see stacks of network cabling under the floors,” he said. “They run chill water and chill piping and refrigerant under the floor with the cabling and let it stack up over time. Many datacenters tend to leave old networking cables in place, then add new cables on top of them. After a few years you have air dams under the floor to keep air flowing freely.”

The Planet puts all of its cabling above the racks, so the only thing running under the computer floor is power cables and refrigerant piping.

Next is the positioning. Most datacenters have what are called “cold aisles” and “hot aisles.” Standard datacenter design is to put holes in the floor in front of a rack and cold air rises up and is sucked into the computer by intake fans and the hot air expelled out the back.

So typical placement is to have an aisle with two outward facing racks and the cold aisle in between the two. Those two racks then have their back side, where the exhaust comes out, facing the back of another rack. That creates a no-mans-land where two servers are blowing their hot exhaust at each other, making the hot aisle.

Next page: Sending heat up, not out the back

Page 2 of 2

Sending heat up, not out the back

The Planet’s solution is to use special, custom built rack mount cabinets from Chatsworth Products in which a “chimney” is placed at the top of the rack cabinet and hot air is pulled up and into the ceiling, where it is drawn back in by the air conditioning system, cooled, and re-circulated.

In effect, the rack enclosure uses basic physics of warm air rising and cold air sinking by pulling in cold air from the floor and venting it up, instead of out the back. Because the exhaust vent is narrower than the cabinet, natural convection pulls the air out faster than it comes in.

The result is no hot aisles. “When you completely separate hot and cold air, you don’t need to have the hot aisles and cold aisles to maximize your datacenter. You can have every aisle be a cold aisle,” said Lowenburg.

The rack mount computers are from fellow Texans Dell (NASDAQ: DELL). Dell doesn’t have a similar rack design because Chatsworth has a patent on the design.

Lowenburg said that The Planet’s datacenters have an average power usage effectiveness of 1.65 to 1.7, and a new Dallas facility, with high efficiency chillers, will get as low as 1.3 PUE. A typical datacenter, according to the Uptime Institute, which monitors datacenter performance, is about 2.4, with some over 3.0.

A PUE of 3.0 means you’re using twice as much power to cool the datacenter as you are to power your equipment because for every three watts in, two watts go to cooling and one watt goes to the equipment.

Lowenburg estimates that using the old hot aisle/cold aisle technique results in a loss of about 20 percent of cold air, since air can be guided with fans, but ultimately can’t be controlled. With this design, The Planet gets in the high 90 percent range for cold air use.

“From the fact I’m getting all the cold air where it needs to be it allows me to fully utilize the power and space,” he said. “It’s hard to quantify, but we can probably put more computers in there than we otherwise would.”

Get the Free Newsletter!

Subscribe to our newsletter.

Subscribe to Daily Tech Insider for top news, trends & analysis

News Around the Web