A decommissioned Air Force bunker in Des Moines able to withstand a nuclear attack, an abandoned limestone mine in a remote corner of Pennsylvania, and three, soon-to-be built 300,000-square-foot buildings near the Arctic Circle. Different as they may be, all are--or soon will be--state-of-the-art data centers.
Data centers are the central nervous system of the modern corporation. Their workload is enormous. Google, for example, handles about a billion searches daily. Processing those searches requires a massive amount of energy: the search giant recently disclosed that it continuously uses 260 million watts of electricity, enough to power 200,000 homes.
Electricity costs--not the physical plant or the hardware and wiring that go into it--is the biggest expense associated with a data center. Up to half of that amount is used to power large-scale air conditioning systems that keep the indoor temperature at a cool 61 to 75 degrees Fahrenheit, the optimal range to keep computer equipment humming. (Humidity must be kept in a range of 40 to 55 percent.)
Smart companies are experimenting with a number of cooling techniques to cut down on energy costs. The savings are potentially huge, and there is the added incentive of the respect that comes withto being considered environmentally responsible. Most of today's data centers use outdoor air for cooling, although the air usually must be chilled once inside. And there is a movement to nudge the upper limit acceptable temperature to 77 degrees Fahrenheit. A one-degree increase in temperature roughly equals a 4 percent decrease in energy costs.
Radical designs grab much of the attention like the data center in Frankfurt, Germany, that sports a roof of living plants to help keep the center cool. (It also uses an air conditioning system.) Or Google's new facility in Hamina, Finland, that uses seawater routed through granite tunnels, once used for milling paper, which is then piped through the data center before it is eventually returned to the ocean. Facebook, eager to tout its "green bona fides," recently publicized its plans to build a server farm near the Arctic Circle where the frigid temperatures are expected to keep its equipment cool.
Although less flashy than the post-modern designs of some modern data centers, computer and network technology can contribute significantly to reducing energy consumption. For example, Cisco's data center in Allen, Texas, uses a converged infrastructure, combining data and storage traffic into a single network, reducing the numbers of switches, adapters and cabling needed. This dramatically reduces power usage. Fewer cables means better air flow, lessening the need for electrically-powered cooling. Blade servers, stripped-down versions of rack-mounted servers, also require less cooling. Server virtualization also helps reduce power costs because fewer physical servers are needed to process workloads. The data center cools outside air only when the ambient temperature rises above 75 degrees. Solar cells are used to help power the office spaces. With these efforts, the company saves an estimated $600,000 a year in cooling costs.
The Inconvenient Truth
Greenhouse gas emissions from data centers are projected to double to four percent of global carbon emissions reported by 2020, according to some estimates. Today, federal and international agencies are using the carrot approach in reducing gas emissions. Unless the businesses show a strong commitment to reducing energy consumption, a stick may replace the carrots. Smart companies are taking on the problem with a combination of approaches, including one that factors in savings derived from green building design as well as those that come from smarter computer and networking technologies.
The contents or opinions in this feature are independent and do not necessarily represent the views of Cisco. They are offered in an effort to encourage continuing conversations on a broad range of innovative technology subjects. We welcome your comments and engagement.
We welcome the re-use, republication, and distribution of "The Network" content. Please credit us with the following information: Used with the permission of http://thenetwork.cisco.com/.