Green Data Centers

Green data centers cut back on electricity consumption by optimizing IT equipment and cooling systems.
commons.wikipedia.org

What Is a Green Data Center?

Data centers consolidate the IT infrastructure of an organization under one roof, where data can then be maneuvered and stored. Design elements vary based on the size, density and nature of operations, but all data centers require substantial power and protection to be considered effective.

Although data centers should not be confused with cloud computing, which refers to the storage of data on the Internet, they can be used to run cloud services -- which explains why cloud service providers like Google, Facebook, Amazon and Microsoft are keen on expanding their data center capacity.

Too often, the pressure to avoid crashes or downtime pushes data managers to operate equipment at full throttle, creating an illusory disjunction between performance and energy efficiency. Over the years, environmental agencies and media muckrakers have proven that the overcompensation comes at a cost -- one borne primarily by the environment, but in no small part by host organizations’ pocketbooks as well. Gradually, the movement for “green” data centers was born.

In a green data center, one can find their standard data center components -- like servers, batteries, cooling units and so on -- integrated into a framework that synchronizes IT and energy systems, much to the benefit of all parties involved. Figuring out how and why data centers should go green, though, requires a fundamental understanding of how and why they have risen to prominence through unsustainable means.

Credit: Honeywell Building Solutions

The Problem with Data Centers

A yearlong watchdog investigation conducted by the New York Times revealed, in 2012, that data centers can potentially waste more than 90 percent of the energy they consume. Of the tens of thousands of servers probed, on average only six to 12 percent of electricity consumed was used for actual computing.

Green Data Centers CourseWhere does the rest of the energy disappear to? Most ends up simply keeping servers on standby, ready to accommodate activity surges. Multiply that differential by the 3 million plus data centers that are in the U.S. alone, and the waste begins to add up.

According to the U.S. Department of Energy (DOE), the amount of electricity used by data centers doubled between 2001 and 2006. In 2014, that figure reached a new high of 70 billion kilowatt-hours -- which, Data Center Knowledge reports, was the amount consumed by 6.4 million American homes that year. It’s also 90 percent higher than it was at the turn of the century, and projected to double once more by 2020.

The energy intensity of data centers has not gone unnoticed. A recent report from the Berkeley Lab suggests that data center energy efficiency measures are being implemented... As a result of energy efficiency efforts in data centers, energy consumption by data centers, as a percentage of total U.S. energy consumption, remains relatively flat at 2%.

Overall, the electricity-intensive performance of data centers comes at a price. For U.S. businesses, it’s $13 billion a year in electricity bills. For the environment, it’s the 100 million metric tons of carbon pollution emitted by data centers annually -- an impressive number, but one that’s hardly surprising, given their track record for avoiding municipal environmental regulations. In Silicon Valley, for example, data centers made a frequent appearance on the Toxic Air Contaminant Inventory, which itemizes local diesel polluters.

How Data Centers Work

Data center management requires the synchronization of multiple technological systems. Taking the time to detangle them can reveal the ruptures and redundancies from which inefficiencies arise -- just as leaving them perpetually entangled can lead to perpetual inefficiency.

Flickr

Data Center Components

Facility: Space available to house IT equipment

Support infrastructure: Equipment that secures and sustains uptime, or the percentage of time that the data center’s IT system runs successfully. This usually includes:

Uninterruptible power sources (UPS): battery banks, generators

Environmental control: heating, ventilation and air conditioning (HVAC) systems, computer room air conditioning (CRAC) units

Physical security systems: biometrics, surveillance systems

IT equipment: Equipment needed for data operations and storage, including servers, storage hardware, cables and server racks

Operations staff: Monitors and maintains IT infrastructure.

Not all data centers are created equal, however, and those on the smaller side won’t necessarily require the human and material resources that large-scale facilities do.

Data Center Sizes

Much of a data center’s technological capacity, layout and general applications depends on its size -- which, in turn, reflects the scale of the managing organization’s data networks. The U.S. Environmental Protection Agency (EPA), in a congressional report compiled through its ENERGY STAR program, used the following metrics to create a baseline vocabulary for discussing data centers of ranging proportions:

Palo Alto Networks breaks down general data center architecture into the following components:

Credit: University of Michigan]

The report goes into greater detail on the infrastructural characteristics of each variant, specifying design elements, IT equipment and environmental pressures that are typical for a given size. Particular emphasis is placed on cooling mechanisms, which -- in addition to security -- are one of the biggest concerns of data center operators.

Server closets and server rooms are located on-site and usually share an HVAC system with the rest of the office. Server rooms may necessitate extra conditioning, but are neither power-intensive nor expansive enough to demand external storage. On-site server facilities were once a corporate norm, and for some organizations they remain a logical option. That said, the advent of off-site data centers has given companies and other agencies the opportunity to compartmentalize data management -- an option that holds certain appeal for those looking to minimize power outages and confer maintenance procedures.

Localized data centers represent the first rung on the ladder of externally stored data. With dozens to hundreds of servers, these moderately sized facilities are usually equipped with a few computer room air conditioning (CRAC) units and very little on-site operational staff. The lack of regulatory upkeep can oftentimes prevent equipment from operating at an optimal level.

Growing quickly in number, mid-tier and enterprise-class data centers are defined by their extensive external storage systems and expansive, capital-heavy facilities. While these large-scale data centers have a greater capacity for housing and maintaining energy efficient equipment, their tendency to rely instead on redundant power supply networks also makes them more prone to inefficiency.

Data Center Operations

To help the general public understand the general mechanics of data centers, SAP Software Solutions has created an interactive multimedia guide that explores their machinery in laymen’s terms. Like most data center literature, the guide’s focus falls on a few key pillars of data center architecture -- among them power supply, cooling and security.

Although off-grid data centers do exist, most data centers maintain their power supply via connections to utility-owned grids, diesel generators and block batteries. Together, these sources create an uninterruptible power supply (UPS) to ensure continuous availability. Key to the UPS operability is the concept of redundancy. Redundant power networks layer and accumulate power sources in order to guarantee uptime, creating inbuilt allowances for repairs and scheduled maintenance without having to switch off IT equipment. Excessive or negligent redundancy, however, can result in overlooked increases in power consumption.

Extensive cooling systems -- which include HVAC systems and CRAC units -- keep IT equipment from overheating. Higher power densities generally translate into higher heat levels, a problem that antiquated and inefficiently designed data centers grapple with by implementing unorganized, energy-burning cooling schemes. Now that data centers are strapped with standards and regulations conceived to correct and mitigate inefficiencies, plenty of solutions have been devised to optimize cooling systems and reduce their energy consumption.

Of course, maintaining the security of data centers raises a different class of concerns for managers, though not one that is necessarily mutually exclusive. Security can refer to the physical safeguards and measures needed to protect the buildings themselves, or it can come up in the context of shielding data from hackers. It can also be used when discussing backup systems and certainty of uptime. Regardless of the methods an organization chooses to deploy, routine checks and consistent evaluations of IT systems and equipment are a must-have for securing any data center.

Join the Poplar Newsletter

Measuring Data Center Performance

As per the standards set by the American National Standards Institute (ANSI), all data centers are accorded with one of four tier levels. The tier system acts as a nationally recognized performance indicator, measuring availability as well as technological capacity and resiliency. Resident tech guru nixCraft summarizes the connotative meaning of each tier as such:

  • Tier 1: Non-redundant capacity components (single uplink and servers)
  • Tier 2: Tier 1 + Redundant capacity components
  • Tier 3: Tier 1 + Tier 2 + Dual-powered equipments and multiple uplinks
  • Tier 4: Tier 1 + Tier 2 + Tier 3 + all components are fully fault-tolerant including uplinks, storage, chillers, HVAC systems, servers etc.

Power usage effectiveness (PUE), or the amount of power used by a data center divided by the amount of power used to power its IT infrastructure, has become a common metric for gauging the energy efficient performance of data centers. Although a green data center’s PUE is by no means the be-all and end-all measurement of its success, energy efficiency strategies are generally put to the test with PUE calculations. For an in-depth exploration of PUE and its role in the data center industry, check out The Green Grid’s definitive white paper on the topic.

Benefits of Green Data Centers

Smaller Carbon Footprint

Although green data center strategies range widely in scope and magnitude, when integrated correctly and mindfully they fuse together to create a true architectural and technological ecosystem. Left to their own devices, data centers can consume reckless amounts of energy and leave behind carbon footprints the size of their ever-humming diesel generators.

Return on Investment

Inefficiently operated data centers lose an untold amount of money to negligent design, disordered operations and lacking maintenance strategies. Indiscriminate deployment of cooling measures, combined with overpowered server racks, burn up funds as quickly as they burn up electricity.

By contrast, green data centers -- in spite of overhead costs that do bear consideration -- reap impressive economic rewards by tightening up resources and optimizing systems at large. The Green Grid quantified these returns in a case study titled “The ROI of Cooling System Energy Efficiency Upgrades”, which evaluated the energy efficiency strategies of an undisclosed Green Grid company over the course of a year. Within that year, energy consumption nosedived by nearly 10 percent, leading to utility savings that amount to about $300,000.

Sustainable Solution

The Natural Resources Defense Council (NRDC) cites “shortsighted procurement practices” as one the key barriers data center operators must strike down if they hope to make sector-wide progress in energy efficiency. Too often, inefficient IT equipment and design strategies are implemented because they cost less upfront, even if the energy efficient options are more economical in the long run.

Green data centers build integrative operational systems that increase server capacity, productivity and longevity without compromising their security. Prioritizing optimization and long-term vision over bare outcomes and immediacy is crucial to normalizing sustainable IT infrastructure.

Green Data Center Design Strategies

Energy efficiency and data center operations were once thought to be irreconcilable. Today, growing awareness of the latter’s legacy of environmental negligence has spawned a proliferation of solutions that can help data centers truly go green.

The ENERGY STAR program has devised numerous energy-saving tactics that can be implemented by data center builders, operators and managers alike. ENERGY STAR classifies these strategies under three categories: IT Opportunities, Airflow Management Strategies and HVAC Adjustments.

IT Opportunities

In 2014, NRDC estimated that the total number of servers housed by data centers was approximately 12 million. Many data centers have followed unsustainable trajectories of growth that, if continued, will deplete 140 billion kilowatt-hours of electricity each year. Although IT equipment isn’t the only direct consumer of electricity in data centers, deploying energy efficiency strategies that fully utilize the technological capacity at hand can do much to diminish energy bills and adverse environmental impacts.

Server Virtualization

When one server is used to run one application, the amount of computing resources used to actually compute can, as The New York Times discovered, turn out to be very little. By consolidating largely unproductive servers, server virtualization trades in power-intensive hardware for energy efficient software -- with the added benefits of expediting disaster recovery and diminishing downtime.

Decommissioning of Unused Servers

Servers are considered “comatose” when they have not engaged in useful work in at least six months. Research conducted in 2015 found that 30 percent of servers worldwide are comatose -- which means, essentially, that they are actively consuming energy without doing much else. According to data collected by ENERGY STAR, decommissioning comatose servers lowers electricity consumption and cuts major costs. AOL, for example, saved nearly five million dollars during its inaugural Server Roundup in 2012.

Consolidation of Lightly Utilized Servers

Building off of the rationale behind server virtualization and decommission, the consolidation of inefficient servers piles multiple applications onto either one server or a single operating system. As a result, server productivity increases and power consumption goes down.

Better Management of Data Storage

Combining economized data storage strategies with energy-efficient data storage equipment can help data center managers lighten their electrical loads. Tactics like automated storage provisioning and even conscientious data compression go hand-in-hand with optimized drive architecture, which -- in the case of MAID (massive array of idle disks) -- has been proven to cut down power expenditure and increase drives’ life expectancy.

Purchasing More Energy-Efficient Servers, UPSs and PDUs

The latest models of servers, uninterruptible power supplies (UPSs) and power distribution units (PDUs) have been re-engineered to be more energy-efficient.

Airflow Management Strategies

In a data center that isn’t configured for energy efficiency, poorly managed or neglected airflows can cause the interior temperature of the facility to rise, which precipitates a need for power-intensive cooling measures. Architecting or retrofitting data centers with layouts and exhaust barriers that contain and redirect airflows puts less strain on cooling systems, lowering electricity consumption as a result.

42U.com

Hot Aisle/Cold Aisle Layout

In 2006, a white paper compiled by Schneider Electric made a case against room-based cooling -- the disordered, power-dense cooling method that was, until recently, a de facto element of data center design. Row- and rack-based cooling, or even a hybrid of the two, better accounts for servers’ exhaust airflows by positioning server racks so their fronts face each other (as do their backs). As a result, hot and cold air gusts are contained to their respective rows, rather than mixing, fluctuating and demanding powerful overhanging coolants.

For new data center construction projects, the simple rearrangement can reconfigure air traffic to the effect of reducing energy consumption and increasing server longevity. Existing data centers, however, must consider the costs -- among them, guaranteed downtime and resources needed for layout revision -- that are sure to be involved.

Containment/Enclosures

Hot- and cold-aisle containment strategies build on the hot aisle/cold aisle layout to erect additional barriers against the mixing of inconsonant air streams. Cold-aisle containment involves an enclosure-based system that uses specialized materials to cordon off cold aisles from the surrounding hot air. Although this option is perhaps more practical for existing data centers, hot-aisle containment has emerged as the go-to solution for new installations. Essentially the inverse of its cold-aisle counterpart, hot-aisle containment encloses hot air and allows cold air to fill the data center -- lowering yearly cooling system energy costs by as much as 43 percent.

Variable Speed Fan Drives

CRAC systems, if not outfitted for energy efficiency, can use a disproportionate amount of electricity to power their fans. ENERGY STAR cites that they contribute five to 10 percent of a data center’s energy consumption -- a ratio outnumbered solely by cooling compressors. With variable-speed fan drives (VFDs) in place, data center staff can adjust fan speed in response to changes in operational demand.

Properly Deployed Airflow Management Devices

Without proper organizationational and evaluatory mechanisms in place, deploying multiple airflow management devices at once may produce less-than-optimal results. ENERGY STAR advises data centers to conduct or seek a professional airflow assessment for pinpointing and troubleshooting weak spots.

HVAC Adjustments

The challenges that non-optimal heating, ventilation and air conditioning (HVAC) systems present challenges for conventional buildings are magnified tenfold in the average data center. Conscientiously operating HVAC equipment and integrating power-saving economizers into new or preexisting systems helps organizations avoid the heightened costs of HVAC inefficiency.

Server Inlet Temperature and Humidity Adjustments

Simply keeping the temperature of air that flows into IT equipment within the range standardized by the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE) can prevent energy wastage. The same goes for humidity levels, and a number of strategies exist that address both. Trading in a distributed control system for a centralized one can help synchronize individual HVAC components, as can sensor-driven temperature and humidity monitoring systems. A widely quoted savings estimate sets the energy costs eliminated by every increase of one degree Fahrenheit at four to five percent.hvacrepairservicetoronto.com

Air-Side Economizer

Air-side economizers reduce a building’s HVAC activity, and thereby energy consumption, by using outside air instead of mechanically cooled air when possible. Programmed with modes that attune activity to exterior and interior environmental conditions, economizers integrate energy-efficient hardware with their surroundings to make for a more fluid, eco-friendly and cost effective cooling system.

According to 42U Data Center Solutions, mechanical cooling can constitute up to 40 percent of a facility’s electricity consumption. Data cente rs require around-the-clock cooling, which means that air-side economizers can yield big returns. Intel estimates that a data center with a power density as high as 10 MW would save nearly three million dollars through air-side economization. To view a rough prediction of your project’s ROI, check out Data Aire’s Air Side Economizer Payback Calculator.

Water-Side Economizer

While air-side economizers cool supply air directly, water-side economizers use a separate water source -- such as a cooling tower or dry cooler -- to reroute heat away from the data center and into the immediate atmosphere. Cold water is delivered from the source to the onsite economizer, where it is used, absent of any energy-powered mediatory processes, to produce cold air. Water-side economizers are generally more efficient for data center projects that cannot accommodate the low-humidity conditions that come with air-side economizers, though the ROI will ultimately depend on how well the HVAC system is calibrated to its exterior surroundings.

Green Data Center Design Standards

As more and more organizations commit themselves to retrofitting their data centers with green design and operation strategies -- in addition to the emerging data centers designed to be sustainable from the get-go -- standards have been created to establish common baselines for all stakeholders involved. The U.S. Green Building Council’s Leadership in Energy and Environmental Design (LEED) certification program included an individual data center matrix in the latest iteration of its rating system (v4), acknowledging the singularity of the demands that separate data center projects from other commercial buildings.

LEED’s requisites for data centers -- which fall across the program’s spectra of categorized credits -- emphasize sustainability without constricting projects to certain technologies or methods of achievement. Particular emphasis is placed on cooling systems, though specifications are also outlined to address the energy-related issues of UPS and IT infrastructure.

Many LEED credits work in tandem with ASHRAE criteria, and in September 2016 ASHRAE published a performance-based design standard for data centers (Standard 90.4). The new standard guides data center projects in conceiving and implementing a plan of action that utilizes renewable energy sources and maximizes energy efficient strategies in all phases of development. ASHRAE’s standard is more technical and quantitative than LEED’s concept-heavy guidelines.

Because data center performance can be measured on countless levels, many standards are needed to fully address every facet. For more information on data center standards, check out Search Data Center’s reference sheet.

LEED’s Green Data Centers: Apple

Located in Maiden, North Carolina, Apple’s iDataCenter has been LEED Platinum certified since 2009. In addition to using hot-aisle containment, the data center also integrates water-side economization into its cooling system, reusing water 35 times and reducing total water consumption by 20 percent. As a result of a partnership with Duke Energy, the facility -- one of the biggest of its kind at over 500,000 square feet -- now sources energy from three fully functional solar arrays and biogas fuel cells.

According to Apple’s 2016 Environmental Responsibility Report, their data centers managed to prevent 187,000 metric tons of CO2e from entering the atmosphere. They also consume 100 percent renewable energy, and have done so since 2013.

More on Green Data Centers

For regular updates on developments specific to green data centers, subscribe to Green Data Center News. Also be sure to check out the NRDC’s issue paper on data center efficiency for a comprehensive overview of its challenges and barriers.

Connect.

Find members with green building and green business skills and experience.

Want to be listed on this page? Join Poplar Network for only $99.99 per year!

Meet Pros.

AvatarAl-Emran Hossain's picture
Mechanical Engineer

Learn.

Relevant education and training to consider.

Green Data Centers Course

 
: 650-746-4261