Cybersecurity & Digital Trust·14 min read··...

Green IT and sustainable data centers: 8 myths vs realities backed by recent evidence

Debunking common misconceptions about green IT and sustainable data centers, from the belief that cloud migration automatically reduces emissions to assumptions about renewable energy credits and the true impact of liquid cooling.

Why It Matters

Global data-center electricity consumption is projected to exceed 1,000 TWh by 2026, roughly doubling from 2022 levels and representing nearly 4 percent of total global electricity demand (IEA, 2025). The explosive growth of generative AI workloads has accelerated this trajectory: a single ChatGPT query consumes approximately ten times the energy of a Google search (Goldman Sachs, 2024). As enterprises race to expand computing capacity, sustainability claims about data centers have proliferated, yet many rest on outdated assumptions or misleading metrics. Misunderstanding the real carbon footprint of digital infrastructure leads to greenwashing risk, misallocated capital, and missed opportunities for genuine emissions reductions. This article examines eight prevalent myths about green IT and sustainable data centers and contrasts them with the latest evidence from 2024 through early 2026.

Key Concepts

Several foundational metrics shape the sustainability conversation around data centers. Power Usage Effectiveness (PUE) is the ratio of total facility energy to IT equipment energy; a PUE of 1.0 would mean zero overhead, while the global average sits at approximately 1.58 (Uptime Institute, 2025). Water Usage Effectiveness (WUE) measures liters of water consumed per kilowatt-hour of IT load, an increasingly critical metric as evaporative and adiabatic cooling systems expand. Carbon-free energy (CFE) matching refers to the practice of matching electricity consumption with carbon-free sources on an hourly, location-specific basis, as opposed to annual renewable energy certificate (REC) procurement. Scope 2 emissions cover purchased electricity and heat, while Scope 3 (embodied carbon) includes the emissions from manufacturing servers, networking equipment, and building materials, a category that can account for 20 to 30 percent of a data center's lifecycle carbon footprint (Borderstep Institute, 2024).

Myth 1: PUE is the definitive measure of data-center sustainability. PUE captures only energy efficiency of the facility's infrastructure relative to IT load. It says nothing about the carbon intensity of the electricity supply, water consumption, embodied carbon in hardware, or the utilization rate of servers. A facility with a PUE of 1.1 running on coal-fired power produces far more emissions than one with a PUE of 1.4 powered entirely by on-site renewables. The Uptime Institute's 2025 global survey found that 62 percent of operators still use PUE as their primary sustainability metric despite its well-documented limitations (Uptime Institute, 2025). A holistic approach should incorporate carbon intensity per compute unit, WUE, and lifecycle embodied carbon alongside PUE.

Myth 2: Moving to the cloud automatically reduces your carbon footprint. Cloud providers achieve higher server utilization rates (typically 40 to 60 percent versus 10 to 20 percent for on-premise facilities) and benefit from economies of scale in cooling and power delivery (Lawrence Berkeley National Laboratory, 2024). However, migration does not guarantee lower emissions. The carbon savings depend on the specific cloud region's grid mix, the workload's compute intensity, and whether migration triggers a "rebound effect" where cheaper compute encourages more consumption. A 2025 analysis by the Cloud Carbon Footprint consortium found that 35 percent of enterprise cloud migrations resulted in higher total energy consumption within 18 months because of expanded workloads, unoptimized architectures, and over-provisioned instances (Cloud Carbon Footprint, 2025).

Myth 3: Renewable energy certificates (RECs) make a data center carbon-neutral. Purchasing unbundled RECs allows an operator to claim renewable energy on paper without changing the actual electrons consumed at the facility. A data center in a coal-heavy grid region buying wind RECs from a distant market does not displace local fossil-fuel generation. Google's pioneering 24/7 Carbon-Free Energy (CFE) initiative demonstrated the gap: in 2024, Google's global fleet achieved 64 percent hourly CFE matching, but some individual facilities in Asia and parts of Europe remained below 40 percent despite the company purchasing enough RECs to cover 100 percent of annual consumption (Google, 2025). Hourly, location-specific CFE matching is the emerging gold standard, endorsed by the UN's 24/7 Carbon-Free Energy Compact, which had 170 signatories by January 2026.

What's Working

Liquid cooling adoption is accelerating. As AI accelerator chips push rack densities beyond 40 kW, traditional air cooling reaches its physical limits. Direct-to-chip and immersion-cooling technologies reduce cooling energy by 30 to 50 percent compared with conventional air-cooled systems and virtually eliminate the need for water-intensive evaporative cooling (Dell'Oro Group, 2025). NVIDIA's GB200 NVL72 racks, shipping in volume since late 2025, are designed exclusively for liquid-cooled deployments. The global liquid-cooling market for data centers is projected to grow at a compound annual rate of 25 percent through 2028 (Markets and Markets, 2025).

Myth 4: Liquid cooling is only viable for hyperscalers. While hyperscale operators like Google, Microsoft, and Meta have led adoption, colocation providers including Equinix, Digital Realty, and QTS now offer liquid-cooled cabinets as standard configurations. Modular, rear-door heat-exchanger solutions from vendors such as CoolIT Systems and Motivair can retrofit existing enterprise data centers at a fraction of the cost of full immersion deployments. A 2025 Uptime Institute survey found that 22 percent of enterprise data-center operators had deployed some form of liquid cooling, up from 8 percent in 2023 (Uptime Institute, 2025).

Circular-economy practices for IT hardware are gaining momentum. Server refresh cycles have historically been three to five years, generating significant e-waste and embodied carbon. Extending server life by even one year reduces lifecycle emissions by approximately 15 to 20 percent (Borderstep Institute, 2024). Companies like Iron Mountain Data Centers and Sims Lifecycle Services have built profitable businesses around IT asset disposition (ITAD), refurbishment, and component harvesting. Microsoft's Circular Centers program, operational at 28 data-center campuses as of 2025, reuses or recycles over 90 percent of decommissioned servers, diverting thousands of metric tons of e-waste annually (Microsoft, 2025).

Myth 5: Renewable energy alone will decarbonize data centers. Even with 100 percent renewable electricity procurement, data centers carry substantial Scope 3 emissions from hardware manufacturing, construction materials, and diesel backup generators. A 2024 Borderstep Institute study estimated that embodied carbon accounts for 22 percent of a hyperscale data center's total lifecycle emissions (Borderstep Institute, 2024). Furthermore, rapidly growing AI demand threatens to outpace renewable-energy buildout in constrained grid regions. In Ireland, data centers consumed 21 percent of national electricity in 2024, prompting the government to impose a moratorium on new connections in the Dublin area (EirGrid, 2025). Supply-side renewables must be complemented by demand-side efficiency, workload optimization, and grid-aware scheduling.

What's Not Working

Myth 6: Water consumption is a secondary concern for data centers. Evaporative cooling remains the dominant approach for large facilities in temperate and arid climates, consuming staggering volumes of freshwater. Microsoft's 2024 Environmental Sustainability Report disclosed that its global data-center water consumption reached 7.8 billion liters in fiscal year 2024, a 23 percent increase year-on-year driven by AI capacity expansion (Microsoft, 2024). In water-stressed regions like the American Southwest, northern Chile, and parts of India, data-center water use increasingly competes with municipal and agricultural demand. WUE metrics remain inconsistently reported: the Uptime Institute found that only 37 percent of operators publicly disclose water consumption figures (Uptime Institute, 2025).

Myth 7: Edge computing will reduce data-center energy demand. Edge deployments bring compute closer to end users, reducing latency and network transmission energy. However, they do not replace centralized data centers; they supplement them. A 2025 analysis by the Lawrence Berkeley National Laboratory estimated that edge infrastructure could add 50 to 100 TWh of additional global electricity demand by 2030, on top of centralized data-center growth, because edge use cases such as autonomous vehicles, IoT analytics, and augmented reality create net-new compute demand rather than shifting existing workloads (LBNL, 2025). Moreover, edge facilities are typically smaller and less efficient than hyperscale campuses, with average PUEs of 1.6 to 2.0.

Myth 8: Carbon offsets can bridge the gap while infrastructure catches up. Some operators purchase carbon credits to compensate for emissions that remain after efficiency improvements and renewable procurement. However, offset quality is highly variable, and reliance on credits without a credible reduction trajectory risks reputational and regulatory exposure. The Science Based Targets initiative (SBTi) does not allow carbon credits to count toward Scope 1 or 2 reduction targets, and the EU Corporate Sustainability Reporting Directive (CSRD) requires separate disclosure of offsets from actual emission reductions (SBTi, 2024). Using offsets as a bridge strategy without a clear phase-down plan for fossil-fuel-dependent infrastructure is increasingly viewed as greenwashing by investors and regulators alike.

Transparency gaps persist across the industry. Despite growing pressure, many operators provide only headline PUE figures and aggregate renewable-energy procurement percentages. Granular, facility-level reporting of carbon intensity per unit of compute, hourly CFE matching rates, water consumption, and embodied carbon remains rare. The lack of standardized reporting frameworks specific to data centers makes it difficult for customers and investors to compare operators or verify sustainability claims.

Key Players

Established Leaders

  • Google — Pioneer of 24/7 carbon-free energy matching, achieving 64% global hourly CFE in 2024 and targeting 100% by 2030.
  • Microsoft — Operates Circular Centers at 28 campuses; committed to being carbon-negative by 2030 with water-positive targets.
  • Equinix — Largest colocation provider globally, with 96% renewable energy coverage and liquid-cooling rollout across 70+ metros.
  • Schneider Electric — Provides EcoStruxure data-center management software and modular power/cooling infrastructure.

Emerging Startups

  • Submer — Immersion-cooling technology using single-phase dielectric fluids; deployed across Europe and APAC.
  • LiquidCool Solutions — Liquid-cooled server chassis designed for retrofit into existing rack environments.
  • Evroc — European sovereign-cloud startup building hyperscale facilities designed for 100% renewable power from inception.
  • Lancium — Develops grid-responsive data centers co-located with renewable-energy generation assets in Texas.

Key Investors/Funders

  • Breakthrough Energy Ventures — Bill Gates-backed fund investing in data-center efficiency and clean-energy infrastructure.
  • Infrastructure Masons (iMasons) — Industry nonprofit coordinating sustainability standards and workforce development for digital infrastructure.
  • Digital Realty — Major REIT investing in sustainable data-center construction with green-bond financing exceeding $5 billion.

Examples

Google's 24/7 Carbon-Free Energy program. Google has published hourly CFE scores for every data-center region since 2022 and uses advanced electricity procurement, battery storage, and geothermal investments to push toward round-the-clock carbon-free operation. In 2024, its data centers in Denmark, Finland, and Oregon exceeded 90 percent hourly CFE, while facilities in Singapore and parts of Asia remained below 40 percent, illustrating the grid-dependency challenge. The company's open-source methodology has been adopted by over 30 other operators and informed the UN 24/7 CFE Compact (Google, 2025).

Microsoft's Circular Centers. Launched in 2020 and scaled to 28 campuses by 2025, Microsoft's Circular Centers use machine-learning-driven sorting to identify servers and components that can be reused, refurbished, or donated. In fiscal year 2025, the program diverted 83 percent of decommissioned assets from landfill by weight and channeled reusable components into secondary markets, reducing embodied-carbon emissions equivalent to approximately 180,000 metric tons of CO2 (Microsoft, 2025). The model has inspired similar programs at AWS and Meta.

Nautilus Data Technologies' water-cooled floating data center. Nautilus operates a floating data center in Stockton, California, that uses rear-door heat exchangers cooled by a closed-loop water system drawing from the adjacent waterway. The facility achieves a PUE of 1.15 and eliminates water evaporation entirely, using zero potable water for cooling. Nautilus reported a 30 percent reduction in total cost of ownership compared with traditional air-cooled facilities of equivalent capacity, demonstrating that unconventional designs can deliver both economic and environmental benefits (Nautilus Data Technologies, 2025).

EirGrid's data-center connection policy in Ireland. Faced with data centers consuming 21 percent of national electricity and threatening grid stability, Ireland's transmission system operator EirGrid introduced conditional connection policies requiring new data-center applicants to demonstrate dispatchable on-site generation, demand-flexibility capabilities, and direct renewable-energy contracts. The policy has slowed the pace of new connections in the Dublin region but has also prompted operators like Amazon Web Services and Microsoft to invest in dedicated wind and solar farms connected directly to their Irish facilities, accelerating the country's renewable buildout (EirGrid, 2025).

Action Checklist

  • Move beyond PUE as a standalone metric. Track carbon intensity per unit of compute (gCO2/kWh of IT load), WUE, and embodied carbon alongside efficiency ratios.
  • Transition from annual REC procurement to hourly, location-specific carbon-free energy matching using tools like Google's CFE methodology or Electricity Maps' data feeds.
  • Assess cloud workloads for rebound effects: audit post-migration compute consumption at 6 and 18 months to ensure efficiency gains are not offset by demand growth.
  • Evaluate liquid-cooling readiness for current and planned GPU/accelerator deployments; request liquid-cooling options from colocation providers.
  • Implement IT asset lifecycle extension strategies: target server refresh cycles of five years or longer where performance permits, and partner with certified ITAD vendors for end-of-life management.
  • Require facility-level sustainability disclosures from cloud and colocation providers, including hourly CFE rates, water consumption, and Scope 3 embodied-carbon estimates.
  • Develop a water-risk assessment for every data-center location using tools such as the WRI Aqueduct Water Risk Atlas.
  • Avoid treating carbon offsets as a primary decarbonization strategy; use them only for genuinely residual emissions after exhausting efficiency, renewable-energy, and hardware-lifecycle measures.

FAQ

Is a low PUE a reliable indicator of a sustainable data center? Not by itself. PUE measures infrastructure energy efficiency but ignores the carbon intensity of electricity, water consumption, embodied carbon from hardware manufacturing, and server utilization rates. A data center can have an excellent PUE of 1.1 while running entirely on fossil-fuel power. Organizations should demand multi-metric reporting that includes carbon intensity per compute unit, WUE, and lifecycle emissions.

How much more energy does AI training consume compared with traditional workloads? Training a large language model can consume 10 to 100 times more energy than a comparably sized traditional computing task. The IEA (2025) estimates that AI-related electricity demand could account for over one-third of total data-center consumption by 2028. Inference workloads, which run continuously after training, collectively consume even more energy than the training phase itself over the lifetime of a model.

Do renewable energy certificates actually reduce emissions? Unbundled RECs purchased from distant markets have minimal impact on local grid emissions. They represent an accounting mechanism, not a physical change in electricity supply. Hourly, location-matched CFE procurement, direct power-purchase agreements with new renewable projects ("additionality"), and behind-the-meter generation deliver genuine emission reductions. The UN 24/7 CFE Compact and SBTi both emphasize the importance of additionality and temporal matching over annual REC-based claims.

What is embodied carbon and why does it matter for data centers? Embodied carbon refers to the greenhouse gas emissions from manufacturing, transporting, and installing data-center hardware (servers, switches, storage) and construction materials (steel, concrete, cabling). The Borderstep Institute (2024) estimates embodied carbon represents 20 to 30 percent of a hyperscale facility's lifecycle footprint. As operational emissions decline through renewable energy procurement, embodied carbon becomes a proportionally larger share, making hardware-lifecycle management and circular-economy practices essential.

Can edge computing reduce the overall environmental impact of digital infrastructure? Edge computing reduces latency and network energy for specific use cases but generally adds net-new energy demand rather than displacing centralized data-center load. Edge facilities are smaller and typically less energy-efficient than hyperscale campuses. Strategic edge deployment can reduce total system energy for latency-sensitive applications, but it should be evaluated on a case-by-case basis rather than assumed to be inherently greener.

Sources

  • IEA. (2025). Electricity 2025: Analysis and Forecast to 2027. International Energy Agency.
  • Goldman Sachs. (2024). AI, Data Centers, and the Coming US Power Demand Surge. Goldman Sachs Research.
  • Uptime Institute. (2025). Global Data Center Survey 2025: Efficiency, Sustainability, and Resilience Trends.
  • Borderstep Institute. (2024). Lifecycle Carbon Footprint of Hyperscale Data Centers: Embodied vs. Operational Emissions. Berlin.
  • Google. (2025). 2024 Environmental Report: 24/7 Carbon-Free Energy Progress. Alphabet Inc.
  • Microsoft. (2024). 2024 Environmental Sustainability Report. Microsoft Corporation.
  • Microsoft. (2025). Circular Centers Impact Report: Fiscal Year 2025. Microsoft Corporation.
  • Cloud Carbon Footprint. (2025). Enterprise Cloud Migration and Energy Rebound: A Multi-Sector Analysis.
  • Lawrence Berkeley National Laboratory. (2024). United States Data Center Energy Usage Report 2024. U.S. Department of Energy.
  • LBNL. (2025). Edge Computing Energy Projections: 2025-2030 Outlook. Lawrence Berkeley National Laboratory.
  • Dell'Oro Group. (2025). Data Center Physical Infrastructure Quarterly Report: Liquid Cooling Adoption Trends.
  • Markets and Markets. (2025). Data Center Liquid Cooling Market: Global Forecast to 2028.
  • EirGrid. (2025). Data Centres and the Irish Electricity System: 2024 Update and Connection Policy.
  • SBTi. (2024). SBTi Corporate Net-Zero Standard v2.0: Treatment of Carbon Credits and Offsets.
  • Nautilus Data Technologies. (2025). Floating Data Center Performance Report: PUE, Water, and Cost Metrics.

Stay in the loop

Get monthly sustainability insights — no spam, just signal.

We respect your privacy. Unsubscribe anytime. Privacy Policy

Article

Trend analysis: Green IT & sustainable data centers — where the value pools are (and who captures them)

Strategic analysis of value creation and capture in Green IT & sustainable data centers, mapping where economic returns concentrate and which players are best positioned to benefit.

Read →
Article

Green IT and sustainable data centers: where the efficiency and investment momentum is heading

A trend analysis examining the trajectory of data center sustainability, covering energy efficiency innovations, cooling technology shifts, nuclear and renewable procurement trends, and the growing regulatory pressure on digital infrastructure operators.

Read →
Deep Dive

Green IT and sustainable data centers: the hidden trade-offs and how to manage them

An in-depth analysis of the trade-offs between data center sustainability targets, performance requirements, and cost constraints, exploring how operators balance energy efficiency with compute demand growth driven by AI and cloud workloads.

Read →
Deep Dive

Deep dive: Green IT & sustainable data centers — the fastest-moving subsegments to watch

An in-depth analysis of the most dynamic subsegments within Green IT & sustainable data centers, tracking where momentum is building, capital is flowing, and breakthroughs are emerging.

Read →
Deep Dive

Deep dive: Green IT & sustainable data centers — what's working, what's not, and what's next

A comprehensive state-of-play assessment for Green IT & sustainable data centers, evaluating current successes, persistent challenges, and the most promising near-term developments.

Read →
Explainer

Explainer: Green IT & sustainable data centers — what it is, why it matters, and how to evaluate options

A practical primer on Green IT & sustainable data centers covering key concepts, decision frameworks, and evaluation criteria for sustainability professionals and teams exploring this space.

Read →