As artificial intelligence moves rapidly into our lives, we face a variety of challenges in managing this technology. One in particular affects the very health of our planet: a huge new demand for energy use, due to the data centers that AI depends on.
A single data center in the United States uses as much energy as 80,000 households, according to a 2023 McKinsey & Company report. Energy demands to keep up with the expansion of AI, cloud services, and big-data-analytics computing continue to grow. A recent US Department of Energy report reveals that data centers consumed about 4.4% of total US electricity in 2023, and it projects that these centers will consume approximately 6.7% to 12% of our total by 2028. We’re talking about hundreds of terawatt hours of power.
Paradoxically, AI has the potential to unlock energy efficiencies, but for now, it uses a tremendous amount of computing power, particularly during the training phase, which requires more and better data centers. A 2025 World Economic Forum report, titled AI’s Energy Dilemma: Challenges, Opportunities, and a Path Forward, predicts that AI-related electricity consumption will grow by up to 50% annually from 2023 to 2030, challenging our power systems.
Another trend compounding this problem relates to the shrinking and stacking of electronics. Smaller chips allow for closer and taller stacking. High computing power, combined with ever tighter chip configurations, comes at a price: The stacked chips require more energy and produce more heat as they operate. The chips, as they work, create hot spots that conventional air cooling cannot reach easily. Hotter chips mean more processing errors, necessary protective slowdowns to lower the chips’ temperature, and an increase in energy use required for cooling the chips.
Cooling systems indeed, account for as much as 40% of a data center’s total annual energy consumption. Traditionally, chips in data centers get cooled by chilled air that flows across metal plates attached to the computer chips to dissipate heat. Air, however, does not efficiently cool down these tightly stacked chips, and data centers struggle to control hot spots within the chip matrix. More efficient air-cooled systems use tiny fins to dissipate the heat to the moving air, but these fins take up valuable space. These air-cooled systems also use a great deal of water to chill the air in evaporation cooling towers. Warm water, a byproduct of this cooling process, can cause environmental problems when discharged downstream.
The good news is that engineers began researching an alternative, liquid cooling systems, decades ago. They have explored two different kinds of these systems: one that submerges the chip stack in fluid and the other that runs fluid along the chip, called direct-to-chip or on-chip cooling. Pumping water uses much less energy than blowing air. Water also cools more efficiently than air because of its density and heat-carrying capacity. The trick has been to get the water as close as possible to the source of the heat, in other words, to the chip itself. This is where microfluidics come into play.
Engineers discovered that a creative solution: creating and embedding tiny channels—20–100 microns wide—in the chip and then pumping water, or other fluids, through the chip itself. Operators can treat the channels like mini-HVAC systems, directing the cooling to where the chips most need it at any time. These targeted, closed systems also use significantly less water. The industry uses a metric called Water Usage Effectiveness to quantify the sustainability of a data center with respect to its water usage.
What does this on-chip cooling with microfluidics mean for sustainability? A team from École Polytechnique Fédérale de Lausanne (EPFL), Switzerland’s prestigious research institute and university, has created a microchip embedded with a cooling system that has the potential to drop power requirements for cooling from 30% to 0.01% of a data center’s power. Other estimates from manufacturers contend that they can reduce cooling energy by upward of 50% over traditional air cooling.
Although on-chip cooling technology has been around for some time, its adoption has been slow because of the cost of retrofitting data centers and because of the initial higher manufacturing costs of making the fluid-filled chips. Smaller research-oriented chip stacks may adopt the technology more quickly than large-scale data centers. As manufacturing becomes more cost effective and our energy challenges build, we will likely see more and more on-chip cooling come online.
Another significant potential mechanism to make data centers more sustainable involves using the waste heat from the chip cooling system to warm homes and buildings. Amazon uses recycled heat from a data center in Ireland to supply heat to a district in Dublin, for example, and Facebook says the heat from its Danish data center warms 6,900 homes. Google has similar plans in Finland.
As our hunger for AI, cloud services, and big data analytics computing grows, we must become smarter about our use of electricity to support these systems. On-chip cooling with microfluidics represents part of the solution to making our future more sustainable.
Meleah Ashford is a water resources engineer with a BS from Oregon State University and an MS from the University of California, Berkeley. She worked in industry for 30 years, mostly as an engineering consultant. Ashford now works as a certified Life Coach and owns Find Solid Ground Coaching, where she helps people meet goals related to financial well-being, starting a business, and living the life they desire. She has also owned an engineering firm. After growing up in rural eastern Oregon, she now lives in the Willamette Valley. She is currently writing a book about women in STEM.
This article was originally published in AWIS Magazine. Join AWIS to access the full issue of AWIS Magazine and more member benefits.
