Search

What Does DeepSeek Mean for AI’s Environmental Impact?

As artificial intelligence development accelerates, concerns over its environmental footprint grow. With the U.S.-based Stargate announcing a half-trillion-dollar investment in AI and China’s DeepSeek unveiling a new, energy-efficient chatbot, the future of AI’s energy consumption remains uncertain. Could advances like DeepSeek help mitigate AI’s rapidly expanding carbon footprint, or is the industry destined to drive global emissions even higher?

The Rising Energy Cost of AI

DeepSeek's latest AI model claims to be significantly cheaper and more efficient than existing systems from U.S. tech giants like Microsoft and Google. This could have profound environmental implications, as the training and operation of AI models currently require enormous amounts of energy.

The International Energy Agency (IEA) estimates that the world's 8,000-plus data centers already account for 1-2% of global electricity use. AI's hunger for energy is only expected to grow. Goldman Sachs forecasts a 160% increase in power demand from data centers by 2030, potentially pushing global electricity consumption to 4%.

Paul Deane, a senior lecturer in clean energy futures at University College Cork, describes AI as having "a ferocious appetite for energy." The emergence of large-scale AI applications has led companies to build vast data centers, often relying on fossil fuels for rapid deployment. While AI can integrate well with renewable energy sources, significant infrastructure investments are required to make this feasible.

Water, the Overlooked Resource

AI's environmental impact extends beyond electricity consumption. Producing the microchips required for AI models is water-intensive, with the manufacture of a single chip consuming more than 8,300 liters of water. Data centers also require vast amounts of water for cooling, making AI's water footprint another growing concern.

A 2023 study from the University of California, Riverside, found that training a large language model such as OpenAI's GPT-3 could consume millions of liters of water. Running ten to fifty queries can use up to half a liter, meaning that routine AI tasks are far more resource-intensive than most people realize.

Water scarcity issues further complicate matters. "If we are using crazy amounts of water in Arizona, Spain, or Uruguay, that's not a good practice," warns Shaolei Ren, a co-author of the UC Riverside study. The projected AI-driven demand for water could reach 6.6 billion cubic meters annually by 2027—six times Denmark's total water consumption.

Can AI Become More Sustainable?

Despite these concerns, experts believe AI's environmental impact can be reduced. One approach is to power data centers with renewables and store excess electricity in large-scale batteries. Data centers could also be built in locations with abundant solar and wind power, or they could schedule AI training during daylight hours to capitalize on solar energy.

Additionally, some companies are exploring heat recycling from data centers to warm nearby communities, enhancing energy efficiency. "We can't put the genie back in the bottle, but we can certainly try and make the genie better, cleaner, and more efficient," says Deane.

Transparency is another key factor. Ren argues that AI firms should disclose their water and energy usage and consider climate conditions when selecting data center locations. Measures like rainwater harvesting, liquid cooling systems, and scheduling AI tasks during off-peak hours could also help limit environmental damage.

The DeepSeek Effect

DeepSeek's AI model could upend expectations about AI's energy demands. The Chinese firm claims its system uses far fewer resources while delivering performance on par with OpenAI's latest models. Because DeepSeek lacked access to Nvidia's powerful AI chips, it had to develop alternative methods, which may result in more energy-efficient AI applications.

If DeepSeek's model lives up to its promises, it could reduce the need for some of the large-scale data centers currently under development. In the long run, AI tasks may even shift to smaller devices, such as smartphones, reducing reliance on centralized computing power.

However, increased efficiency often leads to greater overall consumption—a phenomenon known as Jevons Paradox. If AI becomes cheaper and more accessible, demand could skyrocket, ultimately negating efficiency gains.

"There's a lot more uncertainty now," says Ren. "Whether we'll see AI's continued explosive growth or a shift towards more sustainable computing remains to be seen."

The Road Ahead

As AI continues its rapid evolution, the challenge will be balancing innovation with sustainability. The industry is at a crossroads: it can either pursue unchecked expansion, exacerbating energy and water demands, or take proactive steps to integrate renewable energy, improve efficiency, and minimize environmental harm.

DeepSeek's breakthrough suggests that efficiency gains are possible, but whether they will translate into lower emissions remains an open question. With AI's influence expanding across industries, the coming years will determine whether it can be a tool for sustainability or a driver of escalating environmental costs.