If you want evidence of Microsoftâs progress towards its environmental âmoonshotâ goal, then look closer to earth: at a building site on a west London industrial estate.
The companyâs Park Royal datacentre is part of its commitment to drive the expansion of artificial intelligence (AI), but that ambition is jarring with its target of being carbon negative by 2030.
Microsoft says the centre will be run fully on renewable energy. However, the construction of datacentres and the servers they are filled with means that the companyâs scope 3 emissions â such as CO2 related to the materials in its buildings and the electricity people consume when using products such as Xbox â are more than 30% above their 2020 level. As a result, the company is exceeding its overall emissions target by roughly the same rate.
This week, Microsoftâs co-founder, Bill Gates, claimed AI would help combat climate change because big tech is âseriously willingâ to pay extra to use clean electricity sources in order âto say that theyâre using green energyâ.
In the short term, AI has been problematic for Microsoftâs green goals. Brad Smith, Microsoftâs outspoken president, once called its carbon ambitions a âmoonshotâ. In May, stretching that metaphor to breaking point, he admitted that because of its AI strategy, âthe moon has movedâ. It plans to spend £2.5bn over the next three years on growing its AI datacentre infrastructure in the UK and this year has announced new datacentre projects around the world including in the US, Japan, Spain and Germany.
Training and operating the AI models that underpin products such as OpenAIâs ChatGPT and Googleâs Gemini uses a lot of electricity to power and cool the associated hardware, with additional carbon generated by making and transporting the related equipment.
âIt is a technology that is driving up energy consumption,â says Alex de Vries, the founder of Digiconomist, a website monitoring the environmental impact of new technologies.
The International Energy Agency estimates that datacentresâ total electricity consumption could double from 2022 levels to 1,000 TWh (terawatt hours) in 2026, equivalent to the energy demand of Japan. AI will result in datacentres using 4.5% of global energy generation by 2030, according to calculations by research firm SemiAnalysis.
It means that amid the concerns about AIâs impact on jobs and humanityâs longevity, the environment is featuring, too. Last week, the International Monetary Fund said governments should consider imposing carbon taxes to capture the environmental cost of AI, in the form of a general carbon levy that captures emissions from servers as part of its reach, or other methods such as a specific tax on CO2 generated by that equipment.
All the big tech firms involved in AI â Meta, Google, Amazon, Microsoft â are seeking renewable energy resources to meet their climate targets. In January, Amazon, the worldâs largest corporate purchaser of renewable energy, announced it had bought more than half the output of an offshore windfarm in Scotland, while Microsoft said in May it was backing $10bn (£7.9bn) in renewable energy projects. Google aims to run its datacentres entirely on carbon-free energy by 2030.
A spokesperson for Microsoft said: âWe remain resolute in our commitment to meet our climate goals.â
Microsoft co-founder Bill Gates, who left in 2020 but retains a stake in the company via the Gates Foundation Trust, has argued that AI can directly help fight climate change. The extra electricity demand would be matched by new investments in green generation, he said on Thursday, which would more than compensate for the use.
A recent UK government-backed report agreed, stating that the âcarbon intensity of the energy source is a key variableâ in calculating AI-related emissions, although it adds that âa significant portion of AI training globally still relies on high-carbon sources such as coal or natural gasâ. The water needed to cool servers is also an issue, with one study estimating that AI could account for up to 6.6bn cubic meters of water use by 2027 â nearly two-thirds of Englandâs annual consumption.
De Vries argues that the chase for sustainable computing power puts a strain on demand for renewable energy, which would result in fossil fuels picking up the slack in other sections of the global economy.
âMore energy consumption means we donât have enough renewables to feed that increase,â he says.
NexGen Cloud, a UK firm that provides sustainable cloud computing, a datacentre-reliant industry that delivers IT services such as data storage and computing power over the internet, says renewable energy sources for AI-related computing are available for datacentres if they avoid cities and are sited next to sources of hydro or geothermal power.
Youlian Tzanev, NexGen Cloudâs co-founder, says:
âThe industry norm has been to build around economic hubs rather than sources of renewable energy.â
This makes it more difficult for any AI-focused tech company to hit carbon goals. Amazon, the worldâs biggest cloud computing provider, aims to be net zero â removing as much carbon as it emits â by 2040 and to match its global electricity use with 100% renewable energy by 2025. Google and Meta are pursuing the same net zero goal by 2030. OpenAI, the developer of ChatGPT, uses Microsoft datacentres to train and operate its products.
There are two key ways in which large language models â the technology that underpins chatbots such as ChatGPT or Gemini â consume energy. The first is the training phase, where a model is fed reams of data culled from the internet and beyond, and builds a statistical understanding of language itself, which ultimately enables it to churn out convincing-looking answers to queries.
The upfront energy cost of training AI is astronomical. That keeps smaller companies (and even smaller governments) from competing in the sector, if they do not have a spare $100m to throw at a training run. But it is dwarfed by the cost of actually running the resulting models, a process known as âinferenceâ. According to analyst Brent Thill, at the investment firm Jefferies, 90% of the energy cost of AI sits in that inference phase: the electricity used when people ask an AI system to respond to factual queries, summarise a chunk of text or write an academic essay.
The electricity used for training and inference is funnelled through an enormous and growing digital infrastructure. The datacentres are filled with servers, which are built from the ground up for the specific part of the AI workload they sit in. A single training server may have a central processing unit (CPU) barely more powerful than the one in your own computer, paired with tens of specialised graphics processing units (GPUs) or tensor processing units (TPUs) â microchips designed to rapidly plough through the vast quantities of simple calculations that AI models are made of.
If you use a chatbot, as you watch it spit out answers word by word, a powerful GPU is using about a quarter of the power required to boil a kettle. All of this is being hosted by a datacentre, whether owned by the AI provider itself or a third party â in which case it might be called âthe cloudâ, a fancy name for someone elseâs computer.
SemiAnalysis estimates that if generative AI was integrated into every Google search this could translate into annual energy consumption of 29.2 TWh, comparable with what Ireland consumes in a year, although the financial cost to the tech company would be prohibitive. That has led to speculation that the search company may start charging for some AI tools.
But some argue that looking at the energy overhead for AI is the wrong lens. Instead, consider the energy the new tools can save. A provocative paper in Natureâs peer-reviewed Scientific Reports journal earlier this year argued that the carbon emissions of writing and illustrating are lower for AI than for humans.
AI systems emit âbetween 130 and 1,500 timesâ less carbon dioxide a page of text generated compared with human writers, the researchers from University of California Irvine estimated, and up to 2,900 times less an image.
Left unsaid, of course, is what those human writers and illustrators are doing instead. Redirecting and retraining their labour in another field â such as green jobs â could be another moonshot.