1. Home
  2. Featured
  3. Microsoft will Achieve 100% Renewable Energy by Next Year – EQ
Microsoft will Achieve 100% Renewable Energy by Next Year – EQ

Microsoft will Achieve 100% Renewable Energy by Next Year – EQ


In Short : Microsoft will achieve 100% renewable energy by next year. This milestone underscores the company’s commitment to sustainability and reducing its carbon footprint, setting a benchmark for corporate environmental responsibility.

In Detail : However, Microsoft’s journey toward sustainability is not without challenges.

At Microsoft Build 2024, CEO Satya Nadella reaffirmed the company’s ambitious commitment to sustainability. “We’re on track to meet our goal to have our data centres powered by 100% renewable energy by next year,” he declared.

And we wonder how?

Addressing the strategies behind this pledge, Nadella emphasised the company’s focus on sustainable cloud services.

“We’re making our best-in-class AI infrastructure available everywhere and we’re doing this with a focus on delivering on cloud services sustainability. In fact, we’re optimising power and efficiency across every layer of the stack from the data centre to the network,” he explained.

Nadella highlighted the innovative design of Microsoft’s latest data centres, tailored specifically for AI workloads. This design ensures responsible and efficient use of every megawatt of power, aiming to reduce both cost and energy consumption of AI operations.

Additionally, advanced cooling techniques are being employed to align the workloads’ thermal profiles with the environmental conditions of their respective locations.

Microsoft’s Sustainability Challenge

However, Microsoft’s journey toward sustainability is not without challenges. The company’s annual sustainability report revealed that since 2020, carbon emissions have, in fact, risen by 30% owing to the expansion of data centres.

This data underscores the gap between Microsoft’s 2020 climate goals and the current reality in the light of its ambitious target of becoming carbon-negative by the end of the decade. Interestingly, the goal was set before the AI explosion kicked in, forcing tech companies to find ways to build compute to train AI models.

To address this challenge, Microsoft chief sustainability officer Melanie Nakagawa said, “Select scale, high-volume suppliers will be required to use 100% carbon-free electricity by 2030.”

What is Google Doing?

In 2020, Google announced its objective to operate on 24/7 carbon-free energy (CFE) across all its global operations by 2030. This goal involves procuring clean energy to meet their electricity needs every hour of every day, on every grid, wherever they operate.

Google noted, “Achieving 24/7 CFE is far more complex and technically challenging than annually matching our energy use with renewable energy purchases. No company of our size has achieved 24/7 CFE before, and there’s no playbook for making it happen.”

NVIDIA to the Rescue

Recently, NVIDIA announced the Blackwell platform. It allows organisations to develop and deploy real-time generative AI on trillion-parameter models while consuming up to 25 times less energy and cost than previous methods.

If OpenAI uses Blackwell to train its large language models, the CO2 emissions associated with training GPT could potentially be around 12 tons. This is significantly less than GPT-4, which is estimated to produce around 300 tons of CO2.

Reports since 2012 indicate a rapid increase in computing power for AI training, doubling every 3.4 months on an average. However, with major players like OpenAI, Google, Meta, and Microsoft adopting Blackwell, there’s a collective effort to address the sustainability challenges of AI innovation.

At the recently concluded Microsoft Build, Nadella mentioned that they’ll be among the first cloud providers to offer NVIDIA’s Blackwell GPU V100s as well as GB 200 configurations.

Earlier, in the GTC keynote in San Jose, NVIDIA CEO Jensen Huang stated, “Our aim is to continually reduce costs and energy consumption, as they are directly linked, to expand and scale up computation for training future models.”

Training a GPT model with 1.8 trillion parameters typically takes around 3-5 months using 25,000 amps. However, to train a GPT-4 model, NVIDIA claims that it would have previously required 8,000 Hopper GPUs and 15 megawatts of power, still completing in about 90 days.

This AI model is less costly than one might assume, but with 8,000 GPUs, the expenses are significant. Blackwell offers a more efficient alternative, needing only 2,000 GPUs and consuming just four megawatts over the same 90-day period.

What’s Next?

Recent findings from Cornell University highlighted that training LLMs like GPT-3 consumed electricity equivalent to 500 metric tons of carbon, which amounts to 1.1 million pounds. (A typical coal-fueled power plant working continuously for 24 hours burns about 2.7 million pounds of coal). Training LLMs is equivalent to burning coal for 10 straight hours, or nearly half a day.

Recognising the need for an energy breakthrough to support the future development of AI, Open AI chief Sam Altman invested $375 million in Helion Energy, a private US nuclear fusion company.

At a Bloomberg event during the World Economic Forum’s annual meeting in Davos, Altman emphasised the potential of nuclear fusion and affordable solar energy as viable pathways to support sustainable AI development.

Anand Gupta Editor - EQ Int'l Media Network