Grok AI has been discovered to be the most energy-efficient AI chatbot, with each query producing just 0.17 grams of CO2—a fraction of what its competitors emit.
In contrast, OpenAI’s GPT-4 generates 25 times more emissions per query, pointing out issues about the environmental cost of advanced AI models.
A recent analysis by TRG Datacenters compared the carbon footprints of various AI models, measuring emissions from individual queries based on standard energy grid assumptions. The results reveal a gap in efficiency, with some models demanding far more power than others.
The AI Carbon Footprint Breakdown
AI chatbots differ widely in how much energy they consume during inference. Here’s how they rank in terms of CO2 emissions per query:
- Grok AI – 0.17g
- Google Gemini – 1.6g
- LLaMA (Meta AI) – 3.2g
- Claude AI – 3.5g
- Perplexity AI – 4g
- ChatGPT (GPT-4) – 4.32g
Grok AI’s low emissions result from its simplified computational design, which reduces power usage while maintaining performance. In practical terms, a single query on Grok produces the same emissions as a basic Google search, making it the greenest option in the market.
Meanwhile, Google Gemini ranks second, emitting 1.6g of CO2 per query. Google’s heavy investment in renewable energy and custom AI hardware helps curb its carbon footprint, though it still lags behind Grok.
Meta’s LLaMA model follows at 3.2g CO2 per query, benefiting from Meta’s commitment to renewable energy but still consuming twice as much power as Gemini. Claude AI ranks slightly worse, producing 3.5g CO2 per query, with its emphasis on safety and reliability seemingly driving up computational costs.
At the higher end of the spectrum, Perplexity AI (4g CO2 per query) and ChatGPT (4.32g CO2 per query) stand out for their environmental impact. GPT-4, in particular, has the highest carbon footprint among the chatbots studied.
Its computational intensity, deep learning architecture, and search feature demand massive energy resources, generating emissions equivalent to sending 21 emails or nearly a full phone charge per query.
A spokesperson from TRG Datacenters commented on the findings:
“As AI adoption continues to rise, finding ways to reduce its energy consumption will be key. Some models are already designed to be more efficient, but there is still room for improvement. Advances in hardware, more optimized AI models, and increased use of renewable energy in data centres could help lower emissions over time. AI is here to stay, but balancing innovation with sustainability will be essential in minimizing its environmental impact.”
With AI usage skyrocketing, energy efficiency is becoming a big issue. While Grok AI sets the benchmark for low-carbon performance, larger models like GPT-4 highlight the environmental trade-offs that come with high-powered AI capabilities.
The resilience of AI sustainability will probably depend on hardware improvements, algorithmic optimisations, and increased reliance on green energy. For now, the numbers show that not all AI models are created equal, and some come with a much heavier environmental cost than others.