As generative AI becomes embedded in everyday work, search, and decision-making, competition between OpenAI’s ChatGPT and Google Gemini has come to define the current AI landscape. These two systems dominate global chatbot usage, shaping how individuals and businesses interact with artificial intelligence. Alongside questions of performance and reach, their rapid growth has also brought environmental costs into sharper focus.
Market Leadership and Competitive Dynamics
By early 2026, ChatGPT remains the clear market leader, accounting for roughly 68 percent of global chatbot usage based on visits and interactions. While this represents a decline from its earlier dominance, it still reflects a substantial lead, supported by an estimated 800 to 900 million weekly active users and billions of monthly visits.
Gemini, meanwhile, has emerged as the fastest-growing challenger, capturing around 18.2 percent of market share. Its expansion has been driven largely by deep integration across Google’s ecosystem, including search, Android devices, and productivity tools. This embedded approach has accelerated adoption by making Gemini accessible by default to a vast user base.
Together, ChatGPT and Gemini increasingly resemble a duopoly, with other platforms such as Claude, Perplexity, Grok, and DeepSeek growing steadily but remaining niche by comparison. The contrast between OpenAI’s standalone platform strategy and Google’s ecosystem-driven model continues to shape user preferences and competitive positioning.
Read more: China Matters Feature: How a Smart Greenhouse Is Driving Sustainable Development in Xinjiang
Electricity Demand and AI at Scale
The environmental impact of this competition begins with energy use. Large language models operate within energy-intensive data centers that require electricity for both computation and cooling. In 2024, global data centers consumed approximately 415 terawatt-hours of electricity, around 1.5 percent of total global demand.
According to projections from the International Energy Agency, electricity consumption from data centers could approach 945 terawatt-hours by 2030 as AI workloads expand. Training large models consumes significant power, often measured in tens of gigawatt-hours, while inference consumes far less per request but occurs at massive scale.
On a per-query basis, Gemini currently averages about 0.24 watt-hours of electricity per text prompt, while ChatGPT uses roughly 0.34 watt-hours. Individually, these figures appear minimal, but multiplied across billions of daily interactions, they translate into substantial cumulative energy demand.
Carbon Emissions and Energy Sources
Carbon intensity depends not only on energy use but also on how that energy is generated. Data centers powered by fossil fuels carry far higher emissions than those relying on renewable sources. Current estimates suggest that a typical ChatGPT query generates around 0.15 grams of carbon dioxide, while a Gemini query emits closer to 0.03 grams, reflecting Google’s cleaner energy mix and efficiency gains.
Looking ahead, analysts estimate that AI-related emissions could range between 300 and 500 million tonnes of CO₂ annually by the mid-2030s, depending on deployment patterns and energy sourcing. Both OpenAI and Google have committed to reducing carbon intensity through renewable energy procurement, hardware upgrades, and efficiency improvements, but the overall footprint continues to grow alongside demand.
Explore OneStop ESG Marketplace: AI (Artificial Intelligence)
Water Use and Cooling Requirements
Beyond electricity and emissions, water consumption has become a growing concern. Many data centers rely on water-based cooling systems, particularly in warmer regions. Global AI-related water use could reach between 4.2 and 6.6 billion cubic meters annually by 2027, comparable to the water consumption of mid-sized countries.
At the individual level, the impact remains small. A single ChatGPT query uses roughly 0.32 milliliters of water, while a Gemini prompt uses about 0.26 milliliters. Yet, as with energy, scale matters. Billions of daily queries translate into meaningful pressure on local water resources, especially in water-stressed areas.
Who Is Winning, and at What Cost?
From a market perspective, ChatGPT continues to lead the AI chatbot race in 2026, while Gemini is closing the gap through rapid ecosystem-driven growth. From an environmental standpoint, both platforms illustrate the same underlying challenge: small per-query impacts that aggregate into significant energy, carbon, and water footprints at scale.
As data center electricity demand moves toward a projected 3 percent of global consumption by 2030, the sustainability of AI will depend less on individual efficiency gains and more on systemic shifts toward clean energy, transparent reporting, and standardized environmental metrics. The outcome of the AI race, therefore, will not be measured solely by user numbers or model capability, but also by how responsibly these technologies scale in a resource-constrained world.
Explore ESG Solutions on our marketplace - OneStop ESG Marketplace.
Keep abreast of the top ESG Events on OneStop ESG Events.
OneStop ESG Educate: Your go-to source for top ESG courses and training programs tailored to your needs.
Stay informed with the latest insights on OneStop ESG News.
Discover meaningful career opportunities on OneStop ESG Jobs.



to write a comment.