As artificial intelligence (AI) continues to advance, finding a balance between innovation and sustainability remains a critical challenge. Recently, OpenAI unveiled its most powerful AI model to date, o3. However, along with the cost to run these models, their environmental impact is also gaining attention. A study has revealed that each o3 task consumes approximately 1,785 kWh of energy, which is equivalent to the electricity used by an average US household over two months. This translates to 684 kilograms of CO₂ equivalent (CO₂e) emissions, comparable to the carbon emissions from more than five full tanks of petrol. The high-compute version of o3 was benchmarked on the ARC-AGI framework, with calculations based on standard GPU energy consumption and grid emissions factors. This highlights the need to pay more attention to the tradeoffs as AI technology continues to scale and integrate. However, this calculation does not include embodied carbon and is focused solely on the GPU, so the actual amounts may be underestimated. Additionally, experts have raised concerns about the energy costs of AI models that do not scale down quickly. For example, solving complex math problems with o3 requires multiple drafts, intermediary tests, and reasoning, which can consume a significant amount of energy. Furthermore, earlier this year, it was established that ChatGPT, another AI model, consumes 10% of an average person’s daily drinking water in just one chat. This may seem insignificant, but when millions of people use the chatbot daily, it adds up to a significant water footprint. Experts have also warned about the potential for Jevon’s Paradox to occur with AI advancements like o3, where efficiency tradeoffs may lead to increased water usage. Therefore, it is crucial to consider the comprehensive perspective when making decisions about the deployment of AI technologies, to minimize unintended consequences and maximize benefits. Companies like Synaptics and embedUR are working towards solving these challenges through edge AI, which allows decisions to be made in real-time at the device level, reducing reliance on data centers and minimizing latency and energy use. As AI continues to evolve, it is essential to address these critical challenges to ensure a sustainable future.