Follow “Electronic Engineering Magazine” and add the editor’s WeChat
Regional groups are now open, please send messages 【Shenzhen】【Shanghai】【Beijing】【Chengdu】【Xi’an】 to the public account
The rapid rise of artificial intelligence has led to a surge in energy consumption. Training large AI models requires powerful computing capabilities and a significant amount of electricity and cooling water. The solution is not to abandon AI, but to prioritize energy efficiency during its development and deployment.
Discriminative AI focuses on analyzing existing data rather than generating new content, providing a pathway to reduce energy consumption. A broader prospect lies in smaller embedded AI models that can achieve similar results with lower power consumption.
These embedded systems operate on edge networks, capable of processing data in real-time, making them ideal for industrial applications. In addition to energy-saving capabilities, embedded AI also offers advantages in training, cost, and data privacy.
Companies like AITAD, as experts in embedded, semiconductor, and industrial AI, are following this approach, focusing on maximizing the use of embedded AI to ensure a balance between the benefits of future AI and our environmental responsibilities.
The Growing Energy Demand of AI
AI is increasingly becoming the focus of discussions on energy consumption. Given the enormous power demands of modern AI models, this is not surprising. As early as 2019, a study in the United States warned that the carbon emissions produced by training a neural network are equivalent to the total carbon emissions of five conventional cars over their entire lifetimes. It is estimated that training the model behind ChatGPT consumed 1287MWh of electricity, equivalent to the power output of a medium-sized nuclear power plant for one hour.
The energy consumption of information and communication technology currently accounts for 2% to 4% of global greenhouse gas emissions, comparable to the aviation industry. This demand is expected to grow, with estimates suggesting that by 2027, the electricity consumption of AI could reach 134TWh. Additionally, AI training requires a significant amount of water for cooling—reportedly, training the GPT model consumed approximately 700,000 liters of water.
Unprecedented Demand for Computing Power
Training large AI models requires immense computing power. For example, NVIDIA plans to sell over 2 million units of the H100 “Hopper” AI accelerator by the end of 2024. If all these chips operate at full capacity, they will require 1.6GW of power, more than what a typical large nuclear power plant can provide.
In addition to training, running AI models also consumes a lot of energy. It is estimated that a single ChatGPT query requires between 3 to 9Wh of electricity. If 9 billion search queries are processed by AI instead of traditional methods daily, the global search engine’s power consumption would double. With AI search capabilities integrated into platforms like Microsoft Bing and Google’s Gemini, energy consumption is expected to rise significantly.
AI in the Industrial Sector: Efficiency and Energy Consumption
“The application of AI in medical technology and the industrial sector is also growing, particularly in production,” said Viacheslav Gromov, founder and CEO of AITAD. “The application of AI in the industrial sector is expected to improve process efficiency and prevent production downtime. However, increasing machine efficiency will also lead to a significant increase in energy demand.”
Given these growing demands, AI must improve energy efficiency. In addition to cost savings, improving efficiency is crucial for addressing potential energy shortages and reducing environmental impact. AI and smart sensors can enhance energy efficiency, but optimization must start from the neural network development stage.
Discriminative AI has enormous potential in industrial applications, capable of answering key operational questions such as “What is happening now, and what will happen next?”
Big Is Not Always Better
The size of AI models directly affects their energy consumption. Larger neural networks require more computational resources during training and operation. However, if optimized properly, smaller models can be equally efficient. Recent technological advancements have enabled AI to run even on small semiconductors, from microcontrollers (MCU) to field-programmable gate arrays (FPGA).
“This embedded AI operates autonomously at the network edge, with its main advantage being its ability to evaluate all data received from sensors in real-time, thus requiring minimal connectivity,” said Gromov. “The importance of embedded AI in various application areas is increasingly evident in the industrial sector, while also significantly reducing energy consumption.”
For example, the efficiency of a Cortex-M MCU equipped with a neural processing unit and direct memory access is 20 to 60 times higher than that of a typical Intel i-PC processor with equivalent computational power. Embedded AI operating autonomously at the network edge is changing the game. Local processing of sensor data can minimize connectivity needs and significantly reduce power consumption.
Energy-Efficient AI Training Using Embedded Systems
Training AI models for embedded systems does not require massive server infrastructure. Unlike large-scale AI training, which may involve tens of thousands of GPUs, each costing over $400,000, embedded AI development typically only requires standard PCs. These systems are designed for highly specific tasks, such as monitoring machine health, enabling simple voice control, or assessing dental health based on ultrasonic signals during brushing.
Since embedded AI systems are designed to operate with minimal resources, they consume very little energy—sometimes just a few milliamps. Many systems can be powered by batteries, and some can even be powered through energy harvesting, eliminating the need for external power sources.
The Versatility and Sustainability of Embedded AI
While embedded AI is not a one-size-fits-all solution, it is suitable for any scenario requiring the deployment of sensors. It not only enhances energy efficiency but also improves real-time data analysis capabilities and privacy protection. Embedded AI does not transmit raw sensor data but only sends processed insights, further reducing network load. Additionally, the development and production costs of these systems are often lower than those of cloud-based AI solutions.
“Embedded AI allows AI-driven operations to be decentralized, enabling large interconnected industrial systems to operate with significantly reduced energy consumption,” said Gromov. “Whenever possible, embedded AI should be utilized to optimize energy use and ensure long-term sustainability.”
AITAD: A Pioneer in Industrial Embedded AI
AITAD is a German embedded AI provider focused on the development, testing, and mass production of AI electronic systems. The company specializes in industrial machine learning applications, covering the entire process from data acquisition to product delivery. It customizes AI solutions for clients, ensuring optimal integration with minimal resource requirements.
The company’s areas of expertise include predictive maintenance, user interaction, and functional innovation. Unlike many AI providers offering one-size-fits-all solutions, AITAD develops customized AI systems to meet the needs of each client. Its in-house prototyping capabilities (Figure 1) support rapid development and deployment, enabling industrial partners to implement AI more easily.
AITAD has been recognized for its contributions to AI innovation, receiving multiple awards, including the Embedded Award in the AI category, the Top 100 Innovation Award for medium-sized enterprises, and the title of AI Champion of Baden-Württemberg in 2023. By prioritizing the development of embedded AI, the company is helping various industries transition to more sustainable and efficient AI operations.

Figure 1:AITAD develops AI related to electronics (embedded AI), capable of executing locally defined tasks in devices and machines in real-time.(Source: AITAD Company)
As AI becomes more prevalent, its energy demands cannot be ignored. Promoting the development of miniaturized, specialized models driven by embedded AI provides a blueprint for balancing innovation with sustainability. By prioritizing efficiency, industries can fully leverage the transformative potential of AI while reducing its environmental costs. “The future of AI is not just about doing more, but doing better—for businesses, users, and the planet,” said Gromov.
(Editor: Franklin)