As global energy consumption in data centers continues to rise, finding ways to reduce energy consumption while meeting the increasing demand for computing power has become an urgent issue. Recently, researchers from the University of Waterloo in Canada proposed an innovative solution: by modifying approximately 30 lines of code in the Linux kernel’s network stack, data center energy consumption can be reduced by up to 30%.
Research Findings: Optimizing Network Packet Processing Efficiency
According to the research, there is a significant inefficiency in how data centers handle network traffic. When new data packets enter the network, they trigger interrupt requests, forcing the CPU core to pause its current task to process the data, which not only slows down processing speed but also increases energy consumption.
The researchers at the University of Waterloo developed a technique called “interrupt request suspension” that optimizes CPU power efficiency by reducing unnecessary interrupts under high traffic conditions. With this technique, the system actively checks for new data packets in the network when needed, rather than waiting for each individual interrupt, thereby reducing interrupt requests. Once traffic slows down, the system returns to interrupt handling mode to balance performance and energy consumption.

This improvement in the handling of interrupt requests in the Linux kernel has been incorporated into version 6.13 of the Linux kernel. Martin Karsten, a computer science professor at the University of Waterloo, stated that they did not add any new features but simply rearranged the order of processing tasks, significantly improving the utilization of CPU cache in data centers.
The Energy Consumption Problem in Data Centers is Prominent, and Energy Saving and Emission Reduction are Urgent
Data shows that by 2030, global data centers’ electricity demand may account for 4% of total electricity demand, with the rapid development of artificial intelligence being a significant driver. For example, the energy consumed to train OpenAI’s GPT-4 model is equivalent to the annual electricity consumption of 5,000 American households, not including the power required for AI inference processes.

However, data center operators are currently slow to act on energy saving and emission reduction. A report from the Uptime Institute shows that less than half of data center owners and operators track key sustainability metrics, such as renewable energy consumption and water usage.
Potential Impact and Application Prospects of the New Code
If widely adopted, this new technology is expected to save a significant amount of energy globally. Martin Karsten emphasized that almost all service requests on the internet could benefit from this change. Although tech giants like Amazon, Google, and Meta are very selective in their use of Linux, if they choose to enable this new method in their data centers, it will have a profound impact on global data center energy consumption.

Additionally, the University of Waterloo is building a green computer server room in its new mathematics building. Karsten believes that sustainability research should be a priority for computer scientists. The Linux Foundation, as the governing body for the development of the Linux operating system, is also a founding member of the Green Software Foundation, which aims to explore the development of “green software” that can reduce energy consumption.
References
-
https://www.techrepublic.com/article/data-centres-energy-reduction-code/
-
https://www.datacenterdynamics.com/en/news/changing-linux-code-could-cut-data-center-energy-use-by-30-researchers-claim/