Nokia CEO: Making AI greener starts with smarter data center design
The data center at Nokia’s headquarters in Finland helps heat the homes of 14,000 residents. It’s now our third site to repurpose excess heat to lower the energy consumption of the local community.
While Nokia’s own data centers are relatively small-scale compared to the hyperscaler data centers we’re helping connect for customers, including Microsoft, every cloud provider—and user—is grappling with the energy conundrum the AI era poses.
On the one hand, AI-enabled digitalization can help carbon-intensive industries—such as manufacturing and logistics—cut their carbon footprints through greater efficiency and less waste. AI could help mitigate 5% to 10% of global greenhouse gas emissions by 2030, according to a report by Boston Consulting Group.
On the other hand, data centers, which provide the cloud computing that powers artificial intelligence, consume vast amounts of electricity and water. American data center power demand may triple in the next three years, according to a study backed by the U.S. Energy Department.
Making AI more energy efficient
There’s been a lot of talk about the off-grid energy investments of hyperscalers. But the energy efficiency of AI infrastructure also has a big role to play.
Nokia provides networking connectivity inside and between data centers, as well as between end users and data center applications. Understanding this intricate web is important as it’s not just about making the processes inside a data center faster and more efficient. It’s about making the entire journey between somebody making an AI request—and getting back a response—quick, secure, and more energy efficient.
For example, recent advances in optical networking technology have made it possible to lower network power consumption by 60% per bit. But with global network traffic expected to grow 18% to 27% (CAGR) by 2033, these improvements need to keep coming.
As new AI-embedded consumer devices and industrial applications come to market, cloud providers will need to continue adding capacity, minimizing latency, and ensuring reliable services, all while trying to make their operations more sustainable.
Industrial digitalization is driving cloud evolution
Nvidia, an AI pioneer and a Nokia partner, has suggested solutions, including building data centers in locations with access to 100% renewable energy and at extreme northern or southern latitudes for ambient cooling.
Energy, performance, and cost considerations may prompt some cloud providers to build their data centers in remote locations with access to clean energy, passive cooling, and cheaper and more plentiful real estate.
However, data sovereignty laws, security concerns, and the ultra-low latency requirements of industrial applications may see a move toward more distributed cloud computing, with AI workloads moving closer to the end user. This would likely lead to more regional, metropolitan, and edge data centers, with some businesses and organizations opting for on-site data centers for mission-critical functions.
We may, in fact, see both trends at the same time. The anticipated surging demand for cloud compute from new AI applications and industrial digitalization is why cloud providers everywhere are investing huge sums in scaling up their infrastructure.
But when it comes to making AI more sustainable, bigger isn’t always better.
Smarter AI usage reduces energy consumption
Our researchers at Nokia Bell Labs have built small language models (SLMs), in contrast to the large language model (LLM) approach exemplified by ChatGPT. These specialist AI tools are easier to train, provide higher accuracy, and use less power per computation. For example, a telecom engineer may need guidance on how to install a particular baseband unit. A domain-specific telecom SLM would provide more useful information and consume less energy than an LLM request.
Techniques to prune and distill data sets to create smaller models for learning and inference are among the strategies we’re exploring in the area of neural networks—where processes from the human brain are applied to AI and machine learning tools. For instance, AI pruning, which mimics human brain development, has been shown to lead to 30% to 50% energy savings with only a minimal loss in accuracy.
ata centers first emerged in the 1940s as mainframe computer rooms, but they’re now inextricably interwoven with the most talked about technology of the 21st century.
For the AI era to be both successful and sustainable, continued innovation in infrastructure and more intelligent use of AI tools will be critical.
READ the latest news shaping the data centre market at Data Centre Central
Nokia CEO: Making AI greener starts with smarter data center design, source