In the age of artificial intelligence, data centers are no longer just warehouses for servers—they have become the nervous system of the digital economy. Across Europe, the explosive growth of data and AI-driven services raises urgent questions: How can Europe maintain digital sovereignty when the core AI technologies are dominated by U.S. and Chinese companies? How can it balance the demand for massive computing power with energy efficiency and environmental sustainability?
By featuring insights from several experts, this article dives into the key challenges and strategic choices shaping Europe’s data center landscape in the AI era, including: the rise of AI Giga Factories, the debate between centralized mega-hubs and regionalized infrastructures, and the urgent quest for technological and economic sovereignty.
In a hurry? Here are the key notes to know:
- AI reshapes the role of data centers: Exploding data volumes, heavy training workloads, and regionalized AI services turn data centers into strategic infrastructure at the core of Europe’s digital economy.
- Centralization vs. decentralization: Mega-hubs remain essential for large-scale AI, but Europe also needs regional hubs and edge deployments to cut latency, protect sensitive data, and strengthen resilience.
- AI Giga Factories and sovereignty challenges: Ultra-powerful GPU factories are emerging worldwide, yet none of the key players are European—creating a critical gap in technological and economic autonomy.
- Rising energy impact: AI could triple data-center electricity use by 2035, putting pressure on grids and emissions. Efficiency gains, smart cooling, and frugal AI approaches are now essential.
Investment in Data Center Infrastructure
AI investment has become a daily headline, with tech giants such as OpenAI, Google, Nvidia, and Mistral making continuous announcements. Among these massive investments, data centers occupy a strategic position as the backbone of this industrial revolution.
At the AI Summit in Paris in February 2025, French President Macron unveiled a €109 billion AI investment plan for France, including €5 billion specifically earmarked for new AI infrastructure. In the United States, the scale is even larger, with $1.5 trillion announced for AI infrastructure, including $50 billion from Anthropic for new data centers in the coming years.
Data centers are now visible across Europe, from Marseille to other major hubs, reflecting their critical role as the backbone of the digital economy. But why exactly are they so central to AI?
Why AI Requires More Infrastructure
1. Explosive Data Growth
AI demands more and larger data centers due to the sheer volume of data generated and processed. Fabrice Coquio, CEO Digital Realty in France explains:
“At Digital Realty, we worked with UC Berkeley and found that data creation grows by 130% per year—tenfold every six years.”
This explosive growth drives the need for physically larger and more energy-dense equipment, with some racks now consuming up to 100 kW—compared with just 5 kW two decades ago.
“Many data centers will therefore become obsolete. Furthermore, European directives impose strict energy efficiency standards on data centers,” Coquio adds.
2. Regionalizing Services
Generative AI is built on top of multiple layers of infrastructure, including cloud and software platforms. Ophélie Coelho, researcher in digital geopolitics explains:
“The reason we need infrastructure is because we’re regionalizing part of the services that will perform AI computing. In other words, we need to “heat up silicon” in edge data centers to run services that are now being regionalized.”
For example, when you use Google Gemini, the core of the technology is in the United States, but the regionalized Gemini services will run in Europe. And of course, they need infrastructure to operate.
And because the core technology remains U.S.-based, this model requires additional servers, data centers, greater computing capacity and undersea cables to ensure transcontinental interconnection.
Centralization vs. Decentralization: Strategic Choices for Europe
What does that mean, centralized mega-hubs and smaller, regionalized deployments? Let’s see in detail. Historically, data centers have developed as massive clusters—in Frankfurt, Paris, London, Amsterdam—optimized for computing power, international connectivity, and direct access to major public clouds. These “mega clusters” are indispensable for large-scale AI training and foundational models.
Yet the rise of generative AI, increased sensitivity of local data, and regulatory constraints (GDPR, sovereign cloud initiatives, EU energy directives) are pushing for smaller, regionalized hubs closer to users and data sources.
Regional hubs help to reduce latency by bringing AI closer to end-users. They keep sensitive data within national or European boundaries. They also enhance resilience against network saturation and limit concentrated energy demands in massive sites. Decentralization is also key for technological sovereignty.
Coelho questions the current logic of large hubs:
“Instead of relying on a single centralized core technology, we could depend on agents—smaller models that run locally across a territory. This more decentralized approach would reduce the need for large infrastructure hubs and the extensive use of submarine cables. It would be less resource-intensive and would allow for greater control over technological sovereignty and energy consumption, as well as better management of local resources.”
For Giuseppe Zindato, Director of Data Center Hardware and Software Solutions, supporting companies in AI at Dell,
“The true technology of companies is the data. Today, much of the European data sent to LLMs goes to the U.S. or China and then returns. Keeping AI close to where data is generated is crucial.”
So, the European challenge is clear: balance centralized computing power for global-scale AI with decentralized infrastructure capable of enhancing sovereignty, data protection and energy efficiency.
Read also
AI Giga Factories: A New Industrial Paradigm
Another new concept that has emerged with generative AI, and which Europe must examine, is AI Giga Factories. These are ultra-powerful data centers designed for massive model training and GPU aggregation, consuming at least 1 gigawatt of electricity. They are comparable to a modern nuclear plant.
These factories are not for all AI use cases; they are specialized for raw computing power, foundational model training and hyperscale cloud operations.
“This concerns only certain types of AI. Cloud applications, enterprise workloads, or hybrid projects are outside this scope,” Coquio explains.
Indeed, experts agree that giga factories do not replace regional hubs or edge deployments. They serve complementary roles:
| Concept | Focus |
|---|---|
| 📍 Regional Hubs & Edge Deployments | Proximity, low latency, industrial applications, and data sovereignty |
| 🏭 Giga Factories | Raw computing power, foundational model research, hyperscale cloud operations |
However, Fabrice Coquio notes:
“There are about twenty players mastering these technology cores—and none are European.”
If Europe wants to maintain technological independence, the continent must therefore invest in such infrastructure, including hardware and GPUs.
“It’s a strategic choice for Europe: we need to provide the means to work, including infrastructure, data centers, and hardware,” he emphasizes.
Location Matters: Energy and Strategy
Choosing the right location for a Giga Factory also matters. It is as much about energy as it is about strategy. France has several advantages: abundant electricity, one of the most decarbonized energy mixes in Europe, and export capability.
“We have available and low-carbon electricity, making France a better choice than Poland for Giga Factories,” Coquio notes.
Giga Factories also require dense ecosystems: connectivity, cloud providers, integrators, cybersecurity, and data exchange platforms.
“In AI and cloud, you can’t do anything alone,” says Coquio.
Hubs like Marseille exemplify this, with over 10,000 interconnections and 400 networks, serving as a Mediterranean gateway for undersea cables.
European Sovereignty in AI
AI raises pressing questions about European sovereignty, particularly as digital infrastructures and AI models expand rapidly. Local infrastructure alone does not guarantee control over AI models or services. Ophélie Coelho warns:
“Regionalization does not mean independence, because what runs there are not models we control.”
For her, the current approach is marked by overinvestment and acceleration without clear strategic reasoning. Building more data centers in Europe—whether in France or Germany—does not automatically increase autonomy. Instead, it may create greater dependency, because the services and models running in these centers are dominated by global tech companies, not European actors. Many cloud services involve large libraries, of which European companies use only a fraction, yet the infrastructure continues to lock Europe into a dependent ecosystem.
“We are creating more data centers to gain control, but the value does not remain here, since what runs in them is not ours. Even if some hubs host European products, the models themselves are not European, so the value is captured elsewhere.”
True sovereignty, Coelho emphasizes, is technological and economic. It means understanding and controlling what operates on the infrastructure, being able to sell and direct the technology, and avoiding reliance on systems that do not comply with European legal and regulatory standards.
Economic sovereignty is also at stake: U.S.-based companies could adjust service prices or impose conditions that directly affect the European economic and geopolitical landscape. Coelho frames this as a form of digital imperialism, reflecting aggressive market strategies documented in U.S. AI policy reports.
She also highlights the issue of overcapacity and oversizing: just as one does not take the same suitcase for a one-night trip versus six months abroad, Europe is building oversized digital infrastructure—undersea cables, massive data centers, and other assets—often exceeding current needs, leading to inefficiency and waste.
Giuseppe Zindato adds that sovereignty can also be supported through private cloud environments:
“Private clouds are different because my data is mine.”
Pragmatic steps, such as Europe’s financial participation in the recent Marseille undersea cable, illustrate a more practical, usage-driven approach to sovereignty, focusing on collaboration, access, and control rather than simply scaling up infrastructure.
Data Centers and Environmental Challenges
The AI boom brings environmental challenges. In a previous article, we discussed the Shift Project recent report that warns that data centers could emit 920 MtCO₂e per year by 2030, twice France’s annual emissions, with generative AI alone contributing 35% of this.
For Anne-Sophie Marquet, CFO at Metroscope:
“Data centers currently consume 450 TWh per year, and this could triple by 2035—about 10% of the global electricity demand growth over the next fifteen years.”
Most new installations—85%—will be in already stressed regions, intensifying pressure on the grid. Power Usage Effectiveness (PUE), a measure of energy efficiency, shows room for optimization:
“The global average PUE is 1.7; the ideal is 1, and the best data centers achieve 1.2,” Marquet explains.
Through sensor-based instrumentation and AI, data centers could better manage cooling and server energy consumption, reducing both carbon footprint and energy stress.
Frugal AI
Frugal AI could also offer a way to break the link between “more AI” and “more energy.” Rather than scaling up hardware and computational resources indefinitely, frugal AI works by optimizing the core: using more efficient architectures, smaller or more dynamic models, and smarter resource allocation.
A compelling example comes from Dragon LLM, a European startup that recently unveiled a new generative-AI architecture designed specifically for efficiency. Their model can deliver performance comparable to large transformer-based models — but while using far fewer active computational parameters, reducing both energy consumption and hardware needs.
Europe’s data center development is more than building new sites. It requires a delicate balance between computing power, sovereignty, energy efficiency, and circularity. The coming decade will hinge on Europe’s ability to integrate these technological, geopolitical, and environmental considerations into a coherent strategy for AI.
Related article







