Datacenters: "opacity around this topic is maintained by many stakeholders; it must be addressed"

Data centers, strategic components of the digital ecosystem, are expanding at a rapid pace across the globe. Driven by the global rise of AI, the construction and operation of this infrastructure come with many challenges that are not always clearly explained. An interview with Benoit Petit, co-founder of Hubblo.

By Louise Pastouret

January 9, 2026

20 min

Share this article

On social networks

Crédit : Getty Images
Benoit Petit

Interview

Benoit Petit

Co-founder of Hubblo, a company specializing in the socio-ecological transformation of digital organizations.

My colleagues and I conduct studies on the systemic impacts (environmental, social, societal) of digital technologies and support our clients in aligning their activities with planetary boundaries. I focus in particular on topics related to digital infrastructures, notably datacenters and artificial intelligence. I am also a developer of open-source tools to support the assessment of environmental impacts, and a volunteer with the Boavizta association.

Data centers are currently drawing a great deal of attention, as these key infrastructures power AI systems. What is the situation in France today?

We carried out a data collection on French datacenters as part of the 2060 forward-looking study on energy, water, and land consumption of French datacenters, commissioned by Ademe (alongside Studio Dégel, Lorraine de Montenay, and Sinok).

352

this is the number of active datacenters in mainland France at the end of 2024.

Ademe

The majority of them are colocation facilities: they operate the core datacenter functions and allow their clients to install equipment in the spaces they rent. More specifically, we identified 232 colocation datacenters with less than 10 MW of capacity, and 55 colocation datacenters with more than 10 MW. The remainder consists of private datacenters (reserved exclusively for a company or group), public-sector datacenters, or specialized HPC datacenters (High Performance Computing, generally for research purposes). The scope of the study excludes IT rooms located in office buildings.

These active datacenters represent a cumulative installed capacity of around 2.5 GW (including the entire building, IT equipment, and cooling). For reference, a nuclear reactor typically provides close to 1 GW of production capacity (excluding EPR reactors). In terms of land use, we identified just over 2,445,000 m² of gross floor area (ground footprint summed across all building floors) and just over 1,125,000 m² of IT floor space.

For 2024, we estimate that these datacenters consumed 8.16 TWh of electricity, which corresponds to nearly 30% of the annual energy consumption of the city of Paris (around 27 TWh). To this must be added 1.8 TWh, which is the estimated energy consumption of server rooms located in office buildings (server racks in company offices). All of these data are presented in the study report, with much greater detail.

Among the identified datacenters, how many are dedicated to artificial intelligence use cases?

For now, it appears that deployments in France for this purpose have mainly taken place in colocation or private datacenters, with lower density. That is, with fewer servers per rack, as air cooling is incompatible with electrical power above 40 kW per rack.

At present, there are very few datacenters dedicated to artificial intelligence operating in France. Several projects are underway, some already under construction, across around thirty potential sites capable of hosting high energy-density datacenters, as announced by the government(1).

Among these projects, several are high-density datacenters primarily designed for generative AI. These datacenters are characterized in particular by the use of direct liquid cooling (DLC). Above all, they feature installed capacities far greater than anything that currently exists. Examples include the Fouju site in Seine-et-Marne, developed by the Mistral AI – Nvidia – MGX consortium, which could ultimately host a datacenter with a capacity of 1.4 GW(2), the Cambrai project by Brookfield, which could reach 1 GW(3), or the Montereau-Fault-Yonne site by OpCore on land previously owned by EDF, which could ultimately reach 700 MW(4).

The size and electrical power required for this type of datacenter completely reshapes the environmental and infrastructural footprint of French datacenters (it should be recalled that the total capacity of the current fleet does not exceed 2.5 GW).

In your investigation, what methods were used to identify, analyze, and calculate the resources required to operate these datacenters?

The study consisted of two major phases. First, a data collection phase aimed at establishing the current situation, which involved:

  1. identifying sites likely to correspond to datacenters,
  2. confirming their presence using satellite imagery (with well-known tools such as Google Maps and OpenStreetMap, as well as lesser-known ones like World Imagery Wayback),
  3. identifying the organization operating each datacenter using cadastral data (notably via the real estate website immobilier.pappers.fr),
  4. extracting dimensions, ground footprint, and building height from these sources,
  5. extracting other available data from documents published by the operator or the press,
  6. cross-referencing available information with inferences based on the datacenter type, using data already present in the database.

The second phase involved modeling future trends in order to project consumption levels up to 2060.

To do this, we developed a forward-looking model inspired by well-established models used for this type of projection (from the research team of Eric Masanet at UC Santa Barbara / Berkeley(5)). The model takes 2024 data as input and projects consumption trajectories based on different scenarios, which themselves are derived from Ademe’s prospective scenarios:

  • Business as usual: continuation of current trends.
  • Technological fix: technologies and innovation are the only levers activated to try to meet climate objectives.
  • Green technologies: technologies and innovation remain central, with a stronger focus on renewable energy and electrification.
  • Territorial cooperation: stronger cooperation between datacenter operators and local authorities, with a degree of sobriety.
  • Frugal generation: emphasis on infrastructure and usage sobriety.

Depending on the scenario, the projections differ significantly and are explained in detail in the report.

Can you explain what building a datacenter involves in terms of infrastructure?

A datacenter is made up of several components. IT rooms house computing equipment and the racks in which it is installed. These racks are generally placed on raised floors capable of supporting their weight (often also used to circulate cold air).

The IT room also contains equipment that serves as the endpoint of the cooling system (air cooling, liquid cooling, or, more rarely, immersion cooling in oil baths).

Other rooms are essential to datacenter operations, for example those hosting uninterruptible power supply equipment, which relies on batteries (lead or lithium) when needed. Other rooms (meet-me rooms) are dedicated to network interconnection between datacenter clients.

Electrical redundancy does not stop at batteries—which at best provide only a few dozen minutes of autonomy—but is also ensured by backup generators (most often diesel-powered, although other fuels exist). The required fuel must be stored, often in underground tanks near the datacenter.

The cooling system extends beyond the IT room, with endpoints located on or near the building, where equipment expels hot air or cools heated water from the cooling loop. Various technologies exist; in France, outdoor evaporative cooling systems are still a minority (unlike in the United States).

Then there are all the systems related to lighting, security (fingerprint or retinal scanners, internal information systems, fire alarms and associated equipment, etc.), logistics (equipment delivery, reception, package handling), and so on.

Finally, a datacenter requires transport access (roads, parking), connectivity (trenches for fiber optics), and electrical connections (transformers and a dedicated substation may be required).

We are therefore talking about potentially massive infrastructure on several levels. The occupied surface areas vary widely: there are very small datacenters (less than 500 m² of IT space), but recent projects push toward gigantism (more than 10,000 m² of IT space, or even several tens of thousands in France, and several hundreds of thousands in recent US projects, such as OpenAI’s Stargate site(6)).

Do datacenter projects raise specific issues in terms of siting and deployment?

It depends on which issue we are talking about. From the perspective of the electrical power required, this is clearly an issue for large projects of several hundred megawatts or more than one gigawatt.

However, such projects will not reach these capacities overnight. Grid connections are gradual and carried out in stages (for example, a few tens of megawatts at a time). Announcements are sometimes disconnected from reality and often made well in advance. Energy consumption is a key issue because densities are higher in these new projects, but also because consumption profiles differ: power demand fluctuates more rapidly and intensely with generative AI than with traditional IT(7)(8), which can raise concerns about grid stability.

From a land-use perspective, one of the major questions concerns the nature of the land occupied. Is it a brownfield site—meaning no additional land artificialization—or forest or agricultural land? Quantifying these changes based on prior land use is one of the follow-ups we aim to pursue, for both existing and future sites.

Water consumption is also a local issue. Its severity depends on the water stress of the area, meaning situations where water demand exceeds available supply (or where water quality does not allow the intended use) within a given area and time frame. For now, massive water evaporation does not exist in French datacenters, and national consumption appears to be very low according to Arcep(9), unlike in the United States, where most large LLM-based services are hosted. Politically, the trend is toward simplifying and accelerating the deployment of such projects, so nothing is guaranteed for the future.

Social acceptability is also a central issue. We have seen movements such as “Le nuage était sous nos pieds” in Marseille, led by La Quadrature du Net(10), or the Amazon datacenter project in Brétigny-sur-Orge, which was delayed by the prefect following local opposition(11). In the United States, problems are beginning to emerge following the deployment of large datacenters: water quality with Meta(12), air quality with xAI(13), electricity prices in the eastern states(14), etc.

My personal view is that these projects should only be authorized if public concerns are taken into account, but political direction does not seem to be moving in that direction. Between Article 15 of the bill on simplifying economic life(15)and the European Commission’s project(16), which aims to deregulate large datacenter projects under the Omnibus package, constraints related to the protection of endangered species and rules governing the use of fossil-fuel-based generators could disappear if a project is declared of major national importance or falls under the category defined by the European proposal.

What difficulties did you encounter during the investigation?

Access to useful data was a challenge, though not a surprise. For example, we obtained very little data on water consumption.

The data collection phase also required sorting through the available information. Some data are extremely uncertain. This is particularly true of PUE, orPower Usage Effectiveness (which indicates how many kWh a building consumes overall for every 1 kWh used by IT equipment)(17), whose calculation and interpretation vary widely among stakeholders and are rarely externally audited.

The same applies to greenhouse gas emissions reported in annual disclosures by major tech companies when calculated using a “market-based” approach—that is, taking into account guarantees of origin certificates rather than only the emissions of the national electricity mix. Installed power figures can also be problematic, due to biased announcements driven by investment considerations, speculation, opacity around redundancy, and so on.

Choosing to prioritize surface-area data to calculate other indicators was a reliability choice, as these data can be cross-checked across multiple sources (satellite imagery, cadastral records, manufacturer documentation, engineering studies, etc.).

Are there reasons to be concerned about the deployment of these datacenters in France?

I fully understand environmental concerns, even if it must be remembered that the orders of magnitude in France are very different from those in a country like Ireland, where datacenters are expected to account for 30% of electricity consumption by 2030 according to the International Energy Agency(18), or the United States, where 56% of electricity consumed comes from fossil fuels(19).

The key issue, in my view, is the number and size of these new projects. The environmental impacts of existing datacenters were already widely underestimated. The deployment of new datacenters could represent a multiplication of total installed surface areas and capacities—along with a multiplication of environmental impacts.

Current debates often revolve around “digital sovereignty.” However, it is important to recall that most of these projects rely on foreign investment, are operated by foreign companies, contain hardware from US or Chinese firms, and host software and models that are, in most cases, neither French nor even European. The issue of strong dependency on companies whose interests are not necessarily aligned with those of the country is clearly explained by Ophélie Coelho in her bookGéopolitique du Numérique, l'impérialisme à pas de géant.

And globally?

Microsoft’s greenhouse gas emissions increased by 30% annually between 2020 and 2023(20), and Google’s by 50% over five years(21), mainly due to the concrete used in new construction. In fact, the construction sector is experiencing an economic rebound driven by the datacenter construction boom.

Beyond this study—which only partially addresses the direct environmental impacts of datacenters— the indirect impacts of artificial intelligence are also in question (see the Enabled Emissions campaign by former OpenAI employees Will and Holly Alpine(22), or Gauthier Roussilhe’s webinar(23)), especially as OpenAI’s computing capacity used for shale gas extraction could generate at least three times more emissions than Microsoft’s total carbon footprint(24).

What will happen to the data collected during the investigation?

We want to create a digital commons on this topic, in line with the open-source projects we value, including those maintained by the Boavizta association. Discussions are underway with Data For Good and other stakeholders to establish this commons, its governance, and a clearly documented contribution process so that anyone can propose new data based exclusively on open sources.

One of our inspirations is the MapYourGrid project, which allows anyone interested to contribute to mapping the electricity grid. Data produced through such initiatives are already used by industry professionals and enable progress that would be impossible without a commons.

A final word?

I will conclude by saying that one of the main takeaways of this study is that opacity on this topic is maintained—whether for good or bad reasons—by many actors, and that it needs to be addressed.

A comprehensive understanding of the challenges surrounding generative AI must go beyond analyses limited to per-prompt impacts, which often suffer from narrow scopes and biased framing. Focusing solely on a single processing unit diverts the debate away from a systemic view of the infrastructures and systems involved—and therefore from their real impacts. The issue must be approached from a systemic perspective.

Satellite-image-based approaches are a promising way to begin better mapping these critical infrastructures and the sector’s expansion or transformation trends. Whether the goal of analyses on datacenters is to reduce energy consumption, land use, environmental impacts, or to strengthen infrastructure sovereignty, it is hard to imagine any real progress without genuine understanding and open documentation of this topic.

References: