Normal view

Received today — 2 February 2026

DCR Predicts: Can data centres become ‘good neighbours’ in 2026?

2 February 2026 at 08:18

Gareth Williams, Director, UK, India, Middle East and Africa Data Centres and Technology Leader at Arup, argues that 2026 should be the turning point for designing facilities that stabilise grids, steward water, and deliver visible community benefits.

2026 marks a pivotal opportunity to transform how data centres are seen in the public eye. Much has been done to change perceptions from anonymous ‘black boxes’ into strategic assets. Now we must ensure they are seen as positive partners for local energy, water and communities.

That means designing for reciprocity: centres that not only consume, but also stabilise grids, steward scarce water, create jobs, share heat, and leave biodiversity richer than before. This is what I see in briefs for clients, planners and operators alike: putting community benefit at the heart of developments, not as an afterthought.

Energy: from load to flexible, clean, locally useful power

AI-centric workloads are driving volatile, high-density demand, making efficiency gains harder. This is forcing smarter energy strategies, from chip-level liquid cooling and rack-level heat recovery to intelligent workload management.

We will increasingly see data centres act as energy hubs, with co-located renewables, multi-hour batteries, combined heat and power systems, and grid-service participation (frequency response, demand shifting) from day one. Pilot policies already treat facilities as grid allies, including heat-reuse quotas and flexible-access contracts. Operating models will increasingly shift compute to areas with surplus wind and sun — an approach that could also route non-time-critical training to regions with surplus energy.

Baseload energy supply options will mature unevenly. Some operators are testing power purchase agreements linked to small modular reactors to accelerate capacity. Others will combine hydrogen fuel cells for peak resilience with smart microgrids and local renewables. Regardless, the key is to offer two-way benefits: better uptime for operators and measurable support for national grid stability.

Water: design for scarcity, stewardship and circularity

Cooling demand will keep rising with denser compute. This can shift demand in some cases from air to liquid solutions, but the next step is water stewardship by design: closed-loop systems, immersion cooling where appropriate, and zero-freshwater ambitions in stressed catchments.

The Climate Neutral Data Centre Pact points to a water usage efficiency trajectory from ~1.8 L/kWh to 0.4 L/kWh in water-stressed sites by 2040. This is ambitious, but achievable if we switch to non-potable sources and track upstream and downstream impacts.

Practical levers for 2026 include site-level greywater reuse, recycled/industrial ‘brackish’ water sources, rainwater harvesting with sponge landscapes, and seawater cooling at coastal hubs — where environmental permissions and biodiversity management are designed from the outset. Singapore’s Green Data Centre Roadmap shows how regulation can drive cooling tower efficiency upgrades, blowdown recycling and cycles-of-concentration improvements that cut freshwater withdrawals at scale.

Community engagement: early, transparent, beneficial

Engagement still starts too late on many projects. Flip the sequence: begin with benefits, then shape the scheme around agreed outcomes. Practical packages include renewable partnerships that share surplus power; reuse district heat; build biodiversity corridors and accessible green space; offer fibre upgrades that lift local connectivity; and provide STEM education funding and jobs for technicians and landscapers.

Community-first design de-risks approvals and earns trust. These aren’t gestures; they increase value over the life of the campus. This ‘good neighbour’ lens is the fastest way to retire the ‘black box’ image and demonstrate tangible contributions to people’s lives.

Technology: intelligent management, edge resilience, advanced cooling

AI already plays a crucial role in enhancing operations, and it’s only getting smarter. One example is Digital Realty’s collaboration with Ecolab, which identifies real-time operational inefficiencies in cooling systems and recommends improvements to conserve water.

AI-powered management will become the operating system of next-generation facilities, actively orchestrating workloads, power and cooling to maximise efficiency. Intelligent monitoring will drive automation for predictive maintenance, spotting deteriorating components early and scheduling interventions without disrupting SLAs.

At campus scale, hyperscale modular architecture (standardised power and cooling blocks with repeatable controls) will enable capacity expansion and help manage AI surges. And at rack level, advanced liquid cooling systems (direct-to-chip and rear-door heat exchangers) will integrate with smart controls to maximise performance while minimising power and water use.

Materials: low-carbon, modular, designed for circular recovery

Measuring whole-life carbon is vital to managing the sustainability of buildings and critical infrastructure, including data centres. The materials brief should be explicit: certified low-carbon or recycled steel, geopolymer concrete where feasible, and engineered timber for appropriate architectural elements and shading. Envelope design, daylighting and thoughtful material selection can cut operational and embodied impacts while improving working environments.

2026 will see increasing design for disassembly and recovery: standardised rack aisles, traceable components, and procurement that favours reclaimed metals and remanufactured cooling equipment. We should expect to link digital asset plans with physical asset lifecycle strategies, ensuring that refresh cycles trigger material recovery instead of waste.

Acceleration: scale fast, standardise what matters, customise what counts

Large, out-of-town campuses with repeatable, prefabricated/containerised solutions are the only way to match AI demand responsibly. To make this happen, owners and operators will need to standardise the backbone (power blocks, cooling modules, monitoring stacks), then customise for local energy and water contexts.

Reduced bespoke engineering means faster approvals, lower risk, and clearer community commitments (heat and water reuse, biodiversity) baked into template designs. Energy policies that treat campuses as anchor tenants and reward flexibility services will further cut delivery timelines while raising public value.

Conclusion: a systems brief

This is the year to design data centres as reciprocal systems: energy hubs that stabilise grids and disclose 24/7 clean sourcing; water stewards that minimise freshwater draw and close loops; and neighbours that fund skills, share heat, and leave landscapes better than before.

With multidisciplinary teams and a place-first brief, owners and operators can move from compliance to contribution — engineering facilities that are engines of local resilience and global compute. If we build them this way, the sector will be remembered not for what it consumed, but for what it enabled.

This article is part of our DCR Predicts 2026 series. The series has now offficially concluded, you can catch all the articles at the link below.

DCR Predicts 2026
Received before yesterday

DCR Predicts: The new bottleneck for AI data centres isn’t technology – it’s permission

29 January 2026 at 08:23

As gigawatt-scale sites move from abstract infrastructure to highly visible ‘AI factories’, Tate Cantrell, Verne CTO, argues that grid capacity, water myths, and local sentiment will decide what actually gets built.

The industry in 2026 will need to get ready for hyper-dense, gigawatt-scale data centres, but preparation will be more complicated than purely infrastructure design. AI’s exploding computational demand is pushing designers to deliver facilities with greater density that consume a growing volume of power and challenge conventional cooling.

The growth of hyperscale campuses risks colliding with a public increasingly aware of power and water consumption. If that happens, a gap may open between what designers can achieve with the latest technology and what communities are willing to accept.

A growing public awareness of data centres

The sector has entered an era of scale that would have seemed implausible a few years ago. Internet giants are investing billions of dollars in facilities that redefine large-scale and are reshaping the market. Gigawatt-class sites are being built to train and deploy AI models for the next generation of online services.

But their impact extends beyond the data centre industry: the communities hosting these ‘AI factories’ are being transformed, too.

This is leading to engineered landscapes: industrial campuses spanning hundreds of acres, integrating data halls with power distribution systems and cooling infrastructure. As these sites become more visible, public awareness of the resources they consume is growing. The data centre has become a local landmark – and it’s under scrutiny.

Power versus perception

Power is one area receiving attention. Data centre growth is coinciding with the perception that hyperscale operators are competing for grid capacity or diverting renewable power that might otherwise support local decarbonisation. There is no shortage of coverage suggesting data centres are pushing up energy prices, too.

These perceptions have already had consequences. In the UK, a proposed 90 MW facility near London was challenged in 2025 by campaigners warning that residents and businesses would be forced to compete for electricity with what one campaign group leader called “power-guzzling behemoth”. In Belgium, grid operator Elia may limit the power allocated to operators to protect other industrial users.

It would not be surprising to see this reaction continue in 2026, despite the steps taken by all data centre operators to maximise power efficiency and sustainability.

Cool misunderstandings 

Water has become another focal point. Training and inference models rely on concentrated clusters of GPUs with rack densities that exceed 100kW. The amount of heat produced in such a dense space exceeds the capabilities of air-based cooling, driving the move to more efficient liquid systems.

Yet ‘liquid cooling’ is often interpreted by the public as ‘water cooling’, feeding a perception that data centres are draining natural water sources to cool servers.

In practice, this is rarely the case. While data centres of the past have relied heavily on evaporative cooling towers to deliver lower Power Usage Effectiveness, today we see a strong and consistent trend towards lower Water Usage Effectiveness through smarter cooling and sustainable design. Developments in technology are making water-free cooling possible, too, with half of England’s data centres using waterless cooling. Many operators use non-water coolants and closed-loop systems that conserve resources.

Data centres as part of the community 

Addressing public concerns will require a change in how operators think about their place in communities. Once built, a data centre becomes part of the local fabric and the company behind it, a neighbour. Developers need to view that relationship as more than transactional. They must demonstrate that growth is supported by resilient grids capable of meeting new demand without destabilising supply or driving up cost.

Water and power are essential resources, so public concern is understandable. It’s therefore important that operators show that density and efficiency can be achieved without disproportionate environmental impact. The continued rollout of AI-ready data centres will depend as much on social alignment as on advances in chip performance.

That alignment will be tested in 2026 and beyond as another wave of high-density deployments arrives. Based on NVIDIA’s product roadmap, we already have a sense of what’s coming: each generation of hardware delivers more power and heat, requiring more advanced infrastructure.

NVIDIA’s Chief Executive Jensen Huang introduced the DSX data centre architecture at GTC 2025 in Washington DC, a framework designed to make it easier for developers with limited experience to deploy large-scale, AI-ready facilities. In effect, it offers a global blueprint for gigawatt-scale ‘AI factories’.

A positive outcome of this will be a stronger push towards supply chain standardisation. Companies such as Vertiv, Schneider Electric and Eaton are aligning around modular power and cooling systems that are easily integrated into these architectures. Nvidia, AMD and Qualcomm, meanwhile, have every incentive to encourage that standardisation. The faster infrastructure can be deployed, the faster their chips can deliver the required compute capacity.

Standardisation, then, becomes a commercial and operational imperative, but it also reinforces the need for transparency and shared responsibility.

Efficiency and expansion 

Behind all of this lies the computational driver: the transformer model. These AI architectures process and generate language, code or other complex data at scale — the foundation of today’s generative AI. They are, however, enormously power-hungry, and even though it’s reasonable to expect a few DeepSeek-type breakthroughs in 2026 – discoveries that achieve similar performance with far less energy thanks to advances in algorithms, hardware and networking – we shouldn’t expect demand for power to drop.

The technical roadmap during 2026 is clear. We are heading towards greater density, wider uptake of liquid cooling and further standardisation. With data centres running as efficiently and sustainably as possible, developers and operators will need to establish trust with local stakeholders for the resources required to develop and power the AI factories that will drive a new era of industrial innovation.

This article is part of our DCR Predicts 2026 series. Check back every day this week for a new prediction, as we count down the final days of January.

DCR Predicts 2026

DCR Predicts – UK data centres are booming – but is the power running out?

By:DCR
27 January 2026 at 08:00

A panel of experts explore why grid capacity, connection queues, and rising AI power density are starting to dictate what can be built in 2026 – and where.

The UK’s data centre boom is accelerating, fuelled by the AI gold rush. Hyperscalers are expanding campuses and investment continues to flow, but the practical limits of growth are becoming harder to ignore.

Data centres already account for around 2.5% of the UK’s electricity consumption, and with AI workloads accelerating, that could rise sharply. Power availability, grid connection delays, planning constraints and sustainability pressures are no longer background considerations. As 2026 approaches, they are actively shaping what can be built, where, and how.

Power limits are no longer theoretical

For years, efficiency improvements helped offset rising demand, but that buffer is tiring quickly as AI is pushing power density beyond what many facilities were designed to support.

Skip Levens, Quantum’s Product Leader and AI Strategist, the LTO Program, sees a clear roadblock ahead. “In 2026, AI and HPC data centre buildouts will hit a non-negotiable limit: they cannot get more power into their data centres. Build-outs and expansions are on hold and power-hungry GPU-dense servers are forcing organisations to make hard choices.”

He suggests that modern tape libraries could be the solution to two pressing problems, “First by returning as much as 75% of power to the power budget to ’spend’ on GPUs and servers, while also keeping massive data sets nearby on highly efficient and reliable tape technology.”

Whether or not operators adopt that specific approach, the wider point holds. Growth is no longer just about adding capacity – it’s about how power is allocated and conserved within fixed limits.

Sustainability under pressure

Sustainability remains a defining theme for the sector, but the pace of AI-driven expansion is testing how deeply those commitments are embedded.

Terry Storrar, Managing Director at Leaseweb UK, describes the balancing act many operators are facing, “Sustainability is still the number one topic in the data centre industry. This has to work for the planet, but also from an economic perspective.

“We can’t keep running huge workloads and adding these to the grid,” he warns, “it’s simply not sustainable for the long term. So, there is huge investment into how we make technology do more for less. In the data centre industry, this translates into achieving significant power efficiencies.”

Mark Skelton, Chief Technology Officer at Node4, agrees, warning, “Data centres already consume around 2% of national power, while unchecked growth could push that to 10-15%, at a time when the grid is already strained and struggling to keep pace with soaring demand. In some areas, new developments are being delayed simply because the grid cannot deliver the required capacity quickly enough.”

To put this into perspective, Google’s new Essex facility alone is estimated to emit the same amount of carbon as 500 short-haul flights every year.

Grid delays, planning and skills gaps

There’s also a broader question of how well prepared the UK actually is for such a rapid scale-up in data centre infrastructure,

“Currently, the rush to build is overshadowing the need for a comprehensive approach that considers how facilities draw power and utilise water, as well as how their waste heat could be repurposed for nearby housing or industry,” Node4’s Skelton continues. “The technology to do this already exists, but adoption remains limited because there is little incentive or regulation to encourage it.”

In the UK, high-capacity grid connections can take over a year to secure, while planning delays and local opposition add further friction. Another roadblock is that “communities will increasingly challenge data centre expansion over water and energy use,” warns Curt Geeting, Acoustic Imaging Product Manager at Fluke. This is “pushing operators toward self-contained microgrids, hydrogen fuel cells, and other alternative power sources. Meanwhile, a growing shortage of skilled technicians and electricians will become a defining constraint.”

Geeting believes automation and I will be key to tackling some of these infrastructure roadblocks. “The data centre test and measurement market will enter 2026 on the brink of a major transformation driven by speed, density, and intelligence. Multi-fibre connectivity will expand rapidly to meet the bandwidth demands of AI-driven workloads, edge computing, and cloud-scale growth.

“Very small form factor connectors, multi-core fibre, and even air-core fibre technologies will begin reshaping how data moves through high-density environments – enabling faster transmission with lower latency. At the same time, automation and AI will take centre stage in testing and diagnostics, as intelligent tools and software platforms automate calibration tracking, compliance verification, and predictive maintenance across vast, complex facilities.”

Edge, sovereignty and a rethink of scale

Data centres remain the backbone of the digital economy, underpinning everything from cloud services to AI and edge computing. With the rapid rise in AI, there are concerns that the UK will struggle to keep pace.

“The AWS outage reminded everyone how risky it is to depend too heavily on centralised cloud infrastructure,” urges Bruce Kornfeld, Chief Product Officer at StorMagic. “When a single technical issue can disrupt entire operations at a massive scale, CIOs are realising that stability requires balance.

“In 2026, more organisations will move toward proven on-premises hyperconverged infrastructure for mission-critical applications at the edge. This approach integrates cloud connectivity to simplify operations, strengthen uptime and deliver consistent performance across all environments. AI will continue to accelerate this shift.”

“The year ahead will favour a shift toward simplicity, uptime and management,” he adds. “The organisations that succeed will be those that figure out how to avoid downtime with simple and reliable on-prem infrastructure to run local applications. These winners understand that chasing scale for its own sake does nothing but put them in a vulnerable position.” This redistribution may ease pressure on hyperscale campuses.

Looking to 2026

Looking ahead to 2026, the pressures facing UK data centres are unlikely to ease. Power constraints, grid delays and sustainability expectations are becoming long-term issues, not just temporary obstacles. While technologies like quantum computing may eventually reshape infrastructure design, they won’t resolve the immediate challenges operators face today. The UK still has an opportunity to lead in AI and digital infrastructure, but only if growth is planned with constraint in mind. Without clearer coordination, incentives and accountability, the rush to build risks locking inefficiencies into the system for years to come. 

This article is part of our DCR Predicts 2026 series. Check back every day this week for a new prediction, as we count down the final days of January.

DCR Predicts 2026

DCR Predicts: The ‘gig economy’ is coming to data centres in 2026

21 January 2026 at 08:00

Claire Keelan, Managing Director UK at Onnec, explains why project-based delivery models will become the backbone of new builds and upgrades in 2026, as traditional staffing struggles to match the pace and complexity of AI-led demand.

The data centre industry is constantly evolving. As AI workloads accelerate, operators are under mounting pressure to scale capacity while navigating skills shortages, infrastructure constraints and rising expectations around resilience. What worked a few years ago is no longer enough. Delivery models, workforce strategies and site design assumptions are all being tested.

In 2026, success will depend less on expansion and more on adaptability. Operators will need to rethink how projects are staffed, where capacity is built, and how existing assets are upgraded to meet AI demand. Flexible labour, broader talent inclusion, regional diversification and retrofitting will move from tactical considerations to strategic priorities.

The data centre ‘gig economy’ becomes backbone of delivery

Flexible labour models will underpin almost every new data centre project. Traditional staffing can’t scale at the speed AI demands. By 2026, flexible, crowdsourced, project-based teams will fill critical gaps across design, building, and operations. This shift isn’t about replacing expertise, it’s about redeploying it. Clear standards, accreditation, and safety frameworks will make flexibility viable at scale, turning part-time professionals and returning workers into a reliable, high-quality talent engine.

Women become central to meeting capacity targets 

With women making up less than 8% of the current workforce, the imbalance is holding the sector back. In 2026, diversity will shift from talking point to operational priority. This means targeted recruitment, retraining programmes, and mentorship networks designed to bring more women into engineering, safety, and leadership roles. Diversity will be treated as a business resilience issue, not just a social goal. This is because the industry can’t meet AI’s demands while sidelining a sizable portion of its potential workforce.

AI growth zones redraw the map

Regional ‘AI growth zones’ will emerge as the new engines of capacity. In 2026, Manchester, South Wales, and Scotland will continue to gain momentum thanks to lower land costs, renewable energy access, and close ties to academic institutions. This regional diversification will help balance power use and strengthen resilience against local constraints. The days of London and the M4 corridor as the single dominant hub are fading; the future of data centres is distributed, collaborative, and regionally connected.

Retrofitting becomes a reality check

With the UK home to one of the world’s largest portfolios of legacy data centres, over the next year operators must prove how fast they can innovate to stay ahead in the new AI landscape. 

In 2026, we’ll see a surge in retrofitted data centres as operators rush to upgrade legacy sites to meet soaring AI demand. Power and cooling will be complex, but cabling and network capacity will be the real bottlenecks. Poor-quality or overcrowded cabling limits density, throttles performance, and makes future upgrades almost impossible. 

Smart operators will invest early in high-grade structured systems that support modular expansion and long-term flexibility. ‘Retrofit-ready’ will become the new benchmark for responsible, future-proof design.

Looking into 2026

By 2026, the data centre sector will be defined less by how much capacity it builds and more by how intelligently it evolves. AI is compressing timelines, exposing fragility, and forcing long-term decisions into the present. Operators that treat this moment as a simple scaling challenge will struggle. Those that recognise it as a structural reset will set the pace.

Data centres are becoming critical national infrastructure for an AI-driven economy, and resilience will matter as much as raw performance. Leadership will belong to operators that move early, design for uncertainty, and embed adaptability. The question in 2026 is not who can grow fastest, but who can keep up when the rules keep changing.

This article is part of our DCR Predicts 2026 series. Come back every week in January for more.

DCR Predicts 2026
❌