Reading view

Now and Going Nuclear: Powering the Next Generation of Data Centers

Insights from ASG, Oklo Inc., Switch, and Equinix

Why Nuclear Energy is Back in the Data Center Conversation

At the infra/STRUCTURE Summit 2025, held October 15–16 at the Wynn Las Vegas, one of the most talked-about sessions was “Now and Going Nuclear.” The discussion explored how nuclear energy, long viewed as complex and controversial, is rapidly emerging as a viable solution for powering the data center industry’s next phase of growth.

Moderated by Daniel Golding, CTO of ASG, the panel featured Brian Gitt, Senior Vice President of Business Development at Oklo Inc.; Jason Hoffman, Chief Strategy Officer at Switch; and Philip Read, Senior Director of Product Management at Equinix. Together, they examined how technology, regulation, and market forces are aligning to make small modular reactors (SMRs) and nuclear-derived power a credible and necessary part of the digital infrastructure ecosystem.

A Generational Shift in Nuclear Perception

Daniel Golding opened the discussion by highlighting how dramatically attitudes toward nuclear energy have changed in recent years. “The political opposition has evaporated entirely in the past three to four years,” Golding observed. “What’s happened is a generational change. For younger generations who’ve grown up in a world shaped by climate change, nuclear risk seems modest compared to the risk of inaction.”

This generational shift, Golding noted, is paving the way for new conversations around nuclear deployment, not just as an energy option, but as an environmental imperative. The narrative has moved from “if” to “when,” setting the stage for nuclear integration into the world’s largest digital infrastructure operations.

Policy Momentum and Market Acceleration

Brian Gitt of Oklo described how a wave of regulatory and policy reforms has transformed the U.S. nuclear landscape in just the last year. “Since May, the federal government has released a series of executive orders removing barriers, unlocking fuel supply, and streamlining licensing,” Gitt said. “The NRC is now required to approve reactor applications within 18 months, and the DOE is opening federal lands for AI factories and power infrastructure.”

Gitt also announced that Oklo is leading construction on a $1.68 billion fuel recycling facility in Oak Ridge, Tennessee, the first of its kind in the U.S., designed to convert spent fuel into usable energy. “We’re taking what used to be seen as waste and turning it into 24/7 baseload power,” he explained. “We’ve moved from vision to execution, and the timeline from now to nuclear is about three years.”

Designing for a Nuclear-Powered Future

Jason Hoffman of Switch spoke to how data center design must evolve to integrate nuclear energy at the gigawatt scale. “When we talk about AI factories, we’re talking about facilities that are five times larger than what we’ve traditionally built,” Hoffman said. “These are sites measured in hundreds of acres, with power demand comparable to naval-scale energy systems. Nuclear makes that scale possible.”

He added that Switch and other major operators are actively exploring how to integrate self-generated nuclear power into future campuses. “It’s not just about access to power,” Hoffman said. “It’s about reliability, control, and sustainability. Nuclear enables all three.”

Philip Read of Equinix echoed this point from a customer perspective, emphasizing that clients want certainty. “Our customers want confidence in their power supply, growth strategy, and sustainability goals,” Read said. “They’re asking, ‘Do we need a different strategy for locations and energy sources?’ Nuclear provides that line of sight.”

Security, Scale, and Sustainability

The conversation also touched on key challenges. When asked what keeps him up at night, Hoffman was quick to answer: “Security posture.” Hoffman noted that as nuclear and data centers intersect, ensuring robust cybersecurity and operational safety will be critical.

Gitt added that misconceptions about nuclear waste remain one of the industry’s biggest hurdles. “We have enough stored fuel in the U.S. to power the country for generations,” Gitt said. “It’s not dangerous, it’s energy waiting to be unlocked. We’re sitting on the equivalent of five Saudi Arabias of energy, and we’re burying it instead of using it. That needs to change.”

Golding agreed, noting that for decades, the U.S. has stored waste in temporary pools, a model that is no longer scalable. The consensus: recycling and reusing fuel through modern SMRs is not only possible but essential.

Economic and Community Impact

Beyond technical feasibility, the panel highlighted the broader economic upside of nuclear development. Gitt shared that Oklo’s projects are already generating significant local economic benefits. “We just broke ground in Iowa, and the job creation has been incredible,” Gitt said. “This isn’t just energy innovation, it’s economic revitalization. Communities are competing to host these facilities because they bring skilled jobs, tax revenue, and long-term prosperity.”

Hoffman and Read both agreed that pairing nuclear generation with data center campuses could redefine industrial development in the U.S. “These are long-term, high-value assets,” Hoffman said. “They’re not speculative, they’re the backbone of America’s digital and economic future.”

From Renewable to Reliable: The Role of Baseload Power

Golding raised the question of whether hyperscalers are ready to embrace nuclear as part of their sustainability strategies. Gitt’s answer was unequivocal: “Every major hyperscaler now includes nuclear in their long-term power roadmap. It’s part of the equation for net-zero.”

Gitt noted that nuclear has the smallest materials footprint of any energy source, smaller even than wind or solar; making it one of the most resource-efficient options available. “If we want to keep the lights on and cut emissions, there’s really no alternative,” Gitt said. “The data center industry has realized that nuclear isn’t optional, it’s inevitable.”

From Vision to Reality

The panel made clear that the intersection of nuclear energy and data center infrastructure is no longer theoretical. Regulatory pathways are opening, commercial projects are underway, and the industry’s largest power consumers are preparing to integrate nuclear into their long-term sustainability and capacity strategies.

As Golding concluded, “This isn’t a thought experiment anymore. It’s happening. By the end of the decade, nuclear will be powering data centers, and helping our industry lead the global energy transition.”

Infra/STRUCTURE 2026: Save the Date

Want to tune in live, receive all presentations, gain access to C-level executives, investors and industry leading research? Then save the date for infra/STRUCTURE 2026 set for October 7-8, 2026 at The Wynn Las Vegas.  Pre-Registration for the 2026 event is now open, and you can visit www.infrastructuresummit.io to learn more.

The post Now and Going Nuclear: Powering the Next Generation of Data Centers appeared first on Data Center POST.

  •  

Hyperscale Data Center Procurement: Scaling Smarter in the Age of AI

Insights from Structure Research, Cloudflare, Decillion, Groq, and Lambda

Why Hyperscale Procurement Matters Now

At the infra/STRUCTURE Summit 2025, held October 15–16 at the Wynn Las Vegas, the session on Hyperscale Data Center Procurement explored how hyperscalers, cloud platforms, and AI companies are redefining site selection, capacity planning, and power procurement.

With the explosion of artificial intelligence (AI) and high-performance workloads, the panel examined how data center operators are adapting to meet new demands for speed, density, and collaboration. The discussion brought together leading experts who sit at the intersection of technology, infrastructure, and strategy: Jabez Tan, Head of Research at Structure Research; Sarah Kurtz, Data Center Selection Manager at Cloudflare; Whitney Switzer, CEO of Decillion; Anik Nagpal, Principal Strategic Advisor for Global Data Center Development at Groq; and Ken Patchett, VP of Data Center Infrastructure at Lambda.

Together, they offered a grounded yet forward-looking view of how hyperscale infrastructure is evolving and why collective problem-solving across the ecosystem has never been more urgent.

Understanding the New Procurement Reality – From Megawatts to Gigawatts

Moderator Jabez Tan opened by noting how quickly the scale of data center procurement has transformed. Just a few years ago, hyperscale planning revolved around megawatts. Today, as Ken Patchett of Lambda explained, “We used to talk in megawatts; now we’re talking in gigawatts or even hundreds of megawatts. The world has changed.”

Patchett emphasized that this growth is not theoretical; vacancy rates are at record lows, and facilities are being leased before construction even begins. “Seventy-three percent of buildings being built in the U.S. today are pre-leased before completion,” he said. “In some cases, they’re 100 percent committed before a shovel hits the ground.”

This surge underscores both the opportunity and the strain on today’s hyperscale procurement models. The traditional development timelines, often five years from land acquisition to delivery, are being tested by the speed at which AI and GPU-driven workloads are scaling.

Site Selection and Power – Seeing Through the Noise

Whitney Switzer, CEO of Decillion, offered insights into the increasingly complex process of site selection, especially in an environment filled with speculation and limited power capacity. “There’s a lot of land and a lot of promises,” Switzer said, “but not all sites can actually deliver what hyperscalers need. The challenge is cutting through the noise to identify real, deliverable power and real infrastructure.”

Anik Nagpal from Groq added that power availability has become the defining factor in any site’s viability. “We’re facing long waiting lists with utilities,” Nagpal explained. “It’s not enough to have a site, you need documented substation agreements, confirmed transformer orders, and clear delivery dates.” Without that level of verification, even well-positioned properties can fall short of hyperscale timelines.

Switzer reinforced that the industry must move toward deeper collaboration between developers, power providers, and end users to accelerate readiness. “You have to build trust,” she said. “That’s what ensures creativity and alignment between the business and technical sides of a deal.”

Market Challenges and Evolving Partner Strategies

Sarah Kurtz of Cloudflare described a rapidly tightening capacity market, where competition for space and power is fierce. “Prices have moved dramatically,” Kurtz said. “We might go out for one megawatt and come back to find that the same capacity now costs four times as much.” Despite those pressures, Kurtz highlighted that the key is adaptability, knowing when to secure smaller, strategic sites that can deliver sooner rather than waiting years for larger campuses.

Ken Patchett echoed this sentiment, pointing out that the demand wave is forcing new forms of partnership. “We’re all asking, ‘Do you have space? Do you have power?’ Conversations that didn’t happen ten years ago are now everyday,” Patchett said. “We have to work together, utilities, operators, AI companies, to actually build the infrastructure that matches the pace of technology.”

Nagpal added that power immediacy and transparency are now central to deal-making. “People want to believe the power’s there,” he said, “but you only know it when you see the agreements in writing. That’s the new due diligence.”

Designing for Density and Agility – Building for the Next Cycle

A recurring theme throughout the session was that data center design itself must evolve as hardware cycles shorten. Patchett underscored that density and adaptability are now fundamental requirements. “The buildings we designed 20 years ago won’t support what we’re running today,” Patchett said. “We’re moving from 50-kilowatt racks to 600-kilowatt racks, and we have to build in a way that can pivot every six to nine months as hardware changes.”

Patchett added that despite fears of overbuilding, the industry isn’t facing a bubble. “We’re still using what we built ten or twenty years ago,” he said. “This is about addition, not replacement. Our challenge is to keep up with demand, not question it.”

The panelists agreed that modular design, flexible financing, and shared innovation will define the next phase of data center evolution. As Switzer summarized, “It’s all about partnership, aligning resources and expertise to deliver creative solutions at scale.”

Collaboration as the New Competitive Edge

The session made clear that hyperscale procurement is no longer about simply buying power and land. It’s about integrating supply chains, synchronizing with utilities, and designing for continuous evolution. Across every perspective, developer, operator, and end user, the message was the same: collaboration is the only way to scale sustainably.

The leaders on stage shared a unified view that as AI reshapes data center demand, the industry’s success will depend not on who builds fastest, but on who builds smartest—with transparency, trust, and long-term partnership at the core.

Infra/STRUCTURE 2026: Save the Date

Want to tune in live, received all presentations, gain access to C-level executives, investors and industry leading research? Then save the date for infra/STRUCTURE 2026 set for October 7-8, 2026 at The Wynn Las Vegas.  Pre-Registration for the 2026 event is now open, and you can visit: www.infrastructuresummit.io to learn more.

The post Hyperscale Data Center Procurement: Scaling Smarter in the Age of AI appeared first on Data Center POST.

  •  

Alternative Cloud Providers Redefine Scale, Sovereignty, and AI Performance

At this year’s infra/STRUCTURE Summit 2025, held at the Wynn Las Vegas, one of the most forward-looking conversations came from the session “From Cloud to Edge to AI Inferencing.” Moderated by Philbert Shih, Managing Director at Structure Research, the discussion brought together a diverse panel of innovators shaping the future of cloud and AI infrastructure: Kevin Cochrane, Chief Marketing Officer at Vultr; Jeffrey Gregor, General Manager at OVHcloud; and Darrick Horton, CEO at TensorWave.

Together, they explored the emergence of new platforms bridging the gap between hyperscale cloud providers and the next wave of AI-driven, distributed workloads.

The Rise of Alternatives: Choice Beyond the Hyperscalers

Philbert Shih opened the session by emphasizing the growing diversity in the cloud ecosystem, from legacy hyperscalers to specialized, regionally focused providers. The conversation quickly turned to how these companies are filling critical gaps in the market as enterprises look for more flexible, sovereign, and performance-tuned infrastructure for AI workloads.

Cochrane shared insights from a recent survey of over 2,000 CIOs, revealing a striking shift: while just a few years ago nearly all enterprises defaulted to hyperscalers for AI development, only 18% plan to rely on them exclusively today. “We’re witnessing a dramatic change,” Cochrane said. “Organizations are seeking new partners who can deliver performance and expertise without the lock-in or limitations of traditional cloud models.”

Data Sovereignty and Global Reach

Data sovereignty remains a key differentiator, particularly in Europe. “Being European-born gives us a unique advantage,” Gregor noted. “Our customers care deeply about where their data resides, and we’ve built our infrastructure to reflect those values.”

He also highlighted OVHcloud’s focus on sustainability and self-sufficiency, from designing and operating its own servers to pioneering water-cooling technologies across its data centers. “Our mission is to bring the power of the cloud to everyone,” Gregor said. “From startups to the largest public institutions, we’re enabling a wider range of customers to build, train, and deploy AI workloads responsibly.”

AI Infrastructure at Scale

Horton described how next-generation cloud providers are building infrastructure purpose-built for AI, especially large-scale training and inferencing workloads. “We design for the most demanding use cases, foundational model training, and that requires reliability, flexibility, and power optimization at the cluster scale.”

Horton noted that customers are increasingly choosing data center locations based on power availability and sustainability, underscoring how energy strategy is becoming as critical as network performance. TensorWave’s approach, Horton added, is to make that scale accessible without the hyperscale overhead.

Democratizing Access to AI Compute

Across the panel, a common theme emerged: accessibility. Whether through Vultr’s push to simplify AI infrastructure deployment via API-based services, OVHcloud’s distributed “local zone” strategy, or TensorWave’s focus on purpose-built GPU clusters, each company is working to make advanced compute resources more open and flexible for developers, enterprises, and AI innovators.

These alternative cloud providers are not just filling gaps — they’re redefining what cloud infrastructure can look like in an AI-driven era. From sovereign data control to decentralized AI processing, the cloud is evolving into a more diverse, resilient, and performance-oriented ecosystem.

Looking Ahead

As AI reshapes industries, the demand for specialized infrastructure continues to accelerate. Sessions like this one underscored how innovation is no longer confined to the hyperscalers. It’s emerging from agile providers who combine scale with locality, sustainability, and purpose-built design.

Infra/STRUCTURE 2026: Save the Date

Want to tune in live, receive all presentations, gain access to C-level executives, investors and industry leading research? Then save the date for infra/STRUCTURE 2026 set for October 7-8, 2026 at The Wynn Las Vegas. Pre-Registration for the 2026 event is now open, and you can visit www.infrastructuresummit.io to learn more.

The post Alternative Cloud Providers Redefine Scale, Sovereignty, and AI Performance appeared first on Data Center POST.

  •  

Managed Infrastructure at a Crossroads: The Power-First Revolution

How Data Center Developers Are Navigating the Clash Between Rapid AI Demand and Decades-Old Regulatory Frameworks

The data center industry has undergone a seismic shift that would have seemed unthinkable just a few years ago. Historically, operators selected prime real estate locations and then brought power to their facilities. Today, that equation has completely reversed: developers are taking their data centers to wherever power exists, fundamentally reshaping the geographic and strategic landscape of digital infrastructure.

At the infra/STRUCTURE Summit 2025, held October 15-16 at The Wynn Las Vegas, a distinguished panel gathered to explore this transformation and its profound implications. The session “Managed Infrastructure at a Crossroads” brought together experts from across the infrastructure ecosystem to discuss the challenges, opportunities, and regulatory complexities of the power-first era.

Moderated by Hadassa Lutz, Senior Consulting Analyst at Structure Research, the panel featured Gene Alessandrini, SVP of Energy & Location Strategy at CyrusOne; Allison Clements, Partner at ASG (Aspen Strategy Group) and former FERC Commissioner; and David Dorman, Director of Commercial Operations at APR Energy. Together, they examined how the industry is navigating unprecedented demand while working within, and sometimes around, regulatory frameworks that were never designed for this moment.

The 12-Month Transformation: From “Tier One Markets” to “Just Finding Power”

Alessandrini opened the discussion with a striking timeline that captured the industry’s rapid evolution. When he joined CyrusOne just 12 months prior, the focus was squarely on tier-one markets: Northern Virginia, Columbus, and Chicago. Traditional data center hubs with established infrastructure and connectivity.

“Six months into the job, a lot of new secondary markets started popping up, “Alessandrini explained. “Twelve months into the job, it’s like ‘just find power, and then we’ll figure it out.'”

This shift represents more than a change in strategy; it’s a fundamental reimagining of the industry’s priorities. “We really understand that the industry is constrained in the traditional markets, “Alessandrini continued. “Even the new secondary markets are getting tapped out way too quickly. We’ve gone from an industry that focused really on land acquisition with utility interconnection to now, 12 months later, where any source of power is on the table and location requirements are wide open.”

The traditional site selection checklist includes land first, power second, followed by water, workforce, and tax incentives. However, it’s been completely inverted where power is now number one, and land becomes secondary. “Geography is now wide open, “Alessandrini said. “Your decisions on where you’re going to build that next data center really diverge from all the patterns you followed before.”

The trade-off, he noted, is that scale becomes critical. “If you can get scale at that location, you somewhat offset the ecosystem of challenges when you start building data centers in farther-away locations.”

Bridging Solutions: APR Energy’s Rapid Deployment Model

Dorman of APR Energy provided insight into how power generation providers are responding to this urgent demand with innovative, fast-deployment solutions.

“We focus right now on bridging solutions, either a component solution that we can potentially build for a utility or standalone generation,” Dorman explained. “Our lead time from contract to start-up is about 30 to 90 days after receiving permits and everything else. We’re very quick, and what we bring is kind of a fast-to-market approach.”

APR Energy, with over 20 years in the industry, has developed a proven playbook for rapid deployment. Their equipment is scalable, delivered in blocks of 50 megawatts, 150 megawatts, or 300 megawatts depending on site requirements. “The larger the deployment, the more equipment you have, potentially the cheaper the price point,” Dorman additionally noted.

However, even with this rapid deployment capability, there are critical prerequisites. “The assumptions are: you have permits, you have a site that’s a viable site, and you have fuel supply, typically natural gas, connected close to the site,” Dorman said. “If not, there are providers out there that can bridge to a potential connection if there is some development needed.”

The role APR Energy plays is essential in the current environment: providing a bridging solution that fills the gap between immediate demand and the 24-60 months it typically takes for utility-scale generation or permanent solutions to come online.

The Regulatory Reality: A Clash of Cultures and Timelines

Allison Clements, bringing her perspective as a former Federal Energy Regulatory Commission (FERC) commissioner, offered a sobering assessment of the regulatory challenges facing the industry.

“What is fascinating to me is the clash of cultures between the regulated utility industry and the data center development industries,” Clements said. “Data center developers have a real estate background, they’re tech players coming in, they’re used to operating in actual markets with real supply and consumer choice. In the energy world, it just doesn’t work that way.”

Clements described the fundamental mismatch: “You’ve got a regulatory machine and incumbent incentives, and nothing takes less than 30 or 60 or 90 days. There’s this real lack of appreciation and understanding: Why do you have to move so quickly? Why are you moving so slowly?”

Clements emphasized that when data center developers enter the utility space, they’re stepping into one of the most heavily regulated industries in existence. “You might have one, two, or three regulators who have a hand in decisions when it comes to your interconnection, permitting, and water use. The market is trying to move so quickly, and these regulatory frameworks are just trying to catch up.”

Clements message was clear but empathetic: “These utility sectors aren’t dumb. They’re just built into a giant bureaucracy that wasn’t designed to enable the goals that we now have today. That’s true for state regulatory commissions and the Federal Regulatory Commission. We just weren’t set up for this.”

Alessandrini echoed this disconnect from the developer perspective: “Twelve months before I started, I never thought utilities were as separate from our world as you just described. But after living it for 12 months, I realized that our industry is moving much faster than the regulatory framework: utility markets, interconnection, everything is at a slower pace.”

The challenge, Alessandrini explained, is finding ways to bridge this reality gap. “We’re having lots of conversations, trying to bridge the understanding of what our facilities are, what their businesses are, and why we’re unfortunately not moving at the same pace. We’re working together to find ways that relieve pressure on their framework so they can be more comfortable making decisions that allow us to move faster.”

The Utility Incentive Problem: Capital Investment vs. Operational Efficiency

A critical issue the panel addressed was the fundamental structure of utility incentives, a system that may be working against the rapid expansion the industry needs.

“What’s often missed is the perception that utilities are incentivized incorrectly,” Lutz noted, asking the panelists to expand on whether utilities are rewarded more for capital spending than for optimization and efficiency.

Clements confirmed this concern is rooted in reality. “You have a regulatory system where the incumbent utilities have been given the franchise right to be a monopoly, and they make money by one: volumetric sales, so the more electrons they sell, the more money they make, and two: by capital investment, steel in the ground, generation or grid.”

This creates a structural problem: “A lot of times, efficient operations and opportunities like buying behind-the-meter generation don’t align with the utility business model.”

Dorman, drawing on his 13 years as a utility executive before moving to generation, offered a particularly insightful observation: “It’s funny we’re sitting here saying utilities want to invest in their rate base because that becomes their revenue stream. Yet now the behavior I know it’s not their intention but the way it appears is you don’t want the load anymore, so your rate base doesn’t grow. The new tariff structures appear to disincentivize load growth rather than incentivize it.”

This paradox sits at the heart of the industry’s current challenges: utilities structured to profit from capital investment and volume sales are implementing tariffs that may discourage the very load growth that should benefit them.

Cost Allocation: The Political Third Rail

Clements addressed one of the most politically sensitive issues facing the industry: who pays for what when data centers connect to the grid.

“Data center markets have come onto the system at rapid-fire pace in a moment where electricity prices were already rising,” Clements explained. “The grid has been underinvested in for a long time.”

She outlined three types of costs:

  1. Direct interconnection costs: the physical connection to the grid
  2. Indirect grid impact costs: taking up space on the grid that might impact economic constraints elsewhere
  3. The cost of the electron itself: the actual generation cost

“There’s a lot of discussion around cost allocation,” Clements said. “Data centers come in saying ‘we want to pay our fair share,’ and they do pay for direct costs like substations or switching circuits. But what they don’t pay for is residential customer increases in electricity prices or broader transmission development.”

Clements was careful to note this isn’t necessarily unfair, it’s how supply and demand markets are supposed to work. “You have new customers, new supply should come in, and it should all work out. The opportunity for data centers to lower costs is tremendous.”

The problem, she explained, is timing. “The underlying regulatory frameworks haven’t kept up. As a result, you see demand increasing and supply tightening because we haven’t had time for new supply to come in. These rising costs have been in some part because of data centers, but in large part were coming regardless.”

The political pressure is mounting. “Data center opposition is growing up in communities around the country,” Clements warned. “These are real people with cost concerns, and we need to take it seriously.”

The Scale of the Challenge: 128 Gigawatts by 2029

To put the industry’s challenge in perspective, Clements shared a sobering statistic: “The lowest forecast suggests we’ll have 128 new gigawatts of power demand for data centers. That’s the amount of power it takes to power the entire mid-Atlantic region, that includes Philadelphia, Washington D.C., and Chicago.”

Clements was blunt about the timeline constraints: “If you want power by Q4 2029 and you start today, you can’t build a lifecycle gas plant in that time. Maybe, if you’re lucky enough to secure modular equipment and you start procurement today, in 18 months you can have a solution.”

This reality is driving the search for alternatives and interim solutions, everything from bridging generation to demand flexibility to grid optimization technologies.

Being a “Good Citizen”: Beyond Just Big Power Solutions

When asked what it means to be a “good citizen” in this environment, the panelists emphasized the need for data center operators to look beyond simple power procurement.

Clements urged the industry to embrace innovation: “The hyperscalers want these innovative solutions. Think about opportunities beyond just the big power solutions. There’s hardware and software that helps run the grid more dynamically. We still run our grid like it’s in the 1980s era, no joke in the US.”

Clements also highlighted demand flexibility as a critical tool: “Data centers actually committing to some sort of proactive curtailment throughout the year. That’s hard, it might involve going offline for periods. There are trade-offs in each of these approaches, and each one introduces different risks into your transaction that may or may not be desirable.”

The panel also addressed community impact. “There’s a lot of opportunity for smart developers to give back to the community,” Clements said. “Fire stations, education, public services, these investments matter. We need to help communities understand which part of their electricity bills data centers are responsible for and what they’re doing to mitigate that impact.”

The New Geographic Reality: Data Centers in Unexpected Places

Alessandrini painted a picture of the industry’s evolving geography. “We’re possibly creating new markets for data centers because we’re taking data centers to places you’ve never seen before, the outskirts of Texas, Alabama, Wyoming, and all these other areas.”

This geographic expansion isn’t without challenges. These regions often lack the established ecosystems, workforce, connectivity, and supply chains that traditional markets offer. But when balanced against the availability of power at scale, the trade-offs become acceptable.

“The reality is the industry will continue to broaden,” Alessandrini said. “Power solutions will come from locations with access to gas supply that historically weren’t considered data center markets.”

The 24-to-60-Month Gap: A Bridge Too Far?

Perhaps the session’s most critical tension was captured in Alessandrini’s assessment of timeline misalignment.

“What I realized 12 months in is that instead of being more comfortable that the gap was closing, the gap is actually widening,” Alessandrini said. “We have a 24-month timeline to get facilities built and operational. The regulatory and utility side operates on a 36-to-60-month timeline.”

He was emphatic about the industry’s position: “We’re going to be there in 24 months, and you just tell us when you can join the party, whether that’s 60 months or 72 months. But guess what? We’re going to be there in 24 months.”

The question facing the industry is how to bridge this gap. “We’re going to build power plants, we’re going to build bridging solutions, we’re going to build all these things to allow data centers to get built based on the velocity of our industry,” Alessandrini said.

Dorman agreed: “If we can solve the problems as an industry and bring that 60 months down to 36 months, we still have this 24-month target that we just can’t let go of. We’re going to keep building.”

Looking Ahead: Nuclear, SMRs, and Long-Term Solutions

While the session focused heavily on near-term challenges and natural gas solutions, the panelists acknowledged that longer-term answers may include nuclear power and Small Modular Reactors (SMRs).

“The new SMRs are something which can come to market too,” Alessandrini noted. “We’re trying to wrap our heads around the new reality of data centers today and possibly creating new markets, including with emerging nuclear technologies.”

However, the timeline for commercialized SMR technology remains uncertain, making bridging solutions and interim approaches all the more critical.

Key Takeaways: Navigating the Power-First Era

The panel’s discussion revealed several critical insights for the data center industry:

  1. Power Has Become the Primary Site Selection Criterion: The traditional real estate-first approach is dead. Geography is now determined by power availability at scale, fundamentally reshaping the data center map.
  2. The Regulatory-Developer Timeline Gap Is Widening: Developers operate on 24-month cycles; regulators and utilities on 36-60-month cycles. This gap isn’t closing, it’s growing, forcing creative bridging solutions.
  3. Utility Incentive Structures Are Misaligned: Current regulatory frameworks reward utilities for capital investment and volumetric sales, which may not align with the rapid, efficient expansion the industry needs.
  4. Cost Allocation Is a Political Powder Keg: As residential electricity bills rise and data center development accelerates, community opposition is growing. The industry must proactively address cost concerns and community impact.
  5. Bridging Solutions Are Essential: With demand far outpacing utility-scale generation timelines, fast-deployment bridging solutions from providers like APR Energy are critical to keeping projects on track.
  6. Being a Good Citizen Requires More Than Paying Bills: Data center operators must embrace demand flexibility, support community initiatives, and invest in grid optimization technologies, not just consume power.
  7. The Scale Is Unprecedented: Meeting 128 gigawatts of new demand by 2029 will require every tool in the toolbox, traditional generation, bridging solutions, demand management, grid optimization, and potentially nuclear.
  8. Secondary and Tertiary Markets Are the New Frontier: Texas, Alabama, Wyoming, and other historically non-traditional data center locations are becoming viable, even preferred, due to power availability.

For operators, investors, policymakers, and community stakeholders, the message is clear: the data center industry is at an inflection point. The power-first revolution isn’t a temporary adjustment, it’s the new normal. Success will require unprecedented collaboration between developers, utilities, regulators, and communities to bridge the gap between digital infrastructure’s breakneck pace and the energy sector’s deliberate timelines.

As Alessandrini concluded: “This is a dynamic industry to be in. We’re going to keep building, we’re going to keep finding solutions, because the demand isn’t going away, it’s only accelerating.”

Infra/STRUCTURE 2026: Save the Date

Want to tune in live, receive all presentations, gain access to C-level executives, investors and industry leading research? Then save the date for infra/STRUCTURE 2026 set for October 7-8, 2026 at The Wynn Las Vegas. Pre-Registration for the 2026 event is now open, and you can visit www.infrastructuresummit.io to learn more.

The post Managed Infrastructure at a Crossroads: The Power-First Revolution appeared first on Data Center POST.

  •  

Measured Optimism: Balancing Growth and Realism in the Global Data Center Market

Understanding What’s Next for Digital Infrastructure

At this year’s infra/STRUCTURE Summit 2025, held at the Wynn Las Vegas, industry leaders came together to unpack the state of digital infrastructure in an era defined by AI-driven demand and hyperscale expansion.

One of the standout sessions of the event was “Measured Optimism,” led by Philbert Shih, Managing Director of Structure Research. Known for his data-first insights and global perspective, Shih provided an in-depth look at where the data center market stands today, and where it’s heading next.

His central question set the tone: Are we in a period of sustainable growth, or are we overbuilding?

A Balanced View: Bullish but Realistic

Shih began by acknowledging the debate between the “bulls” and “bears” in digital infrastructure, those who see unbounded opportunity and those who warn of a potential correction.

While recognizing some speculative trends, such as “fake data centers” and build pauses in certain regions, Shih urged attendees to take a longer view.

“There’s a lot of interest in the space, a lot of people with assets to develop,” Shih said. “But the fundamentals remain strong. We’ve seen time and again that this sector has the ability to absorb and grow through cycles.”

Shih drew comparisons to previous market phases. From the dot-com era to the rise of cloud computing, he suggested that what we’re seeing today is a natural evolution, not a bubble.

What the Data Shows

Shih supported his analysis with Structure Research’s latest findings:

  • Demand Continues to Outpace Supply. Hyperscalers and AI workloads are driving record demand. “We consistently see management teams reporting more demand than they can support,” Shih shared.
  • AI is an Accelerant, Not a Disruption. Shih explained that Meta’s recent build pause was less about demand softening and more about re-architecting for AI infrastructure. “AI is reshaping how capacity is planned and deployed,” he said.
  • Global Growth Momentum. While North America remains the largest market, growth across Europe and Asia is accelerating. Chinese and regional cloud providers are increasingly driving new development around the world.
  • Healthy Cycles, Not Cracks. Shih described the current slowdown in some areas as part of the natural “build–pause–absorb” cycle that defines infrastructure development. “Infrastructure doesn’t grow in a straight line,” he noted.

Collaboration Over Competition

A recurring theme throughout Shih’s presentation was partnership. The idea that hyperscalers might replace colocation providers with self-built facilities has largely given way to collaboration.

“There’s more cooperation between hyperscalers and colocation providers than ever before,” Shih said. “These partnerships are becoming essential to meeting global demand efficiently and sustainably.”

Shih also highlighted opportunities in pre-development and edge-scale projects, where new entrants and established providers alike are finding innovative ways to meet demand closer to users.

A Measured but Positive Outlook

Despite capital market challenges, supply chain constraints, and growing power demands, Shih’s conclusion was optimistic, grounded in data and real-world momentum.

“I’m more confident today than I was two years ago,” Shih said. “We’re not overbuilding, we’re building smarter, globally, and with a clearer sense of what’s next.”

The session ended with a strong message: while the sector must navigate its cycles carefully, the long-term trajectory remains firmly upward.

Infra/STRUCTURE 2026: Save the Date

Want to tune in live, receive all presentations, gain access to C-level executives, investors and industry leading research? Then save the date for infra/STRUCTURE 2026 set for October 7-8, 2026 at The Wynn Las Vegas. Pre-Registration for the 2026 event is now open, and you can visit www.infrastructuresummit.io to learn more.

The post Measured Optimism: Balancing Growth and Realism in the Global Data Center Market appeared first on Data Center POST.

  •