Normal view

Received before yesterday

Now and Going Nuclear: Powering the Next Generation of Data Centers

17 December 2025 at 16:00

Insights from ASG, Oklo Inc., Switch, and Equinix

Why Nuclear Energy is Back in the Data Center Conversation

At the infra/STRUCTURE Summit 2025, held October 15–16 at the Wynn Las Vegas, one of the most talked-about sessions was “Now and Going Nuclear.” The discussion explored how nuclear energy, long viewed as complex and controversial, is rapidly emerging as a viable solution for powering the data center industry’s next phase of growth.

Moderated by Daniel Golding, CTO of ASG, the panel featured Brian Gitt, Senior Vice President of Business Development at Oklo Inc.; Jason Hoffman, Chief Strategy Officer at Switch; and Philip Read, Senior Director of Product Management at Equinix. Together, they examined how technology, regulation, and market forces are aligning to make small modular reactors (SMRs) and nuclear-derived power a credible and necessary part of the digital infrastructure ecosystem.

A Generational Shift in Nuclear Perception

Daniel Golding opened the discussion by highlighting how dramatically attitudes toward nuclear energy have changed in recent years. “The political opposition has evaporated entirely in the past three to four years,” Golding observed. “What’s happened is a generational change. For younger generations who’ve grown up in a world shaped by climate change, nuclear risk seems modest compared to the risk of inaction.”

This generational shift, Golding noted, is paving the way for new conversations around nuclear deployment, not just as an energy option, but as an environmental imperative. The narrative has moved from “if” to “when,” setting the stage for nuclear integration into the world’s largest digital infrastructure operations.

Policy Momentum and Market Acceleration

Brian Gitt of Oklo described how a wave of regulatory and policy reforms has transformed the U.S. nuclear landscape in just the last year. “Since May, the federal government has released a series of executive orders removing barriers, unlocking fuel supply, and streamlining licensing,” Gitt said. “The NRC is now required to approve reactor applications within 18 months, and the DOE is opening federal lands for AI factories and power infrastructure.”

Gitt also announced that Oklo is leading construction on a $1.68 billion fuel recycling facility in Oak Ridge, Tennessee, the first of its kind in the U.S., designed to convert spent fuel into usable energy. “We’re taking what used to be seen as waste and turning it into 24/7 baseload power,” he explained. “We’ve moved from vision to execution, and the timeline from now to nuclear is about three years.”

Designing for a Nuclear-Powered Future

Jason Hoffman of Switch spoke to how data center design must evolve to integrate nuclear energy at the gigawatt scale. “When we talk about AI factories, we’re talking about facilities that are five times larger than what we’ve traditionally built,” Hoffman said. “These are sites measured in hundreds of acres, with power demand comparable to naval-scale energy systems. Nuclear makes that scale possible.”

He added that Switch and other major operators are actively exploring how to integrate self-generated nuclear power into future campuses. “It’s not just about access to power,” Hoffman said. “It’s about reliability, control, and sustainability. Nuclear enables all three.”

Philip Read of Equinix echoed this point from a customer perspective, emphasizing that clients want certainty. “Our customers want confidence in their power supply, growth strategy, and sustainability goals,” Read said. “They’re asking, ‘Do we need a different strategy for locations and energy sources?’ Nuclear provides that line of sight.”

Security, Scale, and Sustainability

The conversation also touched on key challenges. When asked what keeps him up at night, Hoffman was quick to answer: “Security posture.” Hoffman noted that as nuclear and data centers intersect, ensuring robust cybersecurity and operational safety will be critical.

Gitt added that misconceptions about nuclear waste remain one of the industry’s biggest hurdles. “We have enough stored fuel in the U.S. to power the country for generations,” Gitt said. “It’s not dangerous, it’s energy waiting to be unlocked. We’re sitting on the equivalent of five Saudi Arabias of energy, and we’re burying it instead of using it. That needs to change.”

Golding agreed, noting that for decades, the U.S. has stored waste in temporary pools, a model that is no longer scalable. The consensus: recycling and reusing fuel through modern SMRs is not only possible but essential.

Economic and Community Impact

Beyond technical feasibility, the panel highlighted the broader economic upside of nuclear development. Gitt shared that Oklo’s projects are already generating significant local economic benefits. “We just broke ground in Iowa, and the job creation has been incredible,” Gitt said. “This isn’t just energy innovation, it’s economic revitalization. Communities are competing to host these facilities because they bring skilled jobs, tax revenue, and long-term prosperity.”

Hoffman and Read both agreed that pairing nuclear generation with data center campuses could redefine industrial development in the U.S. “These are long-term, high-value assets,” Hoffman said. “They’re not speculative, they’re the backbone of America’s digital and economic future.”

From Renewable to Reliable: The Role of Baseload Power

Golding raised the question of whether hyperscalers are ready to embrace nuclear as part of their sustainability strategies. Gitt’s answer was unequivocal: “Every major hyperscaler now includes nuclear in their long-term power roadmap. It’s part of the equation for net-zero.”

Gitt noted that nuclear has the smallest materials footprint of any energy source, smaller even than wind or solar; making it one of the most resource-efficient options available. “If we want to keep the lights on and cut emissions, there’s really no alternative,” Gitt said. “The data center industry has realized that nuclear isn’t optional, it’s inevitable.”

From Vision to Reality

The panel made clear that the intersection of nuclear energy and data center infrastructure is no longer theoretical. Regulatory pathways are opening, commercial projects are underway, and the industry’s largest power consumers are preparing to integrate nuclear into their long-term sustainability and capacity strategies.

As Golding concluded, “This isn’t a thought experiment anymore. It’s happening. By the end of the decade, nuclear will be powering data centers, and helping our industry lead the global energy transition.”

Infra/STRUCTURE 2026: Save the Date

Want to tune in live, receive all presentations, gain access to C-level executives, investors and industry leading research? Then save the date for infra/STRUCTURE 2026 set for October 7-8, 2026 at The Wynn Las Vegas.  Pre-Registration for the 2026 event is now open, and you can visit www.infrastructuresummit.io to learn more.

The post Now and Going Nuclear: Powering the Next Generation of Data Centers appeared first on Data Center POST.

Investment Perspectives: Navigating the Future of Digital Infrastructure

4 December 2025 at 16:00

Insights from RBC Capital Markets, Compass Datacenters, and TD Securities

Understanding the Investment Landscape in a New Era of AI

The infra/STRUCTURE Summit 2025, held October 15–16 at the Wynn Las Vegas, brought together the world’s leading voices in digital infrastructure to explore the industry’s rapid transformation. Among the standout sessions was Investment Perspectives, where experts discussed how artificial intelligence (AI), energy constraints, and capital strategy are reshaping investment decisions and the future of data center development.

Moderated by Jonathan Atkin, Managing Director at RBC Capital Markets, the panel featured Jonathan Schildkraut, Chief Investment Officer at Compass Datacenters, and Colby Synesael, Managing Director at TD Securities. Together, they provided clear insights into the trends influencing where, why, and how capital is being deployed in the infrastructure sector today.

The Shifting Demand Curve: How AI is Driving Data Center Growth

Jonathan Schildkraut opened the discussion by outlining the four primary workloads fueling infrastructure demand: AI training, AI inference, cloud, and social media. He described these workloads as the engines of growth for the sector, emphasizing that most are revenue-generating. “Three of those four buckets are cash registers,” Schildkraut said. “We’re really seeing those revenue-generating workloads accelerating.”

Colby Synesael added that the balance between AI training and inference is shifting quickly. “A year ago, roughly 75% of AI activity was training and 25% inference,” Synesael explained. “In five years, that ratio could reverse. A lot of inferencing will occur near where applications are used, which changes how we think about data center deployment.” Their remarks highlighted a clear message: AI continues to be the dominant force shaping infrastructure demand, but its evolution is redefining both scale and location.

Market Expansion and Power Constraints 

As Tier 1 data center markets face mounting limitations in available land and energy, both Schildkraut and Atkin noted the increasing strategic importance of Tier 2 and Tier 3 regions. Schildkraut cited examples such as Alabama, Georgia, and Texas, which are emerging as viable alternatives due to improved fiber connectivity and more favorable power economics.

Capital Strategy and Facility Adaptability:Investing for the Long Term

The conversation also delved into how investors are evaluating opportunities in an environment of high demand and rapid technological change. Schildkraut explained that access to capital today depends on two critical factors: tenant quality and facility adaptability. “Investors want to know that the tenant and the workload will be there for the long term,” Schildkraut said. “They also care deeply about whether the facility can evolve with future technologies.”

To illustrate this, Schildkraut described Compass Datacenters’ initiative to upgrade power densities, increasing capacity from 6–7 kilowatts per rack to hybrid systems capable of supporting up to 30 kilowatts. This investment is designed to ensure readiness for the next generation of high performance computing and AI workloads. These types of forward looking strategies are helping operators and investors manage both risk and opportunity in an increasingly complex market.

Globalization and Policy Influence 

When the conversation turned to global trends, Schildkraut predicted that AI infrastructure deployment will expand worldwide but at uneven rates. “Availability of power and land isn’t uniform,” he said. “Government incentives will play a critical role in determining which markets can scale.”

Synesael agreed, adding that regions lacking modern AI infrastructure could face growing disadvantages. “Over the next several years, not having this infrastructure in your country or region will become a major constraint on innovation,” Syneasel said. Their perspectives reinforced that infrastructure development is no longer just a commercial priority, it is also a matter of national competitiveness.

A Market Redefined by Technology and Energy

The discussion revealed that the digital infrastructure market is entering a new phase defined by the convergence of AI driven workloads, energy constraints, and strategic capital deployment. As inference workloads expand, Tier 2 and Tier 3 markets rise in importance, and investors prioritize long-term flexibility, the industry’s success will depend on adaptability and foresight. The session made it clear that data centers are no longer just real estate, they are foundational assets powering the next wave of global innovation.

Infra/STRUCTURE 2026: Save the Date

Want to tune in live, receive all presentations, gain access to C-level executives, investors and industry leading research? Then save the date for infra/STRUCTURE 2026 set for October 7-8, 2026 at The Wynn Las Vegas. Pre-Registration for the 2026 event is now open, and you can visit www.infrastructuresummit.io to learn more.

The post Investment Perspectives: Navigating the Future of Digital Infrastructure appeared first on Data Center POST.

How Artificial Intelligence Is Redefining the Future of Global Infrastructure

3 December 2025 at 16:00

At infra/STRUCTURE Summit 2025, industry leaders from Inflect, NTT and NextDC explored how AI is accelerating development timelines, reshaping deal structures, and redrawing the global data center map.

The infra/STRUCTURE Summit 2025, held at The Wynn Las Vegas from October 15–16, 2025 convened the brightest minds in digital infrastructure to explore the seismic shifts underway in the age of artificial intelligence. Among the most forward-looking sessions was “AI Impact on Global Market Expansion Patterns,” a discussion that unpacked how AI is transforming where and how data centers are developed, financed, and operated worldwide.

Moderated by Swapna Subramani, Research Director, IMEA, for Structure Research, the panel featured leading executives including Mike Nguyen, CEO, Inflect; Steve Lim, SVP, Marketing & GTM, NTT Global Data Centers; Craig Scroggie, CEO and Managing Director, NEXTDC. Together, they examined how the explosive demand for AI compute power is pushing developers to rethink long-held assumptions about geography, energy, and risk.

AI Is Rewriting the Rules of Global Expansion

For decades, site selection decisions revolved around a handful of core variables: power cost, connectivity, and proximity to major user populations. But in 2025, those rules are being rewritten by the unprecedented scale of AI workloads.

Regions once considered secondary are suddenly front-runners. Scroggie noted how saturation in markets like Singapore and Hong Kong has forced expansion across Thailand, Indonesia, Malaysia, and India, each now racing to deliver power, land, and permitting capacity fast enough to attract global hyperscalers.

“You can’t build large campuses in Singapore anymore,” Scroggie said. “But throughout Southeast Asia, we’re seeing rapid acceleration as operators balance scale, sustainability, and access to emerging population centers.”

The panelists agreed that energy constraints, not capital, are now the primary limiting factor. “The short term is about finding locations where power exists at scale,” explained Scroggie. “The longer-term challenge is developing new storage and generation models to make that power sustainable.”

Geopolitics and Sovereignty Are Shaping Investment

AI’s global reach has also brought geopolitics and national sovereignty to the forefront of infrastructure strategy.

“We’re living in more challenging times than ever before,” said Nguyen, referencing chip export restrictions and international trade interventions. “AI is no longer just a technological conversation, it’s a matter of national defense and economic competitiveness.”

He noted that ongoing trade restrictions with China are reshaping who gets access to advanced chips and where they can be deployed. “The combination of geopolitical and local legislative pressures determines the future of global trade management,” Nguyen said.

As countries strengthen data sovereignty and privacy laws, regional differentiation is intensifying. “Every geography has a different view,” Nguyen continued. “Some nations are creating frameworks to enable AI and cross-border data sharing, others are locking down their ecosystems entirely.”

Scroggie echoed this, adding that sovereignty-driven strategies are driving a surge in localized buildouts. “We’re seeing more countries push to ensure domestic control of digital assets,” he said. “That’s changing the structure of global supply chains and creating ripple effects that extend well beyond national borders.”

The Industry’s Race Against Time

The conversation turned toward construction velocity, a challenge every developer feels acutely.

“Are we building fast enough?” Subramani, the moderator of the conversation asked.

“Simply put, no,” said Scroggie. “We can’t keep up with demand. Traditional 12-to-24-month build cycles no longer align with AI’s acceleration curve. We have to find a way to build differently.”

The group discussed the need for new modular construction methods, accelerated permitting, and AI-assisted project management to meet scale and speed requirements.

Nguyen framed it within the broader context of industrial history. “We are standing at the dawn of the next industrial revolution,” he said. “Just as steam, electricity, and the internet reshaped economies, AI will redefine global competitiveness. The countries that can deliver sustainable, affordable power will lead.”

He pointed to the “Jacquard Paradox” of AI infrastructure: the more intelligence we produce, the cheaper it becomes, and the more of it the world demands. “The hallmark of global competitiveness will be the unit cost of producing intelligence,” Ngyen explained. “That requires deep collaboration between developers, energy providers, and governments.”

Evolving Deal Structures Reflect a More Complex Market

The financial framework of data center development is also changing dramatically. Traditional “build-to-suit” models are giving way to more creative, multi-tiered partnerships as both hyperscalers and institutional investors seek flexibility and risk mitigation.

“There’s a diversity of players now entering the market, some with deep operational experience, others completely new to the space,” said Scroggie. “Everyone’s chasing the same megawatts, but their risk tolerance and credit profiles vary widely.”

Scroggie also described how education and transparency have become critical. “We’re constantly advising clients on what’s feasible and what’s not. Many are coming in with unrealistic expectations about speed, power, or pricing. It’s part of our job to bridge that gap.”

The consensus was clear: AI-driven demand has transformed data centers from real estate assets into strategic infrastructure platforms, with financial, political, and environmental implications far beyond the industry itself.

Looking Ahead: The Next Decade of AI-Driven Infrastructure

As the discussion drew to a close, the panelists reflected on the extraordinary pace of change. “AI is not replacing, it’s additive,” said Scroggie. “Every new workload, every new inference model adds demand. The scale we’re dealing with is unprecedented.”

In this new era, speed, sustainability, and sovereignty are the defining dimensions of competitiveness. The industry’s success will hinge on its ability to innovate faster than the challenges it faces, whether those are regulatory, environmental, or geopolitical.

“We’re building the highways of the digital era,” said Nguyen in closing. “And like every industrial revolution before it, those who solve the energy equation will lead the world.”

Infra/STRUCTURE 2026: Save the Date

Want to tune in live, receive all presentations, gain access to C-level executives, investors and industry leading research? Then save the date for infra/STRUCTURE 2026 set for October 7-8, 2026 at The Wynn Las Vegas. Pre-Registration for the 2026 event is now open, and you can visit www.infrastructuresummit.io to learn more.

The post How Artificial Intelligence Is Redefining the Future of Global Infrastructure appeared first on Data Center POST.

Hyperscale Data Center Procurement: Scaling Smarter in the Age of AI

1 December 2025 at 16:00

Insights from Structure Research, Cloudflare, Decillion, Groq, and Lambda

Why Hyperscale Procurement Matters Now

At the infra/STRUCTURE Summit 2025, held October 15–16 at the Wynn Las Vegas, the session on Hyperscale Data Center Procurement explored how hyperscalers, cloud platforms, and AI companies are redefining site selection, capacity planning, and power procurement.

With the explosion of artificial intelligence (AI) and high-performance workloads, the panel examined how data center operators are adapting to meet new demands for speed, density, and collaboration. The discussion brought together leading experts who sit at the intersection of technology, infrastructure, and strategy: Jabez Tan, Head of Research at Structure Research; Sarah Kurtz, Data Center Selection Manager at Cloudflare; Whitney Switzer, CEO of Decillion; Anik Nagpal, Principal Strategic Advisor for Global Data Center Development at Groq; and Ken Patchett, VP of Data Center Infrastructure at Lambda.

Together, they offered a grounded yet forward-looking view of how hyperscale infrastructure is evolving and why collective problem-solving across the ecosystem has never been more urgent.

Understanding the New Procurement Reality – From Megawatts to Gigawatts

Moderator Jabez Tan opened by noting how quickly the scale of data center procurement has transformed. Just a few years ago, hyperscale planning revolved around megawatts. Today, as Ken Patchett of Lambda explained, “We used to talk in megawatts; now we’re talking in gigawatts or even hundreds of megawatts. The world has changed.”

Patchett emphasized that this growth is not theoretical; vacancy rates are at record lows, and facilities are being leased before construction even begins. “Seventy-three percent of buildings being built in the U.S. today are pre-leased before completion,” he said. “In some cases, they’re 100 percent committed before a shovel hits the ground.”

This surge underscores both the opportunity and the strain on today’s hyperscale procurement models. The traditional development timelines, often five years from land acquisition to delivery, are being tested by the speed at which AI and GPU-driven workloads are scaling.

Site Selection and Power – Seeing Through the Noise

Whitney Switzer, CEO of Decillion, offered insights into the increasingly complex process of site selection, especially in an environment filled with speculation and limited power capacity. “There’s a lot of land and a lot of promises,” Switzer said, “but not all sites can actually deliver what hyperscalers need. The challenge is cutting through the noise to identify real, deliverable power and real infrastructure.”

Anik Nagpal from Groq added that power availability has become the defining factor in any site’s viability. “We’re facing long waiting lists with utilities,” Nagpal explained. “It’s not enough to have a site, you need documented substation agreements, confirmed transformer orders, and clear delivery dates.” Without that level of verification, even well-positioned properties can fall short of hyperscale timelines.

Switzer reinforced that the industry must move toward deeper collaboration between developers, power providers, and end users to accelerate readiness. “You have to build trust,” she said. “That’s what ensures creativity and alignment between the business and technical sides of a deal.”

Market Challenges and Evolving Partner Strategies

Sarah Kurtz of Cloudflare described a rapidly tightening capacity market, where competition for space and power is fierce. “Prices have moved dramatically,” Kurtz said. “We might go out for one megawatt and come back to find that the same capacity now costs four times as much.” Despite those pressures, Kurtz highlighted that the key is adaptability, knowing when to secure smaller, strategic sites that can deliver sooner rather than waiting years for larger campuses.

Ken Patchett echoed this sentiment, pointing out that the demand wave is forcing new forms of partnership. “We’re all asking, ‘Do you have space? Do you have power?’ Conversations that didn’t happen ten years ago are now everyday,” Patchett said. “We have to work together, utilities, operators, AI companies, to actually build the infrastructure that matches the pace of technology.”

Nagpal added that power immediacy and transparency are now central to deal-making. “People want to believe the power’s there,” he said, “but you only know it when you see the agreements in writing. That’s the new due diligence.”

Designing for Density and Agility – Building for the Next Cycle

A recurring theme throughout the session was that data center design itself must evolve as hardware cycles shorten. Patchett underscored that density and adaptability are now fundamental requirements. “The buildings we designed 20 years ago won’t support what we’re running today,” Patchett said. “We’re moving from 50-kilowatt racks to 600-kilowatt racks, and we have to build in a way that can pivot every six to nine months as hardware changes.”

Patchett added that despite fears of overbuilding, the industry isn’t facing a bubble. “We’re still using what we built ten or twenty years ago,” he said. “This is about addition, not replacement. Our challenge is to keep up with demand, not question it.”

The panelists agreed that modular design, flexible financing, and shared innovation will define the next phase of data center evolution. As Switzer summarized, “It’s all about partnership, aligning resources and expertise to deliver creative solutions at scale.”

Collaboration as the New Competitive Edge

The session made clear that hyperscale procurement is no longer about simply buying power and land. It’s about integrating supply chains, synchronizing with utilities, and designing for continuous evolution. Across every perspective, developer, operator, and end user, the message was the same: collaboration is the only way to scale sustainably.

The leaders on stage shared a unified view that as AI reshapes data center demand, the industry’s success will depend not on who builds fastest, but on who builds smartest—with transparency, trust, and long-term partnership at the core.

Infra/STRUCTURE 2026: Save the Date

Want to tune in live, received all presentations, gain access to C-level executives, investors and industry leading research? Then save the date for infra/STRUCTURE 2026 set for October 7-8, 2026 at The Wynn Las Vegas.  Pre-Registration for the 2026 event is now open, and you can visit: www.infrastructuresummit.io to learn more.

The post Hyperscale Data Center Procurement: Scaling Smarter in the Age of AI appeared first on Data Center POST.

Alternative Cloud Providers Redefine Scale, Sovereignty, and AI Performance

26 November 2025 at 16:00

At this year’s infra/STRUCTURE Summit 2025, held at the Wynn Las Vegas, one of the most forward-looking conversations came from the session “From Cloud to Edge to AI Inferencing.” Moderated by Philbert Shih, Managing Director at Structure Research, the discussion brought together a diverse panel of innovators shaping the future of cloud and AI infrastructure: Kevin Cochrane, Chief Marketing Officer at Vultr; Jeffrey Gregor, General Manager at OVHcloud; and Darrick Horton, CEO at TensorWave.

Together, they explored the emergence of new platforms bridging the gap between hyperscale cloud providers and the next wave of AI-driven, distributed workloads.

The Rise of Alternatives: Choice Beyond the Hyperscalers

Philbert Shih opened the session by emphasizing the growing diversity in the cloud ecosystem, from legacy hyperscalers to specialized, regionally focused providers. The conversation quickly turned to how these companies are filling critical gaps in the market as enterprises look for more flexible, sovereign, and performance-tuned infrastructure for AI workloads.

Cochrane shared insights from a recent survey of over 2,000 CIOs, revealing a striking shift: while just a few years ago nearly all enterprises defaulted to hyperscalers for AI development, only 18% plan to rely on them exclusively today. “We’re witnessing a dramatic change,” Cochrane said. “Organizations are seeking new partners who can deliver performance and expertise without the lock-in or limitations of traditional cloud models.”

Data Sovereignty and Global Reach

Data sovereignty remains a key differentiator, particularly in Europe. “Being European-born gives us a unique advantage,” Gregor noted. “Our customers care deeply about where their data resides, and we’ve built our infrastructure to reflect those values.”

He also highlighted OVHcloud’s focus on sustainability and self-sufficiency, from designing and operating its own servers to pioneering water-cooling technologies across its data centers. “Our mission is to bring the power of the cloud to everyone,” Gregor said. “From startups to the largest public institutions, we’re enabling a wider range of customers to build, train, and deploy AI workloads responsibly.”

AI Infrastructure at Scale

Horton described how next-generation cloud providers are building infrastructure purpose-built for AI, especially large-scale training and inferencing workloads. “We design for the most demanding use cases, foundational model training, and that requires reliability, flexibility, and power optimization at the cluster scale.”

Horton noted that customers are increasingly choosing data center locations based on power availability and sustainability, underscoring how energy strategy is becoming as critical as network performance. TensorWave’s approach, Horton added, is to make that scale accessible without the hyperscale overhead.

Democratizing Access to AI Compute

Across the panel, a common theme emerged: accessibility. Whether through Vultr’s push to simplify AI infrastructure deployment via API-based services, OVHcloud’s distributed “local zone” strategy, or TensorWave’s focus on purpose-built GPU clusters, each company is working to make advanced compute resources more open and flexible for developers, enterprises, and AI innovators.

These alternative cloud providers are not just filling gaps — they’re redefining what cloud infrastructure can look like in an AI-driven era. From sovereign data control to decentralized AI processing, the cloud is evolving into a more diverse, resilient, and performance-oriented ecosystem.

Looking Ahead

As AI reshapes industries, the demand for specialized infrastructure continues to accelerate. Sessions like this one underscored how innovation is no longer confined to the hyperscalers. It’s emerging from agile providers who combine scale with locality, sustainability, and purpose-built design.

Infra/STRUCTURE 2026: Save the Date

Want to tune in live, receive all presentations, gain access to C-level executives, investors and industry leading research? Then save the date for infra/STRUCTURE 2026 set for October 7-8, 2026 at The Wynn Las Vegas. Pre-Registration for the 2026 event is now open, and you can visit www.infrastructuresummit.io to learn more.

The post Alternative Cloud Providers Redefine Scale, Sovereignty, and AI Performance appeared first on Data Center POST.

Managed Infrastructure at a Crossroads: The Power-First Revolution

25 November 2025 at 19:30

How Data Center Developers Are Navigating the Clash Between Rapid AI Demand and Decades-Old Regulatory Frameworks

The data center industry has undergone a seismic shift that would have seemed unthinkable just a few years ago. Historically, operators selected prime real estate locations and then brought power to their facilities. Today, that equation has completely reversed: developers are taking their data centers to wherever power exists, fundamentally reshaping the geographic and strategic landscape of digital infrastructure.

At the infra/STRUCTURE Summit 2025, held October 15-16 at The Wynn Las Vegas, a distinguished panel gathered to explore this transformation and its profound implications. The session “Managed Infrastructure at a Crossroads” brought together experts from across the infrastructure ecosystem to discuss the challenges, opportunities, and regulatory complexities of the power-first era.

Moderated by Hadassa Lutz, Senior Consulting Analyst at Structure Research, the panel featured Gene Alessandrini, SVP of Energy & Location Strategy at CyrusOne; Allison Clements, Partner at ASG (Aspen Strategy Group) and former FERC Commissioner; and David Dorman, Director of Commercial Operations at APR Energy. Together, they examined how the industry is navigating unprecedented demand while working within, and sometimes around, regulatory frameworks that were never designed for this moment.

The 12-Month Transformation: From “Tier One Markets” to “Just Finding Power”

Alessandrini opened the discussion with a striking timeline that captured the industry’s rapid evolution. When he joined CyrusOne just 12 months prior, the focus was squarely on tier-one markets: Northern Virginia, Columbus, and Chicago. Traditional data center hubs with established infrastructure and connectivity.

“Six months into the job, a lot of new secondary markets started popping up, “Alessandrini explained. “Twelve months into the job, it’s like ‘just find power, and then we’ll figure it out.'”

This shift represents more than a change in strategy; it’s a fundamental reimagining of the industry’s priorities. “We really understand that the industry is constrained in the traditional markets, “Alessandrini continued. “Even the new secondary markets are getting tapped out way too quickly. We’ve gone from an industry that focused really on land acquisition with utility interconnection to now, 12 months later, where any source of power is on the table and location requirements are wide open.”

The traditional site selection checklist includes land first, power second, followed by water, workforce, and tax incentives. However, it’s been completely inverted where power is now number one, and land becomes secondary. “Geography is now wide open, “Alessandrini said. “Your decisions on where you’re going to build that next data center really diverge from all the patterns you followed before.”

The trade-off, he noted, is that scale becomes critical. “If you can get scale at that location, you somewhat offset the ecosystem of challenges when you start building data centers in farther-away locations.”

Bridging Solutions: APR Energy’s Rapid Deployment Model

Dorman of APR Energy provided insight into how power generation providers are responding to this urgent demand with innovative, fast-deployment solutions.

“We focus right now on bridging solutions, either a component solution that we can potentially build for a utility or standalone generation,” Dorman explained. “Our lead time from contract to start-up is about 30 to 90 days after receiving permits and everything else. We’re very quick, and what we bring is kind of a fast-to-market approach.”

APR Energy, with over 20 years in the industry, has developed a proven playbook for rapid deployment. Their equipment is scalable, delivered in blocks of 50 megawatts, 150 megawatts, or 300 megawatts depending on site requirements. “The larger the deployment, the more equipment you have, potentially the cheaper the price point,” Dorman additionally noted.

However, even with this rapid deployment capability, there are critical prerequisites. “The assumptions are: you have permits, you have a site that’s a viable site, and you have fuel supply, typically natural gas, connected close to the site,” Dorman said. “If not, there are providers out there that can bridge to a potential connection if there is some development needed.”

The role APR Energy plays is essential in the current environment: providing a bridging solution that fills the gap between immediate demand and the 24-60 months it typically takes for utility-scale generation or permanent solutions to come online.

The Regulatory Reality: A Clash of Cultures and Timelines

Allison Clements, bringing her perspective as a former Federal Energy Regulatory Commission (FERC) commissioner, offered a sobering assessment of the regulatory challenges facing the industry.

“What is fascinating to me is the clash of cultures between the regulated utility industry and the data center development industries,” Clements said. “Data center developers have a real estate background, they’re tech players coming in, they’re used to operating in actual markets with real supply and consumer choice. In the energy world, it just doesn’t work that way.”

Clements described the fundamental mismatch: “You’ve got a regulatory machine and incumbent incentives, and nothing takes less than 30 or 60 or 90 days. There’s this real lack of appreciation and understanding: Why do you have to move so quickly? Why are you moving so slowly?”

Clements emphasized that when data center developers enter the utility space, they’re stepping into one of the most heavily regulated industries in existence. “You might have one, two, or three regulators who have a hand in decisions when it comes to your interconnection, permitting, and water use. The market is trying to move so quickly, and these regulatory frameworks are just trying to catch up.”

Clements message was clear but empathetic: “These utility sectors aren’t dumb. They’re just built into a giant bureaucracy that wasn’t designed to enable the goals that we now have today. That’s true for state regulatory commissions and the Federal Regulatory Commission. We just weren’t set up for this.”

Alessandrini echoed this disconnect from the developer perspective: “Twelve months before I started, I never thought utilities were as separate from our world as you just described. But after living it for 12 months, I realized that our industry is moving much faster than the regulatory framework: utility markets, interconnection, everything is at a slower pace.”

The challenge, Alessandrini explained, is finding ways to bridge this reality gap. “We’re having lots of conversations, trying to bridge the understanding of what our facilities are, what their businesses are, and why we’re unfortunately not moving at the same pace. We’re working together to find ways that relieve pressure on their framework so they can be more comfortable making decisions that allow us to move faster.”

The Utility Incentive Problem: Capital Investment vs. Operational Efficiency

A critical issue the panel addressed was the fundamental structure of utility incentives, a system that may be working against the rapid expansion the industry needs.

“What’s often missed is the perception that utilities are incentivized incorrectly,” Lutz noted, asking the panelists to expand on whether utilities are rewarded more for capital spending than for optimization and efficiency.

Clements confirmed this concern is rooted in reality. “You have a regulatory system where the incumbent utilities have been given the franchise right to be a monopoly, and they make money by one: volumetric sales, so the more electrons they sell, the more money they make, and two: by capital investment, steel in the ground, generation or grid.”

This creates a structural problem: “A lot of times, efficient operations and opportunities like buying behind-the-meter generation don’t align with the utility business model.”

Dorman, drawing on his 13 years as a utility executive before moving to generation, offered a particularly insightful observation: “It’s funny we’re sitting here saying utilities want to invest in their rate base because that becomes their revenue stream. Yet now the behavior I know it’s not their intention but the way it appears is you don’t want the load anymore, so your rate base doesn’t grow. The new tariff structures appear to disincentivize load growth rather than incentivize it.”

This paradox sits at the heart of the industry’s current challenges: utilities structured to profit from capital investment and volume sales are implementing tariffs that may discourage the very load growth that should benefit them.

Cost Allocation: The Political Third Rail

Clements addressed one of the most politically sensitive issues facing the industry: who pays for what when data centers connect to the grid.

“Data center markets have come onto the system at rapid-fire pace in a moment where electricity prices were already rising,” Clements explained. “The grid has been underinvested in for a long time.”

She outlined three types of costs:

  1. Direct interconnection costs: the physical connection to the grid
  2. Indirect grid impact costs: taking up space on the grid that might impact economic constraints elsewhere
  3. The cost of the electron itself: the actual generation cost

“There’s a lot of discussion around cost allocation,” Clements said. “Data centers come in saying ‘we want to pay our fair share,’ and they do pay for direct costs like substations or switching circuits. But what they don’t pay for is residential customer increases in electricity prices or broader transmission development.”

Clements was careful to note this isn’t necessarily unfair, it’s how supply and demand markets are supposed to work. “You have new customers, new supply should come in, and it should all work out. The opportunity for data centers to lower costs is tremendous.”

The problem, she explained, is timing. “The underlying regulatory frameworks haven’t kept up. As a result, you see demand increasing and supply tightening because we haven’t had time for new supply to come in. These rising costs have been in some part because of data centers, but in large part were coming regardless.”

The political pressure is mounting. “Data center opposition is growing up in communities around the country,” Clements warned. “These are real people with cost concerns, and we need to take it seriously.”

The Scale of the Challenge: 128 Gigawatts by 2029

To put the industry’s challenge in perspective, Clements shared a sobering statistic: “The lowest forecast suggests we’ll have 128 new gigawatts of power demand for data centers. That’s the amount of power it takes to power the entire mid-Atlantic region, that includes Philadelphia, Washington D.C., and Chicago.”

Clements was blunt about the timeline constraints: “If you want power by Q4 2029 and you start today, you can’t build a lifecycle gas plant in that time. Maybe, if you’re lucky enough to secure modular equipment and you start procurement today, in 18 months you can have a solution.”

This reality is driving the search for alternatives and interim solutions, everything from bridging generation to demand flexibility to grid optimization technologies.

Being a “Good Citizen”: Beyond Just Big Power Solutions

When asked what it means to be a “good citizen” in this environment, the panelists emphasized the need for data center operators to look beyond simple power procurement.

Clements urged the industry to embrace innovation: “The hyperscalers want these innovative solutions. Think about opportunities beyond just the big power solutions. There’s hardware and software that helps run the grid more dynamically. We still run our grid like it’s in the 1980s era, no joke in the US.”

Clements also highlighted demand flexibility as a critical tool: “Data centers actually committing to some sort of proactive curtailment throughout the year. That’s hard, it might involve going offline for periods. There are trade-offs in each of these approaches, and each one introduces different risks into your transaction that may or may not be desirable.”

The panel also addressed community impact. “There’s a lot of opportunity for smart developers to give back to the community,” Clements said. “Fire stations, education, public services, these investments matter. We need to help communities understand which part of their electricity bills data centers are responsible for and what they’re doing to mitigate that impact.”

The New Geographic Reality: Data Centers in Unexpected Places

Alessandrini painted a picture of the industry’s evolving geography. “We’re possibly creating new markets for data centers because we’re taking data centers to places you’ve never seen before, the outskirts of Texas, Alabama, Wyoming, and all these other areas.”

This geographic expansion isn’t without challenges. These regions often lack the established ecosystems, workforce, connectivity, and supply chains that traditional markets offer. But when balanced against the availability of power at scale, the trade-offs become acceptable.

“The reality is the industry will continue to broaden,” Alessandrini said. “Power solutions will come from locations with access to gas supply that historically weren’t considered data center markets.”

The 24-to-60-Month Gap: A Bridge Too Far?

Perhaps the session’s most critical tension was captured in Alessandrini’s assessment of timeline misalignment.

“What I realized 12 months in is that instead of being more comfortable that the gap was closing, the gap is actually widening,” Alessandrini said. “We have a 24-month timeline to get facilities built and operational. The regulatory and utility side operates on a 36-to-60-month timeline.”

He was emphatic about the industry’s position: “We’re going to be there in 24 months, and you just tell us when you can join the party, whether that’s 60 months or 72 months. But guess what? We’re going to be there in 24 months.”

The question facing the industry is how to bridge this gap. “We’re going to build power plants, we’re going to build bridging solutions, we’re going to build all these things to allow data centers to get built based on the velocity of our industry,” Alessandrini said.

Dorman agreed: “If we can solve the problems as an industry and bring that 60 months down to 36 months, we still have this 24-month target that we just can’t let go of. We’re going to keep building.”

Looking Ahead: Nuclear, SMRs, and Long-Term Solutions

While the session focused heavily on near-term challenges and natural gas solutions, the panelists acknowledged that longer-term answers may include nuclear power and Small Modular Reactors (SMRs).

“The new SMRs are something which can come to market too,” Alessandrini noted. “We’re trying to wrap our heads around the new reality of data centers today and possibly creating new markets, including with emerging nuclear technologies.”

However, the timeline for commercialized SMR technology remains uncertain, making bridging solutions and interim approaches all the more critical.

Key Takeaways: Navigating the Power-First Era

The panel’s discussion revealed several critical insights for the data center industry:

  1. Power Has Become the Primary Site Selection Criterion: The traditional real estate-first approach is dead. Geography is now determined by power availability at scale, fundamentally reshaping the data center map.
  2. The Regulatory-Developer Timeline Gap Is Widening: Developers operate on 24-month cycles; regulators and utilities on 36-60-month cycles. This gap isn’t closing, it’s growing, forcing creative bridging solutions.
  3. Utility Incentive Structures Are Misaligned: Current regulatory frameworks reward utilities for capital investment and volumetric sales, which may not align with the rapid, efficient expansion the industry needs.
  4. Cost Allocation Is a Political Powder Keg: As residential electricity bills rise and data center development accelerates, community opposition is growing. The industry must proactively address cost concerns and community impact.
  5. Bridging Solutions Are Essential: With demand far outpacing utility-scale generation timelines, fast-deployment bridging solutions from providers like APR Energy are critical to keeping projects on track.
  6. Being a Good Citizen Requires More Than Paying Bills: Data center operators must embrace demand flexibility, support community initiatives, and invest in grid optimization technologies, not just consume power.
  7. The Scale Is Unprecedented: Meeting 128 gigawatts of new demand by 2029 will require every tool in the toolbox, traditional generation, bridging solutions, demand management, grid optimization, and potentially nuclear.
  8. Secondary and Tertiary Markets Are the New Frontier: Texas, Alabama, Wyoming, and other historically non-traditional data center locations are becoming viable, even preferred, due to power availability.

For operators, investors, policymakers, and community stakeholders, the message is clear: the data center industry is at an inflection point. The power-first revolution isn’t a temporary adjustment, it’s the new normal. Success will require unprecedented collaboration between developers, utilities, regulators, and communities to bridge the gap between digital infrastructure’s breakneck pace and the energy sector’s deliberate timelines.

As Alessandrini concluded: “This is a dynamic industry to be in. We’re going to keep building, we’re going to keep finding solutions, because the demand isn’t going away, it’s only accelerating.”

Infra/STRUCTURE 2026: Save the Date

Want to tune in live, receive all presentations, gain access to C-level executives, investors and industry leading research? Then save the date for infra/STRUCTURE 2026 set for October 7-8, 2026 at The Wynn Las Vegas. Pre-Registration for the 2026 event is now open, and you can visit www.infrastructuresummit.io to learn more.

The post Managed Infrastructure at a Crossroads: The Power-First Revolution appeared first on Data Center POST.

The Economics of AI Infrastructure: Separating Hype from Reality

25 November 2025 at 16:00

Industry Analysts Examine Revenue Models, Investment Rationale, and Growth Projections Shaping the Future of Cloud Infrastructure

At the infra/STRUCTURE Summit 2025, held October 15-16 at The Wynn Las Vegas, a distinguished panel of industry analysts gathered to tackle one of the most pressing questions facing the digital infrastructure sector: Is the explosive growth in AI-driven cloud infrastructure sustainable, or are we witnessing an investment bubble destined to burst?

The analyst team conclusion session brought together the Structure Research team to examine the economic fundamentals underpinning today’s unprecedented capital expenditure in AI infrastructure. Moderated by Philbert Shih, Managing Director of Structure Research, the panel featured Jabez Tan, Head of Research; Sacha Kavanagh, Research Director, EMEA; Swapna Subramani, Research Director, IMEA; and Ainsley Woods, Research Director. Together, these experts have been tracking hyperscale investments, cloud infrastructure evolution, and AI revenue generation patterns across global markets.

The Double-Counting Concern: Are AI Revenues Real?

One of the session’s most critical discussions centered on a concern that has been circulating throughout the industry: whether hyperscalers are truly generating new revenue from AI, or simply recycling existing workloads through new pricing models.

“There’s this concern that if off-takers are recycling it or reselling it, and not making money off of that compute, then that’s when I started looking around at the structures,” one analyst explained. “For now, it seems like there is a lot of healthy revenue from very basic off-takers.”

The panel pointed to concrete examples of AI-driven revenue growth. They highlighted Cursor, a development tool that now derives roughly 30% of its revenue through AI capabilities. “If LLMs get to the level of a senior software engineer at Google, just by using all these papers and other resources, there’s real value being created,” the analyst noted.

The consensus: As long as overall net new revenue tied to AI can be measured and we’re still in the very initial stages, the growth is legitimate. “It’s all about recognizing that new revenue,” the analyst emphasized.

Existential Investment: Why Hyperscalers Are Spending “Stupid Amounts of Capital”

The discussion took a fascinating turn when panelists examined the psychology driving hyperscale investment decisions. One analyst posed a revealing question to frame the issue: “If I asked the audience today, how many of you can say with 100% certainty that your jobs will not be displaced by AI, most of us would not be able to say that with 100% certainty.”

This uncertainty, the panel argued, is exactly what’s driving hyperscaler behavior.

“That’s exactly how the hyperscalers feel as well, and that’s why they’re investing stupid amounts of capital because that’s an existential threat to their leadership,” the analyst explained. “They’re not really investing their capex in a purely rational economic framework. They’re investing in it because they don’t want to be the next Cisco.”

This perspective reframes the massive capital expenditure not as irrational exuberance, but as strategic survival. The hyperscalers remember the cautionary tales of technology giants that failed to adapt to paradigm shifts, and they’re determined not to repeat those mistakes.

Cloud as a Delivery Vehicle: Learning from Historical Precedent

To assess the sustainability of current AI infrastructure growth, the panel drew parallels to previous technology transitions. Specifically, Microsoft’s shift from licensing Windows Server to delivering it as a cloud infrastructure service on Azure.

“I view the real clouds that are taking GPUs from Nvidia as a delivery vehicle, a service provider for cloud infrastructure not unlike what Microsoft did with Windows Server,” one analyst explained. “Instead of selling licenses and having customers install software in back offices, they simply delivered it as a cloud infrastructure service off the Azure platform.”

This historical comparison provided the panel with confidence in the current trajectory. However, they acknowledged a critical difference: velocity.

“The Windows Server transition happened over the course of five to ten years and it was slow-moving,” the analyst noted. “What’s happening now moves so fast. That basic velocity at which it happens gives us optimism, but it also makes it harder to predict.”

Projecting Five Years Out: Methodology and Data Points

When asked about revenue projections five years into the future and the data points supporting such tremendous growth, the panel outlined their analytical approach.

“First of all, I’ll say it once again: it’s very difficult to see what’s going to happen,” one analyst acknowledged candidly. “But the methodology is grounded in how we view cloud infrastructure growth historically.”

The team’s approach involves:

  1. Historical Evidence Analysis: Examining how first-generation cloud and free hyperscale infrastructure evolved, then comparing that to current hyperscale growth patterns.
  2. Phase-Based Growth Modeling: Dividing growth into distinct phases to understand acceleration patterns and inflection points.
  3. Fundamental Technology Comparison: Recognizing that “GPU clouds are the same thing, right? Servers with chips and storage.” Building projections on these technological fundamentals.

“When something has no historical precedent, the best way to understand it is to look at the closest analog,” the analyst explained. “That’s how we did it, we built on historical patterns and then tried to say, ‘Okay, this is going to be bigger and faster,’ but it’s based on actual precedent.”

Key Takeaways: Why This Matters for the Industry

The analyst panel’s conclusions carry significant implications for stakeholders across the digital infrastructure ecosystem:

  1. AI Revenue is Real: Despite concerns about double-counting, evidence suggests genuine net new revenue generation from AI workloads, with companies like Cursor demonstrating meaningful AI-driven revenue streams.
  2. Investment is Strategic, Not Irrational: Hyperscaler capital expenditure, while massive, reflects existential competitive dynamics rather than speculative excess. Companies are investing to avoid obsolescence.
  3. Historical Models Provide Guidance: While the current AI infrastructure buildout is unprecedented in scale and speed, previous cloud transitions offer methodological frameworks for understanding and projecting growth.
  4. Velocity Creates Uncertainty: The rapid pace of change makes prediction challenging, but it also creates opportunities for those who can move quickly and adapt.
  5. Fundamentals Still Matter: Despite the transformative nature of AI, the underlying infrastructure still consists of servers, chips, and storage—grounding analysis in tangible technological realities.

For infrastructure operators, investors, and technology providers, these insights suggest that while caution is warranted given the pace of change, the fundamental economics of AI infrastructure appear sound. The key will be distinguishing between companies delivering genuine value and those merely riding the hype cycle.

Infra/STRUCTURE 2026: Save the Date

Want to tune in live, receive all presentations, gain access to C-level executives, investors and industry leading research? Then save the date for infra/STRUCTURE 2026 set for October 7-8, 2026 at The Wynn Las Vegas. Pre-Registration for the 2026 event is now open, and you can visit www.infrastructuresummit.io to learn more.

The post The Economics of AI Infrastructure: Separating Hype from Reality appeared first on Data Center POST.

Pre-Development and Evolving Models: The Next Frontier in Digital Infrastructure

20 November 2025 at 16:00

How master planning, power strategy, and early-stage development are reshaping the data center industry

At the infra/STRUCTURE Summit 2025, held October 15–16 at The Wynn Las Vegas, one of the event’s most forward-looking discussions tackled the changing dynamics of the pre-development phase in digital infrastructure.

Moderated by Philbert Shih, Managing Director of Structure Research, the session, “Pre-Development and Evolving Models”, explored how early-stage development has evolved into a strategic discipline of its own. Joining Shih on stage were Nat Sahlstrom, Chief Energy Officer at Tract; Brandon Amber, Chief Strategy Officer and Co-Founder of Doma Infrastructure Group; and Mark McComiskey, CEO of AVAIO Digital Partners.

Together, these leaders shared how they are navigating challenges around power access, land readiness, and entitlement complexity while shaping new models designed to meet the unprecedented pace of global demand.

Creating a Market Segment Out of Pre-Development

Shih opened the session by framing the conversation around an emerging market reality: pre-development is no longer a supporting activity, it’s becoming its own business model.

“From a research perspective, pre-development has grown into a market segment in and of itself,” Shih noted. “It’s where timelines, capital, and innovation intersect, and where future capacity is truly created.”

McComiskey agreed, explaining that his firm identified early on that the long lead times required to secure land, entitlements, and power were creating bottlenecks that could stall digital growth.

“We saw that power and transmission constraints were going to slow down the digital revolution,” McComiskey said. “Our solution was to think like a utility, secure the land and power first, so when customers are ready, they can move immediately.”

AVAIO Digital Partners is offering “ready-to-build” campuses at scale, helping hyperscalers and operators bypass years of red tape. Which is done through acquiring and master-planning large tracts of land with utilities and infrastructure pre-engineered.

The Master Plan Advantage

Sahlstrom emphasized that master planning goes beyond speed to market. He believes it’s about resilience, predictability, and community impact.

“Even with strong utility partnerships, unexpected constraints can always emerge,” Sahlstrom said. “By approaching development as a 10-year master plan, you reduce risk and create predictability, not just for investors, but for the communities where we build.”

Sahlstrom pointed out that large-scale digital campuses are now being designed for hundreds of megawatts of capacity. These campuses integrate everything from power distribution and water systems to long-term sustainability planning, positioning them as both critical infrastructure and engines of economic development.

Finding Value in Underserved Markets

Amber offered a global perspective on how pre-development strategies are unlocking value in underserved regions.

“We’re seeing similar challenges around power and land across the Asia-Pacific and U.S. markets,” Amber explained. “Whether it’s Australia, Malaysia, or emerging economies like Thailand, success starts with solving those pre-development problems early.”

Amber noted that the industry’s traditional focus on Tier 1 markets is expanding outward. Companies like Doma are targeting secondary cities and edge regions, where power can be secured faster and closer to end users. A strategy that’s increasingly critical for AI workloads and latency-sensitive applications.

Filtering the Noise: Quality Over Quantity

As the discussion turned toward the market landscape, the panelists agreed that one of the biggest challenges today is information overload. With speculative projects and site proposals flooding the industry, separating viable opportunities from noise has become a strategic necessity.

“Teams are getting thousands of site proposals every month,” Sahlstrom said. “The real challenge is filtering for quality and identifying the projects that are actually entitled, powered, and deliverable.”

This growing “signal-to-noise” problem underscores the need for deeper collaboration between developers, utilities, and hyperscalers.

The Human Factor: Talent and Expertise

All panelists pointed to talent as another defining challenge in pre-development. With global expansion accelerating, competition for specialized skills in engineering, construction, and power management is intensifying.

“We’re seeing rotation across all levels. Hyperscalers hiring from operators, operators hiring from utilities,” Amber said. “Everyone’s competing for the same expertise, and that’s shaping how partnerships are formed.”

Rather than viewing the talent gap as a limitation, the panel saw it as an opportunity for specialization and partnership, allowing expert teams to focus on the most critical aspects of the pre-development lifecycle.

Building the Foundation for What’s Next

In closing, Shih summarized the session’s central message: pre-development has become the defining stage of digital infrastructure creation.

“This part of the industry used to be invisible,” he said. “Now, it’s where the most strategic value is being created, and where innovation will determine who leads the next wave of digital growth.”

By anticipating constraints, planning for resilience, and aligning with utilities and municipalities early, these leaders are laying the foundation for the next decade of cloud, AI, and edge infrastructure.

Infra/STRUCTURE 2026: Save the Date

Want to tune in live, receive all presentations, gain access to C-level executives, investors and industry leading research? Then save the date for infra/STRUCTURE 2026 set for October 7-8, 2026 at The Wynn Las Vegas. Pre-Registration for the 2026 event is now open, and you can visit www.infrastructuresummit.io to learn more.

The post Pre-Development and Evolving Models: The Next Frontier in Digital Infrastructure appeared first on Data Center POST.

Measured Optimism: Balancing Growth and Realism in the Global Data Center Market

18 November 2025 at 20:00

Understanding What’s Next for Digital Infrastructure

At this year’s infra/STRUCTURE Summit 2025, held at the Wynn Las Vegas, industry leaders came together to unpack the state of digital infrastructure in an era defined by AI-driven demand and hyperscale expansion.

One of the standout sessions of the event was “Measured Optimism,” led by Philbert Shih, Managing Director of Structure Research. Known for his data-first insights and global perspective, Shih provided an in-depth look at where the data center market stands today, and where it’s heading next.

His central question set the tone: Are we in a period of sustainable growth, or are we overbuilding?

A Balanced View: Bullish but Realistic

Shih began by acknowledging the debate between the “bulls” and “bears” in digital infrastructure, those who see unbounded opportunity and those who warn of a potential correction.

While recognizing some speculative trends, such as “fake data centers” and build pauses in certain regions, Shih urged attendees to take a longer view.

“There’s a lot of interest in the space, a lot of people with assets to develop,” Shih said. “But the fundamentals remain strong. We’ve seen time and again that this sector has the ability to absorb and grow through cycles.”

Shih drew comparisons to previous market phases. From the dot-com era to the rise of cloud computing, he suggested that what we’re seeing today is a natural evolution, not a bubble.

What the Data Shows

Shih supported his analysis with Structure Research’s latest findings:

  • Demand Continues to Outpace Supply. Hyperscalers and AI workloads are driving record demand. “We consistently see management teams reporting more demand than they can support,” Shih shared.
  • AI is an Accelerant, Not a Disruption. Shih explained that Meta’s recent build pause was less about demand softening and more about re-architecting for AI infrastructure. “AI is reshaping how capacity is planned and deployed,” he said.
  • Global Growth Momentum. While North America remains the largest market, growth across Europe and Asia is accelerating. Chinese and regional cloud providers are increasingly driving new development around the world.
  • Healthy Cycles, Not Cracks. Shih described the current slowdown in some areas as part of the natural “build–pause–absorb” cycle that defines infrastructure development. “Infrastructure doesn’t grow in a straight line,” he noted.

Collaboration Over Competition

A recurring theme throughout Shih’s presentation was partnership. The idea that hyperscalers might replace colocation providers with self-built facilities has largely given way to collaboration.

“There’s more cooperation between hyperscalers and colocation providers than ever before,” Shih said. “These partnerships are becoming essential to meeting global demand efficiently and sustainably.”

Shih also highlighted opportunities in pre-development and edge-scale projects, where new entrants and established providers alike are finding innovative ways to meet demand closer to users.

A Measured but Positive Outlook

Despite capital market challenges, supply chain constraints, and growing power demands, Shih’s conclusion was optimistic, grounded in data and real-world momentum.

“I’m more confident today than I was two years ago,” Shih said. “We’re not overbuilding, we’re building smarter, globally, and with a clearer sense of what’s next.”

The session ended with a strong message: while the sector must navigate its cycles carefully, the long-term trajectory remains firmly upward.

Infra/STRUCTURE 2026: Save the Date

Want to tune in live, receive all presentations, gain access to C-level executives, investors and industry leading research? Then save the date for infra/STRUCTURE 2026 set for October 7-8, 2026 at The Wynn Las Vegas. Pre-Registration for the 2026 event is now open, and you can visit www.infrastructuresummit.io to learn more.

The post Measured Optimism: Balancing Growth and Realism in the Global Data Center Market appeared first on Data Center POST.

❌