Why Sovereign Hybrid Multi-Cloud is the Future of Cloud in Europe

Why Sovereign Hybrid Multi-Cloud is the Future of Cloud in Europe

When people talk about cloud computing, the conversation almost always drifts toward the hyperscalers. AWS, Azure, and Google Cloud have shaped what we consider a “cloud” today. They offer seemingly endless catalogs of services, APIs for everything, and a global footprint. So why does Nutanix call its Nutanix Cloud Platform (NCP) a private cloud, even though its catalog of IaaS and PaaS services is far more limited?

To answer that, it makes sense to go back to the roots. NIST’s SP 800-145 definition of cloud computing is still the most relevant one. According to it, five essential characteristics make something a cloud: on-demand self-service, broad network access, resource pooling, rapid elasticity, and measured service. NIST then defines four deployment models: public, private, community, and hybrid.

If you look at NCP through that lens, it ticks the boxes. It delivers on-demand infrastructure through Prism and APIs, it abstracts and pools compute and storage across nodes, it scales out quickly, and it gives you metering and reporting on consumption. A private cloud is about the deployment model and the operating characteristics, not about the length of the service catalog. And that’s why NCP rightfully positions itself as a private cloud platform.

Nutanix Cloud Platform Hybrid Multi-Cloud

At the same time, it would be wrong to assume that private clouds stop at virtual machines and storage. Modern platforms are extending their scope with built-in capabilities for container orchestration, making Kubernetes a first-class citizen for enterprises that want to modernize their applications without stitching together multiple toolchains. On top of that, AI workloads are no longer confined to the public cloud. Private clouds can now deliver integrated solutions for deploying, managing, and scaling AI and machine learning models, combining GPUs, data services, and lifecycle management in one place. This means organizations are not locked out of next-generation workloads simply because they run on private infrastructure.

A good example is what many European governments are facing right now. Imagine a national healthcare system wanting to explore generative AI to improve medical research or diagnostics. Regulatory pressure dictates that sensitive patient data must never leave national borders, let alone be processed in a global public cloud where data residency and sovereignty are unclear. By running AI services directly on top of their private cloud, with Kubernetes as the orchestration layer, they can experiment with new models, train them on local GPU resources, and still keep complete operational control. This setup allows them to comply with AI regulations, maintain full sovereignty, and at the same time benefit from the elasticity and speed of a modern cloud environment. It’s a model that not only protects sovereignty but also accelerates innovation. Innovation at a different pace, but it’s still innovation.

Now, here’s where my personal perspective comes in. I no longer believe that the hyperscalers’ stretch into the private domain – think AWS Outposts, Azure Local, or even dedicated models like Oracle’s Dedicated Region – represents the future of cloud. In continental Europe especially, I see these as exceptions rather than the rule. The reality now is that most organizations here are far more concerned with sovereignty, control, and independence than with consuming a hyperscaler’s entire catalog in a smaller, local flavor.

What I believe will be far more relevant is the rise of private clouds as the foundation of enterprise IT. Combined with a hybrid multi-cloud strategy, this opens the door to what I would call a sovereign hybrid multi-cloud architecture. The idea is simple: sovereign and sensitive workloads live in a private cloud that is under your control, built to allow quick migration and even a cloud-exit if needed. At the same time, non-critical workloads can live comfortably in a public cloud where an intentional lock-in may even make sense, because you benefit from the deep integration and services that hyperscalers do best.

And this is where the “exit” part becomes critical. Picture a regulator suddenly deciding that certain workloads containing citizen data cannot legally remain in a U.S.-owned public cloud. For an organization without a sovereign hybrid strategy, this could mean months of firefighting, emergency projects, and unplanned costs to migrate or even rebuild applications. But for those who have invested in a sovereign private cloud foundation, with portable workloads across virtual machines and containers, this becomes a controlled process. Data and apps can be moved back under national jurisdiction quickly (or to any other destination), without breaking services or putting compliance at risk. It turns a crisis into a manageable transition.

VMware Sovereign Cloud Borders

This two-speed model gives you the best of both worlds. Sovereignty where it matters, and scale where it helps. And it puts private cloud platforms like Nutanix NCP in a much more strategic light. They are not just a “mini AWS” or a simplified on-prem extension, but are the anchor that allows enterprises and governments to build an IT architecture with both freedom of choice and long-term resilience.

While public clouds are often seen as environments where control and sovereignty are limited, organizations can now introduce an abstraction and governance layer on top of hyperscaler infrastructure. By running workloads through this layer, whether virtual machines or containers, enterprises gain consistent security controls independent of the underlying public cloud provider, unified operations and management across private and public deployments, and workload portability that avoids deep dependency on hyperscaler-native tools. Most importantly, sovereignty is enhanced, since governance, compliance, and security frameworks remain under the organization’s control.

This architecture essentially transforms the public cloud into an extension of the sovereign environment, rather than a separate silo. It means that even when workloads reside on hyperscaler infrastructure, they can still benefit from enhanced security, governance, and operational consistency, forming the cornerstone of a true sovereign hybrid multi-cloud.

In short, the question is not whether someone like Nutanix can compete with hyperscalers on the number of services. The real question is whether organizations in Europe want to remain fully dependent on global public clouds or if they want the ability to run sovereign, portable workloads under their own control. From what I see, the latter is becoming the priority.

The Principle of Sovereignty – From the Gods to the Digital Age

The Principle of Sovereignty – From the Gods to the Digital Age

Disclaimer: This article reflects my personal interpretation and understanding of the ideas presented by Jordan B. Peterson in his lecture Introduction to the Idea of God. It is based on my own assumptions, reflections, and independent research, and does not claim to represent any theological or academic authority. My intention is not to discuss religion, but to explore an intellectual and philosophical angle on the concept of sovereignty and how it connects to questions of responsibility, order, and autonomy in the digital age.

The views expressed here are entirely my own and written in the spirit of curiosity, personal development, and open dialogue.

For most of modern history, sovereignty was a word reserved for kings, nations, and divine powers. Today, it has migrated into the digital realm. Governments speak of data sovereignty, organisations demand cloud sovereignty, and regulators debate digital autonomy as if it were a new invention. Yet the word itself carries a much deeper heritage.

I have always been drawn to content that challenges me to think deeper. To develop myself, to learn, and to become a more decent human being. That is why I often watch lectures and interviews from thinkers like Jordan B. Peterson. His lecture Introduction to the Idea of God recently caught my attention, not because of its theological angle, but because of how he describes sovereignty. Not as external power, but as an inner principle – the ability to rule responsibly within a domain while remaining subordinate to higher laws and moral order.

That insight provides an unexpected bridge into the digital age. It suggests that sovereignty, whether human or technological, is never just about control. It is about responsible dominion, the ability to govern a domain while remaining answerable to the principles that make such governance legitimate.

When we speak about digital sovereignty, we are therefore not talking about ownership alone. We are talking about whether our digital systems reflect that same hierarchy of responsibility and whether power remains accountable to the principles it claims to serve.

The ancient logic of rule

Long before the language of “data localization” and “cloud independence”, ancient myths wrestled with the same question: who rules, and under what law?

Peterson revisits the Mesopotamian story of Marduk, the god who slays the chaos-monster Tiamat and creates the ordered world from her remains. Marduk becomes king of the gods not because he seizes power, but because he sees clearly (his eyes encircle his head) and speaks truth (his voice creates order). Sovereignty, in this reading, is vision plus articulation, which means that perception and speech transform chaos into structure.

Myth, How Marduk Became King of all the Ancient Babylonian Gods -  Mesopotamia for Kids

Source: https://mesopotamia.mrdonn.org/marduk.html 

That principle later reappears in the Hebrew conception of God as a lawgiver above kings. Even the ruler is subordinate to the law, and even the sovereign must bow to the principle of sovereignty itself. This inversion, that the highest power is bound by what is highest, became the cornerstone of Western political thought and the invisible moral logic behind constitutional democracies.

When power forgets that it is subordinate, it turns into tyranny. When it forgets its responsibility, chaos returns.

From political sovereignty to digital power

Fast-forward to the 21st century. We have entered a world where data centers replace palaces and algorithms rival parliaments. Our digital infrastructures have become the new seat of sovereignty. Whoever controls data, compute, and AI, controls the rhythm of economies and the tempo of societies.

Yet, as Peterson would put it, the psychological pattern remains the same. The temptation for absolute rule, for technical sovereignty without moral subordination, is as old as humanity. When a cloud provider holds the keys to a nation’s data but owes its allegiance to no higher principle than profit, sovereignty decays into dominance. When a government seeks total control over data flows without respecting personal freedom, sovereignty mutates into surveillance.

Therefore, digital sovereignty cannot be achieved by isolation alone. It requires a principle above control and a framework of responsibility, transparency, and trust. Without that, even the most sovereign infrastructure becomes a gilded cage.

Sovereignty as responsibility, not control

Peterson defines sovereignty as rule under law, not rule above it. In the biblical imagination, even God’s power is bound by his word – the principle that creates and limits at the same time. That is the paradox of legitimate sovereignty: its strength comes from restraint.

Applied to technology, this means that a sovereign cloud is sovereign because it binds itself to principle. It protects data privacy, enforces jurisdictional control, and allows auditability not because regulation demands it, but because integrity does.

The European vision of digital sovereignty, from the EU’s Data Act to Gaia-X, is, at its core, an attempt to recreate that ancient alignment between power and rule. It is a collective recognition that technical autonomy without ethical structure leads nowhere. Sovereign infrastructure is not an end in itself but a container for trust.

The danger of absolute sovereignty

Peterson warns that when sovereignty becomes absolute, when a ruler or system recognises no law above itself, catastrophe follows. He points to 20th-century totalitarian regimes where ideology replaced principle and individuals were crushed under the weight of unbounded power.

Our digital landscape carries similar risks. The hyperscaler model, while enabling global innovation, also consolidates an unprecedented concentration of authority. When three or four companies decide how data is stored, how AI models are trained, and which APIs define interoperability, sovereignty becomes fragile. We exchange the chaos of fragmentation for the order of dependence.

Likewise, when nations pursue “data nationalism” without transparency or interoperability, they may protect sovereignty in name but destroy it in spirit. The closed sovereign cloud is the modern equivalent of the paranoid king. Safe behind walls, but blind to the world beyond them.

True sovereignty, Peterson reminds us, requires both vision and speech. You must see beyond your borders and articulate the principles that guide your rule.

Seeing and speaking truth

In Peterson’s cosmology, the sovereign creates order by confronting chaos directly by seeing clearly and speaking truthfully. That process translates elegantly into the digital domain.

To build responsible digital systems, we must first see where our dependencies lie: in APIs, in hardware supply chains, in foreign jurisdictions, in proprietary formats. Sovereignty begins with visibility, with mapping the landscape of control.

Then comes the speech, the articulation of principle. What do we value? Data integrity? Interoperability? Legal transparency? Without speaking those truths, sovereignty becomes mute. The process of defining digital sovereignty is therefore not only political or technical but also linguistic and moral. It requires the courage to define what the higher principle actually is.

In Europe, this articulation is still evolving. The Gaia-X framework, the Swiss Government Cloud initiative, or France’s “Cloud de Confiance” are all attempts to speak sovereignty into existence by declaring that our infrastructure must reflect our societal values, not merely technical efficiency.

The individual at the root of sovereignty

One of Peterson’s most subtle points is that sovereignty ultimately begins in the individual. You cannot have a sovereign nation composed of individuals who refuse responsibility. Likewise, you cannot have a sovereign digital ecosystem built by actors who treat compliance as theatre and trust as a marketing term.

Every architect, policymaker, and operator holds a fragment of sovereignty. The question is whether they act as tyrants within their domain or as stewards under a higher rule. A cloud platform becomes sovereign when its people behave sovereignly and when they balance autonomy with discipline, ownership with humility.

In that sense, digital sovereignty is a psychological project as much as a technical one. It demands integrity, courage, and self-limitation at every level of the stack.

From divine order to digital order

The story of sovereignty, from the earliest myths to the latest regulations, is the same story told in different languages. It is the human struggle to balance power and principle, freedom and responsibility, chaos and order.

Peterson’s interpretation of sovereignty – the ruler who sees and speaks, the individual who stands upright before the highest good – offers a mirror for our technological age. The digital world is the new frontier of chaos. Our infrastructures are its architecture of order. Whether that order remains free depends on whether we remember the ancient rule: even the sovereign must be subordinate to sovereignty itself.

A sovereign cloud, therefore, is not the digital expression of nationalism, but the continuation of an older and nobler tradition. To govern power with conscience, to build systems that serve rather than dominate, to make technology reflect the values of the societies it empowers.

The true measure of sovereignty, digital or divine, lies not in the scope of control but in the depth of responsibility.

It’s Time to Rethink Your Private Cloud Strategy

It’s Time to Rethink Your Private Cloud Strategy

For over a decade, VMware has been the foundation of enterprise IT. Virtualization was almost synonymous with VMware, and entire operating models were built around it. But every era of technology eventually reaches a turning point. With vSphere 8 approaching its End of General Support in October 2027, followed by the End of Technical Guidance in 2029, customers are most probably being asked to commit to VMware Cloud Foundation (VCF) 9.x and beyond.

On paper, this may look like just another upgrade cycle, but in reality, it forces every CIO and IT leader to pause and ask the harder questions: How much control do we still have? How much flexibility remains? And do we have the freedom to define our own future?

Why This Moment Feels Different

Enterprises are not new to change. Platforms evolve, vendors shift focus, and pricing structures come and go. Normally, these transitions are gradual, with plenty of time to adapt.

What feels different today is the depth of dependency on VMware. Many organizations built their entire data center strategy on one assumption: VMware is the safe choice. VMware became the backbone of operations, the standard on which teams, processes, and certifications were built.

CIOs realize the “safe choice” is no longer guaranteed to be the most secure or sustainable. Instead of incremental adjustments, they face fundamental questions: Do we want to double down, or do we want to rebalance our dependencies?

Time Is Shorter Than It Looks

2027 may sound far away, but IT leaders know that large infrastructure decisions take years. A realistic migration journey involves:

  • Evaluation & Strategy (6 to 12 months) – Assessing alternatives, validating requirements, building a business case.

  • Proof of Concept & Pilots (6to 12 months) – Testing technology, ensuring integration, training staff.

  • Procurement & Budgeting (3 to 9 months) – Aligning financial approvals, negotiating contracts, securing resources.

  • Migration & Adoption (12 to24 months) – Moving workloads, stabilizing operations, decommissioning legacy systems.

Put these together, and the timeline shrinks quickly. The real risk is not the change itself, but running out of time to make that change on your terms.

The Pricing Question You Can’t Ignore

Now imagine this scenario:

The list price for VMware Cloud Foundation today sits around $350 per core per year. Let’s say Broadcom adjusts it by +20%, raising it to $420 this year. Then, two years later, just before your next renewal, it increases again to $500 per core per year.

Would your situation and thoughts change?

For many enterprises, this is not a theoretical question. Cost predictability is part of operational stability. If your infrastructure platform becomes a recurring cost variable, every budgeting cycle turns into a crisis of confidence.

When platforms evolve faster than budgets, even loyal customers start re-evaluating their dependency. The total cost of ownership is no longer about what you pay for software -more about what it costs to stay locked in.

And this is where strategic foresight matters most: Do you plan your next three years assuming stability, or do you prepare for volatility?

The Crossroads – vSphere 9 or VCF 9

In the short term, many customers will take the most pragmatic route. They upgrade to vSphere 9 to buy time. It’s the logical next step, preserving compatibility and delaying a bigger architectural decision.

But this path comes with an expiration date. Broadcom’s strategic focus is clear, the future of VMware is VCF 9. Over time, standalone vSphere environments will likely receive less development focus and fewer feature innovations. Eventually, organizations will be encouraged, if not forced, to adopt the integrated VCF model, because vSphere standalone or VMware vSphere Foundation (VVF) are going to be more expensive than VMware Cloud Foundation.

For some, this convergence will simplify operations. For others, it will mean even higher costs, reduced flexibility, and tighter coupling with VMware’s lifecycle.

This is the true decision point. Staying on vSphere 9 buys time, but it doesn’t buy independence (think about sovereignty too!). It’s a pause, not a pivot. Sooner or later, every organization will have to decide:

  • Commit fully to VMware Cloud Foundation and accept the new model, or

  • Diversify and build flexibility with platforms that maintain open integration and operational control

Preparing for the Next Decade

The next decade will reshape enterprise IT. Whether through AI adoption, sovereign cloud requirements, or sustainability mandates, infrastructure decisions will have long-lasting impact.

The question is not whether VMware remains relevant – it will. The question is whether your organization wants to let VMware’s roadmap dictate your future.

This moment should be viewed not as a threat but as an opportunity. It’s a chance (again) to reassess dependencies, diversify, and secure true autonomy. For CIOs, the VMware shift is less about technology and more about leadership.

Yes, it’s about ensuring that your infrastructure strategy aligns with your long-term vision, not just with a vendor’s plan.

The New Rationality of Digital Sovereignty

The New Rationality of Digital Sovereignty

For decades, digital transformation followed a clear logic. Efficiency, scale, and innovation were the ultimate measures of progress. Cloud computing, global supply chains, and open ecosystems seemed to be the rational path forward. The idea of “digital sovereignty”, meaning the control over one’s digital infrastructure, data, and technologies, was often viewed as a political or bureaucratic concern, something secondary to innovation and growth.

But rationality itself is not static. What we perceive as “reasonable” depends on how we define utility and risk. The Expected Utility Theory from behavioral economics tells us that every decision under uncertainty is guided not by objective outcomes, but by how we value those outcomes and how we fear their alternatives.

If that’s true, then digital sovereignty is not only a political mood swing, but also a recalibration of what societies see as valuable in an age where uncertainty has become the dominant variable.

Expected Utility Theory – Rationality Under Uncertainty

When we speak about rationality, we often assume that it means making the “best” choice. But what defines best depends on how we perceive outcomes and risks. The Expected Utility Theory (EUT), first introduced by Daniel Bernoulli and later refined by John von Neumann and Oskar Morgenstern, offers a framework to understand this.

EUT assumes that rational actors make decisions by comparing the expected utility of different options, not their absolute outcomes. Each possible outcome is weighted by the likelihood of its occurrence and the subjective value, or utility, assigned to it.

In essence, people don’t choose what’s objectively optimal. They choose what feels most secure and beneficial, given how they perceive risk.

What makes this theory profound is that utility is subjective. Humans and institutions value losses and gains asymmetrically. Losing control feels more painful than gaining efficiency feels rewarding. As a result, decisions that appear “irrational” in economic terms can be deeply rational when viewed through the lens of stability and trust.

In a world of increasing uncertainty – geopolitical, technological, and environmental – the weighting in this equation changes. The expected cost of dependency grows, even if the actual probability of disruption remains small. Rational actors adjust by rebalancing their notion of utility. They begin to value autonomy, resilience, and continuity more than convenience.

In this sense, EUT doesn’t just describe individual behavior. It reflects the collective psychology of societies adapting to systemic uncertainty by explaining why the pursuit of digital sovereignty is not driven by emotion, but by a rational redefinition of what it means to be safe, independent, and future-ready.

Redefining Utility in a Fragmented World

The Expected Utility Theory states that rational actors make choices based on expected outcomes, the weighted balance of benefits and risks. Yet these weights are subjective. They shift with context, history, and collective experience.

In the early cloud era, societies assigned high utility to efficiency and innovation, while the perceived probability of geopolitical or technological disruption was low. The rational choice, under that model, was integration and interdependence.

Today, the probabilities have changed, or rather, our perception of them. The likelihood of systemic disruption, cascading failures across tightly connected digital and geopolitical systems, no longer feels distant. From chip shortages to data jurisdiction disputes and the limited transparency of AI models, the cost of dependency has become part of the rational calculus.

Sovereignty as Expected Utility

What we call “digital sovereignty” is, at its core, a decision under uncertainty. It asks: Who controls the systems that control us? And what happens if that control shifts?

When the expected utility of autonomy exceeds the expected utility of convenience, societies change course. This is precisely what we are witnessing now. A collective behavioral shift in which resilience, trust, and control carry more weight than frictionless innovation.

Consider the examples:

  • Geopolitics has made technological dependence visible as a strategic vulnerability

  • Cybersecurity incidents have revealed that data sovereignty equals national security

  • AI and cloud governance have shown that control over infrastructure is also control over narrative and the flow of knowledge

Under these pressures, sovereignty is no longer a romantic ideal. It becomes a rational equilibrium point. The point at which perceived risks of dependency outweigh the marginal gains of openness.

From Emotional to Rational Sovereignty

In the past, sovereignty was often treated as nostalgia. A desire for control in a world that had already moved beyond borders. But what if this new form of sovereignty is not regression, but evolution?

Expected Utility Theory reminds us that rational behavior adapts to new distributions of risk. The same actor who once outsourced infrastructure to gain efficiency might now internalize it to regain predictability. The logic hasn’t changed, but the world around it has.

This means that digital sovereignty is not anti-globalization. It is post-globalization. It recognizes that autonomy is not isolation, but a condition for meaningful participation.

Autonomy as the Foundation of Participation

To participate freely, one must first possess the ability to choose. That is the essence of autonomy. It ensures that participation remains voluntary, ethical, and sustainable. Autonomy allows a nation to cooperate globally while retaining control over its data and infrastructure. It allows a company to use shared platforms without surrendering governance. It allows citizens to engage digitally without becoming subjects of invisible systems.

In this sense, autonomy creates the conditions for trust. Without it, participation turns into a relationship defined by dependence rather than exchange.

Post-globalization, therefore, represents not the end of the global era, but its maturation. It transforms globalization from a process of unchecked integration into one of mutual sovereignty: A world still connected, but built on the principle that every participant retains the right and capability to stand independently.

Sovereignty, in this light, becomes the structural balance between openness and ownership. It is what allows a society, an enterprise, or even an individual to remain part of the global system without being consumed by it.

The Return of Control as Rational Choice

Post-globalization has taught us that the strength of a system no longer lies in how open it is, but in how consciously it manages its connections. The same logic now applies to the digital realm.

Societies are learning to calibrate openness with control. They are rediscovering that stability and agency are not obstacles to innovation, but its foundation.

Expected Utility Theory helps make this visible. As uncertainty rises, the perceived utility of autonomy grows. What once appeared as overcaution now reads as wisdom. What once seemed like costly redundancy now feels like strategic resilience.

Control, long misunderstood as inefficiency, has become a new form of intelligence. It is a way to remain adaptive without becoming dependent. Sovereignty, once framed as resistance, now stands for continuity in an unstable world.

The digital age no longer rewards blind efficiency. It rewards balance, which is the ability to stay independent when the systems around us begin to drift.

Sovereignty in the Cloud is A Matter of Perspective

Sovereignty in the Cloud is A Matter of Perspective

Sovereignty in the cloud is often treated as a cost.  Something that slows innovation, complicates operations, and makes infrastructure more expensive. But for governments, critical industries, and regulated enterprises, sovereignty is the basis of resilience, compliance, and long-term autonomy. Hence, sovereignty is not seen as a burden. The way a provider positions sovereignty reveals a lot about how they see the balance between global scale and local control.

Some platforms like Oracle’s EU Sovereign Cloud show that sovereignty doesn’t have to come at the expense of capability. It delivers the same services, the same pricing, and operates entirely with EU-based staff. Nutanix pushes the idea even further with its distributed cloud operating model, proving that sovereignty and value can reinforce each other rather than clash.

Microsoft’s Framing

In Microsoft’s chart, the hyperscale cloud sits on the far left of the spectrum. Standard Azure and Microsoft 365 are presented as delivering only minimal sovereignty, little residency choice, and almost no operational control. The upside, in their telling, is that this model maximizes “cloud value” through global scale, innovation, and efficiency.

Microsoft Sovereignty Trade-Offs

Move further to the right and you encounter Microsoft’s sovereign variants. Here, they place offerings such as Azure Local with M365 Local and national partner clouds like Delos in Germany or Bleu in France. These are designed to deliver more sovereignty and operational control by layering in local staff, isolated infrastructure, and stricter national compliance. Yet the framing is still one of compromise. As you gain sovereignty, you are told that some of the value of the hyperscale model inevitably falls away.

Microsoft’s Sovereign Cloud Portfolio

To reinforce this point, Microsoft presents a portfolio of three models. The first is the Sovereign Public Cloud, which is owned and operated directly by Microsoft. Data remains in Europe, and customers get software-based sovereignty controls such as “Customer Lockbox” or “Confidential Computing”. It runs in Microsoft’s existing datacenters and doesn’t require migration, but it is still, at its core, a hyperscale cloud with policy guardrails added on top.

The second model is the Sovereign Private Cloud. This is customer-owned or partner-operated, running on Azure Local and Microsoft 365 Local inside local data centers. It can be hybrid or even disconnected, and is validated through Microsoft’s traditional on-premises server stack such as Hyper-V, Exchange, or SharePoint. Here, sovereignty increases because customers hold the operational keys, but it is clearly a departure from the hyperscale simplicity.

Microsoft Sovereign Cloud Portfolio

Finally, there are the National Partner Clouds, built in cooperation with approved local entities such as SAP for Delos in Germany or Orange and Cap Gemini for Bleu in France. These clouds are fully isolated, meet the most stringent government standards like SecNumCloud in France, and are aimed at governments and critical infrastructure providers. In Microsoft’s portfolio, this is the most sovereign option, but also the furthest away from the original promise of the hyperscale cloud.

On paper, this portfolio looks broad. But the pattern remains: Microsoft treats sovereignty as something that adds control at the expense of cloud value.

What If We Reframe the Axes From “Cloud Value” to “Business Value”?

That framing makes sense if you are a hyperscaler whose advantage lies in global scale. But it doesn’t reflect how governments, critical infrastructure providers, or regulated enterprises measure success. If we shift the Y-axis away from “cloud value” and instead call it “business value”, the story changes completely. Business value is about resilience, compliance, cost predictability, reliable performance in local contexts, and the flexibility to choose infrastructure and partners that meet strategic needs.

The X-axis also takes on a different character. Instead of seeing sovereignty, residency, and operations as a cost or a burden, they become assets. The more sovereignty an organization can exercise, the more it can align its IT operations with national policies, regulatory mandates, and its own resilience strategies. In this reframing, sovereignty is not a trade-off, but a multiplier.

What the New Landscape Shows

Once you adopt this perspective, the map of cloud providers looks very different.

Sovereign Cloud Analysis Chart

Please note: Exact positions on such a chart are always debatable, depending on whether you weigh ecosystem, scale, or sovereignty highest. 🙂

Microsoft Azure sits in the lower left, offering little in terms of sovereignty or control and, as a result, little real business value for sectors that depend on compliance and resilience. Adding Microsoft’s so-called sovereign controls moves the position slightly upward and to the right, but it still remains closer to enhanced compliance than genuine sovereignty. AWS’s European Sovereign Cloud lands in the middle, reflecting its cautious promises, which are a step toward sovereignty but not yet backed by deep operational independence.

Oracle’s EU Sovereign Cloud moves higher because it combines full service parity with the regular Oracle Cloud, identical pricing, and EU-based operations, making it a credible sovereign choice without hidden compromises. OCI Dedicated Region provides strong business value in a customer’s location, but since operations remain largely in Oracle’s hands, it offers less direct control than something like VMware. VMware by Broadcom sits further to the right thanks to the control it gives customers who run the stack themselves, but its business value is dragged down by complexity, licensing issues, and legacy cost.

The clear outlier is Nutanix, rising toward the top-right corner. Its distributed cloud model spanning on-prem, edge, and multi-cloud maximizes control and business value compared to most peers. Yes, Nutanix is not flawless, and yes, Nutanix lacks the massive partner ecosystem and developer gravity of hyperscalers, but for organizations prioritizing sovereignty, it comes closest to the “ideal zone”.

Conclusion

The lesson here is simple. Sovereignty is always a matter of perspective. For a hyperscaler, it looks like a tax on efficiency. For governments, banks, hospitals, or critical industries, it is the very foundation of value. For enterprises trying to reconcile global ambitions with local obligations, sovereignty is not a drag on innovation but the way to ensure autonomy, resilience, and compliance.

Microsoft’s chart is not technically wrong, but it is incomplete. Once you redefine the axes around real-world business priorities, you see that sovereignty does not reduce value. For many organizations, it is the only way to maximize it – though the exact balance point will differ depending on whether your priority is scale, compliance, or operational autonomy.

What If Cloud Was Never the Destination But Just One Chapter In A Longer Journey

What If Cloud Was Never the Destination But Just One Chapter In A Longer Journey

For more than a decade, IT strategies were shaped by a powerful promise that the public cloud was the final destination. Enterprises were told that everything would eventually run there, that the data center would become obsolete, and that the only rational strategy was “cloud-first”. For a time, this narrative worked. It created clarity in a complex world and provided decision-makers with a guiding principle.

Hyperscalers accelerated digital transformation in ways no one else could have. Without their scale and speed, the last decade of IT modernization would have looked very different. But what worked as a catalyst does not automatically define the long-term architecture.

But what if that narrative was never entirely true? What if the cloud was not the destination at all, but only a chapter? A critical accelerator in the broader evolution of enterprise infrastructure? The growing evidence suggests exactly that. Today, we are seeing the limits of mono-cloud thinking and the emergence of something new. A shift towards adaptive platforms that prioritize autonomy over location.

The Rise and Fall of Mono-Cloud Thinking

The first wave of cloud adoption was almost euphoric. Moving everything into a single public cloud seemed not just efficient but inevitable. Infrastructure management became simpler, procurement cycles shorter, and time-to-market faster. For CIOs under pressure to modernize, the benefits were immediate and tangible.

Yet over time, the cost savings that once justified the shift started to erode. What initially looked like operational efficiency transformed into long-term operating expenses that grew relentlessly with scale. Data gravity added another layer of friction. While applications were easy to deploy, the vast datasets they relied on were not as mobile. And then came the growing emphasis on sovereignty and compliance. Governments and regulators, citizens and journalists as well, started asking difficult questions about who ultimately controlled the data and under what jurisdiction.

These realities did not erase the value of the public cloud, but they reframed it. Mono-cloud strategies, while powerful in their early days, increasingly appeared too rigid, too costly, and too dependent on external factors beyond the control of the enterprise.

Multi-Cloud as a Halfway Step

In response, many organizations turned to multi-cloud. If one provider created lock-in, why not distribute workloads across two or three? The reasoning was logical. Diversify risk, improve resilience, and gain leverage in vendor negotiations.

But as the theory met reality, the complexity of multi-cloud began to outweigh its promises. Each cloud provider came with its own set of tools, APIs, and management layers, creating operational fragmentation rather than simplification. Policies around security and compliance became harder to enforce consistently. And the cost of expertise rose dramatically, as teams were suddenly required to master multiple environments instead of one.

Multi-cloud, in practice, became less of a strategy and more of a compromise. It revealed the desire for autonomy, but without providing the mechanisms to truly achieve it. What emerged was not freedom, but another form of dependency. This time, on the ability of teams to stitch together disparate environments at great cost and complexity.

The Adaptive Platform Hypothesis

If mono-cloud was too rigid and multi-cloud too fragmented, then what comes next? The hypothesis that is now emerging is that the future will be defined not by a place – cloud, on-premises, or edge – but by the adaptability of the platform that connects them.

Adaptive platforms are designed to eliminate friction, allowing workloads to move freely when circumstances change. They bring compute to the data rather than forcing data to move to compute, which becomes especially critical in the age of AI. They make sovereignty and compliance part of the design rather than an afterthought, ensuring that regulatory shifts do not force expensive architectural overhauls. And most importantly, they allow enterprises to retain operational autonomy even as vendors merge, licensing models change, or new technologies emerge.

This idea reframes the conversation entirely. Instead of asking where workloads should run, the more relevant question becomes how quickly and easily they can be moved, scaled, and adapted. Autonomy, not location, becomes the decisive metric of success.

Autonomy as the New Metric?

The story of the cloud is not over, but the chapter of cloud as a final destination is closing. The public cloud was never the endpoint, but it was a powerful catalyst that changed how we think about IT consumption. But the next stage is already being written, and it is less about destinations than about options.

Certain workloads will always thrive in a hyperscale cloud – think collaboration tools, globally distributed apps, or burst capacity. Others, especially those tied to sovereignty, compliance, or AI data proximity, demand a different approach. Adaptive platforms are emerging to fill that gap.

Enterprises that build for autonomy will be better positioned to navigate an unpredictable future. They will be able to shift workloads without fear of vendor lock-in, place AI infrastructure close to where data resides, and comply with sovereignty requirements without slowing down innovation.

The emerging truth is simple: Cloud was never the destination. It was only one chapter in a much longer journey. The next chapter belongs to adaptive platforms and to organizations bold enough to design for freedom rather than dependency.