Every technology leader knows this moment: the procurement team sits across the table and asks the question you’ve heard a hundred times before. “Why is this solution more expensive than what we thought?”
When it comes to Nutanix, the honest answer is simple: it’s not cheap. And it shouldn’t be. Because what you’re paying for is not just software – you’re paying for enterprise-readiness, operational simplicity, support quality, and long-term resilience. And don’t forget freedom and sovereignty.
But let’s put that into perspective.
The Myth of Cheap IT
Many IT strategies start with the illusion of saving money. The public cloud is often positioned as the easy, cost-effective way forward. The first few months even look promising: minimal upfront investments, quick provisioning, instant access to services.
But costs in the public cloud scale differently. What starts as an attractive proof of concept soon becomes a recurring nightmare in the CFO’s inbox. Networking, egress charges, storage tiers, backup, and compliance layers all stack on top of the base infrastructure. Before long, that “cheap” platform becomes one of the most expensive commitments in the entire IT budget.
We don’t have to talk in hypotheticals here. Just look at 37signals (now Basecamp). Beginning in 2022, they started migrating away from Amazon Web Services (AWS) because of escalating costs. Their AWS bill had ballooned to $3.2 million annually, with $1.5 million of that just for storage. By investing $700,000 in Dell servers and $1.5 million in Pure Storage arrays, they migrated 18 Petabytes of data out of AWS and completely shut down their cloud account by summer 2025. The result? Annual savings of more than $2 million, alongside full ownership and visibility into their infrastructure. For 37signals, the math was simple: public cloud had become the expensive choice.
VMware customers are experiencing something similar, but in a different flavor. Broadcom’s new licensing model has transformed familiar cost structures into something far less predictable and much higher. Organizations that relied on VMware for decades now face steep renewals, mandatory bundles, and less flexibility to optimize spend.
So yes, let’s talk about “expensive”. But let’s be honest about what expensive really looks like.
Paying for Readiness
Let’s talk about Nutanix. At first glance, it may not be the cheapest option on the table. But Nutanix is built from the ground up to deliver enterprise capabilities that reduce hidden costs and avoid painful surprises.
What others solve with layers of tools, Nutanix delivers in a single, integrated platform. That means fewer licenses, fewer integration projects, and fewer teams chasing issues across silos.
The architecture distributes risk instead of concentrating it. Failures don’t cascade, operations don’t grind to a halt, and recovery doesn’t require a small army.
You decide the hardware, the software, and how you extend into the public cloud. That means intentional lock-ins, and no forced upgrades just because a vendor decided to change the rules.
Value is the Real Differentiator
Price is always visible. It’s the line item that everyone sees. But value is often hidden in what doesn’t happen. The outages that rarely occur. The security incidents avoided. The integration projects you don’t need.
When you compare Nutanix against VMware’s new pricing or against runaway public cloud bills, the story shifts. What once looked “expensive” now feels reasonable. Because with Nutanix, you are not paying for legacy baggage or unpredictable consumption models. It is for a platform that runs mission-critical workloads in your sovereign environment.
The Real Cost of Cheap
There’s an old truth in enterprise IT: cheap usually ends up being the most expensive choice. Cutting costs upfront often means sacrificing reliability, adding complexity, or creating other lock-in that limits your future options. And every one of those decisions comes back later as a much bigger invoice. Sometimes in dollars, sometimes in lost trust.
Nutanix is not cheap. But it is predictable. It is proven. And it is built for organizations that cannot afford to compromise on the workloads that matter most.
Final Thought
The question is not whether Nutanix costs money, of course it does. The real question is what you get in return, and how it compares to the alternatives. Against public cloud bills spiraling out of control and VMware contracts that now feel more like ransom notes, Nutanix delivers clarity, control, sovereignty, and enterprise-grade quality.
For more than two decades, VMware has been the backbone of enterprise IT. It virtualized the data center, transformed the way infrastructure was consumed, and defined the operating model of an entire generation of CIOs and IT architects. That era mattered, and it brought incredible efficiency gains. But as much as VMware shaped the last chapter, the story of enterprise infrastructure is now moving on. And the real question for organizations is not “VMware or Nutanix?”, but the real question is: how much control are you willing to keep over your own future?
The Wrong Question
The way the conversation is often framed, Nutanix against VMware, misses the point entirely. Customers are not trying to settle a sports rivalry. They are not interested in cheering for one logo over another. What they are really trying to figure out is whether their infrastructure strategy gives them freedom or creates dependency. It is less about choosing between two vendors and more about choosing how much autonomy they retain.
VMware is still seen as the incumbent, the technology that defined stability and became the default. Nutanix is often described as the challenger. But in reality, the battleground has shifted. It is no longer about virtualization versus hyperconvergence, but about which platform offers true adaptability in a multi-cloud world.
The VMware Era – A Breakthrough That Belongs to the Past
There is no denying VMware’s historical importance. Virtualization was a revolution. It allowed enterprises to consolidate, to scale, and to rethink how applications were deployed. For a long time, VMware was synonymous with progress.
But revolutions have life cycles. Virtualization solved yesterday’s problems, and the challenges today look very different. Enterprises now face hybrid and multi-cloud realities, sovereignty concerns, and the rise of AI workloads that stretch far beyond the boundaries of a hypervisor. VMware’s empire was built for an era where the primary challenge was infrastructure efficiency. That chapter is now closing.
The Nutanix Trajectory – From HCI to a Distributed Cloud OS
Nutanix started with hyperconverged infrastructure. That much is true but it never stopped there. Over the years, Nutanix has steadily moved towards building a distributed cloud operating system that spans on-premises data centers, public clouds, and the edge.
For many customers, staying with VMware feels like the path of least resistance. There are sunk costs, existing skill sets, and the comfort of familiarity, but inertia comes at a price. The longer enterprises delay modernization, the more difficult and expensive it becomes to catch up later.
The Broadcom acquisition has accelerated this reality. Pricing changes, bundled contracts, and ecosystem lock-in are daily conversations in boardrooms. Dependency has become a strategic liability. What once felt like stability now feels like fragility.
Leverage Instead of Lock-In
This is where Nutanix changes the narrative. It is not simply offering an alternative hypervisor or another management tool. It is offering leverage – the ability to simplify operations while keeping doors open.
With Nutanix, customers can run workloads on-premises, in AWS, in Azure, in GCP, or across them all. They can adopt cloud-native services without abandoning existing investments. They can prepare for sovereignty requirements or AI infrastructure needs without being tied to a single roadmap dictated by a vendor’s financial strategy.
That is what leverage looks like. It gives CIOs and IT leaders negotiation power. It ensures that the infrastructure strategy is not dictated by one supplier’s pricing model, but by the customer’s own business needs.
The Next Chapter
VMware defined the last era of enterprise IT. It built the virtualization chapter that will always remain a cornerstone in IT history. But the next chapter is being written by Nutanix. Not because it “beat” VMware, but because it aligned itself with the challenges enterprises are facing today: autonomy, adaptability, and resilience.
This chapter is about who controls the terms of the game. And for organizations that want to stay in charge of their own destiny, Nutanix represents the next chapter.
When people talk about cloud computing, the conversation almost always drifts toward the hyperscalers. AWS, Azure, and Google Cloud have shaped what we consider a “cloud” today. They offer seemingly endless catalogs of services, APIs for everything, and a global footprint. So why does Nutanix call its Nutanix Cloud Platform (NCP) a private cloud, even though its catalog of IaaS and PaaS services is far more limited?
To answer that, it makes sense to go back to the roots. NIST’s SP 800-145 definition of cloud computing is still the most relevant one. According to it, five essential characteristics make something a cloud: on-demand self-service, broad network access, resource pooling, rapid elasticity, and measured service. NIST then defines four deployment models: public, private, community, and hybrid.
If you look at NCP through that lens, it ticks the boxes. It delivers on-demand infrastructure through Prism and APIs, it abstracts and pools compute and storage across nodes, it scales out quickly, and it gives you metering and reporting on consumption. A private cloud is about the deployment model and the operating characteristics, not about the length of the service catalog. And that’s why NCP rightfully positions itself as a private cloud platform.
At the same time, it would be wrong to assume that private clouds stop at virtual machines and storage. Modern platforms are extending their scope with built-in capabilities for container orchestration, making Kubernetes a first-class citizen for enterprises that want to modernize their applications without stitching together multiple toolchains. On top of that, AI workloads are no longer confined to the public cloud. Private clouds can now deliver integrated solutions for deploying, managing, and scaling AI and machine learning models, combining GPUs, data services, and lifecycle management in one place. This means organizations are not locked out of next-generation workloads simply because they run on private infrastructure.
A good example is what many European governments are facing right now. Imagine a national healthcare system wanting to explore generative AI to improve medical research or diagnostics. Regulatory pressure dictates that sensitive patient data must never leave national borders, let alone be processed in a global public cloud where data residency and sovereignty are unclear. By running AI services directly on top of their private cloud, with Kubernetes as the orchestration layer, they can experiment with new models, train them on local GPU resources, and still keep complete operational control. This setup allows them to comply with AI regulations, maintain full sovereignty, and at the same time benefit from the elasticity and speed of a modern cloud environment. It’s a model that not only protects sovereignty but also accelerates innovation. Innovation at a different pace, but it’s still innovation.
Now, here’s where my personal perspective comes in. I no longer believe that the hyperscalers’ stretch into the private domain – think AWS Outposts, Azure Local, or even dedicated models like Oracle’s Dedicated Region – represents the future of cloud. In continental Europe especially, I see these as exceptions rather than the rule. The reality now is that most organizations here are far more concerned with sovereignty, control, and independence than with consuming a hyperscaler’s entire catalog in a smaller, local flavor.
What I believe will be far more relevant is the rise of private clouds as the foundation of enterprise IT. Combined with a hybrid multi-cloud strategy, this opens the door to what I would call a sovereign hybrid multi-cloud architecture. The idea is simple: sovereign and sensitive workloads live in a private cloud that is under your control, built to allow quick migration and even a cloud-exit if needed. At the same time, non-critical workloads can live comfortably in a public cloud where an intentional lock-in may even make sense, because you benefit from the deep integration and services that hyperscalers do best.
And this is where the “exit” part becomes critical. Picture a regulator suddenly deciding that certain workloads containing citizen data cannot legally remain in a U.S.-owned public cloud. For an organization without a sovereign hybrid strategy, this could mean months of firefighting, emergency projects, and unplanned costs to migrate or even rebuild applications. But for those who have invested in a sovereign private cloud foundation, with portable workloads across virtual machines and containers, this becomes a controlled process. Data and apps can be moved back under national jurisdiction quickly (or to any other destination), without breaking services or putting compliance at risk. It turns a crisis into a manageable transition.
This two-speed model gives you the best of both worlds. Sovereignty where it matters, and scale where it helps. And it puts private cloud platforms like Nutanix NCP in a much more strategic light. They are not just a “mini AWS” or a simplified on-prem extension, but are the anchor that allows enterprises and governments to build an IT architecture with both freedom of choice and long-term resilience.
While public clouds are often seen as environments where control and sovereignty are limited, organizations can now introduce an abstraction and governance layer on top of hyperscaler infrastructure. By running workloads through this layer, whether virtual machines or containers, enterprises gain consistent security controls independent of the underlying public cloud provider, unified operations and management across private and public deployments, and workload portability that avoids deep dependency on hyperscaler-native tools. Most importantly, sovereignty is enhanced, since governance, compliance, and security frameworks remain under the organization’s control.
This architecture essentially transforms the public cloud into an extension of the sovereign environment, rather than a separate silo. It means that even when workloads reside on hyperscaler infrastructure, they can still benefit from enhanced security, governance, and operational consistency, forming the cornerstone of a true sovereign hybrid multi-cloud.
In short, the question is not whether someone like Nutanix can compete with hyperscalers on the number of services. The real question is whether organizations in Europe want to remain fully dependent on global public clouds or if they want the ability to run sovereign, portable workloads under their own control. From what I see, the latter is becoming the priority.
Disclaimer:This article reflects my personal interpretation and understanding of the ideas presented by Jordan B. Peterson in his lecture Introduction to the Idea of God. It is based on my own assumptions, reflections, and independent research, and does not claim to represent any theological or academic authority. My intention is not to discuss religion, but to explore an intellectual and philosophical angle on the concept of sovereignty and how it connects to questions of responsibility, order, and autonomy in the digital age.
The views expressed here are entirely my own and written in the spirit of curiosity, personal development, and open dialogue.
—
For most of modern history, sovereignty was a word reserved for kings, nations, and divine powers. Today, it has migrated into the digital realm. Governments speak of data sovereignty, organisations demand cloud sovereignty, and regulators debate digital autonomy as if it were a new invention. Yet the word itself carries a much deeper heritage.
I have always been drawn to content that challenges me to think deeper. To develop myself, to learn, and to become a more decent human being. That is why I often watch lectures and interviews from thinkers like Jordan B. Peterson. His lecture Introduction to the Idea of God recently caught my attention, not because of its theological angle, but because of how he describes sovereignty. Not as external power, but as an inner principle – the ability to rule responsibly within a domain while remaining subordinate to higher laws and moral order.
That insight provides an unexpected bridge into the digital age. It suggests that sovereignty, whether human or technological, is never just about control. It is about responsible dominion, the ability to govern a domain while remaining answerable to the principles that make such governance legitimate.
When we speak about digital sovereignty, we are therefore not talking about ownership alone. We are talking about whether our digital systems reflect that same hierarchy of responsibility and whether power remains accountable to the principles it claims to serve.
The ancient logic of rule
Long before the language of “data localization” and “cloud independence”, ancient myths wrestled with the same question: who rules, and under what law?
Peterson revisits the Mesopotamian story of Marduk, the god who slays the chaos-monster Tiamat and creates the ordered world from her remains. Marduk becomes king of the gods not because he seizes power, but because he sees clearly (his eyes encircle his head) and speaks truth (his voice creates order). Sovereignty, in this reading, is vision plus articulation, which means that perception and speech transform chaos into structure.
That principle later reappears in the Hebrew conception of God as a lawgiver above kings. Even the ruler is subordinate to the law, and even the sovereign must bow to the principle of sovereignty itself. This inversion, that the highest power is bound by what is highest, became the cornerstone of Western political thought and the invisible moral logic behind constitutional democracies.
When power forgets that it is subordinate, it turns into tyranny. When it forgets its responsibility, chaos returns.
From political sovereignty to digital power
Fast-forward to the 21st century. We have entered a world where data centers replace palaces and algorithms rival parliaments. Our digital infrastructures have become the new seat of sovereignty. Whoever controls data, compute, and AI, controls the rhythm of economies and the tempo of societies.
Yet, as Peterson would put it, the psychological pattern remains the same. The temptation for absolute rule, for technical sovereignty without moral subordination, is as old as humanity. When a cloud provider holds the keys to a nation’s data but owes its allegiance to no higher principle than profit, sovereignty decays into dominance. When a government seeks total control over data flows without respecting personal freedom, sovereignty mutates into surveillance.
Therefore, digital sovereignty cannot be achieved by isolation alone. It requires a principle above control and a framework of responsibility, transparency, and trust. Without that, even the most sovereign infrastructure becomes a gilded cage.
Sovereignty as responsibility, not control
Peterson defines sovereignty as rule under law, not rule above it. In the biblical imagination, even God’s power is bound by his word – the principle that creates and limits at the same time. That is the paradox of legitimate sovereignty: its strength comes from restraint.
Applied to technology, this means that a sovereign cloud is sovereign because it binds itself to principle. It protects data privacy, enforces jurisdictional control, and allows auditability not because regulation demands it, but because integrity does.
The European vision of digital sovereignty, from the EU’s Data Act to Gaia-X, is, at its core, an attempt to recreate that ancient alignment between power and rule. It is a collective recognition that technical autonomy without ethical structure leads nowhere. Sovereign infrastructure is not an end in itself but a container for trust.
The danger of absolute sovereignty
Peterson warns that when sovereignty becomes absolute, when a ruler or system recognises no law above itself, catastrophe follows. He points to 20th-century totalitarian regimes where ideology replaced principle and individuals were crushed under the weight of unbounded power.
Our digital landscape carries similar risks. The hyperscaler model, while enabling global innovation, also consolidates an unprecedented concentration of authority. When three or four companies decide how data is stored, how AI models are trained, and which APIs define interoperability, sovereignty becomes fragile. We exchange the chaos of fragmentation for the order of dependence.
Likewise, when nations pursue “data nationalism” without transparency or interoperability, they may protect sovereignty in name but destroy it in spirit. The closed sovereign cloud is the modern equivalent of the paranoid king. Safe behind walls, but blind to the world beyond them.
True sovereignty, Peterson reminds us, requires both vision and speech. You must see beyond your borders and articulate the principles that guide your rule.
Seeing and speaking truth
In Peterson’s cosmology, the sovereign creates order by confronting chaos directly by seeing clearly and speaking truthfully. That process translates elegantly into the digital domain.
To build responsible digital systems, we must first see where our dependencies lie: in APIs, in hardware supply chains, in foreign jurisdictions, in proprietary formats. Sovereignty begins with visibility, with mapping the landscape of control.
Then comes the speech, the articulation of principle. What do we value? Data integrity? Interoperability? Legal transparency? Without speaking those truths, sovereignty becomes mute. The process of defining digital sovereignty is therefore not only political or technical but also linguistic and moral. It requires the courage to define what the higher principle actually is.
In Europe, this articulation is still evolving. The Gaia-X framework, the Swiss Government Cloud initiative, or France’s “Cloud de Confiance” are all attempts to speak sovereignty into existence by declaring that our infrastructure must reflect our societal values, not merely technical efficiency.
The individual at the root of sovereignty
One of Peterson’s most subtle points is that sovereignty ultimately begins in the individual. You cannot have a sovereign nation composed of individuals who refuse responsibility. Likewise, you cannot have a sovereign digital ecosystem built by actors who treat compliance as theatre and trust as a marketing term.
Every architect, policymaker, and operator holds a fragment of sovereignty. The question is whether they act as tyrants within their domain or as stewards under a higher rule. A cloud platform becomes sovereign when its people behave sovereignly and when they balance autonomy with discipline, ownership with humility.
In that sense, digital sovereignty is a psychological project as much as a technical one. It demands integrity, courage, and self-limitation at every level of the stack.
From divine order to digital order
The story of sovereignty, from the earliest myths to the latest regulations, is the same story told in different languages. It is the human struggle to balance power and principle, freedom and responsibility, chaos and order.
Peterson’s interpretation of sovereignty – the ruler who sees and speaks, the individual who stands upright before the highest good – offers a mirror for our technological age. The digital world is the new frontier of chaos. Our infrastructures are its architecture of order. Whether that order remains free depends on whether we remember the ancient rule: even the sovereign must be subordinate to sovereignty itself.
A sovereign cloud, therefore, is not the digital expression of nationalism, but the continuation of an older and nobler tradition. To govern power with conscience, to build systems that serve rather than dominate, to make technology reflect the values of the societies it empowers.
The true measure of sovereignty, digital or divine, lies not in the scope of control but in the depth of responsibility.
For over a decade, VMware has been the foundation of enterprise IT. Virtualization was almost synonymous with VMware, and entire operating models were built around it. But every era of technology eventually reaches a turning point. With vSphere 8 approaching its End of General Support in October 2027, followed by the End of Technical Guidance in 2029, customers are most probably being asked to commit to VMware Cloud Foundation (VCF) 9.x and beyond.
On paper, this may look like just another upgrade cycle, but in reality, it forces every CIO and IT leader to pause and ask the harder questions: How much control do we still have? How much flexibility remains? And do we have the freedom to define our own future?
Why This Moment Feels Different
Enterprises are not new to change. Platforms evolve, vendors shift focus, and pricing structures come and go. Normally, these transitions are gradual, with plenty of time to adapt.
What feels different today is the depth of dependency on VMware. Many organizations built their entire data center strategy on one assumption: VMware is the safe choice. VMware became the backbone of operations, the standard on which teams, processes, and certifications were built.
CIOs realize the “safe choice” is no longer guaranteed to be the most secure or sustainable. Instead of incremental adjustments, they face fundamental questions: Do we want to double down, or do we want to rebalance our dependencies?
Time Is Shorter Than It Looks
2027 may sound far away, but IT leaders know that large infrastructure decisions take years. A realistic migration journey involves:
Evaluation & Strategy (6 to 12 months) – Assessing alternatives, validating requirements, building a business case.
Proof of Concept & Pilots (6to 12 months) – Testing technology, ensuring integration, training staff.
Put these together, and the timeline shrinks quickly. The real risk is not the change itself, but running out of time to make that change on your terms.
The Pricing Question You Can’t Ignore
Now imagine this scenario:
The list price for VMware Cloud Foundation today sits around $350 per core per year. Let’s say Broadcom adjusts it by +20%, raising it to $420 this year. Then, two years later, just before your next renewal, it increases again to $500 per core per year.
Would your situation and thoughts change?
For many enterprises, this is not a theoretical question. Cost predictability is part of operational stability. If your infrastructure platform becomes a recurring cost variable, every budgeting cycle turns into a crisis of confidence.
When platforms evolve faster than budgets, even loyal customers start re-evaluating their dependency. The total cost of ownership is no longer about what you pay for software -more about what it costs to stay locked in.
And this is where strategic foresight matters most: Do you plan your next three years assuming stability, or do you prepare for volatility?
The Crossroads – vSphere 9 or VCF 9
In the short term, many customers will take the most pragmatic route. They upgrade to vSphere 9 to buy time. It’s the logical next step, preserving compatibility and delaying a bigger architectural decision.
But this path comes with an expiration date. Broadcom’s strategic focus is clear, the future of VMware is VCF 9. Over time, standalone vSphere environments will likely receive less development focus and fewer feature innovations. Eventually, organizations will be encouraged, if not forced, to adopt the integrated VCF model, because vSphere standalone or VMware vSphere Foundation (VVF) are going to be more expensive than VMware Cloud Foundation.
For some, this convergence will simplify operations. For others, it will mean even higher costs, reduced flexibility, and tighter coupling with VMware’s lifecycle.
This is the true decision point. Staying on vSphere 9 buys time, but it doesn’t buy independence (think about sovereignty too!). It’s a pause, not a pivot. Sooner or later, every organization will have to decide:
Commit fully to VMware Cloud Foundation and accept the new model, or
Diversify and build flexibility with platforms that maintain open integration and operational control
Preparing for the Next Decade
The next decade will reshape enterprise IT. Whether through AI adoption, sovereign cloud requirements, or sustainability mandates, infrastructure decisions will have long-lasting impact.
The question is not whether VMware remains relevant – it will. The question is whether your organization wants to let VMware’s roadmap dictate your future.
This moment should be viewed not as a threat but as an opportunity. It’s a chance (again) to reassess dependencies, diversify, and secure true autonomy. For CIOs, the VMware shift is less about technology and more about leadership.
Yes, it’s about ensuring that your infrastructure strategy aligns with your long-term vision, not just with a vendor’s plan.
For decades, digital transformation followed a clear logic. Efficiency, scale, and innovation were the ultimate measures of progress. Cloud computing, global supply chains, and open ecosystems seemed to be the rational path forward. The idea of “digital sovereignty”, meaning the control over one’s digital infrastructure, data, and technologies, was often viewed as a political or bureaucratic concern, something secondary to innovation and growth.
But rationality itself is not static. What we perceive as “reasonable” depends on how we define utility and risk. The Expected Utility Theory from behavioral economics tells us that every decision under uncertainty is guided not by objective outcomes, but by how we value those outcomes and how we fear their alternatives.
If that’s true, then digital sovereignty is not only a political mood swing, but also a recalibration of what societies see as valuable in an age where uncertainty has become the dominant variable.
Expected Utility Theory – Rationality Under Uncertainty
When we speak about rationality, we often assume that it means making the “best” choice. But what defines best depends on how we perceive outcomes and risks. The Expected Utility Theory (EUT), first introduced by Daniel Bernoulli and later refined by John von Neumann and Oskar Morgenstern, offers a framework to understand this.
EUT assumes that rational actors make decisions by comparing the expected utility of different options, not their absolute outcomes. Each possible outcome is weighted by the likelihood of its occurrence and the subjective value, or utility, assigned to it.
In essence, people don’t choose what’s objectively optimal. They choose what feels most secure and beneficial, given how they perceive risk.
What makes this theory profound is that utility is subjective. Humans and institutions value losses and gains asymmetrically. Losing control feels more painful than gaining efficiency feels rewarding. As a result, decisions that appear “irrational” in economic terms can be deeply rational when viewed through the lens of stability and trust.
In a world of increasing uncertainty – geopolitical, technological, and environmental – the weighting in this equation changes. The expected cost of dependency grows, even if the actual probability of disruption remains small. Rational actors adjust by rebalancing their notion of utility. They begin to value autonomy, resilience, and continuity more than convenience.
In this sense, EUT doesn’t just describe individual behavior. It reflects the collective psychology of societies adapting to systemic uncertainty by explaining why the pursuit of digital sovereignty is not driven by emotion, but by a rational redefinition of what it means to be safe, independent, and future-ready.
Redefining Utility in a Fragmented World
The Expected Utility Theory states that rational actors make choices based on expected outcomes, the weighted balance of benefits and risks. Yet these weights are subjective. They shift with context, history, and collective experience.
In the early cloud era, societies assigned high utility to efficiency and innovation, while the perceived probability of geopolitical or technological disruption was low. The rational choice, under that model, was integration and interdependence.
Today, the probabilities have changed, or rather, our perception of them. The likelihood of systemic disruption, cascading failures across tightly connected digital and geopolitical systems, no longer feels distant. From chip shortages to data jurisdiction disputes and the limited transparency of AI models, the cost of dependency has become part of the rational calculus.
Sovereignty as Expected Utility
What we call “digital sovereignty” is, at its core, a decision under uncertainty. It asks: Who controls the systems that control us? And what happens if that control shifts?
When the expected utility of autonomy exceeds the expected utility of convenience, societies change course. This is precisely what we are witnessing now. A collective behavioral shift in which resilience, trust, and control carry more weight than frictionless innovation.
Consider the examples:
Geopolitics has made technological dependence visible as a strategic vulnerability
Cybersecurity incidents have revealed that data sovereignty equals national security
AI and cloud governance have shown that control over infrastructure is also control over narrative and the flow of knowledge
Under these pressures, sovereignty is no longer a romantic ideal. It becomes a rational equilibrium point. The point at which perceived risks of dependency outweigh the marginal gains of openness.
From Emotional to Rational Sovereignty
In the past, sovereignty was often treated as nostalgia. A desire for control in a world that had already moved beyond borders. But what if this new form of sovereignty is not regression, but evolution?
Expected Utility Theory reminds us that rational behavior adapts to new distributions of risk. The same actor who once outsourced infrastructure to gain efficiency might now internalize it to regain predictability. The logic hasn’t changed, but the world around it has.
This means that digital sovereignty is not anti-globalization. It is post-globalization. It recognizes that autonomy is not isolation, but a condition for meaningful participation.
Autonomy as the Foundation of Participation
To participate freely, one must first possess the ability to choose. That is the essence of autonomy. It ensures that participation remains voluntary, ethical, and sustainable. Autonomy allows a nation to cooperate globally while retaining control over its data and infrastructure. It allows a company to use shared platforms without surrendering governance. It allows citizens to engage digitally without becoming subjects of invisible systems.
In this sense, autonomy creates the conditions for trust. Without it, participation turns into a relationship defined by dependence rather than exchange.
Post-globalization, therefore, represents not the end of the global era, but its maturation. It transforms globalization from a process of unchecked integration into one of mutual sovereignty: A world still connected, but built on the principle that every participant retains the right and capability to stand independently.
Sovereignty, in this light, becomes the structural balance between openness and ownership. It is what allows a society, an enterprise, or even an individual to remain part of the global system without being consumed by it.
The Return of Control as Rational Choice
Post-globalization has taught us that the strength of a system no longer lies in how open it is, but in how consciously it manages its connections. The same logic now applies to the digital realm.
Societies are learning to calibrate openness with control. They are rediscovering that stability and agency are not obstacles to innovation, but its foundation.
Expected Utility Theory helps make this visible. As uncertainty rises, the perceived utility of autonomy grows. What once appeared as overcaution now reads as wisdom. What once seemed like costly redundancy now feels like strategic resilience.
Control, long misunderstood as inefficiency, has become a new form of intelligence. It is a way to remain adaptive without becoming dependent. Sovereignty, once framed as resistance, now stands for continuity in an unstable world.
The digital age no longer rewards blind efficiency. It rewards balance, which is the ability to stay independent when the systems around us begin to drift.
My name is Michael Rebmann. I am an Enterprise Account Manager at Nutanix, helping public sector organizations design sovereign and compliant cloud infrastructures. I focus on sovereign cloud, hybrid multi-cloud architectures, and data privacy in regulated industries.
The views and opinions expressed here are entirely my own, reflecting my journey and insights.