Enterprise private cloud and the need for data sovereignty
Insights on how data and cloud can combine to leverage generative AI
Enterprise private cloud and the need for data sovereignty
Insights on how data and cloud can combine to leverage generative AI
As the fourth to move in to cloud provision, Oracle has been careful about how it has presented its point of differentiation.
TAM forecast for public cloud services through 2027
By focusing on cloud sovereignty, at a time when data privacy and trust are top of mind among countries and enterprises, Oracle is hoping to bridge the gap between public and private cloud.
Jae Evans, global CIO of Oracle, told attendees at this year’s ‘Shifting Techtonics’ Software Leadership Gathering she sees this as a big growth driver for the organisation, having created a general purpose, distributed cloud: Oracle Cloud Infrastructure.
Specifically, Oracle took the decision to start with a small footprint, cognisant of the fact that there was a growing demand for private cloud solutions and data sovereignty – especially in regions like the EU where countries do not want customer data being held on the public cloud. There is a genuine concern as to who has access to data.
Overall, the TAM for public cloud services is forecast to be $1.2 trillion through 2027, with Evans noting that this year public cloud spend is going to exceed non-cloud spend. In addition, the global GPUaaS market is forecast to grow from $4 billion to almost $50 billion in the next four years as private AI solutions start to scale. The data privacy topic, driven by regulators and government agencies, is partly why there is a slower adoption among enterprises in leveraging public cloud and associated AI services.
From the lens of the consumer, Evans explained that there are key considerations for CTOs and CIOs when looking to put their business operations in the hands of cloud providers: 1) table stakes considerations such as data security, robustness; 2) a broad set of tech-enabled services to leverage and 3) who has access to my data, how is it shared?
Mission critical data
Much of the growing concern among enterprises centres on data sovereignty. Where's my data? Who has access to my data? Where does it reside? And who can actually access it from an operator standpoint? When it comes to trust, the bar is rising ever higher.
What are the things that CIOs, CTOs expect when they're putting their businesses in the hands of cloud providers or AI Service providers? And some of those things are table stakes…it's got to be secure, it's got to be highly available and robust and provide disaster recovery and resiliency.”
Ultimately, this means knowing that cloud providers are safeguarding data by having the right policies in place so that enterprises can take comfort leveraging emergent generative AI tools and applications to create value for their end customers.
Greater import should be given to this when one considers that while enterprises want to use the cloud, many are struggling with workforce capabilities. A recent Gartner study showed that 45% of executives don't know how to provide the right levels of estimates or demonstrate the value creation of leveraging generative AI technology.
That creates an immediate opportunity for cloud providers to help enterprises incorporate and implement AI in to their operational environment.
As Evans explained:
This is a big theme and a big reason why we see that adoption of the public cloud isn’t happening fast enough.
“We are now seeing tremendous interest in private AI. The reasons for that are two or threefold. One, in these large regulated industries there is significant amount of data on prem that people don't want to take over to the public clouds as yet; or public language model providers, to be more specific. Secondly, a lot of the high value applications are still internal employee facing. Thirdly, there is no way to formally verify correctness of an application of a language model.”
The data centre power market is expected to become a $33 billion market in the next few years.
More data centres, more power consumption, more chips, more hardware will be needed; let alone the software applications.
Oracle has built a data centre that could house five Boeing 777s nose to tail. Essentially one giant supercomputer, a lot of focus is applied to using energy efficiently and sustainably with Evans noting, “we are constantly leveraging software and hardware to compress and compact as much as possible to support something of this size.”
From a security perspective, cloud providers have to make their proposition something that is both easy to use and onboard for the customer.
Yet, at the same time, make it prescriptive enough that customers are getting the right level of data protection as pertains to their specific needs.
Part of this evolution will therefore require multi-layer security as more generative AI middleware gets built and integrated into the cloud architecture. Doing so will give enterprises confidence that the right controls are in place to keep their data sovereign. Moreover, as cloud providers like Oracle build proprietary LLMs, companies will be able to subscribe to them, rather than build them internally, and in turn build their own customised generative AI applications. This is where cloud providers will ultimately bring scale to enterprises as the generative AI paradigm becomes more dominant.