The Oracle Thesis

The
Compounding
Stack


Oracle’s Database and Applications,
and How One Customer Compounds Into Three Revenue Lines

I. The Bridge: Ten Gigawatts Is Only the First Layer

The prior thesis established that Oracle’s AI infrastructure business has secured ten gigawatts of contracted capacity and is growing 243% year over year, on a forecast that had already moved up $22B inside thirty-seven days last fall. That story, by itself, is the position.

10 GW
Contracted OCI Capacity
243%
OCI Year-over-Year Growth
$22B
Forecast Revision in 37 Days

But it is incomplete. Underneath the infrastructure buildout, two more Oracle businesses — the database and the application suite — are being transformed by the same forces. They run on the same stack. They serve the same customers. And every dollar Oracle earns at one layer creates additional consumption at the layers underneath.

II. Bringing the Model to the Data

Start with a question. If you wanted to apply a frontier AI model to your company’s most valuable data — the customer ledger, the contracts, the claim book, the clinical history — where would you run it?

The frontier model is trained on the public internet. It has never seen any of it. And the data itself sits inside a system you have spent decades hardening against unauthorized access. Sending the data to the model is a non-starter. Most enterprises will not do it. Many cannot.

The data does not move. The model does.

That is the problem Oracle solved. The Oracle Database has been the enterprise default for forty years — the system of record underneath core banking ledgers, clinical records, insurance claims, airline reservations, telecom billing, manufacturing ERP. The data that runs the modern enterprise lives there. Oracle rebuilt that database to vectorize its contents — to translate them into the mathematical form AI models reason over — and to keep them inside the same security perimeter that has protected them for decades. The model brings the reasoning. The database brings the data and the boundary.

Then Oracle extended it with the AI Data Platform: the same vectorization layer, applied to the customer’s data wherever it lives — Oracle databases, non-Oracle databases, object storage, custom apps, legacy systems — and reachable by every top-tier model on the market.

Exhibit 1The AI Data Platform — reasoning inside the customer perimeter
Oracle Database AI
Enterprise data is vectorized in place; frontier models reach in to reason without the data leaving the boundary.

The architecture in the diagram is straightforward, and the value proposition follows directly from it. The customer’s data flows in from every system it already has. It is vectorized inside the perimeter the customer already trusts. Frontier models reach in and reason against it. Nothing about the customer’s data leaves. Nothing about the customer’s existing security posture changes. What the customer gets is reasoning over their own private operational truth — exactly the place AI generates value no public model can replicate.

III. Meeting the Customer Where Their Cloud Already Lives

The next question is where the customer’s data actually sits. For most large enterprises running Oracle Database — which is most large enterprises — the database sits on-premise. Those customers had long since committed to a major public cloud for the rest of their workloads, most often AWS. But the database itself stayed on-prem, because the only cloud path for Oracle Database was OCI, and migrating an entire database estate to a fourth cloud they had not otherwise standardized on was not a trade most CIOs were willing to make. The result was a sub-optimal but stable equilibrium: applications in the public cloud, database on-prem.

The arrival of enterprise AI breaks that equilibrium. The AI workloads that matter most reason against the operational data inside the database, and they want to do so at cloud speed and cloud scale — conditions an on-prem footprint cannot deliver. We treat the specifics later in this thesis. The point here is structural: the configuration most large Oracle customers were sitting on at the start of 2023 was about to become a barrier to the most strategically important workloads they would run in the next decade.

Oracle saw this earlier than most. Rather than wait for customers to migrate their entire cloud footprints to OCI in pursuit of an AI-ready cloud database — which most would never do — Oracle removed the friction. Starting with Microsoft, then Google, then AWS, Oracle embedded Oracle Database directly inside each partner cloud as a first-class service. The customer keeps their existing cloud. The Oracle Database now runs there, in the cloud, with full AI capabilities, alongside the rest of the customer’s workloads.

It would be easy to read this as Oracle capitulating — surrendering OCI’s status as the destination for Oracle Database. That reading misses the strategic logic. Oracle is not giving up a position; it is repositioning for a much larger one. The on-prem-to-OCI migration was a sizeable opportunity. The on-prem-to-AI-cloud-database transition, with Oracle Database available wherever the customer already operates, is a substantially larger one — because the AI-era workload is far bigger than the workload that lived on-prem, and because the friction that kept customers from putting Oracle Database in the cloud has been removed. Oracle traded an exclusivity it could not enforce for a footprint that captures the AI workload wherever it lands.

Exhibit 2Multicloud Database regions live — three rival clouds, one database
Exhibit 2
Microsoft (33 regions) is the most mature partnership; Google (14) about where Microsoft sat a year ago; AWS moved from 2 to 8 regions in a single quarter.

The execution data tracks the rollout directly. Microsoft is the most mature partnership and runs the largest footprint at 33 regions live. Google, the second partner, is at 14 — roughly where Microsoft sat one year ago. AWS, the most recent and previously most resistant partner, expanded from 2 regions to 8 in a single quarter and is on track for 22 by fiscal year end. Three rival clouds now host the Oracle Database as a first-class service. None of this configuration existed three years ago.

1,529%
Multicloud DB Revenue Growth — Q1 FY26
817%
Multicloud DB Revenue Growth — Q2 FY26
531%
Multicloud DB Revenue Growth — Q3 FY26

IV. Where the Agents Belong — and How Fast They Are Arriving

If the AI Database is the layer where models reason on enterprise data, the application suite is the layer where agents act on it. The two are built as one. Applications generate the data. The database vectorizes it. Agents do the work.

The question every enterprise is now asking is the same. Where should our AI agents live?

The market has not fully worked through the answer. Agents need to act on operational data — orders being placed, invoices being approved, candidates being interviewed, parts being shipped, patients being seen. They need to act with the same security, audit trail, and reliability as the human workflows they are replacing. And they need to act inside the business processes the data already lives in, not bolted on alongside them.

That points to one place at enterprise scale: the operational application suite. Fusion ERP holds the financial truth. Fusion HCM holds the workforce. Fusion SCM holds the supply chain. Fusion CX holds the customer relationship. The industry suites — banking, healthcare, retail, hospitality, telecom — hold the operational core of entire verticals.

This is the first source of what we will call agent gravity, and it is the most visible one. There are two others, and they compound on top.

The second is proprietary knowledge. An agent embedded inside a company’s system of record does not stay generic. It learns the exceptions, the approval patterns, the supplier idiosyncrasies, the customer quirks — institutional memory no foundation model can ship with. That memory accrues to the agent in place. It does not port cleanly to a competing runtime, and the longer the agent runs inside Fusion, the more specific to that enterprise it becomes.

The third is cross-customer pattern recognition. A vendor running the operational backbone of thousands of enterprises sees something no individual enterprise can: what actually works. Which payment terms reduce DSO. Which staffing patterns lower turnover. Which inventory policies minimize stockouts in which industries. Agents deployed inside the suite can ship with that distilled operating knowledge encoded in their defaults — not as benchmarks in a deck, but as the way the agent behaves out of the box. A standalone agent platform cannot offer this. It does not see the data.

Stacked together, the three sources of agent gravity point to the same conclusion. Agents need to live where the operational data lives, where institutional memory accrues, and where cross-customer patterns are visible. Not on a standalone agent platform. Not bolted onto a chatbot. Inside the system of record.

“Data gravity matters here. Mission-critical data gravity matters even more.” Mike Sicilia, Co-CEO of Oracle — Q3 FY26 earnings call

For an enterprise building agents — or commissioning a system integrator to build them — the practical question is where to start. Sicilia’s answer is to start inside the system of record, because that is where the operational data with the highest relevance and specificity for any AI workflow already lives. The reasoning layer is only as useful as the operational data it can reach. Very few vendors in the world combine a database holding the mission-critical data of the world’s largest enterprises with an application suite running their core operations at scale. Of those that do, none are moving with Oracle’s urgency.

The pace of execution is the second piece of the picture. In October 2024, at Oracle CloudWorld, the company committed to delivering 100 AI agents inside Fusion. By the October 2025 Financial Analyst Meeting — twelve months later — the count was over 600. By the Q3 FY26 earnings call in March 2026, Fusion alone had crossed 1,000 agents, and the banking suite contained hundreds more on top of that.

Exhibit 3Fusion AI agent count — from 100 to 1,000-plus in eighteen months
Exhibit 3
October 2024 commitment: 100 agents. October 2025: 600+. March 2026: 1,000+ in Fusion alone, with hundreds more in the banking suite.

The chart highlights three observations. The first is that Oracle did not merely hit the commitment — it overshot the original 100-agent target by roughly 6× in the same twelve-month window, and then nearly doubled the count again in the five months that followed. The cadence is accelerating, not decaying.

The second is that the 1,000-plus figure is Fusion alone. The banking suite — one of dozens of industry suites — contains hundreds more, and so do healthcare, retail, hospitality, and telecom. The total agent count across Oracle’s full application portfolio is materially larger than any single figure Oracle has yet published.

The third — and the one with the longest implications — is what is not on the chart. AI Agent Studio, shipped inside Fusion, lets customers build their own agents on top of Oracle’s. The Agent Marketplace, launched at the October 2025 Financial Analyst Meeting, lets partners contribute additional agents to the ecosystem; twenty-four are already signed on. Each customer-built agent and each partner-built agent runs on Oracle’s stack and draws on the same operational data and the same AI Database alongside Oracle’s shipped agents.

That is where the entrenchment dynamic begins. Each new agent on the platform makes the platform more attractive for the next agent. A customer that builds an agent inside Fusion has a strong reason to bring adjacent business processes onto Oracle, because every adjacent process expands the data and context available to the agent it has already invested in. A partner that ships an agent into the Marketplace brings its own customer relationships and use cases with it, which in turn pull more of those customers into Oracle’s orbit. More agents create more gravity. More gravity attracts more agents. And the position Oracle is building inside the system of record gets harder to dislodge with each turn of that loop.

The number on the chart is what Oracle has shipped. The installed count of agents running on Oracle infrastructure is materially higher, and growing every quarter.

Exhibit 4The agent ecosystem — what the shipped count does not capture
Exhibit 4
AI Agent Studio and the Agent Marketplace extend the installed count with customer-built and partner-built agents running on the same stack.

V. What the Customer Is Actually Buying

The agent count is only meaningful insofar as it translates into measurable value for the customers running them. The disclosed evidence indicates that it does, and the value shows up in the units operators care about: patients seen, books closed, tickets resolved, dollars saved. Four examples Oracle has shared make the point concretely.

274
Clinical AI Agent Customers Live, Q2 FY26
50%
Time-Back ROI Within 3 Weeks of Go-Live
2,000+
Fusion/Industry Go-Lives in Q3 FY26 Alone

In healthcare, the AI-powered ambulatory EHR — Oracle’s electronic health record system, rebuilt from the ground up with AI inside — went live in market in fiscal 2026. The clinical AI agent has 274 customers live in production as of Q2 FY26, and the count is rising daily. The reported outcomes from one health system: 50% return on time spent with the system, within three weeks of go-live. Translated: clinicians saw materially more patients per hour, with less administrative friction, while reducing the documentation burden that drives physician burnout. The customer’s payment to Oracle bought the customer back time — the scarcest resource in healthcare delivery.

In finance, Oracle’s own internal team uses Fusion’s ledger agent and payment agent to close the books faster than any other company in the S&P 500. That same capability now ships to every Fusion ERP customer. A finance organization that previously took two weeks to close the quarter can now do it in days, with the same headcount and lower error rates. Faster close means faster reporting, which means faster decisions.

In support, agents handle Tier 1 ticket deflection on the front end and assist human agents with complex tickets on the back end. Oracle’s own deployment shows the result: faster time to resolution, fewer human interventions per ticket, higher first-contact resolution rates, higher customer satisfaction scores. The customer pays for support that scales without scaling headcount.

In banking, the embedded suite spans commercial banking, retail banking, anti-money laundering, financial crimes and compliance, payments, supply chain financing — hundreds of agents covering the operational core of a bank. The agents do not replace the bankers. They take over the high-volume, repetitive cognitive work that previously consumed analyst hours, and redirect that capacity to the work that requires judgment.

The pattern is the same across every example. Agents take the high-volume cognitive work — the work that scales linearly with transaction volume — and convert it from a labor cost into a software cost. The customer’s labor is freed for the work that actually requires a human. The agents do not eliminate workforce. They eliminate the bottleneck that prevented the workforce from doing higher-leverage work.

The revenue lines carry the early signal of that ripple effect. Cloud applications grew 11% in Q3 FY26, taking the segment to a $16.1B annualized run rate, with Fusion ERP up 14%, SCM up 15%, HCM up 15%, and industry SaaS up 19%. Cloud applications deferred revenue grew 14% — outpacing in-quarter revenue, which is the structural signal that the trajectory is accelerating rather than mean-reverting. Two thousand customers went live on Fusion or one of the industry suites in Q3 alone, and median time-to-live continues to decrease.

Read together, these are not the numbers of a software business under threat from AI. They are the numbers of a software business at the early stage of an AI-driven inflection — the ripple effect of embedding agents into a forty-year application franchise showing up first in deferred revenue, then in-quarter revenue, and across the time-to-live curve.

Exhibit 5Cloud applications — revenue and go-live acceleration
Exhibit 5
Cloud applications at a $16.1B run rate; Fusion ERP +14%, SCM +15%, HCM +15%, industry SaaS +19%; 2,000+ go-lives in Q3 FY26 alone.

VI. Free at the Surface, Paid at Depth

Oracle’s pricing decision on the agents looks, on first read, like a giveaway. The 1,000-plus agents inside Fusion ship at no additional cost. They arrive bundled into the standard application subscription, on the standard quarterly release cadence, with the standard security patching. AI Agent Studio — the tool customers use to build their own agents on top of Oracle’s — ships on the same terms. Any customer already paying for Fusion receives the entire set as part of the existing subscription.

Other enterprise software vendors are charging premium prices for their AI features. Oracle is doing the opposite.

The strategic logic becomes visible only on the obvious follow-up: if Oracle is not charging for the AI features, where does Oracle make the money?

The answer is one layer below the surface. Every agent the customer runs generates inference; every inference query hits the AI Database; every AI Database query consumes compute on OCI. Oracle is paid on consumption — at the database layer and at the infrastructure layer — for every agent the customer adopts and every workflow the customer routes through it.

The structure of the business changes accordingly. The customer no longer faces a friction point at the agent level — no price tag, no procurement cycle, no license negotiation. The agents are simply turned on. The more agents the customer uses, the more workflows they automate, the more inference they consume, and the more Oracle earns at the layers underneath.

The model is free at the surface and paid on the volume. Adoption is maximized; consumption is maximized along with it.

Exhibit 6Traditional pricing vs. Oracle’s consumption model
Exhibit 6
Premium-SKU pricing creates a procurement event at every feature; consumption pricing removes the decision and captures the volume underneath.

The contrast across the two columns makes the strategic logic explicit. The traditional vendor charges a premium at the surface and accepts the consequence — slow adoption, narrow workflows, capped revenue per customer. Oracle declines the premium at the surface and captures the consequence in the layers below — faster adoption, broader workflows, and consumption that compounds as the customer scales.

That distinction is what makes the pricing decision strategic rather than generous. Oracle is not giving away revenue; it is removing every reason the customer might hesitate to adopt — because the more agents the customer adopts, the more inference Oracle sells underneath. The customer experiences the trade favorably because the surface they touch is free. Oracle is paid on the volume that surface generates.

Free at the top of the stack and paid at the bottom. What appears to be a giveaway is the channel that allows the bottom of the stack to scale.

VII. One Stack, Three P&Ls

Stack the three layers together and the structural picture resolves. The infrastructure work established that Oracle is being paid to build the AI substrate for the frontier labs. The database work brought the AI Database to the customer’s data, wherever that data lives. And the application work has begun shipping the agents that consume the database, embedded inside the applications the enterprise already runs.

What is true at each of those layers is also true between them: Oracle owns the layer above and the layer below. Fusion runs on the AI Database. The AI Database runs on OCI. OCI is the same infrastructure Oracle sells to OpenAI, Meta, and xAI. There is no third party between the application the customer touches and the GPU the inference runs on. Oracle owns the stack end to end.

This is not synergy in the marketing sense. It is vertical integration in the financial sense — and the financial consequence is what makes Oracle’s position different from any other enterprise software vendor.

Take a single Fortune 500 enterprise renewing its Fusion ERP subscription. The customer turns on the embedded AI agents — free, included in the standard subscription. Those agents pull operational data through the AI Database to reason on it. The AI Database runs queries against the customer’s data, vectorized inside the perimeter. The vectorization, retrieval, and inference all hit OCI compute underneath.

The single customer engagement now generates revenue at three layers: the Fusion subscription itself, the database service consumed by the agents, and the OCI compute consumed by the database. One customer at the surface; three revenue streams underneath.

Exhibit 7One customer dollar, three Oracle revenue lines
Exhibit 7
The customer sees one product at the top of the stack. Oracle captures revenue at the application, database, and infrastructure layers simultaneously.

The diagram makes the asymmetry visible. The customer sees one product and pays one bill at the top. Oracle captures revenue at three layers. And every increment of customer adoption — every new agent turned on, every new workflow automated, every additional piece of data vectorized — multiplies the consumption at the layers underneath.

VIII. A Forced Function: The Migration Becomes Non-Optional

Set Oracle aside for a moment and consider what is happening to the customer. For the entire pre-AI cloud era, enterprises migrated workloads from on-premise to the cloud at a measured pace — set by IT modernization budgets, application refresh cycles, and the slow politics of platform decisions. Oracle benefited from that pace; moving a customer to the cloud delivers a several-fold annual revenue lift over the equivalent on-premise support contract. But the migration was steady, not accelerated.

AI has changed the customer’s incentive structure entirely, and the scale of the change is now measurable.

Exhibit 8NTT DATA enterprise survey: AI demand is outrunning cloud investment
Exhibit 8
99% say AI has increased their need for cloud; 88% say current investment puts AI initiatives at risk; 84% report flat cloud spend; only 14% describe themselves as “cloud-evolved.”

The implication is the part that should focus the reader. The survey provides an unusually direct view into how the enterprise itself reads the situation: its current cloud footprint cannot support the AI strategy it has already committed to. That insight sets the next several years of enterprise IT on a forced trajectory — a large-scale re-platforming of operational data and core workloads onto cloud infrastructure capable of running AI at production scale. What we have been observing — Oracle’s transition from license to subscription, the multicloud database printing 531% growth in Q3 FY26 on a meaningfully larger base, 2,000-plus Fusion go-lives in a single quarter — is the early response. It is the segment of the market that has already started to move. The 84% of enterprises with flat cloud spend have not started yet, but a survey of their peers now tells them that the cost of waiting is the cost of the AI strategy itself.

Oracle is positioned for what comes next. The destination exists in OCI. The Multicloud Database removes the friction of getting Oracle’s data layer into whichever public cloud the customer already runs. The AI Database is the reason the migration is worth doing. The Fusion application suite, with its thousand-plus embedded agents, is where AI does the work inside the customer’s existing operations. Each piece was built before the wave; each is now positioned to absorb it as the rest of the enterprise base begins to move.

IX. The Forecast Is the Floor

At the October 2025 Financial Analyst Meeting, Oracle published a long-range plan for the AI Database and AI Data Platform. The figures are stated explicitly, attached to specific fiscal years, and now part of the company’s formal disclosure record.

Exhibit 9AI Database & AI Data Platform forecast — the $20B floor
Exhibit 9
Oracle’s published FY30 target explicitly excludes inference revenue. The ramp from $4.3B in FY27 to $20B in FY30 assumes adoption the installed base has not yet begun.

Oracle published a $20B AI Database forecast — and explicitly excluded the largest derivative revenue stream the platform is designed to generate. The customer pays for the database service. The customer turns on the AI features (free). The customer’s agents and queries hit OCI compute. Every reasoning operation against the customer’s data generates inference revenue.

“Everyone is going to want to do reasoning on top of their data. I don’t know who’s not going to do that.” Larry Ellison

That inference revenue sits on top of the $20B. Not inside it.

The point worth flagging about the $20B endpoint is what is not in it. The AI Database is the channel through which an enterprise applies any frontier model to its operational data — the ledger, the contract archive, the claim book, the clinical history — and the inference generated against that private data is plausibly larger than what the same enterprise generates from public-facing AI use cases. Multiply by Oracle’s database installed base, measured in Ellison’s framing in millions of customers, and the inference number plausibly dominates the database service number it sits under. The published plan does not include any of that. By exclusion, management is signaling it does not yet feel comfortable putting a figure on inference revenue.