- Thesis IV settled the infrastructure layer on its own merits: 10 GW of contracted OCI capacity, 243% year-over-year growth, $553B in remaining performance obligations, and a $22B upward revision to the long-range plan inside thirty-seven days.
- Two adjacent businesses sit directly on top of that infrastructure. The AI Database and AI Data Platform run as services on OCI. The Fusion application suite and industry suites run on the AI Database. Both layers serve the same customer base that the infrastructure layer serves.
- The cross-layer linkage is architectural, not aspirational: Fusion runs on the AI Database, the AI Database runs on OCI, and OCI is the same infrastructure Oracle sells to OpenAI, Anthropic, Meta, and xAI. There is no third party between the application the customer touches and the GPU the inference runs on.
- Oracle’s October 2025 long-range plan publishes all three lines moving up in parallel: infrastructure FY26 ~$18B → FY30 $166B, AI Database and AI Data Platform ~$3B → $20B floor, applications growing mid-teens on a $16.1B base.
The natural framing treats Oracle as three separate businesses, each examined against its own peer set. That framing misses the linkage. Because Fusion runs on the AI Database and the AI Database runs on OCI, a single customer engagement generates revenue for Oracle at three layers simultaneously — a property of one integrated business, not three independent ones. Examining the segments in isolation systematically understates the whole.
- Regulated industries — banking, healthcare, insurance, telecom, defense — cannot, by compliance obligation, export operational data outside the existing security boundary.
- Oracle Database is the system of record underneath a meaningful share of those industries’ operational data: core banking ledgers, clinical records, insurance claims, airline reservations, telecom billing, manufacturing ERP.
- Oracle rebuilt the database to vectorize its contents in place — translating them into the mathematical form frontier models reason over — without the data leaving the perimeter.
- Oracle moved early: vector search shipped inside the production database in 2024, with the AI Database and AI Data Platform following in 2025 — ahead of most of the market on a problem the market had not yet defined.
- Larry Ellison: “Everyone is going to want to do reasoning on top of their data. I don’t know who’s not going to do that.”
Two facts produce a structural consequence. The highest-value AI reasoning is reasoning against an enterprise’s own private operational data — context no public model can reproduce. Yet compliance and security make exporting that data infeasible at enterprise scale. The architectural answer is therefore inescapable: the reasoning has to happen where the data already sits, and Oracle already owns that substrate.
- Multicloud Database revenue growth: 1,529% (Q1 FY26), 817% (Q2 FY26), 531% (Q3 FY26). Decelerating growth rates reflect a scaling base; absolute dollars are still accelerating.
- Regions live by partner: Microsoft 33, Google 14, AWS 2 → 8 in a single quarter and on track for 22 by fiscal year end.
- Oracle Database customers historically kept the database on-prem while running other workloads in a major public cloud. The only cloud path used to be OCI — a trade most CIOs declined.
- The on-prem Oracle Database installed base supports more than a decade of migration spend; only a small fraction has reached the cloud line so far.
- Enterprise AI breaks the prior on-prem equilibrium: AI workloads need to reason on operational data at cloud speed and cloud scale — conditions an on-prem footprint cannot deliver.
The conventional read is that Oracle gave up on OCI as the destination for the database. The strategic logic is the opposite: Oracle traded a sizeable on-prem-to-OCI migration for a substantially larger on-prem-to-AI-cloud-database one, available wherever the customer already operates. The friction is gone, the AI-era workload is bigger than what lived on-prem, and the binding constraint on revenue is now operational — how fast regions can be energized — rather than commercial.
- Fusion agent count: 100 committed (Oct 2024 CloudWorld) → 600+ (Oct 2025 Financial Analyst Meeting) → 1,000+ in Fusion alone (Q3 FY26 earnings call), with hundreds more in the banking suite. Oracle overshot its own commitment by roughly 6× in twelve months and the count nearly doubled again in the following five.
- AI Agent Studio (shipped inside Fusion) lets customers build agents on Oracle’s stack. Agent Marketplace (October 2025 Financial Analyst Meeting) has twenty-four partners signed on. Each customer-built and partner-built agent runs on Oracle’s stack alongside Oracle’s shipped agents.
- Roughly 30,000 engineers — an organization larger than the engineering organization at most major technology companies, equipped with the AI coding tools covered in Thesis II — is what produces this agent-shipping cadence.
- Three sources of gravity compound on top of the agent install base: operational-data gravity (native, secure, audited access inside the system of record); institutional-memory gravity (an embedded agent learns exceptions, approval patterns, and customer quirks no foundation model ships with); and cross-customer-pattern gravity (Oracle’s vantage across thousands of enterprises lets agents ship with distilled operating knowledge encoded into their defaults).
- Mike Sicilia, Co-CEO of Oracle: “Data gravity matters here. Mission-critical data gravity matters even more.”
An agent is only as useful as the operational data it can reach. The three sources of gravity compound: operational-data access makes the agent useful at deployment, institutional memory makes it harder to substitute over time, and cross-customer pattern recognition sharpens Oracle’s defaults. Customer- and partner-built agents add to that gravity rather than diluting it. The 6× overshoot is the early-stage outcome of a structural advantage Oracle owns and competitors do not.
- Fusion runs on the AI Database; the AI Database runs on OCI; OCI is the same infrastructure Oracle sells to OpenAI, Anthropic, Meta, and xAI. Application, database, and infrastructure revenue all flow into Oracle’s income statement.
- No competing enterprise application vendor owns the layers underneath. Salesforce, Workday, and SAP sell at the application layer, but the database queries and the underlying compute their AI features generate accrue to third parties — AWS, Azure, and whichever database vendor sits behind the application.
- Oracle’s AI Data Platform vectorizes the customer’s data inside Oracle’s own infrastructure; the frontier models the platform reaches in most cases also run on OCI under contract with the labs.
- Oracle’s Fusion suite ships 1,000-plus AI agents at no additional price — a commercial choice that maximizes adoption and converts every adoption event into consumption Oracle captures at the database and infrastructure layers.
The economics of an AI feature embedded in an enterprise application accrue to whoever owns the compute underneath it. Oracle owns its stack; competitors do not. The asymmetry is sharpened by Oracle’s pricing choice — agents ship in the standard subscription, which pushes the revenue into consumption rather than one-time SKU sales. Same customer engagement, same adoption, materially different vendor outcome — and the difference compounds for as long as Oracle owns the stack underneath.
- NTT DATA early-2026 survey, N = 2,300+ senior decision-makers: 99% say AI has increased their need for cloud, 88% say current investment puts AI at risk, 84% report flat cloud spend, only 14% describe themselves as “cloud-evolved.”
- The survey is third-party, customer-side, and covers a representative cross-section of global enterprises — not an Oracle-commissioned data point.
- Oracle’s current trajectory — 531% multicloud database growth in Q3 FY26 and 2,000-plus Fusion / industry go-lives in a single quarter — is the early response from the segment of the market that has already begun to act.
- Cloud migration for Oracle customers specifically delivers a several-fold annual revenue lift versus on-premise support contracts.
The last decade of cloud migration was pulled by IT-modernization economics on budget-cycle timelines. The next phase runs on a different force: the enterprise has committed externally to an AI strategy and now has to close a cloud-readiness gap to execute it. When the question shifts from “should we modernize?” to “is our AI strategy at risk?”, the tempo changes — and Oracle is positioned to receive that accelerated migration at three layers simultaneously.
- Infrastructure (OCI): FY26 ~$18B → FY30 $166B per Oracle’s October 2025 long-range plan — roughly 75% of FY30 total revenue, on a forecast Oracle has already revised upward once.
- AI Database and AI Data Platform: FY26 ~$3B → FY30 $20B floor. Inference revenue is excluded by Oracle’s own disclosure and sits on top of the $20B.
- Applications: $16.1B annualized run rate growing mid-teens, with 2,000-plus Fusion / industry go-lives in Q3 FY26 alone and deferred revenue running ahead of in-quarter revenue.
- Each Fusion go-live is an entry point that pulls database consumption (AI Database queries) and infrastructure consumption (OCI inference) behind it — indefinitely.
- Each additional AI Database customer opens a channel through which inference revenue flows to OCI, expanding the infrastructure line.
- The linkage is the architectural property established in earlier arguments — vertical integration in the financial sense.
Standard segment-level valuation asks what each business is worth on its own trajectory. That question treats the three lines as independent when each generates demand for the others — applications adoption pulls database queries, database queries pull OCI inference, and the published database forecast itself excludes the inference revenue the platform is designed to generate. The correct frame is composite, not additive: what the whole is worth when every engagement at the top generates consumption underneath.