- Home
- Technology
- AI Quote of the Day: ‘All AI Models Are Basically the Same,’ Says Oracle Co-Founder Larry Ellison
AI Quote of the Day: ‘All AI Models Are Basically the Same,’ Says Oracle Co-Founder Larry Ellison
Ellison argues public internet data has commoditised large language models, positioning enterprise data as the next AI battleground

Oracle co-founder and chief technology officer Larry Ellison has delivered a blunt assessment of today’s artificial intelligence boom, arguing that most leading large language models are rapidly becoming indistinguishable from one another.
Speaking during Oracle’s fiscal Q2 2026 earnings call in December 2025, Ellison said the world’s most popular AI systems including OpenAI’s ChatGPT, Google’s Gemini, xAI’s Grok, Meta’s Llama and Anthropic’s Claude—are trained on essentially the same data.
“All the large language models—OpenAI, Anthropic, Meta, Google, xAI—they’re all trained on the same data,” Ellison said. “It’s all public data from the internet. So they’re all basically the same. And that’s why they’re becoming commoditized so quickly.”
Commoditisation Risk in Generative AI
Ellison’s core argument is that reliance on open, web-sourced training data leaves little room for durable differentiation. As a result, generative AI risks devolving into a competition driven largely by pricing, infrastructure scale and incremental features, rather than defensible technological advantage.
While the current generation of models has delivered striking gains in productivity and automation, Ellison suggested that barriers to entry are eroding fast, increasing pressure on margins across the AI ecosystem.
Private Data as the Next Growth Engine
Rather than framing this as a dead end, Ellison described it as a transition point. He argued that the next and far more valuable phase of AI will be built on private, proprietary enterprise data, not publicly available internet content.
“The future lies in leveraging private enterprise data,” Ellison said, predicting that this second wave of AI would ultimately eclipse the current surge in GPUs, data centres and public-model infrastructure.
Ellison contended that Oracle is structurally well positioned for this shift, noting that a significant share of the world’s high-value corporate and institutional data already resides in Oracle databases.
Oracle’s Enterprise AI Push
Oracle is advancing this strategy through its AI Data Platform, which uses techniques such as Retrieval-Augmented Generation (RAG) to allow AI systems to query private enterprise data securely, without retraining models on sensitive information.
To support growing demand, Oracle has sharply increased its investment plans. The company now expects around $50 billion in capital expenditure for the full fiscal year, up from $35 billion projected just months earlier.
Recent infrastructure announcements include a 50,000-GPU supercluster powered by AMD MI450 chips, scheduled for launch in Q3 2026, and the OCI Zettascale10 supercomputer connecting hundreds of thousands of NVIDIA GPUs. By late 2025, Oracle’s cloud backlog had crossed $500 billion, driven largely by enterprise AI workloads.
A Crowded Enterprise AI Race
Ellison’s vision, however, faces intense competition. Cloud rivals such as Amazon Web Services, Microsoft Azure and Google Cloud are expanding their own enterprise AI offerings, while progress in synthetic data generation could reduce reliance on exclusive proprietary datasets.
Even so, Ellison’s remarks underline a growing consensus in the industry: the long-term winners in AI may not be those with the biggest public models, but those who can securely unlock intelligence from the world’s most valuable private data.



