Prior Labs launches first enterprise-scale AI foundation model
Prior Labs announces the world’s first foundation model capable of handling millions of rows of data, giving enterprises a powerful and scalable way to understand and use their most complex, business-critical information.
The breakthrough marks a 1,000x leap in dataset size in less than a year, and positions Prior Labs as the definitive leader in enterprise-scale tabular AI. It also comes as Fortune 500 companies such as Hitachi and major global financial institutions trust its enterprise offering for mission-critical operations, and as its open-source model surpasses 2.3M downloads from engineering teams across Microsoft, Amazon, and Walmart.
Purpose-built for scale
Tabular data – the structured rows and columns that run the majority of enterprise systems – forms the backbone of financial records, supply chains, customer databases and more. Yet progress in tabular AI has lagged behind fields like vision and language because the datasets share few consistent patterns. Unlike images or text, each tabular dataset has its own structure and behaviour, meaning models can’t rely on the same underlying signals when learning from healthcare records as they do from financial transactions, for instance.
First published in the Nature Journal in January, Prior Labs’ TabPFN model is purpose-built for tabular data. Trained on hundreds of millions of synthetic datasets, TabPFN achieves state-of-the-art accuracy without task-specific training, instantly learning patterns from any dataset. At launch, it outperformed all competing models in both speed and accuracy.
PriorLabs’ TabPFN models have scaled from 10,000 rows in January 2025 to 100,000 rows in early November 2025, ranking #1 on the industry’s leading benchmark, TabArena. Now, within the same month, the team scaled up further, to 10 million data points. This 1,000x scale-up in under a year cements Prior Labs as the frontrunner across the full range of tabular dataset sizes.
Plus, as a foundation model, TabPFN can be fine-tuned on a company’s own data. This creates a self-reinforcing cycle of scale where each enterprise that adopts TabPFN for its most complex data challenges sharpens the model’s accuracy and performance.
Built by the pioneers in machine learning
Prior Labs’ competitive positioning stems from its rare combination of research excellence, enterprise deployment, and rapid innovation. The founding team has more than three decades of combined experience in machine learning and operations, and has contributed to over 20 research papers at leading AI conferences including NeurIPS, ICML and ICLR in 2025 alone.
The company's research leadership is anchored by co-founder Professor Frank Hutter and founding advisor Professor Bernhard Schölkopf, the world's most prominent researchers in AutoML and causal machine learning, respectively. They are joined by co-founders Noah Hollmann, a computer scientist experienced at Google and BCG and X-Prize finalist and Sauraj Gambhir, a former venture capital, M&A and growth expert.
The shift to production systems
The announcement comes as the tabular AI market enters a critical inflection point. As enterprises shift from experiments to production AI, so too do their expectations. LLMs can only go so far when looking to move from demos to reliable, production-grade systems and Prior Labs’ rapid enterprise adoption shows how clearly it meets this higher bar.
Hitachi, for instance, usesTabPFN for predictive maintenance across its rail network, identifying track issues earlier and reducing manual inspections, while UK-based biotech, Oxford Cancer Analytics is using it to enable detection of complex lung diseases to enable better patient outcomes, demonstrating the model’s versatility. A major global financial institution is adopting adopting TabPFN across dozens of applications to help it better manage liquidity and, more broadly, the open-source version of TabPFN has been adopted in production across thousands of use-cases across trading, finance, healthcare, industrials and energy, in several cases replacing 100+ pre-existing models with a single foundation model.
Prior Labs is also advancing its technology to add new forms of reasoning. The first, interventional reasoning, will help organisations to answer ‘what if’ queries, allowing them to move from mere predictions about the status quo to proper decision support. The addition of agentic reasoning will enable models to use domain knowledge more intelligently and support decisions in more complex, real-world settings. Details of this research will be featured at NeurIPS in December.
Frank Hutter, CEO and Co-Founder of Prior Labs said: "Our heavy focus on world-leading research has really paid off. Scaling to millions of data points less than a year after our Nature paper is a testament to the speed at which our exceptional team can execute research and put it into production. We’re super excited to continue driving the foundation model revolution of tabular data science to allow better decisions across all industries.”
Sauraj Gambhir, COO and Co-Founder of Prior Labs said: “Enterprises are already running mission-critical operations on TabPFN in finance, healthcare and industrial systems. What changes now is the scale. By extending TabPFN's in-context learning to millions of rows, we're expanding that same proven capability into a new tier of enterprise data volumes. This allows organisations to tackle their largest, most complex datasets with the same foundation model approach, without changing how they work.”
Isabel Ferrando, Innovation Manager at Hitachi said: “Hitachi is committed to advancing predictive maintenance across our rail networks. Working with Prior Labs and TabPFN helps us accelerate our use of data for reliability and safety while reducing operational overhead.”
For more startup news, check out the other articles on the website, and subscribe to the magazine for free. Listen to The Cereal Entrepreneur podcast for more interviews with entrepreneurs and big-hitters in the startup ecosystem.