Our investment in Fundamental - Unlocking the GPT Moment for Structured Data

We are thrilled to announce our participation in Fundamental’s $255M Series A, led by Oak HC/FT, Valor Equity Partners, Battery Ventures, and Salesforce Ventures.
While the world has been captivated by Large Language Models (LLMs) transforming unstructured text, a massive gap has remained in the “silent backbone” of the global economy: structured, tabular data.
Fundamental is building the market’s missing piece: a Large Tabular Model (LTM). They are essentially delivering a “superhuman data scientist” in a single line of code, democratizing PhD-level insights directly on raw enterprise data.
The Problem: The High Cost of “Artisanal” ML
Most enterprise value doesn’t live in PDFs or chat logs; it lives in spreadsheets, ERPs, CRMs, and SQL databases. Today, turning these tables into predictions is a manual, “artisanal” process:
- The XGBoost Bottleneck: Current state-of-the-art tools (like XGBoost) require massive manual effort for data cleaning, feature engineering, and retraining every time a schema changes.
- The LLM Square-Peg-Round-Hole: Trying to “force” LLMs to read CSVs fails because LLMs are sequential. They lack numerical literacy and struggle with the non-linear relationships of rows and columns.
Fundamental’s LTM changes the game by moving from bespoke, one-off models to a general-purpose foundation model for tables.
What is a Large Tabular Model (LTM)?
An LTM is a foundation model purpose-built to understand the native language of the enterprise: rows, columns, joins, and schemas.
Unlike traditional models that see a table as a static grid, Fundamental’s LTM treats data as a dynamic network of relationships. It is characterized by three core breakthroughs:
- Column Order Invariance: Unlike an LLM, which gets confused if you swap column A and B, an LTM understands the underlying structure regardless of how the table is formatted.
- Native Numerical Literacy: It doesn’t treat numbers as “text tokens.” It understands magnitude, scale, and distribution.
- Zero-Pre-processing: It handles raw data — including messy strings and missing values — out of the box, eliminating months of engineering “grunt work.”
Why We Invested: The “Generalization” Alpha
As investors, we look for compounding advantages. Traditionally, ML models don’t “transfer” well; a fraud model for one bank doesn’t help a second bank.
Fundamental is different. Because it is a foundation model, it captures and transfers learnings between datasets. It can learn the “physics” of financial transactions at one institution and infer patterns at another.
The ROI is immediate:
- Financial Services: Moving beyond simple credit scoring to real-time, multi-factor risk underwriting.
- Supply Chain: Predicting anomalies in demand before they hit the bottom line.
- Healthcare: Patient risk stratification that handles messy, inconsistent clinical records.
This is the team to back
We have spent more than a year interacting with Jeremy, Gabriel, and Wojciech since the earliest days of Fundamental. During that time, we’ve watched them move with a clarity of purpose and a pace of execution that is rare even in the fast-moving AI world. They’ve executed flowlessly, securing key partnerships (Cloudera, AWS…), landmark customers, and hiring some of the best talents in the industry.