
Fractile is building chips to run large language models two orders of magnitude faster. Existing hardware is good for training LLMs, but very poorly suited to subsequent inference of the trained model, which is increasingly the dominant workload. A network’s weights need to be moved onto a chip once per word generated, and this movement takes a few hundred times longer than the subsequent computations themselves. Fractile’s revolutionary approach to fusing computation with memory eliminates this bottleneck, and can scale to allow running the world’s largest models at a global scale.

Fractile is building chips to run large language models two orders of magnitude faster. Existing hardware is good for training LLMs, but very poorly suited to subsequent inference of the trained model, which is increasingly the dominant workload. A network’s weights need to be moved onto a chip once per word generated, and this movement takes a few hundred times longer than the subsequent computations themselves. Fractile’s revolutionary approach to fusing computation with memory eliminates this bottleneck, and can scale to allow running the world’s largest models at a global scale.
Product: Processors that physically interleave memory and compute to accelerate LLM inference
Headquarters: London, United Kingdom (team presence in London and Bristol)
Founded: 2022
Founder & CEO: Walter Goodwin
Funding: $17.5M total; currently Seed-stage
Employees: 71
| Company |
|---|
Large language model (LLM) inference / AI hardware
2022
Computer Hardware Manufacturing
$15M
Company emerged from stealth with a $15M seed round; investor participation reported from Kindred Capital, NATO Innovation Fund, Cocoa, and Inovia Capital.
“Backed by technology- and science-focused investors including Oxford Science Enterprises and Kindred Capital, with participation from the NATO Innovation Fund and later involvement from Patrick Gelsinger and ARIA.”