
Exa Labs builds energy-efficient, reconfigurable chips that lower AI compute energy and increase throughput for model training and inference. They implement polymorphic XPUs and a Learnable Function Unit (LFU) that dynamically reconfigures hardware, optimizes dataflow, reduces memory movement, and applies instruction fusion to improve efficiency compared with fixed accelerators. The company operates as a B2B hardware provider and integrates with common AI software stacks and workflows, including FP32 compute and frameworks like Python, JAX, PyTorch, Julia, and TinyGrad. Exa Labs targets AI companies and researchers from its Silicon Valley base and positions its chips as a substrate for large-scale AI and scientific workloads.

Exa Labs builds energy-efficient, reconfigurable chips that lower AI compute energy and increase throughput for model training and inference. They implement polymorphic XPUs and a Learnable Function Unit (LFU) that dynamically reconfigures hardware, optimizes dataflow, reduces memory movement, and applies instruction fusion to improve efficiency compared with fixed accelerators. The company operates as a B2B hardware provider and integrates with common AI software stacks and workflows, including FP32 compute and frameworks like Python, JAX, PyTorch, Julia, and TinyGrad. Exa Labs targets AI companies and researchers from its Silicon Valley base and positions its chips as a substrate for large-scale AI and scientific workloads.
Founded: 2024
Headquarters: San Francisco, United States
Product: Energy-efficient reconfigurable AI chips (XPUs) with polymorphic LFU
Stage / funding: Pre-Seed (Sep 25, 2024)
Notable investors: Y Combinator; Outbound Capital; Saturnin Pugnet
AI training and inference compute efficiency for large-scale and scientific workloads
2024
DeepTech
500000
Reported pre-seed round with participation from Outbound Capital and Saturnin Pugnet
“Includes participation by Y Combinator and angel/backer investors”