
By burning the transformer architecture into our chips, we’re creating the world’s most powerful servers for transformer inference.

By burning the transformer architecture into our chips, we’re creating the world’s most powerful servers for transformer inference.
What they do: Designs AI chips (Sohu) optimized for transformer-model inference
Stage / funding: Series A (announced Jun 25, 2024), reported ~$120M round; total funding reported $625.4M
Founded: Around 2022
Team size: Approximately 317 employees
| Company |
|---|
AI inference performance and efficiency for transformer models
2022
Computer Hardware Manufacturing
$120,000,000
Round reported to include multiple institutional and angel participants
“Includes institutional VCs and prominent angel/backer participation (e.g., Two Sigma Ventures, Peter Thiel, Thomas Dohmke)”