Loading organizations...
Semiconductor company develops AI inference chips and systems for data centers, focused on digital in-memory compute for LLMs.
Based in Santa Clara, California, Di-Matrix develops custom semiconductor chips and digital in-memory compute systems specifically designed to optimize artificial intelligence inference workloads within enterprise data centers. The company provides scalable hardware solutions that reduce the operational cost, power consumption, and processing latency associated with running large language models and transformer-based generative AI applications. Operating across five global sites, the enterprise has expanded its workforce to over 200 employees while actively generating commercial revenue by supplying its specialized inference infrastructure to major technology hyperscalers. To support its continued expansion in the semiconductor market, Di-Matrix has recently secured $275 million in Series C financing from a syndicate of institutional investors, including Microsoft's M12, Playground Global, Triatomic Capital, and Temasek. The organization was officially founded in 2019 by semiconductor industry executives Sid Sheth and Sudeep Bhoja.
Di-Matrix has raised $13.8M across 1 funding round.
Di-Matrix has raised $13.8M in total across 1 funding round.
Di-Matrix has raised $13.8M in total across 1 funding round.
Di-Matrix's investors include Yonghua Capital.
Di-Matrix has raised $13.8M across 1 funding round. Most recently, it raised $13.8M Series B in November 2023.
| Date | Round | Lead Investors | Other Investors |
|---|---|---|---|
| Nov 15, 2023 | $13.8M Series B | Yonghua Capital |