
EnCharge AI is pioneering a new era of AI computation with its transformative, efficient, and sustainable in-memory computing technology. Addressing the limitations of traditional GPUs and digital AI accelerators, EnCharge AI offers up to 20x higher efficiency, 9x higher compute density, and a 10x lower total cost of ownership, resulting in 100x lower CO2 emissions compared to cloud alternatives. Their technology integrates into various form factors, including chiplets, ASICs, and PCIe cards, enabling seamless edge-to-cloud AI deployments. EnCharge AI's mission is to democratize advanced AI, making it accessible for businesses of all sizes by enabling on-device processing for enhanced data privacy, security, and affordability. The company was founded in 2022 and is led by a team of industry veterans with deep expertise in semiconductor design and AI systems.

EnCharge AI is pioneering a new era of AI computation with its transformative, efficient, and sustainable in-memory computing technology. Addressing the limitations of traditional GPUs and digital AI accelerators, EnCharge AI offers up to 20x higher efficiency, 9x higher compute density, and a 10x lower total cost of ownership, resulting in 100x lower CO2 emissions compared to cloud alternatives. Their technology integrates into various form factors, including chiplets, ASICs, and PCIe cards, enabling seamless edge-to-cloud AI deployments. EnCharge AI's mission is to democratize advanced AI, making it accessible for businesses of all sizes by enabling on-device processing for enhanced data privacy, security, and affordability. The company was founded in 2022 and is led by a team of industry veterans with deep expertise in semiconductor design and AI systems.
Founded: 2022
Headquarters: Santa Clara
Product: Analog in-memory AI accelerators (chiplets/ASICs/PCIe) and software
Key claim: Up to 20x efficiency, 9x compute density vs conventional digital accelerators
Total funding (reported): ≈$162.9M
Notable investors: Tiger Global, Anzu Partners, Samsung Ventures
High-cost, energy-intensive AI computation on conventional digital accelerators; need for efficient edge-to-cloud AI compute.
2022
Data and Analytics
21.7M USD
Launch from stealth with Series A to commercialize in-memory computing hardware and software.
22.6M USD
Reported round to commercialize AI-accelerating chips; brought total raised to ~$45M at the time.
>100M USD
Series B led by Tiger Global with participation from multiple financial and strategic investors.
“Participation from strategic and large financial investors (e.g., Tiger Global, Samsung Ventures) alongside returning venture investors”
| Company |
|---|
Title : DFT Architect
Location : Bangalore/Remote(Anywhere in India)
Enchargeai.com
DFT ARCHITECTURE:
MBIST LBIST Logic Design using System Verilog/Verilog.
-Using Siemens/Mentor DFT tools to implement and verify DFT
Architecture/Structures (EDT, LBIST, and SSN ,EDT, MBIST,
IJTAG, 1149.1, 1149.6) in ASICs.
-Run atpg, analyze coverage, use VCS and Questa to simulate
at unit & sdf.
-Perform test insertion for embedded block in SoC.
-Performing general DFT work on SoC.
-Using Cadence tools(modus) to insert/verify DFT logic.
Some tasks in include, running ATPG to verify DFT
implementation is working at block and chip level, perform fault
coverage analysis, root cause low coverage issues, simulate
ATPG and MBIST to verify DFT structure and patterns, work
with PD to close timing on post-layout netlists, create silicon
bring up plan/strategy.
Used Siemens DFT tools to perform test insertion for
embedded block in SoC.
Used Makefile, TCL, Python to perform DFT insertion
Automation.
Contact:
Supriya Sharma
Supriya@mulyatech.com