Nvidia-backed startup invents Ethernet memory pool to help power AI — claims it can add up to 18TB of DDR5 capacity for large-scale inference workloads and reduce per-token generation costs by up to 50%

Wait 5 sec.

Enfabrica is piloting its Emfasys Ethernet-connected DDR5 memory pool that adds up to 18TB of CXL-attached memory to any AI server to ease memory bottlenecks in large-scale inference workloads.