Yesterday, we learned that Nvidia has found Samsung’s HBM4 chip to be the best HBM4 chip in the market. Well, Broadcom and Google seem to have come to the same conclusion.Google is making its next Tensor Processing Unit (TPU), which will handle the brand’s AI workloads, in collaboration with Broadcom. This piece of hardware uses HBM4. To that extent, Broadcom has been testing HBM4 chips from various companies, which include Micron, Samsung, and SK Hynix, the only three major brands ready to make HBM4 chips.Well, according to a new report from Seoul Economic Daily, in the test, Samsung’s HBM4 chip offered an 11 Gbps operating speed, the fastest among all the chips Broadcom was testing. Furthermore, the South Korean tech giant’s HBM4 module also performed the best when it came to thermal performance, a crucial performance aspect for HBM chips.Reportedly, Broadcom tested the HBM4 chip in a system-in-package (SiP) environment, where logic chips and high-bandwidth memory are combined into a single package. The SiP test is the final test before installing HBM in AI chips. With Nvidia, Google, and Broadcom finding Samsung’s HBM4 chip the best in the segment, major companies around the world could line up at Samsung’s door for HBM4 chips, driving up its price and bringing in more revenue and profits.The post Samsung’s memory turned out to be the best for Google’s AI chips appeared first on SamMobile.