AMD this week unveiled its plans for a series of MI300 processors, designed for artificial intelligence and next-generation high-performance computing. This continued push of modern processors will reinforce the need for increased cooling at the chip level to deliver high-demand services.
Joining the MI300, now dubbed MI300A, is the MI300X, an artificial intelligence-focused GPU chip using a similar design.
A CPU handles all the main functions of a computer, whereas the GPU is a specialized component running many smaller tasks at once. GPUs are chips used by firms like OpenAI to build cutting-edge AI programs such as ChatGPT or Google’s Bard.
AMD CEO Lisa Su told investors this week the AI chip market will reach $150 billion by 2027, citing it as a long-term growth market. According to CNBC, “The MI300X can use up to 192GB of memory, which means it can fit even bigger AI models than other chips. Nvidia’s rival H100 only supports 120GB of memory.”
Motiviar’s Dynamic Cold Plates are available in a variety of form factors, including a design for AMD Instinct MI300A Processors.
Earlier this year, AMD announced the MI300, the world’s first integrated data center APU. AMD combines an industry-first 3D stacked design Zen 4 EPYC cores, CDNA 3 GPU cores, and on-die HBM3 memory all within an expected TDP of 600W according to published reports.