SK Hynix investing $1.3 billion to widen lead in AI memory chips

SK Hynix was chosen by Nvidia to provide the high-bandwidth memory for its standard-setting AI accelerators, pushing the South Korean firm’s value up to 119 trillion won (S$119.6 billion). PHOTO: AFP

SEOUL – SK Hynix is ramping up its spending on advanced chip packaging in the hope of capturing more of the burgeoning demand for a crucial component in artificial intelligence (AI) development: high-bandwidth memory (HBM).

The firm is investing more than US$1 billion (S$1.3 billion) in South Korea in 2024 to expand and improve the final steps of its chip manufacturing, said Dr Lee Kang-Wook, a former Samsung Electronics engineer who now heads up packaging development at SK Hynix.

Innovation with that process is at the heart of HBM’s advantage as the most sought-after AI memory, and further advances will be key to reducing power consumption, driving performance and cementing the company’s lead in the HBM market.

Dr Lee specialises in advanced ways of combining and connecting semiconductors, which has grown in importance with the advent of modern AI and its digestion of vast troves of data via parallel processing chains.

The first 50 years of the semiconductor industry “has been about the front end”, or the design and fabrication of the chips themselves, Dr Lee said in an interview. “But the next 50 years is going to be all about the back end (or packaging).”

Being first to achieve the next milestone in this race can now catapult companies into industry-leading positions.

SK Hynix was chosen by Nvidia to provide the HBM for its standard-setting AI accelerators, pushing the South Korean firm’s value up to 119 trillion won (S$119.6 billion). Its stock has gained nearly 120 per cent since the start of 2023, making it the Asian country’s second-most valuable company and outperforming Samsung and United States rival Micron Technology.

Dr Lee helped pioneer a novel method of packaging the third generation of the technology, HBM2E, which was quickly followed by the other two major makers. This innovation was central to SK Hynix winning Nvidia as a customer in late 2019.

Stacking chips to derive greater performance has long been Dr Lee’s passion. In 2002, he joined as principal engineer at Samsung’s memory division, where he led the development of Through-Silicon Via (TSV)-based 3D packaging technologies.

That work would later become the foundation for developing HBM. HBM is a type of high-performance memory that stacks chips on top of one another and connects them with TSVs for faster and more energy-efficient data processing.

ChatGPT’s release in November 2022 was the moment Dr Lee had been waiting for. By that time, his team had developed a new packaging method called mass reflow-moulded underfill (MR-MUF), aided by his contacts in Japan. The process, which involves injecting and then hardening liquid material between layers of silicon, improved heat dissipation and production yields. SK Hynix teamed up with Namics Corp in Japan for the material and a related patent, according to a person familiar with the matter.

SK Hynix is pouring the bulk of its new investment into advancing MR-MUF and TSV technologies, Dr Lee said.

Samsung, which has for years been distracted by a succession saga at its very top, is now fighting back. Nvidia in 2023 gave the nod to Samsung’s HBM chips, and the company said on Feb 26 that it has developed the fifth generation of the technology, HBM3E, with 12 layers of Dram chips and the industry’s largest capacity of 36GB.

On the same day, Micron surprised industry watchers by saying it has begun volume production of 24GB, eight-layer HBM3E, which will be part of Nvidia’s H200 Tensor Core units shipping in the second quarter.

With SK Hynix’s big commitment to expanding and enhancing technology at home and a multibillion-dollar advanced packaging facility planned for the US, Dr Lee remains bullish about the company’s prospects in the face of intensifying competition. He sees the present investment as laying the groundwork to meet more demand to come with future generations of HBM. BLOOMBERG

Join ST's Telegram channel and get the latest breaking news delivered to you.