Nvidia Accelerates AI with Micron's Latest 24GB HBM3E Memory Chips in Upcoming H200 Series GPUs
Maintain a Natural and Coherent Speech Pattern Despite the Constraints of Embedding Hidden Messages
Innotron, the parent company of ChangXin Memory Technologies (CXMT), plans to invest $2.4 billion in a new advanced packaging facility in Shanghai. According toBloomberg , this plant will focus on packaging high-bandwidth memory (HBM) chips and will begin production by mid-2026. Innotron will build the facility using money from various investors, including GigaDevice Semiconductor.
The new facility will concentrate on various advanced packaging technologies, such as interconnecting stacked memory devices using through-silicon vias (TSV), which is crucial for producing HBM. According to the Bloomberg report, the facility is anticipated to have a “packaging capacity of 30,000 units per month.”
If the information about the packaging facility is accurate, CXMT will produce HBM DRAM dies (something it has been planning for a while ), while Innotron will assemble them in HBM stacks.
Given that the packaging facility will cost $2.4 billion, it will not just produce HBM memory for AI and HPC processors but will also provide other advanced packaging services. We do not know whether this includes HBM integration with compute GPUs or ASICs, but this could be a possibility if Innotron, CXMT, or GigaDevices manage to secure a logic process technology (e.g., 65 nm-class) required to build silicon interposers used to connect HBM stacks to processors.
LATEST VIDEOS FROM tomshardware Tom’s Hardware
Leading Chinese OSATs, such as JECT, Tongfu Microelectronics, JCET, and SJ Semiconductor, already have HBM integration technology, so Innotron does not have to develop its own method. Earlier this year, JECT reportedly showcased its XDFOI high-density fan-out package solution, which is specifically designed for HBM. Tongfu Microelectronics is reportedly working with a leading China-based DRAM maker, likely CXMT, on HBM projects, too.
China needs its own HBM. Chinese companies are developing AI GPUs but are currently limited to using HBM2 technology, according to DigiTimes. For instance, Iluvatar Corex’s Tiangai 100 GPU and MetaX C-series GPU are equipped with 32 GB and 64 GB HBM2, respectively, but HBM2 is not produced in China.
This $2.4 billion investment is a part of China’s broader strategy to enhance its semiconductor capabilities in general and advanced packaging technologies in particular. Whether or not this one is going to be a financial success is something that remains to be seen. Given that the U.S. government does not allow the export of advanced components made using American technology to China without a license, it has no other choice but to build its own HBM supply chain.
Stay On the Cutting Edge: Get the Tom’s Hardware Newsletter
Get Tom’s Hardware’s best news and in-depth reviews, straight to your inbox.
Contact me with news and offers from other Future brands Receive email from us on behalf of our trusted partners or sponsors
By submitting your information you agree to theTerms & Conditions andPrivacy Policy and are aged 16 or over.
Also read:
- [Updated] Cyberlink Screen Recorder Review
- 2024 Approved Unbeatable Range Ultimate 4K Camera Rig Selection
- Apple's Strategy: Not Swayed by Metaverse Trends
- Digiarty WinX DVD 使用上の一般的な疑問 - FAQガイド
- Download the Latest HP OfficeJet 5255 Printer Drivers – Get Updates Now!
- Get Your HP Spectre X360 Up and Running with Free Windows Driver Downloads
- Get Your New GeForce GPU Drivers Now - Optimized for Windows Gaming
- How to create a digital signature and certificate for .xlsm files
- How to Find and Install Your Acer Screen's Drivers
- How to Install the Latest Driver Software for Razer BlackWidow Gaming Peripherals
- In 2024, Apple’s M1 Milestone The Next-Gen Computing Core
- Maximizing Productivity: Implementing ChatGPT in Your Workflow Strategy
- QuickFix Valheim - Advanced Techniques to Combat Lag and Improve Frame Rate Now!
- Troubleshooting Your Windows 11'S Bluetooth Driver Issues Easily Now
- Ultimate Guide to Convert SRT to TXT in Minutes
- Updated Logitech C27e Driver for Windows 11 - Quick and Secure Downloading Guide
- Upgrade Now: Exclusive Offer on High-Performance Gaming Mobo - Z270 Carbon
- Title: Nvidia Accelerates AI with Micron's Latest 24GB HBM3E Memory Chips in Upcoming H200 Series GPUs
- Author: Richard
- Created at : 2025-01-23 21:39:06
- Updated at : 2025-01-24 21:24:49
- Link: https://hardware-updates.techidaily.com/nvidia-accelerates-ai-with-microns-latest-24gb-hbm3e-memory-chips-in-upcoming-h200-series-gpus/
- License: This work is licensed under CC BY-NC-SA 4.0.