#skhynix

2 posts · Last used 7d

Back to Timeline
TheBadPlace
@TheBadPlace@mastodon.ozioso.online · May 10, 2026
US Top News and Analysis | Memory chip makers are looking at a 'supercycle' and 'windfall gains.' The stocks jumped 30% in one week AI generated summary, Read the full article for complete information. Memory chip makers are riding a “supercycle” of surging AI demand, boosting pricing power and profit forecasts across the volatile sector. Analysts cite rapid AI accelerator adoption as a catalyst for windfall gains, driving shares like Micron Technology up nearly 38% for its best weekly performance since 2008 and lifting the Roundhill Memory ETF (DRAM) over 30% in a week. Samsung plans to start construction of its new P5 Fab 2 plant six months early to cement market dominance through 2027, while SK Hynix is fielding investment offers to ramp production, and Micron’s recent Taiwan plant acquisition promises greater DRAM and HBM output by the end of 2028. The memory crunch is pushing DRAM and NAND prices higher—analysts project a 180% rise by mid‑2026—causing downstream cost pressures for companies like Apple and Microsoft, while upstream manufacturers expect gross margins of 70‑77% this year, with Micron and SanDisk targeting margins above 80% next year. Read more: https://www.cnbc.com/2026/05/10/memory-chip-makers-are-looking-at-a-supercycle-and-windfall-gains-the-stocks-jumped-30percent-in-one-week.html #MicronTechnology #SKHynix #SamsungElectronics #AI #TimCook
0
0
0
TheBadPlace
@TheBadPlace@mastodon.ozioso.online · Apr 08, 2026

yahoo news | Dell’s CEO reckons that AI’s hunger for memory will grow by as much as 625 times

Dell’s chief executive has warned that total memory demand from the AI market will explode over the next few years. Citing a Bank of America event, Michael Dell claimed that, as both memory per accelerator and system scale grow together, the overall memory requirement could be about 625 times larger in 2028 than it was in 2022. He based this on the Nvidia H100 accelerator, which used 80 GB of HBM3 in 2022 and is expected to reach roughly 2 TB per chip by 2028—a more than 25‑fold increase. Adding an assumed 25‑fold rise in the number of AI accelerators deployed in data centres yields the staggering multiplier.

The implication of such growth is a looming shortage of high‑bandwidth memory (HBM). Only three manufacturers—SK hynix, Samsung, and Micron—currently produce HBM4, and even their combined capacity is unlikely to meet the projected demand. In addition to HBM, other memory categories such as LPDDR5x for laptops and NAND flash for storage are also expected to be stretched thin. A single Nvidia‑based AI server rack can already require hundreds of gigabytes of LPDDR5x and multiple terabytes of SSDs; a fully‑kitted tower might house up to 17 TB of DRAM and 547 TB of flash, and large AI data centres deploy hundreds or thousands of such units.

If manufacturing can keep pace, the cost of memory may remain “affordable” relative to high‑end graphics cards, but the supply‑demand gap threatens to make memory a bottleneck for AI development. The industry will need significant expansion of production facilities and possibly new memory technologies to avoid a crisis. Until then, the predicted 625‑fold surge serves as a stark reminder that the future of AI hinges not just on processing power, but on the availability of massive amounts of fast, high‑capacity memory.

Read more: https://www.pcgamer.com/hardware/memory/dells-ceo-reckons-that-the-total-memory-demand-from-the-entire-ai-market-in-2028-will-be-625x-bigger-than-it-was-in-2022/

#dell #skhynix

0
0
1

You've seen all posts