Samsung Electronics plans to spend more than 110 trillion won in 2026 to strengthen its position in AI chips. The budget equals about $73.3 billion and marks a 22% increase from a year earlier. The company said it will direct the money toward chip capacity, research, and advanced production as competition in AI semiconductors intensifies.
The spending plan shows how sharply the AI chip race has escalated. Samsung wants to regain momentum against SK Hynix, which built a strong position in high-bandwidth memory, or HBM, for Nvidia systems. At the same time, Samsung is expanding both memory and foundry capabilities to capture more AI demand across the supply chain.
Samsung tied the larger budget to rising demand for AI infrastructure. At Nvidia GTC 2026, the company presented its HBM4E roadmap and highlighted its broader memory portfolio for next-generation AI systems. Samsung also said its HBM4 solutions target Nvidia’s Vera Rubin platform, which keeps the company closely linked to future accelerator demand.
That product focus matters because HBM has become one of the most important components in AI servers. These chips help accelerators move data faster while handling larger workloads. Samsung is now using fresh investment to scale output, improve performance, and support customers who need more advanced memory and packaging.
The company also aims to strengthen its foundry business. That effort could widen Samsung’s role beyond memory supply and into chip manufacturing for AI customers. Recent industry reports point to closer work with Nvidia and AMD, which would support Samsung’s broader strategy in AI hardware.
Samsung’s investment increase comes as SK Hynix remains a major force in premium AI memory. SK Hynix built its lead through close ties with Nvidia and strong execution in HBM production. Samsung is trying to narrow that gap through new product launches, faster commercialization, and higher capital spending.
The timing is important. AI demand has shifted industry priorities toward high-end memory, especially HBM. That shift has improved profits in the premium segment, but it has also tightened supply in more traditional memory categories. As more capacity moves toward AI products, other industries face higher costs and longer wait times.
Samsung’s larger budget could help on both fronts. It may support its push in AI memory while also adding broader chip capacity over time. That matters for device makers, data center operators, and industrial customers who still depend on conventional memory.
Samsung’s plan also stands out against other large chipmakers. TSMC said it expects 2026 capital spending of $52 billion to $56 billion, with most of that set aside for advanced process technologies. Samsung’s planned outlay is therefore larger and reflects how memory and foundry competition now sit at the center of AI infrastructure growth.
Furthermore, the company is treating AI semiconductors as a top priority in 2026. By raising spending on HBM, advanced manufacturing, and research, Samsung is trying to secure a bigger role in the next phase of global AI chip demand.