SK hynix is surfing the AI hype wave by setting up what it nebulously describes as a solutions biz to further exploit the hysteria. The chipmaker is already flying high on the back of booming demand for datacenters. Its share price has quadrupled in the past year as it supplies a vital component, one now in tight supply. Judging by its financials published today, the AI gravy train isn't slowing.
On January 22, Intel released its 2025 fourth-quarter earnings and revealed that profits from its AI and data center products had increased by about nine percent for the quarter and five percent for the year. Meanwhile, its consumer PC parts division was down seven percent for the quarter and around 3 percent for the year. This data makes it clear that Intel is making more money from selling AI datacenters the valuable chips and parts they need to operate,
Really, though, the show just confirmed what was apparent since RAM prices skyrocketed over the last few months, driven by demand from AI datacenters. As Samsung's marketing leader, Wonjin Lee, told Bloomberg at CES: "There's going to be issues around semiconductor supplies, and it's going to affect everyone. Prices are going up even as we speak."
TAE, backed by Alphabet's Google and Chevron, the oil giant, is aiming to develop and sell next-generation neutral beam systems for fusion and related applications in a more cost-effective manner. Nuclear fusion refers to a nascent technology that aims to generate electricity by harnessing the process that powers the sun. It offers the vision of abundant energy without pollution, radioactive waste or greenhouse gases.
The memes about RAM prices are funny and all, but the consequences of the memory supply crunch are anything but. Word on the street is that 16GB RAM phones will become extinct next year and we will see the return of budget phones with 4GB RAM. We've already seen some flagships with hiked prices compared to their predecessors, and we are now hearing that Samsung plans to bump up the prices of its mid-range lineup in India.
Qualcomm is launching a pair of new AI chips in an attempt to challenge Nvidia's dominance in the market. On Monday, Qualcomm announced plans to release its new AI200 chip next year, followed by the AI250 in 2027 - both of which are built on the company's mobile neural processing technology. The new chips are built for deploying AI models, rather than training them.
Rather than a custom Arm CPU, like the ones that Microsoft, AWS, and Google designed, Meta tells us the partnership will focus on optimizing the Arm-based silicon that it's already deploying. Like most hyperscalers and cloud providers, Meta is rolling out large quantities of Arm Neoverse cores across its AI datacenters; they just happen to be part of Nvidia's GB200 or GB300 NVL72 rack systems. Each of these racks is equipped with 72 Blackwell GPUs and 36 of Nvidia's Neoverse-V2-based Grace CPUs.