Mobile UX
fromGSMArena.com
4 days agoApple will bump up the iPhone 18 to 12GB of LPDDR5X RAM
Apple will upgrade the base iPhone to 12GB RAM, enabling on-device AI and matching Pro models, while shifting to a staggered 2026–2027 release schedule.
The browser would have more value if it included an on-device AI model that could run without requiring access to the internet, O'Donnell said. "This provides a channel through which they can get hundreds of millions of people to download their model," he said. In that scenario, the browser could access heavyweight AI models in the cloud to handle more demanding tasks.
At the core of the Lumex CSS platform is the Armv9.3 C1 CPU cluster, featuring Scalable Matrix Extension 2 (SME2) units. These units provide up to five times faster AI processing and three times improved efficiency compared with previous generations. The CPU cluster includes configurations such as C1-Ultra and C1-Pro cores, tailored to meet the demands of flagship devices. Micro-architectural improvements across cores contribute to an average 30% performance uplift and a 12% reduction in power consumption for daily mobile workloads.
Intel's upcoming "Panther Lake" processor for AI PCs could be a key building block in the transition of Windows 11 into an AI agent-driven OS, analysts said this week. Panther Lake will be available in PCs starting early in 2026. The chip meets the qualifications set by Microsoft for Windows Copilot+ PCs and provides the performance needed to run a new generation of AI applications, analysts said.
During the Snapdragon Summit on Maui, Cristiano Amon, CEO of Qualcomm, gave a glimpse into where the (mobile) ecosystem they provide with chips is heading. Qualcomm envisions a future in which AI moves from the cloud to your devices, taking care of everything for you in every possible way. Qualcomm invited us to attend the Snapdragon Summit, where two new chips were presented: a new smartphone and a new compute chip. The latter is primarily intended for laptops and mini PCs.
The way this works is silicon partners build and maintain execution providers that Windows ML distributes, manages, and registers to run AI workloads performantly on-device, serving as a hardware abstraction layer for developers and a way to get optimal performance for each specific silicon," Microsoft says in the announcement. In simple terms, the platform enables AI-infused apps to tap into PC hardware that's best suited for their specific workload, such as GPUs for power-intensive tasks, NPUs for power efficiency, and CPUs for flexibility.
With the Galaxy Watch8 Series, Samsung is doing just that. For the first time ever, a smartwatch features Google's Gemini AI built in - not as a voice assistant you have to dig for, not as a cloud service that lags behind - but as a native, on-device intelligence layer that powers everything from workouts to grocery lists to urgent work messages.
Google's latest Gemini AI upgrades attempt to anticipate what useful information you made need from your life to address a potential issue, make you to better photographer or become your personalised health and sleep coach. Shipping on the just-announced Pixel 10 Android phones, the new Magic Cue feature enables the chatbot to comb through your digital life and pull up relevant information on your phone just when you need it.