By: Krish Sankar, Eddy Orabi, Steven Chin, Joshua Buchalter, John Blackledge, Shaul Eyal
Jun. 27, 2025 - 2 minutes 30 seconds
Overview:
- Following the app store (2008), 4G/video (2010) and the pandemic refresh (2021), we're now at the cusp of a fourth wave of smartphone adoption driven by on-device Artificial Intelligence (AI).
- Unlike prior waves, this cycle could see wider contention for the U.S. market with more than one dominant player
- We believe current low-parameter models are just early proof of concept. The real inflection point will come when a fully capable model is deployed on-device.
- Expect on-device AI to trigger a step-change in memory bandwidth, packaging technology and processor capabilities.
The TD Cowen Insight
Every major shift in compute has given rise to a new hardware experience — we believe generative artificial intelligence (GenAI) will do the same for smartphones. We see it as a multistage shift with large language models (LLMs) eventually becoming an operating system themselves.
Stages of AI Adoption in Smartphones
Our Thesis
In this report, we frame the adoption of LLMs in consumer hardware as a multistage evolution. The first stage involved simply downloading a powerful artificial intelligence (AI) chatbot app. The second stage brought partial LLM integration into apps through features like voice, search and deep research. The third and fourth stages — now just beginning — involve tighter integration between LLMs and the on-device operating system. This involves bringing LLMs closer to data sources — texts, emails, health info, group chats and other on-device data sources.
Hardware shifts come with a cost: We estimate a smartphone's power consumption could rise by 30% or more primarily from higher memory bandwidth. That will place added pressure on other parts of the system stack driving demand for lower-power radio frequency (RF), more efficient central processing units (CPUs) and improved OLED display performance.
What is Proprietary
To answer the core questions in this report, we built our own understanding of LLM system requirements — specifically, how model size and architecture impacts memory bandwidth, compute utilization and what thresholds are required for these models to be useful and efficient. We also examined ambitions of tech leaders shaping this space, how tools developed in the cloud today are influencing the future consumer hardware and how the progress of the cloud ecosystem compares to the challenges on the consumer hardware side.
What to Watch
- A multinational Silicon Valley technology company could release a GenAI software development kit (SDK).
- We expect the company to significantly upgrade the underlying model powering their in-house AI platform.
- One likely scenario, in our view, is that this multinational technology company acquires a third-party LLM company.
Subscribing clients can read the full report, Rethinking the Everyday Device: AI As The New Operating Layer - Ahead of Curve, on the TD One Portal