The Next Netflix Moment:
How LLMs are about to transform the ISP network

When streaming services like Netflix went mainstream, they didn’t just change how households consumed content. They redefined what “good broadband” actually meant. Networks that delivered Netflix flawlessly became the networks subscribers chose, recommended, and stayed loyal to. Netflix’s own ISP Speed Index became a public scoreboard, and ISP product teams optimized, quite literally, to climb it. Streaming wasn’t just a service riding on top of the network; it became a yardstick for the network itself.

Today, a new yardstick is emerging.

LLMs have been adopted faster than any consumer technology in modern history. ChatGPT became the fastest-growing consumer application ever, reaching 100 million users in two months, a pace UBS analysts called the steepest consumer-internet ramp they had seen in twenty years. And the data we see across Plume-enabled homes confirms the trend hasn’t peaked. It has accelerated. Subscribers aren’t just trying these tools; they are embedding them into how they work, how they think, and increasingly, how they protect their livelihood in a job market being reshaped by AI itself.

At Plume, we are studying how tools like ChatGPT, Claude and Perplexity, and the next wave of agentic assistants like Claude Code and Cowork, are showing up in real homes. What we see is striking.

AI is now a household app

As of April 2026, more than 1 in 5 active Plume-enabled locations, approximately 22%, show regular AI app traffic on their home Wi-Fi, up from 19% a year ago. That number is intentionally conservative: it captures only what we can see crossing the home network, on the home network. Independent research suggests the broader picture is far larger. Parks Associates reports that 58% of US internet households now use generative AI tools when self-reporting across all devices. In other words, AI in the home has plenty of runway left, and what is happening inside the homes already using it is anything but steady.

Usage isn’t just growing. It’s exploding

Among AI-using households, total data volume tied to AI apps has grown by roughly 2,000% year-on-year. Time spent connected to AI apps, what we measure as “online seconds”, has climbed 364% over the same period, and a further 80% since the start of 2026 alone. Median session length has lengthened by around 30% year-on-year.

The leader is unsurprising: ChatGPT now accounts for roughly 2x the time-on-app of any other AI tool in our footprint, with median sessions approaching 5 minutes. But the fastest movers may be more telling. Perplexity and Claude are the fastest-growing AI apps in both data volume and usage. Microsoft Copilot is currently the slowest-growing of the major players.

These are not quick lookups. Multi-minute, interactive, real-time sessions imply something networks rarely had to optimize for in the chatbot era: sensitivity to latency, jitter and micro-outages, not just peak bandwidth.

From chatbots to autonomous agents

The most important signal in our data isn’t how much AI is being used. It is how it is being used. The first wave of consumer LLM adoption was conversational and human-prompted. The second wave, already visible, is agentic: AI tools that run multi-step tasks, call APIs, sync with mailboxes and storage, schedule, monitor, and act semi-autonomously on behalf of the subscriber.

For ISP networks, this is a different beast. Agentic AI introduces persistent background traffic as agents synchronise and monitor outside of active sessions; machine-to-machine and API-heavy patterns as multi-step workflows ripple across dozens of SaaS services; and bursty, recurring workloads triggered by a single user command. Reliability stops being a nice-to-have. When one micro-outage breaks a multi-step automation, the user doesn’t see a buffering icon. They see a failed task. And increasingly, that task is part of how they earn a living.

A platform built to adapt

This is exactly the kind of shift Plume’s AI platform was built to absorb. Our Application Intelligence identifies the most popular LLMs across our global footprint and tracks how subscribers actually use them. Our Behavior Intelligence monitors how the nature of that usage is evolving, from chatbot to autonomous agent, so the network can adapt in real time, not in hindsight.We also make such insights available to our ISPs through API and our Operator Suite so they can also make better decisions on network planning. The AI landscape is moving faster than any consumer category before it. Plume will continue to study LLMs and their evolution into autonomous agents, and to ensure operators on our platform are ready to deliver Quality of Outcome, not just Quality of Experience, for the workflows their subscribers increasingly depend on.

The next Netflix moment isn’t coming. It’s already here.