Skip to content
Tech News
← Back to articles

The 49MB web page

read original more articles
Why This Matters

The article highlights the alarming growth of web page sizes, exemplified by a 49MB page from The New York Times, which significantly hampers user experience and raises concerns about excessive tracking and ad bloat. This trend underscores the need for better web optimization and privacy protections in the tech industry, affecting both consumers and publishers alike.

Key Takeaways

Published March 12th, 2026

If active distraction of readers of your own website was an Olympic Sport, news publications would top the charts every time.

I went to the New York Times to glimpse at four headlines and was greeted with 422 network requests and 49 megabytes of data. It took two minutes before the page settled. And then you wonder why every sane tech person has an adblocker installed on systems of all their loved ones.

It is the same story across top publishers today.

To truly wrap your head around the phenomenon of a 49 MB web page, let's quickly travel back a few decades. With this page load, you would be leaping ahead of the size of Windows 95 (28 floppy disks). The OS that ran the world fits perfectly inside a single modern page load. In 2006, the iPod reigned supreme and digital music was precious. A standard high-quality MP3 song at 192 kbps bitrate took up around 4 to 5 MB. This singular page represents roughly 10 to 12 full-length songs. I essentially downloaded an entire album's worth of data just to read a few paragraphs of text. According to the International Telecommunication Union, the global average broadband internet speed back then was about 1.5 Mbps. Your browser would continue loading this monstrosity for several minutes, enough time for you to walk away and make a cup of coffee.

If hardware has improved so much over the last 20 years, has the modern framework/ad-tech stack completely negated that progress with abstraction and poorly architected bloat?

CPU throttles, tracking and privacy nightmares

News websites really really like to track.

For the example above, taking a cursory look at the network waterfall for a single article load reveals a sprawling, unregulated programmatic ad auction happening entirely in the client's browser. Before the user finishes reading the headline, the browser is forced to process dozens of concurrent bidding requests to exchanges like Rubicon Project (fastlane.json) and Amazon Ad Systems. While these requests are asynchronous over the network, their payloads are incredibly hostile to the browser's main thread. To facilitate this, the browser must download, parse and compile megabytes of JS. As a publisher, you shouldn't run compute cycles to calculate ad yields before rendering the actual journalism.

The user requests text. The browser downloads 5MB of tracking JS. A silent auction happens in the background, taxing the mobile CPU. The winning bidder injects a carefully selected interstitial ad you didn't ask for.

... continue reading