Battle Web Page Bloat

"We despise having to download megabytes of crapware"

In my opinion, showing empathy for web readers means not burdening them with obese article pages that are loaded with unnecessarily huge and maybe irrelevant images, JavaScript, ads, trackers, and unknown objects. Simplifying leads to faster performance and a better reading experience.

A single web page can be several megabytes in size once everything is downloaded to the user's device. Pounds of cruft can bog down older machines and create a clunky UI/UX.

JavaScript is not the problem. The misuse or overuse of JavaScript is the problem.

Here are the March 7, 2016 webpagetest.org results for a Toledo Blade op-ed.

First View Fully Loaded:

A single web page makes 952 requests and ends up being 5 megabytes in size. The op-ed contained a few hundred words of text with no images.

In June 2016, the Toledo Blade introduced a new version of its website, and as of late July 2016, it appears that the Blade has trimmed its pages by more than 50 percent.

July 2016 webpagetest.org results for the same March 2016 op-ed.

From: Dulles, VA - Chrome - Cable 7/21/2016, 4:50:42 PM

First View Fully Loaded:

The page is still too large, but at least the Blade is trending in a positive direction. Just because our computers of all sizes contain more memory and faster CPUs, that doesn't mean publishers should create sprawling web pages.

I host some of my websites at Digital Ocean, and I have total access to my own virtual server. For these sites, I use the Nginx web server.

Here's a simple HTML page of the same op-ed, stored on my Digital Ocean server (obvious copyright violation to make a point).

The actual file size of the HTML page is around 8 KB. Nginx compression shrinks the download size of the page to 5 KB.

webpagetest.org results

From: Dulles, VA - Chrome - Cable - 7/21/2016, 5:34:39 PM

First View Fully Loaded:

Some tests show the download time to be around 0.3 seconds. Even a heavily-trafficked, properly-configured website should be able to serve the text portions of simple, static HTML files in about a second.

Obviously, for-profit publishers want to monetize their websites, which means bombarding users with ads and trackers. It's a conundrum. How can publishers make money from their web content while providing a reader-friendly experience?

My answer is a paywall. Restaurants and many other businesses do not give away their creations for free. At Etsy, I could pay $15 to $30 for a crocheted beanie, but the Toledo Blade gives away its craft for free.

I feel bad for the writers, editors, and everyone else at newspaper orgs. Their service is needed at the local level. But bloated web design indefensible.

I suppose that I would be the only person willing to pay a hefty annual subscription fee for content that was displayed as simply as the HTML example that I created above. Photos and illustrations are still welcomed. In fact, more images should be posted. But publishers should simplify the delivery container.

A fast, simple delivery mechanism does not improve bad writing. But good writing and important writing can be lost or ignored when the delivery mechanism is an abomination.

Even digital-only media sites that have formed in recent years and never created a print version are designing massively bloated websites.

Here's how to speed up an article page for browsing-only readers:

But I see no improvements in the future by publishers, regarding their reader-hostile web designs. The only change will be that the websites will get worse.

Thanks to poorly-designed, slow websites, we will have more services like Facebook's Instant Articles and Google's Accelerated Mobile Pages because most people read on their phones.

Some day, it may be pointless for news orgs to have their own websites because they will publish their content on other platforms. It's not Facebook's fault. It's the fault of the publishers. And we can't blame the content management systems. Humans make the decisions.

Other people have plenty to say on the subject.


Do you make grilled cheese with a flamethrower? So why are you using javascript to render a page of text?


An analysis of download times highlights how poorly designed news sites are. That’s more evidence of poor implementation of ads… and a strong case for ad blockers.

Websites designers live in a bubble, they’re increasingly disconnected from users.

Today, a news site web page of a consists of a pile of scripts, of requests to multiple hosts in which relevant content only makes up an insignificant proportion of the freight.

Consider the following observations: When I click on a New York Times article page, it takes about 4 minutes to download 2 megabytes of data through… 192 requests, some to Times’ hosts, most to a flurry of others servers hosting scores of scripts. Granted: the most useful part — 1700 words / 10,300 characters article + pictures — will load in less that five seconds.

But when I go to Wikipedia, a 1900 words story will load in 983 milliseconds, requiring only 168 kilobytes of data through 28 requests.

Unfortunately, "only" 2 megabytes of crud and "only" 192 requests is now considered good for a single web page by today's standards.


I’ve said previously that not wasting readers’ time is one of the primary respect metrics for a site, along with quality and legibility of content. You want each page to be a few hundred kilobytes at most, in total, and it should load in half a second. That’s the goal. I’m constantly tweaking things to make this site as fast as possible. People notice, and they appreciate it.

You can’t always make it small, but the size should come from content, not cruft. You might not have the freedom or the expertise to make it fast, but the delay should be from latency, not the transfer and rendering of resources. What you can do is get rid of all the stuff you don’t need, and put the words first.


"Ads on news sites gobble up as much as 79% of users' mobile data"

One of the reasons consumers download mobile ad blockers is the impact ads have on their data plans. A report released Wednesday from Enders Analysis appears to back up that claim — at least when it comes to a sample of news websites.

I submitted the above BI article to WebPageTest.org.

results for First View, Fully Loaded:


Google’s ongoing project to speed up the web with its Accelerated Mobile Pages has focused on how it can speed up page loads for publishers.

The Catch-22 with digital ads is that as they’ve become richer over the years, they’ve also got heavier and, therefore, slowed page loads — a factor that’s had a fundamental role in the rise of ad blocking. Google sites research showing the average mobile site takes 19 seconds to load.

Web page bloat won't be reduced until people understand the cause of the problem. Note what the writer said about Google's AMP project, "speed up the web." That's incorrect.

The horribly slow page load time is not the fault of the mobile web infrastructure, mobile web browsers, nor WiFi and cell connections. It's 100 percent the fault of publishers.

The web is fine. Google is trying to speed up the page load time of web pages, hosted by bloated websites. Google is trying to do the work that publishers should be doing: simplifying a web page. Google is not speeding up the web. Google is trying to improve a reader's user experience because publishers cannot or will not do this.


Designer News comment:

Main reason it is faster than most news sites, beyond all this excellent work: they not beholden to 3rd-party advertising tech.


If there were no bloated ads, some top websites would load up to 90% faster.

Proper terminology was used. Websites would load faster not that the web would be faster. Websites are the problem not the web.


The following admission came from a guest writer who works for an ad company. Will media people comprehend the article?

Many of these tech “solutions” don’t add real value and instead clutter up the user experience, slow down pages, and drive people away. So we – the ad tech community – need to create products that put the user experience first.

I’m not the first to observe that there are way too many pieces of executable code that run on publishers’ pages. Unsurprisingly, these scripts periodically crashed the sites on which they appeared.

... we commonly see more than 500 servers called from a single publisher page. Most [page scripts] do nothing that benefits the publisher. On the contrary, they just leak data at best, and at worst they reduce the consumer’s privacy and experience.

Furthermore, publishers may unwittingly be violating their visitors’ privacy by allowing scripts to grab or use data without consent.

Page scripts that cause latencies lead to delays in delivering content, turning impatient users away. The longer the load times, the steeper the “user decay” curve, a term we use to describe the pattern of page abandonment caused by latency.

Most publishers aren’t intentionally overloading a page with scripts to create a horrid user experience, goading their precious audiences into leaving or spurring them to install ad blockers.

When publishers can learn what’s slipping onto their pages and keep those pages working smoothly, they’ll have the control to give users the experience they deserve.


I’m in complete agreement with Heydon here:

But it turns out the only surefire way to make performant Web Stuff is also to just write less. Minify? Okay. Compress? Well, yeah. Cache? Sounds technical. Flat out refuse to code something or include someone else’s code in the first place? Now you’re talking.

... if you demand that everything must justify its existence, you end up with a better experience for everyone:

My favorite thing about aiming to have less stuff is this: you finish up with only the stuff you really need — only the stuff your user actually wants. Massive hero image of some dude drinking a latte? Lose it. Social media buttons which pull in a bunch of third-party code while simultaneously wrecking your page design? Give them the boot. That JavaScript thingy that hijacks the user’s right mouse button to reveal a custom modal? Ice moon prison.


The botching of the mobile web experience isn’t the phone broswer, it’s the web developer. Developers need to stop being lazy.


HN comment:

Am I in the minority that I don't want an app for every. bloody. webpage. I visit?

Another HN comment:

I also don't want every mobile webpage I visit to use some slow janky JavaScript framework to emulate native app behavior either because in my experience the user experience for those are universally worse than just trying to be a relatively normal web page (perhaps with some media queries for image sizes, etc) and letting the mobile web browser do its thing.


Let me start by saying that beautiful websites come in all sizes and page weights. I love big websites packed with images. I love high-resolution video.

I love sprawling Javascript experiments or well-designed web apps. This talk isn't about any of those. It's about mostly-text sites that, for unfathomable reasons, are growing bigger with every passing year.

Here’s an article on GigaOm from 2012 titled "The Growing Epidemic of Page Bloat". It warns that the average web page is over a megabyte in size. The article itself is 1.8 megabytes long.

Here's an almost identical article from the same website two years later, called “The Overweight Web". This article warns that average page size is approaching 2 megabytes. That article is 3 megabytes long.

... consider this 400-word-long Medium article on bloat, which includes the sentence: "Teams that don’t understand who they’re building for, and why, are prone to make bloated products." The Medium team has somehow made this nugget of thought require 1.2 megabytes.

That's longer than Crime and Punishment, Dostoyevsky’s psychological thriller about an impoverished student who fills his head with thoughts of Napoleon and talks himself into murdering an elderly money lender.


“Sluggish” is too tame a word for what we endure now, due to an accumulation of terrible news-industry design and business choices in recent years.

Before getting into details about what’s happening here, let’s be clear on something. AMP wouldn’t be necessary — assuming it is in the first place — if the news industry hadn’t so thoroughly poisoned its own nest.

Looking for money in a business that grows more financially troubled by the month, media companies have infested articles with garbage code, much of it on behalf of advertising/surveillance companies, to the extent that readers have quite reasonably rebelled.

We don’t like slow-responding sites, period. On our mobile devices, which are taking over as the way we “consume” information, we despise having to download megabytes of crapware just to read something, because the carriers charge us for the privilege.

That’s one reason why we use ad blockers. (The other, at least for me, is that we despise being spied on so relentlessly.)

The news business could have solved this problem without racing into the arms of giant, centralized tech companies. But it didn’t, and here we are.

What if news sites had just done the right thing in the first place? Or, since they didn’t, what if they just resolved to build faster pages — using standard HTML markup and loading components in a non-annoying way — now?


The web doesn’t suck. Your websites suck. All of your websites suck.

You destroy basic usability by hijacking the scrollbar. You take native functionality (scrolling, selection, links, loading) that is fast and efficient and you rewrite it with ‘cutting edge’ javascript toolkits and frameworks so that it is slow and buggy and broken.

You balloon your websites with megabytes of cruft. You ignore best practices. You take something that works and is complementary to your business and turn it into a liability.

The lousy performance of your websites becomes a defensive moat around Facebook. The web is capable of impressive performance and offers a dramatically vaster reach and accessibility than anything an app can offer.


JavaScript has brought the web to the brink of ruin, but there’s no JavaScript in podcasting. Just an RSS feed and MP3 files.

I understand what Gruber was saying, but I would qualify it more.

"The misuse of JavaScript has brought the web to the brink of ruin."

It's not the usage of JavaScript that's the problem. It's the excessive, unnecessary usage.

Who gets to define the misuse? Does the article page or site function fine without JavaScript? What does a thousand pounds of JavaScript files do for the single article page that's accessed by a browsing-only reader?

For a site that requires the user to login, then I expect the dashboard or admin interface to employ JavaScript in an elegant manner that has been designed and developed by extremely talented people.

In my opinion, that happens when I log into my Digital Ocean account. Their usage of JavaScript is elegant. The JavaScript helps make the experience smooth and easy. The JavaScript is not used to show-off. The JavaScript seems to be a background experience. The experience is smooth and maybe unnoticeable, which is even better. I log in, perform a task or two, and then exit. I'm not looking to be wowed by fancy tech.

Ditto for my FastMail account, which is my favorite email service. In my opinion, FastMail's JavaScript usage is also elegantly used.

For content sites where I don't login, I don't understand the overuse of client-side JavaScript. That's why I surf the web with JavaScript and other things disabled by default, thanks to the NoScript plugin for Firefox and the Quick Javascript Switcher extension for Chrome.

Using JavaScript for useless extravagance is breaking the web.


Top-rated HN comment:

On a more serious note - RSS is the Great Web Leveller. It spites your fancy CSS hacks, it's disgusted by your insane javascript, and it will piss all over your "mobile-optimized" crap. No semantic markup == no party; because markup is for robots, and RSS parsers are very stubborn robots that can see through web-hipster bullshit like Superman through walls.

Another HN comment well down in the thread:

Regarding reading news on the Kindle: I've noticed that the browser becomes much more responsive (read: usable) if one disables JavaScript altogether.

Sure, some websites will break, but most will work well enough to be able to read articles.

Disabling JavaScript is also my trick for actually being able to browse the internet on a phone these days. In Firefox for Android, you can install a plugin to toggle it on/off for when you need it.

A bit sad that you have to do this to get at decent experience, but what can you do...

I read many websites with JavaScript disabled within the browser on my desktop and laptop computers. At times, it feels like the latest, greatest CPU is required to read web articles, especially those produced by media orgs. Pages load incredibly fast when javascript is disabled.

Web pages also load fast when viewing sites with the text-based browser called Links.


Speaking of text-based access to websites ...

Here are a few HN comments with my emphasis bolded and additions contained within brackets:

"If it doesn't load through curl, it's broken." --someone So, so true. Thanks, curl.

I wasn't saying not to do the fancy stuff but rather to start with something which degrades well and then have your JavaScript enhance that basic experience.

I've been using websites since the early 90s and this pro-single-page sentiment [SPA] is getting really tiresome. You are breaking the web. You are destroying users' security. Sure, there are plenty of reasons to use JavaScript, and plenty of places where it's appropriate. It probably is a good idea for games and so forth. But requiring users to load and execute constantly-changing code from across the web in order to read a page or submit a form is in-friggin-sane.

Some one else pointed out that it'd be nice if browsers offered more support for things that certain types of developers clearly want to do. I completely agree; it'd definitely be nice to take advantage of many of the technologies which currently exist to do more, in a more structured way. But requiring code execution in order to read data [plain text] is madness.


Because on the desktop we’re all used to seeing the absolute worst of the web. That is, ridiculous widgets, awful JavaScript load times, and, of course, ads galore. AMP stripped all of the crud away and just gave me unadulterated content. And gave it to me fast.

It was such a revelation. I wanted to view all web-based content this way. Not just on mobile, everywhere.

Welcome to the dark side of wanting faster, simpler, browsing-only websites. As mentioned above, a similar experience can be had by disabling JavaScript within the web browser. Google AMP is needed if the website is poorly designed.


From the Wired.com article:

THERE’S ANOTHER WEB out there, a better web hiding just below the surface of the one we surf from our phones and tablets and laptops every day. A web with no ads, no endlessly scrolling pages, and no annoying modal windows begging you to share the site on social media or sign up for a newsletter.

The best part is that you don’t need a special browser extension or an invite-only app to access this alternate reality. All you need to do is change one little setting in your browser of choice. Just un-tick the checkbox that enables “JavaScript” and away you go, to a simpler, cleaner web.

Pages loaded nearly instantly, my laptop battery lasted longer, and I could browse the web with fewer distractions—all without the sense of guilt that comes with using an ad blocker.


In case your experience with computers started after 2010, this: [image] … is a 3.5 inch floppy disk. Standard capacity: 1.44MB.

Today, I tried to read this (really quite good) article from The Atlantic on my iPhone 5S. First time around I tried in a webview. Next attempt was in Safari. It crashed both of them.

Digging for Dinosaurs in My Twenties is a little over 6,200 words and has four images. Saved in rich text format, the words come to 37KB of data, the four images total 346KB (exactly the same images are served to my desktop browser as are to my phone).

Given all the information so far, guess how many floppy disks it would take to fit an article from The Atlantic?

Answer: at least 15

“No, no,” I hear you grumble, “37 plus 346 equals 383KB which is a little over one quarter of the 1.44MB capacity of a floppy disk.”

Well, of course. Unless you include all the other crap that comes with the article.

By the time Safari had crashed, I’d logged 21MB of ads, pixels, and associated scripts that had been downloaded onto my phone. If the main idea was to heat my phone so my hand could stay warm against the San Francisco fall, nice job everybody!

If, on the other hand, the idea was that I could read the article, without scrolling being deathly sluggish, and maybe actually make it to the end before it crashed the browser. Yeah, you failed.

... seriously… does anybody at The Atlantic read their own content on their own site? On a phone… just the same as an ever-increasing number of users consume content? Or were they too busy checking it in Instant Articles and Apple News to take a look at their own mobile view?


HN comment:

Just showed some of these to a generally non-tech-savy friend who said he didn't like them because they looked "too 90s." Personally I love them because they load fast, are easy to read, and don't require a knowledge of a bunch of different frameworks to write.

Another HN comment:

I have been fighting for years to get people used to "90s aesthetics."

It's even more important for web design. Give me simple HTML with a touch of css, and javascript only if it's absolutely necessary. I can think of hardly any websites that I would consider "beautiful" these days for exactly this reason.

PS: I'm not sure I would classify these sites as brutalist; perhaps 'utilitarian' or 'functional' would be better descriptors.

HN comment:

You can make something that doesn't require tons of frameworks and loads fast while NOT looking like a relic of the days of Kazaa. The fact that so many developers are too lazy to do so does not mean we should throw the baby out with the bathwater and go back to times new roman black-on-white.

Worthwhile CSS and JavaScript are fine.

HN comment:

As an overt visual design paradigm, meh. But hallelujah to the idea of a page that just has content, without the trendily de rigeur fucktons of overblown css and pointless javascript that adds 0 and only serves to crash my crappy mobile browser.

http://thin.npr.org

thin.npr.org is a better web design than most slick-looking media websites today. Okay, it lacks the viewport to display nicer on a phone, but at least a reader can read the site in landscape mode, and the reader can zoom into the content to enlarge the text size. And the browser back buttons work properly with the site. Links are underlined.

The NPR thin site supports the open web better than most websites. It uses minimal HTML. The pages are lightweight and fast-loading. A sprite of inline JavaScript exists: 4 lines with two lines being curly braces. No external JavaScript is loaded. No inline nor external CSS is used.

webpagetest.org results for thin.npr.org article FBI Investigates Possible Russian Connection To Leaked DNC Emails - From: Dulles, VA - Chrome - Cable - 7/26/2016, 4:06:41 PM - First View Fully Loaded:

Obviously, the web works well. This news story would load fast over a slow internet connection. If the article contained a viewport mention, inside the top of the HTML page, along with a smidgen of CSS, then the page would display better on a phone. But at least a phone user can zoom into the article and/or read the article in landscape mode.


From the mobiforge.com article:

In July 2015, inspired by something Mat Marquis said at TXJS 2015, I suggested that the average web page weight would equal that of the Doom install image in about 7 months time.

Well, we’ve made it, albeit a bit later than expected.

Recall that Doom is a multi-level first person shooter that ships with an advanced 3D rendering engine and multiple levels, each comprised of maps, sprites and sound effects. By comparison, 2016’s web struggles to deliver a page of web content in the same size. If that doesn’t give you pause you’re missing something.

From the Wired.com article:

Today the average webpage is about the same size, data-wise, as the classic computer game Doom, according to software engineer Ronan Cremin.

A compressed copy of the installer for the shareware version of Doom takes up about 2.39MB of space. Today’s average webpage, meanwhile, requires users to download about 2.3MB worth of data, according to HTTP Archive, a site that tracks website performance and the technologies they use.

That’s not totally analogous comparison, but it does illustrate the web’s growing obesity problem. our story on giving up JavaScript for a week weighs in at 3MB.

So how did we get here? As internet connections have gotten faster, publishers and developers worry less about efficiency.


I can almost hear the Hacker News comments now, about what a luddite I am for not thinking five paragraphs of static text need to be infested with a thousand lines of script. Well, let me say proactively: fuck all y’all.

I think the Web is great, I think interactive dynamic stuff is great, and I think the progress we’ve made in the last decade is great. I also think it’s great that the Web is and always has been inherently customizable by users, and that I can use an extension that lets me decide ahead of time what an arbitrary site can run on my computer.

I’m not saying that genuine web apps like Google Maps shouldn’t exist. I’m saying that something has gone very wrong when basic features that already work in plain HTML suddenly no longer work without JavaScript.

40MB of JavaScript, in fact, according to about:memory — that’s live data, not download size. That might not sound like a lot (for a page dedicated to showing a 140-character message?), but it’s not uncommon for me to accumulate a dozen open Twitter tabs, and now I have half a gig dedicated solely to, at worst, 6KB of text.

Maybe one day a year, get your whole dev team to disable JavaScript and try using your site. Commence weeping.

The web is not a video game console; act accordingly. Keep your stuff modular. Design proactively around likely or common customizations. Maybe scale it down a bit once you hit 40MB of loaded script per page.

Sometimes, it's necessary for a web site to function like a native app, but I don't think that includes media websites where the user is only reading the content.

For text-heavy websites, publishers should build websites, instead of trying to build native app sites. If publishers want native app functionality, then they should build a native app. Publishers should quit trying to make websites act like native apps.

When I log into sites that provide functions such as:

... then I expect a JavaScript-enhanced interface. And when it is done elegantly, the experience is pleasant and satisfactory. I complete the desired tasks, and I move on.

Could those web sites/apps/services function with JavaScript disabled? Possibly, but the experience could be awkward, slower, and error-prone.

When the necessary JavaScript is used, then it's fine. Elegance over extravagance. The focus should be on utility.


The author expressed strong opinions on many web development areas, including Single Page Applications and React.js.

"Really all I’m saying is don’t build a SPA. A SPA will lock you into a framework that has the shelf life of a hamster dump. When you think you need a SPA, just stop thinking."

From what I can tell the background of React is: Facebook couldn’t create a notification indicator on Facebook.com, so they create this over engineered dump pile to solve that problem. Now they can tell you how many unread posts you have while downloading 15 terabytes of Javascript.

I have the Chrome plugin that shows when a site uses React. I swear every other website I visit uses React, for the stupidest stuff. So many content sites use React. It’s pitiful.

Good job Yahoo, you rewrote your shitty mail client in React. Your customers didn’t give a shit. They just want it to work. My poor wife with her shitty Chromebook. She can play Crysis at 60fps, but fuck if she can read her email on Yahoo.

I rarely use my Yahoo email account, and I agree, it's a garbage web site. I started using Yahoo email in 1999, and I would bet that their '99 version would be more useful than their 2016 version. But I don't blame React.js for Yahoo's poor email app design.


The world wants Single Page Apps (SPAs), meaning we have to move huge amounts of logic from the server to the browser. We’ve been doing this for years, but in 2015, we’ve found better ways to build these large sprawling front end apps.

Eewww. Maybe the world wants native apps. Why not simply build native apps?

Are these SPAs used for internal web apps at companies to perform tasks by logged-in users? If so, then okey-dokey.

... we’ve found better ways to build these large sprawling front end apps.

Great. Saddle users' devices with large, sprawling front-end apps. If these piles of steaming poop are used to display text-based content to non-logged-in users, then why build them?

If the user-experience is improved, then the SPA is a success. If the user-experience is diminished by a bloated, sluggish, clunky web site, then the SPA is a massive failure. Return to 1995 web development and then progressively-enhance with a light touch.


The web doesn’t suck. Your websites suck. All of your websites suck. You destroy basic usability by hijacking the scrollbar. You take native functionality (scrolling, selection, links, loading) that is fast and efficient and you rewrite it with ‘cutting edge’ javascript toolkits and frameworks so that it is slow and buggy and broken.

You balloon your websites with megabytes of cruft. You ignore best practices. You take something that works and is complementary to your business and turn it into a liability. The lousy performance of your websites becomes a defensive moat around Facebook.

If your web developers are telling you that a website delivering hypertext and images can’t be just as fast as a native app (albeit behaving in different ways) then you should fire them.

Peter-Paul Koch, web browser tester extraordinaire, picks up on the phrase that I highlighted in the John Gruber quote and runs with it.

The web definitely has a speed problem due to over-design and the junkyard of tools people feel they have to include on every single web page. However, I don’t agree that the web has an inherent slowness. The articles for the new Facebook feature will be sent over exactly the same connection as web pages. However, the web versions of the articles have an extra layer of cruft attached to them, and that’s what makes the web slow to load. The speed problem is not inherent to the web; it’s a consequence of what passes for modern web development. Remove the cruft and we can compete again. Tools don’t solve the web’s problems, they ARE the problem by Peter-Paul Koch

We continue to have this problem because your web developers are treating the web like an app platform when your very business hinges on it being a quick, lightweight media platform with a worldwide reach.

The web is capable of impressive performance and offers a dramatically vaster reach and accessibility than anything an app can offer.


HN comment:

If you have an engineering mind and care about such things - you care about complexity. Even if you don't - user experience matters to everyone.

Have you ever seen something completely insane and everyone around doesn't seem to recognize how awful it really is. That is the web of today. 60-80 requests? 1MB+ single pages?

Man, the olden days of only 1mb web pages. More from the HN commenter:

Your functionality, I don't care if its Facebook - does not need that much. It is not necessary. When broadband came on the scene, everyone started to ignore it, just like GBs of memory made people forget about conservation.

The fact that there isn't a daily drumbeat about how bloated, how needlessly complex, how ridicuous most of the world's web appliactions of today really are - baffles me.

But I disagree with this HN comment:

The real problem of web development is JavaScript. It’s a relic of the past that hasn’t caught up with times and we end up with half-assed hacks that don’t address the real problem. We need something faster and way more elegant.

I'm writing this page from within the JavaScript editor open source code that I borrowed in the summer of 2013 and then hacked to meet my requirements. I have installed versions of this editor in my Junco, Grebe, Scaup, Veery, and Wren web publishing apps. I can use it easily on my phone. I write fast with it. It works for me and my web writing purposes. For this, I LOVE JavaScript.

And "editor" may be an incorrect term to use for "my" writing app. I disabled features that existed in the original code, and I added several new features. The code uses Ajax to send and receive JSON. The JavaScript editor sends markup to the server where the server code converts it into HTML and returns it to the editor. Simple. It's my preferred writing environment for the web.

My Grebe, Scaup, and Veery web publishing apps caches pages and the homepages in Memcached. If the page is not cached, it's pulled from the MySQL or CouchDB database, and then it's cached. Most of the time, browsing-only users receive the cached page.

My Wren publishing app does not use a database. It creates static HTML pages. I'm using Wren to create this and the other related pages.


“We want to make it really easy for publishers of all shapes and sizes to publish AMP-formatted pages, from the New York Post all the way down to people running their own personal blogs,” said Paul Maiorana, vice president of platform services at WordPress.com parent company Automattic.

Why not create simple, fast-loading, reader-friendly pages by default?

Under the hood, AMP works by simplifying and streamlining the HTML code that powers Web pages to prioritize speed. Google also “caches” pages, or saves copies of them on its own systems, in order to deliver them quicker when users access them. It’s an open-source initiative, meaning anyone is free to use it.

Again, content producers on their own can create simple HTML pages, and they can cache their own content for faster access, thus creating a reader-friendly experience.


... page weight does matter. Access can be slow, expensive and prohibitive.

People have come to learn that web fonts take a long time to load. Boom! Just disable them. And along with custom fonts, there go the icon fonts too, so we need to be thinking about fallback images and text for font icons. Or better yet, use SVG instead.

Most importantly, if disabling enhancements like JavaScript or ads ends up breaking your site, will people just assume that your site is down? I suspect they will. I would.

And really who would blame people for reacting to the web this way? Our own practices have set the stage for blockers to become enormously popular.

And I think reach is the greatest advantage of web technology. If we do our jobs well, our sites can reach folks who access the web under very different circumstances than many of we web designers do day to day.

Tim Kadlec recently built What Does My Site Cost .com. It’s a website that calculates the real cost of accessing any site on the web using the costs of the cheapest data plans around the world.

For example, an article on the Wired site weighs over 11.27 mb. For some people it costs almost $4 US dollars to visit that page! For many it's at least a dollar.

Lastly, the tool I’d recommend most is webpagetest.org. It's a website. You can enter a URL, choose a browser/device combination to test, and a region of the world to run the test from, and webpagetest will load your page from there and give you all sorts of information about how it loaded. It’s my favorite tool for development and testing.


“Mobile web performance is bad — I challenge you find someone who disagrees with that,” Mic’s chief strategy officer Cory Haik told me.

I would vehemently disagree. The mobile web performance is not bad. Your obnoxiously bloated and clunky website is bad. You create a reader-hostile experience and then blame something else.

“When our pages load too slowly on mobile, as a publisher, we’re losing an audience, and that is painful. So we’ve been excited to build on AMP.”

Cuckoo time. The publisher creates a self-inflicted wound, blames something else, and then looks to another business that will create simple HTML pages, which the publisher could have done initially.


John Gruber had strong words about Apple news site iMore:

I love iMore. I think they’re the best staff covering Apple today, and their content is great. But count me in with Nick Heer — their website is shit-ass. Rene Ritchie’s response acknowledges the problem, but a web page like that — Rene’s 537-word all-text response — should not weigh 14 MB.1.

It’s not just the download size, long initial page load time, and the ads that cover valuable screen real estate as fixed elements. The fact that these JavaScript trackers hit the network for a full-minute after the page has completed loaded is downright criminal.

Advertising should have minimal effect on page load times and device battery life. Advertising should be respectful of the user’s time, attention, and battery life. The industry has gluttonously gone the other way. iMore is not the exception — they’re the norm.

10+ MB page sizes, minute-long network access, third-party networks tracking you across unrelated websites — those things are all par for the course today, even when serving pages to mobile devices. Even on a site like iMore, staffed by good people who truly have deep respect for their readers.


I hate browsing the web on my phone. I do it all the time, of course — we all do. But man, the web browsers on phones are terrible. They are an abomination of bad user experience, poor performance, and overall disdain for the open web that kicked off the modern tech revolution.

Once again, the problem is not with the mobile web browsers. The problem is with the WEB SITES, which are an "abomination of bad user experience, poor performance, and overall disdain for the open web."

And the Verge.com is one example of a web-abusive site. It's home page is horribly slow-loading thanks to way too many useless images and probably Javascript. With javascript disabled, the site's home page loads significantly faster.

These bloated websites require users to have brand new computers with the latest, fastest CPUs.

More from this senseless article:

Mobile Safari on my iPhone 6 Plus is a slow, buggy, crashy affair, starved for the phone's paltry 1GB of memory and unable to rotate from portrait to landscape without suffering an emotional crisis.

I've never had remotely close to those problems in the two-plus years that I've been using my iPhone 5C.

The overall state of the mobile web is so bad that tech companies have convinced media companies to publish on alternative platforms designed for better performance on phones.

It's not because of poor mobile browsers and poor phone hardware. It's because of horribly designed websites by media orgs.

Near the top of the article, the author expressed a fleeting moment of common sense.

And yes, most commercial web pages are overstuffed with extremely complex ad tech, but it's a two-sided argument: we should expect browser vendors to look at the state of the web and push their browsers to perform better, just as we should expect web developers to look at browser performance and trim the fat. But right now, the conversation appears to be going in just one direction.

I infer that the author believes that it's the fault of web browsers for not loading horribly-bloated web pages faster.

Way down in that lengthy article, the writer finally states something obvious.

Now, I happen to work at a media company, and I happen to run a website that can be bloated and slow. Some of this is our fault: The Verge is ultra-complicated, we have huge images, and we serve ads from our own direct sales and a variety of programmatic networks. Our video player is annoying.

We could do a lot of things to make our site load faster, and we're doing them.

Finally, admitting, in a round-about, back-handed way, that it's the media company's fault. And I would say it's 100 percent the media company's fault.

Yet ...

But we can't fix the performance of Mobile Safari.

The writer or theverge.com should design that article page with bare-minimum HTML, and then load it as a static page and test the load speed on mobile Safari.

Add a meta tag with the viewpoint attribute to make the page read better on the phone. And then add a tiny CSS page with a little formatting and maybe a font-family load and a media query. But keep it focused on something useful.

And test that page load time.

And no JavaScript. Don't need it for a user who is only reading the page.

A commenter to that Verge article said:

Turn off Java Script, suddenly TheVerge is less crappier and loads faster. Go figure.

Another commenter correctly observed:

So problem is not the browser, but website itself. Web browsers are fine, the web itself is overbloated.

Yet another comment:

Here’s a fun solution: turn off JavaScript. Boom! Suddenly the Verge loads in less than a second.

Excerpts from another comment:

The browsers aren’t perfect, but the real problem is shitty web development. Huge bloated and slow javascript frameworks. CDNs that deliver content hanging for long periods of time. 13 trackers. Video ads. But sure, you go ahead and blame the browsers. Good grief!!! Web pages like TheVerge are huge and bloated and take many seconds to load. That is sad beyond belief.

Jul 21, 2015 tweet about TheVerge.com article:

The Verge article blaming browsers for a shitty mobile web is 6MB and has more than 1,000 javascript errors.

6 megabytes for a text article??

Let's view some stats for that 2015 TheVerge.com article.

What mattered most, HTML and CSS, equaled 6.5% of the bytes downloaded.

If the article contained images that helped the article, then that's fine, and obviously, helpful images will add to the download amount.

Nearly 70% of the download bytes were JavaScript and Flash. Flash in 2016?

But according to the author of that article, the main problems are the web in general and mobile web browsers.

What in the heck is going with that TheVerge.com article or website in October 2016?

Another test:

The second view, fully loaded was nearly as bad. Bizarre.


At least Vox Media, which owns TheVerge.com, understands the issue. Whether anything significant changes, that's different. This was a May 2015 post, and based upon the above October 2016 webpagetest.org results for an article at TheVerge.com, things are actually getting worse.

Look, we know our sites aren’t as performant as they could be… I mean, let’s cut to the chase here... our sites are friggin’ slow, okay!


Facebook just announced a new feature they’re calling “Instant Articles”. It keeps the content within Facebook’s environment, which is one less reason for Facebook’s users to ever leave the app or site.

What I find interesting is the emphasis on speed. There are a few interesting interactive features, but speed is the selling point here.

I’m all for fast as a feature. It makes absolute sense. What concerns me, and I think many others based on reactions I’ve seen, is the fact that Facebook very clearly sees the web as too slow and feels that circumventing it is the best route forward.

Here’s the thing: they’re not entirely wrong. The web is too slow.

WRONG! Websites are slow as the writer states with the next sentence.

The median SpeedIndex of the top 1000 websites (as tested on mobile devices) is now 8220 according to HTTP Archive data from the end of April. That’s an embarrassingly far cry from the golden standard of 1000.

And that’s happening in spite of all the improvements we’ve seen in the last few years. Better tooling. Better browsers. Better standards. Better awareness

So why is this a problem? Is the web just inherently slow and destined to never be able to compete with the performance offered by a native platform? (Spoiler: No. No it is not.)

Circumventing the web is not a viable solution for most companies—it’s merely punting on the problem. The web continues to be the medium with the highest capacity for reach—it’s the medium that can get into all the little nooks and crannies of the world better than any other.


This seems like a slimy way to promote fast page-load speeds.

The result: The Post has reduced its “perceived completeness time” — which it defines as the time it takes a page to appear complete to readers — to 1.7 seconds — an 85 percent performance increase compared to the previous iteration of the page.

Unlike “load time,” which details how long it takes for every element on a page to load, perceived load time measures what a reader actually sees, making it a more useful metric, according to Franczyk.

I'm interested in fully-loaded time. I don't want to see crap still loading and adjusting on the page as I scroll down the article. I'm curious as to how many megabytes of cruft get downloaded. Perceived completeness?? Lame.


http://berkshirehathaway.com - This website appears to be created in 1995, and it has not changed its homepage look since. It's only the website for "an American multinational conglomerate holding company" that is currently controlled by Warren Buffet. Berkshire Hathaway started in 1839. 2015 financial data, according to its Wikipedia page:

The site uses no CSS and no JavaScript. The homepage is built with tables. The viewport is not listed, which means that on a phone, a reader would have to zoom and scroll sideways to navigate the homepage. This latter action is preferred over navigating bloated, clunky websites that use a tiny font size and prevent zooming to enlarge the text.

The people who read the info at the website are probably not reading it on their phones.

To me, one major drawback about the Berkshire Hathaway website is its heavy use of displaying content in PDF files. But the site must work well enough for interested parties that no changes are required.

Speed test results on its homepage: https://www.webpagetest.org/result/170222_W4_N9TD berkshirehathaway.com From: Dulles, VA - Chrome - Cable 2/22/2017, 9:04:57 AM First View Fully Loaded: Time = 0.611 seconds Requests = 3 Bytes In =7 KB Cost = $

80% of the downloaded bytes were HTML, and 20% were for images.

The website's speed, light weight, and allegedly its occasional usefulness for interested people implies that the site is designed well. It won't win awards for aesthetics, but that's not the only definition of good design.

Quora - Why does the Berkshire Hathaway (company) website look so antiquated and bad?

Buffett almost certainly feels that spending money on a web designer would be a waste of shareholder dollars. After all, Berkshire is just a holding company. Berkshire's subsidiaries, such as GEICO, all have professional websites because they actually sell products to consumers. Buffett is notorious for running the Berkshire operation on a shoe string budget, out of a desire to protect shareholder interests, with only about 20 staff in Berkshire's corporate offices.

Another answer:

This question presumes that aesthetic appeal is important to every web site. Just like craigslist, the Berkshire Hathaway web site is simple, clear, and quite useful (but not pretty). A shorter answer might be found in Buffett's former license plate on his Town Car - "THRIFTY"

Good info about the site's useful design can be found in this post: http://stanfairbank.com/berkshire-hathaways-brilliant-website.

2012 - theatlantic.com - Berkshire Hathaway's Website Basically Hasn't Changed Since the Year 2000

... the company's website was built in the 1990s, and hasn't really entertained a redesign since. The biggest change to its interface came in 1999, when the design switched from a single bulleted list of 11 links to a two-column bulleted list with a teensy bit more white space around its 14 hotlinks.

As someone who built websites in the mid 1990s for a variety of realtors in southwest Washington State, this WEB page nearly brought me to tears. I can practically see the Geocities template it knocked off, and it made me wish life could be as simple as its structure.

Another fixture on the BH homepage is its footer, which I reproduce in full: "If you have any comments about our WEB page, you can either write us at the address shown above or e-mail us at berkshire@berkshirehathaway.com. However, due to the limited number of personnel in our corporate office, we are unable to provide a direct response." That was put into place in the year 2000 and hasn't changed by a single word.

It still says that in February 2017.

2015 - qz.com Learn to code like it’s the ’90s with Berkshire Hathaway’s normcore website

2013 tweet

Berkshire Hathaway website tells us the investment and management philosophy of the company, fundamentals.

2014 tweet

I visit the Berkshire Hathaway website every day to look up stuff and I'm grateful for how basic it is

Same person, another tweet

just appreciating simplicity given how many letters and reports I have to weed through to find something Buffett said.


#manifesto