A single web page can be several megabytes in size once everything is downloaded to the user's device. Pounds of cruft can bog down older machines and create a clunky UI/UX.
Here are the March 7, 2016 webpagetest.org results for a Toledo Blade op-ed.
First View Fully Loaded:
- Time: 44.201s
- Request: 952
- Bytes In: 5,018 KB
- Cost: $$$$$
A single web page makes 952 requests and ends up being 5 megabytes in size. The op-ed contained a few hundred words of text with no images.
In June 2016, the Toledo Blade introduced a new version of its website, and as of late July 2016, it appears that the Blade has trimmed its pages by more than 50 percent.
July 2016 webpagetest.org results for the same March 2016 op-ed.
From: Dulles, VA - Chrome - Cable 7/21/2016, 4:50:42 PM
First View Fully Loaded:
- Time: 17.279s
- Requests: 329
- Bytes In: 2,020 KB
- Cost: $$$$$
The page is still too large, but at least the Blade is trending in a positive direction. Just because our computers of all sizes contain more memory and faster CPUs, that doesn't mean publishers should create sprawling web pages.
I host some of my websites at Digital Ocean, and I have total access to my own virtual server. For these sites, I use the Nginx web server.
Here's a simple HTML page of the same op-ed, stored on my Digital Ocean server (obvious copyright violation to make a point).
The actual file size of the HTML page is around 8 KB. Nginx compression shrinks the download size of the page to 5 KB.
From: Dulles, VA - Chrome - Cable - 7/21/2016, 5:34:39 PM
First View Fully Loaded:
- Time: 0.557s
- Requests: 2
- Bytes In: 5 KB
- Cost: $
Some tests show the download time to be around 0.3 seconds. Even a heavily-trafficked, properly-configured website should be able to serve the text portions of simple, static HTML files in about a second.
Obviously, for-profit publishers want to monetize their websites, which means bombarding users with ads and trackers. It's a conundrum. How can publishers make money from their web content while providing a reader-friendly experience?
My answer is a paywall. Restaurants and many other businesses do not give away their creations for free. At Etsy, I could pay $15 to $30 for a crocheted beanie, but the Toledo Blade gives away its craft for free.
I feel bad for the writers, editors, and everyone else at newspaper orgs. Their service is needed at the local level. But bloated web design indefensible.
I suppose that I would be the only person willing to pay a hefty annual subscription fee for content that was displayed as simply as the HTML example that I created above. Photos and illustrations are still welcomed. In fact, more images should be posted. But publishers should simplify the delivery container.
A fast, simple delivery mechanism does not improve bad writing. But good writing and important writing can be lost or ignored when the delivery mechanism is an abomination.
Even digital-only media sites that have formed in recent years and never created a print version are designing massively bloated websites.
Here's how to speed up an article page for browsing-only readers:
- cache the page
- create static HTML
But I see no improvements in the future by publishers, regarding their reader-hostile web designs. The only change will be that the websites will get worse.
Thanks to poorly-designed, slow websites, we will have more services like Facebook's Instant Articles and Google's Accelerated Mobile Pages because most people read on their phones.
Some day, it may be pointless for news orgs to have their own websites because they will publish their content on other platforms. It's not Facebook's fault. It's the fault of the publishers. And we can't blame the content management systems. Humans make the decisions.
Other people have plenty to say on the subject.
- twitter.com/Pinboard - tweet
- mondaynote.com - News Sites Are Fatter and Slower Than Ever
An analysis of download times highlights how poorly designed news sites are. That’s more evidence of poor implementation of ads… and a strong case for ad blockers.
Websites designers live in a bubble, they’re increasingly disconnected from users.
Today, a news site web page of a consists of a pile of scripts, of requests to multiple hosts in which relevant content only makes up an insignificant proportion of the freight.
Consider the following observations: When I click on a New York Times article page, it takes about 4 minutes to download 2 megabytes of data through… 192 requests, some to Times’ hosts, most to a flurry of others servers hosting scores of scripts. Granted: the most useful part — 1700 words / 10,300 characters article + pictures — will load in less that five seconds.
But when I go to Wikipedia, a 1900 words story will load in 983 milliseconds, requiring only 168 kilobytes of data through 28 requests.
Unfortunately, "only" 2 megabytes of crud and "only" 192 requests is now considered good for a single web page by today's standards.
- mattgemmell.com - The reader-hostile web
I’ve said previously that not wasting readers’ time is one of the primary respect metrics for a site, along with quality and legibility of content. You want each page to be a few hundred kilobytes at most, in total, and it should load in half a second. That’s the goal. I’m constantly tweaking things to make this site as fast as possible. People notice, and they appreciate it.
You can’t always make it small, but the size should come from content, not cruft. You might not have the freedom or the expertise to make it fast, but the delay should be from latency, not the transfer and rendering of resources. What you can do is get rid of all the stuff you don’t need, and put the words first.
- businessinsider.com - Ads on news sites gobble up as much as 79% of users' mobile data
"Ads on news sites gobble up as much as 79% of users' mobile data"
One of the reasons consumers download mobile ad blockers is the impact ads have on their data plans. A report released Wednesday from Enders Analysis appears to back up that claim — at least when it comes to a sample of news websites.
I submitted the above BI article to WebPageTest.org.
results for First View, Fully Loaded:
- From: Dulles, VA - Chrome - Cable
- 7/21/2016, 2:58:49 PM
- Time = 17.922 seconds
- Requests = 298
- Bytes In = 3,463 KB
Google’s ongoing project to speed up the web with its Accelerated Mobile Pages has focused on how it can speed up page loads for publishers.
The Catch-22 with digital ads is that as they’ve become richer over the years, they’ve also got heavier and, therefore, slowed page loads — a factor that’s had a fundamental role in the rise of ad blocking. Google sites research showing the average mobile site takes 19 seconds to load.
Web page bloat won't be reduced until people understand the cause of the problem. Note what the writer said about Google's AMP project, "speed up the web." That's incorrect.
The horribly slow page load time is not the fault of the mobile web infrastructure, mobile web browsers, nor WiFi and cell connections. It's 100 percent the fault of publishers.
The web is fine. Google is trying to speed up the page load time of web pages, hosted by bloated websites. Google is trying to do the work that publishers should be doing: simplifying a web page. Google is not speeding up the web. Google is trying to improve a reader's user experience because publishers cannot or will not do this.
Designer News comment:
Main reason it is faster than most news sites, beyond all this excellent work: they not beholden to 3rd-party advertising tech.
- opera.com - New ad blocker - Built into the Opera browser
If there were no bloated ads, some top websites would load up to 90% faster.
Proper terminology was used. Websites would load faster not that the web would be faster. Websites are the problem not the web.
- digiday.com - 5 ways publishers' tech choices come back to haunt them
The following admission came from a guest writer who works for an ad company. Will media people comprehend the article?
Many of these tech “solutions” don’t add real value and instead clutter up the user experience, slow down pages, and drive people away. So we – the ad tech community – need to create products that put the user experience first.
I’m not the first to observe that there are way too many pieces of executable code that run on publishers’ pages. Unsurprisingly, these scripts periodically crashed the sites on which they appeared.
... we commonly see more than 500 servers called from a single publisher page. Most [page scripts] do nothing that benefits the publisher. On the contrary, they just leak data at best, and at worst they reduce the consumer’s privacy and experience.
Furthermore, publishers may unwittingly be violating their visitors’ privacy by allowing scripts to grab or use data without consent.
Page scripts that cause latencies lead to delays in delivering content, turning impatient users away. The longer the load times, the steeper the “user decay” curve, a term we use to describe the pattern of page abandonment caused by latency.
Most publishers aren’t intentionally overloading a page with scripts to create a horrid user experience, goading their precious audiences into leaving or spurring them to install ad blockers.
When publishers can learn what’s slipping onto their pages and keep those pages working smoothly, they’ll have the control to give users the experience they deserve.
- adactio.com - Writing Less Damn Code
I’m in complete agreement with Heydon here:
But it turns out the only surefire way to make performant Web Stuff is also to just write less. Minify? Okay. Compress? Well, yeah. Cache? Sounds technical. Flat out refuse to code something or include someone else’s code in the first place? Now you’re talking.
... if you demand that everything must justify its existence, you end up with a better experience for everyone:
- medium.com/@carsol - Why you suck, not the mobile web
The botching of the mobile web experience isn’t the phone broswer, it’s the web developer. Developers need to stop being lazy.
Am I in the minority that I don't want an app for every. bloody. webpage. I visit?
Another HN comment:
- idlewords.com - October 2015 - Website Obesity
Let me start by saying that beautiful websites come in all sizes and page weights. I love big websites packed with images. I love high-resolution video.
Here’s an article on GigaOm from 2012 titled "The Growing Epidemic of Page Bloat". It warns that the average web page is over a megabyte in size. The article itself is 1.8 megabytes long.
Here's an almost identical article from the same website two years later, called “The Overweight Web". This article warns that average page size is approaching 2 megabytes. That article is 3 megabytes long.
... consider this 400-word-long Medium article on bloat, which includes the sentence: "Teams that don’t understand who they’re building for, and why, are prone to make bloated products." The Medium team has somehow made this nugget of thought require 1.2 megabytes.
That's longer than Crime and Punishment, Dostoyevsky’s psychological thriller about an impoverished student who fills his head with thoughts of Napoleon and talks himself into murdering an elderly money lender.
- backchannel.com - Google Is Going to Speed Up the Web. Is This Good?
“Sluggish” is too tame a word for what we endure now, due to an accumulation of terrible news-industry design and business choices in recent years.
Before getting into details about what’s happening here, let’s be clear on something. AMP wouldn’t be necessary — assuming it is in the first place — if the news industry hadn’t so thoroughly poisoned its own nest.
Looking for money in a business that grows more financially troubled by the month, media companies have infested articles with garbage code, much of it on behalf of advertising/surveillance companies, to the extent that readers have quite reasonably rebelled.
We don’t like slow-responding sites, period. On our mobile devices, which are taking over as the way we “consume” information, we despise having to download megabytes of crapware just to read something, because the carriers charge us for the privilege.
That’s one reason why we use ad blockers. (The other, at least for me, is that we despise being spied on so relentlessly.)
The news business could have solved this problem without racing into the arms of giant, centralized tech companies. But it didn’t, and here we are.
What if news sites had just done the right thing in the first place? Or, since they didn’t, what if they just resolved to build faster pages — using standard HTML markup and loading components in a non-annoying way — now?
- baldurbjarnason.com - Facebook and the media: united, they attack the web
The web doesn’t suck. Your websites suck. All of your websites suck.
You balloon your websites with megabytes of cruft. You ignore best practices. You take something that works and is complementary to your business and turn it into a liability.
The lousy performance of your websites becomes a defensive moat around Facebook. The web is capable of impressive performance and offers a dramatically vaster reach and accessibility than anything an app can offer.
- daringfireball.net - The New York Times on Apple's Dominant Position in Podcasting
I understand what Gruber was saying, but I would qualify it more.
Top-rated HN comment:
Another HN comment well down in the thread:
Sure, some websites will break, but most will work well enough to be able to read articles.
A bit sad that you have to do this to get at decent experience, but what can you do...
Web pages also load fast when viewing sites with the text-based browser called Links.
Speaking of text-based access to websites ...
Here are a few HN comments with my emphasis bolded and additions contained within brackets:
"If it doesn't load through curl, it's broken." --someone So, so true. Thanks, curl.
Some one else pointed out that it'd be nice if browsers offered more support for things that certain types of developers clearly want to do. I completely agree; it'd definitely be nice to take advantage of many of the technologies which currently exist to do more, in a more structured way. But requiring code execution in order to read data [plain text] is madness.
- 500ish.com - Drinking the Web Through A Straw
It was such a revelation. I wanted to view all web-based content this way. Not just on mobile, everywhere.
From the Wired.com article:
THERE’S ANOTHER WEB out there, a better web hiding just below the surface of the one we surf from our phones and tablets and laptops every day. A web with no ads, no endlessly scrolling pages, and no annoying modal windows begging you to share the site on social media or sign up for a newsletter.
Pages loaded nearly instantly, my laptop battery lasted longer, and I could browse the web with fewer distractions—all without the sense of guilt that comes with using an ad blocker.
- medium.com/@pete - How many floppy disks do you need to fit an article from The Atlantic?
In case your experience with computers started after 2010, this: [image] … is a 3.5 inch floppy disk. Standard capacity: 1.44MB.
Today, I tried to read this (really quite good) article from The Atlantic on my iPhone 5S. First time around I tried in a webview. Next attempt was in Safari. It crashed both of them.
Digging for Dinosaurs in My Twenties is a little over 6,200 words and has four images. Saved in rich text format, the words come to 37KB of data, the four images total 346KB (exactly the same images are served to my desktop browser as are to my phone).
Given all the information so far, guess how many floppy disks it would take to fit an article from The Atlantic?
Answer: at least 15
“No, no,” I hear you grumble, “37 plus 346 equals 383KB which is a little over one quarter of the 1.44MB capacity of a floppy disk.”
Well, of course. Unless you include all the other crap that comes with the article.
By the time Safari had crashed, I’d logged 21MB of ads, pixels, and associated scripts that had been downloaded onto my phone. If the main idea was to heat my phone so my hand could stay warm against the San Francisco fall, nice job everybody!
If, on the other hand, the idea was that I could read the article, without scrolling being deathly sluggish, and maybe actually make it to the end before it crashed the browser. Yeah, you failed.
... seriously… does anybody at The Atlantic read their own content on their own site? On a phone… just the same as an ever-increasing number of users consume content? Or were they too busy checking it in Instant Articles and Apple News to take a look at their own mobile view?
Just showed some of these to a generally non-tech-savy friend who said he didn't like them because they looked "too 90s." Personally I love them because they load fast, are easy to read, and don't require a knowledge of a bunch of different frameworks to write.
Another HN comment:
I have been fighting for years to get people used to "90s aesthetics."
PS: I'm not sure I would classify these sites as brutalist; perhaps 'utilitarian' or 'functional' would be better descriptors.
You can make something that doesn't require tons of frameworks and loads fast while NOT looking like a relic of the days of Kazaa. The fact that so many developers are too lazy to do so does not mean we should throw the baby out with the bathwater and go back to times new roman black-on-white.
thin.npr.org is a better web design than most slick-looking media websites today. Okay, it lacks the viewport to display nicer on a phone, but at least a reader can read the site in landscape mode, and the reader can zoom into the content to enlarge the text size. And the browser back buttons work properly with the site. Links are underlined.
webpagetest.org results for thin.npr.org article FBI Investigates Possible Russian Connection To Leaked DNC Emails - From: Dulles, VA - Chrome - Cable - 7/26/2016, 4:06:41 PM - First View Fully Loaded:
- 2 requests
- Bytes In = 4 KB
- Cost = $
Obviously, the web works well. This news story would load fast over a slow internet connection. If the article contained a viewport mention, inside the top of the HTML page, along with a smidgen of CSS, then the page would display better on a phone. But at least a phone user can zoom into the article and/or read the article in landscape mode.
- mobiforge.com - April 2016 - The web is Doom
- Hacker News - discussion
- wired.com - The Average Webpage Is Now the Size of the Original Doom
From the mobiforge.com article:
In July 2015, inspired by something Mat Marquis said at TXJS 2015, I suggested that the average web page weight would equal that of the Doom install image in about 7 months time.
Well, we’ve made it, albeit a bit later than expected.
Recall that Doom is a multi-level first person shooter that ships with an advanced 3D rendering engine and multiple levels, each comprised of maps, sprites and sound effects. By comparison, 2016’s web struggles to deliver a page of web content in the same size. If that doesn’t give you pause you’re missing something.
From the Wired.com article:
Today the average webpage is about the same size, data-wise, as the classic computer game Doom, according to software engineer Ronan Cremin.
A compressed copy of the installer for the shareware version of Doom takes up about 2.39MB of space. Today’s average webpage, meanwhile, requires users to download about 2.3MB worth of data, according to HTTP Archive, a site that tracks website performance and the technologies they use.
So how did we get here? As internet connections have gotten faster, publishers and developers worry less about efficiency.
I can almost hear the Hacker News comments now, about what a luddite I am for not thinking five paragraphs of static text need to be infested with a thousand lines of script. Well, let me say proactively: fuck all y’all.
I think the Web is great, I think interactive dynamic stuff is great, and I think the progress we’ve made in the last decade is great. I also think it’s great that the Web is and always has been inherently customizable by users, and that I can use an extension that lets me decide ahead of time what an arbitrary site can run on my computer.
The web is not a video game console; act accordingly. Keep your stuff modular. Design proactively around likely or common customizations. Maybe scale it down a bit once you hit 40MB of loaded script per page.
Sometimes, it's necessary for a web site to function like a native app, but I don't think that includes media websites where the user is only reading the content.
For text-heavy websites, publishers should build websites, instead of trying to build native app sites. If publishers want native app functionality, then they should build a native app. Publishers should quit trying to make websites act like native apps.
When I log into sites that provide functions such as:
- tax preparing
- project management
- online questionnaire survey building
- medium.com/@wob - The sad state of web development
The author expressed strong opinions on many web development areas, including Single Page Applications and React.js.
"Really all I’m saying is don’t build a SPA. A SPA will lock you into a framework that has the shelf life of a hamster dump. When you think you need a SPA, just stop thinking."
I have the Chrome plugin that shows when a site uses React. I swear every other website I visit uses React, for the stupidest stuff. So many content sites use React. It’s pitiful.
Good job Yahoo, you rewrote your shitty mail client in React. Your customers didn’t give a shit. They just want it to work. My poor wife with her shitty Chromebook. She can play Crysis at 60fps, but fuck if she can read her email on Yahoo.
I rarely use my Yahoo email account, and I agree, it's a garbage web site. I started using Yahoo email in 1999, and I would bet that their '99 version would be more useful than their 2016 version. But I don't blame React.js for Yahoo's poor email app design.
- joneisen.me - Great state of web development
The world wants Single Page Apps (SPAs), meaning we have to move huge amounts of logic from the server to the browser. We’ve been doing this for years, but in 2015, we’ve found better ways to build these large sprawling front end apps.
Eewww. Maybe the world wants native apps. Why not simply build native apps?
Are these SPAs used for internal web apps at companies to perform tasks by logged-in users? If so, then okey-dokey.
... we’ve found better ways to build these large sprawling front end apps.
Great. Saddle users' devices with large, sprawling front-end apps. If these piles of steaming poop are used to display text-based content to non-logged-in users, then why build them?
If the user-experience is improved, then the SPA is a success. If the user-experience is diminished by a bloated, sluggish, clunky web site, then the SPA is a massive failure. Return to 1995 web development and then progressively-enhance with a light touch.
- baldurbjarnason.com - Facebook and the media: united, they attack the web
You balloon your websites with megabytes of cruft. You ignore best practices. You take something that works and is complementary to your business and turn it into a liability. The lousy performance of your websites becomes a defensive moat around Facebook.
If your web developers are telling you that a website delivering hypertext and images can’t be just as fast as a native app (albeit behaving in different ways) then you should fire them.
Peter-Paul Koch, web browser tester extraordinaire, picks up on the phrase that I highlighted in the John Gruber quote and runs with it.
The web definitely has a speed problem due to over-design and the junkyard of tools people feel they have to include on every single web page. However, I don’t agree that the web has an inherent slowness. The articles for the new Facebook feature will be sent over exactly the same connection as web pages. However, the web versions of the articles have an extra layer of cruft attached to them, and that’s what makes the web slow to load. The speed problem is not inherent to the web; it’s a consequence of what passes for modern web development. Remove the cruft and we can compete again. Tools don’t solve the web’s problems, they ARE the problem by Peter-Paul Koch
We continue to have this problem because your web developers are treating the web like an app platform when your very business hinges on it being a quick, lightweight media platform with a worldwide reach.
The web is capable of impressive performance and offers a dramatically vaster reach and accessibility than anything an app can offer.
If you have an engineering mind and care about such things - you care about complexity. Even if you don't - user experience matters to everyone.
Have you ever seen something completely insane and everyone around doesn't seem to recognize how awful it really is. That is the web of today. 60-80 requests? 1MB+ single pages?
Man, the olden days of only 1mb web pages. More from the HN commenter:
Your functionality, I don't care if its Facebook - does not need that much. It is not necessary. When broadband came on the scene, everyone started to ignore it, just like GBs of memory made people forget about conservation.
The fact that there isn't a daily drumbeat about how bloated, how needlessly complex, how ridicuous most of the world's web appliactions of today really are - baffles me.
But I disagree with this HN comment:
My Grebe, Scaup, and Veery web publishing apps caches pages and the homepages in Memcached. If the page is not cached, it's pulled from the MySQL or CouchDB database, and then it's cached. Most of the time, browsing-only users receive the cached page.
My Wren publishing app does not use a database. It creates static HTML pages. I'm using Wren to create this and the other related pages.
“We want to make it really easy for publishers of all shapes and sizes to publish AMP-formatted pages, from the New York Post all the way down to people running their own personal blogs,” said Paul Maiorana, vice president of platform services at WordPress.com parent company Automattic.
Why not create simple, fast-loading, reader-friendly pages by default?
Under the hood, AMP works by simplifying and streamlining the HTML code that powers Web pages to prioritize speed. Google also “caches” pages, or saves copies of them on its own systems, in order to deliver them quicker when users access them. It’s an open-source initiative, meaning anyone is free to use it.
Again, content producers on their own can create simple HTML pages, and they can cache their own content for faster access, thus creating a reader-friendly experience.
- filamentgroup.com - Delivering Responsibly
... page weight does matter. Access can be slow, expensive and prohibitive.
People have come to learn that web fonts take a long time to load. Boom! Just disable them. And along with custom fonts, there go the icon fonts too, so we need to be thinking about fallback images and text for font icons. Or better yet, use SVG instead.
And really who would blame people for reacting to the web this way? Our own practices have set the stage for blockers to become enormously popular.
And I think reach is the greatest advantage of web technology. If we do our jobs well, our sites can reach folks who access the web under very different circumstances than many of we web designers do day to day.
Tim Kadlec recently built What Does My Site Cost .com. It’s a website that calculates the real cost of accessing any site on the web using the costs of the cheapest data plans around the world.
For example, an article on the Wired site weighs over 11.27 mb. For some people it costs almost $4 US dollars to visit that page! For many it's at least a dollar.
Lastly, the tool I’d recommend most is webpagetest.org. It's a website. You can enter a URL, choose a browser/device combination to test, and a region of the world to run the test from, and webpagetest will load your page from there and give you all sorts of information about how it loaded. It’s my favorite tool for development and testing.
“Mobile web performance is bad — I challenge you find someone who disagrees with that,” Mic’s chief strategy officer Cory Haik told me.
I would vehemently disagree. The mobile web performance is not bad. Your obnoxiously bloated and clunky website is bad. You create a reader-hostile experience and then blame something else.
“When our pages load too slowly on mobile, as a publisher, we’re losing an audience, and that is painful. So we’ve been excited to build on AMP.”
Cuckoo time. The publisher creates a self-inflicted wound, blames something else, and then looks to another business that will create simple HTML pages, which the publisher could have done initially.
- stratechery.com - Why web pages suck that referenced a John Gruber post and the corresponding Hacker News discussion
John Gruber had strong words about Apple news site iMore:
I love iMore. I think they’re the best staff covering Apple today, and their content is great. But count me in with Nick Heer — their website is shit-ass. Rene Ritchie’s response acknowledges the problem, but a web page like that — Rene’s 537-word all-text response — should not weigh 14 MB.1.
Advertising should have minimal effect on page load times and device battery life. Advertising should be respectful of the user’s time, attention, and battery life. The industry has gluttonously gone the other way. iMore is not the exception — they’re the norm.
10+ MB page sizes, minute-long network access, third-party networks tracking you across unrelated websites — those things are all par for the course today, even when serving pages to mobile devices. Even on a site like iMore, staffed by good people who truly have deep respect for their readers.
- theverge.com - The mobile web sucks - This was the dumbest thing that I read in 2015. It's incredibly ignorant.
I hate browsing the web on my phone. I do it all the time, of course — we all do. But man, the web browsers on phones are terrible. They are an abomination of bad user experience, poor performance, and overall disdain for the open web that kicked off the modern tech revolution.
Once again, the problem is not with the mobile web browsers. The problem is with the WEB SITES, which are an "abomination of bad user experience, poor performance, and overall disdain for the open web."
These bloated websites require users to have brand new computers with the latest, fastest CPUs.
More from this senseless article:
Mobile Safari on my iPhone 6 Plus is a slow, buggy, crashy affair, starved for the phone's paltry 1GB of memory and unable to rotate from portrait to landscape without suffering an emotional crisis.
I've never had remotely close to those problems in the two-plus years that I've been using my iPhone 5C.
The overall state of the mobile web is so bad that tech companies have convinced media companies to publish on alternative platforms designed for better performance on phones.
It's not because of poor mobile browsers and poor phone hardware. It's because of horribly designed websites by media orgs.
Near the top of the article, the author expressed a fleeting moment of common sense.
And yes, most commercial web pages are overstuffed with extremely complex ad tech, but it's a two-sided argument: we should expect browser vendors to look at the state of the web and push their browsers to perform better, just as we should expect web developers to look at browser performance and trim the fat. But right now, the conversation appears to be going in just one direction.
I infer that the author believes that it's the fault of web browsers for not loading horribly-bloated web pages faster.
Way down in that lengthy article, the writer finally states something obvious.
Now, I happen to work at a media company, and I happen to run a website that can be bloated and slow. Some of this is our fault: The Verge is ultra-complicated, we have huge images, and we serve ads from our own direct sales and a variety of programmatic networks. Our video player is annoying.
We could do a lot of things to make our site load faster, and we're doing them.
Finally, admitting, in a round-about, back-handed way, that it's the media company's fault. And I would say it's 100 percent the media company's fault.
But we can't fix the performance of Mobile Safari.
The writer or theverge.com should design that article page with bare-minimum HTML, and then load it as a static page and test the load speed on mobile Safari.
Add a meta tag with the viewpoint attribute to make the page read better on the phone. And then add a tiny CSS page with a little formatting and maybe a font-family load and a media query. But keep it focused on something useful.
And test that page load time.
A commenter to that Verge article said:
Turn off Java Script, suddenly TheVerge is less crappier and loads faster. Go figure.
Another commenter correctly observed:
So problem is not the browser, but website itself. Web browsers are fine, the web itself is overbloated.
Yet another comment:
Excerpts from another comment:
Jul 21, 2015 tweet about TheVerge.com article:
6 megabytes for a text article??
Let's view some stats for that 2015 TheVerge.com article.
- webpagetest.org results
- theverge.com : The mobile web sucks
- From: Dulles, VA - Chrome - Cable - 9/13/2016, 10:43:55 AM
- First View - Fully Loaded:
- 119.658 seconds (two minutes!!!)
- 382 requests (what is being downloaded?)
- 5,061 KB (5 megs!!!)
- 15.5% of the bytes downloaded were image related
- 32.5% of the bytes downloaded were Flash (1.52 megabytes.???)
- 3.4% of the bytes downloaded were HTML (159 KB)
- 3.1% of the bytes downloaded were CSS (147 KB)
What mattered most, HTML and CSS, equaled 6.5% of the bytes downloaded.
If the article contained images that helped the article, then that's fine, and obviously, helpful images will add to the download amount.
But according to the author of that article, the main problems are the web in general and mobile web browsers.
What in the heck is going with that TheVerge.com article or website in October 2016?
- webpagetest.org results
- The mobile web sucks
- From: Dulles, VA - Chrome - Cable - 10/27/2016, 12:00:33 PM
- First View - Fully Loaded:
- 48.287 seconds
- 381 requests
- 16,883 KB (nearly 17 megabytes !!!)
- Second View - Fully Loaded is usually significantly smaller and faster:
- 44.152 seconds
- 295 requests
- 14,578 KB (!!!)
- webpagetest.org results
- The mobile web sucks
- From: Lincoln, Nebraska USA - Chrome - Cable - 10/27/2016, 11:25:00 AM
- First View - Fully Loaded:
- 46.461 seconds
- 382 requests
- 16,899 KB
The second view, fully loaded was nearly as bad. Bizarre.
- product.voxmedia.com - Declaring performance bankruptcy
At least Vox Media, which owns TheVerge.com, understands the issue. Whether anything significant changes, that's different. This was a May 2015 post, and based upon the above October 2016 webpagetest.org results for an article at TheVerge.com, things are actually getting worse.
Look, we know our sites aren’t as performant as they could be… I mean, let’s cut to the chase here... our sites are friggin’ slow, okay!
- timkadlec.com - Choosing Performance
Facebook just announced a new feature they’re calling “Instant Articles”. It keeps the content within Facebook’s environment, which is one less reason for Facebook’s users to ever leave the app or site.
What I find interesting is the emphasis on speed. There are a few interesting interactive features, but speed is the selling point here.
I’m all for fast as a feature. It makes absolute sense. What concerns me, and I think many others based on reactions I’ve seen, is the fact that Facebook very clearly sees the web as too slow and feels that circumventing it is the best route forward.
Here’s the thing: they’re not entirely wrong. The web is too slow.
WRONG! Websites are slow as the writer states with the next sentence.
The median SpeedIndex of the top 1000 websites (as tested on mobile devices) is now 8220 according to HTTP Archive data from the end of April. That’s an embarrassingly far cry from the golden standard of 1000.
And that’s happening in spite of all the improvements we’ve seen in the last few years. Better tooling. Better browsers. Better standards. Better awareness
So why is this a problem? Is the web just inherently slow and destined to never be able to compete with the performance offered by a native platform? (Spoiler: No. No it is not.)
Circumventing the web is not a viable solution for most companies—it’s merely punting on the problem. The web continues to be the medium with the highest capacity for reach—it’s the medium that can get into all the little nooks and crannies of the world better than any other.
This seems like a slimy way to promote fast page-load speeds.
The result: The Post has reduced its “perceived completeness time” — which it defines as the time it takes a page to appear complete to readers — to 1.7 seconds — an 85 percent performance increase compared to the previous iteration of the page.
Unlike “load time,” which details how long it takes for every element on a page to load, perceived load time measures what a reader actually sees, making it a more useful metric, according to Franczyk.
I'm interested in fully-loaded time. I don't want to see crap still loading and adjusting on the page as I scroll down the article. I'm curious as to how many megabytes of cruft get downloaded. Perceived completeness?? Lame.
http://berkshirehathaway.com - This website appears to be created in 1995, and it has not changed its homepage look since. It's only the website for "an American multinational conglomerate holding company" that is currently controlled by Warren Buffet. Berkshire Hathaway started in 1839. 2015 financial data, according to its Wikipedia page:
- Revenue: $210.82 billion
- Net Income: $24.08 billion
- Total Assets: $552.25 billion
The people who read the info at the website are probably not reading it on their phones.
To me, one major drawback about the Berkshire Hathaway website is its heavy use of displaying content in PDF files. But the site must work well enough for interested parties that no changes are required.
Speed test results on its homepage: https://www.webpagetest.org/result/170222_W4_N9TD berkshirehathaway.com From: Dulles, VA - Chrome - Cable 2/22/2017, 9:04:57 AM First View Fully Loaded: Time = 0.611 seconds Requests = 3 Bytes In =7 KB Cost = $
80% of the downloaded bytes were HTML, and 20% were for images.
The website's speed, light weight, and allegedly its occasional usefulness for interested people implies that the site is designed well. It won't win awards for aesthetics, but that's not the only definition of good design.
Buffett almost certainly feels that spending money on a web designer would be a waste of shareholder dollars. After all, Berkshire is just a holding company. Berkshire's subsidiaries, such as GEICO, all have professional websites because they actually sell products to consumers. Buffett is notorious for running the Berkshire operation on a shoe string budget, out of a desire to protect shareholder interests, with only about 20 staff in Berkshire's corporate offices.
This question presumes that aesthetic appeal is important to every web site. Just like craigslist, the Berkshire Hathaway web site is simple, clear, and quite useful (but not pretty). A shorter answer might be found in Buffett's former license plate on his Town Car - "THRIFTY"
Good info about the site's useful design can be found in this post: http://stanfairbank.com/berkshire-hathaways-brilliant-website.
2012 - theatlantic.com - Berkshire Hathaway's Website Basically Hasn't Changed Since the Year 2000
... the company's website was built in the 1990s, and hasn't really entertained a redesign since. The biggest change to its interface came in 1999, when the design switched from a single bulleted list of 11 links to a two-column bulleted list with a teensy bit more white space around its 14 hotlinks.
As someone who built websites in the mid 1990s for a variety of realtors in southwest Washington State, this WEB page nearly brought me to tears. I can practically see the Geocities template it knocked off, and it made me wish life could be as simple as its
Another fixture on the BH homepage is its footer, which I reproduce in full: "If you have any comments about our WEB page, you can either write us at the address shown above or e-mail us at email@example.com. However, due to the limited number of personnel in our corporate office, we are unable to provide a direct response." That was put into place in the year 2000 and hasn't changed by a single word.
It still says that in February 2017.
Berkshire Hathaway website tells us the investment and management philosophy of the company, fundamentals.
I visit the Berkshire Hathaway website every day to look up stuff and I'm grateful for how basic it is
Same person, another tweet
just appreciating simplicity given how many letters and reports I have to weed through to find something Buffett said.
- webpagetest.org results for this page at sawv
- From: Dulles, VA - Chrome - Cable - 10/27/2016, 3:13:37 PM
- First View - Fully Loaded:
- 0.513 seconds
- 2 requests
- 25 KB
- Cost: $
- 100% of the downloaded bytes were HTML