This Hacker News thread from today pointed to a familiar post, created in 2015 that needs updated.
I hate browsing the modern web. It's like wading through a sewer looking for treasure.
A good observation from that HN thread:
What I've noticed is many new sites are using huge high-res images as some kind of maximum-common-denominator to ensure things look great on high dpi screens, while everyone else has to transfer all this junk only to downscale it anyways at render time.
For web of documents-type of websites, such as sites maintained by media orgs or even government agencies, most people are now accessing (reading) those sites on mobile devices. Actually, maybe most people today "access" info from those entities by following their social media silo presences, but that's a different rant.
If most people are reading websites on small screens, then why do web publishers pollute their homepages and article pages with giant images that contribute to the slowness of a website?
Embedding videos and social media silo crap also make the page download slower. Embedding, embedding, embedding. What about using the old web link?
Instead of embedding bloated image-related or social media crap on the homepage or in an article page, provide links to the bloat. Keep the main pages lightweight. Small images could be used with links to the larger versions if readers desire. Instead of embedding videos, provide links to the videos.
During natural disasters, internet connections can be disrupted, especially over cellular. Media and government websites may be too bloated for residents to access. These entities fail at informing the public. Bloated websites drain the batteries faster for mobile devices.
Today, most people in a natural disaster areas probably rely on Facebook or Twitter to get info from local media and local government agencies, provided that those silo sites respond over spotty internet connectivity.
This was the top comment in that HN thread:
We had some storms and network outages yesterday. I was down to 3G, it seemed like dialup speeds. Hacker news faithfully loaded after 30 seconds (I think most of that time there was zero connection, so when a few bytes could fly through it loaded up).
I couldn't get any other site to load. Not even google.com.
It made me want to create a website called "lowbandwidthsites.com" or similar that is itself minimal and just lists the sites that you can load when there is low bandwidth for news etc.
A low bandwitdth read-only repeater for sites like reddit.com would be handy. Or maybe a general purpose low bandwidth repeater that can take any site - a bit like outline.com/mysite.com but even lower fidelity - just returns the markdown!
Anyway, that person's comment reminded me of this excellent post from February 2017.
From 2017 and into 2019, I collected some of my live blog thoughts in this post that is related to the above HN comment about storms.
If local media and government orgs prefer to post all of their updates on silos, then maybe they should shutdown their own websites. But if they prefer to maintain their own web presences, then they should lighten their websites to ensure that important info can be accessed at critical times when internet connection speeds and electrical power is subpar.