assembled in 2016-2017
- Manifesto for Lightweight Web Pages
- Design By Writing
- Create a Comfortable Reading Experience
- Battle Web Page Bloat
Most weeks, I find new and old web posts, related to this subject. These pages are mainly a dumping ground for links, excerpts, and quick thoughts.
- Web Reading UX Etcetera
- Web Reading UX Etcetera Part 2
- Web Reading UX Etcetera Part 3
- Web Reading UX Etcetera Part 4
- Web Reading UX Etcetera Part 5
- Web Reading UX Etcetera Part 6
storing links for page number 7 ...
aug 16, 2017 https://news.ycombinator.com/item?id=15027715
Most content websites have become such a massive crapfest of ad-bloat, bad UX, huge page sizes and general usability hell that it's nigh impossible that I'd be able to reach the actual content of a non AMP site in the first 5-10 seconds of clicking on its link. (On my phone that's an additional 1-2 seconds for registering the tap, and 1-2 seconds for navigating to the browser)
So say what you may, AMP (or FB Instant or its ilk) will prosper until the mobile web experience stops being so crappy.
First, no such thing as the "mobile web" exists. It's the web. Many times, people view the web with mobile devices, but it's the same web that can be viewed with desktop computers. Responsive web design permits sites to adjust the displays for different devices, but it's still the same web or http/https.
Second, the web experience, which includes viewing the web on mobile devices, is fine. The problem is not with the web. The problem is with how websites are designed.
This is a non-interactive text-only website. It shouldn't need anything beside HTML and CSS.
These text-only sites — which used to be more popular in the early days of the Internet, when networks were slower and bandwidth was at a premium – are incredibly useful, and not just during natural disasters. They load much faster, don’t contain any pop-ups or ads or autoplay videos, and help people with low bandwidth or limited Internet access. They’re also beneficial for people with visual impairments who use screen readers to navigate the Internet.
Proving, once again, that the web is not slow. Websites are slow because of how they are built. But these slow web design choices are probably governed by the media orgs' business models and by user experience people who design without empathy for the users.
More from the story:
There are many ways that news organizations can improve the ways they serve both low-bandwidth users and people with visual impairments by stripping out unnecessary elements and optimizing different parts of a website. To learn more, I reached out to front-end website designer J. Albert Bowden, who frequently tweets about accessibility and web design standards, to ask a few questions about how we might approach building text-only sites to help end users.
Kramer: I’m curious. What kinds of things can be stripped from sites for low-bandwidth users and people with visual impairments?
Bowden: Those are two very distinct user groups but some of the approaches bleed over and can be applied together. For low-bandwidth users: Cut the fluff. No pictures, no video, no ads or tracking. Text files are good enough here. Anything else is just fluff.
That's good web design advice for any-bandwidth users.
Typical user agents for those with visual impairments are screen readers, which rely on the foundation (literally HTML) of a website to interpret its content and regurgitate it back to the user.
Kramer: Is text-only the way to go? Are there ways to think about preloading images and/or other methods that might help these users?
Bowden: Text in HTML is the way to go here; you cover accessibility issues and SEO bots, while simultaneously also being usable on the maximum number of devices possible. HTML and CSS are forgiving in the sense that you can make mistakes in them, and something will still be rendered to the user. Browsers are built with backwards compatibility, so combining them all grants you the extended coverage. Meaning that basic sites will work on nearly any phone. Any computer. Any browser.
Hands down the best news site design of 2017. lite.cnn.io
That October 2017 Poynter story contained a link to a 2015 Poynter story.
Consider creating a text-only site. At a recent accessibility hackathon, I sat with visually impaired people who said that this text-only version of the NPR website was the best news website because their screen readers easily parsed the material.
You can also check your website to make sure that it’s usable for people with color impairments, like color blindness. Color Safe helps you choose colors that meet WCAG contrast thresholds, while the Color Contrast Analyzer simulates different forms of color impairment so you can learn which colors may work.
Oct 14, 2017
<form>is all that is needed to conduct business. The web was created as documents+forms in the IBM 3270 model, which businesses and other other organizations were using since the 1970s.
Another HN comment:
Plus, the web has become completely unusable without a script blocker.
A person responded with:
When you exaggerate like that, it diminishes your point. I use the web all day, every day and I have never installed a script blocker.
- "No one seems to be up in arms about slow websites. If AMP and others exist, there's a reason, and it might be more useful to respond to that than to respond to AMP." [yep. blame goes to the publishers for creating horribly bloated websites]
- yeah, as a user AMP is awesome. Dear website owners who wrote this letter: yes, amp has it's downsides but the reason we have amp is that the internet as a whole failed to make fast sites. [no, it's not the internet's fault. it's the publishers' fault.] you had your chance for the previous ~30 years, and it's clear you couldn't be bothered to make speed a priority. I'm glad you're angry that AMP is eating your lunch, because maybe you'll actually start trying to compete now. But you aren't getting my sympathy.
- Everyone is interested in getting customers with shitty 2G connections and tiny data caps, because there literally billions of these people and nobody wants to bet that they're never going to have money to spend on your services. If you pretend that all your users have net connections like the US, Europe, or parts of east Asia then you're going to end up as a small player in a balkanized internet. That's why all these companies are making aggressive investments and acquisitions in other parts of the world.
Testing and ranking website response time, using 3G testing at webpagetest.org.
https://www.webpagetest.org/result/180202_3D_c86d014ace02663a6ab00ddc688b82eb/ sawv.org - homepage From: Dulles, VA - Chrome - 3G 2/1/2018, 10:00:43 PM First View Fully Loaded: Time: 1.390 seconds Requests: 2 Bytes In: 9 KB 100 % of the download was HTML, which was 6,413 bytes.
My specifications were quite simple.
- A simple site to share texts and images
- A cheap site to consult on mobile
- Readable source code for people learning HTML and CSS
- A fun site to edit (as in 1997) where you can quickly add interesting features.
Mobile networks and smart phones have incredible performance.
AMP is about making up for the race-to-the-bottom behaviour of sites that can’t resist using 6mb and a boatful of JS just to render an article(+analytics+ads)
Mobile networks and phones are powerful enough for video, so they're powerful enough to download and render a web page. If they're slow at web pages, it's very likely the fault of the page creator, not the network or device.
AMP is an attempt to force the hands of the page creators out there, since they apparently can't self-regulate.
I think it's very important to address the reason why AMP is possible in the first place: Websites are so extremely slow these days.
From users perspective, when I see the lightning icon on my search results I feel happy because it means that the page will show me it's contents as soon as I click it.
It means that the website is not going to show me white page for 15 seconds then start jumping around, changing shape and position for another 30 seconds until everything is downloaded.
I hear all the ethical/economical/strategic concerns but the tech community resembles the taxi industry a bit, that is, claiming that a tech that improves users experience significantly is bad for the user and must be stopped politically instead of addressing the UX issue that allows this tech to exist in first place.
AMP strikes me as a clever technical solution to a problem that doesn't need a technical solution. It just needs restraint and better web development with existing standard technologies, and ideally a strong taboo on bloated web-sites.
See also two other technologies, the existences of which damn the web: Opera Mini (cloud rendering! and it's useful!), which can only exist for as long as the web is laughably inefficient, and Reader Mode, which improves modern web-design by removing it entirely.
A comment further down:
I work for a publisher that zero ads. We have fast pages with minimal JS. We rolled out AMP purely for the SEO win and saw a huge uptick in traffic.
If Google really cared about performance they’d reward publishers doing the right thing on their regular pages (which would benefit the web as a whole), not just those using AMP.
Google decides, and they may be prioritizing websites that use Google's tech.
Another HN comment that echos my past thoughts about AMP and Facebook's Instant Articles.
I don't blame Google for AMP. The industry could have come together to offer a better experience and speedier page load, but of course they didn't and preferred having countless scripts and poorly optimized ads and that translated into a poor experience for users. This created an opportunity for Google to come in and offer this solution and now we're stuck.
Another HN comment:
I have JS disabled and websites are not that slow for me.
"Make Medium Readable Again" https://news.ycombinator.com/item?id=16516126
Another idiotic article from The Verge that blames the mobile web for horribly bloated websites, like theverge.com, loading slowly. The Google person involved AMP believes that mobile web sucked prior to AMP.
First, not such thing exists called the mobile web. It's the web. It's http or https. The web can be accessed on many devices.
With fast internet connections, simple web pages load fast on any device. Bloated websites load slowly on any device, especially over slow internet connections.
It's 2018 and you're criticizing a site for not running well with JS disabled. You're the one actively deciding to make websites harder to use, why should they design the site for you?
Excellent response to the above small-minded thinking:
Bad engineering in 2018 is still bad engineering. JS can be used to enhance the experience, and even build things otherwise impossible in the browser, and it's totally justified. But taking the newest, shinest web application framework and turning what's a tree of regular web documents into a web application, with data unaccessible if you don't run the entirety of its JS? That's just wrong.
Another disturbing viewpoint:
Nobody that pays for developer time in 2018 is going to optimize their site for people that go out of their way to willfully turn off part of the stack. It's not gonna happen. Effort for no reward.
If those people are designers, then they design with no empathy. They may be unaware of accessibility. They have a narrow view of the world, regarding the tech owned by users and the physical capabilities of users. They obviously lack the ability to pick the right tools for the right times.
Another good response to the small-minded people:
I turn off a part of the stack (or rather, run it on a whitelist), because people are using wrong parts of the stack for wrong things.
If you use the right parts of the stack for the right things, the result is a lean, accessible and interoperable piece of software. That's what the Internet was designed for. Alas, few people care, and in particular, interoperability is actively being opposed.
RE the toilet example, current webdev is more like refusing to build toilets in apartments and instead building them into car seats, because it's 2018, everyone has a car and a driver's license (or, rather, everyone in the population we want to monetize).
Another HN comment:
Some things should be simple and just work. They should work in a predictable manner and in a way that minimises risk for the users.
Another great response:
And the lame response from someone not paying attention:
Most websites are not just simple, static text.
I'd debate the "most" part, unless the commenter is including all internal websites/web apps at companies.
Another HN comment:
I disagree, most of them are simple static sites. Gmail, Google Docs, etc. are the outliers, not the norm. Reddit is a static text site with images. Hacker News is a static text site. Blogs are static text sites (with a few embedded images or non-text objects.
Another HN response:
Most websites should be just simple, static text.
someone who has created static and dynamic webpages since late last millenium
This benefits more than just one tiny group of users: it might also aid disabled users and accessibility software (not to mention the developers of that software), security nuts, people who turn js off to improve performance on low-spec machines (it's 2018 here, but more than a few countries have a four (or even three!) figure GDP/capita, so their machines aren't going to be 2018 machines. This is just off the top of my head, how many other groups might there be that would benefit?
It's 2018 and web developers are still not competent enough to build simple websites (blogs, news, reddit) without forcing visitors to use JS.
JS is the technology responsible for most of the malware infections and spyware ad-tracking. It's not like people disable it just to piss off developers, there are very good reasons to turn it off.
"JS disabled" is just the consequent, tech-y angle to not requiring JS to display every damn text box. I get JS to load things like simple menu-pop-ups or expanding an image, but it's so infused in the new reddit layout, you need it to basically display simple, static text content. It's just bloat.
I just get annoyed, one website shouldn't require enough computing power [that] my [CPU] fan has to kick in
Not one website. A single web page can cause older computers to scream and glow red.
Oh wow that is a 'slow' website. After initial site loading it used up 100% of one CPU core for all the animations. When scrolling up and down the hiding / changing side menus and ads make it even slower. That is an incredible piece of work on how not to do frontend.
another hn comment:
I dont know whats included on the website but it is loading really slow and the scrolling is laggy, I gave up after the first person highlighted.
Free karma to whoever posts a plain-text list.
creating a new design because it's cool and not because it solves any problems.
"Reddit's redesign increases power usage of user's devices" https://www.reddit.com/r/redesign/comments/8jzddx/reddits_redesign_increases_power_usage_of_our/
top HN comment:
Anyone from reddit's dev team reading this? Or youtube's dev team, because my question applies to both.
The previous design was great. The new version of both sites is slow, sluggish and provides me with no benefit. Why was the change implemented? The previous design wasn't broken!
How's the user feedback? A/B testing really indicated to you this was a good choice?!
If you have any insight -- I'm sure this was a decision made much higher up than dev -- please do share.
have to use this version of reddit: https://i.reddit.com
another hn comment:
This is the dumbest publishing platform on the web.
Write something, hit publish, and it's live.
Long live the independent web!
The web is about linking, therefore a 100-percent plain text option does not make sense. But with the way some websites are poisoned by horrible, modern web design, a 100-percent plain text version would be preferable.
But even Gopher contains linking capabilities.
TBL created the web with its own markup language that was a subset of SGML. TBL called HTML, and the purpose of the markup language was to display content with some simple formatting options to make reading long docs easier in the new web browser app.
The web = linking and formatting with HTML. A 100-percent plain text option is not the web, but it's possible to do.
Some web browsers today permit readers to unstyle web pages, which means the browser feature or extension disables CSS. I don't know if it disabled HTML formatting.
related from the knee-jerker who happens to be concerned about privacy and tracking.
From the IndieWeb chat log, this disappointing observation from an IndieWeb user:
Again, from the knee-jerker who posted in that Twitter thread:
That person displayed his ignorance.
Good observation by another IndieWeb user in that thread.
sknebel blocking third-party JS with a small whitelist is in my experience the best web experience, so I understand where the JS "hate" is coming from
READING a PUBLIC WEBPAGE as a BROWSING-ONLY USER is NOT the same as PERFORMING WORK in a PRIVATE WORKSPACE as a LOGGED-IN USER within an internal or external WEB APP.
The original web was created to share documents for READING. That simple capability still exists today.
But today, my hardware, operating systems, and web browsers are being made obsolete by WEB PAGES that require many megabytes of crapware to be downloaded, which causes older and slower CPUs to glow red and scream. And all that I wanted to do was to READ those bloated web pages.
I'm not trying to install a brand new hardware widget that performs magnitudes more functions, and requires a newer version of the operating system or whatever. I'm not trying to install a new, mission control-like software system that requires more RAM and new CPUs, and bigger monitors.
Web pages and websites are obsoleting systems, which is insane when the site is simply display text and images. It's not offering video game play.
Design Tip: Never Use Black (2012) https://news.ycombinator.com/item?id=17334627 https://ianstormtaylor.com/design-tip-never-use-black/
I'll take high contrast black on white over the annoyingly popular grey on grey that also uses a microscopic font size when the websites are viewed on a phone's web browser. It's almost as if the design was intentionally made to prevent people from reading the website.
July 2018 HN thread about a new text-based browser. Discussion from one part of the thread.
Can it be made not to show WebGL, embedded video, and so forth? I enjoy a very serene internet using w3m set to monochrome, with mouse and images turned off. Every now and then it's necessary to use a graphical browser, and it's the sensory equivalent of being woken up by a toddler at 5:30 on Christmas morning.
What you call "serene," I would call "austere." That's not meant as a denigration, mind you: I'm very curious as to your viewpoint here. What do you enjoy about such an experience?
It's unfortunate for USABLE web design that such a question needs to be asked. Forget about "serene" and "austere". The proper term is usable.
The person gave a great response to the sad question.
I find the ordinary internet terribly overstimulating. It's a constant din of people screaming for my attention. Even relatively sober sites often have distracting designs that make it hard to focus on the content. Text mode is calming, it cuts straight through the clutter. In text mode, authors have to distinguish themselves by saying something interesting, and it's a lot easier to decide whether that's the case when they have nothing but words with which to make their arguments.
Another user provided an excellent response to the question.
The major benefit is that I don't enjoy an "experience", I just read the content and leave. So much of the modern web is built specifically to prevent that.
Indeed. The massively bloated, clunky, tracker-filled, and ad-filled websites that dominate the media industry make it harder to stay informed. Their web reading experience is too awful to tolerate.
A friend gave me design advice once. He said to start with left-aligned black text on a white background, and to apply styling only to solve a specific problem. This is good advice. Embrace this, and you embrace Brutalist Web Design. Focus on your content and your visitors will enjoy you and your website. Focus on decoration or tricking your visitors into clicking ads, and your content will suffer, along with your visitors.
That's useful advice, but I don't understand the repeated attempts in the past couple years to apply brutalist architectural design to web design.
More from brutalist-web.design:
Here's the July 2018 HN thread related to the above website.
That thread contained over 300 comments. Here's a humorous one.
Please don't encourage people taking marketing classes. The internet needs less advertising not more.
Naturally met by ignorant criticism.
When it's a text-based article, why does the content need to be lost to wretched, bloated, obnoxious web design? What's wrong with old, simple HTML with a smattering of simple CSS when displaying an article that contains mainly text?
I stumbled upon Peter's post and the HN thread while using the Links web browser on my Linux desktop computer. That's the Links browser and not Lynx. I occasionally use Lynx, which is a text-based web browser. I first started using Lynx in 1994 or 1995.
The Links web browser came "later". It's development began in the late 1990s. I have rarely used Links, since I preferred Lynx. But learned in July 2018 that Links had a graphics mode that supports images and the mouse. For some reason, I never knew that.
On July 25, 2018, I started using Links or Links2 in graphics mode by starting it up from the command prompt with
links2 -g. I like it. It does not style pages as well as NetSurf, which is still a favorite alternative web browser of mine. But I like the Links text-mode styling with some graphics capabilities.
Jul 25, 2018
"Senator Asks US Agencies to Remove Flash from Government Websites"
Two notes, though I doubt anyone working on government pages will actually read them:
1) PDF -> HTML improves accessibility. HTML -> content rendered by JS via an SPA framework is worse than PDF, and approaches Flash. Please don't do that.
2) Think of the robots :). One of the problem of government data is that while you can usually find the scanned PDF or an XLS file with the data you're looking for, it's completely useless for automated processing. Making public data easier for machines to read enables citizens to build interesting tools on top of them.
Yes. Trying to text-process government PDF files with homemade programs can be frustrating. I had that trouble back in 2005. Our local governments still produce too many PDF files today. Web access to one of the Lucas County services requires Silverlight!!!???
July 27, 2018
"Twitter shares drop 14 percent after reporting declining monthly active users"
What I don't understand is the discrepancy between the size of their engineering workforce and the lousy quality of their web client on mobile.
This may be an intentional dark pattern / effort to drive users away from the web client and to the official app. Facebook and reddit do the same thing. More permissions = more juicy data to harvest.
I think that a lot of websites do that. According to my tin-foil hat theories, it seems that websites on all devices function so poorly due to disgusting web designs that it seems like the bad designs are intentional to encourage people to download yet another native app for the phone.
Jul 30-31, 2018
Great top comment:
I've said this before, but it bears repeating: Moby Dick is 1.2mb uncompressed in plain-text. That's lower than the "average" news website by quite a bit--I just loaded the New York Times front page. It was 6.6mb. that's more than 5 copies of Moby Dick, solely for a gateway to the actual content that I want. A secondary reload was only 5mb.
I then opened a random article. The article itself was about 1,400 words long, but the page was 5.9mb. That's about 4kb per word without including the gateway (which is required if you're not using social media). Including the gateway, that's about 8kb per word, which is actually about the size of the actual content of the article itself.
So all told, to read just one article from the New York Times, I had to download the equivalent of ten copies of Moby Dick. That's about 4,600 pages. That's approaching the entirety of George R.R. Martin's A Song of Ice and Fire, without appendices.
If I check the NY Times just 4 times a day and read three articles each time, I'm downloading 100mb worth of stuff (83 Moby-Dicks) to read 72kb worth of plaintext.
Even ignoring first-principles ecological conservatism, that's just insanely inefficient and wasteful, regardless of how inexpensive bandwidth and computing power are in the west.
EDIT: I wrote a longer write-up on this a while ago on a personal blog, but don't want it to be hugged to death:
another comment, related to the ny times or media orgs:
It would be nice if the things I pay for didn't start stuffing their content with bullshit. What and who do I have to pay to get single second page loads?
Man, it's like I wrote that comment, but I didn't, at least not at HN. I certainly have made similar comments here.
That commenter continued ...
It's not a given that advertising has to be so bloated and privacy-invasive. Various podcasts and blogs (like Daring Fireball) plug the same ad to their entire audience each post/episode for set periods of time. If you're going to cry about needing advertising then take your geographic and demographic based targeting. But no war of attrition will get me to concede you need user-by-user tracking.
You want me to pay for your content? Fine, I like it well enough. You want to present ads as well? Okay sure, the writing and perspectives are worth that too I suppose. But in addition to all of this you want to track my behavior and correlate it to my online activity that has nothing to do with your content? No, that's ridiculous.
The top commenter replied someone else's misguided comment.
A random archive of the New York Times frontpage in 2005 is 300kb. Articles were probably comparable in size.
Are you honestly saying that the landscape of the internet and/or the staffing needs of the NY Times has changed so drastically that they actually needed a 22x increase in size to deliver fundamentally text-based reporting?
Here's the misguided comment by the person who replied to the top comment.
I don't think thats a meaningful comparison. Moby Dick is a book, written by 1 guy and maybe an editor or two. NYT employs 1,300 people.
Yes. The NY Times like most media orgs are for-profit businesses, and most of these orgs rely on subscriptions and/or digital advertising that obviously needs page views.
When you read a book all you get is the text. NYT has text, images, related articles, analytics, etc.
Here's a Toledo Blade editorial that contains around 370 words and one small image.
From: Dulles, VA - Thinkpad T430 - Chrome - Cable 7/31/2018, 3:12:23 PM First View Fully Loaded: Time: 10.610 seconds Requests: 286 Bytes in: 2,532 KB
Why would a 370-word editorial require a web reader to download 2.5 megabytes of data? And why would such an article require the reader's web browser to make 286 web requests?
Again from above:
Moby Dick is 1.2mb uncompressed in plain-text
If Moby Dick was being rewritten and optimized every single day it would be a few mb. Its not, so you can't compare the two.
Yes NYT should be lighter, no your comparison is not meaningful. A better comparison would by Moby Dick to the physical NYT newspaper.
Here's another commenter who made a misguided post as a reply to the author of the top comment.
This is both an appeal to people's universal appreciation of efficiency, and a weak denunciation of the modern web. Your argument is
1) that a website's value is the number of words on the page, and
Huh? The other person did not discuss the value or the merits of word counts. It's the content that matters, regardless if it's 20,000 words or 200 words. And it doesn't matter if the content is a mix of audio, video, images, and words, as long as every piece of content has value. I despise huge, useless images that have nothing to do with the content of the page. And since most people read web pages on their phones, why uses massive images. A 640 or 800 pixel wide image is probably enough for most cases for articles on all screen sizes. Maybe instead of embedding giant images and video and every social media widget within the article, the content creators should provide links to the bigger content. If a reader wants to view the image or the video, then the reader can click a link to view the image or video. That keeps the article page lighter.
Excerpts from another comment posted by the author of the top comment, at least at the time that I read this HN thread.
Concerning the relative value of HTML and CSS, yes, you could argue that UX matters in that department, but even the most bloated static HTML/CSS page is going to pale dramatically in comparison to the size of what's considered acceptable throughput today.
Hopefully, commenters or "thinkers" like this HN user do not design websites.
Comparing the raw text of a fiction novel to the code of a website is a pretty asinine comparison, honestly.
Clearly, the person lacks the ability to comprehend the comparison, which is to highlight how absurdly bloated single web pages have become, especially at media orgs. It's why Facebook created Instant Articles and Google created Accelerated Mobile Pages.
Another HN commenter added:
Amen to every word except this sentence. "Better choices should be made by web developers to not ship this bullshit in the first place."
No developer I know, web or otherwise, wants to do any of this, and all of them are religious in their use of ad blockers and autoplay stoppers.
This is the kind of stuff developers are forced to do with guns to their heads by the PMs and marketing teams that actually determine the user experience.
We don't hear how designers and developers are forced to make bloated websites because of the business models used by the publishers. I wish that we would hear more tell-alls in this area.
In July 2018, I began using and enjoying the Links web browser on my Linux desktop computer. That would be Links in graphics mode. For some reason, I needed to download Links2. I start up the browser form the command line with
With a broadband connection at home, I'm stunned at how fast our internet access is, and we do not have the fastest option, offered by local ISP company toast.net. Our version costs around $37 per month.
Links renders web pages literally in a blink of an eye, maybe faster than I can blink. The limited graphical browser NetSurf can render through at least HTML4 and CSS2 and maybe a smattering of HTML5 and CSS3.
links -g renders only basic HTML, such as headings, paragraphs, emphasis, bullet points, and blockquotes, about all that I need.
links -g can also display images. But the browser does not support CSS. The browser provides global display options for the background color, font size, margin sizes, etc. Simple but useful.
When individual webpages are stripped down to their basic HTML, like how the web was meant to be, web access and web page loading is blazingly fast over our home broadband internet connection.
In July 2018, I've been using the uMatrix Chrome browser extension with everything disabled by default for all websites, and then I enable on a per site basis. With everything disabled, webpages in Chrome display similar to how they look in the
links -g web browser.
Since it's unlikely to get the
links -g web browser to work on an smartphone, then I may need to try Chrome again on the iPhone with uMatrix, assuming that exists for the mobile version of Chrome.
The web (http/https) is not slow. The mobile web is not slow. No such thing as the mobile web exists. It's the same damn web, displayed on smaller screens. Even on a 2G connection, simple web pages load faster than bloated web pages on fast internet connections.
The web browser makers are not the problem for a slow web as defined by The Verge. The blame belongs to bloated website publishers like The Verge.
If the Toledo Blade offered a web product that functioned like https://text.npr.org except with the addition of useful and judiciously placed images, video, and audio, then I would happily subscribe.
My fictional Toledo Gazette test website with real Blade content.
One Year Without AMP (alexkras.com)
From the medium.com post:
Let’s talk about Google AMP. AMP stands for Accelerated-Mobile-Pages. It’s a technology Google originally introduced to get web developers to speed up their webpages for mobile devices and mobile networks. But in many ways it seems like great technology for any device or network. Who doesn’t want fast websites?
Uh, newsflash: Website owners don't need Google AMP to create fast websites. Website owners can CHOOSE to create fast websites on their own, but they don't. It's a choice.
And website owners can do that themselves. But most web publishers, especially media orgs, CHOOSE to create horrendously bloated web designs.
Top comment from the HN thread that contains over 340 comments:
Question: If what makes AMP fast is the restrictions on size, JS, and CSS, and you know this and want to conform to this, why do you need to use AMP? Why not just develop your site like this anyways?
It's possible that the tech people working at media orgs would prefer to create lightweight, fast-loading, non-tracking websites, but those tech people are employees beholden to "superiors" who promote a failing business model, built around digital ads, clickbait, and pageviews.
Another HN comment:
AMP's innovation isn't a way to make pages fast. AMP is a way to sell other stakeholders on implementing technologies that make your website fast. All the stuff AMP does is stuff you could do yourself without the extra request to amp.js and the extra work to amp-ify your pages.
But imagine you've got an advertising department that wants three different ad networks, a couple different managers that want to see stats from a couple different analytics platforms, and and the designer wants to load a font from fontsquirrel and another one from typekit and another one from google web fonts, and as a developer who wants to keep the site fast you have to fight them every single time they want to add something else that slows your site down. Having the same fight every time, with everybody else saying "oh, it's just one request. and this one is really critical" it's hard to keep fighting that fight.
It's a lot easier to say "i can't do that, it doesn't work in AMP". If you can find a better way to convince large organizations that page load speed is a valuable metric, and more important that whatever other resource they want to load today, I'd love to hear it. But from what i've seen, AMP is the only thing that's had any success in tackling this problem.
Another HN comment:
AMP was a blessing for me honestly. I can now maintain a version of our new site that isn't bogged down with tracking and flavor-of-the-month JS feature garbage.
I've been fighting against adding additional tracking forever, but constantly get railroaded by marketing because "they're the ones that know how to make us profitable."
Fundamentally I hate what it means for the internet, but I finally have a little power to say "no we can't do that."
Another HN comment:
It is astonishing how hard it can be to internally sell any kind of web quality features to management in both for profit and non-profit organizations.
There is also a real herd effect. Many people will do whatever Matt Cutts tells them because they think it will be good for their SEO. Yeah right. Some of the people who are good at SEO either went to work for huge brands or quasi-competitors of Google (like about.com) that might have some ability to bring Google to anti-trust court; most of the others switched to paid advertising once they figured out that Google won't let you win at SEO.
Depending upon the business, designers and developers should NOT be blamed for poorly designed websites. We don't know the stories behind the bloated, nefarious designs.