Tag - Manifesto

assembled in 2016-2017

Most weeks, I find new and old web posts, related to this subject. These pages are mainly a dumping ground for links, excerpts, and quick thoughts.

storing links for page number 7 ...

aug 16, 2017 https://news.ycombinator.com/item?id=15027715

Most content websites have become such a massive crapfest of ad-bloat, bad UX, huge page sizes and general usability hell that it's nigh impossible that I'd be able to reach the actual content of a non AMP site in the first 5-10 seconds of clicking on its link. (On my phone that's an additional 1-2 seconds for registering the tap, and 1-2 seconds for navigating to the browser)

So say what you may, AMP (or FB Instant or its ilk) will prosper until the mobile web experience stops being so crappy.

First, no such thing as the "mobile web" exists. It's the web. Many times, people view the web with mobile devices, but it's the same web that can be viewed with desktop computers. Responsive web design permits sites to adjust the displays for different devices, but it's still the same web or http/https.

Second, the web experience, which includes viewing the web on mobile devices, is fine. The problem is not with the web. The problem is with how websites are designed.


This is a non-interactive text-only website. It shouldn't need anything beside HTML and CSS.


These text-only sites — which used to be more popular in the early days of the Internet, when networks were slower and bandwidth was at a premium – are incredibly useful, and not just during natural disasters. They load much faster, don’t contain any pop-ups or ads or autoplay videos, and help people with low bandwidth or limited Internet access. They’re also beneficial for people with visual impairments who use screen readers to navigate the Internet.

Proving, once again, that the web is not slow. Websites are slow because of how they are built. But these slow web design choices are probably governed by the media orgs' business models and by user experience people who design without empathy for the users.

More from the story:

There are many ways that news organizations can improve the ways they serve both low-bandwidth users and people with visual impairments by stripping out unnecessary elements and optimizing different parts of a website. To learn more, I reached out to front-end website designer J. Albert Bowden, who frequently tweets about accessibility and web design standards, to ask a few questions about how we might approach building text-only sites to help end users.

Kramer: I’m curious. What kinds of things can be stripped from sites for low-bandwidth users and people with visual impairments?

Bowden: Those are two very distinct user groups but some of the approaches bleed over and can be applied together. For low-bandwidth users: Cut the fluff. No pictures, no video, no ads or tracking. Text files are good enough here. Anything else is just fluff.

That's good web design advice for any-bandwidth users.

[Bowden:] For visually impaired users: I’m going to just talk about a11y [which is a shorthand way of referring to computer accessibility] here. A11y is best addressed in the foundation of a website, in the CSS, HTML, and JavaScript. There are other ways to go about doing it, but they are much more resource intensive and therefore are never going to be default for mainstream.

Typical user agents for those with visual impairments are screen readers, which rely on the foundation (literally HTML) of a website to interpret its content and regurgitate it back to the user.

Kramer: Is text-only the way to go? Are there ways to think about preloading images and/or other methods that might help these users?

Bowden: Text in HTML is the way to go here; you cover accessibility issues and SEO bots, while simultaneously also being usable on the maximum number of devices possible. HTML and CSS are forgiving in the sense that you can make mistakes in them, and something will still be rendered to the user. Browsers are built with backwards compatibility, so combining them all grants you the extended coverage. Meaning that basic sites will work on nearly any phone. Any computer. Any browser.

Once you deviate from this path, all bets are off. Some are solid, no doubt, but most are shaky at least, and downright broken at worst. JavaScript is the bee’s knees and you can do anything in the world with it ... but if you make certain mistakes, your web page will not render in the browser; the browser will choke to death.


Hands down the best news site design of 2017. lite.cnn.io

That October 2017 Poynter story contained a link to a 2015 Poynter story.


Consider creating a text-only site. At a recent accessibility hackathon, I sat with visually impaired people who said that this text-only version of the NPR website was the best news website because their screen readers easily parsed the material.

You can also check your website to make sure that it’s usable for people with color impairments, like color blindness. Color Safe helps you choose colors that meet WCAG contrast thresholds, while the Color Contrast Analyzer simulates different forms of color impairment so you can learn which colors may work.

Oct 14, 2017

https://www.codementor.io/kwasiamantin/undefined-cmglhfnfv https://news.ycombinator.com/item?id=15472627

https://www.neustadt.fr/essays/against-a-user-hostile-web https://news.ycombinator.com/item?id=15611122

"How the BBC News website has changed over the past 20 years" http://www.bbc.co.uk/news/uk-41890165 https://news.ycombinator.com/item?id=15730218

"Hundreds of web firms record 'every keystroke'" http://www.bbc.com/news/technology-42065650 https://news.ycombinator.com/item?id=15786887

Businesses existed before the creation of Javascript. Conducting most business does not require a Turing complete language and executing arbitrary code on every document. The HTML <form> is all that is needed to conduct business. The web was created as documents+forms in the IBM 3270 model, which businesses and other other organizations were using since the 1970s.

Another HN comment:

Whenever someone complains about a website not working without javascript enabled, someone inevitably responds "it's 2017, you can expect javascript to be enabled". I think that piece of knowledge is outdated:

  • Late 1990's: static html documents + forms - early 2000's: shitty DHTML scripts that added nothing - early 2010's: javascript + gracefully downgrading sites - 2015/16: required useful javascript everywhere - early 2017: trackers everywhere, html5 popups, trackers, spywhere, trackers, bitcoin miners, trackers, etc, etc.

2017 is the year where you NEED a javascript blocker. What's the use of having any security at all if you're going to leave the biggest attack vector in modern times completely unprotected?

Plus, the web has become completely unusable without a script blocker.

A person responded with:

When you exaggerate like that, it diminishes your point. I use the web all day, every day and I have never installed a script blocker.

It's not an exaggeration. It's nearly impossible to read media websites without JavaScript disabled, especially over an LTE connection, and definitely over a 3G connection. But even with a fast connection, older computers get bogged down when accessing media websites with JavaScript enabled.

January 2018

A letter about Google AMP http://ampletter.org https://news.ycombinator.com/item?id=16108553 - over 300 comments

HN comments:

Testing and ranking website response time, using 3G testing at webpagetest.org.



https://www.webpagetest.org/result/180202_3D_c86d014ace02663a6ab00ddc688b82eb/ sawv.org - homepage From: Dulles, VA - Chrome - 3G 2/1/2018, 10:00:43 PM First View Fully Loaded: Time: 1.390 seconds Requests: 2 Bytes In: 9 KB 100 % of the download was HTML, which was 6,413 bytes.


My specifications were quite simple.

  • A simple site to share texts and images
  • A cheap site to consult on mobile
  • Readable source code for people learning HTML and CSS
  • A fun site to edit (as in 1997) where you can quickly add interesting features.

Feb 2018:


Mobile networks and smart phones have incredible performance.

AMP is about making up for the race-to-the-bottom behaviour of sites that can’t resist using 6mb and a boatful of JS just to render an article(+analytics+ads)

Mobile networks and phones are powerful enough for video, so they're powerful enough to download and render a web page. If they're slow at web pages, it's very likely the fault of the page creator, not the network or device.

AMP is an attempt to force the hands of the page creators out there, since they apparently can't self-regulate.

I can't prove it because it's only my anecdotal experience, but it seems that the battery on my old iPhone drains faster when I have multiple Safari tabs open to web pages that use a lot of JavaScript, and I have JavaScript enabled. It seems that my battery life lasts longer when I have JavaScript disabled globally for Safari.

Feb 2018



Top comment:

I think it's very important to address the reason why AMP is possible in the first place: Websites are so extremely slow these days.

From users perspective, when I see the lightning icon on my search results I feel happy because it means that the page will show me it's contents as soon as I click it.

It means that the website is not going to show me white page for 15 seconds then start jumping around, changing shape and position for another 30 seconds until everything is downloaded.

I hear all the ethical/economical/strategic concerns but the tech community resembles the taxi industry a bit, that is, claiming that a tech that improves users experience significantly is bad for the user and must be stopped politically instead of addressing the UX issue that allows this tech to exist in first place.

Next comment:

The tragedy of it is that web browsers have never been faster - it's just that websites insist on bloating, and bloating, and bloating. It's not unusual for modern websites to have literally megabytes of pointless JavaScript. (Reminder: Super Mario 64 weighs in at 8MB. The whole game.)

AMP strikes me as a clever technical solution to a problem that doesn't need a technical solution. It just needs restraint and better web development with existing standard technologies, and ideally a strong taboo on bloated web-sites.

See also two other technologies, the existences of which damn the web: Opera Mini (cloud rendering! and it's useful!), which can only exist for as long as the web is laughably inefficient, and Reader Mode, which improves modern web-design by removing it entirely.

A comment further down:

I work for a publisher that zero ads. We have fast pages with minimal JS. We rolled out AMP purely for the SEO win and saw a huge uptick in traffic.

If Google really cared about performance they’d reward publishers doing the right thing on their regular pages (which would benefit the web as a whole), not just those using AMP.

Google decides, and they may be prioritizing websites that use Google's tech.

Another HN comment that echos my past thoughts about AMP and Facebook's Instant Articles.

I don't blame Google for AMP. The industry could have come together to offer a better experience and speedier page load, but of course they didn't and preferred having countless scripts and poorly optimized ads and that translated into a poor experience for users. This created an opportunity for Google to come in and offer this solution and now we're stuck.

Another HN comment:

I have JS disabled and websites are not that slow for me.



"Make Medium Readable Again" https://news.ycombinator.com/item?id=16516126

Another idiotic article from The Verge that blames the mobile web for horribly bloated websites, like theverge.com, loading slowly. The Google person involved AMP believes that mobile web sucked prior to AMP.

First, not such thing exists called the mobile web. It's the web. It's http or https. The web can be accessed on many devices.

With fast internet connections, simple web pages load fast on any device. Bloated websites load slowly on any device, especially over slow internet connections.


March 2018



April 2018

https://changemyview.net/2018/04/11/reddit-redesigned-its-site-and-it-could-kill-discussion-subreddits/ https://news.ycombinator.com/item?id=16819397

It's 2018 and you're criticizing a site for not running well with JS disabled. You're the one actively deciding to make websites harder to use, why should they design the site for you?

Excellent response to the above small-minded thinking:

Bad engineering in 2018 is still bad engineering. JS can be used to enhance the experience, and even build things otherwise impossible in the browser, and it's totally justified. But taking the newest, shinest web application framework and turning what's a tree of regular web documents into a web application, with data unaccessible if you don't run the entirety of its JS? That's just wrong.

Another disturbing viewpoint:

Nobody that pays for developer time in 2018 is going to optimize their site for people that go out of their way to willfully turn off part of the stack. It's not gonna happen. Effort for no reward.

If those people are designers, then they design with no empathy. They may be unaware of accessibility. They have a narrow view of the world, regarding the tech owned by users and the physical capabilities of users. They obviously lack the ability to pick the right tools for the right times.

JavaScript should never be required to read text on a public website when the reader is not logged into the website.

Another good response to the small-minded people:

I turn off a part of the stack (or rather, run it on a whitelist), because people are using wrong parts of the stack for wrong things.

If you use the right parts of the stack for the right things, the result is a lean, accessible and interoperable piece of software. That's what the Internet was designed for. Alas, few people care, and in particular, interoperability is actively being opposed.

RE the toilet example, current webdev is more like refusing to build toilets in apartments and instead building them into car seats, because it's 2018, everyone has a car and a driver's license (or, rather, everyone in the population we want to monetize).

Another HN comment:

Some things should be simple and just work. They should work in a predictable manner and in a way that minimises risk for the users.

The narrow-minded JavaScript all-the-time everywhere freaks are contributing to the demise of a useful, open web. They don't understand time-and-place tech.

Another great response:

Good engineering is robust and efficient. Requiring Javascript to display simple, static text is not good engineering.

And the lame response from someone not paying attention:

Most websites are not just simple, static text.

I'd debate the "most" part, unless the commenter is including all internal websites/web apps at companies.

The discussion is about Reddit, which is mainly text. Why does Reddit's primary mobile web display require non-logged-in users to enable JavaScript to read text?

Web apps that require users to login and work from their console or dashboard in a private manner can use JavaScript without most of us whining about it. Banking, tax preparing, etc. I expect JavaScript, used well, to improve the user experience. But that info, hopefully, is not available for anyone to read on the public web.

Another HN comment:

I disagree, most of them are simple static sites. Gmail, Google Docs, etc. are the outliers, not the norm. Reddit is a static text site with images. Hacker News is a static text site. Blogs are static text sites (with a few embedded images or non-text objects.

A JavaScript version of Gmail is available on desktop/laptop screens. I don't understand why Google does not make this version of Gmail available on mobile web browsers. But Gmail can be used without JavaScript. That's how I use Gmail on my desktop and laptop computers.

Facebook can function without JavaScript at https://mbasic.facebook.com. Twitter functions without JavaScript. A minute ago, I logged into my test Twitter account and created a test tweet, all with JavaScript disabled for Twitter.

Large companies can afford to expend resources to create systems that function without JavaScript. Facebook, Google, and Twitter know that they have users in parts of the world that have limited internet or slow internet access, or the users have limited data cellphone plans, and JavaScript-free systems help the user experience in these situations.

And security-wise, it's a good idea to disable JavaScript in the browser.

Another HN response:

Most websites should be just simple, static text.

Another comment:

FWIW it takes less developer time to make a decent website without Javascript. Javascript also introduces risks that are non existent in static pages. Now if theese JS sites were super snappy and worked well offline etc I'd say probably wort it. But I guess they are often worse that a plain old web page.

someone who has created static and dynamic webpages since late last millenium

HN comment:

It doesn't take a lot of genius, time, or money to set a link w/ url parameters or use a standard html form for e.g. upvoting or comment submission, and then override that with javascript for fancier UI / AJAX / avoid redirects, etc -- i.e., extra polish, rather than it being the only option and the site just becomes nonfunctional instead of degrading gracefully.

This benefits more than just one tiny group of users: it might also aid disabled users and accessibility software (not to mention the developers of that software), security nuts, people who turn js off to improve performance on low-spec machines (it's 2018 here, but more than a few countries have a four (or even three!) figure GDP/capita, so their machines aren't going to be 2018 machines. This is just off the top of my head, how many other groups might there be that would benefit?

New businesses in the U.S. have to conform to accessibility guidelines for people with disabilities. No such regulation exists for websites. If the JavaScript all-the-time-everywhere crowd opened a coffee shop, they would gladly turn away customers with disabilities because such people represent only a small customer base.

HN comment:

It's 2018 and web developers are still not competent enough to build simple websites (blogs, news, reddit) without forcing visitors to use JS.

JS is the technology responsible for most of the malware infections and spyware ad-tracking. It's not like people disable it just to piss off developers, there are very good reasons to turn it off.

HN comment:

"JS disabled" is just the consequent, tech-y angle to not requiring JS to display every damn text box. I get JS to load things like simple menu-pop-ups or expanding an image, but it's so infused in the new reddit layout, you need it to basically display simple, static text content. It's just bloat.

What's the utility reason for requiring so much JavaScript to perform simple functions that were solved with basic HTML more than 20 years ago?


I just get annoyed, one website shouldn't require enough computing power [that] my [CPU] fan has to kick in

Not one website. A single web page can cause older computers to scream and glow red.

https://www.inc.com/30-under-30 https://news.ycombinator.com/item?id=16876061

Oh wow that is a 'slow' website. After initial site loading it used up 100% of one CPU core for all the animations. When scrolling up and down the hiding / changing side menus and ads make it even slower. That is an incredible piece of work on how not to do frontend.

another hn comment:

I dont know whats included on the website but it is loading really slow and the scrolling is laggy, I gave up after the first person highlighted.

hn comment:

Free karma to whoever posts a plain-text list.



Zegnat I block everything in uMatrix. images, css, javascript. To save on data-usage. Though the saving from CSS is basically 0, it does sometimes stop multiple third-party CDNs from loading


https://www.discoverdev.io https://news.ycombinator.com/item?id=17054460

creating a new design because it's cool and not because it solves any problems.

"Reddit's redesign increases power usage of user's devices" https://www.reddit.com/r/redesign/comments/8jzddx/reddits_redesign_increases_power_usage_of_our/


top HN comment:

Anyone from reddit's dev team reading this? Or youtube's dev team, because my question applies to both.

The previous design was great. The new version of both sites is slow, sluggish and provides me with no benefit. Why was the change implemented? The previous design wasn't broken!

How's the user feedback? A/B testing really indicated to you this was a good choice?!

If you have any insight -- I'm sure this was a decision made much higher up than dev -- please do share.

the new reddit design displays nothing when viewed on a mobile device with javascript disabled. brilliant.

have to use this version of reddit: https://i.reddit.com

another hn comment:

This is just a symptom of the webdev world's over-reliance on JavaScript to draw interfaces nowadays. The web stack has HTML, CSS, and JavaScript for reason. Quit doing everything in the third layer. Make your stuff with progressive enhancement. Knock off the do-it-all-in-JS stuff and your web apps will perform better.



This is the dumbest publishing platform on the web.

Write something, hit publish, and it's live.

There's no tracking, ad-tech, webfonts, analytics, javascript, cookies, databases, user accounts, comments, friending, likes, follower counts or other quantifiers of social capital. The only practical way for anyone to find out about a posting is if the author links to it elsewhere.

Long live the independent web!



Jun 1, 2018 discussion that appeared in the IndieWeb chat log about websites that maliciously use JavaScript, which is a good reason to disable JavaScript when READING. Naturally, a knee-jerker suggested disabling HTML, which proved that the user has no knowledge of about the web's history.

The web is about linking, therefore a 100-percent plain text option does not make sense. But with the way some websites are poisoned by horrible, modern web design, a 100-percent plain text version would be preferable.

But even Gopher contains linking capabilities.

TBL created the web with its own markup language that was a subset of SGML. TBL called HTML, and the purpose of the markup language was to display content with some simple formatting options to make reading long docs easier in the new web browser app.

The web = linking and formatting with HTML. A 100-percent plain text option is not the web, but it's possible to do.

Some web browsers today permit readers to unstyle web pages, which means the browser feature or extension disables CSS. I don't know if it disabled HTML formatting.

People disable JavaScript when READING the web over the internet for multiple reasons: security, privacy, faster page loads, simpler pages, better reading experience, less data consumed.

When people need to conduct work within a company's network by using internal web apps, then a JavaScript experience will probably occur, and users might expect it because the users demand a pleasant user experience.

Same thing when users LOG INTO their web apps that execute over the internet and CONDUCT WORK from their DASHBOARDS or ADMIN CONSOLES. These are not public actions. These pages are not for public consumption. These web apps need to work smoothly, and using JavaScript can help.

related from the knee-jerker who happens to be concerned about privacy and tracking.


Disable JavaScript when READING web sites that exist on the internet, and the annoying pop-over screens do not exist. At least the ones created by JavaScript.

JavaScript not HTML is responsible for the tracking and the anti-privacy crap.

From the IndieWeb chat log, this disappointing observation from an IndieWeb user:


2018-06-01 UTC 14:18 [jgmac1106] I have such disdain in the all javascript is evil position. Like blaming existence of words for hate speech

I have disdain for such binary observations too. Who said that all JavaScript is evil 100 percent of the time in all conditions, including web apps that execute in internal networks?

Again, from the knee-jerker who posted in that Twitter thread:

Heck, let’s get rid of HTML too. Ain’t nothing that needs saying that we can’t say in plain ol’ ASCII amirite? ;) (There’s nothing wrong with JavaScript, it’s the business models that are wrong. A few Stallmanesque folks browsing the web with Lynx isn’t going to solve this.)

That person displayed his ignorance.

Many websites that are NOT single page applications still require JavaScript to display text.

I run the latest, greatest version of Chrome. I use multiple browser extensions, including Quick JavaScript Switcher. And when I have JavaScript disabled for a website that requires JavaScript to display its content even though the site is not an SPA, I can sometimes read the text for that same site with Lynx. WTF?

And Stallman is NOT against JavaScript. The knee-jerker needs to stop creating a new reality. Maybe try reading RMS's website.


Stallman opposes non-free JavaScript code. From his homepage.

write a recipe for how to connect to the WiFi in a New York City subway station without running its nonfree Javascript code. The recipe could include a free Javascript program I could run, or it could consist of instructions for what I would type into IceCat (our variant of Firefox).

Stallman supports JavaScript code that is free.

The problem with geeks is that too often they view an issue in binary. The position is either support JavaScript 100 percent of the time or ban JavaScript 100 percent of the time. No leeway. Both viewpoints deserve to be ignored.

My favorite way of writing on the web is by using my own JavaScript-based editor. Logging into web apps to perform work means the users should experience a pleasant UI/UX. Get in, do work, exit, and move on. JavaScript can make banking, doing taxes, managing projects, etc. easier for the logged-in users.

Sure, those web apps could be made to work without JavaScript, but if a competitor creates a better UI/UX, then the non-JavaScript options could lose customers.

Fastmail only works with JavaScript, and I'm okay with that because I enjoy using Fastmail.

When I visit websites as a browsing-only user, and my only function is to read the website, then I see zero reason for the websites to require JavaScript to display text. I see no reason to require JavaScript to display images.

Without JavaScript, those sites appear broken or blank to me, which is fine. It's a big web. A World Wide Web, even. And I can find other websites to waste my time reading.

Good observation by another IndieWeb user in that thread.


sknebel blocking third-party JS with a small whitelist is in my experience the best web experience, so I understand where the JS "hate" is coming from


The original web was created to share documents for READING. That simple capability still exists today.

But today, my hardware, operating systems, and web browsers are being made obsolete by WEB PAGES that require many megabytes of crapware to be downloaded, which causes older and slower CPUs to glow red and scream. And all that I wanted to do was to READ those bloated web pages.

WEB PAGES and newfangled JavaScript are obsoleting my older computers.

I'm not trying to install a brand new hardware widget that performs magnitudes more functions, and requires a newer version of the operating system or whatever. I'm not trying to install a new, mission control-like software system that requires more RAM and new CPUs, and bigger monitors.

Web pages and websites are obsoleting systems, which is insane when the site is simply display text and images. It's not offering video game play.

Design Tip: Never Use Black (2012) https://news.ycombinator.com/item?id=17334627 https://ianstormtaylor.com/design-tip-never-use-black/

I'll take high contrast black on white over the annoyingly popular grey on grey that also uses a microscopic font size when the websites are viewed on a phone's web browser. It's almost as if the design was intentionally made to prevent people from reading the website.

July 2018 HN thread about a new text-based browser. Discussion from one part of the thread.


Can it be made not to show WebGL, embedded video, and so forth? I enjoy a very serene internet using w3m set to monochrome, with mouse and images turned off. Every now and then it's necessary to use a graphical browser, and it's the sensory equivalent of being woken up by a toddler at 5:30 on Christmas morning.



What you call "serene," I would call "austere." That's not meant as a denigration, mind you: I'm very curious as to your viewpoint here. What do you enjoy about such an experience?

It's unfortunate for USABLE web design that such a question needs to be asked. Forget about "serene" and "austere". The proper term is usable.

The person gave a great response to the sad question.


I find the ordinary internet terribly overstimulating. It's a constant din of people screaming for my attention. Even relatively sober sites often have distracting designs that make it hard to focus on the content. Text mode is calming, it cuts straight through the clutter. In text mode, authors have to distinguish themselves by saying something interesting, and it's a lot easier to decide whether that's the case when they have nothing but words with which to make their arguments.

Another user provided an excellent response to the question.

The major benefit is that I don't enjoy an "experience", I just read the content and leave. So much of the modern web is built specifically to prevent that.

Indeed. The massively bloated, clunky, tracker-filled, and ad-filled websites that dominate the media industry make it harder to stay informed. Their web reading experience is too awful to tolerate.

When I simply want to read information as a browsing-only user and not as a logged-in user, performing work at my dashboard or admin console, then plain text works best, especially if the plain text is delivered to the web browser with minimal HTML, little to no CSS, and definitely no JavaScript. I read, I learn, and I move on.

I'm not looking to be wowed by the latest JavaScript client-side fad framework. The heavy JavaScript experience is great when I'm hoping for a brilliantly designed UI/UX that stays out of my way when I'm logged into a web app to perform tasks.


A friend gave me design advice once. He said to start with left-aligned black text on a white background, and to apply styling only to solve a specific problem. This is good advice. Embrace this, and you embrace Brutalist Web Design. Focus on your content and your visitors will enjoy you and your website. Focus on decoration or tricking your visitors into clicking ads, and your content will suffer, along with your visitors.

That's useful advice, but I don't understand the repeated attempts in the past couple years to apply brutalist architectural design to web design.

More from brutalist-web.design:

A website's materials aren't HTML tags, CSS, or JavaScript code. Rather, they are its content and the context in which it's consumed. A website is for a visitor, using a browser, running on a computer to read, watch, listen, or perhaps to interact. A website that embraces Brutalist Web Design is raw in its focus on content, and prioritization of the website visitor.

Here's the July 2018 HN thread related to the above website.


That thread contained over 300 comments. Here's a humorous one.

Please don't encourage people taking marketing classes. The internet needs less advertising not more.

Good stuff. https://petermolnar.net/do-websites-want-us-to-use-reader-mode/

Naturally met by ignorant criticism.


It's always binary with the rabid pro-JavaScript AND rabid anti-JavaScript crowds.

When it's a text-based article, why does the content need to be lost to wretched, bloated, obnoxious web design? What's wrong with old, simple HTML with a smattering of simple CSS when displaying an article that contains mainly text?

When it's a web app that requires the user to log into the website, then JavaScript a storm, provided that the megatons of JavaScript are useful. When the JavaScript provides a comfortable and useful UI/UX, then that's a successful web app.

I stumbled upon Peter's post and the HN thread while using the Links web browser on my Linux desktop computer. That's the Links browser and not Lynx. I occasionally use Lynx, which is a text-based web browser. I first started using Lynx in 1994 or 1995.

The Links web browser came "later". It's development began in the late 1990s. I have rarely used Links, since I preferred Lynx. But learned in July 2018 that Links had a graphics mode that supports images and the mouse. For some reason, I never knew that.

On July 25, 2018, I started using Links or Links2 in graphics mode by starting it up from the command prompt with links2 -g. I like it. It does not style pages as well as NetSurf, which is still a favorite alternative web browser of mine. But I like the Links text-mode styling with some graphics capabilities.

I found this post of mine from December 2017. It's a quote from a March 2017 Hacker News thread.

There is a special circle of hell for people who write documentation on pages so javascript heavy that a lynx browser on a computer with broken graphics drivers can't load it.

Jul 25, 2018

"Senator Asks US Agencies to Remove Flash from Government Websites"



HN comment:

Two notes, though I doubt anyone working on government pages will actually read them:

1) PDF -> HTML improves accessibility. HTML -> content rendered by JS via an SPA framework is worse than PDF, and approaches Flash. Please don't do that.

2) Think of the robots :). One of the problem of government data is that while you can usually find the scanned PDF or an XLS file with the data you're looking for, it's completely useless for automated processing. Making public data easier for machines to read enables citizens to build interesting tools on top of them.

Yes. Trying to text-process government PDF files with homemade programs can be frustrating. I had that trouble back in 2005. Our local governments still produce too many PDF files today. Web access to one of the Lucas County services requires Silverlight!!!???

July 27, 2018

"Twitter shares drop 14 percent after reporting declining monthly active users"


What I don't understand is the discrepancy between the size of their engineering workforce and the lousy quality of their web client on mobile.


This may be an intentional dark pattern / effort to drive users away from the web client and to the official app. Facebook and reddit do the same thing. More permissions = more juicy data to harvest.

I think that a lot of websites do that. According to my tin-foil hat theories, it seems that websites on all devices function so poorly due to disgusting web designs that it seems like the bad designs are intentional to encourage people to download yet another native app for the phone.

Jul 30-31, 2018



Great top comment:

I've said this before, but it bears repeating: Moby Dick is 1.2mb uncompressed in plain-text. That's lower than the "average" news website by quite a bit--I just loaded the New York Times front page. It was 6.6mb. that's more than 5 copies of Moby Dick, solely for a gateway to the actual content that I want. A secondary reload was only 5mb.

I then opened a random article. The article itself was about 1,400 words long, but the page was 5.9mb. That's about 4kb per word without including the gateway (which is required if you're not using social media). Including the gateway, that's about 8kb per word, which is actually about the size of the actual content of the article itself.

So all told, to read just one article from the New York Times, I had to download the equivalent of ten copies of Moby Dick. That's about 4,600 pages. That's approaching the entirety of George R.R. Martin's A Song of Ice and Fire, without appendices.

If I check the NY Times just 4 times a day and read three articles each time, I'm downloading 100mb worth of stuff (83 Moby-Dicks) to read 72kb worth of plaintext.

Even ignoring first-principles ecological conservatism, that's just insanely inefficient and wasteful, regardless of how inexpensive bandwidth and computing power are in the west.

EDIT: I wrote a longer write-up on this a while ago on a personal blog, but don't want it to be hugged to death:


another comment, related to the ny times or media orgs:

It would be nice if the things I pay for didn't start stuffing their content with bullshit. What and who do I have to pay to get single second page loads?

Man, it's like I wrote that comment, but I didn't, at least not at HN. I certainly have made similar comments here.

That commenter continued ...

It's not a given that advertising has to be so bloated and privacy-invasive. Various podcasts and blogs (like Daring Fireball) plug the same ad to their entire audience each post/episode for set periods of time. If you're going to cry about needing advertising then take your geographic and demographic based targeting. But no war of attrition will get me to concede you need user-by-user tracking.

You want me to pay for your content? Fine, I like it well enough. You want to present ads as well? Okay sure, the writing and perspectives are worth that too I suppose. But in addition to all of this you want to track my behavior and correlate it to my online activity that has nothing to do with your content? No, that's ridiculous.

And hence why I disable JavaScript for most of my web reading. That's reading. R - E - A - D - I - N - G

The top commenter replied someone else's misguided comment.

A random archive of the New York Times frontpage in 2005 is 300kb. Articles were probably comparable in size.

Are you honestly saying that the landscape of the internet and/or the staffing needs of the NY Times has changed so drastically that they actually needed a 22x increase in size to deliver fundamentally text-based reporting?

Here's the misguided comment by the person who replied to the top comment.

I don't think thats a meaningful comparison. Moby Dick is a book, written by 1 guy and maybe an editor or two. NYT employs 1,300 people.

Yes. The NY Times like most media orgs are for-profit businesses, and most of these orgs rely on subscriptions and/or digital advertising that obviously needs page views.

When you read a book all you get is the text. NYT has text, images, related articles, analytics, etc.

But why do an increasingly number of media orgs require JavaScript to display images when the HTML image tag has existed for more than 20 years?

Here's a Toledo Blade editorial that contains around 370 words and one small image.



From: Dulles, VA - Thinkpad T430 - Chrome - Cable 7/31/2018, 3:12:23 PM First View Fully Loaded: Time: 10.610 seconds Requests: 286 Bytes in: 2,532 KB

Why would a 370-word editorial require a web reader to download 2.5 megabytes of data? And why would such an article require the reader's web browser to make 286 web requests?

43 of those requests were for JavaScript, which totaled 1.1 megabytes of the total download. The Blade's bloated web design, however, is not as bad as it was two to three years ago.

Again from above:

Moby Dick is 1.2mb uncompressed in plain-text

And the Blade's website required a reader to download 1.1 megabytes of JavaScript to read a small article that was mainly text.

Moby Dick doesn't have to know what pages you read. NYT needs to know how long you spent, on which articles, etc. They need data to produce the product and you can only achieve that with javascript tracking pixels (Server logs aren't good enough).

If Moby Dick was being rewritten and optimized every single day it would be a few mb. Its not, so you can't compare the two.

Yes NYT should be lighter, no your comparison is not meaningful. A better comparison would by Moby Dick to the physical NYT newspaper.

Here's another commenter who made a misguided post as a reply to the author of the top comment.

This is both an appeal to people's universal appreciation of efficiency, and a weak denunciation of the modern web. Your argument is

1) that a website's value is the number of words on the page, and

Huh? The other person did not discuss the value or the merits of word counts. It's the content that matters, regardless if it's 20,000 words or 200 words. And it doesn't matter if the content is a mix of audio, video, images, and words, as long as every piece of content has value. I despise huge, useless images that have nothing to do with the content of the page. And since most people read web pages on their phones, why uses massive images. A 640 or 800 pixel wide image is probably enough for most cases for articles on all screen sizes. Maybe instead of embedding giant images and video and every social media widget within the article, the content creators should provide links to the bigger content. If a reader wants to view the image or the video, then the reader can click a link to view the image or video. That keeps the article page lighter.

Excerpts from another comment posted by the author of the top comment, at least at the time that I read this HN thread.

Concerning the relative value of HTML and CSS, yes, you could argue that UX matters in that department, but even the most bloated static HTML/CSS page is going to pale dramatically in comparison to the size of what's considered acceptable throughput today.

Hopefully, commenters or "thinkers" like this HN user do not design websites.

Comparing the raw text of a fiction novel to the code of a website is a pretty asinine comparison, honestly.

Clearly, the person lacks the ability to comprehend the comparison, which is to highlight how absurdly bloated single web pages have become, especially at media orgs. It's why Facebook created Instant Articles and Google created Accelerated Mobile Pages.

Another HN commenter added:

Amen to every word except this sentence. "Better choices should be made by web developers to not ship this bullshit in the first place."

No developer I know, web or otherwise, wants to do any of this, and all of them are religious in their use of ad blockers and autoplay stoppers.

This is the kind of stuff developers are forced to do with guns to their heads by the PMs and marketing teams that actually determine the user experience.

We don't hear how designers and developers are forced to make bloated websites because of the business models used by the publishers. I wish that we would hear more tell-alls in this area.

In July 2018, I began using and enjoying the Links web browser on my Linux desktop computer. That would be Links in graphics mode. For some reason, I needed to download Links2. I start up the browser form the command line with links2 -g.

With a broadband connection at home, I'm stunned at how fast our internet access is, and we do not have the fastest option, offered by local ISP company toast.net. Our version costs around $37 per month.

Links renders web pages literally in a blink of an eye, maybe faster than I can blink. The limited graphical browser NetSurf can render through at least HTML4 and CSS2 and maybe a smattering of HTML5 and CSS3.

But links -g renders only basic HTML, such as headings, paragraphs, emphasis, bullet points, and blockquotes, about all that I need. links -g can also display images. But the browser does not support CSS. The browser provides global display options for the background color, font size, margin sizes, etc. Simple but useful.

When individual webpages are stripped down to their basic HTML, like how the web was meant to be, web access and web page loading is blazingly fast over our home broadband internet connection.

In July 2018, I've been using the uMatrix Chrome browser extension with everything disabled by default for all websites, and then I enable on a per site basis. With everything disabled, webpages in Chrome display similar to how they look in the links -g web browser.

Since it's unlikely to get the links -g web browser to work on an smartphone, then I may need to try Chrome again on the iPhone with uMatrix, assuming that exists for the mobile version of Chrome.

The web (http/https) is not slow. The mobile web is not slow. No such thing as the mobile web exists. It's the same damn web, displayed on smaller screens. Even on a 2G connection, simple web pages load faster than bloated web pages on fast internet connections.

The web browser makers are not the problem for a slow web as defined by The Verge. The blame belongs to bloated website publishers like The Verge.

If the Toledo Blade offered a web product that functioned like https://text.npr.org except with the addition of useful and judiciously placed images, video, and audio, then I would happily subscribe.

My fictional Toledo Gazette test website with real Blade content.


One Year Without AMP (alexkras.com)



August 2018



From the medium.com post:

Let’s talk about Google AMP. AMP stands for Accelerated-Mobile-Pages. It’s a technology Google originally introduced to get web developers to speed up their webpages for mobile devices and mobile networks. But in many ways it seems like great technology for any device or network. Who doesn’t want fast websites?

Uh, newsflash: Website owners don't need Google AMP to create fast websites. Website owners can CHOOSE to create fast websites on their own, but they don't. It's a choice.

There’s nothing that magical about it. A big part of its performance boost is simply its standards: no javascript, all inline CSS, and CSS can only measure 50KB or less. You’re going to make any page load faster with those requirements.

And website owners can do that themselves. But most web publishers, especially media orgs, CHOOSE to create horrendously bloated web designs.

Top comment from the HN thread that contains over 340 comments:

Question: If what makes AMP fast is the restrictions on size, JS, and CSS, and you know this and want to conform to this, why do you need to use AMP? Why not just develop your site like this anyways?

It's possible that the tech people working at media orgs would prefer to create lightweight, fast-loading, non-tracking websites, but those tech people are employees beholden to "superiors" who promote a failing business model, built around digital ads, clickbait, and pageviews.

Another HN comment:

AMP's innovation isn't a way to make pages fast. AMP is a way to sell other stakeholders on implementing technologies that make your website fast. All the stuff AMP does is stuff you could do yourself without the extra request to amp.js and the extra work to amp-ify your pages.

But imagine you've got an advertising department that wants three different ad networks, a couple different managers that want to see stats from a couple different analytics platforms, and and the designer wants to load a font from fontsquirrel and another one from typekit and another one from google web fonts, and as a developer who wants to keep the site fast you have to fight them every single time they want to add something else that slows your site down. Having the same fight every time, with everybody else saying "oh, it's just one request. and this one is really critical" it's hard to keep fighting that fight.

It's a lot easier to say "i can't do that, it doesn't work in AMP". If you can find a better way to convince large organizations that page load speed is a valuable metric, and more important that whatever other resource they want to load today, I'd love to hear it. But from what i've seen, AMP is the only thing that's had any success in tackling this problem.

Another HN comment:

AMP was a blessing for me honestly. I can now maintain a version of our new site that isn't bogged down with tracking and flavor-of-the-month JS feature garbage.

I've been fighting against adding additional tracking forever, but constantly get railroaded by marketing because "they're the ones that know how to make us profitable."

Fundamentally I hate what it means for the internet, but I finally have a little power to say "no we can't do that."

Another HN comment:

It is astonishing how hard it can be to internally sell any kind of web quality features to management in both for profit and non-profit organizations.

There is also a real herd effect. Many people will do whatever Matt Cutts tells them because they think it will be good for their SEO. Yeah right. Some of the people who are good at SEO either went to work for huge brands or quasi-competitors of Google (like about.com) that might have some ability to bring Google to anti-trust court; most of the others switched to paid advertising once they figured out that Google won't let you win at SEO.

Depending upon the business, designers and developers should NOT be blamed for poorly designed websites. We don't know the stories behind the bloated, nefarious designs.




My kingdom for restaurant websites which have a text-only one page version of the menu without so much javascript that google translate can't even parse it and nothing is copypastable.


06:43 @romanzolotarev Oh my,... won't name sites. Dear web developers, if your webapp weights more than 500KB (on page load) you are doing it wrong. If it's a static page (text and pictures) it should be in 1KB-100KB range. #IndieWeb (twtr.io/1hhT9hMyM9L)

"How to Design for the Modern Web (medium.com)" https://medium.com/commitlog/how-to-design-for-the-modern-web-52eaa926bae2 https://news.ycombinator.com/item?id=17897404

Since it's a medium.com article in 2018, that means that I access the post with styling and JavaScript disabled. I see a simple, plain, HTML text view. No CSS. No images.

Over the past couple years, medium.com's web design has plummeted into the sewer on all screen sizes. It's so bad, that I mostly avoid reading medium.com articles on all screen sizes.

At one time, medium.com had a comfortable reading environment. That HN thread contained over 80 comments. The top comment is wonderfully sarcastic and poignant.

Medium is doing a great job on the web design front already! Reading on my laptop, there was a fixed header taking up about 10-15% of my screen height (in fullscreen), a fixed social networking sidebar on the left and a lovely dialog box attached to the bottom of the screen letting me know I can subscribe to Medium for $5/month. Best part about that bottom bar is that just in case I haven't subscribed when I closed it out, it is more than happy to show up after a few minutes when I've switched back to the tab, as it did while I was composing this commentary.

Really helps the readability of the website, which I especially appreciate on a website devoted entirely to reading. Really makes me wonder why Firefox even bothers with a Reader mode when we have high quality web design like this!

Let's never go back to hosting our own writing on our own websites. Medium has made that obsolete.

It makes wonder why supposedly talented designers and engineers display content so horribly. Is it by choice, or are the developers forced to create a hideous reading environment. Medium.com started as a focus on text in an era when so many silos were focusing on images and videos. But medium.com's reading environment on the web is too rugged. No wonder they hawk their native mobile app so much.

Another HN comment:

It’s impossible to read any article regarding web design on Medium without having to seriously question whether it’s intended to be satire or not.

Another HN comment:

In a mobile browser, they also obscure the article text with an "Open in app" button that stays on screen as you scroll.

It's infuriating. Disabling JavaScript does not help because much of medium.com's irritating issues are done with CSS. Hence the reason for disabling basically everything via uMatrix.

Unrelated to this topic, but it applies to my thoughts about media orgs, especially legacy newspapers that have websites.

Even as a subscriber news sites are bloated. I wish the subscription got me an optimized mobile first view

BTW, the medium.com article is satire too. It's hilarious. Create an article about intentionally making bad web designs and post the joke to a website that uses bad web design.

From the medium.com jokey article:

Always Bet on JavaScript

These modals obviously require JavaScript, and of course it’s important to have endless scrolling but make sure you future proof yourself by using the latest framework. You may think “oh it’s only a couple of modals” today. But in the future, it may be many many more modals and oh boy! When that happens you’ll regret you did not make an isomorphic application with React and code splitting!

Don’t know anything about web development at all? Don’t worry you can just take a week long bootcamp!

Brilliant. The writer used medium.com to rail against websites like medium.com.

Conservative web development (drewdevault.com) https://drewdevault.com/2018/09/04/Conservative-web-development.html https://news.ycombinator.com/item?id=17908691

Sep 27, 2018



HN comment:

This feels like the trillionth time I read a rant like this, but that doesn't make it less true or important.

The more railing against useless, obnoxious, bloated web design, the better.

Another HN comment:

I for one absolutely agree with you that the web has gotten too bloated, especially news websites, and I salute your initiative for "debloating" your blog. This is a path that more of us should follow.

I think Google's AMP project is a step in the right direction, but it seems to me that it's not adopted enough.

Hell no to AMP, which is a reaction to bloated web design. AMP helps Google. If publishers like the results of AMP, then publishers should create their own lightweight websites.

Another HN comment:

As long as advertising corporations keep control of your web browser, the experience will always be optimized for ad tech and maybe some other megacorp interests, but definitely not for you. That's the problem.

I imagine a web browser actually made for the user will try hard to extract content from web pages by default, render them with decent consistent UX and not blindly let third parties display and execute any crap they want. There is no reason to render say articles with comments in thousand different ways for thousand websites. And no reason it can't be blazing fast on RaspberryPi.

Here's the web browser to use that I execute from the command line in a terminal shell on my old Linux desktop computer: links2 -g. It's stunning how fast the web is over a broadband connection.

The web is fast even over a slow connection. Websites are slow. Newer web browsers are running slower on older CPUs because modern web browsers are doing too much.

links2 -g shows how fast the web can be over a modern internet connection.

Another HN comment:

I stopped doing casual web browsing years ago when many web owners started to bloat their websites with an extravagant amount of ads, excessive JavaScript code, and unnecessary "enhancements" (like custom scrolling experiences and the like). Nowadays, websites like Hacker News, the old Reddit interface, Craiglist, and similar are the only ones that I find myself visiting, just because they are among the few hundreds that still load with server-side generated HTML and the UI is fairly minimal.

From the fabiensanglard.net article:

It used to happen sporadically but now it is a daily experience. As I am browsing the net I click on a link (usually a newspaper website). The page starts to load. Then I wait. And I wait. And I wait. It takes several seconds.

I don't remember the sporadic days.

Once loaded, my patience is not rewarded since my MacBook Air mid-2011 seems to barely be able to keep up. Videos start playing left and right. Sound is not even turned off by default anymore. This shitshow festival of lights and sounds is discouraging but I am committed to learn about world news. I continue.

I have the silly idea to scroll down (searching for the meaty citations located between double quotes) and the framerate drops to 15 frames per second. Later, for no apparent reason, all fans will start running at full speed. The air exhaust will expel burning hot air. MacOS X's ActivityMonitor.app reveals countless "Helpers" processes which are not helping at all. I wonder if the machine is going to die on my lap, or take off like a jet and fly away.

What a sad sight. My laptop is seven years old but its specs are far from ridiculous.

And most of those websites are probably read-only websites. But the publishers need to make money, and their business models are based around serving ads.

Browsing-only websites are obsoleting my hardware, operating systems, and web browsers. That's hideous. I'm not playing brand new video games.

More from the article:

A computer this powerful should be able to render any web-page gracefully. It used to do the job correctly a few years ago. It can't anymore even though it doesn't look like the new content is doing much more than it used to.

There is hope. More and more of us are becoming aware of the problem. I am confident things will end up going in the right direction. On a personal note, it was time for this website to change.

This is the new layout of fabiensanglard.net. I have removed the comment sections, most of the Javascript, and I will try to stick to text, svgs and webp images. I may have gone over the top by removing almost everything but this will be an interesting challenge to attempt to achieve more with less.







"Please Google, let us revert to the classic Gmail look"



Product managers at Google (and everywhere else) don't get promoted for leaving good products alone.

Another HN comment:

"If it ain't broke, redesign it."

From the Sep 28, 2018 Google forum thread starter post:

Yesterday when I looked in my gmail, everything was fine, how I liked it. A while back I used the "Go back to classic Gmail" option and thought it would stay in the classic 2011 look. Now today, the option is gone, and my inbox was forcibly changed to the new look that is absolutely horrid!

The classic design of 2011 that we've been using until now was perfect. None of this "cleaner" and "simpler" none-sense that google has just shoved into our faces.

I still get the old, basic webmail-like UI for Gmail by disabling JavaScript, but I have only been able to get this to work in the past by using desktop/laptop web browsers. Even with JS disabled in my phone's web browser, I have never been able to get the old Gmail look.

An answer in the Google forum thread:

Here's how to get rid of it now:

1.) Reload gmail page 2.) in the bottom right side of the screen click "Load Basic HTML (for slow connections)" 3.) You will be on a version of gmail that's ugly, but not nearly as ugly as their new UI. 4.) At the top of the page click to set default view to HTML.

I absolutely refuse to EVER use this new gmail. I'm going to use HTML until they either get rid of it or change to something actually good. If a fair portion of the 1.4 Billion users do this it would send a stronger message than any petition ever could.

BTW, the Google forum UI/UX design is wretched. With JS disabled, this is all that I see:

To use Google Groups Discussions, please enable JavaScript in your browser settings, and then refresh this page.

The Hurricane Web https://mxb.at/blog/hurricane-web/

As Hurricane Florence makes its way across the US southeast coast, many people are stuck in areas with severe flooding. These people rely on outside information, yet have limited bandwidth and power. To help them, news platforms like CNN and NPR provide text-only versions of their sites.

Text-only sites like these are usually treated as a MVP of sorts. A slimmed-down version of the real site, specifically for emergencies.

I’d argue though that in some aspects, they are actually better than the original. Think about it- that simple NPR site gets a lot of points right:

  • It’s pure content, without any distractions
  • It’s almost completely fail-safe
  • It’s responsive by default and will work on any device
  • It’s accessible
  • It’s search engine friendly
  • It’s machine readable and translatable
  • It has close to perfect performance scores:

Most importantly, it’s user friendly. People get what they came for (the news) and are able to accomplish their tasks.

This is the web as it was originally designed. Pure information, with zero overhead. Beautiful in a way.

The “full” NPR site in comparison takes ~114 requests and weighs close to 3MB on average. Time to first paint is around 20 seconds on slow connections. It includes ads, analytics, tracking scripts and social media widgets.

Meanwhile, the actual news content is roughly the same. The articles are identical - apart from some complementary images, they convey exactly the same information.

I recently read this great article by Alex Russel, in which he compares Javascript to CO2 - in the sense that too much of it can be harmful to the ecosystem.

Javascript enables us to do amazing things and it can really enhance the user experience, if done right. But it always has a cost. It’s the most expensive way to accomplish a task, and it’s also the most fragile. It’s easy to forget that fact when we develop things on a highspeed broadband connection, on our state-of-the-art devices.

That’s why websites built for a storm do not rely on Javascript. The benefit simply does not outweigh the cost. They rely on resilient HTML, because that’s all that is really necessary here.

That NPR [text] site is a very useful thing that serves a purpose, and it does so in the simplest, most efficient way possible. Personally, I’d love to see more distilled experiences like this on the web.

Figure out what the main thing is people want from your site and deliver it - using the simplest, least powerful technology available. Make it withstand hurricanes.


JavaScript is the web’s CO2. We need some of it, but too much puts the entire ecosystem at risk. Those who emit the most are furthest from suffering the consequences — until the ecosystem collapses. The web will not succeed in the markets and form-factors where computing is headed unless we get JS emissions under control.



Oct 23, 2018

"Show HN: Websites in 2018 (bloomca.me)"

https://news.ycombinator.com/item?id=18284910 - 20 comments

Top comment:

I found a few bugs: It only has one JS file, and that isn't even 3KB. Needs to be at least 3MB. uBlock only blocks 2 items, not 30. It doesn't have infinite scroll, a sticky header, a fake chat window that pops up and says, "Shana from support is here to answer any questions"... Oh, and the back button still works.

The bloomca.me website is also humorous. After making through the intentionally annoying dialog boxes, the final page states:

Of course, it is not that bad, and there are plently of good, fast and responsive websites, which are not bloated with unnecessary information and poorly working widgets.

Maybe somebody will think why is it so and what can we do to avoid it in the future to improve web experience.


Jeremy read through all the blog on 2003-2006 research... Good stuff... None of the word docs open so you have to find gems through broken special characters

How could 12- to 15-year-old MS Word docs be unreadable in 2018?

This is less web design-related and more about formats.

graydon2.dreamwidth.org - always bet on text

Nov 28, 2018

"The Baseline Costs of JavaScript Frameworks (uncommon.is)"


https://news.ycombinator.com/item?id=18552375 - 208 comments

Top HN comment:

As a US based user of an A53 based phone, where I use firefox. I'm here to tell you more than 50% of the sites I visit with it (HN being one that actually works well) are completely unusable. There was a saying back in the 1990's that developers should be forced to use obsolete hardware to assure that their software was usable for the regular user, and its even more true today. Developers running around with the latest iphones does nothing to actually represent what the average user actually sees.

Picking on amazon, I generally don't even bother going to their site unless i'm on my desktop, where I can physically watch them frequently spike my 4+ Ghz CPU for a few seconds when I type in the search bar, or click a page. Doing the same on my phone or cheap tablet frequently results in a 10+ second waits. The other day I was typing on my tablet and the amazon search completion was literally taking 20+ seconds to display each character and a completion list for it.


You don't need the latest iphone for sites to work. Sites poorly impliment mobile sites if there's an app equivalent. Amazon is one of those. Their site is littered with tracking and mountains of compatability code like any other site trying to make a profit.

Over the years, my jokey, tin-foil hat conspiracy theory was that websites intentionally created horrible user experiences for mobile web users to encourage users to download the sites' native app alternatives.

HN commenter's claim:

You can disable javascript and still order stuff on Amazon - that's inspired me to take a step back and learn how to build applications around HTML forms and CGI. If it's good enough for amazon...

Impressive if true. I visited Amazon.com with uMatrix cranked up to block everything, and I could not determine how to log into Amazon. I found no login link.

November 2018

"Medium is a poor choice for blogging (medium.com)"



HN comment:

Dan Luu [1] gets this absolutely right. His blog would probably look similar on Lynx, or any text-only browser, yet is better than 99.99% of stuff "out there".

[1] https://danluu.com/

danluu.com uses a great web design, like text.npr.org. Do those sites use good typography? No or probably not. That depends upon the readers who will have a wide variety of tastes.

danluu.com and text.npr.org load fast. They are lightweight. They are not hostile web reading experiences. On my phone, I hold my phone in landscape mode to read their sites, since that increases the font size.

From a typography viewpoint, I would prefer a larger font size, more line height, more paragraph spacing, a narrower column of text, and maybe a couple other things.

Both sites use good contrast between text color and background color. Both sites underline their links within the body of articles.

If most websites are read on mobile devices, then the design of these websites are fine. On iOS devices, readers can choose the reader mode within the Safari web browser, which allows readers to customize the text and background colors, font sizes, and font types. Perfect. Reader modes exist or probably exist in other mobile web browsers. A browser extension might be required.

Here are comments from the HN thread about Medium being a poor choice for blogging.

This is a reply comment to the person who mentioned danluu.com.

He took it way too far. Long lines are unreadable and 50 more bytes of CSS would fix it.

Another reply about danluu.com's brilliant web design:

On my regular browser size this looks something like this [1], which is borderline unreadable due to the width alone. Sure I can resize the browser at least, but even then I am finding the posts hard to read due to the font-size, letter spacing and line-height. The browser default styles did work perfectly in the 90's, but they didn't evolve when the devices use to browse the pages did. And yes, you can always use Lynx, adjust the styles myself or use a CRT display, but I most likely won't bother.

When viewing danluu.com on my desktop or laptop, I increase the font size of the site within the web browser. Problem solved. The long lines bother me much less than the massive bloat that accompanies so many websites.

Another reply that I agree with:

The right place to fix the long line issue is on the browser side, with user in control of what makes the line "long" - since it's inherently a subjective metric.

If most websites were like this one, I could write a simple one-page CSS that formatted them the way I want - including fonts, colors, lines etc. As originally intended by this whole HTML thing.

Even links2 -g permits me, the reader, the ability to make minor adjustments to how web pages get displayed.

Another reply:

Just resize your browser window: it's not the site's job to guess how you want content displayed.

I loved browsing the web in the 90s: I had a half-width browser window on one side of my screen, an editor window in the upper quadrant of the other side and a console window the the lower quadrant. It was great! Then suddenly every website decided that I must have a full-width browser window, and was ludicrously narrow with a half-width browser. And they all stopped using and started abusing CSS. And then they all started using & abusing (but I repeat myself) JavaScript.

All I really want from the Web in 2018 is the experience I had in the 90s, and I can't get it any longer.

Modern web browsers are huge programs, compared to web browsers of the late 1990s. Yet today's bloated and complex web browsers do not provide users with the ability to modify easily how web pages get displayed. Some users prefer to read light text on a dark background. That requires an extension. Some users want more contrast or less contrast between text color and background color. Some users want a larger font size while others prefer a smaller font size. And on and on.

The content is what matters. Good typography helps readers, but what I consider to be good typography could be considered mediocre or bad typography by someone else.

While I like the web designs and typographical choices for my websites, such as ...

... other users might dislike my choices.

Reader modes with some typographical options should be default in web browsers and easily usable.

Reader modes help make obnoxiously designed websites easier to digest, assuming that the reader mode option can function on such a website, reader modes do not stop the downloading of massive web bloat.

HN reply:

Here's the thing: your browser has a handy tool to increase/decrease font size. Usually it will also scale the images in sensible ways. And it's available via keyboard shortcuts, so it's easy to zoom in/out. Also this format works great with the browser's "Reader" function.

I don't understand this response when viewing danluu.com on a laptop or desktop web browser.

I know people keep saying this, but resizing the browser is not really a viable solution with today's messed up web. Sure, I can resize the browser for this blog using full width lines but then I have to resize my browser again for another website because the other website chooses to waste half of the horizontal real estate with meaningless stuff. As a user, I can't keep resizing the browser for every site I visit.

HN reply that I agree with:

I have a 40in monitor, a 13in monitor, and a 7in monitor.

It's impossible for a site developer to know how much space to allocate to their page.

On the 40in screen it would be asinine for me to maximise it, for example.

It's completely unsurprising to me that I can resize the window into a state where the website becomes silly. I just don't do that.

Resizing a window is trivial, it's not that you can't, it's that you won't.

edit: Obviously I know what responsive design is. I'm on Hacker News. The author hasn't done it (probably because they have better things to do), you have a trivial option to work around it.

The contrast to be drawn is with a site like Medium that deliberately goes out of its' way to be annoying and harder to use for some vague profitability goal.

And a bizarre reply:

Or the website could just set a width on the text column. Why are we pretending this website is some paragon of design? Simple is good but there are still basic readability standards that are important and can be achieved with one or two lines of css.

Clearly that commenter has not played around with webpagetest.org.

Compared to most websites that I encounter, danluu.com's web design is brilliant. Dan's typographical choices are lacking, but that's different than web design, in my opinion.

Medium.com uses good typography, but medium.com's design is obnoxious with its bloat and fixed headers and banners and other hostile annoyances.

Content is the key. Dan wrote about obnoxious web designs that pollute the landscape.


More about Dan's design and typography choices:

The total lack of styling makes it almost unreadable. Even if you resize the window to get a sane page width the lack of margins makes it uncomfortable to read.


The question you should be asking is, why is the default CSS in your browser making unstyled websites uncomfortable to read? The browser could provide sane defaults for margins etc. It just chooses not to - probably because everything it renders is going to be styled anyway, so why bother?

Ironically, the "reading mode" thing that is becoming popular is essentially that - stripping the page of most of its custom styling, and then applying some sensible default style. It's like we went full circle, all the way back to pre-CSS times.

If ad blockers, disabling JavaScript, and reader modes continue to increase in popularity, then most web reading done on mobile devices will make most websites look the same, like paperback books of text with some images, and I see nothing wrong with that if the main reason to visit information-based websites is for the content.

What's the point of creating a sophisticated CSS file or a series of CSS files with complex media queries when users reading the articles select reader mode?

Piggybacking on the above ...

December 2018:

"The State of Web Browsers – 2019 edition (ferdychristant.com)"



Top HN comment:

This article celebrates the proposition that "Almost every user will experience websites the way they were intended to be experienced." I doubt this is a good thing.

I enable Safari or Firefox Reader View. I no longer "experience websites the way they were intended to be experienced." Because it's way, way better! When will Chrome implement a reader mode? Dumb question: they barely snuck it in on Android and drag their feet everywhere else.

Browsers may be the user's ally or adversary. We know which one Chrome is. Microsoft has one hell of an opportunity here - not in a financial sense, but in a way that could greatly benefit Windows users.

Within the Chrome web browser on my desktop and laptop computers, I use the extension Mercury Reader to get a reader mode-like functionality.

Another HN comment:

Browsers need to be on the side of users, if only because authors are demonstrably terrible at user experience.

This isn't just because there's intrusive, bright/colorful/motion ads everywhere (though that's part of it). It's because authors suck at font choice, text size, line length, line height, text/background color & contrast, reflow across device sizes, and all the other basics of written content.

Browsers need to also do things like:

  • make content more accessible to people with visual impairments and other disabilities
  • filter out content that doesn't need to be loaded
  • translate content into your language
  • make content appear consistent across sites and according to your preference (e.g. most recently, dark mode)
  • protect users from malicious content
  • bake in standard functionality like sharing & bookmarks

Another HN comment:

I agree that authors suck at font choice, text size, line length, line height, text/background color & contrast, etc. I generally do not specify any in documents I write, instead the client will render the document according to the user settings; this also reduces the file size. (In fact, I would even want to ensure most of them will work on Lynx, too.) (Also, many thing I write I use plain ASCII text files anyways; that is far more portable than HTML anyways)

It need on the side of users not only because authors are demonstrably terrible at user experience, but also the user may wish to override anything.

Great HN comment:

I want a browser that defaults to “reader view” ... I don’t even want to see the web as it was “intended to be viewed” if it means endless tracking, wasting my battery life & optimization for addictive ness.

Remember when stylesheets were just a suggestion & you were supposed to be able to swap it out for your own preferences?

I'm convinced that on my old iPhone 5C, the batter drains faster when I have JavaScript enabled. The main app that I use on my phone is the Safari web browser. Sometimes, I have several tabs opened. If JavaScript is enabled, not only does my phone get slow when trying to load so many bloated web pages, but it seems like the battery drains faster.

Another HN comment:

  • 63% of browsing is done on mobile. Having a dominant position on the desktop doesn’t mean much.


If that 63 percent number is try, then why did so many HN commenters whine about the long line length for danluu.com? I'm guessing that many of the commenters were developers and designers, reading HN from their desktop or laptop computers.

Obviously, I don't have a long-line problem when reading danluu.com in landscape mode on my phone. And if I chose reader mode, I can view the site comfortably on my phone, held in portrait mode.

The whining by many HN commenters was misplaced, and it shows how some tech people may be out of touch with how many other people read the web.

Dec 22, 2018

"Blueprint for a more accessible internet (qz.com)"



From the qz.com article:

Using his platform as Google’s “chief internet evangelist,” Cerf has been proselytizing about how bad design results in injustice. ”It’s almost criminal that programmers have not had their feet held to the fire to build interfaces that are accommodating for people with vision problems or hearing problems or motor problems,” Cerf says.

The HN thread contained only five comments. Is that telling? Does that mean that the web developers and designers who frequent HN don't care about good web design, especially concerning accessibility? Designing websites for accessibility might interfere with using shitpiles of JavaScript.

Disturbing HN comment:

Serious question: who is paying me as a freelancer or small business owner to work on mobile accessibility? It's not the client.

The amount of work and hours put into developing and maintaining (nevermind learning from scratch) is hugely exceeded by the 10-15% potential increase in revenue, which I'm dubious about to begin with. Is the 200 hours/year it would take to maintain at $100/hr worth $20,000 in incremental revenue? I just don't seethat happening.

I don't disagree with most of the article, but the expectation that "you should do this without considering opportunity cost" makes the entire argument moot.

What if new building designers took a similar approach? Why design a new building to be accessible for a small population of wheelchair users? Well, it's a law in the U.S.

That HN comment proves that designers and developers MIGHT be responsible for the large amount of horribly bloated and clunky web experiences.

Designing web page layouts for screen readers (benrobertson.io)




I never understood why these webpages can't have like, 4 lines of CSS to make them much more readable. Preserve the older aesthetic I guess?


As pointed out elsewhere, the post is from 1999. You're effectively asking the author to, apart from creating the content, to also maintain it over the years to someone else's arbitrary satisfaction. I'm not sure if that's fair. The author likely has other project's they're working on, and maintaining the look of something they wrote 18 years ago isn't a priority.

another reply:

Maybe because the author's expertise is something other than writing HTML, so he picked up an old book on HTML, marked it up, and that's it. That the browser can render HTML written 20 years ago is quite a virtue. If it only takes 4 lines of CSS to make it more readable, then the page's lack of readability is more an indictment of the browser (which could do this tidying itself) rather than of the author, who should not have to continually update HTML so it renders well on recently-invented devices.

2019 HTML/CSS/JavaScript may be outdated in 2039. But it's possible that in 2039, web browsers, assuming they still exist, may still render simple HTML, maybe even the <center> tag that was removed from the HTML standard several years ago, but web browsers still support it.

If 1990s HTML is not supported by so-called modern web browsers in 2039, old browser code, such as links2 -g will still exist that can be used to read sturdy HTML pages.

Even the links2 -g web browser from 15 to 20-plus years ago supports basic typography mods, which dictate how all web pages display.

Another HN reply that's, well ..., forget it. I'll be nice.

then the page's lack of readability is more an indictment of the browser (which could do this tidying itself)

I strongly disagree with this. The browser should do nothing it is not explicitly made to do, which is one of the reasons it can still render HTML from 20 years ago. We used to have browsers that tried to do that kind of thing and we're only now extracting ourselves from that mess.

It is 100% on the website author to make their page more readable.

Define more readable? This person's outrage to an informative, web-based book is severely misplaced, considering the tons of unreadable modern websites that exist NOW. The author of that sentiment would be in a perpetual state of anger if the commenter paid attention to most modern web design, which includes massive bloat.

The book is READABLE on all devices TO ME.


As of Jan 3, 2019, that large web page displays in the Safari web browser on my old iPhone 5C that I acquired in June 2014, and the 5C was tech that was released in the fall of 2013. I upgraded iOS once from 7.x to 8.x.

I can read that page better by holding the phone in landscape mode and zooming in. It's better than many modern websites that use an uncomfortable, microscopic font size and prevent zooming in.

The Safari reader mode is unavailable on my phone for finseth.com/craft, but I can read it on my phone per the above paragraph.

BUT ... it's a book. A web-based book about text editing. If I truly wanted to read it, I would use a desktop, laptop, or tablet computer. But I can read it on my phone with JavaScript disabled. Again, many so-called modern websites fail to display TEXT when JavaScript is disabled.

Here's a good response to the person's flawed viewpoint about how an author should update a web page from 1999 to look good in 2019.

The whole point of vanilla HTML is that it has few presentation details. It has some headings, bold, italic, and such. If the page does not specify margins or font size, the browser absolutely should set these so it is most readable on the device. If I write plain HTML today I am not optimizing for some VR headset that will be used twenty years hence. The headset should render the plain HTML in a manner faithful to the semantic markup, not so it looks the same way it looked on Netscape with a VGA screen.

Headings, bullet points, bolding, italicizing, and paragraph size are presentation choices made by the author. In my opinion, a large web page that uses headings, bullet points, and small paragraphs is easier to digest than if the same page used no headings, no bullet points, and large blobs of text per paragraph. The author CHOOSES how to display the content.

Margin spacing, font size, font type, text spacing, letter spacing, paragraph spacing, text color, and background color are typographical related, in my opinion. Some web browsers or browser extensions that support some kind of "reader mode" functionality will also offer a few typographical choices.

Web page authors can make typographical choices, but these are subjective. I like large font sizes, but others with perfect vision might prefer small font sizes. My typographical choices used at sawv.org may please some readers while displeasing others.

Typographical choices should be available to readers via the web browsers. Safari's reader mode on my old iPhone permits me to choose a font size, a font type, a text color, and a background color. It's about the only way I can comfortably read many modern websites, provided that the reader mode option is available for pages. I forget why Safari's reader mode option is unavailable for some web pages.

Jan 2019

"Add limits to amount of JavaScript that can be loaded by a website (webkit.org)"



JavaScript vs no or less JavaScript is always a highly intellectual conversation on Hacker News. [eyeroll].

Thankfully, the web READING experience is still CONTROLLED by the READER.

And thankfully, the web is big. Some say it's a world wide web. And the amount of content that exists that works fine without JavaScript is still too much info for me to consume.

As long as text displays without JavaScript, then I'm good.

At least one person in that HN thread understood the difference between web applications that require users to log into the app to perform work and a website that is meant for browsing-only readers.

What about web apps?

I think it's important to make this distinction. A lot of websites don't need the JavaScript they ship with to present the material the users want. I agree those websites should work without JavaScript.

However, you can't expect actual web applications to do this.

And I don't expect web apps to function without JavaScript.

Newsflash: my JavaScript editor fails to function when JavaScript is disabled.

Banking, tax preparing, project management, and even shopping sites, generally are private functions that require user accounts and require users to log into the services before performing tasks.

But a website like toledoblade.com should require JavaScript to read a 500-word editorial. That's not web design. That's shit. That's an abomination.


Feb 2019 - good stuff in the HN thread. Lightweight versions of websites without all the bloat (github.com) https://github.com/mdibaiee/awesome-lite-websites https://news.ycombinator.com/item?id=19239615

We Must Revive Gopherspace (2017) (matto.nl) https://box.matto.nl/revivegopher.html https://news.ycombinator.com/item?id=19178885

March 2019 "Web Design 3.0: When Your Web Design Matters (nicepage.com)" https://news.ycombinator.com/item?id=19309821

The nicepage.com article is not worth reading. I prefer the HN comments, like this one.

My requirements for a website are: I get the information I need then I leave ASAP. This is held up if the site requires cookies, or blocked outright if it requires scripts eg. to present text (q.v. the washington post). That is about all.

I want usable websites (= useful info concisely and safely presented, with respect for others' disabilities), but creative types often don't seem to care (edit: or know; they're often clueless about usability).

That's why I consider this personal website to be one of the best designed websites.

HN geeks whine that Dan Luu could improve the readability of his site by adding only a few lines of CSS, but that's subjective. Improve it for whom? Dan, some HN readers, me, who?

Maybe I prefer large-ish font sizes, but some HN geeks prefer a microscopic font size. If Dan included a few lines of CSS, then people would still complain: The text area is still too wide, now it's too narrow, line spacing is too big, font size is too big, I prefer serif fonts, paragraph spacing is too narrow, the contrast between text color and background color is too great or too small, etc.

This is why typography should be controlled by each READER. Modern web browsers are huge computer programs, but the browsers offer no or only a few typographical customization to readers. Firefox's reader mode/view is a good starting point.

This would flatten or homogenize websites. It would remove the so-called creativity of web designers and make all web ARTICLES to be READ look the same.

But the articles are not the same because the CONTENT will be different.

Reading about the Cleveland Browns is different than reading about meteorology, which is different than reading about shepherding, which is different than reading about programming in Lua.

I visit many websites for the CONTENT and not for the web designers' artistic creativity, related to how web pages look. If I cannot read the text, then the creativity is a useless, massive failure. This is why reason why web browser reader mode/views exist.

Kindle users are accustomed to books looking similar on their devices.

I skimmed and searched through the nicepage.com article, and I never saw craigslist.org mentioned. Craigslist has been useful for over 20 years. That's damn good design.

Even the HN thread failed to mention Craigslist. How does a discussion about web design fail to mention Craigslist?

An HN commenter mentioned this:


Users spend most of their time on other sites. This means that users prefer your site to work the same way as all the other sites they already know.

For some reason, that nicepage.com article mentioned Pinterest many times. I consider Pinterest to be the worst designed website of all time. It's UX is atrocious at best. It breaks normal, expected functionality. I detest Pinterest. I refuse to click on Pinterest links when the site pollutes search results.

The nicepage.com article did not mention too much creativity, such as using useless images. How does a discussion about web design not mention bloat?


The HN thread mentioned bloat.

how to ensure you can stay employed to shit out JavaScript and bloated styles.

Another commenter:

Right now it seems that the website builder services such as Squarespace are of great appeal to people. A few years ago a freelance designer could build a website for a local business with Wordpress but now they use Squarespace. The bloat does not matter to them. It looks good and who cares if it is not using CSS grid?

I can't see website builder services such as Squarespace being able to keep people happy with bloated web pages forever and, when it becomes possible to get results with native HTML5/CSS without having a behemoth of a dev team, there should be design progress.

March 13, 2019

This is about accessibility.

"The Web We Broke (ethanmarcotte.com)"



HN comment:

This is sad, but in no way surprising. It has the same root cause as many websites misusing javascript to a point where they become unusable, and that is people not really knowing how to use the tools they are given.

March 19, 2019

"Firefox 66.0 (mozilla.org)" https://news.ycombinator.com/item?id=19430684

With this update, Firefox is introducing scroll anchoring, which ensures that you’re not going to bounce around on the page as these slow-loading ads load.

While I love the idea of videos not automatically playing, I'm almost more excited for the scroll anchoring feature.

Another comment:

Same. Have you ever tried to click a link only to accidentally click something else because the page won't stop loading? It's infuriating.

Thanks to ad-bloated or image-bloated or who-knows-what-bloated websites because either their business models depend upon the bloat, or the site owners have designed without empathy. They don't care about the user experience.

"Text-only news sites are slowly making a comeback. Here’s why (poynter.org)" - 2017 story https://www.poynter.org/tech-tools/2017/text-only-news-sites-are-slowly-making-a-comeback-heres-why/ But it appeared again at HN on March 19, 2019. https://news.ycombinator.com/item?id=19429792

HN comment:

Advertisement in modern world is so insanely rotten to the core, it is disgusting. These people have made the web unbearable.

Yep. It's why abhor ads on websites, and I block crapware as much as possible by either disabling JavaScript or using uMatrix cranked up to blocking everything but HTML text. I enable/disable on a site by site basis, but the default is for uMatrix to block everything when I visit a new domain.

"I Used the Web for a Day on Internet Explorer 8 (smashingmagazine.com)" https://news.ycombinator.com/item?id=19430291

HN comment:

Google uses another approach to backwards compatibility that devs may want to consider

The commenter gave a good explanation and concluded with this:

Old browser users generally don't expect the latest & greatest features; if they did, they'd be on newer browsers. And old browser users don't pay the performance cost for new features, which can matter a lot when they're also on old computers and old connections.

It might be a slow internet connection and not an old internet connection.


This article is part of a series in which I attempt to use the web under various constraints, representing a given demographic of user. I hope to raise the profile of difficulties faced by real people, which are avoidable if we design and develop in a way that is sympathetic to their needs.

Holy crap. Someone who cares.

Last time, I navigated the web for a day using a screen reader. This time, I spent the day using Internet Explorer 8, which was released ten years ago today, on March 19th, 2009.


Apr 26, 2019 - about Gmail's bad design.


I'm going to admit I wasn't even aware of the current design, probably because other apps function well for my phone, and I've been using the basic HTML alternative on my desktop for many years.

That fix actually resembles something I do to websites, ad-blocking away all the unnecessary elements, siderails, wrappers, navbars, footers, etc. It's very cathartic spending a few seconds resculpting a website to reveal only the relevant content. And the end result always convinces me that less is more when it comes to web design, that regular text and simple formatting should be prioritized higher.

HN reply that sounds like me:

I also use the basic HTML view on desktop because 1) I don't want to leave the tab open with the modern UI since it inevitably uses up over a GB of ram by itself and 2) I don't want to close and reopen the tab with the modern UI since it takes O(10)s of seconds to be responsive.

I remember getting access to gmail when it was in the invite-only phase and thinking that it was absolute magic, that this was the future, and that I'd never use anything else. It only took a huge pile of JS to undo that. I use basic HTML now for gmail and I've switched a lot of what I do to fastmail.

Fastmail is my primary email management system. Gmail is a backup that I rarely access. When I access Gmail on a desktop or laptop computer, I access it with JavaScript disabled, which permits me to use Gmail's basic HTML interface.

Fastmail requires JavaScript, unfortunately. But I'll give credit to Google for maintaining a JavaScript-free version of Gmail. For some reason, however, the HMTL version of Gmail cannot be accessed from mobile web browsers.

Great HN reply comment:

And the end result always convinces me that less is more when it comes to web design

Careful with that phrase, as apparently a large number of designers think it means less content too!

Hilarious. It does seem that web design today means using tons of needless JavaScript in the web browser to display text because without that JavaScript bloat, the text cannot be displayed. That's how the new Toledo Blade website functions. A media org's website displays no text in browsers that do not support JavaScript. Great job on informing the public. The Blade, like far too many media orgs, create hostile web reading experiences.

Apr 26, 2019 tweet:


Unpopular Opinion: Outside of interactives and ads there aren’t too many reasons to have more than a basic level of JavaScript on news sites

Why is even a basic level of JavaScript needed? https://text.npr.org uses this tiny amount of JavaScript, which is unnecessary.

<script type="text/javascript">
if (window != top)
      top.location.href = location.href;

How healthy is the internet? (internethealthreport.org) https://internethealthreport.org/2019/ https://news.ycombinator.com/item?id=19745901

Top HN comment:

There's some absurd irony here in that the report about the health of the Internet is an omnishambles of the absolute worst practices on the modern internet.

I could not grok the web version. I got headers and those took me to tiles and those took me to blog posts? I was lost. In the end, I downloaded the PDF to read this.

And that's a lie. To start with I had to tell NoScript to allow their third party domain scripts because the content did not work without javascript.

They make some interesting points, but the standard scientific paper format vastly outperforms Web 4.0 routers and widgets for conveying it.

Modern web design has become so unnecessary complex and bloated that now the PDF versions of web pages are LIGHTER and EASIER to download and read.

JavaScript Is the CO2 of the Web [audio] (changelog.com)



JavaScript should be used like a condiment instead of a whole meal. It's led to the bastardization of the web in my opinion.


This website isn’t trying to say that all websites should seek to load with just 7KB of data transfer. The point it’s trying to make is that websites have become hugely obese in the past decade, and that as well as all of the other negatives that go with that, it’s also bad for the environment.

We already have Facebook and Google implementing Facebook Instant Articles and AMP respectively. But if we just built better, leaner websites, projects like those wouldn’t be necessary.

Is common sense permitted on the web in 2019?

So what Susty WP is really trying to do is encourage those who work on the web to think twice when it comes to, for example:

  • Implementing a carousel. The evidence suggests users neither like them nor use them.
  • Embedding a YouTube video vs just linking to it. YouTube embeds typically cost at least 1MB before the user even clicks on them.
  • Filling pages with cruft like endless sidebar content, ads, fancy flipping click-to-reveal navigation and distractions that are unlikely to add value and might ultimately turn users off.

And embedding social media posts that can bloat web pages significantly.

We don’t all live on the Wealthy Western Web. And even in the “wealthy” west, a lot of people pay by the MB for their data. Loading two web pages shouldn’t blast through 10MB of data, but for some websites that’s the reality.


Writing Less Damned Code – Heydon Pickering – btconfBER2016

Concatenating, minifying, compressing, caching: all serviceable ways to improve the performance of web interfaces. But none are as effective as not coding something in the first place. Code that don't exist is infinitely performant and extremely easy to maintain and document. This talk will identify some examples of front-end code that are either not needed at all, make the interface worse just by being there, or can be replaced by something much, much simpler. Say hello to unprogressive non-enhancement.

unsure if this is useful for anything.