Tag - Manifesto

assembled in 2016-2017

Most weeks, I find new and old web posts, related to this subject. These pages are mainly a dumping ground for links, excerpts, and quick thoughts.


storing links for page number 7 ...

aug 16, 2017 https://news.ycombinator.com/item?id=15027715

Most content websites have become such a massive crapfest of ad-bloat, bad UX, huge page sizes and general usability hell that it's nigh impossible that I'd be able to reach the actual content of a non AMP site in the first 5-10 seconds of clicking on its link. (On my phone that's an additional 1-2 seconds for registering the tap, and 1-2 seconds for navigating to the browser)

So say what you may, AMP (or FB Instant or its ilk) will prosper until the mobile web experience stops being so crappy.

First, no such thing as the "mobile web" exists. It's the web. Many times, people view the web with mobile devices, but it's the same web that can be viewed with desktop computers. Responsive web design permits sites to adjust the displays for different devices, but it's still the same web or http/https.

Second, the web experience, which includes viewing the web on mobile devices, is fine. The problem is not with the web. The problem is with how websites are designed.


https://news.ycombinator.com/item?id=15210022

This is a non-interactive text-only website. It shouldn't need anything beside HTML and CSS.


https://www.poynter.org/news/text-only-news-sites-are-slowly-making-comeback-heres-why

These text-only sites — which used to be more popular in the early days of the Internet, when networks were slower and bandwidth was at a premium – are incredibly useful, and not just during natural disasters. They load much faster, don’t contain any pop-ups or ads or autoplay videos, and help people with low bandwidth or limited Internet access. They’re also beneficial for people with visual impairments who use screen readers to navigate the Internet.

Proving, once again, that the web is not slow. Websites are slow because of how they are built. But these slow web design choices are probably governed by the media orgs' business models and by user experience people who design without empathy for the users.

More from the story:

There are many ways that news organizations can improve the ways they serve both low-bandwidth users and people with visual impairments by stripping out unnecessary elements and optimizing different parts of a website. To learn more, I reached out to front-end website designer J. Albert Bowden, who frequently tweets about accessibility and web design standards, to ask a few questions about how we might approach building text-only sites to help end users.

Kramer: I’m curious. What kinds of things can be stripped from sites for low-bandwidth users and people with visual impairments?

Bowden: Those are two very distinct user groups but some of the approaches bleed over and can be applied together. For low-bandwidth users: Cut the fluff. No pictures, no video, no ads or tracking. Text files are good enough here. Anything else is just fluff.

That's good web design advice for any-bandwidth users.

[Bowden:] For visually impaired users: I’m going to just talk about a11y [which is a shorthand way of referring to computer accessibility] here. A11y is best addressed in the foundation of a website, in the CSS, HTML, and JavaScript. There are other ways to go about doing it, but they are much more resource intensive and therefore are never going to be default for mainstream.

Typical user agents for those with visual impairments are screen readers, which rely on the foundation (literally HTML) of a website to interpret its content and regurgitate it back to the user.

Kramer: Is text-only the way to go? Are there ways to think about preloading images and/or other methods that might help these users?

Bowden: Text in HTML is the way to go here; you cover accessibility issues and SEO bots, while simultaneously also being usable on the maximum number of devices possible. HTML and CSS are forgiving in the sense that you can make mistakes in them, and something will still be rendered to the user. Browsers are built with backwards compatibility, so combining them all grants you the extended coverage. Meaning that basic sites will work on nearly any phone. Any computer. Any browser.

Once you deviate from this path, all bets are off. Some are solid, no doubt, but most are shaky at least, and downright broken at worst. JavaScript is the bee’s knees and you can do anything in the world with it ... but if you make certain mistakes, your web page will not render in the browser; the browser will choke to death.

https://mobile.twitter.com/dansinker/status/906691093919621120

Hands down the best news site design of 2017. lite.cnn.io

That October 2017 Poynter story contained a link to a 2015 Poynter story.

https://www.poynter.org/news/designing-journalism-products-accessibility

Consider creating a text-only site. At a recent accessibility hackathon, I sat with visually impaired people who said that this text-only version of the NPR website was the best news website because their screen readers easily parsed the material.

You can also check your website to make sure that it’s usable for people with color impairments, like color blindness. Color Safe helps you choose colors that meet WCAG contrast thresholds, while the Color Contrast Analyzer simulates different forms of color impairment so you can learn which colors may work.


Oct 14, 2017

https://www.codementor.io/kwasiamantin/undefined-cmglhfnfv https://news.ycombinator.com/item?id=15472627


https://www.neustadt.fr/essays/against-a-user-hostile-web https://news.ycombinator.com/item?id=15611122


"How the BBC News website has changed over the past 20 years" http://www.bbc.co.uk/news/uk-41890165 https://news.ycombinator.com/item?id=15730218


"Hundreds of web firms record 'every keystroke'" http://www.bbc.com/news/technology-42065650 https://news.ycombinator.com/item?id=15786887

Businesses existed before the creation of Javascript. Conducting most business does not require a Turing complete language and executing arbitrary code on every document. The HTML <form> is all that is needed to conduct business. The web was created as documents+forms in the IBM 3270 model, which businesses and other other organizations were using since the 1970s.

Another HN comment:

Whenever someone complains about a website not working without javascript enabled, someone inevitably responds "it's 2017, you can expect javascript to be enabled". I think that piece of knowledge is outdated:

  • Late 1990's: static html documents + forms - early 2000's: shitty DHTML scripts that added nothing - early 2010's: javascript + gracefully downgrading sites - 2015/16: required useful javascript everywhere - early 2017: trackers everywhere, html5 popups, trackers, spywhere, trackers, bitcoin miners, trackers, etc, etc.

2017 is the year where you NEED a javascript blocker. What's the use of having any security at all if you're going to leave the biggest attack vector in modern times completely unprotected?

Plus, the web has become completely unusable without a script blocker.

A person responded with:

When you exaggerate like that, it diminishes your point. I use the web all day, every day and I have never installed a script blocker.

It's not an exaggeration. It's nearly impossible to read media websites without JavaScript disabled, especially over an LTE connection, and definitely over a 3G connection. But even with a fast connection, older computers get bogged down when accessing media websites with JavaScript enabled.


January 2018

A letter about Google AMP http://ampletter.org https://news.ycombinator.com/item?id=16108553 - over 300 comments

HN comments:


Testing and ranking website response time, using 3G testing at webpagetest.org.

https://projects.hearstnp.com/performance

https://css-tricks.com/use-webpagetest-api

https://www.webpagetest.org/result/180202_3D_c86d014ace02663a6ab00ddc688b82eb/ sawv.org - homepage From: Dulles, VA - Chrome - 3G 2/1/2018, 10:00:43 PM First View Fully Loaded: Time: 1.390 seconds Requests: 2 Bytes In: 9 KB 100 % of the download was HTML, which was 6,413 bytes.


http://evrenkiefer.com/posts/2018-01-14-migration-de-wordpress-vers-jekyll.html

My specifications were quite simple.

  • A simple site to share texts and images
  • A cheap site to consult on mobile
  • Readable source code for people learning HTML and CSS
  • A fun site to edit (as in 1997) where you can quickly add interesting features.

Feb 2018:

https://news.ycombinator.com/item?id=16401630

Mobile networks and smart phones have incredible performance.

AMP is about making up for the race-to-the-bottom behaviour of sites that can’t resist using 6mb and a boatful of JS just to render an article(+analytics+ads)

Mobile networks and phones are powerful enough for video, so they're powerful enough to download and render a web page. If they're slow at web pages, it's very likely the fault of the page creator, not the network or device.

AMP is an attempt to force the hands of the page creators out there, since they apparently can't self-regulate.

I can't prove it because it's only my anecdotal experience, but it seems that the battery on my old iPhone drains faster when I have multiple Safari tabs open to web pages that use a lot of JavaScript, and I have JavaScript enabled. It seems that my battery life lasts longer when I have JavaScript disabled globally for Safari.


Feb 2018

https://www.socpub.com/articles/chris-graham-why-google-amp-threat-open-web-15847

https://news.ycombinator.com/item?id=16427367

Top comment:

I think it's very important to address the reason why AMP is possible in the first place: Websites are so extremely slow these days.

From users perspective, when I see the lightning icon on my search results I feel happy because it means that the page will show me it's contents as soon as I click it.

It means that the website is not going to show me white page for 15 seconds then start jumping around, changing shape and position for another 30 seconds until everything is downloaded.

I hear all the ethical/economical/strategic concerns but the tech community resembles the taxi industry a bit, that is, claiming that a tech that improves users experience significantly is bad for the user and must be stopped politically instead of addressing the UX issue that allows this tech to exist in first place.

Next comment:

The tragedy of it is that web browsers have never been faster - it's just that websites insist on bloating, and bloating, and bloating. It's not unusual for modern websites to have literally megabytes of pointless JavaScript. (Reminder: Super Mario 64 weighs in at 8MB. The whole game.)

AMP strikes me as a clever technical solution to a problem that doesn't need a technical solution. It just needs restraint and better web development with existing standard technologies, and ideally a strong taboo on bloated web-sites.

See also two other technologies, the existences of which damn the web: Opera Mini (cloud rendering! and it's useful!), which can only exist for as long as the web is laughably inefficient, and Reader Mode, which improves modern web-design by removing it entirely.

A comment further down:

I work for a publisher that zero ads. We have fast pages with minimal JS. We rolled out AMP purely for the SEO win and saw a huge uptick in traffic.

If Google really cared about performance they’d reward publishers doing the right thing on their regular pages (which would benefit the web as a whole), not just those using AMP.

Google decides, and they may be prioritizing websites that use Google's tech.

Another HN comment that echos my past thoughts about AMP and Facebook's Instant Articles.

I don't blame Google for AMP. The industry could have come together to offer a better experience and speedier page load, but of course they didn't and preferred having countless scripts and poorly optimized ads and that translated into a poor experience for users. This created an opportunity for Google to come in and offer this solution and now we're stuck.

Another HN comment:

I have JS disabled and websites are not that slow for me.

Indeed.


https://medium.com/confrere/its-illegal-to-have-an-inaccessible-website-in-norway-and-that-s-good-news-for-all-of-us-b59a9e929d54


"Make Medium Readable Again" https://news.ycombinator.com/item?id=16516126


Another idiotic article from The Verge that blames the mobile web for horribly bloated websites, like theverge.com, loading slowly. The Google person involved AMP believes that mobile web sucked prior to AMP.

First, not such thing exists called the mobile web. It's the web. It's http or https. The web can be accessed on many devices.

With fast internet connections, simple web pages load fast on any device. Bloated websites load slowly on any device, especially over slow internet connections.

https://www.theverge.com/2018/3/8/17095078/google-amp-accelerated-mobile-page-announcement-standard-web-packaging-urls


March 2018

https://sonniesedge.co.uk/talks/dear-developer

https://news.ycombinator.com/item?id=16773398


April 2018

https://changemyview.net/2018/04/11/reddit-redesigned-its-site-and-it-could-kill-discussion-subreddits/ https://news.ycombinator.com/item?id=16819397

It's 2018 and you're criticizing a site for not running well with JS disabled. You're the one actively deciding to make websites harder to use, why should they design the site for you?

Excellent response to the above small-minded thinking:

Bad engineering in 2018 is still bad engineering. JS can be used to enhance the experience, and even build things otherwise impossible in the browser, and it's totally justified. But taking the newest, shinest web application framework and turning what's a tree of regular web documents into a web application, with data unaccessible if you don't run the entirety of its JS? That's just wrong.

Another disturbing viewpoint:

Nobody that pays for developer time in 2018 is going to optimize their site for people that go out of their way to willfully turn off part of the stack. It's not gonna happen. Effort for no reward.

If those people are designers, then they design with no empathy. They may be unaware of accessibility. They have a narrow view of the world, regarding the tech owned by users and the physical capabilities of users. They obviously lack the ability to pick the right tools for the right times.

JavaScript should never be required to read text on a public website when the reader is not logged into the website.

Another good response to the small-minded people:

I turn off a part of the stack (or rather, run it on a whitelist), because people are using wrong parts of the stack for wrong things.

If you use the right parts of the stack for the right things, the result is a lean, accessible and interoperable piece of software. That's what the Internet was designed for. Alas, few people care, and in particular, interoperability is actively being opposed.

RE the toilet example, current webdev is more like refusing to build toilets in apartments and instead building them into car seats, because it's 2018, everyone has a car and a driver's license (or, rather, everyone in the population we want to monetize).

Another HN comment:

Some things should be simple and just work. They should work in a predictable manner and in a way that minimises risk for the users.

The narrow-minded JavaScript all-the-time everywhere freaks are contributing to the demise of a useful, open web. They don't understand time-and-place tech.

Another great response:

Good engineering is robust and efficient. Requiring Javascript to display simple, static text is not good engineering.

And the lame response from someone not paying attention:

Most websites are not just simple, static text.

I'd debate the "most" part, unless the commenter is including all internal websites/web apps at companies.

The discussion is about Reddit, which is mainly text. Why does Reddit's primary mobile web display require non-logged-in users to enable JavaScript to read text?

Web apps that require users to login and work from their console or dashboard in a private manner can use JavaScript without most of us whining about it. Banking, tax preparing, etc. I expect JavaScript, used well, to improve the user experience. But that info, hopefully, is not available for anyone to read on the public web.

Another HN comment:

I disagree, most of them are simple static sites. Gmail, Google Docs, etc. are the outliers, not the norm. Reddit is a static text site with images. Hacker News is a static text site. Blogs are static text sites (with a few embedded images or non-text objects.

A JavaScript version of Gmail is available on desktop/laptop screens. I don't understand why Google does not make this version of Gmail available on mobile web browsers. But Gmail can be used without JavaScript. That's how I use Gmail on my desktop and laptop computers.

Facebook can function without JavaScript at https://mbasic.facebook.com. Twitter functions without JavaScript. A minute ago, I logged into my test Twitter account and created a test tweet, all with JavaScript disabled for Twitter.

Large companies can afford to expend resources to create systems that function without JavaScript. Facebook, Google, and Twitter know that they have users in parts of the world that have limited internet or slow internet access, or the users have limited data cellphone plans, and JavaScript-free systems help the user experience in these situations.

And security-wise, it's a good idea to disable JavaScript in the browser.

Another HN response:

Most websites should be just simple, static text.

Another comment:

FWIW it takes less developer time to make a decent website without Javascript. Javascript also introduces risks that are non existent in static pages. Now if theese JS sites were super snappy and worked well offline etc I'd say probably wort it. But I guess they are often worse that a plain old web page.

someone who has created static and dynamic webpages since late last millenium

HN comment:

It doesn't take a lot of genius, time, or money to set a link w/ url parameters or use a standard html form for e.g. upvoting or comment submission, and then override that with javascript for fancier UI / AJAX / avoid redirects, etc -- i.e., extra polish, rather than it being the only option and the site just becomes nonfunctional instead of degrading gracefully.

This benefits more than just one tiny group of users: it might also aid disabled users and accessibility software (not to mention the developers of that software), security nuts, people who turn js off to improve performance on low-spec machines (it's 2018 here, but more than a few countries have a four (or even three!) figure GDP/capita, so their machines aren't going to be 2018 machines. This is just off the top of my head, how many other groups might there be that would benefit?

New businesses in the U.S. have to conform to accessibility guidelines for people with disabilities. No such regulation exists for websites. If the JavaScript all-the-time-everywhere crowd opened a coffee shop, they would gladly turn away customers with disabilities because such people represent only a small customer base.

HN comment:

It's 2018 and web developers are still not competent enough to build simple websites (blogs, news, reddit) without forcing visitors to use JS.

JS is the technology responsible for most of the malware infections and spyware ad-tracking. It's not like people disable it just to piss off developers, there are very good reasons to turn it off.

HN comment:

"JS disabled" is just the consequent, tech-y angle to not requiring JS to display every damn text box. I get JS to load things like simple menu-pop-ups or expanding an image, but it's so infused in the new reddit layout, you need it to basically display simple, static text content. It's just bloat.

What's the utility reason for requiring so much JavaScript to perform simple functions that were solved with basic HTML more than 20 years ago?


https://chat.indieweb.org/2018-04-13/1523620457122900

I just get annoyed, one website shouldn't require enough computing power [that] my [CPU] fan has to kick in

Not one website. A single web page can cause older computers to scream and glow red.


https://www.inc.com/30-under-30 https://news.ycombinator.com/item?id=16876061

Oh wow that is a 'slow' website. After initial site loading it used up 100% of one CPU core for all the animations. When scrolling up and down the hiding / changing side menus and ads make it even slower. That is an incredible piece of work on how not to do frontend.

another hn comment:

I dont know whats included on the website but it is loading really slow and the scrolling is laggy, I gave up after the first person highlighted.

hn comment:

Free karma to whoever posts a plain-text list.


http://epeus.blogspot.com/2011/12/facebook-twitter-and-google-plus-shun.html


https://chat.indieweb.org/dev/2018-04-25/1524642976411000

Zegnat I block everything in uMatrix. images, css, javascript. To save on data-usage. Though the saving from CSS is basically 0, it does sometimes stop multiple third-party CDNs from loading


https://www.smashingmagazine.com/2018/05/using-the-web-with-javascript-turned-off/


https://www.discoverdev.io https://news.ycombinator.com/item?id=17054460


creating a new design because it's cool and not because it solves any problems.

"Reddit's redesign increases power usage of user's devices" https://www.reddit.com/r/redesign/comments/8jzddx/reddits_redesign_increases_power_usage_of_our/

https://news.ycombinator.com/item?id=17087176

top HN comment:

Anyone from reddit's dev team reading this? Or youtube's dev team, because my question applies to both.

The previous design was great. The new version of both sites is slow, sluggish and provides me with no benefit. Why was the change implemented? The previous design wasn't broken!

How's the user feedback? A/B testing really indicated to you this was a good choice?!

If you have any insight -- I'm sure this was a decision made much higher up than dev -- please do share.

the new reddit design displays nothing when viewed on a mobile device with javascript disabled. brilliant.

have to use this version of reddit: https://i.reddit.com

another hn comment:

This is just a symptom of the webdev world's over-reliance on JavaScript to draw interfaces nowadays. The web stack has HTML, CSS, and JavaScript for reason. Quit doing everything in the third layer. Make your stuff with progressive enhancement. Knock off the do-it-all-in-JS stuff and your web apps will perform better.


http://bradfrost.com/blog/link/txt-fyi/

https://txt.fyi/about/

This is the dumbest publishing platform on the web.

Write something, hit publish, and it's live.

There's no tracking, ad-tech, webfonts, analytics, javascript, cookies, databases, user accounts, comments, friending, likes, follower counts or other quantifiers of social capital. The only practical way for anyone to find out about a posting is if the author links to it elsewhere.

Long live the independent web!


https://ben.balter.com/2013/10/30/content-is-king/


https://mobile.twitter.com/aral/status/1002498136479236097

Jun 1, 2018 discussion that appeared in the IndieWeb chat log about websites that maliciously use JavaScript, which is a good reason to disable JavaScript when READING. Naturally, a knee-jerker suggested disabling HTML, which proved that the user has no knowledge of about the web's history.

The web is about linking, therefore a 100-percent plain text option does not make sense. But with the way some websites are poisoned by horrible, modern web design, a 100-percent plain text version would be preferable.

But even Gopher contains linking capabilities.

TBL created the web with its own markup language that was a subset of SGML. TBL called HTML, and the purpose of the markup language was to display content with some simple formatting options to make reading long docs easier in the new web browser app.

The web = linking and formatting with HTML. A 100-percent plain text option is not the web, but it's possible to do.

Some web browsers today permit readers to unstyle web pages, which means the browser feature or extension disables CSS. I don't know if it disabled HTML formatting.

People disable JavaScript when READING the web over the internet for multiple reasons: security, privacy, faster page loads, simpler pages, better reading experience, less data consumed.

When people need to conduct work within a company's network by using internal web apps, then a JavaScript experience will probably occur, and users might expect it because the users demand a pleasant user experience.

Same thing when users LOG INTO their web apps that execute over the internet and CONDUCT WORK from their DASHBOARDS or ADMIN CONSOLES. These are not public actions. These pages are not for public consumption. These web apps need to work smoothly, and using JavaScript can help.

related from the knee-jerker who happens to be concerned about privacy and tracking.

https://mobile.twitter.com/aral/status/1002538437235355649?p=p

Disable JavaScript when READING web sites that exist on the internet, and the annoying pop-over screens do not exist. At least the ones created by JavaScript.

JavaScript not HTML is responsible for the tracking and the anti-privacy crap.

From the IndieWeb chat log, this disappointing observation from an IndieWeb user:

https://chat.indieweb.org/2018-06-01/1527862734800400

2018-06-01 UTC 14:18 [jgmac1106] I have such disdain in the all javascript is evil position. Like blaming existence of words for hate speech

I have disdain for such binary observations too. Who said that all JavaScript is evil 100 percent of the time in all conditions, including web apps that execute in internal networks?

Again, from the knee-jerker who posted in that Twitter thread:

Heck, let’s get rid of HTML too. Ain’t nothing that needs saying that we can’t say in plain ol’ ASCII amirite? ;) (There’s nothing wrong with JavaScript, it’s the business models that are wrong. A few Stallmanesque folks browsing the web with Lynx isn’t going to solve this.)

That person displayed his ignorance.

Many websites that are NOT single page applications still require JavaScript to display text.

I run the latest, greatest version of Chrome. I use multiple browser extensions, including Quick JavaScript Switcher. And when I have JavaScript disabled for a website that requires JavaScript to display its content even though the site is not an SPA, I can sometimes read the text for that same site with Lynx. WTF?

And Stallman is NOT against JavaScript. The knee-jerker needs to stop creating a new reality. Maybe try reading RMS's website.

https://stallman.org

Stallman opposes non-free JavaScript code. From his homepage.

write a recipe for how to connect to the WiFi in a New York City subway station without running its nonfree Javascript code. The recipe could include a free Javascript program I could run, or it could consist of instructions for what I would type into IceCat (our variant of Firefox).

Stallman supports JavaScript code that is free.

The problem with geeks is that too often they view an issue in binary. The position is either support JavaScript 100 percent of the time or ban JavaScript 100 percent of the time. No leeway. Both viewpoints deserve to be ignored.

My favorite way of writing on the web is by using my own JavaScript-based editor. Logging into web apps to perform work means the users should experience a pleasant UI/UX. Get in, do work, exit, and move on. JavaScript can make banking, doing taxes, managing projects, etc. easier for the logged-in users.

Sure, those web apps could be made to work without JavaScript, but if a competitor creates a better UI/UX, then the non-JavaScript options could lose customers.

Fastmail only works with JavaScript, and I'm okay with that because I enjoy using Fastmail.

When I visit websites as a browsing-only user, and my only function is to read the website, then I see zero reason for the websites to require JavaScript to display text. I see no reason to require JavaScript to display images.

Without JavaScript, those sites appear broken or blank to me, which is fine. It's a big web. A World Wide Web, even. And I can find other websites to waste my time reading.

Good observation by another IndieWeb user in that thread.

https://chat.indieweb.org/2018-06-01/1527863075370500

sknebel blocking third-party JS with a small whitelist is in my experience the best web experience, so I understand where the JS "hate" is coming from

READING a PUBLIC WEBPAGE as a BROWSING-ONLY USER is NOT the same as PERFORMING WORK in a PRIVATE WORKSPACE as a LOGGED-IN USER within an internal or external WEB APP.

The original web was created to share documents for READING. That simple capability still exists today.

But today, my hardware, operating systems, and web browsers are being made obsolete by WEB PAGES that require many megabytes of crapware to be downloaded, which causes older and slower CPUs to glow red and scream. And all that I wanted to do was to READ those bloated web pages.

WEB PAGES and newfangled JavaScript are obsoleting my older computers.

I'm not trying to install a brand new hardware widget that performs magnitudes more functions, and requires a newer version of the operating system or whatever. I'm not trying to install a new, mission control-like software system that requires more RAM and new CPUs, and bigger monitors.

Web pages and websites are obsoleting systems, which is insane when the site is simply display text and images. It's not offering video game play.


Design Tip: Never Use Black (2012) https://news.ycombinator.com/item?id=17334627 https://ianstormtaylor.com/design-tip-never-use-black/

I'll take high contrast black on white over the annoyingly popular grey on grey that also uses a microscopic font size when the websites are viewed on a phone's web browser. It's almost as if the design was intentionally made to prevent people from reading the website.


July 2018 HN thread about a new text-based browser. Discussion from one part of the thread.

sevensor

Can it be made not to show WebGL, embedded video, and so forth? I enjoy a very serene internet using w3m set to monochrome, with mouse and images turned off. Every now and then it's necessary to use a graphical browser, and it's the sensory equivalent of being woken up by a toddler at 5:30 on Christmas morning.

Reply:

bovermyer

What you call "serene," I would call "austere." That's not meant as a denigration, mind you: I'm very curious as to your viewpoint here. What do you enjoy about such an experience?

It's unfortunate for USABLE web design that such a question needs to be asked. Forget about "serene" and "austere". The proper term is usable.

The person gave a great response to the sad question.

sevensor

I find the ordinary internet terribly overstimulating. It's a constant din of people screaming for my attention. Even relatively sober sites often have distracting designs that make it hard to focus on the content. Text mode is calming, it cuts straight through the clutter. In text mode, authors have to distinguish themselves by saying something interesting, and it's a lot easier to decide whether that's the case when they have nothing but words with which to make their arguments.

Another user provided an excellent response to the question.

The major benefit is that I don't enjoy an "experience", I just read the content and leave. So much of the modern web is built specifically to prevent that.

Indeed. The massively bloated, clunky, tracker-filled, and ad-filled websites that dominate the media industry make it harder to stay informed. Their web reading experience is too awful to tolerate.

When I simply want to read information as a browsing-only user and not as a logged-in user, performing work at my dashboard or admin console, then plain text works best, especially if the plain text is delivered to the web browser with minimal HTML, little to no CSS, and definitely no JavaScript. I read, I learn, and I move on.

I'm not looking to be wowed by the latest JavaScript client-side fad framework. The heavy JavaScript experience is great when I'm hoping for a brilliantly designed UI/UX that stays out of my way when I'm logged into a web app to perform tasks.


https://brutalist-web.design/

A friend gave me design advice once. He said to start with left-aligned black text on a white background, and to apply styling only to solve a specific problem. This is good advice. Embrace this, and you embrace Brutalist Web Design. Focus on your content and your visitors will enjoy you and your website. Focus on decoration or tricking your visitors into clicking ads, and your content will suffer, along with your visitors.

That's useful advice, but I don't understand the repeated attempts in the past couple years to apply brutalist architectural design to web design.

More from brutalist-web.design:

A website's materials aren't HTML tags, CSS, or JavaScript code. Rather, they are its content and the context in which it's consumed. A website is for a visitor, using a browser, running on a computer to read, watch, listen, or perhaps to interact. A website that embraces Brutalist Web Design is raw in its focus on content, and prioritization of the website visitor.

Here's the July 2018 HN thread related to the above website.

https://news.ycombinator.com/item?id=17478133

That thread contained over 300 comments. Here's a humorous one.

Please don't encourage people taking marketing classes. The internet needs less advertising not more.


Good stuff. https://petermolnar.net/do-websites-want-us-to-use-reader-mode/

Naturally met by ignorant criticism.

https://news.ycombinator.com/item?id=17592600%C2%A0

It's always binary with the rabid pro-JavaScript AND rabid anti-JavaScript crowds.

When it's a text-based article, why does the content need to be lost to wretched, bloated, obnoxious web design? What's wrong with old, simple HTML with a smattering of simple CSS when displaying an article that contains mainly text?

When it's a web app that requires the user to log into the website, then JavaScript a storm, provided that the megatons of JavaScript are useful. When the JavaScript provides a comfortable and useful UI/UX, then that's a successful web app.

I stumbled upon Peter's post and the HN thread while using the Links web browser on my Linux desktop computer. That's the Links browser and not Lynx. I occasionally use Lynx, which is a text-based web browser. I first started using Lynx in 1994 or 1995.

The Links web browser came "later". It's development began in the late 1990s. I have rarely used Links, since I preferred Lynx. But learned in July 2018 that Links had a graphics mode that supports images and the mouse. For some reason, I never knew that.

On July 25, 2018, I started using Links or Links2 in graphics mode by starting it up from the command prompt with links2 -g. I like it. It does not style pages as well as NetSurf, which is still a favorite alternative web browser of mine. But I like the Links text-mode styling with some graphics capabilities.


I found this post of mine from December 2017. It's a quote from a March 2017 Hacker News thread.

There is a special circle of hell for people who write documentation on pages so javascript heavy that a lynx browser on a computer with broken graphics drivers can't load it.


Jul 25, 2018

"Senator Asks US Agencies to Remove Flash from Government Websites"

https://www.bleepingcomputer.com/news/government/senator-asks-us-agencies-to-remove-flash-from-government-websites/

https://news.ycombinator.com/item?id=17611772

HN comment:

Two notes, though I doubt anyone working on government pages will actually read them:

1) PDF -> HTML improves accessibility. HTML -> content rendered by JS via an SPA framework is worse than PDF, and approaches Flash. Please don't do that.

2) Think of the robots :). One of the problem of government data is that while you can usually find the scanned PDF or an XLS file with the data you're looking for, it's completely useless for automated processing. Making public data easier for machines to read enables citizens to build interesting tools on top of them.

Yes. Trying to text-process government PDF files with homemade programs can be frustrating. I had that trouble back in 2005. Our local governments still produce too many PDF files today. Web access to one of the Lucas County services requires Silverlight!!!???


July 27, 2018

"Twitter shares drop 14 percent after reporting declining monthly active users"

https://news.ycombinator.com/item?id=17625456

What I don't understand is the discrepancy between the size of their engineering workforce and the lousy quality of their web client on mobile.

reply:

This may be an intentional dark pattern / effort to drive users away from the web client and to the official app. Facebook and reddit do the same thing. More permissions = more juicy data to harvest.

I think that a lot of websites do that. According to my tin-foil hat theories, it seems that websites on all devices function so poorly due to disgusting web designs that it seems like the bad designs are intentional to encourage people to download yet another native app for the phone.


Jul 30-31, 2018

https://pxlnv.com/blog/bullshit-web/

https://news.ycombinator.com/item?id=17655089

Great top comment:

I've said this before, but it bears repeating: Moby Dick is 1.2mb uncompressed in plain-text. That's lower than the "average" news website by quite a bit--I just loaded the New York Times front page. It was 6.6mb. that's more than 5 copies of Moby Dick, solely for a gateway to the actual content that I want. A secondary reload was only 5mb.

I then opened a random article. The article itself was about 1,400 words long, but the page was 5.9mb. That's about 4kb per word without including the gateway (which is required if you're not using social media). Including the gateway, that's about 8kb per word, which is actually about the size of the actual content of the article itself.

So all told, to read just one article from the New York Times, I had to download the equivalent of ten copies of Moby Dick. That's about 4,600 pages. That's approaching the entirety of George R.R. Martin's A Song of Ice and Fire, without appendices.

If I check the NY Times just 4 times a day and read three articles each time, I'm downloading 100mb worth of stuff (83 Moby-Dicks) to read 72kb worth of plaintext.

Even ignoring first-principles ecological conservatism, that's just insanely inefficient and wasteful, regardless of how inexpensive bandwidth and computing power are in the west.

EDIT: I wrote a longer write-up on this a while ago on a personal blog, but don't want it to be hugged to death:

http://txti.es/theneedforplaintext

another comment, related to the ny times or media orgs:

It would be nice if the things I pay for didn't start stuffing their content with bullshit. What and who do I have to pay to get single second page loads?

Man, it's like I wrote that comment, but I didn't, at least not at HN. I certainly have made similar comments here.

That commenter continued ...

It's not a given that advertising has to be so bloated and privacy-invasive. Various podcasts and blogs (like Daring Fireball) plug the same ad to their entire audience each post/episode for set periods of time. If you're going to cry about needing advertising then take your geographic and demographic based targeting. But no war of attrition will get me to concede you need user-by-user tracking.

You want me to pay for your content? Fine, I like it well enough. You want to present ads as well? Okay sure, the writing and perspectives are worth that too I suppose. But in addition to all of this you want to track my behavior and correlate it to my online activity that has nothing to do with your content? No, that's ridiculous.

And hence why I disable JavaScript for most of my web reading. That's reading. R - E - A - D - I - N - G

The top commenter replied someone else's misguided comment.

A random archive of the New York Times frontpage in 2005 is 300kb. Articles were probably comparable in size.

Are you honestly saying that the landscape of the internet and/or the staffing needs of the NY Times has changed so drastically that they actually needed a 22x increase in size to deliver fundamentally text-based reporting?

Here's the misguided comment by the person who replied to the top comment.

I don't think thats a meaningful comparison. Moby Dick is a book, written by 1 guy and maybe an editor or two. NYT employs 1,300 people.

Yes. The NY Times like most media orgs are for-profit businesses, and most of these orgs rely on subscriptions and/or digital advertising that obviously needs page views.

When you read a book all you get is the text. NYT has text, images, related articles, analytics, etc.

But why do an increasingly number of media orgs require JavaScript to display images when the HTML image tag has existed for more than 20 years?

Here's a Toledo Blade editorial that contains around 370 words and one small image.

http://www.toledoblade.com/Editorials/2018/07/30/City-must-help-residents-fight/stories/20180730145

https://www.webpagetest.org/result/180731_ZR_8fe8b402a15da8a8932ba6602dca1494/

From: Dulles, VA - Thinkpad T430 - Chrome - Cable 7/31/2018, 3:12:23 PM First View Fully Loaded: Time: 10.610 seconds Requests: 286 Bytes in: 2,532 KB

Why would a 370-word editorial require a web reader to download 2.5 megabytes of data? And why would such an article require the reader's web browser to make 286 web requests?

43 of those requests were for JavaScript, which totaled 1.1 megabytes of the total download. The Blade's bloated web design, however, is not as bad as it was two to three years ago.

Again from above:

Moby Dick is 1.2mb uncompressed in plain-text

And the Blade's website required a reader to download 1.1 megabytes of JavaScript to read a small article that was mainly text.

Moby Dick doesn't have to know what pages you read. NYT needs to know how long you spent, on which articles, etc. They need data to produce the product and you can only achieve that with javascript tracking pixels (Server logs aren't good enough).

If Moby Dick was being rewritten and optimized every single day it would be a few mb. Its not, so you can't compare the two.

Yes NYT should be lighter, no your comparison is not meaningful. A better comparison would by Moby Dick to the physical NYT newspaper.

Here's another commenter who made a misguided post as a reply to the author of the top comment.

This is both an appeal to people's universal appreciation of efficiency, and a weak denunciation of the modern web. Your argument is

1) that a website's value is the number of words on the page, and

Huh? The other person did not discuss the value or the merits of word counts. It's the content that matters, regardless if it's 20,000 words or 200 words. And it doesn't matter if the content is a mix of audio, video, images, and words, as long as every piece of content has value. I despise huge, useless images that have nothing to do with the content of the page. And since most people read web pages on their phones, why uses massive images. A 640 or 800 pixel wide image is probably enough for most cases for articles on all screen sizes. Maybe instead of embedding giant images and video and every social media widget within the article, the content creators should provide links to the bigger content. If a reader wants to view the image or the video, then the reader can click a link to view the image or video. That keeps the article page lighter.

Excerpts from another comment posted by the author of the top comment, at least at the time that I read this HN thread.

Concerning the relative value of HTML and CSS, yes, you could argue that UX matters in that department, but even the most bloated static HTML/CSS page is going to pale dramatically in comparison to the size of what's considered acceptable throughput today.

Hopefully, commenters or "thinkers" like this HN user do not design websites.

Comparing the raw text of a fiction novel to the code of a website is a pretty asinine comparison, honestly.

Clearly, the person lacks the ability to comprehend the comparison, which is to highlight how absurdly bloated single web pages have become, especially at media orgs. It's why Facebook created Instant Articles and Google created Accelerated Mobile Pages.

Another HN commenter added:

Amen to every word except this sentence. "Better choices should be made by web developers to not ship this bullshit in the first place."

No developer I know, web or otherwise, wants to do any of this, and all of them are religious in their use of ad blockers and autoplay stoppers.

This is the kind of stuff developers are forced to do with guns to their heads by the PMs and marketing teams that actually determine the user experience.

We don't hear how designers and developers are forced to make bloated websites because of the business models used by the publishers. I wish that we would hear more tell-alls in this area.

In July 2018, I began using and enjoying the Links web browser on my Linux desktop computer. That would be Links in graphics mode. For some reason, I needed to download Links2. I start up the browser form the command line with links2 -g.

With a broadband connection at home, I'm stunned at how fast our internet access is, and we do not have the fastest option, offered by local ISP company toast.net. Our version costs around $37 per month.

Links renders web pages literally in a blink of an eye, maybe faster than I can blink. The limited graphical browser NetSurf can render through at least HTML4 and CSS2 and maybe a smattering of HTML5 and CSS3.

But links -g renders only basic HTML, such as headings, paragraphs, emphasis, bullet points, and blockquotes, about all that I need. links -g can also display images. But the browser does not support CSS. The browser provides global display options for the background color, font size, margin sizes, etc. Simple but useful.

When individual webpages are stripped down to their basic HTML, like how the web was meant to be, web access and web page loading is blazingly fast over our home broadband internet connection.

In July 2018, I've been using the uMatrix Chrome browser extension with everything disabled by default for all websites, and then I enable on a per site basis. With everything disabled, webpages in Chrome display similar to how they look in the links -g web browser.

Since it's unlikely to get the links -g web browser to work on an smartphone, then I may need to try Chrome again on the iPhone with uMatrix, assuming that exists for the mobile version of Chrome.

The web (http/https) is not slow. The mobile web is not slow. No such thing as the mobile web exists. It's the same damn web, displayed on smaller screens. Even on a 2G connection, simple web pages load faster than bloated web pages on fast internet connections.

The web browser makers are not the problem for a slow web as defined by The Verge. The blame belongs to bloated website publishers like The Verge.

If the Toledo Blade offered a web product that functioned like https://text.npr.org except with the addition of useful and judiciously placed images, video, and audio, then I would happily subscribe.

My fictional Toledo Gazette test website with real Blade content.

http://wren.soupmode.com/tg


One Year Without AMP (alexkras.com)

https://www.alexkras.com/one-year-without-amp

https://news.ycombinator.com/item?id=17678705


August 2018

https://medium.com/the-set-list/google-amp-a-70-drop-in-our-conversion-rate-35fe3cb69c59

https://news.ycombinator.com/item?id=17717241

From the medium.com post:

Let’s talk about Google AMP. AMP stands for Accelerated-Mobile-Pages. It’s a technology Google originally introduced to get web developers to speed up their webpages for mobile devices and mobile networks. But in many ways it seems like great technology for any device or network. Who doesn’t want fast websites?

Uh, newsflash: Website owners don't need Google AMP to create fast websites. Website owners can CHOOSE to create fast websites on their own, but they don't. It's a choice.

There’s nothing that magical about it. A big part of its performance boost is simply its standards: no javascript, all inline CSS, and CSS can only measure 50KB or less. You’re going to make any page load faster with those requirements.

And website owners can do that themselves. But most web publishers, especially media orgs, CHOOSE to create horrendously bloated web designs.

Top comment from the HN thread that contains over 340 comments:

Question: If what makes AMP fast is the restrictions on size, JS, and CSS, and you know this and want to conform to this, why do you need to use AMP? Why not just develop your site like this anyways?

It's possible that the tech people working at media orgs would prefer to create lightweight, fast-loading, non-tracking websites, but those tech people are employees beholden to "superiors" who promote a failing business model, built around digital ads, clickbait, and pageviews.

Another HN comment:

AMP's innovation isn't a way to make pages fast. AMP is a way to sell other stakeholders on implementing technologies that make your website fast. All the stuff AMP does is stuff you could do yourself without the extra request to amp.js and the extra work to amp-ify your pages.

But imagine you've got an advertising department that wants three different ad networks, a couple different managers that want to see stats from a couple different analytics platforms, and and the designer wants to load a font from fontsquirrel and another one from typekit and another one from google web fonts, and as a developer who wants to keep the site fast you have to fight them every single time they want to add something else that slows your site down. Having the same fight every time, with everybody else saying "oh, it's just one request. and this one is really critical" it's hard to keep fighting that fight.

It's a lot easier to say "i can't do that, it doesn't work in AMP". If you can find a better way to convince large organizations that page load speed is a valuable metric, and more important that whatever other resource they want to load today, I'd love to hear it. But from what i've seen, AMP is the only thing that's had any success in tackling this problem.

Another HN comment:

AMP was a blessing for me honestly. I can now maintain a version of our new site that isn't bogged down with tracking and flavor-of-the-month JS feature garbage.

I've been fighting against adding additional tracking forever, but constantly get railroaded by marketing because "they're the ones that know how to make us profitable."

Fundamentally I hate what it means for the internet, but I finally have a little power to say "no we can't do that."

Another HN comment:

It is astonishing how hard it can be to internally sell any kind of web quality features to management in both for profit and non-profit organizations.

There is also a real herd effect. Many people will do whatever Matt Cutts tells them because they think it will be good for their SEO. Yeah right. Some of the people who are good at SEO either went to work for huge brands or quasi-competitors of Google (like about.com) that might have some ability to bring Google to anti-trust court; most of the others switched to paid advertising once they figured out that Google won't let you win at SEO.

Depending upon the business, designers and developers should NOT be blamed for poorly designed websites. We don't know the stories behind the bloated, nefarious designs.


https://chat.indieweb.org/2018-08-10/1533941657434800

https://mobile.twitter.com/rhiaro/status/1027857852730429440

https://rhiaro.co.uk/2018/08/websites

My kingdom for restaurant websites which have a text-only one page version of the menu without so much javascript that google translate can't even parse it and nothing is copypastable.