Tag - Manifesto

assembled in 2016-2017

Most weeks, I find new and old web posts, related to this subject. These pages are mainly a dumping ground for links, excerpts, and quick thoughts.


storing links for page number 7 ...

aug 16, 2017 https://news.ycombinator.com/item?id=15027715

Most content websites have become such a massive crapfest of ad-bloat, bad UX, huge page sizes and general usability hell that it's nigh impossible that I'd be able to reach the actual content of a non AMP site in the first 5-10 seconds of clicking on its link. (On my phone that's an additional 1-2 seconds for registering the tap, and 1-2 seconds for navigating to the browser)

So say what you may, AMP (or FB Instant or its ilk) will prosper until the mobile web experience stops being so crappy.

First, no such thing as the "mobile web" exists. It's the web. Many times, people view the web with mobile devices, but it's the same web that can be viewed with desktop computers. Responsive web design permits sites to adjust the displays for different devices, but it's still the same web or http/https.

Second, the web experience, which includes viewing the web on mobile devices, is fine. The problem is not with the web. The problem is with how websites are designed.


https://news.ycombinator.com/item?id=15210022

This is a non-interactive text-only website. It shouldn't need anything beside HTML and CSS.


https://www.poynter.org/news/text-only-news-sites-are-slowly-making-comeback-heres-why

These text-only sites — which used to be more popular in the early days of the Internet, when networks were slower and bandwidth was at a premium – are incredibly useful, and not just during natural disasters. They load much faster, don’t contain any pop-ups or ads or autoplay videos, and help people with low bandwidth or limited Internet access. They’re also beneficial for people with visual impairments who use screen readers to navigate the Internet.

Proving, once again, that the web is not slow. Websites are slow because of how they are built. But these slow web design choices are probably governed by the media orgs' business models and by user experience people who design without empathy for the users.

More from the story:

There are many ways that news organizations can improve the ways they serve both low-bandwidth users and people with visual impairments by stripping out unnecessary elements and optimizing different parts of a website. To learn more, I reached out to front-end website designer J. Albert Bowden, who frequently tweets about accessibility and web design standards, to ask a few questions about how we might approach building text-only sites to help end users.

Kramer: I’m curious. What kinds of things can be stripped from sites for low-bandwidth users and people with visual impairments?

Bowden: Those are two very distinct user groups but some of the approaches bleed over and can be applied together. For low-bandwidth users: Cut the fluff. No pictures, no video, no ads or tracking. Text files are good enough here. Anything else is just fluff.

That's good web design advice for any-bandwidth users.

[Bowden:] For visually impaired users: I’m going to just talk about a11y [which is a shorthand way of referring to computer accessibility] here. A11y is best addressed in the foundation of a website, in the CSS, HTML, and JavaScript. There are other ways to go about doing it, but they are much more resource intensive and therefore are never going to be default for mainstream.

Typical user agents for those with visual impairments are screen readers, which rely on the foundation (literally HTML) of a website to interpret its content and regurgitate it back to the user.

Kramer: Is text-only the way to go? Are there ways to think about preloading images and/or other methods that might help these users?

Bowden: Text in HTML is the way to go here; you cover accessibility issues and SEO bots, while simultaneously also being usable on the maximum number of devices possible. HTML and CSS are forgiving in the sense that you can make mistakes in them, and something will still be rendered to the user. Browsers are built with backwards compatibility, so combining them all grants you the extended coverage. Meaning that basic sites will work on nearly any phone. Any computer. Any browser.

Once you deviate from this path, all bets are off. Some are solid, no doubt, but most are shaky at least, and downright broken at worst. JavaScript is the bee’s knees and you can do anything in the world with it ... but if you make certain mistakes, your web page will not render in the browser; the browser will choke to death.

https://mobile.twitter.com/dansinker/status/906691093919621120

Hands down the best news site design of 2017. lite.cnn.io

That October 2017 Poynter story contained a link to a 2015 Poynter story.

https://www.poynter.org/news/designing-journalism-products-accessibility

Consider creating a text-only site. At a recent accessibility hackathon, I sat with visually impaired people who said that this text-only version of the NPR website was the best news website because their screen readers easily parsed the material.

You can also check your website to make sure that it’s usable for people with color impairments, like color blindness. Color Safe helps you choose colors that meet WCAG contrast thresholds, while the Color Contrast Analyzer simulates different forms of color impairment so you can learn which colors may work.


Oct 14, 2017

https://www.codementor.io/kwasiamantin/undefined-cmglhfnfv https://news.ycombinator.com/item?id=15472627


https://www.neustadt.fr/essays/against-a-user-hostile-web https://news.ycombinator.com/item?id=15611122


"How the BBC News website has changed over the past 20 years" http://www.bbc.co.uk/news/uk-41890165 https://news.ycombinator.com/item?id=15730218


"Hundreds of web firms record 'every keystroke'" http://www.bbc.com/news/technology-42065650 https://news.ycombinator.com/item?id=15786887

Businesses existed before the creation of Javascript. Conducting most business does not require a Turing complete language and executing arbitrary code on every document. The HTML

is all that is needed to conduct business. The web was created as documents+forms in the IBM 3270 model, which businesses and other other organizations were using since the 1970s.

Another HN comment:

Whenever someone complains about a website not working without javascript enabled, someone inevitably responds "it's 2017, you can expect javascript to be enabled". I think that piece of knowledge is outdated:

  • Late 1990's: static html documents + forms - early 2000's: shitty DHTML scripts that added nothing - early 2010's: javascript + gracefully downgrading sites - 2015/16: required useful javascript everywhere - early 2017: trackers everywhere, html5 popups, trackers, spywhere, trackers, bitcoin miners, trackers, etc, etc.

2017 is the year where you NEED a javascript blocker. What's the use of having any security at all if you're going to leave the biggest attack vector in modern times completely unprotected?

Plus, the web has become completely unusable without a script blocker.

A person responded with:

When you exaggerate like that, it diminishes your point. I use the web all day, every day and I have never installed a script blocker.

It's not an exaggeration. It's nearly impossible to read media websites without JavaScript disabled, especially over an LTE connection, and definitely over a 3G connection. But even with a fast connection, older computers get bogged down when accessing media websites with JavaScript enabled.


January 2018

A letter about Google AMP http://ampletter.org https://news.ycombinator.com/item?id=16108553 - over 300 comments

HN comments:


Testing and ranking website response time, using 3G testing at webpagetest.org.

https://projects.hearstnp.com/performance

https://css-tricks.com/use-webpagetest-api

https://www.webpagetest.org/result/180202_3D_c86d014ace02663a6ab00ddc688b82eb/ sawv.org - homepage From: Dulles, VA - Chrome - 3G 2/1/2018, 10:00:43 PM First View Fully Loaded: Time: 1.390 seconds Requests: 2 Bytes In: 9 KB 100 % of the download was HTML, which was 6,413 bytes.


http://evrenkiefer.com/posts/2018-01-14-migration-de-wordpress-vers-jekyll.html

My specifications were quite simple.

  • A simple site to share texts and images
  • A cheap site to consult on mobile
  • Readable source code for people learning HTML and CSS
  • A fun site to edit (as in 1997) where you can quickly add interesting features.

Feb 2018:

https://news.ycombinator.com/item?id=16401630

Mobile networks and smart phones have incredible performance.

AMP is about making up for the race-to-the-bottom behaviour of sites that can’t resist using 6mb and a boatful of JS just to render an article(+analytics+ads)

Mobile networks and phones are powerful enough for video, so they're powerful enough to download and render a web page. If they're slow at web pages, it's very likely the fault of the page creator, not the network or device.

AMP is an attempt to force the hands of the page creators out there, since they apparently can't self-regulate.

I can't prove it because it's only my anecdotal experience, but it seems that the battery on my old iPhone drains faster when I have multiple Safari tabs open to web pages that use a lot of JavaScript, and I have JavaScript enabled. It seems that my battery life lasts longer when I have JavaScript disabled globally for Safari.


Feb 2018

https://www.socpub.com/articles/chris-graham-why-google-amp-threat-open-web-15847

https://news.ycombinator.com/item?id=16427367

Top comment:

I think it's very important to address the reason why AMP is possible in the first place: Websites are so extremely slow these days.

From users perspective, when I see the lightning icon on my search results I feel happy because it means that the page will show me it's contents as soon as I click it.

It means that the website is not going to show me white page for 15 seconds then start jumping around, changing shape and position for another 30 seconds until everything is downloaded.

I hear all the ethical/economical/strategic concerns but the tech community resembles the taxi industry a bit, that is, claiming that a tech that improves users experience significantly is bad for the user and must be stopped politically instead of addressing the UX issue that allows this tech to exist in first place.

Next comment:

The tragedy of it is that web browsers have never been faster - it's just that websites insist on bloating, and bloating, and bloating. It's not unusual for modern websites to have literally megabytes of pointless JavaScript. (Reminder: Super Mario 64 weighs in at 8MB. The whole game.)

AMP strikes me as a clever technical solution to a problem that doesn't need a technical solution. It just needs restraint and better web development with existing standard technologies, and ideally a strong taboo on bloated web-sites.

See also two other technologies, the existences of which damn the web: Opera Mini (cloud rendering! and it's useful!), which can only exist for as long as the web is laughably inefficient, and Reader Mode, which improves modern web-design by removing it entirely.

A comment further down:

I work for a publisher that zero ads. We have fast pages with minimal JS. We rolled out AMP purely for the SEO win and saw a huge uptick in traffic.

If Google really cared about performance they’d reward publishers doing the right thing on their regular pages (which would benefit the web as a whole), not just those using AMP.

Google decides, and they may be prioritizing websites that use Google's tech.

Another HN comment that echos my past thoughts about AMP and Facebook's Instant Articles.

I don't blame Google for AMP. The industry could have come together to offer a better experience and speedier page load, but of course they didn't and preferred having countless scripts and poorly optimized ads and that translated into a poor experience for users. This created an opportunity for Google to come in and offer this solution and now we're stuck.

Another HN comment:

I have JS disabled and websites are not that slow for me.

Indeed.


https://medium.com/confrere/its-illegal-to-have-an-inaccessible-website-in-norway-and-that-s-good-news-for-all-of-us-b59a9e929d54


"Make Medium Readable Again" https://news.ycombinator.com/item?id=16516126


Another idiotic article from The Verge that blames the mobile web for horribly bloated websites, like theverge.com, loading slowly. The Google person involved AMP believes that mobile web sucked prior to AMP.

First, not such thing exists called the mobile web. It's the web. It's http or https. The web can be accessed on many devices.

With fast internet connections, simple web pages load fast on any device. Bloated websites load slowly on any device, especially over slow internet connections.

https://www.theverge.com/2018/3/8/17095078/google-amp-accelerated-mobile-page-announcement-standard-web-packaging-urls


March 2018

https://sonniesedge.co.uk/talks/dear-developer

https://news.ycombinator.com/item?id=16773398


April 2018

https://changemyview.net/2018/04/11/reddit-redesigned-its-site-and-it-could-kill-discussion-subreddits/ https://news.ycombinator.com/item?id=16819397

It's 2018 and you're criticizing a site for not running well with JS disabled. You're the one actively deciding to make websites harder to use, why should they design the site for you?

Excellent response to the above small-minded thinking:

Bad engineering in 2018 is still bad engineering. JS can be used to enhance the experience, and even build things otherwise impossible in the browser, and it's totally justified. But taking the newest, shinest web application framework and turning what's a tree of regular web documents into a web application, with data unaccessible if you don't run the entirety of its JS? That's just wrong.

Another disturbing viewpoint:

Nobody that pays for developer time in 2018 is going to optimize their site for people that go out of their way to willfully turn off part of the stack. It's not gonna happen. Effort for no reward.

If those people are designers, then they design with no empathy. They may be unaware of accessibility. They have a narrow view of the world, regarding the tech owned by users and the physical capabilities of users. They obviously lack the ability to pick the right tools for the right times.

JavaScript should never be required to read text on a public website when the reader is not logged into the website.

Another good response to the small-minded people:

I turn off a part of the stack (or rather, run it on a whitelist), because people are using wrong parts of the stack for wrong things.

If you use the right parts of the stack for the right things, the result is a lean, accessible and interoperable piece of software. That's what the Internet was designed for. Alas, few people care, and in particular, interoperability is actively being opposed.

RE the toilet example, current webdev is more like refusing to build toilets in apartments and instead building them into car seats, because it's 2018, everyone has a car and a driver's license (or, rather, everyone in the population we want to monetize).

Another HN comment:

Some things should be simple and just work. They should work in a predictable manner and in a way that minimises risk for the users.

The narrow-minded JavaScript all-the-time everywhere freaks are contributing to the demise of a useful, open web. They don't understand time-and-place tech.

Another great response:

Good engineering is robust and efficient. Requiring Javascript to display simple, static text is not good engineering.

And the lame response from someone not paying attention:

Most websites are not just simple, static text.

I'd debate the "most" part, unless the commenter is including all internal websites/web apps at companies.

The discussion is about Reddit, which is mainly text. Why does Reddit's primary mobile web display require non-logged-in users to enable JavaScript to read text?

Web apps that require users to login and work from their console or dashboard in a private manner can use JavaScript without most of us whining about it. Banking, tax preparing, etc. I expect JavaScript, used well, to improve the user experience. But that info, hopefully, is not available for anyone to read on the public web.

Another HN comment:

I disagree, most of them are simple static sites. Gmail, Google Docs, etc. are the outliers, not the norm. Reddit is a static text site with images. Hacker News is a static text site. Blogs are static text sites (with a few embedded images or non-text objects.

A JavaScript version of Gmail is available on desktop/laptop screens. I don't understand why Google does not make this version of Gmail available on mobile web browsers. But Gmail can be used without JavaScript. That's how I use Gmail on my desktop and laptop computers.

Facebook can function without JavaScript at https://mbasic.facebook.com. Twitter functions without JavaScript. A minute ago, I logged into my test Twitter account and created a test tweet, all with JavaScript disabled for Twitter.

Large companies can afford to expend resources to create systems that function without JavaScript. Facebook, Google, and Twitter know that they have users in parts of the world that have limited internet or slow internet access, or the users have limited data cellphone plans, and JavaScript-free systems help the user experience in these situations.

And security-wise, it's a good idea to disable JavaScript in the browser.

Another HN response:

Most websites should be just simple, static text.

Another comment:

FWIW it takes less developer time to make a decent website without Javascript. Javascript also introduces risks that are non existent in static pages. Now if theese JS sites were super snappy and worked well offline etc I'd say probably wort it. But I guess they are often worse that a plain old web page.

someone who has created static and dynamic webpages since late last millenium

HN comment:

It doesn't take a lot of genius, time, or money to set a link w/ url parameters or use a standard html form for e.g. upvoting or comment submission, and then override that with javascript for fancier UI / AJAX / avoid redirects, etc -- i.e., extra polish, rather than it being the only option and the site just becomes nonfunctional instead of degrading gracefully.

This benefits more than just one tiny group of users: it might also aid disabled users and accessibility software (not to mention the developers of that software), security nuts, people who turn js off to improve performance on low-spec machines (it's 2018 here, but more than a few countries have a four (or even three!) figure GDP/capita, so their machines aren't going to be 2018 machines. This is just off the top of my head, how many other groups might there be that would benefit?

New businesses in the U.S. have to conform to accessibility guidelines for people with disabilities. No such regulation exists for websites. If the JavaScript all-the-time-everywhere crowd opened a coffee shop, they would gladly turn away customers with disabilities because such people represent only a small customer base.

HN comment:

It's 2018 and web developers are still not competent enough to build simple websites (blogs, news, reddit) without forcing visitors to use JS.

JS is the technology responsible for most of the malware infections and spyware ad-tracking. It's not like people disable it just to piss off developers, there are very good reasons to turn it off.

HN comment:

"JS disabled" is just the consequent, tech-y angle to not requiring JS to display every damn text box. I get JS to load things like simple menu-pop-ups or expanding an image, but it's so infused in the new reddit layout, you need it to basically display simple, static text content. It's just bloat.

What's the utility reason for requiring so much JavaScript to perform simple functions that were solved with basic HTML more than 20 years ago?


https://chat.indieweb.org/2018-04-13/1523620457122900

I just get annoyed, one website shouldn't require enough computing power [that] my [CPU] fan has to kick in

Not one website. A single web page can cause older computers to scream and glow red.


https://www.inc.com/30-under-30 https://news.ycombinator.com/item?id=16876061

Oh wow that is a 'slow' website. After initial site loading it used up 100% of one CPU core for all the animations. When scrolling up and down the hiding / changing side menus and ads make it even slower. That is an incredible piece of work on how not to do frontend.

another hn comment:

I dont know whats included on the website but it is loading really slow and the scrolling is laggy, I gave up after the first person highlighted.

hn comment:

Free karma to whoever posts a plain-text list.


http://epeus.blogspot.com/2011/12/facebook-twitter-and-google-plus-shun.html


https://chat.indieweb.org/dev/2018-04-25/1524642976411000

Zegnat I block everything in uMatrix. images, css, javascript. To save on data-usage. Though the saving from CSS is basically 0, it does sometimes stop multiple third-party CDNs from loading


https://www.smashingmagazine.com/2018/05/using-the-web-with-javascript-turned-off/


https://www.discoverdev.io https://news.ycombinator.com/item?id=17054460


creating a new design because it's cool and not because it solves any problems.

"Reddit's redesign increases power usage of user's devices" https://www.reddit.com/r/redesign/comments/8jzddx/reddits_redesign_increases_power_usage_of_our/

https://news.ycombinator.com/item?id=17087176

top HN comment:

Anyone from reddit's dev team reading this? Or youtube's dev team, because my question applies to both.

The previous design was great. The new version of both sites is slow, sluggish and provides me with no benefit. Why was the change implemented? The previous design wasn't broken!

How's the user feedback? A/B testing really indicated to you this was a good choice?!

If you have any insight -- I'm sure this was a decision made much higher up than dev -- please do share.

the new reddit design displays nothing when viewed on a mobile device with javascript disabled. brilliant.

have to use this version of reddit: https://i.reddit.com

another hn comment:

This is just a symptom of the webdev world's over-reliance on JavaScript to draw interfaces nowadays. The web stack has HTML, CSS, and JavaScript for reason. Quit doing everything in the third layer. Make your stuff with progressive enhancement. Knock off the do-it-all-in-JS stuff and your web apps will perform better.


http://bradfrost.com/blog/link/txt-fyi/

https://txt.fyi/about/

This is the dumbest publishing platform on the web.

Write something, hit publish, and it's live.

There's no tracking, ad-tech, webfonts, analytics, javascript, cookies, databases, user accounts, comments, friending, likes, follower counts or other quantifiers of social capital. The only practical way for anyone to find out about a posting is if the author links to it elsewhere.

Long live the independent web!