Web Reading UX Etcetera Part 6

https://news.ycombinator.com/item?id=14586325

Every time I use a web browser without an ad blocker I'm surprised at how shitty the web experience is.


https://idontwantyourfuckingapp.tumblr.com

https://alisdair.mcdiarmid.org/kill-sticky-headers


https://adactio.com/links/12441 https://www.theverge.com/2017/5/24/15681958/what-is-web-definition

Mind you, I think this is a bit rich in an article published on The Verge:

The HTML web may be slow and annoying and processor intensive, but before we rush too fast into replacing it, let’s not lose what’s good about it.

Excuse me? Slow, annoying, processor-intensive web pages have nothing to do with the technology, and everything to do with publishers like The Verge shoving bucketloads of intrusive JavaScript trackers into every page view.

TheVerge is suppose to be a media org that covers technology. Two years ago in the summer of 2015, TheVerge published one of the all-time dumbest articles about the web. TheVerge author blamed the slowness and other issues on the mobile web and mobile web browsers. The mobile web is no different than any other web. It's all the same web. TheVerge creates massively bloated websites because their business model depends upon advertising and page views.

And now in June 2017, the author of the above TheVerge article continues the absurdity of claiming that the web is "slow and annoying and processor intensive." WRONG. The web is none of those things.

Websites like theverge.com are slow and annoying and processor intensive. Remove the JavaScript, and theverge.com loads completely in around one second.

Executing cURL from the command line is the web too:

curl https://www.theverge.com/2017/5/24/15681958/what-is-web-definition > x.x

And that finished in a fraction of a second. The web is fast. Check that. The speed of the web depends upon the reader's internet connection.

Currently, I'm using fast WiFi. The web is fast for me right now.

Last week when visiting my wife's parents in central Michigan, my phone had a 3G connection much of the time. I accessed websites with JavaScript disabled to make the pages load in a reasonable time. If I permitted JavaScript, web pages of the type created by TheVerge would never finish downloading in a tolerable timeframe.

With my 3G connection last week, I accessed this site and my other websites fine because I'm not forcing the browser to download mountains of useless JavaScript, trackers, and ads. My sites are not trying to generate revenue.

With slow internet connections, the web can still be fast or at least good enough for usability, provided the websites are NOT designed like TheVerge.

After I curled TheVerge's article, I ran the file through a Perl script, called html2text.pl. Then I edited the saved text file, and I removed all the text unrelated to the article. And here's that page:

http://testcode.soupmode.com/thevergeweb.txt

I used a web browser to access that text file. That's the web even if the returned content is plain text. HTML text contains the markup that tells the web browser how to display it. But plain text looks the same in the web browser as it does by viewing the text file on the file system with the cat command or the vim editor.

Obviously, an HTML file looks different in a text editor, compared to the web browser. But the web is the transport mechanism. Web clients can download many file types over the web. It's not only about HTML.

HTML makes it comfortable for the reader to read a website. But some websites are so bloated and clunky that a plain text file that required pinching and zooming when reading on a phone would be preferred.

The number of horribly bloated websites makes the Gopher protocol seem like a viable option. Gopher and the web use the internet. Gopher, however, deals mainly with raw text. The files look about the same in a gopher browser as they do in a text editor.


Jun 23, 2017

Nothing will get solved when people fail to understand the difference between the internet and the web, and when people fail to see the difference between a slow internet connection and a massively-bloated website that forces users to download megabytes of crapware to read a simple text article.

https://blog.apnic.net/2017/06/19/why-is-the-internet-so-slow/

Our measurements reveal that the Internet is much, much slower than it could be: fetching just the HTML of the landing pages of popular websites is (in the median) ~37 times worse than c-latency. Where does this huge slowdown come from?

A single web page may require a user's browser to make hundreds of requests for a text article.

Related HN thread: https://news.ycombinator.com/item?id=14617713

HN comments:

The page this article is published on makes 87 separate network requests to 19 different domains. Even with a great deal of the content cached, it takes 7 seconds to reload over my WiFi today. This is because I'm using an ad blocker; without one, the page makes more requests to more domains and transfers more stuff, slower.

It seems likely that cutting bloat would have a much bigger impact on responsiveness than infrastructure upgrades. Cutting bloat would produce benefits even (especially) in times and places where signals or infrastructure are marginal. Cutting bloat could mean your old phone or tablet computer could browse the web tolerably instead of ending up in a landfill. Cutting bloat from a website today produces a benefit, for all visitors, worldwide, today. Improving infrastructure may be a good investment, but cutting bloat is a force multiplier.

The responsiveness of the web has been adjusted to optimize the number of eyeballs that see ads. If a site is too sluggish, people leave, but a site that's too fast is leaving money on the table. How many advertising, tracking, and affiliate domains would a web page like this connect to, if latency were cut by two thirds?

Another HN comment:

its not hard to make something fast on the web. its just usually that its way easier to throw the kitchen sink at the problem.

Another HN comment:

This took me a total of 5 minutes: http://static.haldean.org/why-is-the-internet-slow.html

The page is 2.3kB on the wire, and the single other request it makes (for the image) adds another 70kB. Taken together, this is a two-order-of-magnitude decrease in page size, with exactly the same content. Every byte is sacred! This webpage infuriates me.

Original article test: https://www.webpagetest.org/result/170624_HC_6XK/ https://blog.apnic.net/2017/06/19/why-is-the-internet-so-slow/ From: Dulles, VA - Chrome - Cable 6/23/2017, 10:15:42 PM First View Fully Loaded: Time = 8.372s Requests = 87 Bytes In = 1,254 KB Cost = $$$ out of a max of five

That's pretty good, relatively speaking when compared to media websites.

But here are the results of a simpler webpage with the same content. Apparently, someone else also likes to create simpler versions of the same webpage.

https://www.webpagetest.org/result/170624_NC_76M/ static.haldean.org/why-is-the-internet-slow.html From: Dulles, VA - Chrome - Cable 6/23/2017, 10:26:42 PM First View Fully Loaded: Time = 0.591s Requests = 3 Bytes In = 15 KB Cost = $

Yep. That's the web. And it's fast if websites are designed simply.

But websites want to make money. They need to make money to exist.


fun and helpful - https://betterwebtype.com/triangle


Jul 5, 2017: https://www.quirksmode.org/blog/archives/2017/07/the_elephant_in.html

In my opinion, so-called "modern" web design that lacks progressive enhancement is hostile. Such poor web design opposes diversity, inclusiveness, and accessibility, especially for websites that have browsing-only users (readers). If it's a web APP that requires users to login and work from their dashboard or admin console, then that might be a different story that does not apply here.

Although there’s a lot of heated discussion around diversity, I feel many of us ignore the elephant in the web development diversity room. We tend to forget about users of older or non-standard devices and browsers, instead focusing on people with modern browsers, which nowadays means the latest versions of Chrome and Safari.

Older devices with slower CPUs glow red when more the two media websites are open at the same time.

PPK's post is brilliant.

Ignoring users of older browsers springs from the same causes as ignoring women, or non-whites, or any other disadvantaged group. Average web developer does not know any non-whites, so he ignores them. Average web developer doesn’t know any people with older devices, so he ignores them. Not ignoring them would be more work, and we’re on a tight deadline with a tight budget, the boss didn’t say we have to pay attention to them, etc. etc. The usual excuses.

It seems that too many Web developers are testing their designs in perfect working conditions with the latest technology and their perfect eyesight.

How well does the browsing-only website load over a 3G connection?

Besides, let’s be realistic, shall we? The next billion, most of whom are on cheap Android devices without the latest and greatest browsing software, are mostly poor — and mostly black or brown anyway — so they don’t fit in our business model. We only cater to whites — not because we’re racist (of COURSE we aren’t!) but because of ... well, the Others are not our market.

How fast are or will be their internet connections? Will they have download data limits?

It's hostile for browsing-only websites to force readers to download 5 to 10 megabytes of bilge to view a simple, mostly text-based article. Not the entire website, but one article.

So far, this diversity problem plays out the same as the others. However, there’s one important difference: while other diversity problems in web development could conceivably be solved by non-web developers (by managers, for instance), the old-devices problem can only be solved by us because we’re the only ones who know how to do it.

Besides, taking care of all users is our job. So let’s do our job, shall we?

And let’s start at the start. Let’s admit we have a prejudice against users of old or non-standard devices and browsers.

https://mobile.twitter.com/ppk/status/882589599998119936?p=p

A counter argument tweet:

I can't even begin to explain how absolutely wrong this article is in many ways. Bizarre clickbait designed to make you look conscientious.

Apparently, he can't explain his point because he didn't.

Tweet responses to the above counter argument:

A solid, well-researched, well-cited counterargument. Five stars.

Good one. Funny.

Yup. This is the sort of carefully-argued, nuanced discussion that we need.

Man, Twitter's web-based UI/UX sucks stunningly bad on all devices. I have to disable JavaScript to enjoy something close to normal web behavior. Twitter fights the web browsers. Twitter creates abnormal web actions based upon my expectations. It's difficult to impossible to copy text. Clicking some links causes new tabs to open that I didn't want to open. Collapsing and expanding occur when I didn't expect those actions after clicking something. Bizarre.


july 2017

https://news.ycombinator.com/item?id=14698545

Today, for some reason, everyone thinks that writing software for a platform on a platform on a platform with no interconnecting standards or protocols is a great idea. Instead of trying to improve people's lives, we're just making things needlessly complex, buggy, and bloaty. You need 8 gigs of ram minimum just to browse the web.


https://accessibility.blog.gov.uk/2016/09/02/dos-and-donts-on-designing-for-accessibility


https://popmotion.io/blog/20170710-mobile-web-is-awful-and-were-all-to-blame https://news.ycombinator.com/item?id=14736313


http://mediagazer.com/170713/p5#a170713p5 https://www.wsj.com/articles/online-publishers-try-reducing-ads-to-boost-revenue-1499940061 http://jothut.com/cgi-bin/junco.pl/replies/92793


Jul 25, 2017 https://www.axios.com/exclusive-flipboard-is-crushing-it-on-mobile-2464930844.html

Flipboard's investments in a simple mobile user experience and editorial curated quality content are paying off.

"Creating great user experiences is similar to producing great writing — it's not always about what you add, but often about what you leave out," says Parse.ly CTO Andrew Montalenti.

"Parse.ly data indicates that platforms that simplify the UX around content, like Flipboard and Reddit, have seen sustained growth, especially on mobile, and will likely be a larger portion of traffic moving forward."


aug 1, 2017 - i found another one. i think that this is number five.

https://thebestmotherfucking.website


https://daringfireball.net/2017/07/df_display_ads

For readers, these are ads that, again, are visually unobjectionable, and which offer the most privacy you could hope for. Not only is there no tracking involved, there is no JavaScript involved. They’re just images, text, and HTML links. (

Sounds like ethical advertising. That won't become mainstream because web publishers would need to design with empathy for the readers.

https://daringfireball.net/linked/2017/06/22/navistone-form-data

This might sound hyperbolic, but I mean it: I think we’d be better off if JavaScript had never been added to web browsers.

#manifesto