Critical Times Require Lightweight Web Designs

Government agencies and media orgs use inhumane web designs

created Mar 26, 2020 - updated Apr 23, 2020

TL;DR

Obviously, I enjoy bitching and whining about the demise of the Web of Documents type of websites.

Too many orgs that are allegedly responsible for informing the public are deploying websites that are thoughtless and uncaring due to using unnecessary bloated web designs. Their web design choices lack empathy.

These website owners do not consider that some people may have poor eyesight, use low resolution screens, slow CPUs, old versions of browsers and operating systems, and have slow or spotty internet connectivity speeds for various reasons.

The website owners and the designers and developers probably access their websites from their workplaces, using relatively new monitors and CPUs, perfect indoor lighting, perfect or near perfect eyesight, and high-speed internet access. And since their garbage websites respond "quickly" for them, and since they can read their websites well, then they assume that everyone has the same experience.

Do I have evidence to support the above paragraph? No. But it must true, otherwise why would their garbage websites exist? It does not appear that they test their sites on older computers and slow internet connections.

Web READERS have grown accustomed to "modern" websites performing poorly and being inaccessible at times. Web users don't complain. They won't send a snail-mail letter. They won't send an email. They won't make a pull request. They will move on and use Facebook.

Website owners blame spikes in traffic. Right. When websites force users to download megabytes of JavaScript to read text, then 100 percent of the blame belongs to the website owners. That JavaScript could be stored at a CDN, but how much of that JavaScript is making API requests for data that is stored on the site owners' servers?

And if the web READERs only have a 2G cellular internet connection for some reason, then the megabytes of JavaScript won't be downloaded and executed, and the important text content may never display.

But nothing will change. We won't see a trend toward developing information-based websites with only simple HTML, along with a small amount of CSS and definitely no JavaScript. A year from now, these disgusting web designs will be even more bloated. That's the trend. More and more useless bloat.

These are not web apps. How in the hell can information-based websites throw 500-type errors in 2020? Media orgs and government agencies that are suppose to inform the public are doing a disservice by using bloated modern web designs.

Sep 11, 2001

This might be the first example of a media org switching its website to a 'lite' mode to handle traffic.

http://sawv.org/2001/09/11/terrorism-hits-nyc-and-washington.html

The CNN web site has been altered to be a simple page with one phote and some headlines. The site is obviously receiving heavy traffic.

Sep 2016 update: Here are Sep 11, 2001 news pages from CNN.com.

On that day, CNN and maybe some other news sites converted their websites to a "lite" mode to meet the traffic demand. Even then, the default sites for news orgs were too slow. CNN's lite mode from 2001 with its static HTML files would be useful and welcomed today in 2016.

https://webpagetest.org/result/200326_CC_f2ec4fe4b7c0869b5915268f90777258/
https://www.cnn.com/2001/US/09/11/worldtrade.crash/index.html
From: Dulles, VA - Chrome - Cable
3/26/2020, 12:20:55 PM
First View Fully Loaded
Download time: 0.582 seconds
Web requests: 4
Bytes downloaded: 31 KB
No JavaScript
22 KB of the download were for images.

https://webpagetest.org/result/200326_RN_f889ff45d3672d41a8226e73d36b10d8/
https://www.cnn.com/2001/US/09/11/chronology.attack/index.html
From: Dulles, VA - Chrome - Cable
3/26/2020, 12:21:09 PM
First View Fully Loaded:
Download time: 0.736 seconds
Web requests: 26
Bytes downloaded: 82 KB
6 KB were for JavaScript


Web browsers

Maybe these informational, Web of Documents type of websites should function 100 percent in these browsers.

Links2:
http://sawv.org/2020/03/10/links2-web-browser.html

NetSurf:
https://www.netsurf-browser.org/

Lynx:
https://lynx.invisible-island.net/

http://sawv.org/2018/05/25/opera-mini-on-blackberry-classic-phone.html

http://sawv.org/2017/10/17/my-favorite-app-is-the-web-browser.html

The link below points to a replicate of the original CERN web browser, the world's first web browser, recreated in JavaScript in early 2019, 30 years later. This is actually a good use for Javascript.

https://worldwideweb.cern.ch/browser/

How many of today's websites display and function within the first web browser? That browser did not and does not support the img tag. sawv.org works. danluu.com works. But for some reason, the links for text.npr.org don't work.


Ohio's Coronavirus site

https://coronavirus.ohio.gov/

http://sawv.org/2020/03/13/the-antithesis-of-modern-web-design-is-usefulness.html

Sun, Mar 22, 2020

So-called "modern" web design is failing at the wrong times. At 3:22 p.m., Ohio governor Mike DeWine concluded a press conference. The conference focused on a new directive handed down by the governor that will go into affect tomorrow night. I have no idea what it's about. The governor and the others speaking at the press conference know that their website is unresponsive, but the officials were not repeating whatever it is that we are suppose to read on a website that does not work because of traffic.

Mon, Mar 23, 2020

https://coronavirus.ohio.gov/ redirects to the URL shown below in the test.

https://webpagetest.org/result/200323_ZH_9bb597373c92fb82be29922174fe0f32/
https://coronavirus.ohio.gov/wps/portal/gov/covid-19/
From: Dulles, VA - Chrome - Cable
3/23/2020, 6:20:49 AM
First View Fully Loaded:
Download time: 7.774 seconds
Web requests: 82
Bytes downloaded: 3,560 KB

1.89 megabytes of the download were do to JavaScript. This is obscene for an important informative website that SHOULD be designed to be a Web of Documents type of website.

Would hosting the informative content in simple HTML pages with minimal inline CSS on a $5 a month Digital Ocean Droplet hold up under the strain of the Sunday afternoon surge in traffic?

Would hosting those same simple pages at an AWS S3 bucket work?

Would using GitHub Pages work?

If I had to use a CMS-hosted solution to host my content, I would choose GitHub Pages, but I would not use git from the command line on a local computer. I would create and update pages through web browsers by using GitHub's default web editing interface, or I would use Development Seed's prose.io editor.

October 2013 - https://medium.com/devseed/its-called-jekyll-and-it-works-1e6d70d27f5f

Good morning tech pundits! The main HealthCare.gov landing page and the thousands of subpages that educate the public on Affordable Care Act insurance are powered by Jekyll. This portion of the website has experienced 100% uptime and has functioned perfectly since we launched it in June. The site and our approach is all part of how we build CMS free websites.

Jekyll is an awesome open source project started at GitHub five years ago that generates static websites. We also built Prose.io, a web-based content editor specifically designed to make it simple for content creators to publish to Jekyll on GitHub (where we store all of our code). Together Prose.io and Jekyll enable building simple, flexible, and reliable sites without the overhead of dynamic CMSs, and the approach has worked great so far on HealthCare.gov and many other sites.

https://medium.com/devseed/how-we-build-cms-free-websites-d7e19d94a0ff

Mon, Mar 30, 2020 update

https://webpagetest.org/result/200330_67_1e3905b0e8dc5fc20f2954cf8e98f55a/

https://coronavirus.ohio.gov/
From: Dulles, VA - Chrome - Cable
3/30/2020, 6:16:00 PM
First View Fully Loaded:
Download time: 6.822 seconds
Web requests: 98
Bytes downloaded: 3,347 KB

1.0 megabytes of the download were for JavaScript. That's slimmer. A week ago, 1.89 megabytes of the download were for JavaScript. Did the tech team locate areas that could be trimmed? Why in the hell is even 1 megabyte of JavaScript needed? That's still idiotic. 1 megabyte of the download was for images, which is also rediculous. People visit the site mainly for TEXT.

Here's my boring, simple simon ass version of the Ohio government's coronavirus homepage. It's Craigslisty, but it contains the same info, which is stats, links, and link descriptions. It's not much info the Ohio government's homepage, which is whacked that it requires one megabyte of JavaScript to be downloaded to unsuspecting readers.

http://sora.soupmode.com/coronavirusohiogov.html

For my version, I got tired of connecting the links to actual web pages. I need to finish that part, but adding the rest of the URLs will have little to no impact on the webpagetest.org results.

https://webpagetest.org/result/200330_R1_b1cf29ac876989295785c575faef758e/
sora.soupmode.com/coronavirusohiogov.html
From: Dulles, VA - Chrome - Cable
3/30/2020, 7:24:16 PM
Download time: 0.270 seconds
Web requests: 2
Bytes downloaded: 4 KB
100 percent of the download was for HTML because the CSS is contained within the web page.

On the Ohio Coronavirus homepage, a link points to an article page, titled Resources for Economic Support.

https://webpagetest.org/result/200331_FF_934aac82cbf656958b1ca8887ebc871d/
https://coronavirus.ohio.gov/wps/portal/gov/covid-19/home/resources-for-economic-support/econ
From: Dulles, VA - Chrome - Cable
3/31/2020, 12:21:02 AM
First View Fully Loaded:
Download time: 4.480 seconds
Web requests: 44
Bytes download: 2,020 KB
Cost: $$$$$ (the max is 5 dollar signs)
15 web requests and 600 KB were for JavaScript. 740 KB were for CSS. Jeesh.

Mmm. New discovery with the webpagetest.org results. Normally, I don't click the "Content Breakdown" link, but this time, I did.

https://webpagetest.org/result/200331_FF_934aac82cbf656958b1ca8887ebc871d/2/breakdown/

The 600 KB JavaScript number mentioned above is compressed JavaScript. The content breakdown shows how big the JavaScript was when uncompressed, and it was 2 megabytes. The browser, however, can process compressed or minified JavaScript.

Anyway, it appears that Ohio Coronavirus article pages contain less bloatware than the homepage, but why is ANY JavaScript needed? That article page contains bullet points, text, and links. It's a simple, informative page. Why would such a simplistic but useful webpage require an unsuspecting user to download 2 megabtyes? What attributes in modern web design are users clamoring for that require JavaScript for a page with bullet points, text, and links?

Here's my version.

https://webpagetest.org/result/200331_7K_4f991d482cd1233567499675668a2a28/
http://sora.soupmode.com/resources-for-economic-support.html>
From: Dulles, VA - Chrome - Cable
3/31/2020, 12:24:56 AM
First View Fully Loaded:
Download time: 0.297 seconds
Web requests: 2
Bytes downloaded: 8 KB
Cost: $

Apr 6, 2020 update

Toledo's county is Lucas County. This is the county health department's coronavirus page.

https://lucascountyhealth.com/coronavirusupdates

It appears fairly benign. The focus is on content. This is another page that consists mostly of bullet points, text, and links.

Some of the text colors seem odd and confusing. Some blue underlined text is not a link while other blue underlined text and blue text that's not underlined are links.

I like the simple JPEG map.

JavaScript is used to expand info at the bottom of the page. Some of these sections are large. In my opinion, each one of these sections should exist on their own web pages.

Here are the webpagetest.org results for the Lucas County page.

https://webpagetest.org/result/200406_T0_171428fd5b5c6e40a1c64a60ffd2fbd9/
https://lucascountyhealth.com/coronavirusupdates/
From: Dulles, VA - Chrome - Cable
4/6/2020, 10:46:08 AM
First View Fully Loaded:
Download time: 9.416 seconds
Web requests: 149
Bytes downloaded: 4,899 KB

Nearly 5 megabytes???

69 web requests and 2 megabytes of the download were for JavaScript. A local government agency's web page forces unsuspecting readers or web browsers to download 2 megabytes of JavaScript for a simple web page.

1.8 megabytes of the download were for images.

Here's my version.

http://sora.soupmode.com/lucas-county-coronavirus-information.html

https://webpagetest.org/result/200406_QN_a09d3adac761c8d953c758ea78cb64e6/
sora.soupmode.com/lucas-county-coronavirus-information.html
From: Dulles, VA - Chrome - Cable
4/6/2020, 5:47:32 PM
First View Fully Loaded:
Download time: 0.854 seconds
Web requests: 3
Bytes downloaded: 265 KB

267 KB of the download were for one image, the map of the county that showed the number of coronavirus cases by zip code.

Instead of embedding the map, which displays too small anyway due to the width that I use for the page, I should simply link to the map.

I updated my version by removing the embedded image. I provided a link to the JPEG version, underneath the link to the PDF version of the map.

https://webpagetest.org/result/200406_8P_08de688939b38794d707f76388d307a8/
sora.soupmode.com/lucas-county-coronavirus-information.html
From: Dulles, VA - Chrome - Cable
4/6/2020, 6:04:49 PM
Download time: 0.272 seconds
Web requests: 2
Bytes downloaded: 4 KB


Mar 23, 2020 HN threads

Get Static (meyerweb.com)
https://meyerweb.com/eric/thoughts/2020/03/22/get-static/
https://news.ycombinator.com/item?id=22659324

A Starter Kit for Emergency Websites (mxb.dev)
https://mxb.dev/blog/emergency-website-kit/
https://news.ycombinator.com/item?id=22657571

First of all, a so-called static website does not imply lightweight and fast-loading for READERS. A static HTML page can contain links to megabytes of JavaScript that can bloat the web reading experience.

If a website has every informational page dynamically generated by server-side code, then that site can still be fast for READERS under normal server load. But if traffic spikes, then bottlenecks can occur on the server-side in the programming code that needs to executed to create the pages and/or with the server-side code making database connections.

This is the reason for making information websites as static HTML pages. No other programming code needs to be executed. No databases need to be accessed. The web server returns the HTML page from the file system. More sophisticated setups, however, can be used for static pages with the HTML pages cached in edge servers.

Excerpts from the meyerweb.com post:

If you are in charge of a web site that provides even slightly important information, or important services, it’s time to get static. I’m thinking here of sites for places like health departments (and pretty much all government services), hospitals and clinics, utility services, food delivery and ordering, and I’m sure there are more that haven’t occurred to me. As much as you possibly can, get it down to static HTML and CSS and maybe a tiny bit of enhancing JS, and pare away every byte you can.

Doing that now may be too late, unless the org can afford to employ a team of people to work around the clock to make the above happen. The time do this is with the initial design or redesign of the website under normal circumstances. Waiting until it's an emergency is probably too late for most orgs, especially local newspapers that may not have an IT team.

As much as you possibly can, get it down to static HTML and CSS and maybe a tiny bit of enhancing JS, and pare away every byte you can.

That should be the DEFAULT web design all of the time for Web of Documents type of websites.

Because too many sites are already crashing because their CMSes can’t keep up with the traffic surges. And too many sites are using dynamic frameworks that drain mobile batteries and shut out people with older browsers. That’s annoying and counter-productive in the best of times, but right now, it’s unacceptable. This is not the time for “well, this is as performant as our stack gets, so I guess users will have to live with it”. Performance isn’t just something to aspire to any more. Right now, in some situations, performance could literally be life-saving to a user, or their family.

We’re in this to serve our users. The best service you can render at this moment is to make sure they can use your site or service, not get 502/Bad Gateway or a two-minute 20%-battery-drain page render. Everything should take several back seats to this effort.

Maybe for you, getting static means using very aggressive server caching and a cache-buster approach to updating info. Maybe it means using some kind of static-render plugin for your CMS. Maybe is means accelerating a planned migration to a static-site CMS like Jekyll or Eleventy or Grav. Maybe it means saving pages as HTML from your browser and hand-assembling a static copy of your site for now. There are a lot of ways to do this, but whatever way you choose, do it now.

The related HN thread contained 86 comments. Here's the top comment:

Static site generator for non-technical people:

  1. Write your article in Microsoft Word.
  2. Save as 'Web Page, Filtered'
  3. Upload/Place in html folder on Webserver.

Almost everyone has access to a copy of Word, if not I think LibreOffice has a html save function also.

This solution is 100% wrong for us technical people, but for non-techies who just want to get the information out there it's a really good option imho.

Over the past couple years, Netlify is a popular service mentioned by HN users to create and manage websites, and it's mentioned a lot in this thread too.

Anyway, another HN comment:

I am sort of surprised at how talk about static websites is only becoming top of mind as of recent. When I think of static websites used with success, I always think of Obama's campaign, it was built with Jekyll. There was quite a bit of talk around static websites around that time, and it seemed like quite a few people were on-board with the idea.

Numerous articles were created that described the team and the tech used by the 2008 and 2012 Obama election campaigns.

Another HN comment that describes a process that I don't fully understand:

I've been working on an restaurant order system for a client lately.

Instead of doing the obvious thing and build a database connected web application hosted in the cloud; I opted for a local application that generates static files for customer ordering, uploads them to a web server and polls for orders. On the server there's a simple PHP script that writes a chunk of JSON to a file when the order is created.

The main motivation was to allow them to continue taking orders in the restaurant and perform maintenance even if the internet connection isn't working. Generating files also simplifies the implementation of customer specific customizations.

And since the database isn't customer/web facing any more and therefore doesn't need to scale beyond a couple of simultaneous users, I opted for a simple file based solution.

The fact that the customer order part runs blazing fast is a nice bonus.

That same user added:

The application is used for everything that happens locally, including taking orders in the restaurant and organizing deliveries.

Polls for orders means it reads the json-file that is appended to via the php-script when a customer makes an order from the web interface and imports new orders into the local database.

The application generates json-files, which are then read by static html files. This makes it easier to test with stub data and allows shipping the html/css to a designer if/when they want something nicer looking.

The html files and generated json data is never used locally.

Of course, JAMStack is hot now. https://jamstack.org

Fast and secure sites and apps delivered by pre-rendering files and serving them directly from a CDN, removing the requirement to manage or run web servers.

JAMstack Course - Build websites that are simpler, faster, and more secure
https://www.youtube.com/watch?v=A_l0qrPUJds

When it's a Lucas County or Toledo government agency website, how sophisticated can the process flow be? Who maintains the infrastructure setup? And what new stack will become popular two or three years from now? Will boring, sturdy tech be used that can hold up and be maintained 5 to 10 years from now?

Most Web of Documents type of websites do not need to be built to the scale of Reddit.

HN comment:

It amazes me why every site seems to need the latest and greatest framework just to tell me a paragraph of text.

HN comment about something I know nothing about:

Static is not the way to go. There are technologies today which can scale fast without crashing. You don’t need static sites to handle the load. Checkout a serverless CMS like Webiny - https://www.webiny.com

Good observation from an HN commenter:

A lot of the time, you use a CMS so that non-technical people can manage content. I'm also not aware of any static site generator that doesn't completely fail in this regard.

The process to create and update content probably needs to be done through the web browser. Requiring non-technical users to work from the command line and use git commands will probably be a big hurdle.

HN comment:

A government website, or anything of importance during an emergency, really has no excuse when it comes to being excessively heavy, both on the front end and the back end. At times of panic you want to convey information as fast as possible and as efficiently as possible. Five nines availability.

HN comment:

Specifying content, styling, and layout are what HTML and CSS were designed to do, and they do it quite well. Throwing the bigger hammer of Javascript at this problem means anybody who likes to browse without Javascript (which is quite a few of us in this era of abusive ads and malware) is not going to see your site properly. It also means you've opened the door for some future maintainer to introduce a multi-megabyte Javascript library 98% of which isn't actually needed, and which will very much consume a lot more power on the client's platform than HTML and CSS.

Never use a Turing-complete language to solve a problem that's already well-solved with a non-Turing-complete language. Especially when that non-Turing-complete language is already present on the client's machine. Most especially when you're talking about static sites for essential public information.


Excerpts from the mxb.dev post:

In cases of emergency, many organizations need a quick way to publish critical information. But existing (CMS) websites are often unable to handle sudden spikes in traffic.

Ahhh. Why not build lightweight websites by default? How often will such emergencies occur at local, regional, and state levels that require websites to "flip the switch" to lite mode? And what happens when flipping the switch fails? It would have to be tested periodically.

To make things worse, natural disasters can also damage local network infrastructure, sometimes leaving people with very poor mobile connections.

I’ve written about the practice of publishing minimal “text-only” versions of critical news websites before and I think it makes a lot of sense to rely on the rule of least power for these things. When it comes to resilience, you just can’t beat static HTML.

Like so many others, I’m currently in voluntary quarantine at home - and I used some time this weekend to put a small boilerplate together for this exact usecase.

Here’s the main idea:

  • generate a static site with Eleventy
  • minimal markup, inlined CSS
  • aim to transmit everything in the first connection roundtrip (~14KB)
  • progressively enable offline-support w/ Service Worker
  • set up Netlify CMS for easy content editing
  • one-click deployment via Netlify to get off the ground quickly

The offline-support is interesting, but probably not needed for all events nor by all types of websites.

The site contains only the bare minimum - no webfonts, no tracking, no unnecessary images. The entire thing should fit in a single HTTP request. It’s basically just a small, ultra-lean blog focused on maximum resilience and accessibility. The Service Worker takes it a step further from there so if you’ve visited the site once, the information is still accessible even if you lose network coverage.

The end result is just a set of static files that can be easily hosted on cloud infrastructure and put on a CDN. Netlify does this out of the box, but other providers or privately owned servers are possible as well.

His demo site: https://emergency-site.dev

The related HN thread contained 108 comments. Here's the top comment:

What we did for tacticalvote.co.uk:

  1. A static site generator, with markdown as the source input in Github
  2. Data from Google Sheets
  3. A bash job on a cron that would poll both for changes... if changes exist, re-publish the site or data and purge Cloudflare cache using their API
  4. Configure Cloudflare via Page Rule to Cache Everything

Even with a very high change rate and hundreds of thousands of visitors a day and severe traffic spikes... the site was instantaneous to load, simple to maintain and update, and the cache purge stampede never overwhelmed the cheapest Linode serving the static files.

The content editors used Github as the CMS and edited Markdown, or just updated data in Google Sheets. Changes were live within 5 minutes.

I like it, but it sounds too technical for most users.

HN reply:

You lost most municipals with this. Yes, even this is too complicated.

And again, Netlify is mentioned often in that HN thread.

I have used a few Amazon Web Services, such as Route53, EC2, S3, and CloudFront, and I like using them, but the usage leans toward the technical side.

Setup is one thing. Maintenance is another. And creating and updating content is a third function. When it's the same person or the same group of people doing all three things, then it's easy.

A client, such as a government agency or a media org, may hire a web development firm to create the setup. That company may train the client's IT staff to do the maintenance. The Client's IT staff will train its non-technical users to create and update content.

I like this HN comment:

Not to be "that guy", as I mostly like modern Web design, but:

  • and after the emergency is over, keep it that way

Many other ideas were suggested in that HN thread.

The simplest solution is to make a static website of your dynamic one using wget, and then publish that. I did this e.g. for Wordpress sites, works really well and is very reliable. The process can be triggered via a cron script or manually (I wrote a small Wordpress plugin for it). No special hardware, infrastructure or cloud services required. Just make sure all resources are reachable via a link (so wget can find them) or manually point it at a list of otherwise unreachable files (e.g. using a sitemap or .txt file).

The advantage is that you can still use your existing CMS, so your staff won't need to learn a new system, and you also don't need any third-party cloud services.

Actually, if your CMS is properly configured (e.g. correct cache headers) you can also simply put it behind a CDN like Cloudflare, which will handle the caching and scaling for you.

Normally you can just use

wget --mirror --convert-links --adjust-extension --page-requisites --no-parent http://example.org

https://www.guyrutenberg.com/2014/05/02/make-offline-mirror-of-a-site-using-wget/

Another good reason to use CloudFlare in front of a website is for protection against denial of service attacks.

Back to the HN thread, this commenter used the boilerplate and Netlify to launch a simple website. This is a good working example.

The one-click deploy to Netlify worked for me. Have used this to spin up a site listing crisis communications advice around Coronavirus: https://coronaviruscomms.netlify.com/


March 25-26, 2020 Blade stories

http://www.toledoblade.com/local/Coronavirus/2020/03/25/port-clinton-mayor-father-on-voluntary-quarantine-after-taiwan-trip-coronavirus/stories/20200325110

As the Trump Administration began to crack down on international travel, it took Mrs. Snider about two days to maneuver the logistics of crashing websites and unanswered calls before she could get them new flights with the fewest layovers as possible, totaling $1,300.

This week, state unemployment websites have been unresponsive at times. Again, the blame is on a traffic spike. Okay. Could a simpler web design and better server-side choices have helped?

In the previous story and in this story below, if users are required to create accounts and login, then I can understand the strain on server code and databases. But if the sites are primarily informational sites, then they should ALWAYS respond or at least respond 99.99 percent of the time.

https://www.toledoblade.com/news/nation/2020/03/26/us-jobless-claims-soar-to-record-3-3-million-as-layoffs-jump/stories/20200326089

Many people who have lost jobs in recent days have been unable to file for unemployment aid because state websites and phone systems have been overwhelmed by a crush of applicants and have frozen up.

In New York, the state’s website repeatedly crashed when she was halfway through filling out her request. When she finally managed to press submit, she received a pop-up saying she had to file over the phone. That hasn’t worked well, either.


My past related thoughts

Over the past few years, I have also thought about lightweight web designs in critical times for homepages and live blogs. Holy hell, the media today maintain atrocious live blog pages.

http://sawv.org/2017/11/30/the-usa-todays-war-on-the-web.html

Pondering live blog design

From that post:

October 2018

My Oct 11, 2018 comments about the Panama City newspaper website The News Herald and its bloated web design used during a time when people with internet or cell phone service may only have limited internet access. Media orgs and government agencies NEED a way to convert their slow, bloated websites to a lightweight mode. People living in a crisis area may be actually interested in being informed.

http://sawv.org/2018/10/11/links-and-notes-thu-oct-11-2018.html#links

September 2018 post titled The Hurricane Web that I read in mid-October 2018.
https://mxb.at/blog/hurricane-web

As Hurricane Florence makes its way across the US southeast coast, many people are stuck in areas with severe flooding. These people rely on outside information, yet have limited bandwidth and power. To help them, news platforms like CNN and NPR provide text-only versions of their sites.

Text-only sites like these are usually treated as a MVP of sorts. A slimmed-down version of the real site, specifically for emergencies.

Most importantly, it’s user friendly. People get what they came for (the news) and are able to accomplish their tasks.

This is the web as it was originally designed. Pure information, with zero overhead. Beautiful in a way.

Javascript enables us to do amazing things and it can really enhance the user experience, if done right. But it always has a cost. It’s the most expensive way to accomplish a task, and it’s also the most fragile. It’s easy to forget that fact when we develop things on a highspeed broadband connection, on our state-of-the-art devices.

That’s why websites built for a storm do not rely on Javascript. The benefit simply does not outweigh the cost. They rely on resilient HTML, because that’s all that is really necessary here.

That NPR [text] site is a very useful thing that serves a purpose, and it does so in the simplest, most efficient way possible. Personally, I’d love to see more distilled experiences like this on the web.

Figure out what the main thing is people want from your site and deliver it - using the simplest, least powerful technology available. Make it withstand hurricanes.

http://lite.cnn.com was created in 2017 or 2018 to provide info to people impacted by an Atlantic hurricane. That site continues to exist today.

https://text.npr.org was create several years ago, maybe in the aught years. I'm unsure. I first learned about it approximately four years ago. In my opinion, text.npr.org is the best designed media website.

https://infrequently.org/2018/09/the-developer-experience-bait-and-switch/

JavaScript is the web’s CO2. We need some of it, but too much puts the entire ecosystem at risk. Those who emit the most are furthest from suffering the consequences — until the ecosystem collapses. The web will not succeed in the markets and form-factors where computing is headed unless we get JS emissions under control.


More from that same post http://sawv.org/2017/11/30/the-usa-todays-war-on-the-web.html

Aug 4, 2019

Two mass-shooting incidents occurred in the U.S. over a 12-hour period. The first shooting occurred in El Paso, TX around 1:00 p.m. EDT yesterday, and the second occurred in Dayton, OH around 1:00 a.m. today.

I would think that a newspaper like the Dayton Daily News would be a source of updated information, but it fails at live blog style reporting.

This is a disservice to the public.

https://www.daytondailynews.com/news/crime--law/police-responding-active-shooting-oregon-district/dHOvgFCs726CylnDLdZQxM/

https://www.webpagetest.org/result/190804_DD_ed7d7dd4b916911a5561930d64d03e6e/

From: Dulles, VA - Chrome - Cable
8/4/2019, 2:19:52 PM
First View Fully Loaded:
Time: 47.704 seconds
Requests: 650
Bytes in: 23,575 KB

650 web requests???

10.7 megabytes were for video. Why not separate that video out onto individual pages by using only links to those pages on the main live blog page? This would greatly reduce the size of the page, which makes the page loadable and usable for people over cell connections or slower connections.

https://www.nytimes.com/2019/08/04/us/dayton-ohio-shooting.html
https://www.webpagetest.org/result/190804_E4_82fada5c9989874a441cea73863e83e3/
From: Dulles, VA - Chrome - Cable
8/4/2019, 2:33:12 PM
First View Fully Loaded:
Time: 20.814 seconds
Requests: 365
Bytes in: 7,548 KB

That's better than the DDN page, but that's not saying much. The great NYT with all of its technical know-how still fails at the live blog concept.

365 requests???

122 requests were for JavaScript. Unacceptable.

4 megabytes of the download were for JavaScript. Definitely criminal, regarding web design.

I don't get it. I'll never understand how newspapers fail at providing useful web design.


http://sawv.org/2018/03/12/march-1213-2018-new-england-winter-storm.html

Live blog fails to function with JavaScript disabled.
https://www.boston.com/news/weather/2018/03/11/live-updates-boston-snow-storm-noreaster-march-12-13-2018

This live update page also fails to function with JavaScript disabled.
http://www.bostonglobe.com/metro/2018/03/12/minute-minute-updates-tuesday-nor-easter/Xcq3HpJhDYC0hRDwGhgB8H/story.html

These pages display with JavaScript disabled. Both sites are bloated.

http://www.bostonglobe.com/metro/2018/03/12/nor-easter-nears-forecast-remains-same/bU3WogXrYHrYzmYt0aiQ0O/story.html

https://www.boston.com/weather/weather/2018/03/12/areas-in-massachusetts-under-winter-storm-blizzard-warning


From http://sawv.org/2017/08/27/hurricane-harvey-sun-aug-27-2017.html

https://www.hcfcd.org

Http/1.1 Service Unavailable

Weak.

Press conference occurring with person from Office of Emergency Management. The person gave out the URL to view the maps of areas that may flood due to a controlled release of water from reservoirs/whatever.

Too many people pounding on the website at the same time. Man. Simpler websites are needed. Static HTML. Maybe S3 hosted.

Prob here is that the two maps are PDF files, and that would bog down their server, unless stored elsewhere, such as an S3 bucket.

Back to today, over 2.5 years later ...

https://webpagetest.org/result/200326_ND_51a990ad4c89a0aa3369ca7bd214c034/
https://www.hcfcd.org/
From: Dulles, VA - Chrome - Cable
3/26/2020, 12:20:53 PM
First View Fully Loaded:
Download time: 13.068 seconds
Web requests: 164
Bytes downloaded: 5,949 KB
2.6 megabytes of the download were for JavaScript.

Come on. WTF?

hcfcd.org is the Harris County Flood Control District. Okay, that might not be the most exciting website to visit. I doubt that Houston area residents visit the site daily or even weekly. The homepage, however is flashy looking, as if the website was hawking something exciting or pop culture-related.

When viewing that site in Firefox with JavaScript disabled, the entire page is blank. When I use the uMatrix browser extension to disable CSS, then content displays in a horribly clunky manner.

How is this type of web design legal, especially for government websites?

politico.com and cjr.org are also blank when viewed in Firefox with JavaScript disabled. To see content at those sites, CSS needs to be disabled too. Both sites will display content in Links2, since Links2 does not support JavaScript nor CSS, but these type of shitball-designed websites don't work in NetSurf.


My Sample pages, test pages, and templates for live blogs and homepages:

http://sawv.org/2014/11/26/live-blog-wed-nov-26-2014.html

home page template for major news/info event sites:
http://wren.soupmode.com/test-home-page-format.html
http://wren.soupmode.com/test-home-page-format-2.html

http://sawv.org/2018/02/11/news-event-template-page.html - homepage
http://sawv.org/2018/02/11/live-blog-template.html

http://sawv.org/2017/03/14/live-blog-tue-mar-14-2017.html
http://wren.soupmode.com/2017/03/14/live-blog-tue-mar-14-2017-v2.html - same as above but with fewer images and videos embedded into the main page, which contains links to the heavy content.

http://sawv.org/2017/09/16/test-bbc-live-blog.html

http://sawv.org/2016/05/12/thu-may-12-2016-notes.html

One of my live blog pages:
http://sawv.org/2018-nfl-draft.html


Old toledotalk.com posts

By user JustaSooner:
http://toledotalk.com/cgi-bin/tt.pl/article/32601/Weather_Test_Run

From 2007 by me:
http://toledotalk.com/cgi-bin/tt.pl/article/6032/Reporting_on_the_San_Diego_County_fires
http://toledotalk.com/cgi-bin/tt.pl/article/6032#7382

Also from 2007 by me:
http://toledotalk.com/cgi-bin/tt.pl/article/142543/Network_News_in_a_Box_-_2007

http://toledotalk.com/cgi-bin/tt.pl/article/6032#7381 - Nov 15, 2007 Toledo Blade story titled Lucas County to establish small-scale crisis team


Local media websites

We have no obligation to fund local newspapers when their digital products are poorly designed.

I wish that the Toledo Blade website operated this way for subscribers like me.
http://wren.soupmode.com/tg/

Does my fictional Toledo Gazette site (using real Blade content) need a video to explain how to use the site?

http://www.toledoblade.com/local/city/2020/03/19/video-tutorials-for-using-the-eBlade/stories/20200319116

The Blade has created these video tutorials to help guide readers through the most commonly asked questions about how to navigate and print a story in the eBlade.

Print a story? Wow. I have no use for a printer except to print our tax returns because we cannot file electronically for bizarre reasons, related to a data breech at my wife's company four years ago. Every spring, I buy new ink to print the tax returns for snail-mailing. Then we don't use the printer again or rarely use it again until next spring when the ink has dried up and i have to buy new ink again. But I might start printing the crochet patterns that I buy online.

Print a news story. The Blade only prints its newspaper three days a week, down from seven days a week 13 months ago.

A surprisingly large number of people STILL prefer a print newspaper, and I assume that's why people want to print the content in the eBlade app, which I think is a PDF file of what a printed newspaper would look like. I don't know. I don't care about the eBlade app, nor it's NewsSlide app.

Did you want to go back to read a story in a previous edition? View this video to find out how:

Previous edition? I don't understand "edition" on the web or in the digital media landscape. It's web posts (articles) with a created and/or updated date and time stamp.

I don't understand the "edition" concept in the RSS feeds that I use to access Blade content.

Did you want to go to a specific section in the eBlade instead of tapping through all the pages? View this video to find out how:

Do you want to look at the ads? View this video to find out how to get to the advertising in the eBlade:

Do you want to print a story from the eBlade to save it or read it on paper? View this video to find out how:

My Toledo Gazette example assumes that READERS know how to use a web browser, including the back button, and they understand that the web is about URLs and linking to other pages, hosted on the same server or on other servers.

Users' web browsers may have some kind of readability mode if the default typography for the website is not good enough.

http://sawv.org/2019/08/17/how-i-read-the-toledo-blade.html


http://sawv.org/2019/05/16/with-newspaper-website-design-this-atrocious-its-hard-to-respect-the-news.html

Below is some additional info about the web version of the book War and Peace. In printed book form, it's over 1000 pages long.

Comment from a May 2017 Hacker News thread:

I can read "War and Peace" as an HTML document on my 7 year old cheapo Android phone. The browser even "streams" the data, displaying each chunk as it loads over a slow connection.

HTML version of WAR AND PEACE By Leo Tolstoy/Tolstoi:

The Kindle version is 5.2 MB.

One web article hosted at the mercurynews.com is larger than the Kindle version of War and Peace when unsuspecting web readers download several megabytes of crapware.


https://webpagetest.org/result/200324_Z6_8a64c93727992b164f4f2ef31f1678c5/
https://www.pghcitypaper.com/pittsburgh/no-news-is-bad-news/Content?oid=16966532
From: Dulles, VA - Chrome - Cable
3/24/2020, 7:30:31 PM
First View Fully Loaded:
Download time: 8.204 seconds
Web requests: 140
Bytes downloaded: 2,982 KB
44 web requests were for JavaScript.
1.3 megabytes of the download were for JavaScript.
1.1 megabytes of the download were for images.

Sadly, 1.3 megabytes of JavaScript is considered "small" for media orgs today, but that's still unacceptable, in my opinion. Why not zero?

How about this crap.

https://webpagetest.org/result/200324_VR_59cd20417f97322ea8d49d218a76bf7f/
https://www.theadvocate.com/baton_rouge/news/coronavirus/article_0226976e-6d37-11ea-abb2-ab9213b0edbd.html
From: Dulles, VA - Chrome - Cable
3/24/2020, 7:30:07 PM
First View Fully Loaded:
Download time: 37.518 seconds
Web requests: 989
Bytes downloaded: 8,263 KB
130 requests were for JavaScript.
3.2 megabytes of the download were for JavaScript.
301 web requests were for images.
187 web requests for for "other."
2.7 megabytes of the download were for video.

This is a weekly or bi-weekly alternative newspaper. When I think of such media orgs, I think of print and not their digital products. But the coronavirus has shut down businesses that advertise in alternative newspapers, which has caused these newspapers to layoff staff and stop printing. This might be considered piling on, but this is so bad that it's undescribable.

https://webpagetest.org/result/200324_WV_7cf1c819089c7819ed0854a78a8c2035/
https://www.riverfronttimes.com/newsblog/2020/03/18/coronavirus-and-the-rft
From: Dulles, VA - Chrome - Cable
3/24/2020, 7:31:57 PM
Download time: 30.258 seconds
Web requests: 781
Bytes downloaded: 6,776 KB
146 web requests were for JavaScript.
4.2 megabytes of the download were for JavaScript.

The HTML version of War and Peace is 3.9 megabytes.

Google's alleged coronavirus info page is also too bloated with JavaScript, but Google is an ad surveillance company.

https://webpagetest.org/result/200326_QA_fe568eb29ba61a7e9b558b9bd8914627/
https://www.google.com/covid19/
From: Dulles, VA - Chrome - Cable
3/26/2020, 11:40:06 AM
Download time: 7.004 seconds
Web requests: 79
Bytes downloaded: 3,356 KB
1.5 megabytes of the download were for JavaScript.

Doing what the media won't

This week's news ...

https://scroll.blog/2020/03/24/scroll-partners-with-firefox-to-build-a-better-internet/

Within an updated Firefox web browser on Linux, the above blog post is missing content when JavaScript is disabled. In order to view the content with JavaScript disabled, CSS has to be disabled too. Wow. Great web design from a company claiming to "build a better internet."

Uh, Scroll is not building a better internet. The company is attempting to display media websites more humanely. Since media orgs fail to produce lightweight, useful websites, the media rely on Google's Accelerated Mobile Pages and now Scroll.

This is not a better internet. It's better media web articles.

https://firstlook.firefox.com/betterweb/

https://blog.mozilla.org/blog/2020/03/24/try-our-latest-test-pilot-firefox-for-a-better-web-offering-privacy-and-faster-access-to-great-content/


Apr 1, 2020

Mar 26, 2020 - themarkup.org - How Many Americans Lack High-SpeedInternet?

As COVID-19 continues to spread around the world, offices, schools, and even entire cities have been shut down, forcing people to alter the way they live and work.

But not everyone in the U.S. has access to these online services. There are more than 14 million people without any internet access and 25 million without the faster and more reliable broadband access, according to a 2018 Federal Communications Commission study.

Microsoft has said its research shows that if broadband access was counted more precisely, the number of Americans without it would be closer to 163 million people.

Meaning what? These people are only accessing the internet over cellular, or are they also using phone-based, dial-up modems for home internet access?

The U.S. ranked 10th among 28 countries in broadband download speeds in 2016, behind Luxembourg, Japan, and Iceland, according to the FCC. And that could also be an overstatement: The FCC data is based on the speeds that internet service providers advertise, not what is actually in place.

True.

And U.S. consumers consistently pay much more for access to broadband than users in other countries. Jonathan Mayer, a professor at Princeton University who served as chief technologist in the FCC during the Obama administration, said internet is so expensive in the U.S. because of a lack of competition.

Massively true. The number one provider in the Toledo area is Buckeye Internet-something. It's a local business, owned by Block Communications, which also provides the main cable TV access. Block Comm also owns the Toledo Blade newspaper.

We use another local business for home internet access: toast.net. Deb started using toast.net in the mid to late 1990s. Deb and I got married in 2001. We switched from dial-up modem to broadband around 2003, maybe later than that, like 2005. But this household has used toast.net since at least 1996.

Many areas in the U.S., however, only have one home internet access option.

When I first connected to the internet at my home that I rented, it was back in 1993 or so, and I used a company called Primenet. More home internet access options seems to exist back in the early to mid 1990s.

“It can be very difficult to start an ISP [in the U.S.],” said Mayer. “In some other countries there are a large number of local or semi-local ISPs that fill in the gaps where national ISPs don’t have coverage or have competed with the national ISP market. That phenomenon hasn’t really played out in the United States.”

The lack of reliable and affordable internet access is especially severe in rural communities, where only two out of every three people say they have broadband access, according to a Pew Research Center survey conducted in early 2019. The survey also shows that people in rural communities are less likely to have a home computer, smartphone, or other device that provides access to the internet.

School-aged students are also affected. One study of students in 2018 found that low-income students, first-generation college students, and students of color were more likely to have access to only a single device, such as a home computer, tablet, or smartphone. For another 10 percent, that device was only a smartphone.

Poorer neighborhoods often don’t have the same access to broadband as affluent ones. According to a 2016 investigation by the Center for Public Integrity, poor families were five times less likely to have access to high-speed internet than their affluent counterparts.

And doesn't this highlight how inhumane bloated modern web design can be?

When modern web design is unnecessarily bloated, then in my opinion, these web designs are discriminatory because users with slow internet connections and/or people with older and slower computers may not be able to use many websites.

But those same slow internet connections and old computers could probably access and USE the lighter web designs of the late 1990s and early aughts when most of us were still using dial-up modems at home.

In my opinion, web users want to access and use websites. They are not clamoring for slow, clunky, bloated web designs. Stakeholders, designers, and programmers are ruining the web by using bloated tech, mainly because the tech exists. They are not using the bloated tech to solve problems. In same cases, yes, the new bloated web tech is helping users.

But if modern web design was so great, why does https://craigslist.org still look similar to how it looked 20-plus years ago?

In 2017, neighborhoods in Dallas and Cleveland with high poverty rates were found to have slower download speeds because AT&T didn’t provide those areas with up-to-date broadband improvements, according to studies done by NDIA.

https://webpagetest.org/result/200401_SE_8b7f29c209dad30d4cb07afc01be1137/
https://toledo.craigslist.org/
From: Dulles, VA - Chrome - Cable
4/1/2020, 11:40:34 AM
First View Fully Loaded:
Download speed: 0.877 seconds
Web requests: 9
Bytes downloaded: 154 K
Cost: $
3 web requests and 124 KB were for JavaScript. I don't know why it uses JavaScript, but compared to most websites, that's a tiny amount of JavaScript.

Craigslist uses a humane web design. Craigslist uses a useful web design. That's REAL modern web design because it never goes out of style because it's useful. Craigslist web design was modern 20 years ago because the site or service was useful. It still uses a modern web design because it's useful. Craigslist does not implement fad web designs. Unfortunately, fad web design is now called modern web design. Craigslist, however, has made enough web design changes over the years to make their service usable across all devices. That's real modern web design.

Craigslist web design how the Ohio coronavirus homepage should function. It's how the Toledo Blade homepage should function. But hoping that governments and media orgs create useful web designs is misplaced hope.

Mar 31, 2020 - themarkup.org - Online Unemployment Benefits Systems Are Buckling Under a Wave of Applications

Newly laid-off workers face crashes, long load times, and messages offering phone callbacks as states struggle to adjust

In several states—including Pennsylvania, Massachusetts, New York, Michigan, and Hawaii—online benefits systems have buckled under the crush of new applicants.

The online application system in the state, meanwhile, can be detailed and cumbersome, depending on your circumstances, according to Simon-Mishel. In other words, it can be difficult to put together on a mobile device—which is all some have access to while the virus puts public spaces on lockdown. “A lot of people who would normally go find a computer at a library or at a workforce center, they’re currently unable to go out and go to those places, or those places are closed,” Simon-Mishel said. The web interface, she said, is “not very mobile-responsive.”

Hawaii was forced to replace its online application portal with a new form after repeated crashes, according to the Honolulu Star-Advertiser.

State governments won't have the resources and IT capital of Craigslist, but the concepts used by Craigslist might work for state governments, which could mean that the websites actually work.

A 2015 report on Florida’s unemployment insurance system from the National Employment Law Project found that thousands of unemployed workers in the state were denied benefits because of technical issues with an online platform the state mandated be used.

He said the state should have learned from Hurricane Irma in 2017, when new users caused similar problems with overload. “It appears that they were not really prepared for this, which is shocking to me, given our experience with Irma,” Rowinsky said.

Why would it be shocking to learn that government has not learned from past experiences?

The move to online only exacerbated the problem, the study found. In many states, the difficult-to-complete online application systems were responsible in large part for the decline.

Even humane web design unlikely to help users who face "difficult-to-complete" forms.

The above story linked to this story by a Michigan media org. Excerpts:

According to the UIA, workers are encouraged to apply at Michigan.gov/UIA or by phone at 866-500-0017, but due to tremendous call volume, certain callers may receive a busy signal.

https://webpagetest.org/result/200401_DV_2d636d7ad33d31f47bdeb939ea8fcc25/
https://Michigan.gov/UIA
From: Dulles, VA - Chrome - Cable
4/1/2020, 11:59:16 AM
First View Fully Loaded:
Download time: 5.259 seconds
Web requests: 68
Bytes downloaded: 2,096 KB
1.4 megabytes of the download were for images. "Only" 320 KB of the download were for JavaScript. It's sad that this is now considered a lightweight web page among today's government and media websites.

When I disable JavaScript, the following message displayed at the top of the webpage:

Browsers that can not handle javascript will not be able to access some features of this site.

That page contains text and links, which is the primary content. The page, however, contains useless stock images that bloat the download. And what's the point of the JavaScript?

It's unfortunate that not enough voices exist in media, especially tech media, that shame the hell out of these bad web designs.

Apr 7, 2020 update

https://lucascountyhealth.com/coronavirusupdates

Lucas County Community COVID-19 Survey

Designed to help health officials track suspected cases, estimate COVID-19 presence in our community, and follow-up with individuals expressing mild to moderate symptoms as time and resources permit.

https://toledolucascountyhealthdepartment.formstack.com/forms/covid19_tracking_form

Updated this week. The form now states:

This form is for Lucas County, Wood County, and Fulton County residents to report current suspected symptoms of coronavirus (COVID-19) or to report past symptoms.

Yesterday, I learned about a survey offered by Stanford Medicine.

https://med.stanford.edu/covid19/covid-counter.html

National Daily Health Survey

For Novel Coronavirus (COVID-19)

Our goal is to learn and predict which geographical areas will be most impacted by coronavirus based on how you are feeling. This information will be used to inform local and national responses, such as redirecting medical resources or improving policies and public guidance. Given the 9-10 day delay between onset of symptoms and hospitalization, and the 20% hospitalization rate of patients, tools like this will be necessary to truly track and fight the spread.

Your involvement will hopefully help save lives. As a country, we are all in this together!

On that Standford Medicine web page, beneath the above text is a button or something that appeared to be clickable. The text for the button states: "Take the Survey."

When I mouse over it, my cursor changes from an arrow to a hand, which implies that I can click on it. But when I click the Take the Survey button, nothing happens.

I disabled uMatrix. I disabled Privacy Badger. I enabled JavaScript. Then I clicked the button again, and still nothing.

Within Firefox, I have security and privacy settings set to my desire. Maybe something in my Firefox settings is blocking the ability to click a button.

Obvious question: Why isn't the Take a Survey text a normal web link that has existed on the web for about 30 years?

What in the hell are the designers of this shitball website doing? How does it not work in Firefox with all of my additional security and privacy add-ons disabled?

It's like how the Toledo Blade website fails to function in Firefox with my add-ons disabled. I had to use the Brave web browser to log into the Blade website to subscribe to the Blade's coronavirus email newsletter.

I loaded the above Standford Medicine webpage into the Brave web browser. I clicked the Take a Survey button, and some kind of window appeared at the bottom of the browser that contained a frowny face. Obviously, the browser disliked whatever I was trying to do.

I lowered the shields of protection in the Brave browser, reloaded the page, tried again, and still nothing worked.

It's upsetting that this type of horrendous web design is so rampant. I don't need to take the survey. I was curious to see how it compared to our local version. But for some reason, I cannot access the survey by using the Firefox and Brave web browsers on Linux.

Apparently, insecure web browsers are needed to click a link!


More links

Related and somewhat related ...

Mine:

http://sawv.org/2020/03/13/the-antithesis-of-modern-web-design-is-usefulness.html

http://sawv.org/2020/03/19/quote-mar-19-2020.html

http://sora.soupmode.com/more-article-format-testing-2.html

http://sora.soupmode.com/2019/01/03/web-article-design-testing-03jan2019.html

http://sawv.org/2018/06/05/suggestions-for-local-news-orgs.html

http://sawv.org/2017/04/11/favorite-articles-about-web-design.html

http://sawv.org/2018/08/09/reasons-for-disabling-javascript-when-reading-the-web.html

http://sawv.org/2018/07/12/ecofriendly-web-design.html

http://sawv.org/manifesto-for-lightweight-web-pages.html

http://sawv.org/2017/10/10/toledo-blades-free-adbased-mobile-news-app-called-newsslide.html

http://sawv.org/2017/08/13/the-future-of-local-newspapers.html

http://sawv.org/2019/05/16/with-newspaper-website-design-this-atrocious-its-hard-to-respect-the-news.html

http://sawv.org/2017/05/29/disable-javascript-for-a-faster-web-reading-experience.html

http://sawv.org/2017/08/11/jh-tt-post-aug-11-2017-b.html

http://sawv.org/2020/01/31/quote-jan-31-2020.html

The problem is not JavaScript. The problem is the unnecessary usage of client-side JavaScript that is murdering the open web of documents.

http://sawv.org/2019/04/22/the-media-spreads-fear-rage-and-nefarious-ad-trackers.html

http://sawv.org/2017/07/12/a-simpler-web.html

http://sawv.org/2019/12/19/newspaper-industrys-hideous-modern-web-designs.html

http://sawv.org/2018/03/11/medias-war-against-the-open-web.html


https://www.washingtonpost.com/technology/2020/03/16/schools-internet-inequality-coronavirus/

https://danluu.com/web-bloat/ - February 2017

https://text.npr.org

http://motherfuckingwebsite.com/

https://bestmotherfucking.website/

https://exclusive-design.vasilis.nl/design-like-its-1999

https://exclusive-design.vasilis.nl/principles/

https://justinjackson.ca/words.html

http://deathtobullshit.com/

http://vault.simplebits.com/notebook/2013/02/16/food-for-thought/

https://mattgemmell.com/designing-blogs-for-readers/

https://www.wired.com/2016/10/how-the-web-became-unreadable/

https://ia.net/topics/the-web-is-all-about-typography-period

http://www.wired.com/2015/11/i-turned-off-javascript-for-a-whole-week-and-it-was-glorious

https://eev.ee/blog/2016/03/06/maybe-we-could-tone-down-the-javascript

http://www.zeldman.com/2015/07/29/publishing-versus-performance-our-struggle-for-the-soul-of-the-web/

http://tantek.com/2015/069/t1/js-dr-javascript-required-dead

Apr 10, 2020

https://www.cnbc.com/2020/04/09/google-creates-online-unemployment-application-with-state-of-new-york.html

https://labor.ny.gov/pressreleases/2020/april-09-2020.shtm

https://webpagetest.org/result/200410_JK_8771f0dd66cf1b613431d553e1f76917/

https://www.labor.ny.gov/home/

Apr 23, 2020

Lightweight web designs and ACCESSIBLE web designs are important.

A Markup survey finds that many state webpages have serious accessibilityissues

Every state in the U.S. has launched at least one website with updates about the novel coronavirus outbreak. Unfortunately, the majority are difficult or unusable for visually impaired users, according to a survey conducted for The Markup by the web accessibility group WebAIM.

Forty-one of the 50 state pages we surveyed contained low-contrast text, which can be challenging for users with low vision, including seniors, who are at higher risk in the outbreak.

Navigation was another challenge for users with visual disabilities. Thirty-one of the 50 state pages contained empty links or buttons, which means a screen reader will not be able to tell the user what the button does or where the link is supposed to go. A screen reader is accessibility software that enables translation of text and images on a screen to speech or a Braille display.

The Markup’s analysis was based on a list of 50 websites that appear in a Google information module when a user enters a search for the state name and “covid.” We then sent the list, on April 15, 2020, to the web accessibility group WebAIM, which ran an analysis of the homepages using WAVE, a web accessibility tool that flags common problems for blind and low-vision users. We also interviewed two accessibility experts and two screen reader users for insight into issues that could not be detected by WAVE.

WAVE flagged an average of 28.5 errors per coronavirus homepage—which is lower than typical websites, which had an average of 60.9 errors per homepage in WebAIM’s February 2020 analysis of the top million websites. WAVE analysis, which for the Markup survey included only the first page of each site, catches fewer than 40 percent of possible issues, according to WebAIM.

Although the state coronavirus websites were better than most websites, they have a special burden to be accessible, said Jared Smith, associate director at WebAIM.

The Americans with Disabilities Act prohibits discrimination against people with disabilities by governments and businesses that are open to the public, in the physical world and online. However, the Department of Justice, which is responsible for enforcing the ADA, has not issued standards for website compliance, despite declaring its intention to do so in 2010.

Disability advocates have been pushing for the Justice Department to adopt the Web Content Accessibility Guidelines developed by the World Wide Web Consortium, which would require specific color contrast ratios, text alternatives to nontext information, keyboard-compatibility, and other features that make a site accessible to users with different physical and cognitive disabilities. But the department formally abandoned the proposed rule under a Trump administration initiative to reduce new government regulation.

WAVE’s automated analysis can also miss some major obstacles to users. For example, a large infographic on Washington’s site did not produce errors in WAVE, but Michael Forzano, a blind software engineer who works at Amazon, was unable to parse it using NVDA, a popular screen reader. “I don’t even know that it is a graphic,” he said.

The same information was also available in an accessible table, but the website did not make that explicit—so Forzano wasn’t sure what he was missing.

The site does have an accessibility disclaimer at the bottom—“For people with disabilities, Web documents in other formats are available on request”—but, Forzano said, “we shouldn’t have to contact somebody to get the information that is readily available to everyone else.”

Maine and New Mexico were the only states that did not register any errors in the tool.

Jackie Farwell, the communications director for Maine’s department of health, pointed to the state’s accessibility policy and noted the department also uses the WAVE tool internally. “We’ve also worked with stakeholders in the disability community to ensure they have the information they need in the face of the pandemic,” she said in an email.

I wonder if this is due to social media's popularity and the silo users' desire to view quickly an image, instead of reading and concentrating on text.

Much of the information circulating about the pandemic is image-based, including charts, graphs, and maps about the spread, as well as diagrams about proper hand-washing and mask-wearing, in which the text is part of the image, making it unreadable to screen readers.

“I honestly just stopped reading news stories and started listening to CSPAN,” Littlefield said.

That's a failure by media orgs and government agencies. They failed to create humane web designs.

His frustration inspired him to build an accessible website for tracking COVID-19 stats, which he said prompted a flood of messages from users saying thank you.

“I’ve been doing software development since I was 12,” he said. “A lot of people who are low vision, seniors who have recently gone blind, people who don’t have tech experience—if I’m struggling, I can’t imagine where they’re at.”

His website requires JavaScript to display the stats. His site won't work in limited web browsers.

Another good exercise would be to submit each states Covid-19 website homepage to webpagetest.org to expose unnecessary bloat.

Links of interest from that article.

The cvstats.net website gets data via this API.

https://corona.lmao.ninja/v2/states/ohio?yesterday=true

{
  "state": "Ohio",
  "cases": 14117,
  "todayCases": 392,
  "deaths": 610,
  "todayDeaths": 53,
  "active": 13387,
  "tests": 97998,
  "testsPerOneMillion": 8418
}

curl -X GET "https://corona.lmao.ninja/v2/historical/usacounties/ohio?lastdays=all" -H  "accept: application/json"

https://corona.lmao.ninja/v2/historical/usacounties/ohio?lastdays=all

-30-