JavaScript Required; Didn't Read

created Feb 18, 2020 - updated Feb 22 ,2020

It's sad that alleged open web proponent, Dave Winer, built his personal website to require JavaScript to READ TEXT when accessing his homepage.

The way that I get around this is by using DW's URL structure that contains the current year and month.

Instead of accessing scripting.com, I access http://scripting.com/2020/02 to READ his website.

Even with JavaScript ENABLED on my old iPhone, DW's content on his homepage is blank. Obviously, he's using JavaScript that's newer than my version of Safari. I can read his content on my old iPhone by using his /YYYY/MM URL.

DW's website is a Web of Documents type of website. My computer should not have to EXECUTE programming code (JavaScript) to display text.

But it's DW's personal website, and I support personal web publishers designing their websites however they desire. This is the advantage of leasing domain names and building websites on leased servers: nearly no limitations exist in design choices.

If personal web publishers design their sites to work only with Web Assembly or work as JavaScript SPAs, goody gumdrops for them. They don't need me as a reader. I can waste my reading time elsewhere. Win-win. It's all good.

To READ DW's website, I have the above URL option, and I can read DW's content within a feed reader. Lightweight options do exist.

Anyway, DW mentioned a new media-like service that uses a lame, generic name called Scroll. DW linked to Scroll's Crunchbase page.

TechCrunch maintains a terrible web design. Recently, the uMatrix extension in Firefox pukes when I click a link that points to a TC article. That's because something bizarre is being used at TC's website. uMatrix alerts me and stops loading the website. I ignore TC.

I wanted to see if something similar occurred when I clicked the Crunchbase link when using the Links2 web browser. I'm typing this post within the Links2 web browser. I was using Links2 to read DW's website at the above February 2020 URL.

When I clicked the link for Crunchbase, I received the following message within Links2.

Please verify you are a human

Access to this page has been denied because we believe you are using automation tools to browse the website.

This may happen as a result of the following:

  • Javascript is disabled or blocked by an extension (ad blockers for example)
  • Your browser does not support cookies

Please make sure that Javascript and cookies are enabled on your browser and that you are not blocking them from loading.

Reference ID: #a623d9b0-529a-11ea-88c9-038dbfae47d8

Powered by PerimeterX , Inc.

That's all that displayed within Links2 when accessing a website that should be a web of documents type of website. This is hellish modern web design.

Why does Crunchbase need to set cookies and EXECUTE code on MY computer? To hell with Crunchbase and all websites designed like this.

This is why security and privacy controls should be cranked up on users' web browsers. It's why most of the time I READ the web of documents type of websites with JavaScript disabled on my laptop and on my phone.

Despite my disdain for Facebook, web designs like Crunchbase's makes me root for Facebook to crush the media. When media orgs, even alleged tech media orgs, ignore the open web, then we may as well root for Facebook to win it all.

When I tried to view the Crunchbase post about Scroll within the Lynx web browser, I received the same error. But when I used elinks, then the page loaded. When I tried to cURL the Crunchbase article, I received the error page that appears within Links2. If the website is not cURL-able, then that site does not exist on the open web.

About perimeterx.com:

proactively detects and manages risks to your web applications

Your web and mobile applications are the online consumer experience for your company.

Ahh, Bach! That it explains the issue. Apparently, Crunchbase.com is a web application and not a website.

And perimeterx.com offers some kind of bot protection service too.

Use capabilities like behavioral fingerprints, predictive analytics and advanced machine learning models to differentiate real users from automated attacks and accurately identify and block sophisticated threats.

Some irony exists with DW mentioning Scroll and my observation of Crunchbase's hostile web design.

https://scroll.com

Welcome to an internet that feels magical.

Twice as fast, no ads, 80% fewer trackers. It’s the sites you love. But on a better internet.

Someone should inform Scroll that it's still the same internet.

Anyway, Scroll has partnered with media orgs to host content, produced by media orgs that maintain horribly bloated websites that are polluted with ads, trackers, megabytes of JavaScript, and other crapware.

In my opinion, the media's horrendous web designs inspired Google to introduce its anti-open web product called Accelerated Mobile Pages back in 2015.

Media orgs could CHOOSE to create their own lightweight, fast-loading, ad-free, tracker-free, JavaScript-free, and crapware-free websites, but the media chooses differently.

If the Toledo Blade and other local media orgs in the Ohio-Michigan region partner with Scroll, then I might consider it. Otherwise, I'll continue to read the Blade my way.

Feb 22, 2020 update

Sigh

Today, while using the Links2 web browser, I read Dave Winer's homepage by using the URL scripting.com/2020/02, and I saw this post:

Doc posts a monster thread from LO2 to twitter. Here's the thread, and the outline.

I left out the links from Winer's post. The "thread" link pointed to Doc's Twitter something, probably the god-awful tweetstorm, which ... never mind.

Earlier this month, I modified my /etc/hosts file on my Lenovo laptop that runs Ubuntu Linux. I used ALL of the suggestions, listed at Steven Black's GitHub repo. My /etc/hosts file contains over 80,000 entries. I cannot access any social media site directly, and that means the crapware that sites embed from social media is humanely blocked too.

The "outline" link pointed to output, produced by LO2, which is Dave's outliner program or writing environment or something like that.

Doc is Doc Searls who edited the Linux Journal and has been writing about tech and writing on the web for a long time.

This is Doc's elegantly simple personal homepage that loads quickly, in under a half second. This is brilliant web design. This is humane web design. This is useful web design.

http://searls.com

This is Doc's blog. It's at this URL because, I think, that he Dave did some early blogging work for or at Harvard. The RSS 2.0 spec is hosted at a Harvard domain name.

http://blogs.harvard.edu/doc

Doc's Feb 10, 2020 post shown below is long and fascinating, and it's an example of the many kinds of content that can exist on personal websites that don't have character size nor content type limitations. Making this into a tweetstorm would be a web crime against design.

The above blog post consist mainly of TEXT, and I can read that text with JavaScript disabled.

I can understand DW's LO2 writing app requiring JavaScript to create and update content, but why in the hell is JavaScript required to READ the text output? This is absurd and hostile. I should not need to execute someone's JavaScript code on MY computer to read text.

Doc's LO2 thread or outline that DW mentioned is found at this URL: http://instantoutliner.com/gx.

To read that outline, I had to disable uMatrix, disable Privacy Badger, and enable JavaScript, and yet all of Doc's content was simple text.

This is going backwards. This is devolving. This is sad for the open web that people with the experience of DW and Doc fail at text in 2020, especially when Dave started his personal web publishing in the mid-1990s, 25 years ago.

DW created a better web design for readers in 1995 than he does in 2020. That's devolving.

I don' understand this anti-open web stance. Is it intentional or accidental? "JavaScript Text" cross-posted to Twitter, a silo. What happened to the personal website or blogging? What happened to supporting the open web?

I'm guessing that Doc has never heard about the indieweb.org and its concepts.

https://indieweb.org/IndieWeb

The IndieWeb is a community of individual personal websites, connected by simple standards, based on the principles of owning your domain, using it as your primary identity, to publish on your own site (optionally syndicate elsewhere), and own your data.

I tried to cURL Doc's outline post, and, of course, I saw no content. As usual, if it's not cURL-able, then it does not exist on the open web.

After lowering all of my security and privacy shields to read Doc's post of text, I saw these gems of content ...

... oh, wait. I cannot copy and paste in a normal manner that has been available within web browsers for over 20 years because DW's reader-hostile web design has hijacked normal keyboard and mouse usage. Shameful.

I'm quickly losing respect for Doc and Dave, regarding anything that they do for the web. The users and creators of this type of violent web design need to be ignored.

If it's a web application that requires users to login to perform private tasks, then using JavaScript to make the UI/UX useful is okay.

But if it's a Web of Documents-type of website that requires non-logged-in READERS to use JavaScript to read text, then that's obscene. How is requiring JavaScript to read text solving a problem? What problem existed?

The only way to excerpt Doc's content is to copy all of it and paste it into an editor or into a textarea box and then excerpt from that blob of text. Holy hell.

Anyway, from Doc's post:

This tweet outline /thread will be notes on @ColumbiaSIPA's @johnbattelle talk

Why the emphasis on using the cesspool of the internet? Why are Doc and Dave supporting a platform of hate and misinformation?

History will not judge the 2016 election kindly. @johnbattelle compares it to the Black Socks Scandal. Facebook is "the most amazing machine" for obtaining personal information and using it to target advertising.

And Twitter is the greatest tool for spreading misinformation quickly. Twitter is a machine that produces anger and hate.

The entire adtech advertising system also got in to the game, using 3rd party cookies..
We have taken a very laissez faire approach to regulating the use of personal data

Then why is Doc using Twitter, a silo, and why is blogging pioneer DW such a fan of Twitter? Why does DW insist on making his Twitter account is online identity for authentication? None of this makes sense. It's definitely anti-open web. It's anti-independence.

My online identity is the sawv.org domain name that I lease. The IndieWeb.org created an authentication concept called IndieAuth that is based upon using one's own domain name, along with one other service, such as GitHub, Twitter, an email account, etc. I use email to complete the login when using IndieAuth. It's an elegant solution. I already maintain a strong password for my email account.

IndieAuth plus one's own domain name is the advanced way, the evolving way, the progressing way to maintain an online identity at other services. Relying on silos for login identities is disgusting, and it probably means surrendering more data to the silos as Doc mentioned above.

More from Doc's post:

We continue to outsource our thinking to the captains of tech industry.

Ahh, yeah. Doc and Dave are shining examples of outsourcing their thinking to the tech industry. Per Doc's opening sentence for his outline post:

This tweet outline /thread will be notes on @ColumbiaSIPA's @johnbattelle talk

That's outsourcing to the tech industry.

More from Doc about outsourcing our thinking to Big Tech:

Alas, these are proprietary systems, silos. What if we were to be able to have this data shared across silos?

Huh? Silos sharing data with other silos???

Does Doc not realize that "his" thoughts would still be locked inside silos, away from the open web?

I quoted the word "his" because once users post their thoughts onto silos, then ownership of those thoughts is transferred to the silos. The silo users willingly cede ownership of their thoughts.

The silos have no incentive to share data easily with other silos. Hence the reason these services are called silos.

But even if a Twitter user could easily cross-post to Facebook or Instagram, that's still using silos. It's not the open web.

The IndieWeb.org solved Doc's dilemma years ago. The IndieWeb users post their content on their own websites first and then optionally syndicate to the silos with interactions on those silos getting backfed to the personal websites. This is especially true when the IndieWeb users use Twitter, which seems to be Doc's preferred silo.

The IndieWeb.org is 10 years ahead of Doc in thinking about this.


webpagetest.org results for this post, above the horizontal rule.

From: Dulles, VA - Chrome - Cable
2/22/2020, 3:19:02 PM
First View Fully Loaded:
Download time: 0.281 seconds
Web requests: 2
Bytes downloaded: 9 KB

100 percent of the downloaded content was HTML. The CSS is included within this HTML page. It's not coming from an external file, which is probably why CSS is not counted separately.

The Content Breakdown section states only one web request was made, and it was made for 100 percent of the HTML. I'm unsure why the web requests count under First View Fully Loaded always contains one more request.

The point is that the web is fast. The web is fast even over a sluggish internet connection.

The "mobile web" is not slow because the "mobile web" does not exist. It's the same damn web. Mobile devices access the web via http/https, exactly like desktop and laptop computers.

Web browsers are not slow. Web browsers on mobile devices are not slow.

Horrendously designed websites are slow. The blame belongs to the poor choices made by web publishers.

-30-