The Media's War on the Open Web

created Apr 17, 2019 - updated Jul 10, 2019

Journalists and media orgs could promote the open web more than anyone, but unfortunately, journalists love to use Twitter, and media orgs embrace nearly ever new product, offered by Big Tech.

Journalists use Twitter, the cesspool of the internet, to bitch about Facebook. Media orgs use funding models that rely on referral traffic from the social media silos because the media pollute their websites with advertising.

Media orgs publish stories, bashing Big Tech silos while those articles contain sharing links to those silos. The media cries about the demise of local newspapers while promoting their stories on the silos.

Sadly, many media orgs support Google's Accelerated Mobile Pages technology because media orgs maintain slow, bloated, obnoxiously-designed websites, and the media hope that by supporting AMP, their sites will appear higher in Google search results for mobile users.

Then the media whine about Google and Facebook dominating the digital ad market.

Journalists could help promote the open web by:

This is not about journalists ending their social media presences. This is about journalists using their personal websites in addition to their Twitter accounts. It might require a little time adjustment, but they could make it work if they desired.

But it seems that journalists are addicted to Twitter. They enjoy wallowing in their own filter bubble, echo chamber, and vacuum.

In my opinion, the journalists' heavy usage of Twitter, their quick reactions to anything, and their thinking that everything is breaking news have contributed to the spread of misinformation.

Apparently, it's better to be fast than correct, and it's better to share an unverified breaking news story on Twitter than exude patience and wait for more information.

After all, facts and corrections can occur later when few people are paying attention because the story has aged, and everyone has moved on to the next outrage.

If journalists embraced the open web, such as using personal websites, feeds, and Webmentions, this could be viewed as adopting the Slow Web Movement and the Slow New Movement, which could help develop trust between journalists and the rest of the public. The Slow Web and Slow News Movements could reduce the amount of misinformation that spreads wildly.

It seems that patience is no longer a virtue. I eschew breaking news. I wait 12 hours, 24 hours, or 48 hours later to consume the information when the media have more facts. Breaking news is usually synonymous with incorrect news and being uniformed.

Embracing the open web might encourage information consumers to leave their silo native apps for news. It could reunite users with the concepts of web browsers and URLs.

If a user manually copies and pastes a media org's story URL into a silo, then that's okay because nothing can be done to stop that activity. But media orgs should remove the social media sharing links from their story pages. Why do media orgs encourage the behavior that they criticize?

Instead of following media orgs within the social media silos, users would need to visit the media sites directly and/or follow the media orgs and the journalists by using feed readers.

Media orgs and journalists should promote their websites first and only their own websites.

Here's some "wonderful" news from Google, which desires to control the web and email with its own so-called standards. It would be like wolves developing standards for how sheep should act.

AMP is an abomination for the open web. It's pathetic and sad that media orgs support AMP. It shows that media orgs are weird. They trash Big Tech while relying on it heavily.

HN comment:

I think that most people are worried about Google using a controversial, draft web "standard", that introduces a major change in how the web works, in mass production, without trying to first resolve the problems raised with the proposal.

HN comment:

If you're going to break the way the web works, you might as well break it hard. At least that seems to be Google's philosophy with this unasked-for inserting itself between people and publishers.

Whoa, hold on now. The websites' CMS apps have to create content that supports Google's idea for a web standard. Without supporting AMP, then the sites will not appear in Google's dubious AMP-related search results, which are displayed in a carousel format.

Unfortunately for the open web, media orgs want their stories to appear in Google's carousel for mobile web users.

In my opinion, media orgs embrace every nefarious web tech out of fear of being left behind.

Wired.com article from this week. Here's the Mediagazer headline:

Interviews with 65 current and former employees on the meltdown inside Facebook over the last 15 months as the Cambridge Analytica scandal unfolded

I don't care about the Wired article nor the soap opera drama at Facebook that employees about 20,000 people.

As usual, the reactions connected to the Mediagazer link involved people using Twitter, mainly journalists. Maybe it's a Mediagazer issue with how it finds reactions to stories. Anyway, here's the discussion, attached to that Mediagazer link.

Two of the dumbest methods for posting to the web are used at Twitter.

The worst method is when a Twitter user types a bunch of text, takes a screenshot of the text, and then posts the image of the text to Twitter.

In the world of web publishing, that's digressing, devolving.

That stupidity is done to swerve around Twitter's character limit for a single tweet. Apparently, these Twitter users cannot imagine any other means to bypass the character limit. (Hello, personal websites and syndication.)

The other hideous method uses the dreaded Tweetstorm. WHY ????

Blogger launched in 1999, twenty years ago. The issue of typing a lot of text and posting it to the web was solved in the 1990s. It was solved on a wider scale in the early aughts with Greymatter, Movable Type, WordPress, etc. Those apps were created more than 15 years ago.

I don't read Tweetstorms. Jay Rosen teaches journalism. I don't think that he practices or "does" journalism. But when journalists create Tweetstorms, then they are harming the open web. That's a double-whammy for the open web.

Here's Rosen's tweetstorm, which I think is related to the above Wired story.

Rosen maintains a personal website at http://pressthink.org, but whatever his Tweetstorm was about, he did not post it as a blog post to his personal website, since his last website post was dated Apr 7, 2019.

I don't understand that behavior.

Moving on.

Here's an NBC News story from this week with the following Mediagazer headline with the following Mediagazer headline:

Both Twitter and YouTube struggled again to contain the spread of conspiracy theories and Islamophobic content amid a news event, as Notre Dame burned

Since I don't use social media, then I was oblivious to the above nonsense, but apparently, whatever happened occupied a lot of journalists' time.

Tweet from a sports writer at a Toronto paper.

This thread is a useful list of brain poison. The Internet is broken

Another "broken" claim. Great. The sports writer added a follow-up tweet.

I say we shut down the Internet and see how we do just in case tho

Instead of shutting down the internet, maybe that journalist should lead by example and at least shutdown his Twitter account. Nope. He has posted a lot to Twitter, since his moronic statements above.

According to the user's Twitter profile ...

Toronto Star sports columnist, father of four ...

He has nearly 140,000 Twitter followers. Yeah, he wants the internet shutdown. Also, he has posted 269,407 tweets. !!!!

How does he have time to be a columnist and more importantly a father to four kids? That much Twitter activity would seem to imply detachment from reality.

Here's a tweet, related to that same NBC News story.

there’s a very good case that it’s irresponsible for us all to be using this website if the platform is this irresponsible. maybe we should all seriously think about what it means to post all our stuff alongside an increasing volume of garbage

Yep. Why do journalists love to use a silo that promotes hate, violence, and misinformation? They could post their thoughts at their own domain names. Depending upon whether or how they accept comments, the journalists could be the only ones posting content to their sites. Even if they permitted Webmention comments, they could remove the garbage ones. They would be in charge of what occurs on their websites.

Reply tweet to the previous tweet:

Was thinking that earlier. It's a profit-hungry monopoly with no corp responsibility or governance apparent. And by using it? We reward that approach and further enrich it.

And will any of those people embrace the open web? Unlikely.

The open web and the rest of the internet promote hate, violence, and misinformation too, but at least with the open web, we have more control.

The content displayed on our personal websites and within our feeds are not controlled by algorithms nor by any other human.

When we use feed readers to consume feeds, algorithms do not control how the feeds are displayed. When we view our feed readers, we don't see recommended feeds nor content from people we don't follow.

The fear-and-rage-producing silos want to anger people to keep users within their silos. On mobile devices, the silos prefer that people use the silo native apps. The silos don't want people using web browsers.

The open web has more barriers, making it harder to create, discover, and share content. These barriers pushed users to adopt the silos, such as Facebook and Twitter, which built better mousetraps. I understand.

Easier, however, does not always equal better. Facebook and Twitter create flooded rivers of information. These services become addictive fear-and-rage machines.

And yes, people can use the silos in ways that are beneficial by limiting who and what they follow and what content they create, etc.

Maybe the phrase should say "open internet" instead of "open web", since email is a separate application layer protocol from the web. When owning a domain name, then it's possible to manage a custom email address that uses the domain name. And the same domain name be used for to host other services, such as gopher, FTP, IRC, and other protocols.

Sometimes, I follow too many feeds within my feed reader. Sometimes, I subscribe to too many email newsletters. My response is to reduce the number of feeds and email newsletters that I subscribe to. Simple.

My feed-reading and email-reading activities are private. The publishers might know that I subscribe to their feeds and email newsletters, but that's about all they know about me. They are not connecting the content that I create hear at sawv.org with my other activities on the open web to sell to advertisers.

I will not join any service that maintains narcissistic counters that cause users to sacrifice their selves and lose their originality by creating content that generates likes and shares.

I will not join any service that uses algorithms to decide what I should see. If I follow someone, then I expect to see all of their content.

When I subscribe to a feed or an email newsletter, I expect to see everything contained within those products. My feed reader and email client do not use algorithms to display only parts of a feed or a newsletter.

In my opinion, the web, the media, and our knowledge would be improved if journalists and their respective media orgs embraced the open web. Journalists don't have to shun the social media silos completely. They could reduce usage and reliance on social media, or at least use their own domain names along with social media.

It would be amazing for the open web/open internet if at least 5 to 10 percent of the journalists were active users of one or more of the following ideas:

Email uses a separate application layer protocol than the web. Hence the reason to include the "open internet". Most of what's mentioned above, however, runs over the web.

It would be amazing for empathetic web design if media orgs designed simple, functional, useful, lightweight, fast websites.

The media orgs could start with https://text.npr.org and then scale up fancy-wise slowly, provided that the additional design provides utility. Leave out the JavaScript for READING web pages.

Use simple HTML to display text and images. Use simple CSS to make the web pages more comfortable to view on all devices. Please, no grey text on a grey background. And leave out the trackers and crapware.

Unfortunately, the media create some of the worst websites on the planet. Horribly bloated. Loaded with trackers and ads. Blank or broken with JavaScript disabled.

4 megabytes downloaded to read a 500-word opinion??? It's hard to have sympathy for media orgs going through tough financial times when they offer hostile delivering mechanisms.

The web does not have to be this complex and bloated for the media to inform the citizenry. The media should create websites that display well within the Lynx and NetSurf web browsers because then those sites will load fast in modern web browsers and over slow internet connections.

The media would not need to support AMP if they maintained their own lightweight, fast websites. But they would continue to support AMP regardless because of the fear of not appearing high in Google search results.

It's almost as if Google is extorting media orgs to abide by Google's version of the web.

And the media cowers, which means that the open web loses more ground.

The media loses more too even when it believes that it wins small victories, like seeing a jump in referral traffic from Google because of supporting AMP.

Ultimately, the media enslaves more of its business to Big Tech. No victories exist, except for Google.

Many media orgs create obnoxiously complex websites that require users to run the latest version of Chrome or Firefox. Older computers moan and groan under strain of displaying media web pages. Phone batteries drain faster because of bloated web pages.

Hey, what about the planet? My July 2018 post titled Eco-friendly Web Design. Excerpt:

Web bloat shaming. Hilarious. Accuse owners and creators of bloated websites that they are destroying the planet.

Bloated websites consume more energy due to the crapware that readers are forced to download and execute on their computers, draining batteries and red-lining CPUs.

If the media created simple websites that worked fine within NetSurf, then the media would be leaders in promoting a simpler web. With a simpler web, we might have more independent web browser development that would lead to interesting ideas for consuming media content.

Independent web browsers could make it easy to see and subscribe to feeds on websites. The web browsers could contain simple feed readers, like the Opera Mini web browser. This could help spread the idea of feeds and feed readers to newbies who may later use more sophisticated feed readers.

In my opinion, the complex, bloated web is pushing modern web browser development toward one rendering engine, which is the one that Google maintains. The open web dies some more. It's easier for Google to push its version of the web when most users rely on Google's rendering engine.

For reading media websites, a simplified web, minus JavaScript and other complexity, could encourage more independent web browser development.

Since few orgs have the resources to create huge web browser applications that support the complex web, then it makes sense that the world's most popular web browser is aligned with one of the biggest tech companies in the world.

In my opinion, the overuse and misuse of client-side JavaScript has created a world where most web users will rely on Google's rendering engine. Google loves the complex, bloated web. Ironically, Google pushes AMP to create simplified web experiences. What a vicious circle.

Create unnecessary complex websites to display a 700-word article because new tech is always cool to use, regardless of its utility, and in order for people to view your cool website, users will need the latest version of Chrome. But now your cool website is bloated and slow to load over mediocre internet connections, but don't worry. We have a solution for you. Support AMP to create a lightweight, fast-loading version of your 700-word article. Never mind that the simple article under AMP does not contain the new, cool tech that we encouraged you to use on your own website.

What's the point?

If journalists and media orgs exploited the open web more, then maybe that would encourage users to start their own personal websites. And if people discussed feeds, feed readers, blogrolls, webrings, Webmentions, lightweight web design, etc., that could spread the open web to others.

The open web/open internet could see slow, organic, viral growth. It would begin with a few journalists embracing the open web. Then some users might try feed readers and email newsletters. Then more journalists would embrace the open web, which would encourage more users try feed readers and email newsletters. Then some users would launch their own personal websites. Then some media orgs would embrace the open web, causing more journalists and users to follow.

That slow growth would continue over a long time. People would learn that they can use the open web in addition to or as a replacement for social media.

The open web and a simpler web would allow journalists, the media orgs, and the rest of us to break free of Big Tech's stranglehold grip on us.

We can still use the complex, bloated web browsers to access complex, bloated web apps when NECESSARY.

It's unnecessary, however, to require complexity and bloat to read text articles.

Maybe this utopia would encourage developers and designers to build simpler websites and web apps, outside of the media industry.

At the moment, the scorecard is bleak. Directly and indirectly through their actions, journalists and media orgs harm the open web, endorse complexity and bloat, and limit web browser independence.


Apr 18, 2019

While perusing Mediagazer today, I saw this story about some kind of spat that has occurred, mainly on Twitter, I think. It all sounds boring and nonsensical.

Within the attached discussion for the Mediagazer story thread, I saw this tweet.

NYT: “We will use Twitter instead of a public editor to gauge reader reactions and concerns” NYT: runs gossipy hatchet job on their front page Twitter: [angry emojis] NYT: “You know what we should do? Double down!”

Uh, whatever. I was interested in this part of the tweet that contained no links to support the person's claim.

NYT: “We will use Twitter instead of a public editor to gauge reader reactions and concerns”

I have no idea if that's true. I'm guessing that it's not. I thought that the NY Times had a public editor.

Why would the NY Times rely on the cesspool of the internet to gauge reader reactions and concerns?

The NY Times boasts a large and talented IT staff or technical R&D department. The NY Times could create its own digital feedback system that would not rely on the toxicity of Twitter.

This got me thinking about how incredible it would be if the NY Times implemented a digital feedback system, based upon Webmentions, similar to what I implemented with my concept message board site kleete.com.

Users would have to create their Webmentions on their own websites.

The NY Times would have to decide whether to allow posts that originated from username.blogspot.com, username.wordpress.com, username.tumblr.com, and similar CMS-hosted services.

Maybe the NY Times blocks those types of Webmentions, but it accepts Webmentions from those same services that hosts users' domain names, via domain name mapping.

Webmentions from username.com would be accepted over username.blogspot.com.

Blocking websites that use domain names, such as mysite.hostingco.com could reduce Webmention abuse or spam.

The NY Times feedback system should definitely block Webmention posts that originate from social media silos, such as Medium, Facebook, and Twitter.

The NY Times might not receive much feedback through Webmentions, but it could help ignite discussions about the open web and the IndieWeb concepts. Maybe over time, more people would setup their own websites that use their own domain names.

The NY Times feedback form would supply text input fields that would accept URLS for the users' source posts (Webmention replies) and the NY Times target articles (the stories being replied to), similar to how I accept Webmentions here at sawv.org.

Or each story page would contain one text input field for users to copy and paste the URLs to their replies. That would be a commenting system, which the NY Times could easily implement.

Accepting Webmentions on every story page sounds different than providing feedback to the public editor.

Maybe the user feedback is posted only to the same page. If that was the case, then the NY Times would need only one text input field, located on one page. The target URL would always be the same: the NY Times feedback page.

If readers have gripes or praise, then they would create their responses on their own websites with their posts containing the URL of the NY Times public editor feedback page. Then the readers would copy the URLs that point to their Wemention feedback posts and paste their URLs into the text input field, located at the NY Times feedback page, which would be the target URL.

If the same readers have another gripe, then they would need to create a new post and do the above again. This could be automated, which could lead to spamming or abuse. But the NY Times could added throttling, to limit posts from the same domain name to one per week or one per month.

The NY Times could prevent receiving Webmentions programmatically. My CMS app that I use for sawv.org can send Webmentions automatically when I create the post. I don't have to copy and paste my URL manually into a text input field at the website that I'm responding to.

sawv.org can also receive Webmentions programmatically because I display my Webmention API endpoint within the HTML of my homepage. But if a commenter uses a CMS, such as Blogger, that does not support sending Webmentions, then that person can use the manual approach of copying and pasting his or her Webmention URL into my Webmentions page.

But blocking programs from sending Webmentions and forcing people to copy and paste the URLs into text input fields could slow down or prevent abuse. The NY Times might need to require some kind of captcha test too, during the URL copy-and-paste function.

The NY Times could require Webmention commenters to log into the feedback page/site via IndieAuth before being able to copy and paste the Webmention URLs.

Barries are good, especially now. Barriers can reduce the noise and raise the signal quality.

Accepting Webmentions is a good way for readers to discover personal websites, although it's possible that the Webmention system could be used and abused by people who want to promote their websites.

Maybe the Webmentions get moderated by the NY Times, but that requires manual labor, and probably not a valid option.

I like these barriers best:

The other obvious barrier that the NY Times could use is to permit the above only for subscribers. That's an excellent barrier. Why accept feedback from people who don't pay for the content?

I could log into the NY Times with my normal NY Times digital subscription account, which would remove the need for IndieAuth. After logging in, I could copy and paste my Webmention reply URL into their feedback page or at an article page if the Webmentions were used as comments.

That process sounds simple to me and better than making the process open to anyone. But if opened to anyone, then the IndieAuth step would probably be needed.

If the NY Times accepted feedback only from paying subscribers, via Webmentions that were created at personal websites, and the NY Times required commenters to log into the NY Times with their subscription accounts before submitting their Webmentions, Media Twitter might explode.

News consumers and journalists might be confused and maybe outraged at this anarchy. Webmentions? IndieWeb? Personal websites? URLs? Copy-and-paste? Logging in? Paid subscriptions? Barriers to entry?

What would baffle the confused even more is the possibility that intelligent, civil, and useful comments could be created because of those barriers.

On second thought, the gifted NY Times tech team should not implement a Webmention feedback system because it would break too many brains in the media who cannot grasp commenting outside of Twitter.


Also on Apr 18, 2018.

At Memeorandum.com, I saw this job posting for Slate.

Audience Engagement Editor

Multiple media orgs employ "audience engagement" people or something similar. In my opinion, it sounds bad. It sounds like publishing to satisfy metrics, instead of publishing to inform the public.

Let's see. Excerpts from the job posting:

Slate is hiring an editor to join its audience development and podcasts teams. Ideal candidates know how to ensure our best work—in both text and audio—finds a large and loyal audience. They have the instincts of a journalist and know how to write compelling headlines for our social media feeds, and they understand what inspires podcast listeners to subscribe.

Editors should want to experiment with new strategies and test their assumptions as the podcast and social media ecosystems change.

The audience engagement editor will report to the senior director for strategy and work with Slate’s audience team to:

Manage the strategy and scheduling of existing social media accounts and help define the voice of Slate’s feeds

Work with editors to write sharp and social-friendly headlines for the work we publish

Use data to make decisions about how and where to promote our work

Requirements:

At least 2 years of social media experience

An interest in working closely with the rest of the newsroom, whether it’s framing a cover story for Facebook or pitching ideas at our weekly editors meeting

Oh, man. The media's reliance and/or heavy usage of social media will never wane. The media' outrage at Big Tech, especially the social media silos, is phony.

Jul 10, 2019

Via a story that appeared at Mediagazer, I landed on https://mississippitoday.org. I like Mississippi Today's description on its about page.

Mississippi Today is a nonprofit 501(c)(3) news and media company with a forward-facing mission of civic engagement and public dialog through service journalism, live events and digital outreach.

Our newsroom is dedicated to providing Mississippians with reporting that inspires active interest in their state and equips them to engage in community life.

Our Vision

We envision a Mississippi where engaging in the news is a way of life. We think good reporting and the accountability it inspires can change the trajectory of our state.

Our Mission

We commit to engage our readers by:

  • Providing top-notch service journalism via digital dissemination
  • Creating a forum for public dialog through social media
  • Hosting live events where people can interact with newsmakers and reporters

But holy hell. What is going on with this page?

https://mississippitoday.org/category/politics

That page crashes the Chrome web browser even with JavaScript disabled. It does not crash the entire browser, only the tab that attempts to load that page. It causes the browser to produce the "Aw, Snap!" blowup message, which states, "Something went wrong while displaying this webpage."

Repeated refreshing produces the same results. At least it loads within the text-based web browser elinks. The problems seems to be image related.

webpagetest.org results for that page.

From: Dulles, VA - Chrome - Cable
7/10/2019, 10:12:50 AM
First View Fully Loaded:
Time: 47.386 seconds
Requests: 210
Bytes in: 26,400 KB

1.6 megabytes of the download were for JavaScript, which is horrible, but nearly 25 megabytes of the download were for images. !!!

Clearly, the publishers of media orgs do no testing of their websites.

According to the waterfall section of the webpagetest.org results, this image was a 22 megabyte download.

https://31lz132jjnab134mc12u4xg5-wpengine.netdna-ssl.com/wp-content/uploads/2019/06/The-Other-Side_Podcast_2019-3000x3000.jpg

The homepage is the same way. !?!?!?!?!?

How can this org be respected as a legitimate media org with such incredibly bad web design?

webpagetest.org results.

https://mississippitoday.org/
From: Dulles, VA - Chrome - Cable
7/10/2019, 11:39:25 AM
First View Fully Loaded:
Time: 54.335 seconds
Requests: 214
Bytes in: 28,616 KB

1.6 megabytes of the download were for JavaScript.

27 megabytes of the download were for images. The homepage.

I have no sympathy for the Big Tech silo platforms crushing media orgs when the media produce websites like mississippitoday.org.

The same image exists that comprises most of the download. WTF?

https://31lz132jjnab134mc12u4xg5-wpengine.netdna-ssl.com/wp-content/uploads/2019/06/The-Other-Side_Podcast_2019-3000x3000.jpg

I'm guessing that someone or something screwed up royally in creating/updating an image that is used as a thumbnail image. It might have been an accident, but automated monitoring that tests load time of some web pages would expose the error.

That image is related to this story:

https://mississippitoday.org/2019/07/01/ep-60-2019-candidates-enter-crunch-time/

The "thumbnail" image, however, is not used on that article page. It's only used on stream pages, such as the politics section and the homepage. After a day or more, that story will get pushed down and off the stream page, I'm guessing.

Here's an article page that does not contain the offending image. webpagetest.org results.

From: Dulles, VA - Chrome - Cable
7/10/2019, 11:51:40 AM
First View Fully Loaded:
Time: 9.354 seconds
Requests: 130
Bytes in: 2,158 KB

77% of the downloaded bytes were for 1.7 megabytes of JavaScript. Now the focus is back on the usual culprit for wretched media web design: JavaScript. 54 web requests were for JavaScript. Why are they using so much JavaScript to display an article that is text, except for two small stock photos? The page displays fine without JavaScript. Why is the JavaScript needed?

288 kb of the download were for images.

Jul 15, 2019

"Theverge.com scores 25/100 on Google's Pagespeed Insights tool (mobile) (developers.google.com)"

https://developers.google.com/speed/pagespeed/insights/?url=https%3A%2F%2Fwww.theverge.com%2F&tab=mobile

https://news.ycombinator.com/item?id=20436367

With an adblocker it's a very fast site and I like how it looks and reads. However I turned it off after I saw this and immediately the frontpage took ~4x as long to load and only showed a black screen for most of that time.

Another HN comment:

Verge loads an absolute crapton of 3rd party trackers and scripts. It's absurd. Ghostery has a field day on that site.

That's about every media website.

-30-