HN Links - Wed, May 13, 2020

Moths have 'secret role' as crucial pollinators (

Work-from-home boom leads to more surveillance (

After two weeks of working from her Brooklyn apartment, a 25-year-old e-commerce worker received a staffwide email from her company: Employees were to install software called Hubstaff immediately on their personal computers so it could track their mouse movements and keyboard strokes, and record the webpages they visited.

They also had to download an app called TSheets to their phones to keep tabs on their whereabouts during work hours.

"There are five of us. And we always came to work. We always came on time. There was no reason to start location-tracking us," the woman told NPR. She spoke on the condition of anonymity, fearing she could lose her job.

Don’t require a user to be interested twice: Lessons on reducing signup friction (

Know your target audience too. I'm not sure if I'm it, but reCAPTCHA gives me enough friction that I often abandon pages with it. Simply using Firefox's antifingerprinting feature plus some ad/tracker blocking is enough for it to be miserable every time.

Janet: a lightweight, expressive and modern Lisp ( - 209 comments

top comment:

This looks so awesome! It's got the best parts of a lot of languages. This is what sticks out to me:

  • Really simple lisp like scheme, but reminds me of lua (and not bloated like CL)

  • Has resumable fibers, no callcc like scheme

  • Not missing the lack of lists tbh

  • A module system that doesn't feel awkward like CL

  • A built-in package manager (unlike CL)

  • Good lua and C support

  • Threads have a shared-nothing approach with message passing (reminds me of Erlang actors)

  • Destructuring

  • Good PEG support, encourages it over regex

  • (Im)mutable versions of data structures (ie tuples vs arrays, structs vs tables) for maximum flexibility

  • Prototypal inheritance

  • Docs are clean and easy to read

I'll definitely have to try this out, it looks really cool.

Some stuff I'd like to see:

  • Pattern matching support (could be a library I guess)

  • Multimethods

  • Full numeric tower with arbitrary precision types

  • Javascript compilation -- I could see this language being really useful for web and game dev

Does it have good debugging support? I'm thinking of something like slime, swank, etc. Can I set up emacs and/or vim to work the same way I can use CL with slime + swank?

I'm also wondering about the stack traces -- one of the downsides to CL is sometimes the stack traces are nasty to read

Role of Vitamin D in the prevention of Coronavirus 2019 infection and mortality (

I haven't dug into this, and, for the record, I do think vitamin D is likely an important factor here. But I find the two charts to be very worrisome.

Their claim of a very high p value and significant effect is VERY strongly influenced by a single outlier. It's not even close. On this unlabeled graph, I think x axis is vitamin D levels and y axis is deaths per million. They've got ~16 data points in the range of 0-10 deaths, one point at ~20, and one at 65. If you removed the 65 point the line would get a hell of a lot less convincing.

The modern HTTPS world has no place for old web servers (

Top comment:

Prior to last year's release of macOS Catalina, OS X shipped with a Dashboard widget to display the weather forecast. In 2019, Apple broke this widget by turning off the servers it used.

Luckily, Dashboard widgets are just editable html and javascript files, so I rewrote a portion of Apple's weather widget to use the DarkSky API instead. Since the entire point of this project was to support a legacy feature (the Dashboard), I really wanted it to work on the full gamut of OS X 10.4 Tiger – 10.14 Mojave.

My modified version worked fine on 10.9 and above, but on 10.4 – 10.8, users reported being unable to retrieve any weather data. After some back and forth of looking at logs, I found the problem—old versions of OS X didn't support the modern iteration of HTTPS required by DarkSky. I couldn't fix this, because DarkSky doesn't offer their API via HTTP.

Was this really necessary? Weather forecasts are public information, so what level of safety is provided by HTTPS to the point where it should be not just a default, but the only option for developers?

Sensible reply:

I agree. I treat cyber security a lot like physical security.

"Always lock your front door" is good advice, just like "always use https" is good advice. But I'm not going to lock and deadbolt my door if I'm only walking out to grab something from my car and returning immediately.

Stupid reply:

The issue is that an MITM attacker can modify the traffic between client and server if it's not protected by HTTPS.

They could include a script tag and then run JS in the context of the widget which may have undesirable effects.

If the traffic is traveling over a connection you aren't 100% sure you control entirely, wrap it in HTTPS.

First, we are aware of the man in the middle attack possibility for HTTP connections. But in the case mentioned by the original commenter, regarding access to weather APIs ... this person explains it well:

That's not a concern for a REST API where you are just grabbing and parsing json data. Even if someone injects some code into the result, it just gets parsed as a string and the worst thing that happens is the widget displays the wrong thing.

Right. If a jackass man in the middle attacker changes my weather data to make the forecast for today be sunny and 72 when the actual forecast is for rain and snow and 35 degrees F, then I would like to buy that MITM attacker a gift.

From the web post:

When I ran into Firefox's interstitial warning for old TLS versions, it wasn't where I expected, and where it happened gave me some tangled feelings. I had expected to first run into this on some ancient appliance or IPMI web interface (both of which are famous for this sort of thing). Instead, it was on the website of an active person that had been mentioned in a recent comment here on Wandering Thoughts. On the one hand, this is a situation where they could have kept their web server up to date. On the other hand, this demonstrates (and brings home) that the modern HTTPS web actively requires you to keep your web server up to date in a way that the HTTP web didn't. In the era of HTTP, you could have set up a web server in 2000 and it could still be running today, working perfectly well (even if it didn't support the very latest shiny thing). This doesn't work for HTTPS, not today and not in the future.

Regardless of how difficult or how easy it is to add and update TLS certificates with a web server (every three months with Let's Encrypt), it's one more admin tax bullshit that social media silo users do not need to endure.

People who post on their own websites that are hosted at CMS-host providers may also not have to endure this admin tax. Maintaining a leased domain name at blog hosting services, such as Blogger,, Svbtle, and Ghost means that those personal website authors will have fewer admin tax duties, but they may also have fewer freedoms with how they can customize their personal web space.

In practice there are a lot of things that have to be maintained on a HTTPS server. First, you have to renew TLS certificates, or automate it (in practice you've probably had to change how you get TLS certificates several times). Even with automated renewals, Let's Encrypt has changed their protocol once already, deprecating old clients and thus old configurations, and will probably do that again someday.

YEP. I'm aware, and it was frustrating.

And now you have to keep reasonably up to date with web server software, TLS libraries, and TLS configurations on an ongoing basis, because I doubt that the deprecation of everything before TLS 1.2 will be the last such deprecation.

I can't help but feel that there is something lost with this. The HTTPS web probably won't be a place where you can preserve old web servers, for example, the way the HTTP web is. Today if you have operating hardware you could run a HTTP web server from an old SGI Irix workstation or even a DEC Ultrix machine, and every browser would probably be happy to speak HTTP 1.0 or the like to it, even though the server software probably hasn't been updated since the 1990s. That's not going to be possible on the HTTPS web, no matter how meticulously you maintain old environments.

Actually, it will still be possible to access old HTTP 1.0 servers in the future by using limited web browsers, such as Links2 and NetSurf.

It's up to the web clients to stop using HTTP. Google's Chrome and maybe Safari and Microsoft's version of Chrome can decide to stop supporting HTTP, which will wipe out user access to many old websites, unless users switch to a different web browser. Since hardly anyone uses Firefox, it doesn't matter what they do, but Firefox will also stop using HTTP some day.

But those big tech pricks cannot stop us from creating new limited web browsers and from using old limited web browsers. In order to prevent Links2 from accessing an HTTP 1.0 website, our internet service provider or DNS or some other layer of the internet would have to block plain text content.

I do not use any modern, mainstream web browsers on my laptop. My main web browser that I have used in recent weeks on my Linux laptop has been Pale Moon, which is based upon an older version of Firefox, but Pale Moon continues to be updated. It's superior to Firefox, in my opinion.

I also use the Links2 web browser occasionally.

Another, more relevant side of this is that it's not going to be possible for people with web servers to just let them sit.

That's for damn sure. And it's damn unfortunate. Thanks geeks for screwing things up.

The more the HTTPS world changes and requires you to change, the more your HTTPS web server requires ongoing work. If you ignore it and skip that work, what happens to your website is the interstitial warning that I experienced and eventually it will stop being accepted by browsers at all. I expect that this is going to drive more people into the arms of large operations (like Github Pages or Cloudflare) that will look after all of that for them, and a little bit more of the indie 'anyone can do this' spirit of the old web will fade away.

Twitter Will Allow Employees to Work at Home Forever ( - nearly 1200 comments

An Illustrated Guide to Masked Wrestlers (

Sciter – Multiplatform HTML/CSS UI Engine for Desktop and Mobile Applications (

Farm in SE Australia is growing native grains for flour and bread (

A 2020 Vision of Linear Algebra (

Show HN: Writing for Software Developers (

The economic devastation of the pandemic could kill more people than the virus (

Facebook is helping to set up a new pro-tech advocacy group (

Update to L.A.'s stay-at-home orders (

I guess I will be the "Did you read the article?" guy because most of the people in the comments here are really overreacting to what was said in the article. This isn't a plan to maintain a full lockdown for the next 3 months. It was simply a comment that the stay-at-home orders are unlikely to be completely gone in three months. It is talking about the return to semi-normal life. It isn't saying all non-essential businesses need to be closed for the next 3 months. LA County has already moved to start opening up and this article says they will continue on that path.

Castor: A browser for the small internet (Gemini, Gopher, Finger) (

Gemini – A new, collaboratively designed internet protocol (