

I found the original blog post more educational.
Looks like these may be typosquats, or at least “namespace obfuscation”, imitating more popular packages. So hopefully not too widespread. I think it’s easy to just search for a package name and copy/paste the first .git files, but it’s important to look at forks/stars/issue numbers too. Maybe I’m just paranoid but I always creep on the owners of git repos a little before I include their stuff, but I can’t say I do that for their includes and those includes etc. Like if this was included in hugo or something huge I would just be fucked.
I see it as the continuation of a very old problem. Old school engineering didn’t have any standards until a bunch of people died over and over and the public demanded change. The railroads, construction tycoons, factory owners, mine operators etc all bitterly fought, and still fight, engineering safety requirements. Computer industries have continued this. They all oppose public action, hide negative information, and try to pin blame for conspicuous failures on individuals rather than systemic rot.
I think also because of the relatively less visceral nature of software catastrophes we don’t have a culture of safety. That’s not to say software errors can’t cause horrific accidents but the power grid going down and causing a dozen people in the service area to die is less traumatic than a bridge collapsing and sending a dozen people into an icy river. That’s an extreme example but my point is that humans undervalue harms that are seen as less acutely, physically brutal and software just seems more abstract.
Most of us aren’t working on power grid either, so when you start trying to quantify our software’s risks you have to speak to “harms” rather than just crimes like negligence, and then you expose this huge contradiction about how responsibility is allocated socially. Like, not only should engineers, pilots, and doctors have higher responsibility to prevent harm, but so should cops, journalists, politicians, billionaires, etc.
So the risks are undervalued and both intentionally and unconsciously minimized. The result is most of us who’ve seen the inside are quietly horrified and that’s the end of it.
I don’t know what the answer is except unignorable tragedies because that seems to be the only thing powerful enough to build regulations which are constantly being eroded.