

Personally, I have nothing against crawlers and bots
If they’re implemented reasonably, web crawlers aren’t the issue. The problems with them mostly stem from laziness and cost cutting. Web crawlers by AI comapnies frequently DDoS entire services, especially Git forges like Gitlab or Forgejo. Not “intentionally”, but because these crawlers will blindly request every URL on a service, no matter the actual content. This is cheaper for the AI company to implement this way, and scan through the data later. But this also leads to the service having to render and serve tens of thousands of times as much content as is actually present. They are made to try and hide themselves doing so, which is the biggest reason we see “modern” PoW CAPTCHAs everywhere, like Anubis or go-away.
Robots.txt used to work, because search engines needed there to be an “internet” to provide their services. Web crawlers pre-AI were made knowing that taking down a service made another website go down, which lessened the overall quality of search results.
I’ve had LLM webcrawlers take down my whole server by DDoSing it several times. Pre-LLMs, a git forge would take maybe a couple hundred MB of RAM and be mostly idle while not in use. Nowadays, without a PoW CAPTCHA in front, there are often over 10.000 active concurrent connections to a small, single person Git forge. This makes hosting costs go through the roof for any smaller entity.

The smaller/newer distros have no evidence of staying around for years, so it’s hard to judge whether they’ll be around in another couple years. Distros like Bazzite are definitely interesting, but you can’t reliably predict whether it’ll get updates in 10 years. There are stable community-led distros that have been around for a long time, like Debian.