Has anyone put any thought into how to protect your personal blog from the generative ai scrapers? I've already blocked openai in robots.txt, but it seems like more and more small #generativeAI providers are popping up who don't honor these requests?
Maybe a noise filters artists are using with invisible characters but then again how do I make sure Google bot can see my posts? I don't care about humans using my work but I take issue with machines
But it's a struggle to come up with a solution. Everything I can think of hurts accessibility of my content. Invisible characters could break screen readers. Dynamic #javascript or decrypt my content is bad, and isn't really that strong, so I'm at a loss.
If you ever feel like your life has no purpose, just remember that there's someone at Microsoft whose job is to make sure Microsoft Edge works on #Linux.