Prevent AI from using content of my blig
-
Hi there,
how can I prevent AI from using the content of my blog, without losing its visibilty for search machines?
The blog I need help with is: (visible only to logged in users)
-
-
Thanks, but since I write about contemporary classical music, there is no revenue whatsoever, so I can’t afford a paid blog, let alone a business arrangement.
-
All those plugins use a robots.txt. While robots.txt provides recommendations to crawlers, it’s not a strict mandate. Crawlers can choose to ignore the instructions.
Never forget the golden rule: everything you put on the internet can and will be missen one way or another.
-
-
All WordPress.com websites have robots.txt and what appears there depends on your site’s Privacy settings. https://wordpress.com/support/privacy-settings/
Also https://wordpress.com/blog/2024/02/27/more-control-over-the-content-you-share/
-
Thanks, found the setting. The box for preventingthird party viewing was already ticked!
-
But bare in mind this is only a request not to scrape your site; it’s not a strict mandate. Crawlers can choose to ignore the instructions and they do.
-
Thanks, so there is no other way to prevent AI from using my content, than setting it to private??
-
-
-
Hello again, Strictly speaking, that’s true. But going full Private or “Discourage Search Engines” also means you won’t get any traffic from valid search engines crawling your site so your blog posts appear in search results. So a Catch-22.
Have you looked at your site’s current robot.txt file? Just add /robots.txt to your site URL? It’s instructive.
Aside: Here’s an interesting article from The Verge about the robots.txt landscape. (And which I found in search engine results after searching for “robots.txt”.)
-
Yes, that’s what I meant: it’s no use publishing in-depth articles if noonecreads them. I’ll check out the robot.txt option. Thanks also for the link!
- The topic ‘Prevent AI from using content of my blig’ is closed to new replies.