Limited access to robots.txt file to exclude directories

  • Unknown's avatar

    Limited access to the robots.txt file to exclude file folders.

    I moved my site here over a year ago that came with a custom domain name that I had been using for a few years, my old site was running on WordPress.ORG software but had a different directory structure. WordPress.COM can’t handle some redirects and some of the directories were outside the WordPress structure and I also had a test blog running in it’s own directory. I am still stomping out crawl errors over at Google, my most recent victory was getting the crawl errors below 50. I am approaching 800 exclusions but I have entered well over 1,000 exclusion requests.

    Still two days ago I had to input 30 more crawl errors, 28 or which had been submitted before some at least 5 or more times.

    Google will not delete most content unless it is blocked by the robots.txt file so every week or two I need to put in ALL the crawl errors yet again – in some cases Google has not a clue where it got the page for the crawl error from, but I still get dinged for the error.

    So being able to exclude entire directories would help speed up the cleaning up of crawl errors.

    The blog I need help with is: (visible only to logged in users)

  • Unknown's avatar
  • Unknown's avatar

    YES YES YES!!!!!!!!!!!!!!!!

  • Unknown's avatar
  • The topic ‘Limited access to robots.txt file to exclude directories’ is closed to new replies.