Googlebot Site Crawling Issue – (robots.txt)
-
Happy Holidays to all!
After a year of good crawling with no apparent issues,
I decided I needed to get away from my blog for awhile
and turned it all off as best I could without deleting it
entirely.I set my public visibility to private and blocked search
engines from indexing my site — amongst other things.I’ve spent the last day attempting to get it all back up
online, with public visibility and bot crawling.But Google says: “Googlebot is blocked from http://thedirtylowdown.wordpress.com/” and
the current robots.txt file shows nothing but
“Disallow” all the way down the page through
like 12 line items: disallow everything.How can I fix this? Thanks.
The blog I need help with is: (visible only to logged in users)
-
Hey, this is what Google says:
_______________________________Googlebot is blocked from http://thedirtylowdown.wordpress.com/
Blocked by line 3: Disallow: /
Detected as a directory; specific files may have different restrictions -
- The topic ‘Googlebot Site Crawling Issue – (robots.txt)’ is closed to new replies.