Robots.txt generator set for optimal seo benefits?
I had some struggle with the generated robots.txt and Google’s own testing tool for Google access vs robots.txt allows/dissalows restrictions.
Anyway there are two lines that make me wonder:
The first line I thought would allow access of all bots that could be seo-wise nice
The last I thought would be a general disallow. But since this line is at the end I thought that would be fine since bots are allowed at the top already.
In summary Google was not allowed to crawl my site optimal with these settings. I had to comment # Disallow: /* and to add
User-agent: * Disallow:
(at the end of robots.txt) to give Google’s bots and crawler free access.
In a nutshell, since we are in ecommerce I wonder waht the optimal robot.txt is to boost seo/Google.
How did that end up in your
I looked at your lines, but thirty bees doesn’t generate it like that. Instead it generates
Disallowsfor every route that should not be accessible by a bot. Have you tried temporarily renaming your current
robots.txt, regenerate a
robots.txtand compare? There might be more optimizations possible.
Nice, that you found time to check it. I did rename the file and generated a new one on the live host.
Differences are that I had to add, at the end:
Otherwise Google is not happy.
I also added for my blog that I going to burry:
@lesley is that correct?
It is. Actually, to be honest, I quit using them unless you are specifically trying to hide something. Robots crawl everything unless they are told specifically not to.