Robots.txt generator set for optimal seo benefits?

  • Hi

    I had some struggle with the generated robots.txt and Google’s own testing tool for Google access vs robots.txt allows/dissalows restrictions.

    Anyway there are two lines that make me wonder:

    • User-agent: *

    • Disallow: /*

    The first line I thought would allow access of all bots that could be seo-wise nice :)
    The last I thought would be a general disallow. But since this line is at the end I thought that would be fine since bots are allowed at the top already.

    In summary Google was not allowed to crawl my site optimal with these settings. I had to comment # Disallow: /* and to add

    User-agent: *

    (at the end of robots.txt) to give Google’s bots and crawler free access.

    In a nutshell, since we are in ecommerce I wonder waht the optimal robot.txt is to boost seo/Google.

  • administrators

    How did that end up in your robots.txt.

    I looked at your lines, but thirty bees doesn’t generate it like that. Instead it generates Disallows for every route that should not be accessible by a bot. Have you tried temporarily renaming your current robots.txt, regenerate a robots.txt and compare? There might be more optimizations possible.

Log in to reply

Looks like your connection to thirty bees forum was lost, please wait while we try to reconnect.