Jump to content
thirty bees forum
  • 0

Pages blocked by robots.txt after updating to 1.5


Question

Posted

Hello, I update to TB 1.5 but after this all pages are marked as Blocked by robots.txt on Google Search Console and obviusly traffic from Google search decreased a lot.

The file is correct: almost the same as before. I think there is something else blocking indexion but I am not able to understand what is it.

I checked Geolocation: it was disabled but also enabling it with MaxMind nothing change.

Someone can help with this? My website is https://www.artigianodelcuo.io . Thanks

Schermata 2024-09-12 alle 16.42.12.png

5 answers to this question

Recommended Posts

  • 0
Posted

My guess is that your server is blocking requests from google servers (maybe you have some firewall or something that block it)

This is not an application issue. You can verify this by checking your server access log

  • 0
Posted

Hello, I have severals website on the same dedicated server. Others with TB 1.2 don't have any problems. Only this one updated to TB 1.5 and the problems start exactly after update.

  • 0
Posted

When I tested it yesterday it indeed displayed the robots.txt error.

I assume that google used, at that time, some old (cached) version of robots.txt that blocked the access.

  • 0
Posted

Domain is let go by cloudflare, and cloudflare caches static resources for a long time.
We should clean cloudflare's cache after changes to the site.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...