Jump to content
thirty bees forum
  • 0

Pages blocked by robots.txt after updating to 1.5


Derbai

Question

Hello, I update to TB 1.5 but after this all pages are marked as Blocked by robots.txt on Google Search Console and obviusly traffic from Google search decreased a lot.

The file is correct: almost the same as before. I think there is something else blocking indexion but I am not able to understand what is it.

I checked Geolocation: it was disabled but also enabling it with MaxMind nothing change.

Someone can help with this? My website is https://www.artigianodelcuo.io . Thanks

Schermata 2024-09-12 alle 16.42.12.png

Link to comment
Share on other sites

5 answers to this question

Recommended Posts

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...