Blocked By Robots.txt: How To Resolve Blocked By Robots.txt Error?

How to resolve blocked by robots.txt error

The error message “Blocked by robots.txt” means that Googlebot is unable to crawl a page on a website due to a Disallow directive in the robots.txt file. Google won’t crawl or index content blocked by a robots.txt file. However, a page that is disallowed in robots.txt can still be indexed if it is linked to from other sites.

How to resolve “Blocked By Robots.txt” Error? 

To fix this issue, you can follow these steps:

  1. Identify the Problem:

Determine which parts of your website are being blocked by robots.txt and which search engines are reporting the error. You can use tools like Google Search Console to identify specific URLs that are blocked.

  1. Review Your Robots.txt File:
    • Access your website’s robots.txt file by typing in your browser.
    • Review the contents of the file to understand which directories or pages are disallowed for crawlers.
  2. Make Necessary Changes:
    • If you want to grant access to a previously blocked section of your site, you can modify your robots.txt file to remove the disallow rules for that section.
    • If you want to continue blocking access to certain parts, make sure the rules in your robots.txt are correctly formatted.
  3. Test Your Robots.txt File:
  • Use Google’s robots.txt Tester in Google Search Console to check if your updated robots.txt file is valid and doesn’t block essential content unintentionally.

     5. XML Sitemaps

  • Blocking access to your XML sitemaps can prevent search engines from discovering and indexing new content efficiently.
  • Submit Updated Sitemap:
  • If you’ve made changes to your robots.txt file, it’s a good practice to update and resubmit your XML sitemap in Google Search Console or other search engine webmaster tools to ensure that the changes are recognized.
  1. Wait for Re-crawling:

    After making changes to your robots.txt file and submitting the updated sitemap, you’ll need to wait for search engines to recrawl your site. This may take some time.


Please note that fixing a “blocked by robots.txt” error doesn’t involve code changes unless the issue is related to incorrect robots.txt syntax, which can be resolved by editing the robots.txt file’s content. As we as some time pages are not index on google due to excluded by noindex tag, For fixing this issue you can check “How to Fix Excluded By Noindex Tag in GSC“.

Frequently Asked Questions

Q: How often should I check my robots.txt file? 

A: It’s advisable to review your robots.txt file whenever you make significant changes to your website or on a regular basis to ensure it aligns with your SEO strategy.

Q: Can a misconfigured robots.txt negatively impact SEO? 

A: Yes, a misconfigured robots.txt file can block important content from search engines, leading to decreased visibility and rankings.

Q: What is the role of User-agent in robots.txt? 

A: User-agent specifies which search engines or bots the directives in the robots.txt file apply to.

Q: Is it possible to allow a specific bot while disallowing others?

 A: Yes, you can customize directives for specific bots using User-agent commands.

Q: Are there any SEO benefits to using robots.txt? 

A: Properly configuring robots.txt can improve SEO by directing search engines to relevant content and preventing them from crawling unimportant sections of your site.

Q: Can I use robots.txt to block spam bots? 

A: Yes, you can use robots.txt to block known spam bots, enhancing your site’s security and performance.


Understanding what is blocked by robots.txt and how to fix it is crucial for maintaining a strong online presence. By following the step-by-step guide provided in this article and regularly reviewing and optimizing your robots.txt file, you can ensure that search engines crawl and index your website effectively. Don’t let misconfigurations hinder your SEO efforts; take control of your robots.txt and enhance your website’s visibility.

Contact Info

© 2024 3wBiz | All Rights Reserved | Privacy Policy