How to Add Custom Robots.txt file in Blogger

Allowing Search Engines to crawl your blog, is very important in order to get your posts indexed in these Search Engines and to get them ranked. If Search Engines are not allowed to crawl your blog, your blog posts won’t be indexed or updated and you’ll lose SERP. And as a result, you’ll probably lose all those important traffic from Search Engines.

And today, we will be guiding you through a tutorial that will tell you how to edit or add custom Robots.txt file in blogger blog.

In your blog, the “Robots.txt” is the file that determines which all bots should crawl your blog. Therefore, you can edit the robot.txt file according to your needs in order to allow or deny certain bots from crawling your blog.

If your blog is hosted on Google’s Blogger platform, then you can easily customize your Robots.txt file and do necessary edits according to your needs.

Custom Robots For Blogger Note: If you’re unsure about the “Robot.txt” file and it’s use, its better not to make any customizations on your Robot.txt file. As these changes will also affect your search engine crawlability. So proceed at your own risk.

How To Add Custom Robot.txt File In Blogger

Now, below is the detailed tutorial on how to add a custom Robots.txt file in blogger blog

Also Read :- How to Add Favicon on Blogger Blog

  • First of all, navigate to and login using your login credentials.
  • And now, select the blog whose “Robot.txt” file you wish to edit.
  • And now, move to the settings panel of your desired blog.
  • And now, from the Settings panel, click on the “Search Preferences” sub-menu.
  • Now, from the “Search Preferences” sub-menu check out for the “Custom Robots.txt” option.
  • Now, click on the “Edit” button beside the “Custom Robot.txt file” option.
  • Now, a box will appear below the option.
  • Now, add the below code into the box

User-agent: Mediapartners-Google
User-agent: *
Disallow: /search
Allow: /

  • And now replace “” with your blog’s homepage URL.
  • And hit the save button and you’re done !

Understanding The Custom Robots.txt File Code.Each sentence in the above code has a specific use. And below is what each code means:

  • User-agent: Mediapartners-Google: The default “Robot.txt” file won’t allow any other bots other than Search Engine bots, to crawl your blog. And if you’re using Google Adsense on your blog, then Adsense bots need to crawl your blog for ad optimization. Hence, this line tells your blog to allow Adsense bots also, to crawl your blog.
  • User-agent:* : This code allows all Search Engine bots to crawl your blog.
  • Disallow: /search : This prevents Search Engine bots from crawling labels on your blog.
  • Allow: / : This code allows Search Engine bots to crawl the homepage of your blog.

How To Prevent Search Engines From Crawling Certain Pages/Posts On Your Blog ?

Also Read :- How To Reduce Blogger Blog Loading Time

Suppose, you want to prevent certain posts/pages on your blog, from being crawled and indexed by Search Engine. For the purpose, you just need to add the below piece of code on to your blog’s “Robots.txt” file

  • For posts:  “Disallow: /yyyy/mm/post-url.html”. Replace “yyyy/mm/post-url.html” with the permalink of your blog post.
  • For pages: “Disallow: /p/page-url.html”. Replace “p/page-url.html” with the page URL that you want to hide.
If you enjoyed this article, Get email updates


  1. Hello Dear, I would like to thanks you to share like this post. Excellent website, posted nice article as well as informative post and helpful for other people who want’s to gather knowledge about of website and search engine optimization. I really liked your website. Please keep writing posting to help others.

Speak Your Mind