Add a custom robots.txt file in blogger is another way to improve your blogger SEO. Robot. txt is a text file that tells to the search engine (servers) how for you to crawl that blog site. Robot. txt file gives instruction to everyone search engine crawlers/search engine spider and actually tell them which a part of your need to be accessed for robots and which regions of your blog need to be blocked forever from indexing.
How to add custom Robots.txt file in blogger? seo describe picture

How to add custom Robots.txt file in blogger?

What are these Robots.txt files?

By including custom Robots.txt file in blogger blog, it is going to bring remarkable change inside your blog traffic. Add a custom robots.txt file in blogger in blogger is one more step to make the blog more SEO friendly. Now the custom robots.txt file is available for bloggers. By using that website owner able to write the commands for the web crawlers to what to crawler or What not to the crawler. That commands are written by different coding which can only be read by web crawlers only. The pages that are restricted in the robots.txt file will won’t be crawled and indexed in search results, then you can stop bots from crawling unnecessary areas of your site. However, all those pages are viewable publicly to normal humans. Each Blogger blog site will have a robots.txt file, that comes by default and it looks like the below one. You can check your own blogs robots.txt file by adding /robots.txt next to your domain name. (, then you can see below codes. That is the default Robots.txt of most blog sites.
User-agent: Mediapartners-Google Disallow: User-agent: * Disallow: /search Allow: / Sitemap:
As you can see above the default robots.txt file has few things like user-agent, media partners-Google, user-agent:*, disallow and site map. If you are still not aware of these, then here is the explanation. First, you need to know about the User-agent which is a software agent or client software that will act on behalf of you.

Features of robots.txt file in Blogger

Media partners – Google; Media partner, Google is the user agent for Google AdSense that is used to server better relevant ads on your site based on your blog content. So if you disallow this then you will won’t able to see any ads on your blocked pages. User-agent: *; So you all know what user-agent is, so what is user-agent:*. The user-agent that is marked with (*) asterisk is applicable to all crawlers and robots that can be Bing robots, affiliate crawlers or any client software it can be. Simply the user agent may be all the search engine crawlers Disallow: By adding disallow you are telling robots not to crawl and index the pages. Disallow: /search; which means you are disallowing your blog search results by default. You are disallowing crawlers into the directory /search that comes next after your domain name. That is a search page like will not be crawled and never be indexed. In Blogger search option is related to labels. If you aren’t using labels wisely per post, you should disallow a crawl of search link. That request all the search engine crawlers to exclude the search. Allow; Allow: / simply refers to or you are specifically allowing search engines to crawl the home page. Sitemap; In this Robots.txt you can also write the location of your sitemap file. A sitemap is a file located on the server which contains all posts’ permalinks of your blog. The Sitemap helps to crawl and index all your accessible pages and so in default robots.txt, you can see that your blog specifically allowing crawlers into sitemaps. There is an issue with default Blogger sitemap, so learn how to create a sitemap in Blogger and notify search engines. Blogger is reading sitemap entries through the feed, by this method most recent 25 posts are submitted to search engines. With that above robots.txt code, search engine bots only work on the most recent 25 posts in your blog.

What pages should you disallow in Blogger?

This question is a little tricky and we cannot predict what pages to allow and what to disallow in your Blog. You can disallow pages like privacy policy, Terms & conditions, cloaked affiliate links, labels as well as search results and that depends all upon you. Since you get some reasonable traffic from search results it is not recommended that you disallow the labels page, privacy policy page, and TOS page.

How to disallow pages in Blogger using robots.txt?

You can easily disallow search engines to crawl and index particular pages/posts or prevent bots from crawling the site in Blogger using the robots.txt file. We don’t have the reason to block search engines on any particular posts and if you wish so then just add Disallow: /year/month/your-post-url.html in your robots.txt file. That is copied your post URL next to your domain name and add it in your robots.txt file. How to prevent bots from crawling your site? what you will need to do for disallowing any particular pages. Copy the page URL next to your domain name and add it like this Disallow: /p/your-page-name.html in your robots.txt file. To Allow a Page: Allow: /p/contact.html To Disallow: Disallow: /p/contact.html

How to add a custom robots.txt file in Blogger? 

1.      Log into your Blogger blog 2.      Go to the dashboard. 3.      Then select Settings 4.      Now click on Search Preferences 5.      Look for Custom Robots.Txt Section in The Bottom and Edit It. 6.     Now a checkbox will appear then tick, Yes and a box will appear where that you have to write the robots.txt file. Just enter below codes,
User-agent: Mediapartners-Google
User-agent: *
Disallow: /search?q=*
Disallow: /*?updated-max=*
Allow: /
*Replace the with your blog address or a custom domain. Note: If you want to search engine bots to crawl most recent 500 posts then you should need to replace the below code with
Sitemap: .

Sitemap: . com/atom.xml?redirect=false&start-index=1&max-results=500
If you already have more than 500 posts then you have to add one more sitemap line, below the above code
Sitemap: . com/atom.xml?redirect=false&start-index=501&max-results=500
*Replace the  with your blog address or a custom domain. If you have organized post labels in a good format and after good experienced in SEO then you can remove the following line.
Disallow: /search
4. Finally, click save changes. Once done click save changes. Now to check your robots.txt just add /robots.txt at the end of your blog URL and you can see your custom robots.txt file. After adding your custom robots.txt file you can submit your blog to search engines. Then I think you have a good understanding of a robots.txt feature in your blogger blog. *Recommend you to use the Google search console Robots.txt tester to test the Robots.txt for any errors or warnings. I hope the above tip helps you to add custom Robots.txt file in blogger and improve your blogger SEO. And if you have any problem or know more on How to Add Custom Robots.txt File in Blogger, Please share with us in the comments section below.

Keshan Lge provides guides and Tips to Entrepreneurs for enhancing and simplifying their Online/Offline Business, Blogging, and SEO for business and being your own boss at your own business. was founded in Feb. 2018 by Keshan Liyanagama. I believe that knowledge should be free. So please, gain more knowledge through Bebizboss.


Anwar Ahmed · March 16, 2020 at 10:50

After you have gained blogging experience, you can network with other bloggers and writers or apply for lucrative blogging positions.

Wiki Guru · March 27, 2020 at 00:11

Thank for providing such an informative article, I am also new to blogging and your blog have really good articles to reslove my problems.

    Keshan Lge · March 28, 2020 at 07:33

    Thank you and good luck for your blogging journey.

Leave a Reply

Avatar placeholder

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.