How To Add Correct Robot.txt in Blogger Website

Hello Everyone, Today in this post I an going to Tell How to Set Robot.txt Correctly in Blogger Website. If you want to get your website ranked on Goo
How To Add Correct Robot.txt in Blogger Website

Hello Everyone, Today in this post I an going to Tell How To Add Correct Robot.txt in Blogger Website. If you want to get your website ranked on Google Fay quickly, then you must do the right setting of Robot.txt in blogger.

Robot.txt is a special file that is useful for telling search engine crawlers to sort out which pages need to be searched and not. For Blogger users, you can easily activate the Robot.txt feature via the Settings page.

What is often a problem and question is how should this robot.txt file be filled? That's right?

Common Version Format of Robot.txt :-

If you search on Google with the keyword "How to set Blogger's robot.txt" then you will find that most of the formats are like this:
User-agent: Mediapartners-Google
Disallow: 
/search
User-agent: *
Disallow: /search
Allow: /
Sitemap: https://www.theforyou/sitemap.xml

If you're using the robot.txt format above, then that's fine as long as you understand how crawlers read that format. If described like this:

  • For the Mediapartners-Google crawler there is no need to explore pages that contain the /search url format in them.
  • For other crawlers also do not need to explore pages that contain the /search url format in them. But other pages whose url format is behind / are crawlable.
  • Here is my sitemap URL if needed:- https://www.theforyou.online/sitemap.xml

And many claim that Robot.txt format like that is the most SEO Friendly format. I don't think so! There are some things that are actually odd and in my opinion can be minimized. Some of the odd things include:

  • Why should Mediapartners-Google be distinguished from other user-agents even though their functions are the same?
  • Why is there an allow: / line  when it's logical and without the need to enter it's fine. Logically, if you have used the Disallow command, then indirectly you are also giving the user-agent an order that pages other than the disallowed ones may be visited.

Updated Version of Robot.txt :-

From my anxiety, I would like to recommend a robot.txt format which I think is more simple and effective to install on Blogger blogs. The format is like this:

User-agent: Mediapartners-Google
Disallow: /search
User-agent: *
Allow: /
Sitemap: https://www.theforyou.online/sitemap.xml
Sitemap: https://www.theforyou.online/sitemap-pages.xml

With this format, we can describe it simply like this:

  • For all crawlers there is no need to explore pages that contain the /search url format in them. Other pages are crawlable.
  • Here is my sitemap URL of Posts if needed: https://www.theforyou.online/sitemap.xml
  • Here is the sitemap url of Pages : https://www.theforyou.online/sitemap-pages.xml
If in the future you have some special pages that you want crawlers not to touch let alone index them, you can add a new disallow command below them like this:
User-agent: Mediapartners-Google
Disallow: /search
Disallow: /p/error.html
User-agent: *
Allow: /
Sitemap: https://www.theforyou.online/sitemap.xml
Sitemap: https://www.theforyou.online/sitemap-pages.xml

One message that I want to convey and maybe you need to get used to about web developing is that not everything that gets more and more complex is good. Sometimes simple is better, really.

Last Words :-

Maybe that's all I can say about the Correct way of setting robot.txt in Blogger Website. If anyone does not agree with my opinion, we can discuss it in the comments column below. Hopefully my writing is useful.

Post a Comment