When I started my first blog on Blogger, I did not know much about blogging and SEO. To learn more about blogging and SEO, I took the help of the internet. And I suggest that if you are a newbie in blogging then try to know more about it.
In this article, you will learn about an important setting for improving your blog’s search ranking that is Robots.txt and we will learn how to add Robots.txt in Blogger or Blogspot blog.
What is Robots.txt?
Robots.txt is text files which contain few instructions in the form of codes. this file is saved in the root directory of a website or blog.
A Robots.txt file tells web crawlers (like Googlebot) that how to index your blog or website in search results.
Note:- Web Crawlers (Also called bots, spiders, crawlers) are the programs used by the search engines to collect date from the websites present on the internet. Ex: Googlebot
It means that you can allow or restrict a page, file, and post from indexing in search engines.
When a web crawler arrives at your blog or website, it first scans the Robots.txt file of your blog or website, after which it starts crawling your blog or website according to the instructions given in the Robots.txt file.
Adding Custom Robots.txt to Blogger?
Now You will learn how to add Robots.txt in blogger blogs. So, Follow the simple steps mentioned below to add it.
1. Go to Your Blogger Blog.
2. Navigate to Setting >> Search Preferences >> Crawlers and indexing >>Custom Robots.txt >>Edit >>Yes
3. Now Copy The Robots.txt File code which is given below and Paste it in the box.
So, this was our guide to add robots txt in blogger. I hope this article will help you to add a custom robots.txt file to your blog incase if you have any doubt feel free to write down in the comments section.