SEO Tutorial

What is Robots txt File for SEO? How to Create Robots Txt File?

What is a Robots.txt file in SEO?

The role of a robots.txt file is to tell crawlers or search engines which web pages to crawl and which web pages to avoid. 

To check the robots.txt file of a website, just add it after the domain name.

For example:

#13 What is Robots.txt & How to Create Robots.txt File ?

#13 What is Robots.txt & How to Create Robots.txt File ? |  JOIN : In this video, We are explaining about SEO Course - 2020 | What is robots.txt & how to write them? | (in Hindi). Please do watch the complete video for in-depth information. Link to our "English Youtube Channel" : #WhaisRobots.txt #HowtoCreateRobots.txtFile # CreateRobots.txtFile WsCubeTech – Digital Marketing Agency & Institute.

How to create robots.txt file?

1. Open a plain text editor like Notepad.

2. In the first line, type User-agent: *

This will tell crawlers that you are defining some rules for crawling. 

3. In the second line, type Disallow:

If you don’t type anything after it in the second line, you are telling crawlers to crawl the entire site. So, you can ignore it. 

4.  If there is anything you want the crawlers to ignore, type Disallow: / and then add what to ignore. For example, if you don’t want your images to be crawled, then it will be like this- Disallow: /images/

5. You can also add your sitemap here by typing- Sitemap:

In the end, the robots.txt file will look like this-

User-agent *

Disallow: /images/


Did you find this article helpful?