What is Robots.txt file? How is it works?

--

Robots.txt file is a way to tell the google bots that which pages they have to crawl or which are not? A website made up of numbers of web pages. In this way sometimes the website holder doesn’t want to show some of their pages. Thus the disallow the web pages. Let’s see how to disallow these web pages or how to allow these pages.

robots.txt

How is it works?

First of all, we can make this file in our notepad and it is must to save as robots.txt file.

  1. In the first case, we will see how to disallow a whole website:

User-Agent:*

Disallow:/

2. In this case, we will ask the google bot to crawl the whole website.

User-Agent:*

Disallow:

3. Now, We can see how to disallow a particular page:

User-Agent:*

Disallow:/page-admin/

Disallow:/page-database/

After the creation of robots.txt file, e will submit this file into the root directory of our site. We can check our robot file by using :

www.sitename.com/robots.txt.

Hope you are getting my points. I had tried my best to explain this topic.

Keep Smile:)

Keep Learning:)

--

--

Wakeupcoders - Digital Marketing & Web App Company
Wakeupcoders - Digital Marketing & Web App Company

Written by Wakeupcoders - Digital Marketing & Web App Company

We make your business smarter and broader through the power of the internet. Researcher | Web developer | Internet of things | AI | www.wakeupcoders.com

No responses yet