Description

The Robots file (robots.txt) controls what crawlers can access or not.

Example

Block one folder

User-agent: *
Disallow: /folder/
                                                

Block one file

User-agent: *
Disallow: /file.html
                                                

Block all

User-agent: *
Disallow: /
                                                

Allow all

User-agent: *
Disallow:
                                                

User agents: google

User-agent: Googlebot
                                                











Alex Bieth, Owner @ SEO Berlino and SEO Consultant