How to Create a Robots.txt File
I don’t like to get too techie in my posts, but a lot of people ask the question “what is a robots.txt file” or want to know how to create a robots.txt file.
First, don’t worry ifyou don’t have a robots.txt file on your web server – it is not mandatory and if you don’t want to prevent any pages from being indexed by the search engines you don’t actually need one
A robots.txt file is simply a text file that’s created with notepad or any other compatible text editor. You use this file to list pages or directories on your website that you do not want the search engine spiders to crawl (i.e. pages that you do not want to be indexed in the search engines).
The reason for this could be because you want to keep your download pages from showing up in the search engine results (SERPS), or because you don’t want to show duplicate content on your site.
Even if you don’t have any web pages that you do not want to show up in the SERPS you can still upload a robots.txt file to your site to let the search engines know that you’re playing by the rules!
Here is how to create a robots.txt file. These files are actually very simply to make despite the name!
First simply open notepad or your favorite text editor. This lines you will enter in this text file are as follows:
User-Agent: [Spider or Bot name]
Disallow: [Directory or file name]
It’s pretty much self explanatory; you tell which spider, i.e. Google, MSN or Yahoo not to crawl the “Disallow” listed sites. That’s all to it, too easy right?
Let me give you a few examples so you know exactly how this should look when you create your robots.txt file.
Let’s say you DO NOT want the Google bots (spiders) from crawling a particular directory on your site. This is how it would look:
If you want to allow the search engines to index everything on your site and just want to have the robots.txt file on your site for good measures. This is how it would look:
Hopefully this will make it clearer to you how to create a robots.txt file!
Filed under: SEO Tips
Like this post? Subscribe to my RSS feed and get loads more!