Create a custom or tailor-made Robots.txt for your Blog 4


how to create custom robots.txt

Learn to create the perfect search engine friendly Robots.txt that’s just right for you.

Creating a Robots.txt for your website or Blog is probably one of the smartest things that you can do for optimizing your websites SEO next to creating and submitting your XML sitemap to Webmaster tools, which is one of those other things that you should not put off doing.

For today readers I would like to help you in creating your very own robots.txt that you can set and forget,however it will continue to serve you and your blog for years to come.

What is a Robots.txt?

A robots.txt is a simple text document that tells the search engines bots weather or not they can craw your website and index the contents. This can also be used to exclude particular search engines or blocking certain portions of your website from being indexed if you so choose.In extreme cases you can block these bots all together, but that would mean no one would find your website in search.

Issues often Encountered with Robots.txt

The problem is that most bloggers do not know what a robots.txt is a and they would like a hassle free method to creating one, we have you covered so read on.

Most only find out about their robots.txt when they are having website errors or they are receiving errors in Adsense control panel or webmasters tools.

What does my robots.txt look like?

You may view your current or default robots.txt or that of any website by placing this after the domain in the address field:

/robots.txt

So for example that would be in the address bar:

YourWebSite.com/robots.txt

You may look at ours as well, go ahead and give it a try by simply adding /robots.txt to our domain name or clicking here.

Create Your Own Robots.txt

Each line of a robots.txt basically directs bots telling them what they should do and what they should not do or index, to put it simply the house rules.You can also lay out the welcome mat and tell them where your Site Map is, if you have yet to create a XML SiteMap make sure to do so here as it a very good idea for indexing of your content and people finding you in search engines.Its also a good idea to get rid of your default Robots.text and loading a custom one as the examples below:

 

Example 1: For those seeking a simple no hassle robots text that require no editing and that allows all search engines to crawl your content for indexing, simply copy the code below and move to the instructions towards the end of the article  to easily apply your robots.txt to your blog.(Also note that your content will only be indexed if it original and provides value to readers)

 

# robots.txt created at https://www.blogtechtips.com
User-agent: *
Disallow:
Disallow: /cgi-bin/

 

Example 2: A robots.txt that allows all search engines and includes a site Map to tell search engines where to find your websites XML Sitemap, be sure to place the address of your site map on the line that says Sitemap: below in the sample.

# robots.txt created at https://www.blogtechtips.com
User-agent: *
Disallow:
Disallow: /cgi-bin/
Sitemap:

You can learn how to create and get your XML site map address in our tutorial here.

how to create custom robots.txt

Eliminate Adsense Crawler errors that’s preventing targeted ads from appearing on your blog.

Example 3: This robots.txt contains the string of code that allows the Google Adsense crawlers to display the appropriate information on your website that targeted to your content by allowing them full access through robots.txt identified by the addition of the Mediapartners-Google addition at the very top of the robots.txt, also remember to edit and include your site map url in the example below on the line that says Sitemap:

User-agent: Mediapartners-Google
Disallow:

# robots.txt created at https://www.blogtechtips.com
User-agent: *
Disallow:
Disallow: /cgi-bin/
Sitemap:

Generate Your Own Robots.txt with custom Exclusions or settings

If none of the above is doing it for you or you would like to generate your own, and block specific search bots or block specific directories from being indexed then you may want to click below and use this highly recommended generator to easily create a Robots.txt tailor fitted for you. Simply fill out the info,make your selections and click create to make your own.

 

Click here: Robots.txt Generator

How to implement your Robots.txt hassle free?

You could take the long and complicated route of actually logging into the backend of your website through CPanel where you will have access to your website control panel or some other complicated method that involves editing your robots.txt file and placing it in the root directory of your website, or you could use our so simple a baby could do it method:

1. Install Yoast All in one SEO plugin (or go to plugins and search for it) and install and activate it. If you have the plugin already you are ahead of the game as its the best for SEO optimization.

2.  After activation look for a SEO option in the left panel and select edit files.

3. You should see a robots.txt option at the top click the button to edit or modify the settings.

4. Copy one of our Examples or generate your own and paste it in the robots.txt field, save changes to the robots.txt and you are finished.

5. Your new robots.txt is now in effect.

 

Testing Your Robots.txt

To test the robots.txt simply go to your webmaster tools and in the left panel select Crawl and fetch as Google and attempt to fetch the default URL or a specific page using the settings on that page, if you get a success that means crawlers are not blocked and are accessing your pages fine. You are now finished.

 

How easy did you find this exercise or did you fail miserably in implementing and following the instructions? let us know what you experience has been below as a part of the comments, looking forward to hearing from you.


About Ricardo

I am a Technology enthusiast and a Blogger who loves nothing more than playing with new exciting Gadgets and technology. In my spare time I repair and maintain computers, which is just a few of my many hobbies. Please feel free to leave a comment below and subscribe to this blog. Thank you very much for your visit.

:) How well did I do? Take a moment to leave a comment.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

4 thoughts on “Create a custom or tailor-made Robots.txt for your Blog

  • Wayne

    I have couple questions about robots.txt:
    1. Are you sure robots.txt are cool where Google Adsense is concern, knowing that there rules are strict?
    2. Can robots be used in free Blogger?
    3. Is there a Blogger SEO tool like how you have WordPress SEO Plugin?

    • Ricardo Gardener Post author

      There is a difference between Robots.txt and robots, every website has a robots.txt file it tells the Google crawlers that index your content: what to index, what you want left out and where to find important files such as your site map. I do not condone the use of robots in any way -so to be absolutely clear I am taking about robots.txt, the “.txt” makes all the difference you can read up on it some more on Google and you will realize that as a part of good webmasters practices Google recommends that you set it up one.
      As for Blogger the procedure is a little different, I can only guide you on he WordPress procedure that I have attempted myself, but the instructions can easily be found Wayne.

      • Wayne

        Remember I am new to the webmaster fraternity so I really didn’t know that there was such a thing as robots.txt and robots. I never know there were two of them and I would definitely go the legal way.

        I will do my research on Blogger and see what it got to offer. Just need to know the basics of web trafficking where Blogger is concern.

        • Ricardo Gardener Post author

          I understand we all were new at some point, don’t sweat it and remember there are no shortcuts in Blogging. Don’t follow the hype, grow at your own pace and never be daunted by those claiming big numbers this is a marathon and not a sprint as some might believe.