Google robots and your website sitemap


Behind the scene, google robots are crawling on your site on a regular basis. To do so google look at a specific file called robots.txt file. This file is literally the gate keeper to your site as it tells the search engine what to do and where to go. One google robots crawling on website is called Googlebot.
When googlebot hits your website robot.txt file, it reads a a couple of line of code that dictate specific actions:

User-Agent: *
This line says that the rules below are for any search engine.

Allow: /
This says that the robots are allowed to crawl the site.

Disallow: /wp-admin and Disallow: /wp-includes
This line says that robots are not allowed to crawl the wp-admin folder as well as wp-includes folder which typically contains scripts, theme and plugin information.

This line specifies where your website’s sitemap is located:

In a nutshell, robots.txt tells search engines where they cannot go and the sitemap.xml tells them where your websites pages are located. The sitemap is generated with an XML file name and in general.

The simplest way to create a sitemap on a wordpress website is to set up the plugin YOAST.

Now here are a couple of tips:
Make sure you do not disallow: / as this will block EVERYTHING on your site (this could be done before a website is launched).Set up your website on webmasters to identify crawl errors and ensure your website is optimised and “SEO ready”!… and off course if you need help, contact us

Posted in Uncategorized.

Leave a Reply

Your email address will not be published. Required fields are marked *