Business & Finance Advertising & sales & Marketing

Eight Tips to Make the Best Use of Google Webmaster Tools

Google Webmaster tools are a set of tools which enable webmaster to administer and monitor their site online.
These tools are totally free and all that it requires is a Google account.
Once you log into the webmaster console you can add all your sites here and administer how the Google bot interacts with your site.
This is one of the best resources made available by Google and if you are not using it then you are missing a great opportunity.
For those of you who are using this feature, here are a few tips to get more out of Google webmaster tools.
Submit Your Sitemap A sitemap is a basic xml document which lists out all the url's in your site.
This is a very useful document and Google will you it as a reference to crawl all your pages.
Therefore this is the first thing that one must do immediately after you have verified your site.
If you are not comfortable with html, they you can use one of the free online tools to create a xml sitemap.
A sitemap is particularly useful if your site has a complex navigational structure.
Address Canonical Issues In the webmaster console you can set your url preference to either use the www or non www format.
This will prevent many problems arising out of canonical issues and issue of duplicate content (More on addressing canonical problems in a later article) Check Crawling Stats On the webmaster console, you can access your crawling stats.
Look for the amount of time the Google bot has spent on your site and the average amount of data downloaded (lower the better).
You can also get details about broken links and other http errors.
There is a feature called fetch as Google bot, which displays the web page as the Google bot sees it.
This will be helpful to identify if your content is clearly visible to the spider.
Create A Robots.
txt file
A Robots.
txt file is a simple text file which gives directions to the Spider on how to crawl your website.
If you do not want Google to crawl certain pages, you can block these pages using the robots.
txt file.
However if you wish for the bot to spider all pages, then there is no requirement for a robots.
txt file although I prefer to have one just for sake of completion.
Links To Your Website The webmaster console also shows the total number of links pointing to your site.
But do not be surprised if you see a very low number because this is not a real time data and it can only be used as a ballpark figure.
But nevertheless it is useful.
Analyse Keywords Based on the data obtained by crawling your site, Google displays a set of keywords relevant to your pages.
This is a useful feature and will help you analyse and correct keyword density and placement.
Analyse Search Terms The Google webmaster console provides another powerful feature called search queries.
This is a synopsis of all the search queries leading to your website.
This feature is very useful because you can focus on keywords which are actually bringing traffic to your website.
Check Site Speed The speed with which your site loads is an important factor when it comes to drawing visitor.
A site that loads slowly is likely to be missed by impatient viewers.
It is therefore prudent to test the site speed using Google webmaster tools.
Based on this one can take remedial actions like compressing pages or displaying them as a cache.

Leave a reply