Re: Google Webmaster Tools

Google webmaster tools are an important feature of the Google that is launched in the year 2009. It is the parameter handling feature. This feature of Google helped many site owners for specifying the parameters on their website, which were vs. required. After a year, they have added an option for a default value to this feature for the better improvement of this feature. A positive impact has been seen from the use of these tools hence they have again improved this feature by allowing site owners for specifying how the content of the page can be changed by the parameter.

Now it is possible for many site owners to specify if or not the content on the page can be changed by a parameter. Things become difficult when a parameter changes the content on your website page. There are several options available now. Such latest feature can impact on two things like which website URLs get crawled and how the parameters get handled. For accessing such crucial feature, one only needs to log into his or her Google webmaster tools account. After this, one needs to click on the website that he or she wants to configure and then select Site configuration > URL parameters.

If Google find out duplicate content like differences caused by URL parameters then duplicate URLs are grouped into one cluster. Once grouped the duplicate content, they choose the best URL for representing cluster in search results. They afterwards combine properties of URLs in cluster like link popularity to representative URL. Consolidating assets from duplicates into one representative URL always offers people right and best search results that they are looking for. For improving such process, it is suggested to make use of the parameter handling tool for giving Google information of how one can handle URLs consisting particular parameters.

If one needs to get control on that process then he or she can click on Edit option for specifying if that parameter changes the web page content. One is also able to look at the collection of recently crawled URLs with that particular parameter to get help in figuring out what they have used for. One makes use of this feature if he or she is confident enough about how parameters work for the website. If you tell Googlebot to exclude URLs with a few parameters then it can lead to big numbers of pages disappear from the Google Index. The crawl is very effective and hence more pages of the websites can get crawled and hence indexed. Google cannot instantly stop crawling the pages since they also want to ensure that the content is similar before consolidating URLs into a single version. However if you are a professional with URL parameters then you maybe wish to allow Google sort this out and not to change such settings. They do rather a best job especially if URLs make use of standard key-value pairs.

If one knows the ins as well as outs of this newly introduced feature of Google called URL parameters and knows rightly what he or she wants Google to crawl then one can go for it.

Leave a Reply

Your email address will not be published. Required fields are marked *