What are the Cons of Google Webmaster Tools

Google Webmaster Tools / Search Console

What are the Google Webmaster Tools?

The Search Console, or "Webmaster Tools“Is a free service provided by Google for monitoring and analyzing your own websites. At the beginning of 2015, Google referred to the service as "Search Console" and no longer "Webmaster Tools". However, “Webmaster Tools” is still used as a synonym and will be used as such in the following.

The Search Console enables a website to be checked for errors in the code, the sitemap, internal links or the URL structure. In addition, the page can be marked with structured data, links and search queries with which users come to the page are displayed, as well as the crawl behavior of Google bots, the status of the indexing and much more. In addition, the Search Console is used to contact Google.

The Google Webmaster Tools are often referred to as "Google Analytics Light" because they have some functions in rudimentary form that Google Analytics offers in greater depth and in a comprehensive manner. In addition, the data in the Webmaster Tools is often a few days out of date. The webmaster tools are still of great value for search engine optimization.

This article discusses Google's Webmaster Tools. Other search engines, such as Bing or Yandex, offer comparable services.

Activate / set up Search Console

In order to use the Search Console, after registering with any Google service (Google Plus, Gmail, Google My Business, etc.), the ownership of the website that is to be monitored with the Search Console must be confirmed. To do this, a simple HTML file must be uploaded to the web server.

Alternative methods, such as an HTML tag, the link with Google Analytics or the Google Tag Manager are also possible.

Every website to be monitored is a "property". A total of 100 properties can be created in the Search Console. Properties do not have to be entire websites, but can also represent subdirectories of a website.


The dashboard gives an overview of four metrics:

New and important: Displays recent errors or messages from Google. A message can be a manual penalty (more on this under "Disavow Tool"), confirmation of a URL move or a change of address.

Crawling errors indicate whether a page could not be found under the URL known to Google (404 error).

The Search analysis gives a good insight into the number of visitors over time and which search terms in Google led to a visit to the page.

The Sitemaps ad shows how many URLs were transmitted in the sitemap and which ones were indexed, as well as any errors in the sitemap.

With a click on the respective category in the dashboard, you get to the full view of this point.

Representation of the search

Under Representation of the search you can review Structured Data and the Data Highlighter, HTML Enhancements and Sitelinks.

The point Structured data Indicates whether microformats (schema.org) are correctly marked on the website and can be read by Google. With the Data highlighter elements on the website can be tagged according to microformats for Google without interfering with the source code. Markings can be made under “Start marking”.

After successfully activating the Search Console and waiting for about 24 hours, you can go to HTML improvements Errors in the title, the meta information and any non-indexable content can be viewed.

Sitelinks are specified in the SERPs below the actual hit:

Google determines which links these are. If a link is not desired as a sitelink, it can be "devalued" under this point. Google will then no longer display these links as a sitelink. There is no guarantee for this, however.

Search queries

Under the point Search queries finds analyzes of user behavior, links to the page, and other useful points.

The Search analysis provides insight into the amount of search queries on Google that led to a visit to the website and which search terms were used for them. In addition, various filters can be set, e.g. sorting according to smartphones, countries or sub-pages.

The "Search Analysis" function is currently (July 6th, 2015) still in the beta version. As soon as there is a final version, it will be presented here. You can find more information on search analysis in our blog: Webmaster Tools with a new function - search analysis.

Under Links to your website As the name suggests, backlinks of the website are listed. The number of links is not always complete - Google reserves the right to only display some of the links, especially with a large number of backlinks. Nevertheless, the view offers a good overview.

Under Internal links As the name suggests, all internal links are displayed on the website.

If the website has received a manual penalty, it will appear under Manual action the corresponding email is displayed, as well as possible answers and solutions (see also: What is the disavow tool).

If a website uses the hreflang tag for a international orientation, errors or inconsistencies are displayed under this option.

Since every website should run without problems on smartphones or similar mobile devices, you can go to Ease of use on mobile devices, examine the code of the page.

Google indexing

Under the heading Google index options related to how the page is indexed on Google are displayed

Under Indexing status the number of indexed pages and any URLs blocked by the robots.txt are displayed.

Content keywords shows the frequency of certain keywords within the domain.

Blocked resources shows resources that cannot be read by the crawler. Usually these are login URLs. CSS files or Java scripts should not be blocked!

If you want to remove certain URLs of your website from the Google index, you can do so under Remove urls make.


The category Crawling offers the webmaster the opportunity to examine the Google crawler more closely and, if necessary, to influence the behavior of the crawler.

Under Crawl errors URLs that have produced an error are output, usually a 404 status code.

The Crawl statistics show how often the Google crawler was on the website.

Access as with Google offers the possibility to see the website from the eyes of the crawler in order to prevent possible discrepancies between what users see and what is indexed. If a URL is entered here, it is possible to send it to the index:

This should definitely be done when pages have been changed. This ensures that Google always has the latest version of the page in the index.

The robots.txt tester shows any errors in the robots.txt.

Under Sitemaps there is the possibility to send your own sitemap directly to Google. This is not always necessary, but it is particularly worthwhile for larger pages!

Under Url parameters you can control how the crawler should handle duplicate content. This option should only be used by experienced webmasters. Further information on the URL parameters can be found at: https://www.search-one.de/parameter-gwt/

Under Security issues possible hacks, Trojans or viruses are identified on a website. Other resources contain links to training programs from Google, the domain service from Google or Google Merchants.

It is worth mentioning Structured data test toolwith which you can identify the schema.org awards on the website in more detail: https://developers.google.com/structured-data/testing-tool/