Call us Today! 561-299-5932

Webmaster Tools and SEO

SEO GazelleOur ThoughtsWebmaster Tools and SEO
Posted by:

Google’s Webmaster Tools is one of the most powerful tools for site SEO and improving the visibility of your website. It allows us a complex understanding of how Google is crawling and indexing your site – as well as providing an insight into any issues it has encountered. The Crawl Errors report is one of the most powerful features of Webmaster Tools. There are six main categories of errors: Server errors, Soft 404’s, Access denied, Not found, Not followed and ‘Other’.

Server Errors


Server Errors are typically 500 and 503 response codes caused by internal server errors. These are typically rogue URL’s. A high number of server errors on your website could lead Google to believe that your site provides a bad user experience and this may harm your search engine visibility.

Soft 404’s

Soft 404’s are pages which are technically 404 error pages, but which are not sending the correct 404 response code. For example, this might include ‘Product Not Found’ pages which are sending the normal 200 response code, but which should be sending a 404 response. Google’s stance is that these pages “create a poor experience for searchers and search engines” – so it’s important that we look into these.

Access Denied

This report allows you to see which pages Google is being prevented from accessing – either by statements within your robots.txt file or through areas of the site where user log-in is required.

Not Found

These are effectively 404 errors which are returned when Google attempts to visit a page which no longer exists. While it was previously thought that a large number of 404 errors on your site would harm your rankings, Google has recently clarified its position on this, stating that “Generally, 404 errors don’t impact your site’s ranking in Google, and you can safely ignore them”. These error warnings should therefore be interpreted more as an ‘FYI’ than as something to be actioned.

Not Followed

This category lists URL’s on your site which Googlebot was unable to completely follow. There are generally two reasons why this might happen. Features such as Flash and Javascript might make links difficult to crawl, while search engine spiders sometimes have trouble following unconventional redirects which might result in anomalies such as redirect loops. More information on these errors can be found in Google’s official guidelines around not followed errors.


These include other errors such as 400 Response Codes (Bad Requests). This report is usually best for identifying areas of your site which are causing errors and taking action upon these rogue URL’s.

Comments are closed.