One should know how to use Google Webmaster Tools (Great Tools by Google) to Analyze and Maintain your website effectively.
Verification: Before you get started, you need to tell Google what sites you want included in your account. Enter the URL as prompted and then you will need to verify the ownership of that site by adding a Meta tag or upload a HTML file with the Meta Tag to your server as provided.
Brief of the Stats available in Google Webmaster Tools:
Your Site on the Web: This section has the following data:
• Search queries: How people are getting to your site from a Google Search
• Links to your site
• Keywords
• Internal Links
• Subscriber Stats
Diagnostics: The Diagnostic tools are here to tell you about any errors that Google has encountered while crawling your site. They will report on the following error types:
• Malware
• Crawl Errors
• Crawl Stats
• HTML Suggestions
• URLs timed out
• Unreachable URLs
Links: The link reports in Webmaster Tools are limited, but do provide you with ways to measure internal and external link popularity.
Google Sitemaps: Google Sitemaps are what the entire Webmaster Tools were originally built around. Here you can upload and manage XML based sitemap files that catalog all of the pages on your site.
Analyze robots.txt: Robots.txt is where Googlebot and other spiders go when they land on your site to immediately find instructions on what they can and cannot have access to within your site. If you don’t want spiders indexing your images, just disallow them. If you’d prefer not to have certain areas of your site indexed and available for the searching public – go ahead and restrict access.
This is where you can check to make sure your robots.txt file is not only up to date, but also valid in terms of how it is written.
Malware details: In order to help webmasters eliminate malware, Google is now sharing snippets of code from pages it considers malware.
Fetch as Googlebot: The Fetch as Googlebot feature lets users see if some of their pages have been hacked and help them understand why they aren’t ranking for certain keywords.
Site performance: This page shows you performance statistics of your site. You can use this information to improve the speed of your site and create a faster experience for your users.
Set crawl rate: This area is very informative - it provides an overview of Googlebot’s activity on your site. If you have recently updated your site or acquired new links you’ll want to come back and check this section out. See if you see an increase in Googlebot activity in response to your work.
Set preferred domain: Tired of seeing www.domain.com and domain.com in your search results? Or, maybe you have become worried about canonicalization and how it will impact your optimization and links? All you have to do is set the preferred domain tool. Using this tool you can instruct Google to display URLs according to your preference. Now you can have all listings appear as being on http://www.domain.com/.
Remove URLs: This automated tool is available to help resolve issues with pages that no longer exist or pages that you just want removed from Google’s index.
In conclusion, the above is a general outline of the important features in Webmaster Tools. Keep analyzing the data available about your website over there and take actions as required.
Thanks for reading these posts on "The Complete SEO Process ". Step6 of the Complete SEO Process will be focussed on the important campaign for Link Building "Off-Page Optimization - Links".
0 comments:
Post a Comment