When working with Google webmaster Tools, take a look at Diagnostics. Crawl errors, HTML suggestions. They all give good info about your site.
You can check how much of your site crawlers see, deal with double content issue, understand how your site works. See if there are any errors and correct them.
But! They give that info from Google perspective – from the pages Google has indexed. Most likely: It will not include recent changes and include old and obsolete pages – you don’t have them anymore, you have removed all links to them. But, Google will show them to you!
So, if you see some problems in your account, don’t rush in headlong and try to understand where is that error. (As I did 🙂 )
Continue reading “Some general tips and tricks”
Last night I heard a complain that Google isn’t indexing a web-page. That guy even wondered if he had been banned for some reason.
To understand what’s the problem we asked a few questions, got his website name and had a look at it.
What we got from his answers:
- Website is very new: only a few days on the net.
- He had verified it using Google Webmaster Tools
- It’s URL was added to Google: How to add a link to google
- His robot file was
user agent: *
So, for starters: his website was new, did not have inbound links to it, thus, it was untrustworthy from Google point of view. Page will get more trust and get indexed too, it just takes more time. Second and third knowledge point let us understand that Google is aware is this page and will visit it sooner or later.
Continue reading “Indexing pages and getting rank problems”
Continuing from my previous post:
When starting to work with Google Webmaster Tools you will be asked to verify that you are the owner of the site. In reality it drills down to a check if you can add a meta tag or a html file to the website in question. FTP access is all that is needed.
If you choose meta tag, Google will ask you to add something like this:
<meta name=”verify-v1″ content=”<some string here>” >
Continue reading “Google Webmaster Tools W3C”