Hello All…
I always wonder, when i search for my keyword “web design Hyderabad” in google.com i find my website on the first page of google.com search results.
But one fine day i found the twist that when i search “web design hyderabad” in google.co.in its not there on the first page, though got a bit shocked for a while, found my website on the second page.
You should note the fact that the domain is .com and not a .co.in or a .in . I agree that i havent done much of a SEO on the domain, as its evident that the result of the search bringing my domain to the first two pages are back links that i have.
Secondly, The diffrences in the search results caused are due to the Datacenter as per my understanding, If i was hosted in India, my website would have shown up better in .in rather than .com ( now its hosted in US, hence .com reads it as a non indian or a rather Global company.)
And Dont worry, Nothing can stop you from being on the first page if your website is SEO Optimized well, so optimize it well and be on the first on both the domains.
This article was to make the difference clear, and i hope it is clear, please ask me if any questions.
Cheers, josh
Excellent Post Josh ! Thankyou for the info
Hi Josh,
Could you please help me with this problem.
My site is located in US. The geo-targetting is also set to US. However strangely when I do a search in Google for "site:mindfiresolutions.com", I get only 254 results whereas the number of pages of my website is over 450! I was struggling with this problem of identifying why some pages are not getting indexed, when I hit across another one.
When searched in google.co.in, the same search string above gives over 1400 results, which is good. But I wanted google.com to also have atleast the same number of pages.
Can someone please explain and help?
Here is the link which i posted in Google forums – http://www.google.com/support/forum/p/Webmasters/…
Thank you
Subhendu
Hey Subhendu,
Few Questions for you that would help to the root of the issue
1. How long has your site pages been updaetd ( frequency), i mean the pages not indext mainly.
2. Do you have a Robots.txt file on your website root, if yes is it requesting bots for anythign
3. Do your site has a good google sitemap
4. Are you giving your site enough exposure to your website content.
usually, its seen that if a website has 450 pages not all 450 are indexed unless the website has a good frequency of content update and google bot feels the deep link indexing is required. so if you keep your site regulary updated, google will index fast, apart from that effectively use the robot.txt and suggest bots to crawl deep, plus, generate a xml sitemap that has links too all the pages you wanna index and introduce it to google thru your webmaster tools…
let me know if you further need help.
Cheers
Josh
Thanks Josh.
The clue to the missing indexed pages might lie in the fact that those pages are not new, they are around 4-5 months old and have not been updated after that.
Robots.txt file is there and it only disallows bots for internal portals.
Yes, I have a sitemap which is submitted to google every month or sometimes once in 20 days.
Exposure to content.. Not sure I understood this point. Could you just explain a bit more?
Thank you once again for the amazing clarity of response.
Subhendu
Exposure to content: By this i meant, introduce these pages to google, post in forums or blogs with links and anchor texts linking to these unindexed pages ( that you would have to update first preferably). that would give the bots a chance to visit these pages through an external source and reindex it.
cheers
Josh
Thanks so much Josh!! It actually worked. The external exposure did help getting the pages re-indexed! Thank you.
Thats gr8, Am really Glad that it helped.
Cheers
Josh