There is a bug in "Change of Address" in Google webmaster tools. I guess this is the worst part of Google webmaster tools I ever seen in my experience. I redirected old site to new site using Google webmaster tools in "Change in Address". See the image below, a screen shot of change of address I did in Google webmaster tools.
And all the old pages were redirected to the new pages in the new site using 301 redirect. But Google is not good enough to handle the request. It is showing my new pages under old domain name. But Google search engine is messed up between those two sites, Old and new.
But Google showing a new link http://www.allfaithsfunerals.com.au/writing-a-eulogy, which is ever existing. So, I don't have any other chice, i redirected my non existing pages also to new pages.
Hope Google needs to improve its "Change of Address" in Webmaster tools. Another strange thing is the Google showing each page twice on its SERP's for "site:selwynallenfunerals.com.au" (Selwyn allen funerals http://www.selwynallenfunerals.com.au). I discussed it in my last post (Repeating Google Search Results - Google Bug).
Why Google repeating same search results twice? It is strange. Of course it is going under duplicate content. But why showing same page with same title, description twice? I ever seen like this.
I redirected the old site to new site using Google webmaster tools, where old site has entirely different set of URL's from new one. The content also rewritten.
This is experiment on how class="robots-nocontent" works. All my previous three experiments class="robots-noindex", rel="noindex", <noindex> failed in making a part of the content non indexable by search engines. Lets try this finally.
<div class="robots-nocontent">As yahoo told at http://help.yahoo.com/l/us/yahoo/search/webcrawler/slurp-14.html, this part of content in this div shouldn't indexed by yahoo search engine. But not sure about Google. Google may not support this "robots-nocontent" class. Because it is quite opposite to Google SEO philosophy, "do no do anything specially for search engines" </div>
Here i am experimenting on nocache and noindex tags within the paragraph.
The aim of this experiment is to test search engines for noindex and nofollow if they were used in the middle of the content.
The following text is for testing.
<p content="noindex"> All melbourne seo's belongs to Australia </p>
<p name="robots" content="noindex">All Australian designers are not belongs to Melbourne</p>
<noindex>All Melbourne web designers live in Australia</noindex>
<strong>do not use strong tag but use bold tag. This will not treated as bold by search engines. Search in Google cached content of this page</strong>
<b>Do use b tag instead of strong tag. It is good for seo</b>
Please leave your comments if you like these kinds of experiments.
I have my two sites verified in Google Webmaster Tools. Today i tried to apply for change of address. From and old site to newly designed site. The change of Address.
Google saying that i need to verify the site i wanted to redirect from which is already verified by Google Webmaster Tools.
We couldn't verify oldsite.com.au. To submit a change of address, oldsite.com.au must be verified using the same method as www.oldsite.com.au. Add oldsite.com.au to your account and verify ownership, then try again.
Here we need to understand one thing. www.sitename.com.au and sitename.com.au were treated differently by Google. So we need to verify both of them
I did it. Then Google accepted my change of address (redirection request) from old site to new site.
They said that the Dmoz data should be used with its badge only. But there are many number of websites not following this rule, not even aware of it. But they are just getting the data from Dmoz and using it.
But the warning for those people by the Dmoz is not strong enough.
They stated
Sites that do not display the badge are in violation of the license agreement. Site owners may be contacted by DMOZ and asked to make the appropriate updates in order to comply.
Many people ask questions through his blog comments. But he may hardly read them. Then what is the way to ask questions to Matt? I too tired to searching a way to ask my seo questions. But finally i found a good way to ask him question.
But be careful. Do not ask all the heck like how to optimize my title tag, how to optimize my meta description tags, my meta keywords tags, he may not respond. Not only MattCutts you can ask questions to any Google engineer, a world leader and y ou can Suggest an Android application.
I would like to list some SEO buzzwords here. The order is random.
Keyword Density
Keyword Proximity
Keyword Prominance
Keyword Stemming
Keyword Stuffing
Linkbait
SMO
Doorway pages
Cloaking
Blak hat seo
White hat seo
Gray hat seo
Hallway page
Google Caffeine
Latent Semantic Indexing (LSI)
Google bombing
SERP's
Link Juice
Canonicalization
Google dance Google cache dance
Video Optimization News Optimization
Image optimization
URL optimization Piggyback SEO
Twitter SEO
Link Spam
PageRank
Bing SEO
if you enter 'Help site:www.expertrating.com' in the Google search box, what will Google search for?
a. It will open up the Google help pages applicable to www.expertrating.com
b. It will find pages about help within www.expertrating.com
c. it will find only page titles about help within www.expertrating.com
d. It will direct you to the request page for re-indexing of www.expertrating.com
Which of the following statements regarding website content are correct?
a. If you have two versions of a document on your website, Google recommends that you only allow the indexing of the better version
b. Linking to a page inconsistently does not effect the way Google views the pages. Examples of inconsistent linking could be http://www.expertrating.com/ and http://www.expertrating.com and http://www.expertrating.com/page/index.htm
c. Syndicating your content could be lead to Google viewing the material as duplicate
d. Placeholders for pages which do not have content are never viewed as duplicate content by Google
Which of the following conditions will Google treat favorably from the 'relevancy' perspective?
a. The website offering products and services to the visitors country
b. A website that provides free content for the related keywords
c. Hidden keywords on the home page matching the search term
d. Image alt tags on the home page matching the search term
Which of the following factors have an impact on the Google PageRank?
a. The total number of inbound links to a page of a web site
b. The subject matter of the site providing the inbound link to a page of a web site
c. The text used to describe the inbound link to a page of the web site
d. The number of outbound links on the page that contains the inbound link to a page of a web site
Look at the "d" option. I love the way the confusing, and it is not the place to confuse the test writers. They will get irritated like me..
Which black hat technique is characterized by a method to deceive search engine, by detecting search engine bot and "feeding" it with different HTML code than the HTML actually served to users?
a. Coating
b. Foisting
c. Slighting
d. Cloaking
Here the intention is.. Wasting the candidate time by asking question with sense less long way. Be careful with these kind of screwing tests..
What is the most likely time period requres for getting a Google page rank?
a. 1 week
b. 3 weeks
c. 1 month
d. More than three months
Answer: If i start my website just a day before google is going to update its pagerank database then the answer is 1 Day. I will get pagerank 0 in one day.
What of the following statements about search engine optimization techniques are correct
a. Making a keyword bold does not influence the way that the search engines looks at the keyword
b. Websites with deep linking are looked at favorably by search engines
c. Search engine robots follow the the first link they find to any particular page and they do not follow additional links to the same page
d. It is not a good idea to have the same anchor text for all inbound links as it could look automated to search engines.
Time eating question.. right.. Its unfortunate. Poor English too. But the last point - d catches my eye. D is not true. Many think it is true. It would be good to not get the gray hat seo techniques into the seo test by Lime exchange people. Because they are arguable and eats time..
Cloaking is a controversial SEO technique. What does it involve?
a. increasing the keyword density on the web pages
b. Offering a different set of web pages to the search engines
c. Hiding the keywords within the webpage
d. Creating multiple pages and hiding them from the website visitors
We all know the answer is b. But the the lime exchange people might have written it in understandable way. We know answers to may questions, but it takes great deal of time to understand in the given limited time to finish the test. I reported it to them :D
Which of the following factors contribute towards link popularity of a website?
a. Number of websites that link to it
b. The number of the pages of the website indexed by Google
c. The number of pages in website
d. The quality of websites that link to it
What is the name of the search engine technology due to which a query of the word 'actor' will also show search results for related words such as actress, acting or act?
a. Spreading
b. Dilating
c. RSD(real time synonym detection)
d. Stemming
e. Branching
This question took bit more time to understand.
Does any one had idea? Please post your answers below in comments
a. Attract visitors from the search engines straight onto the Hallway page
b. Organize the doorway pages
c. Help people navigatge to different doorway pages
d. enable search engine bots to index the doorway pages
Why do we love Google? They care each and every point very particular.
Today i am submitting my site to the Yahoo and Google webmaster tools. I was given meta verification tags by them. See the difference. Google given with xhtml standards, in the other hand Yahoo seems to be not caring any xhtml standards.
What should we call it?
And Yahoo asking 24 hours to validate the site. It is not able to validate the site immediately after inserting meta tags or uploading authentication html file.
May people confuses. The confusion comes from misunderstanding or not understanding at all.
Cloaking is completely related to IP address.
Doorway pages is related to redirection using meta or .htaccess or javascript
Cloaking
Cloaking is to serve the different content to different users. Cloaking has its uses. Cloaking is basically not a black hat technique.Using cloaking in a different way.. will comes under black hat techique.
So.. lets see about cloaking usage.
If you have a news website which serves news internationally. You would like to serve different news to different countries at the same time on your home page. This means.. you are serving different content to different users. It is not Black hat technique.
Then why Cloaking called as black hat technique?
If you serving content to search engines specially. that will comes under black hat cloaking. When you show different content to your users and to the search engines.. then that will comes under black hat cloaking. You have to treat search engines as your normal visitors. Do not treat them different than your visitors.
You can serve different content to the browser than that of search engines depending on the type of request(User-agent HTTP or others) your server gets.
Doorway pages
Doorway pages are something different than cloaking.
Using JavaScript
Search engines ignores javascript. If some one takes advantage of it .. and redirects the visitor referred by search engines to a different page.
I am not sure whether Google or any other search engine considers the time with in meta redirect tag.
What if some one keep the time of 6000 seconds = 100 minutes which is greater than one and half hours.
This means.. the visitor almost finishes reading content on that page and closes the page before the browser redirects to the destination page. But the search engines considers the redirected page content in the search results.
Good way, bad way, another one.. not good and not bad.
Lets discuss about the bad way first
Cloaking
Yes. It is the bad way. Showing different to search engines and site visitors. Obviously Google hates it.
It is not good practice to show the flash content to the site visitor and text content in that flash to the search engines. There is a smart way to do this. But don't go for Cloaking or doorway page techniques. It comes under SEO black-hat technique.
Hiding text behind flash
This is not good and not bad. Not an SEO Blak-hat technique. But there is a rule. You must show the content same format which is exists in the flash movie but not any other text.
It is not a black-hat technique because you are not showing the different content or URL to the search engines and site visitors.
Well.. there is a better technique below..
using <noscript> tags
Interesting !!!
Call the flash movie using javascript and write your flash content html version in <noscript> tags. Obviously search engines ignores javascript and get the content which is in between <noscript> tags.
Canonicalization is the solution for the duplicate content. But they added canonical tag for each and every page of my blog with the corresponding page link in HREF.
In the case of Google blogspot the canonical tags should be appear for the label pages(eg: http://web-search-techniques.blogspot.com/search/label/Canonicalization). Because that is only the part in my blog content duplication happens.
It showed me the cache of 31st August 2009... then i refresh.. it showed 4th September cache.. then again i refreshed.. 31st cache then 4th.. Its so pathetic..
I think it is because of update at its data centers.
It is a way of using keyword in different ways with prefix, suffix and using plurals. It is a practice in Search Engine Optimisation (SEO).
Any Example?
Yes, why not.
Lest start with keyword of "keyword stemming".
We can stem the keyword stemming in the following ways.
keyword stemming
key word stemming
keywords stemming
stemming keyword
stemming keywords
stemming key words
stemming key word
All the above key words were stemmed.
Why is it so important ?
In optimizing a page it is a good practice of stemming of keywords. While you are writing the content take care all your stemmed keywords were used in a proper manner and spread throughout the content. Content quality is also important factor in SEO success not only keyword stemming.
Don't try to fool the Google using Stemming technique. Google is intelligent enough to recognize the different between "stem cell", "rose stem", "plant stem" and "keyword stem".
I got white screen. There is no announcement of downtime!!!
I last accessed this site on Friday.
We all hope it will come back soon..
BTW if any one needs Demonoid membership invitation code.. I have it.
I will give you the demonoid invitation code to you. But i need small favor from you.
I need link to my site http://donate-books.org from your site or blog. Or at least help me in spreading the word. It is not a condition. It is a request
Those who needs Demonoid invitation code leave a comment here.
When ever Demonoid comes out from its downtime.. I can generate one for you and send you. But I may help few people, not all.
Don't forget to leave your email here in comments fro Demonoid inviation Code.
But It is not possible to differentiate versions of Safari browser like this for defining different CSS for different safari versions.
I worked on Quirks Javascript today and made some changes which is easy to use for dummies also.
Include the following JavaScript in your html document
====================================================================== <script type="text/javascript">
<!--
var BrowserDetect = {
init: function () {
this.browser = this.searchString(this.dataBrowser) || "An unknown browser";
this.version = this.searchVersion(navigator.userAgent)
|| this.searchVersion(navigator.appVersion)
|| "an unknown version";
this.OS = this.searchString(this.dataOS) || "an unknown OS";
},
searchString: function (data) {
for (var i=0;i<data.length;i++) {
var dataString = data[i].string;
var dataProp = data[i].prop;
this.versionSearchString = data[i].versionSearch || data[i].identity;
if (dataString) {
if (dataString.indexOf(data[i].subString) != -1)
return data[i].identity;
}
else if (dataProp)
return data[i].identity;
}
},
searchVersion: function (dataString) {
var index = dataString.indexOf(this.versionSearchString);
if (index == -1) return;
return parseFloat(dataString.substring(index+this.versionSearchString.length+1));
},
dataBrowser: [
// --> </script>
======================================================================
And then the following script
======================================================================
<script type="text/javascript"> <!-- if (BrowserDetect.browser == "Safari") {