January 27, 2011

Search Engine Issues with Multiple Domain Names

In the heady days of the Dot-Com Boom, there were countless people registering domain names that contained the names of celebrities, well-known companies, movie and television titles, and anything else they thought would bring in the big bucks when they sold the rights to them. Domain names are not relatively the hot properties that they once were since the registrars have set up rules that protect trademarks and copyrights. But the biggest reason that catchy domain names are no longer the Holy Grail is that Internet users are now accustomed to using search engines to find products and services, rather than blindly typing in things like "bestcarprices.com". But the desire for higher rankings in the search engines has revived a great deal of momentum to the practice of buying many domain names and having them all point at the same website - or more precisely, the same WebPages.
Search engines like Google do provide some weight to the contents of the URL for a webpage in their ranking algorithms. It's just a tiny factor, but some hucksters are so anxious for any benefits that they'll register dozens of domain names that are variations of the primary keywords for their sites just to get that small boost. However, there's a fly the size of Brazil in this ointment. It's the mirror site or duplicate content filter that all search engines impose if they detect duplicate content. The search engines have several concerns in this regard. First of all, allowing multiple URLs to point to the same content degrade their search results. Second, each webpage in the index of a search engine consumes valuable resources in their networks and its understandable that they don't want to waste these resources on SPAM or even innocent copies. When Google detects duplicate content, it tries to select the best version of the page, (the "canonical version"), and devalue the copies. The common phrase "duplicate content penalty" is a bit of a misnomer, since there is no overt penalty involved. But, the problem is that you don't get to pick which copy is selected as the canonical version and which one gets ignored. So you can have half of your content indexed under one domain, and the rest on another, all of which kills your internal linking benefits, and damages your overall rankings on both domains. Using a different language to convey the same information is NOT duplicate content, so don't worry about that.
The best advice is to never have multiple domain names pointing to a single website unless you have set up 301 redirects to a single URL: that is, the URL for the primary domain name. It is doubly or triply difficult to get enough link popularity spread among multiple domains in order to have any positive effects, as opposed to simply promoting and enhancing a single website with a single domain name. Many companies and organizations buy extra domain names that are common misspellings of their preferred domain name, or might otherwise be mistyped by users or misused by competitors. In those situations, of course, the best practice is still to install 301 redirects back to the primary domain name.
But if you have a compelling reason to use multiple domain names, such as using country-specific Top-Level-Domains (ie. TLD's like ".uk" or ".au") for their search engine ranking advantage in terms of geo-location factors, you need to take steps to avoid problems by making sure that there is very little duplication among the sites you operate. Using a different page design can help, but it's the actual text within each website that needs to be as unique as practical considerations allow. Google has recently posted an article on Multi-Regional Sites that discusses this issue in great detail.
If vital information is on the company's main website and it can't be rewritten for one reason or another, then you should use one of the following alternatives:
  • Don't duplicate the page on every site. Always link to the page on the main website from the secondary sites/domains.
  • Use the rel="canonical" tag if the page really must remain on the secondary website.
  • Block the duplicate pages on the secondary sites from the search engines with the robots.txt file or by using a robots <meta> tag set to "noindex".
If you already have multiple domain names in use, then you should merge them into a single, primary domain. The solution is to use server control methods to automatically redirect all requests for pages in the secondary domains to the URL in your primary domain name. The server must return a "301 Moved Permanently" response code in order for the search engines to properly re-assign the link popularity and to update their internal records of the page's true URL and to avoid any problems. Any other response code returned by your secondary domains will, at best, prevent the link popularity to pass on to the primary domain, and, at worst, can cause the duplicate content issues to begin to spread to the primary domain and impair your rankings.
Websites running on hosts that use the Apache server software usually have it the easiest in this regard because they can control this problem on their own using the .htaccess control file. Just create a simple text file named ".htaccess" (with no filename extension), and insert the following command:
RewriteEngine On
RewriteCond %{HTTP_HOST} ^(www\.?)yourduplicatesite.com$
RewriteRule ^(.*)$ http://www.yourmainsite.com/$1 [R=301,L]

Simply replace "yourmainsite.com" in the above code with your primary website's domain name and "duplicatesite.com" with the name of your duplicate domain. Websites based on Microsoft's IIS Server Software will likely need to consult their system administrator for help. Again, be sure the server returns the redirecting result code #301 or you're not really repairing it. A code 302 redirect will not do the job properly or reliably. You can check the result code that your server sends using my Server Result Checker.

Duplicate Pages on Sites You Own
Another instance where a webmaster might have duplicate content issues is when they operate separate domains for different contries. Often, there will be information that is important to include on all such sites. For those pages where the content is identical, or nearly so, it is a good idea to use the rel="canonical" tag on all such pages to point to a single, best quality version to tell the search engines to only index this "canonical" version. The syntax is:
<link rel="canonical" href="http://www.example.com/canonical-page.html">
The search engines treat this tag much like a 301 redirect. It prevents duplicate content problems while providing users with local copies of important pages. See my SEO Tips article on the rel="canonical" tag for more information on how this tag works and how to use it.

 

Original post is published here -
Multiple Domain Names & Search Engine Ranking - Rainbo Design SEO Tips

0 comments:

Post a Comment

Note: Only a member of this blog may post a comment.

Share

Twitter Delicious Facebook Digg Stumbleupon Favorites More