Blog Disclaimer

The posts published in this blog are collected from different blogs or websites written by various famous bloggers/writers. I have just collected these posts only. These posts are not written by me. All collected posts are the great stuffs.

Blog Disclaimer

All content provided/collected on this blog is for informational purposes only, it is not used for any commercial purpose. At the end of any post, the visitor can find the link of the original source.

Blog Disclaimer

At the end of any post, the visitor can find the link of the original source. These posts are only for further reference to review/study latter. It’s a request to all visitors; please go through the original post by clicking on the source given below/above of every post.

October 18, 2012

How to Use Trademark in Adwords & Bind Ads: Tips for Trademark Owners and Resellers


Both Google and Bing allow advertisers to bid on a trademarked terms, but they monitor and restrict the use of them in ad copy.  A trademark cannot be used in the ad text or the display URL without permission, nor be dynamically inserted through keyword replace parameters.  Trademark owners can prevent the unauthorized usage of their terms by taking the steps below:
  • On Google, trademark owners are required to fill out the Trademark Complaint Form to stop the misuse of a trademark. The form also gives the option to list approved resellers which then will be allowed to use the trademarked term in ad copy.
Even if there are no advertisers on Google.com or Bing/Yahoo.com taking advantage of your trademarked term, it is a best practice to fill out the forms for both engines anyway. In some instances, competitors may be using your trademark on the search partner networks.
Protecting your trademarked term is fairly straightforward. However, for a reseller, it is more difficult to get disapproved ads re-approved. In many cases, ads that include a trademark get disapproved altogether or receive the “Approved Limited” status. On Google, “Approved Limited” ad status is granted for ads that may not show in all regions or on all the partner network sites. On Bing, it refers to ads that are undergoing additional review, or have been disapproved, in at least one market.
There are several ways you can attempt to avoid ad disapprovals on Google:
  • The easiest way is to have the trademark owner or the reseller listed on the trademark complaint fill out the 3rd Party Authorization Request. However, in some cases, it may be difficult to find out who actually owns the trademark. We have found that Google account managers are an excellent resource to help in this search.
  • If obtaining permission from the trademark owner is not an option, there are a couple of other changes that can make the editorial process smoother:
    • Make sure the ad copy landing page clearly displays the trademarked term in question.
    • The trademarked term needs to be written in text, not in an image or in a link.
    • The action to purchase the trademarked product must be available on the landing page or within one click of the landing page.
    • The landing page must be primarily focused on the trademarked manufacturer.
    • There should not be any competitor trademarks on the landing page.
    • Use the ™ and ® symbols in ad copy exactly as they appear on the landing page.
On Bing, the best way to solve trademark issues is to work with your Bing team or the Bing Support Center. Making changes to disapproved ads in the Bing Ads online interface has been made more difficult in recent weeks. If a campaign has too many keywords or ads to be shown at once, you may have to go adgroup by adgroup looking for the disapproved keyword or ad in question. It may sometimes be possible to see ad disapprovals in the Bing Ads desktop tool, but this tool isn’t always entirely reliable. From our experience, ads get disapproved due to trademark policies less frequently on Yahoo/Bing than on Google.
On Google, it’s important to note that trademark terms are generally not allowed in the display text of Sitelinks and oftentimes get disapproved. The trademark owner can submit the 3rd Party Authorization Request to request approval. However, the simplest way to ensure that your Sitelinks are active is to avoid using trademarked terms altogether in this extension.
In summary, the search engine trademark policies seem to favor the trademark owners and misuse is policed pretty thoroughly. Even for authorized resellers, the editorial process is a hassle, but navigating it effectively might just play an important role in our Q4 success.

posted by  and  
Source : | RKG Blog |

A Study On Panda Propaganda • Why One Site Lost Million Views


Today is the last day of PubCon in Las Vegas and yesterday I was fortunate enough to be on the expert panel discussing Google Panda/Penguin and the recovery.
I took a different approach from the other panelists and showed an actual case study including steps that my team has taken to pull our sites out from the clutches of these two Google Updates.
I was last to present and what was the crowd’s reaction?













Since the onslaught began in 2011, Panda and now Penguin have killed off many sites. There has been a lot of advice out there on how to fix things, but there isn’t a magic formula that works with all sites. Let me be clear on that: you have to figure out the problems and address each one in order for the recovery to work. There is no BIG RED BUTTON that will make your problems go away either. This takes real work.
And this is what separates me from many experts in the field. I don’t believe in bullshit. I don’t go by theory, I just look at test results. If it worked on one domain and you push the same process on dozens of domains and it works on all of them, that’s a pretty good indication that it is going to work. Even the crowd picked up on it.
Let’s get to the case study and Steve Metivier from InspirationalSpark.com. His site was cranking some pretty serious traffic. What is the definition of serious traffic? How about 7.4 million page views over the last year.
7.4 million page views is pretty impressive in and of itself. I know you will look at his bounce rate of over 70% and point to that as a main reason. No. Focus on what the purpose of the site is: to give quotes and inspirational ideas to the visitor who normally just comes in, gets the information and leaves. Usually their need is handled with just one page view. That is key to realize with bounce rate, ensuring the site is performing properly based on its intent.
So, in Google Analytics we can see that Panda struck this site on June 9th. The real mystery is why Google didn’t pound this site earlier as most of it is in bad shape; you will see how bad of shape it’s in further down.
While that slide is horrific, it didn’t stop there. Check this out…
It is a gradual slide down and then it is ALL DOWNHILL in September. If this was your site, the stress would be through the roof.
Let’s peel back the site piece by piece and do some analysis of what could have caused this.

Skinny Content and Duplicates

The content is a major issue with this site and the worst problem area is within the blog subdomain. Below is a picture-perfect example of the ridiculously thin content: a one line quote from Roosevelt. It is so short that the excerpt itself covers the entirety of the content; so not only is it thin, it is internal duplicate content. Let’s not forget these are just quotes so duplicates are elsewhere on the web. This is not good and there were so many pages like this, Google should have punted this site a long time ago.
And the actual infringer itself:

How strong is the site in terms of content? Well, take a look:
384 pages of similar content. That’s 45% of the site that Google thinks is essentially useless. That can be a site killer just itself, but there’s more. To the next problem…

So Slowwwwwwwww

The second largest issue has to be site speed. The load times on the domain are horrendous and likely affect the site in Google. This is a bad user experience and should be fixed even if SEO wasn’t the goal.
Here is how the top pages fare in terms of load time. You can see just how painfully slow here:
Of course part of this speed drain is from hot linking (when a site fetches an image not on its own server, but from another domain). In the next image you can see some of the hot link offenses found on this site:

Another speed killer is running multiple analytics software. With Google Analytics, there is no need to slow down the site even more with StatCounter:

Missing Elements

Some of this is just simple fixes, like a missing title tag. These things can’t be overlooked. Not with these updates, you can’t overlook ANYTHING.

URL Structure

You may have noticed in the previous image that the URL was a bit strange, /1528/1528/ – why does that exist? When Google crawls just /1528/, it finds a page with a canonical that points to /1528/1528/. The current standard is the /sample-post/ format. You have to be aware of how you are doing your URL structure as it can completely screw you with Google if you do it wrong.

Sitemap

The sitemap is one of the key components of a site to help it get indexed, stay indexed and then reindexed when changes occur. It is also a way for you to tell Google how important your pages are. The problem is, some webmasters label all their pages as important and that is as effective as labeling all your boxes in your move fragile; it just gets ignored. Don’t do that.
The weirdest thing with this sitemap is the filename. This is not a web standard sitemap.xml. The priorities within should be as follows:
1.0 – Home Page
0.7-0.8 – Very important pages/top category pages (less than 10 of your pages should carry this weight typically)
0.6 – The rest of your category pages
0.4-0.5 – Your content pages
0.3 – About Us
0.2 – Any other low-level pages.
0.1 – Terms of Use, Privacy Policy, DMCA, etc.

Fixes

So, what is our prescription for Steven to fix his site? It won’t be easy, but he will have a much better site than before and should regain all the Google-love he lost.
  1. Time to “pony up” and get a real design. The current design is a failure and if he wants to really be a serious player in the quote space then he needs to look like one. My suggestion is to make the www root (actual home) a WordPress site and use theGenesis framework for the theme. Things need to be simple, including the code. This is why I recommend a platform that is easy to edit site-wide without hassles from Dreamweaver or FTP logins. We have tested nearly every framework and Genesis passed all of our tests.
  2. Shut down the subdomain blog.inspirationalspark.com. There is no real reason for him to have a subdomain, but the reason he does is he wanted a WordPress blog and the framework of his original design would not support one. This is taken care of with the new design. The content found on the subdomain was mostly the omitted results and thin pages anyway. It would be wise to take the pages with decent content or backlinks and 301 redirect them to a new location.
  3. Fix Content. Google would just say “add value” without giving specifics. Here are some real ideas that could be added:
    • Explanation of when and where the quote was said
    • Historical context, and/or interpretation.
    • Use a standard social share button closer to the quotes and encourage comments. This will engage users on the site and can generate links. All good things.
  4. Where are all the pictures? Infographics? Videos? The site is basically dead and needs life injected into it to really stand out.
  5. Change sitemap file name to sitemap.xml (this is a web standard for bots). Most WordPress sitemap plugins do this properly.
  6. Verify the redirect from /index.html to the homepage (http://www.inspirationalspark.com/). Currently it doesn’t work. Too much internal duplicate content exists which has to be causing issues. Most internal duplicate content issue will disappear with the WordPress redesign.
  7. Write custom excerpts on each post to lessen internal content issues and encourage traffic flow throughout the site.
  8. Site speed matters. Use these tools’ recommendations:
  9. Minimize DNS Lookups. If the server is drained and the images have to be elsewhere, consider using a CDN or reverse proxy like CloudFlare.
  10. Minimize Redirects. They add an unnecessary tenth of a second each time one is loaded. Use a tool like ScrutinyScreaming Frog SEO, or the Broken Link CheckerWordPress plugin.
  11. Remove StatCounter. Using Google Analytics is enough.
  12. Add blog titles to each blog post. This is a simple one. Again, using a tool like Scrutiny or Screaming Frog SEO will highlight where title tags and meta descriptions are missing.
SEO is different now than it was just a year or two ago. You can’t be lazy anymore. You can’t just post other people’s content without actually doing something to it to make it worthwhile. You can’t just sit on your hands and wait for the traffic come to you, you have to go out and get it. Be aggressive. Using programs like WP Robot, Scrapebox and other automated programs just don’t work. This is getting back to grass-roots marketing. Focusing. Delivering what your visitor wants. Giving them a solid experience.
I am sure you are saying, “Shouldn’t that have been happening since Day One when a site is launched? Yes it should, but it wasn’t happening. And the reason it wasn’t happening is webmasters were blinded by the checks. First $10k a month, then $30k, soon it was $100k and some even surpassed $200k a month. That kind of money will taint even the most respectable webmaster into doing shady things.
It is time to get to work and to be awesome again.
by JERRY WEST on OCTOBER 17, 2012

Source : Panda Propaganda • Why One Site Lost 56% of 7.4 Million Views

October 14, 2012

Solutions to Exact Match Domain (EMD) Problem


It’s obviously a bit troubling times for EMD owners. No one likes to be at the center of an SEO witch hunt. It’s all fine and good to do spam reports, until it hits your site, or targets your niche or competitive advantage. One of the best competitive advantages has always been the ability to stay under the radar and keep your mouth shut (though I sometimes fail to fail at what I preach).
The solutions are the same as to many of the problems with Panda and Penguin. It’s a tough time to be a site owner, and admit that you were “over-optimized” and start back peddling a bit, but it’s G’s world – we just play in it. How many times has Google said it? Focus on the user. You may have always scoffed at doing “what’s good for the user,” but with engagement metrics that suggestion has turned into a requirement. We'll continue to focus on both how to make our websites better for users and Google with more actionable execution taking advantage of how user's interact with our sites via search engines. 
There’s still plenty of advantages to EMD’s, and we should continue to see instances of their success, but it’s hard to build a generic commercial intent keyword brand. You gotta have the chops to back it up!
We all know ranking for generic commercial intent phrases is valuable, or we wouldn’t be targeting them.  In order to stand up to the scrutiny, you’re going to have choose your favorite EMD’s, and let those other pipe-dream microsites die their slow painful death. It’s important to know when to pull the plug on a losing web property. Any good web entrepreneur has plenty of failures on their resume.

A few things to consider for solving problems with EMD sites:

  1. Disavow all public knowledge of SEO
  2. De-optimize
  3. De-link
  4. Prioritize your SEO efforts – you can’t win the battle on all fronts anymore
  5. Focus on quality of quantity (with site indexation)
  6. Redesign and Rebrand (maybe it’s time to get a mascot for your .org)
  7. Innovate ways to improve user engagement metrics
  8. Develop a social presence and improve your social mentions
  9. Diversify your backlink profile
  10. Diversify your anchor text
  11. Okay – I’m (kind of) kidding on rule #1 - #3


You know who hates on good EMD's most? The people who don't own them. You know why? Because they've always carried an advantage with them. While this advantage is diminishing, there is still a tactical advantage in spending some money up front for a great exact match domain name that describes exactly what you do and acknowledges the generic commercial intent of your visitor.
EMD's will always receive lots of hatorade because the majority of people don't own them. Toolman at webmasterworld said it best: S.P.A.M = Sites Positioned Above Mine. There’s plenty of SEO’s who could make Silky Johnson look like Tony Robbins. Don't participate in the hate, and don't feed the trolls.

Very few people are going to come out of the woodwork, and “extol the virtues” of an exact match domain, and put their website under the ever scrutinous eyes of search engineers, and a community that often prefers to focus on failure instead of offering opinion for improvement. As usual, I enjoy being the exception to the rule, and figured I’d pitch in my two cents. 

How to find and buy an EMD (and avoid being a hater)

  1. Type in whois.sc/yourkeyword.com/.net./org (this will redirect you to domaintools whois search for the targeted phrase)
  2. Identify if the domain is owned by a domainer or owner and do some further research
  3. If there is no established website - Write an email and ask if the domain is for sale.
  4. If you get a response – offer approximately 40% of the asking price, or propose one high enough to not offend the seller.
  5. Meet in the middle if .com is worth it.  if .net/.org offer 2-10% of your .com price
  6. If there is an established site, check the other metrics, and be prepared to pay much more.
  7. After EMD “death” be prepared to pay more for domains in the aftermarket after their “rebirth”
For more on domaining, check out the domainer myths.
Take my opinion on EMD’s with a grain of salt. No, I didn't test my theories like Pete. This is just my experience. I have bought a fair share of them thinking they were a great buy for future projects, or just to invest in and sell in the aftermarket at a later date. We’ve been warned of the “death of EMD’s” for a long time. I just hope EMD's continue to suffer the same type of death that SEO constantly battles with: one that is curable with creativity, innovation, and execution.

Resources:




Source : http://www.seomoz.org/blog/

EMD - The Exact Match Domain and The Best Practices


EMD and domain best practices

  1. Always be willing to spend 10-15% of your overall budget on the BEST domain name you can get. It will make a big difference in both the short and long run. Dive into the aftermarket, and send some emails.
  2. Skip the second level TLD’s - .mobi / .travel / .info isn’t worth it.
  3. No more than one dash in your domain (better to just skip dash domains altogether)
  4. 3-4 words max for .com EMD’s
  5. 2-3 words max for .net/.org EMD’s
  6. Best to build a Brand site on a keyword domain so you get both brand mentions and generic intent keywords (see Toys.com owned by ToysRus.com and associates)
  7. Geo-local EMD’s are great to own, and offer lower barriers to entry
  8. You're going to have to focus some efforts on "de-optimization"
Marauder Sport Fishing
As the proud owner of MiamiFishing.com (no, I’m not a retired fisherman, but thanks for asking) and other exact match domains, I can say that there are both pros and cons to EMD's. I saw a few sites of my own pay the price for “over optimization” during Penguin. It's hard to always know how aggressive to be, and how far G is going to turn the "filter knobs," In a time where disavowing, delinking, and de-optimization seem to be the valid strategies, it's safe to say you should probably take a more conservative approach to your organic ranking strategy.
SEO factors aside, there's something valuable about having your domain name "say on the box" exactly what you do when you put it on a hat, t-shirt, or sign. There's a lot of implied credibility in a .com EMD (and even to some extent .net and .org).
After years of being an SEO, it’s sometimes difficult to maintain a TAGFEE mentality and put my own site up on the chopping block for public criticism, but it’s a site I’m also very proud of, and I think really stands up to the other websites in the vertical in delivering value to our users. Please be gentle.  I do believe the Moz has great tenants, but it can be very frightening to put your site up in the crosshairs for people to take aim and fire at, especially when you haven't accomplished everything you'd like to do with it sometimes. Being optimized or optimal means getting the most you can from the resources at your disposal, and sometimes this isn't always enough to create the perfect website (I have others that aren't nearly as pretty).
EMD’s do have their advantages, but they have some disadvantages as well.
Pros of an EMD
  • Great for a startup to gather some relevant longtail traffic
  • Easier to get targeted anchor text
  • Easier to get social mentions with keywords
  • Can dominate a single niche (IE: “Category Killer”)
  • Good for targeting variations in the long tail keyword phrase set
  • Brand mentions and keyword mentions become one in the same
  • They can be very effective for generic commercial intent queries
  • They can be very effective in local search
  • Great way to build startup “bootstrapper” traction
  • Can be an effective strategy with a well built microsite to target a single niche.
  • Some businesses have very limited keyword sets – this is a decent approach in these areas.
Cons of an EMD
  • Limits future brand expansion
  • Can create “brand confusion”
  • You don’t get the same “credit” for brand mentions.
  • Your brand can come off as “generic”
  • It can be harder to claim social media profiles
  • It can be more difficult to associate mentions with your brand
  • Hatorade on your site quality if you outrank competitors
  • More chance of “over-optimization” (seriously, does anyone else hate this phrase as much as I do?)
  • There are a limited amount of them
  • They can be very expensive
  • The effectiveness of the advantages are slowly being neutralized

EMD's and Brand Confusion

One of the main problems facing EMD's is the brand confusion that can come with a keyword domain. It’s HARD to own a very sought after generic commercial intent keyword. Google really doesn't want someone to own a keyword, and for good reason.
Keywords are the new brand. Someone in every vertical is trying to own their generic commercial keywords. Think about the big brands Staples and Office Max; do they really DESERVE to rank better than a well built OfficeChairs.com or OfficeFurnitureOnline.com ?
Generic commercial intent keywords are hard to come by; there’s really not a ton of them around, and they are VERY sought after when you start looking at the search demand curve. It doesn't make sense to for a SE to allow only one advertiser own the keyword when several can compete to drive prices to a point of maximum profit for G and diminishing returns for advertisers. There will always be competition to be the brand associated with the generic commercial intent keyword. Logic follows that value in the associated domains should stay pretty strong as well.  
This is probably beyond the scope of this post, so I may leave this discussion of "branding" keyword domains for another day, but it is at the crux of the EMD debate. I’ll leave the solutions to the commenters ;). We all know that G is expecting much more out of a website to allow it to remain on their first page these days.
Think there’s a lot of keywords with generic commercial intent? Consider the main ones in each of these categories where G makes the majority of their ad revenues.  The list might not be as long as you think. I'm willing to bet most consultants and agencies here in the Moz community have at least a client or two in each of these major verticals.


So what was the “solution” to the EMD relevance “problem?”

Google engineers have always attempted to “level the playing field” for webmasters. They do a great job in many cases, and provide lots of fantastic tools these days with Google Webmaster Tools. Unfortunately, I don’t really think EMD’s are inherently a bad thing. They were just too large of a competitive advantage for some competitive niches where it was difficult to get targeted keyword anchor text. It's still going to remain difficult to get targeted anchor text in these niches (though it's now much less valuable to do so). EMD’s became a goldrush landgrab for optimizers and domainers when they saw the advantages they provide, and the tactics got used and abused and started to create some relevance problems.
As with all landgrabs, people got greedy. Speculators starting creating sites that gave EMD’s a pretty bad rap.Competitors start reporting these websites go Google as S.P.A.M (sites positioned above mine), and users start to complain that the SERPs suck. Speculators started putting up 1 page garbage microsites and ranking for large 2 and 3 word phrases with 3 crappy directory links and a page of outsourced content. The EMD's started to look like those old double dashed sites, even though the barriers to entry for top search rankings were a bit higher. Those barriers continue to get raised.
You can’t cry about your rankings when you didn’t deserve them in the first place, and honestly you never deserve rankings. You earn rankings, and often lose them. It’s part of the love, joy, and pain that is SEO.  As John Andrews says in “You’re Free to Go Home”  "That’s alot like SEO. You win, you get traffic. You don’t win, you don’t get traffic. It doesn’t matter how you play."

The real issue with EMDs

The main issue originally posed by suffering relevance was not EMD’s, but the amount of influence that keyword anchor text wielded over the search relevance algorithm. EMD’s just benefitted disproportionately from advantages with targeted anchor text. Anchor text carried too much influence without that added benefit. It’s a whole lot easier to get a link that says “Real Estate” when you’re RealEstate.com than it is to get one when you’re  Zillow.com. The same can be said right down to Buy-my-crappy-spyware-cleaner-software.com.
It was much more important to fix the overall issues associated with the anchor text relevancy problems, than it was to fix the EMD “problem,” and that’s why we saw the anchor text issues being remedied first with Panda and Penguin (which fixed a slew of other issues as well), before directly fixing EMD issues. There is a lot of potential collateral damage that can occur when making the decision of if a keyword domain has enough "brand signals" or "quality factors" to be near the top of the search results for a phrase, so I imagine it's a pretty difficult search relevance area to tackle. The simple fact is many EMD's ARE good valuable sites that deliver a quality experience to their end users. Can you really take a way all their advantage that they were wise enough to gain from paying top dollar for a great domain?
As with most important signals, optimizers found a way to take full advantage of benefits that inbound keyword anchor text provided. As with the rest of the history of SEO, we’ve seen a major shift in the importance of anchor text that has sent a lot of SEO’s reeling. If you didn’t see the writing on the wall, you either didn’t pay attention, or didn’t care. Either way, SEO’s who ignored the impending anchor text over-optimization warning bells are now paying the price, and trying to fix mistakes.
Panda and Penguin cured most of the major EMD relevance issues by forcing EMD websites to earn their rankings through achieving acceptable engagement metrics. Think of Panda as a beast that eats sites who don’t give their users what they want. If you don’t hold up the the “relative engagement metrics” within your SERPs, your site gets eaten.
If I were to play “if I were a search relevance engineer” (one of my favorite games), I think would just set the barriers to entry higher for EMD’s to rank in the short and medium tail keyphrases. I would also validate with user metrics the fact that they deserve to be there. Long ago (in 2005), Google introduced the “sandbox” (or trustboxThe “trustbox” made new websites “guilty until proven innocent” with regards to their page authority unless they demonstrated sufficient signals to be let into the index. 
The principles and ideas associated with the trustbox are still very much in effect today. Value to your users creates trust and credibility verifying engagement metrics like high time on site, multiple page views, low bounce rate, repeat visits, and new websites are let into the index more quickly, but the barriers to entry for commercial intent high dollar short and medium tale queries are much higher. Essentially, your user engagement metrics must validate your rankings. 
Yes, that was an “Eminememe”, and as Eminem says: “you get one shot, never miss your chance to blow.” When you get your “audition phase” in the top of the search results, your site needs to perform well against other sites in that keyphrase set. Make sure you pass your “audition” instead of puking on your visitors sweater and telling them it’s value. Positive engagement metrics during your audition phase is equivalent to the importance of quality score in you PPC campaigns; it can really have an effect on the outcome of your webpage's success.

Positive engagement metrics

  • High time on site
  • Multiple page view
  • Repeat visits
  • Low bounce Rates
Not every industry requires 10 minute time on site, and 50% repeat visitors, but some do. These metrics reflect brands and brand signals, which is what G has repeatedly mentioned as their priority for providing quality and relevant sites to users in the search results.   

HOW TO PERFORMING WEBSITE AUDITS ?


Onsite optimization of websites is the most important factor in ranking better in search engine results. You can build all the links you want, but if your website isn’t optimized, Googlebot won’t be able to crawl the site properly. Over the past several years, Google and Bing have released guidelines for webmasters to make websites more crawlable. This is a consolidation of all the guidelines released by Google and Bing over the years to improve onsite optimization. Audits should be performed before the launch of any new website design, and at least twice per year to keep up with changes in industry standards.

Canonicalization and URL Structure

www or non-www, that is the question. Rather than wasting your time rehashing this topic, I have written a handy guide that explains URL structure best practices in full. To summarize the article, there should only be one URL used to get to the homepage of your website. Whether or not you choose to use www is your choice. Just make sure there is only one version, and it’s not followed by an index.php, index.html, etc… URLs should contain words only and no numbers or symbols. The words should be separated by dashes instead of spaces or underscores. All words should contain lower case letters. Each URL should contain a keyword phrase, and should be no longer than 5 words in length overall.

Content Optimization

Every page on the website should have unique, high-quality content using proper optimization techniques. Homepages should have a bare minimum of one paragraph of keyword-rich content. A good rule of thumb would be around three or more paragraphs. Without quality content, Googlebot will have a tough time determining the relevance a page has to the keyword you want to target. The title of the page should be contained within an h1 tag, and should include a keyword – preferably a keyword similar to the page’s title tag. Use plenty of keywords in the body, but avoid keyword-stuffing. There’s no real rule of thumb about keyword stuffing, but if it looks like keyword spam, it probably will be read as spam by Google.
Each page should employ a content hierarchy to establish importance of key phrases. Titles should be in h1 tags and only one per page. Subhead titles should be placed in h2 and h3 tags with a keyword included in it. Bolding and italicizing text within paragraphs will show importance of the phrase to search engines.  Linking to other pages on the website using good, keyword-rich anchor text as well as other trusted websites will also increase the value of the content. Every page on a website should have around one hundred links. If a page contains too many links, it will be penalized for being spammy. Non-navigation links on pages should look natural to the content. Follow this content optimization checklist before publishing content to ensure it is properly optimized.
Cloaking or duplicating content is a huge no-no for SEO. Cloaking content is a very risky practice which will get your website heavily penalized by search engines. Matt Cutts recently did a wonderful video explaining how cloaking works and why it’s bad. Duplicate content on pages is another way to get on Google’s black list. Internet leeches love scraping content from websites and reposting it on other websites to collect advertising money. Google grew smart to these practices, and changed the algorithm to filter out duplicate content. Even having two pages on the same site with duplicate content will get your site penalized.
If you absolutely must have duplicate content on your website, which is necessary in rare cases, use the rel=canonical attribute to tell the spiders to only pay attention to one of the two pages. Copyscape and Plagiarisma are two great tools for finding content-scraping leeches that stole your content. BluePay is a credit card processing company that has fallen victim to content scraping. The bottom half ofthis competitor’s page was completely scraped from BluePay’s homepage. Scary, isn’t it? To check for duplicate content within your own site, do a search for a snippet of your content within quotation marks. This will show you instances where your content appears twice on the website, and may point you to possible bugs on the site. If you do find another website that has scraped your content, you can fill out a request for Google to remove the content.
One last key point to mention about content optimization is advertisements. If your website is oversaturated with advertisements, Google may penalize it. If your website is about GPS tracking for trucks and you have text link ads about online poker, you might get penalized. Avoid creating a mosaic of advertisements on your website. Use ALT text with keywords liberally for all advertisements, links, and images. This will help search engines better understand the content.

Geo Redirects

Geo redirects are a great way for international websites to forward its visitors to the correct language version of the site. Keeping that in mind, it may be the worst SEO mistake of your life. If your site is accessed by going to www.mysite.com and all backlinks point there as well, why would you want it redirecting to www.mysite.com/en/index.html? By automatically redirecting people away from the root domain that all backlinks are pointing to, you will be cutting your link value in half. Create a language selection option on the homepage instead.

Image Optimization

Image optimization is a commonly overlooked practice, but it does carry a lot of SEO value if done properly. Read my article on image optimization best practices for a full guide on optimizing images. This practice is mainly important for getting images indexed in image searches. Add captions and ALT tags using keywords for each image. ALT tags should be no longer than 50 characters. Filenames should include keywords, and should be separated by dashes.

Internal Linking

Each page should contain around one hundred links per page. You may think that number is outrageously high, but it’s really not hard to achieve. Before getting started, run Xenu Link Sleuth to check for broken links on the site, and fix them. Make sure all internal links are text links. This is especially important with navigational links. Any halfway decent designer will be able to put together some nice text-over-image links with CSS. All category and main pages should be linked throughout the global navigation. Include footer links at the bottom of the page as a secondary form of navigation. This area should link to subcategory, terms of use, social media, privacy statement, sitemap, and contact pages. Be sure to use good anchor text for the subcategory links in the footer. Also be sure to include good links with anchor text throughout the page content. Breadcrumbs are another great form of internal linking, and greatly improve a site’s usability.

Microdata

Microdata is a little-covered topic in relation to SEO, but it does pack a powerful punch. Microdata creates a mode of communication between your content and search engine spiders. You can define the meaning behind sections, elements, and words on each page. This helps build a trust with search engines – especially Bing. I have put together a couple articles on this topic already. If you run an e-commerce site, check out my article on e-commerce microdata. If you run a B2B corporate site, check out my other article about corporate B2B microdata.

Noindex and Nofollow

Depending on the type of website you are building, there may be some pages you specifically do not want indexed or ranked for. Any irrelevant pages like terms of use, PDF files, privacy policy, and sitemaps should be nofollowed. Use your own discretion as to other pages you may want to nofollow. Site search results should also be nofollowed and noindexed. Site search results can be indexed, and will create duplicate content issues if you don’t add the nofollow and noindex attributes to them.

Title and Meta Tags

Title tags should be unique for every page, and contain a primary keyword phrase that is relevant to the page’s content. If you have room for two keyword phrases, separate them with a comma. If the site is a well-known brand, you may want to include the brand name in the title tag. Title tags should be 50-65 characters in length. If the string is longer than 70 characters, it will be cut off in SERPs.
Never use meta keywords. Meta keywords are used by spammers and scrapers all the time, so Bing has decided to flag all pages using meta keywords as spam. Meta descriptions should be included on every page. They should include a keyword phrase similar to the title tag. Descriptions should not exceed 160 characters in length. Never reuse the same description for two pages.

Webmaster Tools

Google and Bing Webmaster Tools have a few tools necessary for SEO. The most important tool is the sitemap submission feature. First, you will need to generate a sitemap for your website. Once you have generated a sitemap, upload it to the home directory of your server and submit it through Google and Bing Webmaster Tools. Sitemap submissions will tell search engines that you have new content, and it should be crawled on a regular basis. Submitting RSS feeds to the sitemap will also carry a lot of SEO value since Google and Bing love RSS feeds. Both webmaster tool suites have an option for doing this.
Webmaster tools can also be used to set the crawl rate, check for crawl errors, and to instantly submit the site for indexation. If your content is updated on a daily basis, you may want to change the crawl rate to run Googlebot on a daily basis. Another great tool is the “crawl errors” feature. This will check for errors found while indexing the site. All errors found should be fixed immediately. If you have just launched a new design, site, or just write a very important blog post, you may want to consider using the “Fetch as Googlebot” tool. Bing’s equivalent is called “Submit URLs”. This will instantly index the page you submit and give it a nice bump up in SERPS, but beware – if the click-through rate is very low, that page may getpenalized.

Website Architecture

Two or three clicks from the homepage is all it should take to reach any page on a website. Search engines are a lot like human readers when indexing pages. If it takes four clicks to reach a page, Googlebot will place low importance on that page. Likewise, a page reachable from the homepage will carry the most weight. When designing a site, determine the importance you want to place on each page, and create a content hierarchy for those pages when planning out your website. Pages containing your top keywords should be placed towards the top of the hierarchy while long-tail, less important keyword focusing pages should be buried towards the bottom. Now that Google indexes AJAX and JavaScript, there are lots of options to creating a navigation menu where you can implement an effective content hierarchy.

Website Design Mistakes

Three common web design errors that will wreck your website’s crawlability are frame usage, using all-Flash, and not setting up 301 redirects after moving content. Designers are not SEO professionals, so cut them some slack. Although, it is imperative that you review their plans before they implement any changes. If your designer wants to use frames, ask them if they plan on using Comic Sans while they’re at it. Much like Comic Sans, frames are completely outdated from a design and SEO standpoint. Frames require three html files to create the design while normal sites only use one html file. This confuses the heck out of spiders when crawling your site. Most of the time, none of the pages will end up getting indexed.
Flash is an eye-catching way to build a website, but if you plan on building an all-Flash site, then you might as well go sit in the corner, talking to a brick wall because that’s what you would be doing to search engines. Flash cannot be crawled, so don’t implement any site elements in Flash. Flash should be used sparingly, and for decorative purposes only. Another thing to watch out for is whether or not your designers set up 301 redirects after moving around any content. If they did, make sure they used 301’s instead of 302 redirects. If a page has already been indexed and gets moved without a redirect set up, it will come up as a 404 error, which will hurt the indexation of your site.

Website Security

Providing a safe and secure experience to your website’s audience will not only make them feel more confident in your site, but also search engines. If your company owns several domains, make sure none of the content is duplicated across the domains. If you run an e-commerce site, make sure a secure server version of the checkout process is available. Set up robots.txt to block certain pages from being crawled by spiders, but be careful not to block any pages that will hurt your internal linking structure. Check your website to make sure it isn’t infected by malware. Google Webmaster Tools has a free malware diagnostic tool. Web page errors will also cause search engines to mistrust your website. Run your site through the W3C Markup Validator to check for bugs that may cause indexation issues.

Summary

When performing a site audit, it’s helpful to create a checklist of violations to watch out for on the website. You can create one using this article, or you can use SEOmoz’s pre-made site audit checklist. Keep in mind that SEOmoz’s checklist is a few years old and is missing a few elements. Once the audit is complete, you may find it helpful to write up a list of structural recommendations with brief descriptions to help the client understand what you want to do for them.
Author : Harrison Jones is an SEO Analyst at Straight North

Source : http://www.techipedia.com

Share

Twitter Delicious Facebook Digg Stumbleupon Favorites More