SearchEngineLand shares “worst & best practises in SEO”
with 2 parts. I collect them all in this post and think that this is
the most-updated SEO checklist of 2010.
Many consider search engine optimization as a sort of black box. But once the essential features of a search engine optimal website are laid out in a concise list, SEO is not nearly as mystifying.
That’s where these checklists come in. They are designed for web marketers and web developers so that they can easily understand SEO and start tackling it. You can read a full description of each best and worst practice at the end of this, after the two checklists.
Worst practices in SEO
Partially indexed, poorly ranked, penalized and possibly banned: such is the unpleasant fate of a website that’s not duly optimized for search engines. Even if you mastered all “best practices”, your site may not be safe.
The mission of search engines is to supply their visitors with relevant results, so penalizing or banning sites that appear to interfere with that mission is a necessity. Understanding which practices adversely impact your search engine rankings is a prerequisite to a well-optimized site.
Whether inadvertent or not, any of the following worst practices could doom your site to suboptimal traffic levels. Here are 29 critical “must nots” in SEO (this is not a comprehensive list, by the way):
Worst practices explained
If you’ve read this and thought, “Hmm, that was interesting” but you didn’t actually tick any marks on the above checklists, then you have extracted only a fraction of this article’s value. The simple action of printing out the checklists and checking the appropriate boxes one by one is the first step to doing things differently. Remember: if you always do what you’ve always done, you’ll always get what you’ve always gotten.
If you adhere to the advice laid out for you above, and stay tuned for Part 2 of this article which will include a checklist covering best practices in SEO, you’ll be well on your way to a search engine optimal website. Go astray, and your rankings and perhaps even your reputation with the search engines could suffer.
Checklists are just the beginning on the path to SEO success. It’s important to engage with an SEO expert to help guide your organization through the changes necessary to optimize your site.
Best Practices
Implementing the 14 best practices below (or at least some of them) and avoiding the worst practices should offer you a straightforward approach to better visibility in search engines, including Google, Yahoo!, and Bing.
Best and Worst Practice Explanations
Curious about the importance or relevance of some of the questions on the checklists? Read on for full descriptions of the implications of these questions.
Best Practices Explanations
We should all check our sites for errors and be focused on SEO and Social Media all year long but not everyone has the time or knowledge to do everything they should. If you are one of the offenders, now is the time to make it up to your neglected website. Go through the checklist below and make sure you have everything in place to start 2011 off right.
An old Chinese Proverb says
BEST SEO Audit List of Steve Wiideman
This may be the best SEO Audit checklist on the web, by Steve Wiideman. Hope you find it useful and that it helps your business.
Disclaimer: While I’d like to credit for the techniques and strategies in this (maybe the best) SEO Audit List, there really isn’t anything new here that you wouldn’t find on the Google Webmaster Checklist, SEOMOz.org’s Top SEO Ranking Factors, or in David Mihm’s Local SEO Ranking Factors. All I did was consolidate them and provide links to help you learn more about each factor. Success will depend on proper execution, avoidance of spam tactics, and on the rare occasion, a handshake from a local chimney sweep.
SEO Strategy was reviewed by an industry expert Website is not created in Full Flash Website is not created using frames Website has no broken links or images HTML validation to W3C standards CSS validation to W3C standards CSS table-less template used to reduce file size (okay for holding data) Average “time on site” duration greater than 2 min Keyword-focused anchor text from internal links Duplicate titles and meta descriptions corrected Internal link popularity adjusted Non WWW 301 redirect corrected Custom 404 page added Trailing slash on pages without file name extensions Server/hosting uptime verified Use of an inclusive or segmented sitemap Dynamic URLs replaced with static or fixed with canonical tag Use of XML sitemap(s) JavaScripts extracted to JS folder CSS styles extracted to CSS folder Use of feeds on the domain Use of external links to reputable, trustworthy sites/pages Google Analytics installed (for your analyst) (Local) Country code TLD of the root domain (Local) Language of the content used on the site (Local) Geographic location of the host IP address of the domain
NOT keyword stuffing in the on-page text NOT keyword stuffing in the title tag NOT cloaking by JavaScript/rich media support detection NOT cloaking by cookie detection NOT using of “poison” keywords in anchor text of external links NOT linking to domain banned from Google’s index for web spam NOT having excessive number of dynamic parameters in the URL Compiled a Content Tracking Spreadsheet Title tag principles used on keyword-targeted pages Including an engaging, useful video between text on the page Average “time on page” duration Existence of substantive, unique content on the page Keyword use in the meta description tag Keyword use in the page name URL Keyword use anywhere in the H1 headline tag Total page size is under 150k Use of links on the page that point to other URLs on the domain Keyword Use in <strong> tags Short or moderate URL length Location in information architecture of the website Keyword use in image names included on the page Keyword use in image alt text Using keyword-rich links in anchor text (not necessarily w/target keyword) Conversion form exists Page contains less than 100 outbound links Images are optimized for faster loading Keyword density proportional Keyword use in the first 50-100 words in HTML on the page Keyword Use in other headline tags (<h2> – <h6>) (Local) Not using an 800 number exclusively (Local) Business name, city and state in title (Local) Address in on-page text content (Local) Product/service used in URL
Additionally, getting a mention can in some cases be just as important as getting a link, so don’t discount the value of a profile on a trusted website just because they won’t link. Ask the major brands what life was like after the Vince update and you’re sure to get a smile.
NOT having link acquisition from known link brokers Not having excessive repetition of the same anchor text in external links Utilizing online classifieds websites for SEO Domain registration with Google, Yahoo! & Bing Webmaster Tools Website is listed in trusted web directories Business is listed in trusted business directories Website is listed where competitors are listed Website is listed with relevant associations & organizations Website continues to receive fresh bookmark links Business has an active profile & presence in Twitter Business has an active profile & presence in Facebook Business has an active profile & presence in YouTube Business and pages within the website are mentioned frequently in the Blogosphere Business produces content noteworthy of being referenced by university websites Business is mentioned by government websites for providing community services Website is mentioned frequently in conjunction to the product or services offered Products and services are mention in conjunction to the brand in Craigslist, eBay, etc Affiliates can register their domains during signup and receive static affiliate links Product/Service Expert has featured articles in popular online publications Product/Service Expert is interviewed and written about by online newspapers Questions answered frequently in Q&A websites (Askville, Yahoo! Answers, etc) Friends, fans, and followers are sharing useful or extremely funny (or both) content Other industry experts are entertained and mention the business in a blog posts Business is mentioned anytime a Google or Yahoo! Alert lists a competitor Inclusion of feeds from the domain in Google News Domain “mentions” (text citations of the domain name) Inclusion of feeds from the Domain in Google Blog Search Citations/references in the Yahoo! Directory Citations/references of the domain in DMOZ.org Citations/references of the domain in Wikipedia Citations/references of the domain in Lii.org (Local) Get Business Reviews to Influence CTR (Local) Manual review/targeting by Google Engineers and/or Quality Raters (Local) Geo-targeting preference set inside Google Webmaster Tools (Local) Registration of the site with Google Local in the country/region (Local) Address associated with the registration of the domain More to come!
Domain name Length of domain registration Historical click-through rate from search results to a specific URL Historical click-through rate from search results to all pages on a domain Geographic location of visitors to the site
That’s where these checklists come in. They are designed for web marketers and web developers so that they can easily understand SEO and start tackling it. You can read a full description of each best and worst practice at the end of this, after the two checklists.
Worst practices in SEO
Partially indexed, poorly ranked, penalized and possibly banned: such is the unpleasant fate of a website that’s not duly optimized for search engines. Even if you mastered all “best practices”, your site may not be safe.
The mission of search engines is to supply their visitors with relevant results, so penalizing or banning sites that appear to interfere with that mission is a necessity. Understanding which practices adversely impact your search engine rankings is a prerequisite to a well-optimized site.
Whether inadvertent or not, any of the following worst practices could doom your site to suboptimal traffic levels. Here are 29 critical “must nots” in SEO (this is not a comprehensive list, by the way):
Worst Practice | N/A | Will stop | Won’t stop |
---|---|---|---|
1. Do you use pull-down boxes for navigation? | |||
2. Does your primary navigation require Flash, Java or Javascript to function? | |||
3. Is your web site done entirely in Flash or overly graphical with very little textural content? | |||
4. Is your home page a “splash page” or otherwise content-less? | |||
5. Does your site employ frames? | |||
6. Do the URLs of your pages include “cgi-bin” or numerous ampersands? | |||
7. Do the URLs of your pages include session IDs or user IDs? | |||
8. Do you unnecessarily spread your site across multiple domains? | |||
9. Are your title tags the same on all pages? | |||
10. Do you have pop-ups on your site? | |||
11. Do you have error pages in the search results (“session expired”, etc.)? | |||
12. Does your File Not Found error return a 200 status code? | |||
13. Do you use “click here” or any other superfluous copy for your hyperlink text? | |||
14. Do you have superfluous text like “Welcome to” at the beginning of your title tags? | |||
15. Do you unnecessarily employ redirects, or are they the wrong type? | |||
16. Do you have any hidden or small text meant only for the search engines? | |||
17. Do you engage in “keyword stuffing”? | |||
18. Do you have pages targeted to obviously irrelevant keywords? | |||
19. Do you repeatedly submit your site to the search engines? | |||
20. Do you incorporate your competitors’ brand names in your meta tags? | |||
21. Do you have duplicate pages with minimal or no changes? | |||
22. Does your content read like “spamglish”? | |||
23. Do you have “doorway pages” on your site? | |||
24. Do you have machine-generated pages on your site? | |||
25. Are you “pagejacking”? | |||
26. Are you cloaking? | |||
27. Are you submitting to FFA (“Free For All”) link pages and link farms? | |||
28. Are you buying expired domains with high PageRank scores to use as link targets? | |||
29. Are you presenting a country selector as your home page to Googlebot? |
- Do you use pull-down boxes for navigation? Search engine spiders can’t fill out forms, even short ones with just one pull-down. Thus, they can’t get to the pages that follow. If you’re using pull-downs, make sure there is an alternate means of navigating to those pages that the spiders can use. Note this is not the same as a mouseover menu, where sub-choices show up upon hovering over the main navigation bar; that’s fine if done using CSS (rather than Javascript.)
- Does your primary navigation require Flash, Java or Javascript? If you rely on search engine spiders executing Flash, Java or Javascript code in order to access links to deeper pages within your site, you’re taking a big risk. The search engines have a limited ability to deal with Flash, Java and Javascript. So the links may not be accessible to the spiders, or the link text may not get associated with the link. Semantically marked up HTML is always the most search engine friendly way to go.
- Is your site done entirely in Flash or overly graphical with very little textual content?Text is always better than graphics or Flash animations for search engine rankings. Page titles and section headings should be text, not graphics. The main textual content of the page should ideally not be embedded within Flash. If it is, then have an alternative text version within div tags and use SWFObject to determine whether that text is displayed based on whether the visitor has the Flash plugin installed.
- Is your home page a “splash page” or otherwise content-less? With most webites, as mentioned above, the home page is weighted by the search engines as the most important page on the site (i.e., given the highest PageRank score.) Thus, having no keyword-rich content on your home page is a missed opportunity.
- Does your site employ frames? Search engines have problems crawling sites that use frames (i.e., where part of the page moves when you scroll but other parts stay stationary.) Google advises not using frames: “Frames tend to cause problems with search engines, bookmarks, emailing links and so on, because frames don’t fit the conceptual model of the Web (every page corresponds to a single URL.) “Furthermore, if a frame does get indexed, searchers clicking through to it from search results will often find an “orphaned page”: a frame without the content it framed, or content without the associated navigation links in the frame it was intended to display with. Often, they will simply find an error page.What about “iFrames”, you ask? iFrames are better than frames for a variety of reasons, but the content within an iframe on a page still won’t be indexed as part of that page’s content.
- Do the URLs of your pages Include “cgi-bin” or numerous ampersands? As discussed, search engines are leery of dynamically generated pages. That’s because they can lead the search spider into an infinite loop called a “spider trap.” Certain characters (question marks, ampersands, equal signs) and “cgi-bin” in the URL are sure-fire tip-offs to the search engines that the page is dynamic and thus to proceed with caution. If the URLs have long, overly complex “query strings” (the part of the URL after the question mark), with a number of ampersands and equals signs (which signify that there are multiple variables in the query string), then your page is less likely to get included in the search engine’s index.
- Do the URLs of your pages include session IDs or user IDs? If your answer to this question is yes, then consider this: search engine spiders like Googlebot don’t support cookies, and thus the spider will be assigned a new session ID or user ID on each page on your site that it visits. This is the proverbial “spider trap” waiting to happen. Search engine spiders may just skip over these pages. If such pages do get indexed, there will be multiple copies of the same pages each taking a share of the PageRank score, resulting in PageRank dilution and lowered rankings.If you’re not quite clear on why your PageRank scores will be diluted, think of it this way: Googlebot will find minimal links pointing to the exact version of a page with a particular session ID in its URL.
- Do you unnecessarily spread your site across multiple domains? This is typically done for load balancing purposes. For example, the links on the JCPenney.com home page point off to www2.jcpenney.com, or www3.jcpenney.com, or www4.jcpenney.com and so on, depending on which server is the least busy. This dilutes PageRank in a way similar to how session IDs in the URL dilute PageRank.
- Are your title tags the same on all pages? Far too many websites use a single title tag for the entire site. If your site falls into that group, you’re missing out on a lot of search engine traffic. Each page of your site should “sing” for one or several unique keyword themes. That “singing” is stifled when the page’s title tag doesn’t incorporate the particular keyword being targeted.
- Do you have pop-ups on your site? Most search engines don’t index Javascript-based pop-ups, so the content within the pop-up will not get indexed. If that’s not good enough reason to stop using pop-ups, you should know that people hate them – with a passion. Also consider that untold millions of users have pop-up blockers installed. (The Google Toolbar and Yahoo Companion toolbar are pop-up blockers, too, in case you didn’t know.)
- Do you have error pages in the search results (“session expired” etc.)? First impressions count . . . a lot! So make sure search engine users aren’t seeing error messages in your search listings. Hotmail took the cake in this regard, with a Google listing for its home page that, for years, began with: “Sign-In Access Error.” Not exactly a useful, compelling or brand-building search result for the user to see. Check to see if you have any error pages by querying Google, Yahoo and Bing for site:www.yourcompanyurl.com. Eliminate error pages from the search engine’s index by serving up the proper status code in the HTTP header (see below) and/or by including a meta robots noindex tag in the HTML.
- Does your “file not found” error page return a 200 status code? This is a corollary to the tip immediately above. Before the content of a page is served up by your Web server, a HTTP header is sent, which includes a status code. A status code of 200 is what’s usually sent, meaning that the page is “OK.” A status code of 404 means that the requested URL was not found. Obviously, a file not found error page should return a 404 status code, not a 200. You can verify whether this is the case using a server header checker and then into the form input a bogus URL at your domain, such as http://www.yourcompanyurl.com/blahblah. An additional, and even more serious, consequence of a 200 being returned with URLs that are clearly bogus/non-existent is that your site will look less trustworthy by Google (Google does check for this).Note that there are other error status codes that may be more appropriate to return than a 404 in certain circumstances, like a 403 if the page is restricted or 500 if the server is overloaded and temporarily unavailable; a 200 (or a 301 or 302 redirect that points to a 200) should never be returned, regardless of the error, to ensure the URL with the error does not end up in the search results.
- Do you use “click here” or other superfluous copy for your hyperlink text? Wanting to rank tops for the words “click here,” eh? Try some more relevant keywords instead. Remember, Google associates the link text with the page you are linking to, so make that anchor text count.
- Do you have superfluous text like “Welcome To” at the beginning of your title tags?No one wants to be top ranked for the word “welcome” (except maybe the Welcome Inn chain!) so remove those superfluous words from your title tags!
- Do you unnecessarily employ redirects, or are they the wrong type? A redirect is where the URL changes automatically while the page is still loading in the user’s browser. Temporary (status code of 302) redirects — as opposed to permanent (301) ones — can cost you valuable PageRank. That’s because temporary redirects don’t pass PageRank to the destination URL. Links that go through a click-through tracker first tend to use temporary redirects. Don’t redirect visitors when they first enter your site at the home page; but if you must, at least employ a 301 redirect. Whether 301 or 302, if you can easily avoid using a redirect altogether, then do that. If you must have a redirect, avoid having a bunch of redirects in a row; if that’s not possible, then ensure that there are only 301s in that chain. Most importantly, avoid selectively redirecting human visitors (but not spiders) immediately as they enter your site from a search engine, as that can be deemed a “sneaky redirect” and can get you penalized or banned.
- Do you have any hidden or small text meant only for the search engines? It may be tempting to obscure your keywords from visitors by using tiny text that is too small for humans to see, or as text that is the same color as the page background. However, the search engines are on to that trick.
- Do you engage in “keyword stuffing”? Putting the same keyword everywhere, such as in every ALT attribute, is just asking for trouble. Don’t go overboard with repeating keywords or adding a meta keywords tag that’s hundreds of words long. (Why even have a meta keywords tag? They don’t help with SEO, they only help educate your competitors on which keywords you are targeting.) Google warns not to hide keywords in places that aren’t rendered, such as comment tags. A good rule of thumb to operate under: if you’d feel uncomfortable showing to a Google employee what you’re doing, you shouldn’t be doing it.
- Do you have pages targeted to obviously irrelevant keywords? Just because “britney spears” is a popular search term doesn’t mean it’s right for you to be targeting it. Relevancy is the name of the game. Why would you want to be number one for “britney spears” anyway? The bounce rate for such traffic would be terrible.
- Do you repeatedly submit your site to the engines? At best this is unnecessary. At worst this could flag your site as spam, since spammers have historically submitted their sites to the engines through the submission form (usually multiple times, using automated tools, and without consideration for whether the site is already indexed). You shouldn’t have to submit your site to the engines; their spiders should find you on their own — assuming you have some links pointing to your site. And if you don’t, you have bigger issues: like the fact your site is completely devoid of PageRank, trust and authority. If you’re going to submit your site to a search engine, search for your site first to make sure it’s not already in the search engine’s index and only submit it manually if it’s not in the index. Note this warning doesn’t apply to participating in the Sitemaps program; it’s absolutely fine to provide the engines with a comprehensive Sitemaps XML file on an ongoing basis (learn more about this program atSitemaps.org).
- Do you incorporate your competitors’ brand names in your meta tags? Unless you have their express permission, this is a good way to end up at the wrong end of a lawsuit.
- Do you have duplicate pages with minimal or no changes? The search engines won’t appreciate you purposefully creating duplicate content to occupy more than your fair share of available positions in the search results. Note that a dynamic (database-driven) website inadvertently offering duplicate versions of pages to the spiders at multiple URLs is not a spam tactic, as it is a common occurrence for dynamic websites (even Google’s own Googlestore.com suffers from this), but it is something you would want to minimize due to the PageRank dilution effects.
- Does your content read like “spamglish”? Crafting pages filled with nonsensical, keyword-rich gibberish is a great way to get penalized or banned by search engines.
- Do you have “doorway pages” on your site? Doorway pages are pages designed solely for search engines that aren’t useful or interesting to human visitors. Doorway pages typically aren’t linked to much from other sites or much from your own site. The search engines strongly discourage the use of this tactic, quite understandably.
- Do you have machine-generated pages on your site? Such pages are usually devoid of meaningful content. There are tools that churn out keyword-rich doorway pages for you, automatically. Yuck! Don’t do it; the search engines can spot such doorway pages.
- Are you “pagejacking”?” Pagejacking” refers to hijacking or stealing high-ranking pages from other sites and placing them on your site with few or no changes. Often, this tactic is combined with cloaking so as to hide the victimized site’s content from search engine users. The tactic has evolved over the years; for example “auto-blogs” are completely pagejacked content (lifted from RSS feeds). Pagejacking is a big no-no! Not only is it very unethical, it’s illegal; and the consequences can be severe.
- Are you “cloaking”? “Cloaking” is the tactic of detecting search engine spiders when they visit and varying the content specifically for the spiders in order to improve rankings. If you are in any way selectively modifying the page content, this is nothing less than a bait-and-switch. Search engines have undercover spiders that masquerade as regular visitors to detect such unscrupulous behavior. (Note that cleaning up search engine unfriendly URLs selectively for spiders, like Yahoo.com does on their home page by dropping their ylt tracking parameter from all their links, is a legitimate tactic.)
- Are you submitting to FFA (“Free For All”) links pages and link farms? Search engines don’t think highly of link farms and such, and may penalize you or ban you for participating on them. How can you tell link farms and directories apart from each other? Link farms are poorly organized, have many more links per page, and have minimal editorial control.
- Are you buying expired domains with high PageRank scores to use as link targets?Google underwent a major algorithm change a while back to thwart this tactic. Now, when domains expire, their PageRank scores are reset to 0, regardless of how many links point to the site.
- Are you presenting a country selector as your home page to Googlebot? Global corporations sometimes present first-time visitors with a list of countries and/or languages to choose from upon entry to their site. An example of this is at EMC.com. This becomes a “worst practice” when this country list is represented to the search engines as the home page. Happily, EMC had done their homework on SEO and is detecting the spiders and waving them on. In other words, Googlebot doesn’t have to select a country before entry. You can confirm this to be the case yourself: do a Google search a “cache:www.emc.com” and you will see the EMC’s U.S. home page.
If you’ve read this and thought, “Hmm, that was interesting” but you didn’t actually tick any marks on the above checklists, then you have extracted only a fraction of this article’s value. The simple action of printing out the checklists and checking the appropriate boxes one by one is the first step to doing things differently. Remember: if you always do what you’ve always done, you’ll always get what you’ve always gotten.
If you adhere to the advice laid out for you above, and stay tuned for Part 2 of this article which will include a checklist covering best practices in SEO, you’ll be well on your way to a search engine optimal website. Go astray, and your rankings and perhaps even your reputation with the search engines could suffer.
Checklists are just the beginning on the path to SEO success. It’s important to engage with an SEO expert to help guide your organization through the changes necessary to optimize your site.
Best Practices
Implementing the 14 best practices below (or at least some of them) and avoiding the worst practices should offer you a straightforward approach to better visibility in search engines, including Google, Yahoo!, and Bing.
Best Practice | Doing now | Will do soon | Won’t or N/A |
---|---|---|---|
1. Are the keywords you are targeting relevant and popular with searchers? | |||
2. Do your page titles lead with your targeted keywords? | |||
3. Is your body copy of sufficient length and keyword-rich? | |||
4. Does the anchor text pointing to various pages within your site include good keywords? | |||
5. Do you employ text links from your home page to your most important secondary pages? | |||
6. If you must have graphical navigation, do you use the CSS image replacement technique as a workaround, and do those graphics have descriptive and keyword-rich ALT attributes that are useful for both humans and engines? | |||
7. Does your Web site have an XML Sitemap, as well as an HTML site map with text links? | |||
8. Are the URLs of your dynamic (database driven) pages short, simple, and static-looking? | |||
9. Does your home page and other key pages of your site have sufficient PageRank (link authority)? | |||
10. Does your site have an optimized internal linking structure? | |||
11. Do your pages have keyword-rich meta descriptions with a compelling call to action? | |||
12. Does your site have a custom error page that returns the correct “status code”? | |||
13. Do your filenames and directory names include targeted keywords? | |||
14. Are you actively building links to your Web site? |
Curious about the importance or relevance of some of the questions on the checklists? Read on for full descriptions of the implications of these questions.
Best Practices Explanations
- Are the keywords that you are targeting not only relevant but also popular with searchers? There is no point going after high rankings for keywords that no one searches for. Compare relative popularity of keywords using Google’s free tools (Google AdWords Keyword Tool and Google Insights for Search) and/or paid tools like KeywordDiscovery.com and WordTracker.com before deciding what keywords to employ on your Web pages.Despite the popularity of individual words, it’s best to target two- or three-word phrases (or even longer). Because of the staggering number of Web pages indexed by the major search engines, competing for a spot on the first or second page of search results on a one-word keyword will typically be a losing battle (unless you have killer link authority). This should go without saying, but the keywords you select should be relevant to your business.
- Do your page titles lead with your targeted keywords? The text within your page title (a.k.a. the title tag) is given more weight by the search engines than any other text on the page. The keywords at the beginning of the title tag are given the most weight. Thus, by leading with keywords that you’ve chosen carefully, you make your page appear more relevant to those keywords in a search.
- Is your body copy of sufficient length and keyword-rich?Ideally, incorporate at least several hundred words on each page so there’s enough “meat” there for the search engines to sink their teeth into and determine a keyword theme of the page. Include relevant keywords high up in the page, where they will be weighted more heavily by the search engines than keywords mentioned only at the bottom of the page, where it’s almost like an afterthought. This is known as keyword prominence. Think in terms of keyword prominence in the HTML, not the rendered page on the screen; Google doesn’t realize that something is at the top of the third column if it appears low in the HTML. Be careful not to go overboard to the point that your copy doesn’t read well; that’s called “keyword stuffing” and is discussed later, under “Worst Practices.”
- Does the anchor text pointing to various pages within your site include good keywords? Google, Yahoo, and Bing all associate the anchor text in the hyperlink as highly relevant to the page being linked to. So, use good keywords in the anchor text to help the engine better ascertain the theme of the page you are linking to. Keep the link text relatively succinct and tightly focused on just one keyword or key phrase. The longer the anchor text, the more diluted the overall theme conveyed to the engine.
- Do you employ text links from your home page to your most important secondary pages? Text links are, by far, the better option over ALT attributes in conveying to the search engine the context of the page to which you are linking. (An ALT attribute is the text that appears in a small box when you hover your cursor over an image.) ALT attributes can have an effect, but it’s small in comparison with that of text links. If you have graphical navigation buttons, switch them to keyword-rich text links; if that’s not an option, at least include text link navigation repeated elsewhere on the page, such as in the footer (note however that footer links are partially devalued), or consider the CSS image replacement technique, described below.
- If you must have graphical navigation, do you use the CSS image replacement technique as a workaround, and do those graphics have descriptive and keyword-rich ALT attributes that are useful for both humans and search engines? Image Replacement is a technique that employs CSS (Cascading Style Sheets) to substitute in replacement copy and HTML – such as a text link or heading tag – when the stylesheet is not loaded (as is the case when the search engine spiders come to visit.) The text-based replacement is weighted more heavily by the engines than the IMG ALT attribute — thus it is preferable to relying solely on the ALT attribute. Of the many ways to implement the image replacement technique, most use CSS to physically move the text off the screen (text-indent: -9999em; left:-9999em;display:none, etc), which is not ideal because the search engines maydiscount this as hidden text.Important: resist the temptation to work in additional keywords or text into the text replacement, or your site may be hit with a penalty. A few CSS image replacement methods exist that are preferable because they don’t physically move the content off-page and are still accessible, namely The Leahy/Langridge Method, The Gilder/Levin Method and The ‘Shea Enhancement’. It is still useful to have ALT attributes on your images, more for usability/accessibility than for SEO. ALT attributes should contain relevant keywords that convey the key information from the image that the user would not receive if she had image loading turned off.
- Does your Web site have an XML Sitemap, as well as an HTML site map with text links? An XML Sitemap file provides the search engines with a comprehensive list of all the URLs corresponding to the pages/documents which are contained on your website. This helps ensure all of your pages end up getting indexed by the search engines. But the XML Sitemap is more than just a list of URLs; it can include additional information about each URL, such as the page’s last modified date and priority (which can impact how frequently the page is visited by the search engine spiders and thus how quickly it is refreshed.)It’s abest practice to also include the location of your sitemap file(s) in your site’s robots.txt, so that the search engines can “autodiscover” the sitemaps on their own without you having to specify the location of the file(s) in each search engine’s Webmaster Center. An HTML sitemap is a different thing altogether. It’s simply a page on your website with links to all your important pages, displayed usually in a hierarchical fashion. A link to the sitemap is typically present in the footer of every page of the site.HTML sitemaps have long been touted as good “spider food” because it provides the search engine spiders with a links to key pages to explore and index. Use text links, since they are more search engine optimal than graphical links, as already mentioned. Bear in mind that you should ideally try to stay within 100 links per page, as a recommended best practice by Google (this is a rough guideline, not a hard and fast rule). That may mean breaking up your site map into multiple HTML pages.
- Are the URLs of your dynamic (database-driven) pages short, simple and static-looking? Pages with URLs that contain a question mark and numerous ampersands and equals signs aren’t as palatable to the search engines as simple, static-looking URLs. Either install a server module/plug-in that allows you to “rewrite” your URLs, or recode your site to embed your variables in the path info instead of the query string; or, if you need to minimize resource requirements by your IT team, you can enlist a “proxy serving” solution such as Organic Search Optimizer.I’ve written about this at length in this two-part article. Another, oft-neglected aspect of URL optimization is making them short for improved click-through from the search results. In my previous article on URL optimization I discussed an interesting study by MarketingSherpa that found that short URLs get clicked on twice as often as long URLs in the Google SERPs.
- Does your home page and other key pages of your site have sufficient PageRank (link authority)? PageRank is Google’s way of quantifying the importance of a Web page. Put another way, it’s as much about the quality of the links pointing to a given Web page as it is about the quantity (more so, actually). PageRank has been the cornerstone for Google’s ranking algorithm since the beginning. The more important (PageRank-endowed) pages wield more voting power; the page’s “vote” gets divvied up among all the links on the page and passed on to those pages.Of course, this is a massive over-simplification, and the PageRank algorithm has evolved over the years to include such things as trust and authority to stay ahead of the spammers. Nonetheless, a form of PageRank is still in use today by Google. You can check Google PageRank scores using the Google Toolbar. Mouse over the toolbar’s PageRank meter to display the numerical rating, an integer value between 0 and 10. Yahoo’s importance-scoring equivalent to PageRank has been referred to internally as both LinkFlux and Yahoo! Web Rank at various times. It’s best to refer to the PageRank-like algorithms of the three major engines more generally as “link authority,” “link equity,” or “link juice”.The PageRank scores delivered by Google’s toolbar server are on a logarithmic scale; meaning that integer increments are not evenly spaced. Thus, garnering more links and gaining in PageRank score from 3 to 4 is easy, but from 6 to 7 is a lot harder.Also bear in mind that the PageRank displayed in the Google Toolbar is not the same PageRank as what is used by Google’s ranking algorithm. In fact, the correlation between the two PageRanks has degraded over time. Potentially a better predictor of your true PageRank score is the “mozRank” score available from Linkscape. “mozRank” approximates Google PageRank using a sophisticated algorithm and an index of 30+ billion pages. mozRank scores are also on a logarithmic scale. A PageRank or mozRank score for your home page of 7 or 8 is a laudable goal.Note that each page has its own PageRank score. Because most of the inbound links your site has garnered point to the home page, your home page almost invariably ends up being the highest PageRank-endowed page of your site. The PageRank that has accumulated on your home page is passed to your internal pages through your internal linking structure.Bottom line: if a given page on your site doesn’t have enough PageRank (I’m referring to the super-secret, internal PageRank that Google doesn’t share with us SEOs via the Toolbar), then it doesn’t deserve to rank.
- Does your site have an optimized internal linking structure? Your site’s hierarchical internal linking structure conveys to the search engines how important you consider each page of your site, comparatively. This of course impacts these pages’ PageRank scores and ultimately their Google rankings. The deeper down a page is in the site tree (i.e. the more clicks away the page in question is from the home page), the less PageRank with which that page will be endowed.Therefore, it’s critical you think carefully about how you spend that hard-earned PageRank, i.e. where and how you link from your home page and from your site-wide navigation to the rest of your site.Generally speaking, the deeper in your hierarchy you hide key content, the less important that content appears to the search engines — if they even find it (which is not a given if it’s very deep). As an aside, this concept applies not only to your linking structure but also to your URL structure: too many slashes in the URL (i.e. too many sub-directories deep) and you convey to the engines that the page is unimportant. A flat directory structure, where you minimize the number of slashes in the URL, helps ensure more pages of your site get indexed.
- Do your pages have keyword-rich meta descriptions with a compelling call to action?Because meta tags are tucked away in the HTML and hidden from the view of the human visitor, they have been abused like crazy by spammers trying to hide keywords out of view. The original purpose of meta tags was to provide meta-information about the page which could then be used by search engine spiders and other algorithms. One such piece of meta-information is a description of the page (e.g., its content and its purpose), a.k.a. the “meta description”. Although it won’t improve your rankings to define a meta description (or meta keywords or any other meta tag, for that matter), it is useful from the standpoint of influencing what text appears within your listing in the search results (i.e. the “snippet”), in order to better persuade the user to click through to your site.Yahoo will frequently employ the meta description as the description in your search results listing. Bing is also displaying the meta descriptions in the search listings. Google may incorporate some or all of your meta description in to the snippet displayed in your search listing; it’s more likely if the searcher’s keywords are present in your meta description. More on the intricacies on Google snippets here. The user’s search terms, and related keywords, like those with the same root – are bolded in the search listing, which improves the clickthrough rate to your page (from the search results). This is known as KWiC (KeyWords in Context).
- Does your site have a custom error page that returns the correct “status code”? Don’t greet users with the default “File not found” error page when they click through from a search engine results page to a page on your site that no longer exists. Offer a custom error page instead, with your logo and branding, navigation, site map, and search box. Important from an SEO standpoint – make sure that “File not found” error page returns in the HTTP header a “status code” of 404 (or potentially a different 400 or 500 level status code depending on the nature of the error), or it 301 redirects to a URL that returns a 404. You can check this with a server header checker, such as this one. If you send a mistakenly send a 200 status code instead, this error page will likely end up in the index, and thus the search results. This is discussed further in the “Worst Practices.” No matter what the reason for the page’s unavailability (e.g., discontinued product, site redesign, file renamed, server or database issues), you shouldn’t be driving visitors away with an ugly error page that doesn’t provide a path to your home page and other key areas of your site.
- Do your filenames and directory names include targeted keywords? Google engineer Matt Cutts has blogged that this is a useful “signal” to Google, so if it’s easy to do, why not? Separate keywords with hyphens, not with underscores. Avoid having more than a few keywords into a filename or directory name, as it could look spammy to the search engines.
- Are you actively building links to your site? A steady stream of high quality links don’t just “happen”; just like ongoing, consistently great media coverage doesn’t just “happen.” If it did, link builders and public relations pros would all be out of a job.The most basic of starting points for link building is the authoritative directories like the Yahoo Directory and the Open Directory. Not only do the high quality directories improve your PageRank and consequently your rankings; they also drive direct click-through traffic. If you aren’t already listed in the Yahoo Directory or Open Directory then you should identify the category most relevant to your business and submit your site. A listing in Open Directory also ensures a listing in the (largely forgotten) Google Directory and numerous other directories powered by Open Directory.Submitting to Yahoo’s directory costs $299 then $299 per year recurring (it’s free for noncommercial sites, though.) Submitting to Open Directory is free but it’s become practically impossible to get into, at least in the most appropriate category for your site, since the Open Directory’s owner (AOL) and its volunteer editors have left the Directory semi-abandoned. Don’t waste your time and money submitting to hundreds of directories, just pick the most critical ones that are relevant to your business/industry and that Google would likely consider authoritative and trustworthy.For example, a business-to-business company may wish to submit to business.com and ThomasNet.com. Directories that primarily target webmasters and SEOs to sell them listings, rather than end users who would actually browse the directory, are most likely being devalued by Google and thus would be a waste of your time and money to submit to.What’s next after the directories? I can’t get into that or this already overly long article would quickly become a book! There’s an entire Search Engine Land column dedicated to this important topic: Link Week. Suffice it to say that within link building lies quite a spectrum of tactics, from the more basic like optimized press releases, article syndication, and guest blogging to the more advanced like consistently hitting the Digg.com front page with killer link bait. Diversify your link building tactics like you diversify your investment portfolio. Don’t just rely on one tactic.
SEO checklist from SearchEngineLand
NEW: 2011 SEO & Social Media Checklist – SearchEngineJournal
In addition to focusing on increasing last minute holiday sales, now is the time to make sure everything is in place to make 2011 a great year for your site and business.We should all check our sites for errors and be focused on SEO and Social Media all year long but not everyone has the time or knowledge to do everything they should. If you are one of the offenders, now is the time to make it up to your neglected website. Go through the checklist below and make sure you have everything in place to start 2011 off right.
An old Chinese Proverb says
“The best time to plant a tree was 20 years ago. The next best time is now.”That kinda says it all! Now on to the checklist:
SEO:
- Make sure all your “Local/Places/Maps” listings are set up and claimed
- Review your On-Page Optimization and make sure it’s all complete
- Make sure you have a content addition strategy in place
- Make sure your product feeds are all submitted
- Make sure your keyword research done (for now) with plan to check often for new keywords
Site:
- Review your Newsletter/Follow Up Series and make sure it’s current and compelling (create one if you don’t have one yet)
- Make sure web stats/analytics are installed and configured
- Check site for broken links (and fix them!)
- Review/update text (make sure nothing it outdated or old info)
- Test order links
- Review order process for user-friendliness
- Update photos
- Create a plan to keep site fresh and updated
- Make sure you have a plan in place to review web stats regularly
General Marketing:
- Put a plan in place to run split tests on marketing campaigns and landing pages
- Review results from previous split tests and see what you learned from it
- Put a plan in place to syndicate articles
Blog:
- Create a plan to post regularly
- Make sure you Blog is properly configured with the latest and best plugins
- Make sure you are submitting your RSS Feed
- Make sure you have a good strategy for using Keywords/Tagging/Categorizing on posts
- Make sure your Blog is properly connected to your various social media sites
Twitter:
- Make sure your custom background done and compelling
- Take some time to be sure your goals and objectives outlined (you’ll get more out of Twitter if you know what you are doing and why)
- Create a plan to build followers
- Review how often you are tweeting and what tweets you are getting results from – you’ll need to fine-tune your Twitter campaign as you go
Facebook:
- Take some time to be sure your goals and objectives outlined (you’ll get more out of Facebook if you know what you are doing and why)
- Set up a strong personal profile
- Set up KILLER Fan Page with FBML Tabs and special features (like offering a discount after someone clicks the LIKE button to become a Fan)
- Create an action plan to increase Fans and promote page
- Make sure your FB badge is placed on your site and you are linking from sig line in emails etc.
- Create an action plan to network and engage friends/fans
- Create an action plan to keep content updated
LinkedIn:
- Make sure your profile is set up and complete
- Request endorsements
- Participate in Questions and discussions
- Connect your profile to your other social media accounts
Other social media sites:
- Find other relevant industry social media sites
- Create your profile and create action plan to participate
- Consider a video strategy with YouTube
Holiday Marketing:
- Outlined all holidays and action plan for holiday promotions (I know it seems early but holidays sneak up on your, at least outline a rough action plan for the year and assign deadlines in advance of each holiday to get work done and campaigns in place)
PPC:
- Review results and adjust bids
- Add new keywords
- Split test ad copy
- Split test landing pages
BEST SEO Audit List of Steve Wiideman
This may be the best SEO Audit checklist on the web, by Steve Wiideman. Hope you find it useful and that it helps your business.
Disclaimer: While I’d like to credit for the techniques and strategies in this (maybe the best) SEO Audit List, there really isn’t anything new here that you wouldn’t find on the Google Webmaster Checklist, SEOMOz.org’s Top SEO Ranking Factors, or in David Mihm’s Local SEO Ranking Factors. All I did was consolidate them and provide links to help you learn more about each factor. Success will depend on proper execution, avoidance of spam tactics, and on the rare occasion, a handshake from a local chimney sweep.
Website-Level, On-Page SEO Audit
The list below involves changes you can make on your website that may increase your ranking in the search results. These tasks involve making your website search engine-friendly, reducing duplicate content, and avoiding actions that could get your website penalized from the search results altogether. I personally chose the order, email me if you disagree with the sequence or the technique.SEO Strategy was reviewed by an industry expert Website is not created in Full Flash Website is not created using frames Website has no broken links or images HTML validation to W3C standards CSS validation to W3C standards CSS table-less template used to reduce file size (okay for holding data) Average “time on site” duration greater than 2 min Keyword-focused anchor text from internal links Duplicate titles and meta descriptions corrected Internal link popularity adjusted Non WWW 301 redirect corrected Custom 404 page added Trailing slash on pages without file name extensions Server/hosting uptime verified Use of an inclusive or segmented sitemap Dynamic URLs replaced with static or fixed with canonical tag Use of XML sitemap(s) JavaScripts extracted to JS folder CSS styles extracted to CSS folder Use of feeds on the domain Use of external links to reputable, trustworthy sites/pages Google Analytics installed (for your analyst) (Local) Country code TLD of the root domain (Local) Language of the content used on the site (Local) Geographic location of the host IP address of the domain
Webpage-Level, On-Page SEO Audit
So you’ve made your website search engine-friendly, but that doesn’t mean that you’re going to suddenly appear for every keyword you’d like to rank for in the search results. The following website-level, on-page SEO audit checklist relates to the individual pages that you will be creating targeting specific keywords (or keyword themes). Example, when you search for “SEO Expert”, it’s not my Top 10 SEO Tips homepage that appears, it’s my seo_expert.htm page that appears (try it).NOT keyword stuffing in the on-page text NOT keyword stuffing in the title tag NOT cloaking by JavaScript/rich media support detection NOT cloaking by cookie detection NOT using of “poison” keywords in anchor text of external links NOT linking to domain banned from Google’s index for web spam NOT having excessive number of dynamic parameters in the URL Compiled a Content Tracking Spreadsheet Title tag principles used on keyword-targeted pages Including an engaging, useful video between text on the page Average “time on page” duration Existence of substantive, unique content on the page Keyword use in the meta description tag Keyword use in the page name URL Keyword use anywhere in the H1 headline tag Total page size is under 150k Use of links on the page that point to other URLs on the domain Keyword Use in <strong> tags Short or moderate URL length Location in information architecture of the website Keyword use in image names included on the page Keyword use in image alt text Using keyword-rich links in anchor text (not necessarily w/target keyword) Conversion form exists Page contains less than 100 outbound links Images are optimized for faster loading Keyword density proportional Keyword use in the first 50-100 words in HTML on the page Keyword Use in other headline tags (<h2> – <h6>) (Local) Not using an 800 number exclusively (Local) Business name, city and state in title (Local) Address in on-page text content (Local) Product/service used in URL
Off-Page SEO Audit Checklist
The recipe for a delicious SEO campaign involves one part on-page SEO and one part off-page SEO. Mix in a little click-through rate (CTR) optimization and now you’re cookin’! The checklist below will help you improve your visibility over time. I strongly advise against launching a new website and raising a flag with Google by acquiring a massive amount of links all at once. Keep your campaign as organic as possible. Don’t make your link building look like a blurp on a chart, make your link building look like Google’s stock chart from 2004 to 2008 (you know what I mean).Additionally, getting a mention can in some cases be just as important as getting a link, so don’t discount the value of a profile on a trusted website just because they won’t link. Ask the major brands what life was like after the Vince update and you’re sure to get a smile.
NOT having link acquisition from known link brokers Not having excessive repetition of the same anchor text in external links Utilizing online classifieds websites for SEO Domain registration with Google, Yahoo! & Bing Webmaster Tools Website is listed in trusted web directories Business is listed in trusted business directories Website is listed where competitors are listed Website is listed with relevant associations & organizations Website continues to receive fresh bookmark links Business has an active profile & presence in Twitter Business has an active profile & presence in Facebook Business has an active profile & presence in YouTube Business and pages within the website are mentioned frequently in the Blogosphere Business produces content noteworthy of being referenced by university websites Business is mentioned by government websites for providing community services Website is mentioned frequently in conjunction to the product or services offered Products and services are mention in conjunction to the brand in Craigslist, eBay, etc Affiliates can register their domains during signup and receive static affiliate links Product/Service Expert has featured articles in popular online publications Product/Service Expert is interviewed and written about by online newspapers Questions answered frequently in Q&A websites (Askville, Yahoo! Answers, etc) Friends, fans, and followers are sharing useful or extremely funny (or both) content Other industry experts are entertained and mention the business in a blog posts Business is mentioned anytime a Google or Yahoo! Alert lists a competitor Inclusion of feeds from the domain in Google News Domain “mentions” (text citations of the domain name) Inclusion of feeds from the Domain in Google Blog Search Citations/references in the Yahoo! Directory Citations/references of the domain in DMOZ.org Citations/references of the domain in Wikipedia Citations/references of the domain in Lii.org (Local) Get Business Reviews to Influence CTR (Local) Manual review/targeting by Google Engineers and/or Quality Raters (Local) Geo-targeting preference set inside Google Webmaster Tools (Local) Registration of the site with Google Local in the country/region (Local) Address associated with the registration of the domain More to come!
Factors You Can’t Control & Shouldn’t Worry About
With every new account I take on comes the questions of “because my competitor’s domain is older, does that mean I’ll never get the top spot?”. SEO is a journey, not a destination. Wit the right amount of nurturing, you can own the top spot for any non-ambiguous, relevant keyword you desire. These factors have historically played a very small role in ranking, but because you can not control them, should not be worried about. Being aware of these factors will help you understand why a competitor may have a top placement even though they may not be doing much search engine optimization work.Domain name Length of domain registration Historical click-through rate from search results to a specific URL Historical click-through rate from search results to all pages on a domain Geographic location of visitors to the site
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.