This Post is Written by Alexander Zagoumenov
There's a number of ways to categorize your SEO efforts (on-site and off-site, for instance). In this article I want to discuss on-site SEO elements as opposed to off-site strategy. In other words, I'll provide you with tips and tools to keep your site clean and have a great relationship with search engines. This article will be useful for both professional SEO (as a refresher) and novice SEOs (to get a perspective of on-site SEO strategy).
First of all, let me define the on-site SEO strategy. For further discussion, on-site SEO strategy is a collection of tactics to ensure that
It is useful to get a hold of a good crawler tool such as SEOmoz where you can track changes in site errors and warnings on weekly basis. Or you can use desktop SEO tools such as Screaming Frog and Xenu.
These pages are bad because they degrade user experience on your site. Of course your visitor can alter the URL in address bar and land on the homepage (yoursite.com/page-that-does-not-exist/) BUT most likely he / she will close the window / press the back button on the browser and never come back.
So in order to make sure you keep visitors happy and longer on the site (Google likes sites that keep users on the site longer), make sure you have a custom 404 error page.
Above is a good example that evokes positive feeling. However, here's a few things I would add to this 404 error page:
Also, if you run a blog, forum, e-commerce site or a news site you are likely to have pagination issues. Learn more about the pagination issues for SEO and find how to solve it here.
Want to create custom reports and save time in Google Analytics, check this guide from Google. Or feel free to import these templates by other people here and here.
There's a number of ways to categorize your SEO efforts (on-site and off-site, for instance). In this article I want to discuss on-site SEO elements as opposed to off-site strategy. In other words, I'll provide you with tips and tools to keep your site clean and have a great relationship with search engines. This article will be useful for both professional SEO (as a refresher) and novice SEOs (to get a perspective of on-site SEO strategy).
First of all, let me define the on-site SEO strategy. For further discussion, on-site SEO strategy is a collection of tactics to ensure that
- Search engines know about your site;
- Search bots can properly index your site;
- Your pages are well-formatted for SERPs.
It is useful to get a hold of a good crawler tool such as SEOmoz where you can track changes in site errors and warnings on weekly basis. Or you can use desktop SEO tools such as Screaming Frog and Xenu.
Do search engines know about my site?
The way Google-bot works (discovers new pages through links) Google will eventually find your site even if you do nothing (a link or two from external resources are still needed). However, there's a way to 1) speed up the indexing process, and 2) ensure that all new updates (new pages / categories, etc.) get indexed in timely manner. Here's a couple of things to keep in mindXML sitemaps
Sitemap.xml files are sitemaps in a format that is easy to understand for search engine bots. Such file is not created to humans (/sitemap.html/ or /sitemap/). It's located in the root directory of your site for search engines to pick it up. Learn more about XML sitemaps. There's a number of ways to generate such file once the site structure is finalized. Here's only a few of them:- XML Sitemap generator (500 pages limit in a free version)
- Screaming Frog Spider (500 pages limit in a free version)
- WordPress SEO by Yoast Plugin (if you have a WP-based site)
Webmaster Tools
Webmaster Tools such as Google Webmaster Tools (GWT) and Bing Webmaster are the most direct doorway between your site and a search engine. These accounts will help you keep track of your site's health as it relates to search engines. So, ensure that you have such account created and your site's XML sitemap or feed is submitted.Can search engines see what I want them to see?
To answer this question we need to make sure that there's nothing preventing robots from discovering pages inside the website. You can run a quick scan manually by searching Google for [site:www.domain.com] and taking a note of the number of results displayed. If it's about the same as you expected, then you shouldn't have a problem. If it's not, read on Let's take a look at several important domain-level and page-level.Robots.txt
Robots.txt file is a file in the root directory of your website (yourdomain.com/robots.txt) that instructs search bots on what to index on your site. Read more about Robots.txt, what it is, how to configure it here.404 errors
404 error pages appear when a page on the site is absent. Keep in mind, 404 errors happen, they are ok, nothing to panic about, but if your site is not prepared for it, such errors can damage your reputation and play a role in reduced rankings. Worst case scenario is when a user gets something like thisThese pages are bad because they degrade user experience on your site. Of course your visitor can alter the URL in address bar and land on the homepage (yoursite.com/page-that-does-not-exist/) BUT most likely he / she will close the window / press the back button on the browser and never come back.
So in order to make sure you keep visitors happy and longer on the site (Google likes sites that keep users on the site longer), make sure you have a custom 404 error page.
Above is a good example that evokes positive feeling. However, here's a few things I would add to this 404 error page:
- a few (2-3) text links to point people to popular directions on the site
- a list (3-5) of options pointing to pages / posts related to the search query
- a search field that would provide additional navigational opportunity
Redirects
It happens that you update / change URLs (due to a new site structure or specific page optimization) on the site. Once you did that, it's important to make sure that your XML sitemap is updated with a new one (and the old one removed). Also, it's a great idea to create a redirect from old to the new pages. The steps will depend on a server you are running but here's a good place to start.URLs
Make sure your site's URLs follow a few rules in terms of depth, length, descriptiveness:- Fix URLs that are over 3-4 directories deep. Flatter URLs tend to index faster when the page is created. Plus it's less confusing for the bot and users.
- URLs that are 100 characters long tend to rank worse. So, avoid stop words (in, a, the, etc.). Keep it short to 3-5 words.
- Make sure your URLs are descriptive of the content on the page. Stay away from keyword stuffing the URL, but try to get a phrase in, if possible.
Titles
Your titles need to be short, specific and descriptive. It should tell an engine what the page is about in a short form. Avoid stuffing the title tag with keywords. Optimal title length is between 70 and 100 characters long including spaces. For dynamic titles I recommend you go from detailed to broad: Product > Category > Brand name. Read more in my earlier article here.Headings
Make sure there's one h1 per page, it's descriptive, preferably short and includes your page-level target term. Headings (H1, H2, etc.) are supposed to divide your page content into logical sections, therefore presenting a value to search bots trying to understand what your page is about. Make sure your pages use heading, and those headings include your page's focus terms. Also, my rule of thumb: One H1, Two H2s and Three H3s per page. You don't have to follow this exactly, but make sure there's only one H1 per page.Links (interlinking)
I've worked with a number of Yandex SEO projects. It appears that interlinking of pages is not as important for Yandex algorithm as it is for Google. So, if you're optimizing for Google, have a read on internal linking here.Also, if you run a blog, forum, e-commerce site or a news site you are likely to have pagination issues. Learn more about the pagination issues for SEO and find how to solve it here.
Site speed
Users love faster sites. Therefore Google rewards faster loading pages with higher rankings. Make sure your site doesn't have slow pages that negatively affect your rankings and user experience. Test your site regularly using Pingdom or page speed test tools from Google.Site analytics
Analytics can tell you a lot of things about your site performance (issues and speed) and your visitors. Regardless of what you use, Google Analytics or something else, make sure you keep an eye on content efficiency (how effective your pages are).Want to create custom reports and save time in Google Analytics, check this guide from Google. Or feel free to import these templates by other people here and here.
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.