March 2, 2011

Do Your Own Site Structure Audit

It’s easy to forget about the importance of a strong site architecture when so much focus is put on other on and off-page SEO efforts. When your site was being designed you might have built a sitemap to help keep things in order, but things can get out of hand pretty quick.
A nice, SEO friendly site architecture
A nice, SEO friendly site architecture at launch.
A Few Months Later:
Bad, Bad, Bad Site Architecture
uh-oh
For a rapidly growing website with promotions, linkbait campaigns, and expanding product lines, it becomes important to reassess the flow of PageRank and usability of your site every so often. When auditing site structure we’re mainly dealing with indexation, but a poor architecture can keep pages from ranking, as well. Let’s discuss how to do a quick audit of your site’s information architecture.

First: The tools

There are a couple essential tools for a site structure audit.

Step 0

Hopefully you’ve registered your site with Google and Bing’s Webmaster Tools, but if not, go for it. The crawl error reports can alert you to all sorts of problems that you may have overlooked. Additionally, Sitemap submission can be used to help monitor indexation levels, especially if you segment your Sitemaps by site section, as detailed in Rob’s post.
Webmaster Tools Sitemap indexation
Looking good!

Step 1: Xenu

Begin your Xenu crawl. Depending on the size of your site this could take awhile, so I like to get this going ASAP. I also like to check external links to make sure I’m not linking out to pages that have gone dead; even if its not related to site architecture, it’s bad for users (this will significantly increase time to completion, however). Once the crawl is completed allow Xenu to output a report. From here we can find orphaned pages, dead links, redirecting links and more.
Xenu also reports the ‘level’ of a URL, which is the distance from your home page.
Xenu depth
Um, that’s not good
In most cases, levels 3 or 4 are just about as far away from the home page you’d like a page to be if you expect it to rank. eCommerce store owners will want to make sure none of their product pages are past this level.
If you’d like more info on breaking down Xenu’s output, I wrote a post about using Excel to score internal page power based on Xenu’s report.

Step 2: Too Many Links

Run your link counter on a nice sample of URLs. If your site is generated using templates, check the link count on each of those templates. The old SEO myth of 100 links per page was officially busted by Matt Cutts, but there’s an important point made in the second half of the video below:
As Mr. Cutts said, a page’s PageRank is divided and dispersed through the links found within, so a ton of links on a page might not be an effective way to get everything indexed. How much is a ton? Well, it depends on a lot, but there’s definitely more leniency on the more powerful pages like the home and first-level pages. If there are deep pages on your site with over 200 links on the page, there may not be very much PageRank flowing through the links found within.

Step 3: JavaScript Disabled ‘Crawl’

Now it’s time to fire up your browser of choice, turn JavaScript off with the Web Developer Toolbar, and behave like the Googlebot.
Start at your home page and ‘crawl’ through as many URLs as possible. Are there pages that can’t be reached? Take note of sections or pages that are too many clicks from the home page, and pay attention to the shortest path to poorly indexed pages. These will be the targets for the changes we’ll make.

Step 4: Make Some Changes

Your poorly indexed pages might need to be relocated closer to the home page, or your HTML sitemap may need to be adjusted. Other than the home page, a good place for links to your unindexed pages is on those pages that are deemed the strongest as a result of external links. Check out the top pages report on Open Site Explorer or the most linked pages from GWT.
Google Webmaster Tool's most linked pages
This report is a good place to determine where you should be dropping links to your poorly indexed pages.
If you’ve got a whole section of pages that are poorly indexed, it might be a good idea to build out a widget/module that links to a few pages within that section. Take Yelp.com’s latest lists widget that appears on their first level pages as an example:
Yelp.com lists
This widget updates server-side so as the Googlebot encounters it, new pages are crawled and indexed.

Finally: Monitor Your Changes

There are a few ways to determine the efficacy of your changes. By far the most accurate representation of indexation would be provided by Google Webmaster Tool’s Sitemap report, assuming you’ve set this up properly. An alternative would be to monitor your organic landing pages via Google Analytics. This has the added benefit of relating your indexation to traffic levels, but is subject to the many oddities that could occur with search volume.
Google Analytics organic landing pages
This report isn’t flawless, but it’s pretty good at tying site architecture changes to actual traffic

Rinse, Repeat

If you’re working on a particularly large and frequently updated site, especially one with multiple internal groups making changes, keep site architecture in mind. Schedule a quick checkup every few months, if necessary, and more frequently if your site isn’t getting the indexation it needs.
This has been a friendly reminder to not forget about your site’s architecture while you’re off building links and adding pages!
Written By Mike Pantoliano here - Posts About SEO | Distilled blog

0 comments:

Post a Comment

Note: Only a member of this blog may post a comment.

Share

Twitter Delicious Facebook Digg Stumbleupon Favorites More