Blog Disclaimer

The posts published in this blog are collected from different blogs or websites written by various famous bloggers/writers. I have just collected these posts only. These posts are not written by me. All collected posts are the great stuffs.

Blog Disclaimer

All content provided/collected on this blog is for informational purposes only, it is not used for any commercial purpose. At the end of any post, the visitor can find the link of the original source.

Blog Disclaimer

At the end of any post, the visitor can find the link of the original source. These posts are only for further reference to review/study latter. It’s a request to all visitors; please go through the original post by clicking on the source given below/above of every post.

June 27, 2012

PPC Spring Cleaning Necessities

If you’ve just started running paid ads, or if even if you’ve been doing it for a while, there are always improvements that can be made: more keywords to target, bids to be changed, landing pages to be optimized. To help you get a fresh start on your account this spring, I’ve put together the most frequent recommendations I give during paid search audits.

Geotargeting

We pair this with our Structure & Settings section, but really it could be an entire section all on its own because it’s so important. Here’s where to find your campaign split out by geographic location.
Let’s pretend this is your account. You are spending quite a bit of money in California, Florida, and Illinois.
There are a few options we recommend you take:
a. You can restrict your advertising from these areas to save the money if you aren’t seeing strong returns within 30 days or even through analytics.
b. If you are getting impressive returns for these states, you should consider restricting them from this campaign and create new campaigns for each state with state specific keywords and adverts

Ad Extensions

Enabling all possible ad extensions: Phone number, site links, products, G+, location, and even a mobile app ad extension. These are some of the things that make your ad stand out. Do you not have a product feed or mobile app? No problem. There’s still no reason to not use the other 3.
Opinion: Thirty or more Google maps reviews per location add ratings for non-eCommerce businesses. No one in paid search discusses brick & mortar star ratings (as far as I’m aware). I’ve seen some adverts with seller ratings extensions, and when I click on the stars they take me to the Places page with reviews. Just one more reason to pimp out your Google Places account and link it up with AdWords. This is where your mailing list and followers on Facebook and Twitter come in handy!

AdGroup Structure

One adgroup should contain one search intent so the advert is as closely targeted as possible. If you have 300 keywords and 25 adgroups, we recommend you separate your keywords by similar intent then expand your keyword list, segment by device and location, and label your adgroups and campaigns so you can easily filter data. This will help you quickly and easily review how different parts of the account are working.

A/B Testing AdText

With the latest update from Google being 30 day max rotation, A/B testing for the long tail/low impression keywords is probably going to be on the out. However, if you’ve been paying attention to your ads, you’ll see that Google has been favoring certain ads anyway when the setting ‘rotate’ was selected.

After you’ve created new ads for your adgroups, the least time consuming way to change over your rotation settings are to select ALL campaigns and go to the Settings tab.
If you’ve already done A/B testing and you have selected the best ads by statistical significance and you are seeing more than 15 conversions in 30 days, you should select ‘Optimize for conversions’ as Google will help you target people who are actually going to convert.

Search Query Report

If you don’t know what it is, you could be suffering from low Click Through Rates (CTRs), higher Cost Per Click (CPCs), decreased time on site, and wasted opportunity.
The search query report shows you the true search queries that are matched to your keywords. Broad and modified broad types throw the widest net (not always the best idea); phrase match connects your phrase “white boats” to “buy white boats” “white boat shoes for men” and many more; exact match pairs your ad with anyone who only searches for [white boats]. The modified broad match and search query report are now available in AdCenter as well. Make good use of them.

Pausing Quality Scores Below 3

If you just reacted negatively to this, let’s talk about it for a minute. You are probably receiving the majority of your traffic from broad match keywords with a QS of 3 or below, but any keyword with a 1 or 2 is barely showing anyway. You are only weighing down the adgroups and accounts by allowing these keywords to remain active. You need to do a few things:
a. Make better adgroups. I’ve seen keywords go from 6s to 10s when they were pulled out and regrouped with more closely associated terms. If you could increase QS just by restructuring, you need to do it.
b. You’re paying several times more than people who have higher QSs. People aren’t beating you because they are paying more, they are beating you because they have more relevant keyword groupings, ads, and landing pages (even though Google says landing pages don’t make or break QSs).
c. Your ads aren’t showing the way you think they are. Even if you’ve enabled all extensions, you are probably showing without them more frequently than not. Google won’t reward you if they think you aren’t good for [their] business, and CTR is the way they determine how amazing you are.
If you weren’t put off by pausing lower QS keywords, then you should:
a. Pause low-traffic, low QS keywords
b. Create tighter themed adgroups (potentially split by match type)
c. Pause low QS, higher-than-you-will-every-pay CPC keywords
d. Pause any low CTR ads.
If you take a massive traffic hit after pausing or deleting these low quality keywords, we would prefer you use broad match with higher QS keywords to supplement your traffic.

Segmenting by Network

It is now well known that we don’t mix display and search into one campaign. However, we always need to be cautious about the search partners network.
To see your campaign data by network, go to the Segment dropdown and select Network (with search partners).
This campaign shows exactly why we recommend eliminating the search partners.

a. CPA is much higher than that of just Google Search. If the CPA was lower, we would have kept the search partners network because it does not affect keyword quality score.
b. Google search network is more rewarding by $10+/conversion.
If you aren’t getting strong data from the search partners, go to the settings tab for that campaign and disable your ads from showing there. Remember to hit Save before leaving this page.

Segmenting by Device

This recommendation is very similar to the Network and Geotargeting advice we’ve already gone over.
If you either decide (based on data) that you should be targeting devices individually, go to the campaign settings to make your changes.
If you know your website is not mobile or tablet friendly (graphic heavy, small text, gray text & gray background, requires serious attention, etc), then create a duplicate campaign only targeted at tablet and mobile traffic, choose Wi-Fi traffic, and your  time on site will probably increase substantially.
If you are already low on budget and have high CPCs, you should stop advertising on devices that aren’t providing comparable, strong return.

There’s Always More

Segmenting by Device and Network is always a necessity, but don’t forget about the other ways to segment. Back in August I wrote about Top v Side, which is a great segment that allows you to bid more effectively. If you aren’t already familiar with all the available segments, you should spend some time reviewing them!
If you think your paid search account is so good it can’t be improved, you should start expanding to AdCenter, LinkedIn, Twitter, and Facebook. Distilled can help you with your paid search agenda.
If you aren’t too sure about your account, we offer paid search audits to businesses looking to switch from their current agency or find their first agency. Our audits provide an in depth analysis of your AdWords and AdCenter accounts.
We review:
  • account structure and settings
  • keyphrase selection and match types
  • search query report
  • quality scores
  • adverts
  • CRO potential
  • and display advertising.
We’ve recently started providing feedback on paid video ads as YouTube Advertising has recently been transferred to the AdWords interface.

Posted by Jasmine Aye
Source : distilled

June 20, 2012

Google’s Recommendations for Building Smartphone-Optimized Websites

Know about Google’s recommendations for building smartphone-optimized websites and explain how to do so in a way that gives both your desktop- and smartphone-optimized sites the best chance of performing well in Google’s search results.

Recommendations for smartphone-optimized sites

The full details of our recommendation can be found in our new help site, which we now summarize.
When building a website that targets smartphones, Google supports three different configurations:
  1. Sites that use responsive web design, i.e. sites that serve all devices on the same set of URLs, with each URL serving the same HTML to all devices and using just CSS to change how the page is rendered on the device. This is Google’s recommended configuration.
  2. Sites that dynamically serve all devices on the same set of URLs, but each URL serves different HTML (and CSS) depending on whether the user agent is a desktop or a mobile device.
  3. Sites that have a separate mobile and desktop sites.

Responsive web design

Responsive web design is a technique to build web pages that alter how they look using CSS3 media queries. That is, there is one HTML code for the page regardless of the device accessing it, but its presentation changes using CSS media queries to specify which CSS rules apply for the browser displaying the page. You can learn more about responsive web design from this blog postby Google's webmasters and in our recommendations.
Using responsive web design has multiple advantages, including:
  • It keeps your desktop and mobile content on a single URL, which is easier for your users to interact with, share, and link to and for Google’s algorithms to assign the indexing properties to your content.
  • Google can discover your content more efficiently as we wouldn't need to crawl a page with the different Googlebot user agents to retrieve and index all the content.

Device-specific HTML

However, we appreciate that for many situations it may not be possible or appropriate to use responsive web design. That’s why we support having websites serve equivalent content using different, device-specific, HTML. The device-specific HTML can be served on the same URL (a configuration called dynamic serving) or different URLs (such as www.example.com and m.example.com).
If your website uses a dynamic serving configuration, we strongly recommend using the Vary HTTP header to communicate to caching servers and our algorithms that the content may change for different user agents requesting the page. We also use this as a crawling signal for Googlebot-Mobile. More details are here.
As for the separate mobile site configuration, since there are many ways to do this, our recommendation introduces annotations that communicate to our algorithms that your desktop and mobile pages are equivalent in purpose; that is, the new annotations describe the relationship between the desktop and mobile content as alternatives of each other and should be treated as a single entity with each alternative targeting a specific class of device.
These annotations will help us discover your smartphone-optimized content and help our algorithms understand the structure of your content, giving it the best chance of performing well in our search results.

Conclusion

This blog post is only a brief summary of our recommendation for building smartphone-optimized websites. Please read the full recommendation and see which supported implementation is most suitable for your site and users. And, as always, please ask on our Webmaster Help forums if you have more questions.


Source : Official Google Webmaster Central Blog

June 13, 2012

The Organic SEO Task List for 2012 & 2013

Ideally a blog post should be written and formatted for the web, meaning it should be short, snappy and image-rich. This post is exactly the oposite of that. Do as I say, not as I do? After beating myself up about it I concluded that the message I wanted to convey (i.e. Holy CRAP we need to know a lot of stuff these days!) is best “illustrated” as a wall of text…
Jenga.3 by Herman Rhoids, on Flickr
One of the exciting projects we’re working on here at seOverflow is to update our site audit and ongoing SEO task lists. Since the attrition rate of old tasks isn’t anywhere near the growth rate of new tasks our new audit task list has tripled in length from the one I had a few years ago.
Although a few things have dropped off the agenda (e.g. keyword meta tags) we’re still working on things like title tags, meta descriptions, robots.txt, robots meta, header tags, alt tags… and the myriad “old-school” on-page factors that have been part of our lexicon for many years.
In-fact, I can think of very few things that I used to do as part of an audit or ongoing SEO that I no longer have to do. But I can think of dozens of things that have been added, and most of them are more complex than their predecessors.
I’d like to outline a few of these things and then discuss what we can do about them. But first, let’s have a look at some of those predecessors…

What Doesn’t Work Anymore

the Jenga by egarc2, on FlickrLots of things don’t “work” anymore, or at least not as well as they used to, or not without so much risk as to render them undesirable. This includes keyword stuffing content, site-wide footer text, sponsoring WordPress themes, embedding links in widgets, automated link exchanges, massive article spinning, bait-n-switch 301 redirecting, domain buying for the purpose of redirecting, automated comment spam, mass directory submissions, most paid link networks…
If you have one or two out of date SEO tactics in your foundation it is unlikely to cause your site to come tumbling down as long as you’ve worked hard on other areas. A few directory links and a site-wide footer probably aren’t going to kill your rankings if you’ve spent the last couple of years building up a good reputation in the industry by providing outstanding service or products, and useful content.
But old SEO tactics that rely on outsmarting the algorithm are are being eroded one update at a time. If you are left with a foundation built upon templated content with the name of a city changed, or thousands of stub pages waiting to be filled with user-generated content that you think will magically appear someday – you may be in for a fall. Frankly, I’d be surprised if you haven’t already had one.

What Has Been Added to the SEO Task List Recently

If you don’t have a cup of coffee you should go grab one. We’re going to cover a LOT of ground here. Unfortunately, this means that we have to sacrifice a little depth to gain the breadth. This post isn’t meant to be a tutorial on every aspect of SEO, but rather a way of putting into perspective the enormous, overwhelming, mind-boggling amount of STUFF on an SEO’s plate these days. Executives and Developers need to know this. SEOs need to know this. And most important of all, any moron out there saying “SEO is dead” needs to know this. SEO is like a hydra. Every time one task becomes obsolete, several more have taken its place. Here are some of the examples that I’ve come up with, though I’m sure there are many more I’ve left off the list (feel free to add them in the comments section).

Site Performance / Page Load Time

If you don’t think high latency is worth fixing you might want to read this page, specifically this part: “…that’s why we’ve decided to take site speed into account in our search rankings.” How many other times have you actually heard Google categorically say that something is a ranking factor? One could argue that they chose this particular one to make public because it is within their best interest to have SEOs speeding up the web for their crawlers, which saves Google money. However, that doesn’t change the fact that site speed, however small, is a ranking factor. The percentage of sites affected by this factor are probably still in the single digits, but it is worth your time to make sure your sites aren’t among them. Add site speed to the task list and use the tools below to diagnose and fix problems.
seoBook Brand Infographic

Brand Building / PR

As my grandmother would say… Goodness graciousthis is probably at the heart of how our jobs have changed over the last few years. Call it growing up, selling out, or just simply losing the battle – but the SEO of today may be just another marketer.
I’ve always disliked marketing because I tend to dislike being told I should want things that I don’t need and can’t afford (or at least shouldn’t be wasting my money on). I mute commercials. I roll my eyes at magazine and newspaper ads. Sometimes I’ll even get downright angry because I feel like marketers insult my intelligence by putting things like “Natural Essential Oil Shampoo” on a product so full of carcinogens I’m surprised they’re even legally allowed to sell it, much less use a word like “natural”.
So have we joined the dark side? Are we just marketers? I like to think we’re not. The first time I ever heard Danny Sullivan speak was at an SES in Chicago and the point of the keynote was about how search is different from other forms of marketing because the audience specifically asks to see whatever it is that you have to offer. They ASK by searching for something, which is entirely different than being bombarded with commercials and glossy ads that you have no desire to see whatsoever. And for this reason I embrace our newfound responsibilities as marketers and brand builders.
What are our tasks related to brand building and PR? First of all, if you think “PR” is writing a press release about how you launched a new page on your site so you can send it out to the nobody-cares list on PR Web or PR Newswire you are on the wrong path. These days SEOs need to either specialize in and understand PR from the traditional standpoint as taught in universities and PR firms around the world, or work with people who do specialize in that area. Frankly, if you have a good PR team to work with you don’t need to hire a link builder. Other tasks include everything from major brand awareness campaigns and choosing branded domains over exact-match domains (EMDs), to simply recognizing when and why to put the brand at the front of  a title tag instead of the end.
Search used to be where the playing field was level and mom-n-pop shops could get in front of a huge audience just by being a little bit more agile than the big brands that dominate the landscape off of every major interstate highway exit across the country. Though I find it heartbreaking to admit this, those days are over. You need to build a brand. This is the single most important and single most difficult task that has been added to the SEO agenda over the last few years. And I wish I was better at it. It helps to have resources like the ones below:
A Little Secret: Before coming over to seOverflow I spent a year as the 
SEO Director
 Overpaid PR & Marketing Apprentice for a guy named Luke Knowles. You’ve probably never heard of him, but he started several “super affiliate” sites, including Free Shipping Day, which was bigger than Black Friday (according to comScore) and was responsible for about $942 million in online revenue last year. In 2012 it will probably top $1 billion. Here’s the secret: I didn’t have to do any linkbuilding for his sites. Instead, I worked with a small in-house PR team to pitch article and TV segment ideas to major news outlets. We used CisionPoint to discover and manage these media relationships. Instead of building links directly into the sites I worked on outreach to make sure articles featuring our websites and TV news segments featuring our in-house talking heads PR experts received as much attention as possible. Part of this had to do with interlinking the stories so they could help prop each other up long enough for viral action to take place. I won’t go into the details of how all of this was done, or what it takes to get someone from your company, or an article about your company, on CNN, ABC, Fox, NBC, NPR, USA Today, New York Times… (all of which we did), but let’s just say if I had to choose between your average link builder and an expert PR professional who knew how to approach and interact with media outlets and presented well on camera, I’d go for the public relations person any day of the week.
So Why is Everyone Still Link Building? A: Link building is easier. B: Link building is cheaper. C: Not every business is sexy/interesting/new enough to warrant mainstream media coverage. D: Link building still works (our team rocks at it).
What if I Have a Boring Client? You can overcome a boring client, but not one without a decent budget. The story doesn’t have to be, and in fact shouldn’t be, about your company or client. It should be about a topic that is already in the news. Nobody wants to hear your personal injury client talk about his latest slip-and-fall case, but you can bet there  is a TV news station or online newspaper out there right now looking for a “legal expert” to talk about any number of high-profile legal cases going on at the moment. The trick is to develop your client as a “legal expert” worthy of being interviewed on the topic. This takes a commitment on their part. You have to start small, develop momentum, and be pro-active in contacting the media. The more interviews they get, the more – and higher-profile – outlets they can pitch. All of this is easier said than done.
PR and Brand Building is Hard! There’s no doubt about that. And, unfortunately, the cost is way beyond the typical monthly link building budget for most SEO clients. Entire SEO business models are currently shifting around this area. Smarter people than me have already figured out how to make the leap without overpricing their services and alienating small business clients. A theme I keep returning to in this post is one of specialization. To be successful at this kind of public relations and brand building you need to know it forwards and backwards. If you want to be the next SEO-centric Public Relations expert I think your future will be bright, but you’ll have to start specializing 
yesterday
 now.

Http Header Status Codes

Even status code best practices are changing! The jury is still out on some of this, but a lot of people recommend the 410 over the 404 because it is faster and, technically, a more accurate code for web documents that have been removed from the domain. John Mueller from Google has recently verified this when he clarified his statements from a Google Webmaster Help thread in a response to Barry Schwartz:
We do treat 410s slightly differently than 404s. …If you want to speed up the removal (and don’t want to use a noindex meta tag or the urgent URL removal tools), then a 410 might have a small time-advantage over a 404.
SEOs also need to keep their eyes out for soft 404s, though this has been an issue for many years now. The good news here is that Google doesn’t seem to be denying your XML sitemap just because you have soft 404s like they did a few years ago.

Content Quality

Content has always been important, and presumably “quality” has always been a major factor. However, with a few decent links hard-won by linkbuilding efforts and a decent, trusted domain – even crap content could rank well. A lot of SEOs figured it wasn’t their problem to deal with. Their job was to get a site to rank well, which meant that content had to be written for search engines.
Multiple algorithm updates have blasted such sites out of the index, but Panda was a particularly effective hatchet in this area. Once upon a time you could have thousands of stub pages (e.g. empty listing pages for businesses, coupons or products); templated content with geo-modified search & replace keywords (e.g. directories and IYPs); useless content designed to bring in specific searches; spun content; duplicate content… and the worst thing that could happen is that those pages would be put into the supplemental index and wouldn’t rank very well. The rest of your site would be fine. Internet Yellow Page (IYP) type sites, and vertical directories loved this. It meant they could put up a generic page for every single city in the country, fill it with templated content that replaced one city name with another, and once someone signed up to be listed on that page it would move from the supplemental to the main index and start ranking. Content mills loved it because they could have 80% crap articles as filler content and long-tail traffic generators without negatively affecting their short-tail rankings for the 20% of their content worthy of being featured on the forward-facing site.
Panda changed ALL of that. Now even a small percentage of crap content can pull down rankings for an entire site, including your award-winning, trusted resources with links from dozens of high-profile media outlets.
If you have an eCommerce site and don’t have content on category pages you are three years behind. If you have an eCommerce site and the content on your category pages consists of filler like “This is our blue widget category where you’ll find the best blue widgets to choose from…” then you’re only two years behind and should read this. If you have an eCommerce site with useful content on category pages that helps the shopper decide which sub-category, brand page or product page to click on next then I’d like to shake your hand because you know what’s up.
Does Your Content Deserve to Rank? Sometimes you just have to give yourself some tough love. Five years ago I might look at keyword use, internal and external links, how long the page has been up and other on-page factors like header, alt and strong tags to make this determination. These days I have to read the content and pretend I am a searcher who arrived at this page from Google after typing the targeted keyword. Does this page answer my question, entertain me, or otherwise provide me with the content I was looking for with that search? Or is it simply “good enough” to rank “for now” with the intention that I’ll either click on an ad, buy something or go to a different part of the site? This is the crux of the matter because now “good enough” isn’t good enough.
giant jenga by fudj, on Flickr

An Aside: What Makes a Good SEO These Days?

I really do feel for anyone who is just getting into this industry. Those who come from a technical background will likely have trouble with the PR/Marketing aspect, and those who come from a Marketing/PR background will likely have trouble with the technical aspect.
Everything is moving at warp speed and the tasks that fall on a typical SEO’s plate keep stacking up. I get overwhelmed and I’ve been doing this for about eight years now – not as long as some, but long enough to see some big changes. I can only imagine what the marketing team intern feels when they’re hired on full time to “do SEO” and get shipped off to their first conference to “learn how”. It must be similar to trying to take a sip of water from a fire hose.
So what makes a good SEO these days?I’d say the same thing that made a good SEO five, eight or ten years ago: Someone who loves to learn. Someone who gets bored easily and needs to be constantly challenged. Someone who can think with both sides of their brain.
Now back to the list of SEO tasks. Let’s see, what shall we stack on next…

Sitemap Changes

XML Sitemaps: While not “new” I do remember when they didn’t exist and a “sitemap” meant a bunch of footer links, an HTML sitemap page, or a visual site map of pages for use in designing and developing a new site. Now there are many different types of XML sitemaps and several approaches to implementation.
Sitemap Segmentation: Made popular by Vanessa Fox and authors, submitting a separate sitemap for different sections of your site (e.g. ecommerce categories or top categories and sub-categories) can give you some insight that a single sitemap cannot. For instance, if your indexation rate is low you might want to know if that is a site-wide issue, or specific to one section.
Vertical Search Sitemaps (News, Images, Video…): What started off as a single XML sitemap for HTML pages, then grew into multiple sitemaps for each content type, then grew into multiple segmented sitemaps for each content type (each with their own markup), has now evolved into a single sitemap in which you can mix and match content types. #FullCircle?

Social Integration

I remember when social media marketing was just an offshoot of SEO. We would submit to sites like Digg and Reddit, then “network” with our friends to get the word out so people would vote for our content so we could get on the front page so we would get a firehose of traffic that often would crash the servers (the only time in my life when I remember being happy about crashed servers) so we could get a ton of links so we could rank higher. I remember when we hired our first social media expert because it became too much for me to manage on my own. And since that time social has grown up to be its own channel, completely independent of, yet inextricably connected to, search. Here are some of the new tasks and considerations that we need to make given the state of Search and Social today…
Rel Author & AuthorRankrel-author-everett
What if an article ranked well not just because of the keywords in the content, or the amount and quality of links, but just because of who the author was? What if you were your own brand and your brand followed you around the web, adding credibility to the content you write? What if you could increase CTR in the SERPs by 30% almost overnight just by adding a few lines of simple code and linking to the site from your Google+ profile? These questions highlight the importance of using rel author / rel me tags and getting your mugshot in the SERPs next to your content. This is one of those tasks that we’re thankful to have added to our priority list as SEOs.
There are multiple ways of handling rel author markup at this point, though I hesitate to mention them specifically because they tend to change on a weekly basis. My personal site esizemore.com is a single author WordPress site so I simply put the Rel=”me” tag in as part of the site-wide template. You can see it in action with this query. The seOverflow site has several authors so we opted to go with the rel=”author” link to our bio pages, which then link with rel=”me” to our Google+ profiles. You can see that in action here. In both scenarios you need to link back from the Google+ page under the “Contributor” section to the site where your rel author markup is located. Otherwise I’d just put Danny Sullivan or Barry Schwartz’s rel me link on everything I write.
Below are a few resources you may find useful when sorting through the different rel author markup options to find the one most suitable to your needs…
Social Signals as Ranking Factors (especially for QDF keywords)
Both Google and Bing have publicly stated that they use social signals as ranking factors, and tests have shown that Facebook and Twitter shares/likes are at the very least “correlated” with higher rankings. Though there have been mixed signals from Google about whether G+ counts have a direct impact on rankings, I think the question is almost moot since they have an obvious impact on personalization, which in many cases translates to rankings.
Google Search Plus Your World (SPYW) and PersonalizationGoogle has been personalizing search results for many years and they just keep getting more personal. First it was for users who were logged into Google accounts, or just because of their geographic location. Then in 2009 they started personalizing results even if you weren’t logged in. I think at that point many of us started using plugins that add the pws=0 parameter into search query URLs, but even that doesn’t work if you dont’ have Instant Search turned off. Last year (2011) Google started blending social search results in with the universal search listings using a variety of social networks. There was data-drama between Google, Twitter and Facebook, which many believe was the catalyst behind what was to come next…
Google SPYW changes your search results based on your social connections and activity, specifically on Google+. You can turn it on/off and set some basic personalization preferences. The heavy-handed approach to the way Google+ and GoogleSPYw were implemented in the SERPs has made it virtually impossible for SEOs to ignore social signals as a ranking factor for logged-in users. Google Fellow Amit Singhal said at SMX London last month that GSPYW “is the first baby step to achieve Google’s dream” and that data shows users like personal results. In other words, search and social are married in the church of Google and they’ll be having a family soon. This could be good for marketers who build their brand and social channels because when “users like” a new SERP feature it also lifts clickthrough rates for early adopters.
As an added SEO task now instead of worrying about how many links you have and the keyword use within page content, as well as crawlability and other technical issues, we need to think about the social graph. How many people have you in their circles? What words do you use in your G+ introduction? Should we put a G+ button on client sites? Does every client need a Facebook, Twitter and G+ page and, if so, how do we optimize them and how do they interact with each other and the main site? How do you implement the “Rel Publisher” tag? How do Google+ pages interact with rel author thumbs in the SERPs (e.g. does being in more circles make it more likely that your thumb will show up?) and so on and so forth…
Rather than reinvent the wheel, I’d like to link out to some folks who know more about the unfortunately-named Google products and some of the things SEOs need to think about in regard to them…
Google Places Moved to G+
I wasn’t sure whether to put this under local or social, but you can read up on it here. A lot of agencies out there are probably thinking the same thing. Does our local search person still handle this, or does it move over to the social media person now? Either way I think the local search SEO is going to have to get into social (if they hadn’t already) and the social media profile manager is going to have to get into local search. I feel sorry for them both.

Usability and User Experience

Both in terms of the user experience on Google, as well as on your site, SEOs are having to think about human beings more than search engine bots and algorithms these days. Here are a few of the things that have landed on our plates as of late…
On-Site UX: Probably for several years, but certainly since Panda, providing the best user experience has become an increasingly important part of the job. This is due to user feedback signals available to Google. We can theorize about what they are and how it is collected (Chrome, deal with Firefox, Logged-in Google accounts, Gmail links, Analytics, Etc…) but the fact is Google has massive amounts of data pertaining to how users react to your site as a whole, as well as links to your site, and your site in the search engine results. They know who visits; how long they stay; what other sites they visit; what queries they used before, during and after visiting your site; where else they are interacting with your brand, and a whole host of known and unknown useful pieces of data. You can fool a bot pretty easily, but it’s tough to fool a user. Factor in how algorithm updates are being evaluated by quality raters who are asked to determine the relevancy and trustworthiness of a set of results for given queries and it becomes quite apparent that the ups and downs we see in the rankings, though not directly affected by any one person, are heavily influenced by user behavior, which is heavily influenced by the experiences they have on your site. And thus user experience has become “an SEO thing”, which is why we partner with Conversion IQ.
Poorly Performing Pages: Whether Google uses Analytics data in their algorithm as a ranking factor is beside the point here. What is important is which pages and keywords are providing a poor user experience. I used to look at pages and keywords to see what was ranking and how I can improve rankings. I still do. However, nowadays I look at UX metrics like time on site, average order size, pages per visit, bounce rate, etc… to determine which pages or keywords aren’t doing well. Then I try to figure out what the problem is. Sometimes it’s a poorly matched keyword / landing page combination, in which case I update the internal and external linking strategy, as well as on-page factors, to help a better page show up or to improve the page that is already ranking. Sometimes it is a slow loading page, an ad or fly-out nav that covers up the content, or a broken HTML tag causing something not to work. It could be any number of things that make a visitor from the search engines decide they don’t like what they’ve found. If someone had told me six years ago that changing an image from an old white lady to a young African American would affect rankings for a product page I’d have thought they were crazy. And probably racist. But now an SEO really has to know who the target demographic is because you need to give those users the experience they want. While the image change is an extreme example, you can bet that a pop-up ad you can’t close will send your user experience metrics plummeting faster than a social media company’s IPO.
SERP Changes: We can’t possibly cover all of these search engine result page users experience changes in the scope of this post, though we’ll touch on certain ones elsewhere. It is sufficient to say that if there’s one constant about Google SERPs these days it’s that they’re always changing. From the Google 
maps
 / local places listings that seem to change on a weekly basis; to the addition of more universal search elements from vertical search; to the real-time search feature that lasted from December, 2009 to July 2011; to author thumbs; to various schema markups; to choosing their own title tags for you; to breadcrumbs; to bulleted lists; to organic Google Product Search Shopping results, which are soon to become pay-to-play; to “Search Plus Your World”; to the new “Knowledge Graph”; to big site links and little site links and… If my fingers had lungs they’d have passed out from lack of oxygen several semicolons ago. For the sake of my arthritic future, I’m hoping you get the point. This is a LOT to keep up with!

Rel Canonical (single domain and cross-domain)

This is one of those tags that was introduced to solve one problem and ended up creating a half-dozen other ones. I know of sites that were cruising along just fine until one day someone implemented rel canonical tags incorrectly with disastrous results. Some SEOs insist that every page has one. Some tend to ignore them altogether for the problems they create, and seek instead to fix the problem instead of the symptom. Most of us do a little of both depending on the situation. Either way we need to A: See if a site uses them or needs them, and B: make sure they are implemented correctly.
Some Rel Canonical Issues to Look Out For…
  • Self-referencing cross-domain rel canonicals: Each domain references its own version of the content. This happens often when the rel canonical code uses a relative path and multiple domains access the same code.
  • Self-referencing rel canonical tags on non-canonical pages: Similar to the above issue, but on the same domain.
  • Conflicts with other tags: Rel canonical on ?page=2 pointing to the first page, but first page is indexable and page two is noindexed. There are many other examples of conflicting tags in this way, including rel next / prev and canonical view all pages.
  • Multiple rel canonical tags on a single page, sometimes with different URLs: I’ve seen this happen when one tag was hard coded at the page level and another was an include in the header.php file or some other site-wide implementation conflicting with a page-specific implementation
While the rel canonical tag did solve some issues for SEOs it also made our jobs significantly more complicated and added several other potential issues for us to diagnose when things go south.
Recommended Reading:

Micro-Formatting, Rich Snippets and Structured Data

hCard, XFN, HTML5 microdata, hReview, Rel tags, Schema.org, RDFa, rich snippets, structured data, micro-formats, the semantic web… Oh My! You may already know all about this stuff, but if it makes your head spin it helps to think about it this way: Just because you know that 555-123-4567 is a phone number doesn’t mean Google does. As far as they know it could be a Sku, a serial number, or a song title. All of these acronyms and new phrases are essentially about dealing with that problem in its many manifestations. The fact that there are so many options and so many ways to implement them is just one more thing we have to deal with as SEOs. Dont’ feel bad if you have yet to get a full grip on them either. Google is having trouble too.
Which is Best? (RDFa, HTML5 Microdata, Microformats…) Personally, I go with Schema.org markup, which was a collaboration by Google, Bing and Yahoo to bring together structure data markup that all three search engines can support. Schema.org includes schemas that cover most situations from products, events, organizations, people, places, offers, ratings and combinations of some of those things in addition to many others. Google also recommends HTML5 microdata as a way to mark up information.
Must-Have Microformatting Links:
My first encounter with contributing to the semantic web: I remember the day it was announced that Google had purchased MetaWeb, the startup behind the ambitious FreeBase.com project. I was working for Gaiam at the time (Disclosure: now an seOverflow client) and I immediately went over and started connecting the dots between Gaiam’s various brands, instructors who appeared in Gaiam workout videos and more. You can see some of that still in place here (notice my ugly mug in the top right corner) and here and here and here and here and… Among other things, I was able to describe all of the business units and tell Google that the yoga instructor Rodney Yee was married to Colleen Saidman, both of whom were yoga instructors for Gaiam, which owned the brands Gaiam Yoga Club and Spiritual Cinema Circle, which were started in this year by this person in this city, and here are the products they sell or the services they offer…. You can see some of the other things I messed around with in July of 2010 here and here. It is all transparent. Imagine my lack of surprise when Google announced a year later that they would be using structured data to help them figure out relationships in the semantic web. Add this to your task list: When Google buys a company you should look into it. They do it often, and they always do it for a reason.

Navigation & Pagination

The rel canonical, crawlability, navigation and pagination tasks and tools all blend together in many aspects. Learning what they are, what they’re for and how they interact with each other can be like untangling knots in fishing line sometimes. Add to that the aspect of figuring out which tool, or combination of tools, is best for each situation – or diagnosing what went wrong post-implementation – and you can see why many SEOs spend a lot of time dealing with these things.
Rel Next / Prev: This tag is a good way to consolidate the power of multiple paginated pages into the first page in a paginated set. It fills in a few gaps left open by nofollow, noindex and rel canonical tags, which were typically used to solve a variety of pagination issues. You can use rel next / prev in addition to, or in place of, these other tags, depending on the situation. The first thing you may need to know is that the tag goes in the header of the document, rather than on the pagination links themselves, which does make implementation quite a bit more tricky. Read upon these tags here and here. And here’s onefrom the Beginning SEO blog by Rick Ramos.
View All Canonical: This is a good alternative to self-referencing rel canonical tags on each paginated page. While you wouldn’t want the rel canonical tag on page 2 to point to page 1 as the canonical, you could have ALL pages point to your “View All” page as the canonical. I’ve had mixed success with this. On fast-loading content pages it tends to work well. On category pages with hundreds of products it tends to not work very well. All I can say is to experiment, pay close attention to category page traffic and rankings after implementation, and read this post carefully before venturing down the View All Canonical road.
Faceted Navigation Strategy: I typically see faceted navigation used on eCommerce sites. It can be very useful for shoppers and can improve conversion rates and average order volume. However, it is also a bloody nightmare from an SEO perspective. Faceted navigation is when you can filter any given category in several different ways (e.g. category, price high to low, price low to high, color, best sellers…). This is different than the typical “Sort by price” option that adds a single ?sort=price parameter to the URL. Imagine shopping for blue shoes and being able to access a category page by going to the blue category followed by shoes filter, or the shoes category followed by a blue filter. Then you can add on parameters or folders to narrow the selection by price, feature, ratings, offers, brands… and once you’re in brands you can re-sort by all of those other parameters all over again. If you want to see this in action, go printer shopping at BestBuy.com and pay attention the the URLs as you sift through the options. Some content management systems treat the category navigation and faceted filters separately, which makes fixing the issue rather easy, though you may limit crawling and lose some link equity going into those faceted URLs. It’s when the faceted filtering menu IS the navigation that you run into major problems.
I don’t have a one-size-fits-all approach to faceted navigation, but generally I will try to limit the depth at which search engines can crawl and/or index the content while working on ways to make the primary, canonical category page rank higher with unique content. One example might be to use wildcards in the robots.txt file combined with a special parameter added at the third filter to limit crawling to only two levels deep, while making sure the primary category page is the only one with static, textual content and that all other pages link to it via breadcrumbs. For example…
User-agent: *
Disallow: /*crawl=no
Once the user applies a third filter to their product search ?crawl=no gets added into the URL as a parameter. You can also play with robots meta noindex and nofollow tags for added protection, but the more complicated you make it the easier it is for something to break.
Other options include Rel Canonical tags to the main category page (which could limit crawling but may be the best option in certain circumstances), a View All Canonical page (see above), or a combination of different tools. I highly recommend readingthis post on SEOmoz by Mike Pantoliano if you ever have to deal with faceted navigation.
On the way down by Omer Wazir, on Flickr

In the wise words of Douglas Adams…

DON’T PANIC!
You don’t need to know everything there is to know about every aspect of SEO. We’ll get to more about how to keep your sanity, and your job, in a climate of warp-speed change later. In the meantime, keep working on building a strong foundation because… well a picture is worth a thousand words.
<— #GoodFoundationFTW

eCommerce SEO

eCommerce SEO is something I’ve been heavily involved in over the last few years. I wrote a post back in 2009 that won the SEMMY award for Best SEO Blog Post of the year, and though I’m proud of the post I have to cringe when I read it these days because SO MUCH has changed since then. I really need to go back and update it. Below are a few things that have changed for SEOs who work with online stores…
Google 
Merchant Center
 / Product Search / Shopping: I started this post back when Google Product Search aka Google Shopping was free and getting your products listed was as easy as submitting a valid feed. Posts like this and this still have value, but now Google Shopping comes with a cover charge. Again, this development happened between the time I started writing and the time I published this post. That is how fast we’re moving folks. I hope you brought some dramamine.
Schema Markup: As covered in the section on microformatting above, I prefer using Schema.org markup for product information and product reviews. This is the place to start when looking into the markup options. It can get a little confusing when you drill down into the review schema and see the breadcrumb change from Thing > Product on one page to Thing > Creative Work > Review on the next. Luckily the good folks at Raven have created a tool that makes generating sample markup for developers super simple. The hard part is taking data like prices and reviews, which change over time, and updating the markup dynamically, but at least that is ONE thing that typical SEOs don’t have to do these days.
If you haven’t worked with Schema.org markup for products yet, here is what I’d advise in chronological order…
  1. Go here and read up on the product schema and all available attributes.
  2. Go here and use information from one of your product pages to populate the fields.
  3. Copy the code generated from the tool above and paste it into the HTML field here.
  4. Take a screenshot of the Google Search Preview snippet and send that, along with a txt file with the code, to your developers.
  5. Say please and tell them how awesome they are.
Product Schema Preview
Click image to enlarge a screenshot of a fake product review marked up using Raven's Schema Creator.
Product Descriptions: Writing unique product descriptions has always been a best practice. However, when you deal with enterprise ecommerce sites that have hundreds-of-thousands, or millions, of products it isn’t exactly a scaleable task for in-house copywriters. It used to be that you could write custom copy for the best-selling products, say the top 1,000, and let the rest rank where they may. Depending on how authoritative the domain and trusted the brand was you could bring in a lot of long-tail traffic from those pages, especially if some of them had a few user-generated custom reviews. However, since Pandacame around you really can’t afford to do that. Those low quality or duplicate content pages now have the ability to pull down rankings across your entire site, including those fantastic 1,000 top-shelf product pages you spend months optimizing. Furthermore, you can’t afford to have thousands of little accessory product pages that have little or no description at all. You are faced with a choice: Either get unique, useful content on these hundreds of thousands + pages, or keep Google from showing them in the search results. Given the scaleability issue, you can guess which camp most enterprise-level ecommerce sites are falling into. I think this is very unfortunate and makes the web a worse place, but I don’t own a search engine so all I can do is react and 
complain
 provide constructive criticism. A word of caution: I have found that simply putting a robots noindex tag in the page’s header won’t do the trick. You may have to think outside the box. For instance, I know some people who have had success with forcing their low-quality pages to show a 404 status code, even if they really do exist. Others have combined robots.txt blocks with robots meta blocks. Ideally, however, you need to have unique, useful copy on every product page. Never use the description supplied to you by brands, manufacturers and distributors if you can help it.
Here are a few other things you have to worry about now…
  • Product Feeds (How to generate them, where to submit them, and how to keep it from plastering duplicate content across dozens of channels and thus killing your product page rankings when your Amazon or Ebay store outranks you with your own description and the partner channel manager or affiliate manager takes credit for all of those sales while the executive team blames you for lost rankings and organic search revenue drops… can you tell I’ve been there before?).
  • Paid Inclusion in Google Shopping (This used to be an organic channel. Now that it’s pay-to-play whose job is it?)
  • Google Trusted Stores program (It isn’t as easy as just applying. Though free to all merchants now for now, you need to know how your’e going to supply Google with regularly updated information like when and why an order gets cancelled. And once in you need to maintain high standards like 90%+ on-time shipping. My advice: Let someone else handle this channel. You have enough to worry about without dealing with customer service issues too.)
  • Faceted Navigation Filters (See above Navigation section)
  • On-Site Reviews (Via third-party solutions like BazaarVoice and PowerReviews, or a home-grown solution)
  • Off-Site Reviews (From sites like PriceGrabber, epinions, rateitall, viewpoints and more… These may affect product rankings, especially on Google Shopping, so you need to know how to get lots of positive reviews without violating guidelines. Here’s a hint: Don’t ask the person who received their order late and had to send it back three times before they got the right color.)

International SEO

We recently finished a strategy and recommendations document for a US client looking to break into international markets. The scope was to discuss all of the different options, including pros/cons and best practices, and then to make implementation recommendations for best practices specific to the option they ended up going with. The three primary options haven’t really changed in the last few years. They included, in order of preference: #1 Separate CC TLDs for each country; #2 One TLD (e.g. .com) with sub-folders (e.g. /france/); and #3 One TLD with subdomains (e.g. france.globalsite.com).
However, while the three primary options haven’t changed much, EVERYTHING else has. In fact, I started the document in mid-May and by the time I’d finished the first draft a week later the game had totally changed in regard to even the most recent developments. Specifically, while I was recommending the painful process of bulking up the code on every page by using multiple rel href lang tags (see below) in the section of every page, Google finally figured out that it would be much easier to just do this in the XML sitemap.
So just some of things to consider with an international site:
  • CC TLDs, Subdomains or Subfolders
  • XML Sitemap segmentation
  • Webmaster Tools geotargeting
  • Origin country homepage (e.g. US) or global homepage
  • Single rel canonical per page, or self-referencing rel canonicals
  • Country-specific pages, Language-specific pages or both
  • Auto-detection of geolocation or not
  • Auto-redirection to geolocation or not
  • Rel alternate href lang tags for Google
  • Meta content-language tags for Bing
  • Interlink all of the international pages to each other or not
  • Auto translation (e.g. Google Translate) or custom-written content
  • Geographic location of web host
  • A host of other factors, like cultural, budgeting and future-proofing considerations
Rel HREF LangWhile Bing seems to do pretty well with the meta language tag, Google has recently said they prefer the rel alternate hreflang tag, as described here. As I understand it, the tag can be used for defining pages with completely different languages, the same “language” with different spellings and geotargeted areas (e.g. UK and US or Spain and. Mexico), or when you translate the page template (header nav, footer, sidebars…) but the main content is in a single language. One important thing to remember is that the rel href lang tag for each country/language needs to go on all other pages. So the rel alternate hreflang tag referencing the Spanish page needs to be put on the English, Chinese, German… and all other pages. As you can imagine, having separate headers for all of these different pages is a hassle. Google answered this with support for rel alternate href lang tags in XML sitemaps.
Check out…

LOCAL SEO

Local Search has always been a strong area for seOverflow because Mike Belasco is a frequent speaker at national and international conferences on the topic of local SEO. I can’t keep track of how often this changes. It is just insane and, quite frankly, I’m going to leave it to Mike to pick up on this in a separate post (coming soon) about all of the changes that have happened recently in local search. Until then, here are some undoubtedly out-of-date local search rankings factors to which Mike contributed in 2011. And here is the Local Search Toolkit, powered by seOverflow. Also check out David Mihm’s “A Brief History of Google Places” timeline.
Jenga Terrorist!

An Aside: Dealing with Frustration

All of the tasks we have to do, and the seemingly endless series of paradigm-shifting algorithm changes, features, products, guidelines and directives streaming out of the GooglePlex and elsewhere are enough to drive you insane. For me, local search is a particularly sanity-consuming area of search. For this reason, I choose not to study it. While I’ve learned a lot through osmosis just by working with experts like Mike Belasco and Dev Basu, and talking with good folks like David MihmAndrew Shotland and Miriam Ellis when I have the chance, I know that local search isn’t my cup of tea. If I tried to know as much about local as I know about eCommerce I’d probably just throw my hands up in defeat and start a new career farming sheep.

Link Building

We will be expanding on the changes in link building in future weeks, as our link building expert Alex shares his latest tips. In the meantime, here are a few of the new considerations we have…
Stay tuned for a more in-depth post about link building changes because this is one area that has had massive and frequent disruptions over the last few years. Also, see my rant above about PR Vs. Link Building ROI and you’ll have a pretty good idea of where I think this part of the industry is heading over the next 2-3 years.

FU Google!Analytics

As analytics has become more complicated and useful, we’ve also lost a lot of data. This includes Yahoo Site Explorer (not analytics, but useful data we no longer have) and most importantly the infamous (not provided) data. Here are some analytics changes and analysis tasks that have landed on the SEO industry’s collective plate…
(not provided): You need to know how to get information out of data that is (not provided) and there are several good articles out there about exactly that. This is the single biggest analytics shake-up since Google bought Urchin and turned it into Google Analytics. While Google told everyone they probably would only see single digit percentages of (not provided) traffic, things quickly grew well beyond that and into double-digits for at least half the sites, and up to 20+% for many. These days it is rare to find a site that doesn’t have (not provided) as the top referring keyword, and often the top converting one as well. What a pity. Google claims this was a privacy issue, but if you don’t mind a bit of sailor talk here: That is bullshit. They still give the keyword data to paying AdWords customers so what Google is essentially saying is “Your information is private, unless someone buys it off of us.” If you read that in a privacy policy on any other website you’d close your browser and never visit that site again. As SEOs we need keyword data. It is our lifeblood. I’m not sure what else to say about this development, other than it makes me very angry.
Filters Galore! I’m a big fan of Google Analytics filters and segments. While I do miss the day when installing analytics from any major vendor was as easy as copying and pasting a piece of code into your template, setting up filters and segments to wrangle more information out of Google Analytics data is definitely worth the time. Here are a few resources that may help you…
I can’t possibly list all of the great filters and segmenting options out there, but showing the full URL, and tracking rankings have been very important for me. Just to give a real-world example, one of my favorite reports to run is what I call the “Page Two Performers” report, but it could also be run as a “Below the Fold Performers” report. Simply show all keywords that are sending a reasonable amount of traffic and/or conversions from the second page of Google (or from spots #6-#10). Set the threshold in a way that gets rid of most of the “noise”. For instance, if a keyword on page two brought in two searches, one of which converted, it will show up as 50% conversion rate. Don’t get too excited about a sample that small. I typically make sure that there were at least 20 visits in a month from that keyword, but that changes depending on the site and their overall traffic levels. The ide of this report is simple: If those keyowrds are sending traffic/conversions despite less-than-ideal rankings, imagine what they could do with just a bit of a boost? These are “diamonds in the rough” and will be my focus keywords and landing pages for the coming month in terms of internal/external link building an on-page optimization.
Real Time Analytics: Several companies have offered real time analytics dashboards in the past, and Google Analytics added it to all accounts in 2011, making it easy to see what’s going on during major traffic spikes after, for example, a TV commercial or interview airs.
Multi-Channel Attribution Goes Mainstream: It isn’t always as simple as knowing that a user came from search and used this or that keyword, then converted on the site. Likewise, it isn’t always enough to know that someone arrived from a display ad. Some businesses, particularly at the enterprise level or those with complex affiliate lead generation relationships, need to know what the first touch and last touch were. Some businesses may decide to attribute the conversion to the first touch, others to the last touch, and still others will break up the conversion and apply a percentage of it to each. As if analytics wasn’t complicated enough already… But to make things easier, LunaMetrics has provided an attribution modeling tool via Google Docs.
Webmaster Tools: Major changes have happened in both Google and Bing’s webmaster tool dashboards. In fact, Bing madesome big changes as I was writing this. Here are a few Google / Bing WMT changes, some of which create new taks for the SEO…
  • Google has been sending more messages to webmasters via GWT, including unnatural links, spam, major traffic drops…
  • SEOs now have control over parameter handling with both tools.
  • SEOs now have control over geotargeting in GWT
  • Bing added Link Explorer data in the Phoenix edition of BWT (presumably to replace lost YSE data)
  • Bing added an SEO analyzer in the Phoenix edition of BWT
  • You can fetch as Bing or Google bots, which is especially useful in identifying potential cloaking issues
  • Both tools now have URL removal tools
  • Reinclusion requests, including lots of clarification on manual Vs. algorithmic penalties

Paid Search

Admittedly, PPC is not my strong-suit. That’s why our pay-per-click team’s Director, Billy Overin, will follow-up this post with more information about the recent developments in paid search. In the meantime, here are a few of the things I know to have changed off the top of my head…
  • Search Retargeting: SEO and PPC have always worked well together. For instance, you could test some keywords in the paid search arena before spending months going after them in organic only to find out they don’t convert for you. Search retargeting presents yet another way to leverage organic search data for paid search profits by paying for ads shown to users who have previously searched for your brand or keywords. As you can imagine, this opens up a big attribution model can of worms, but can also be extremely profitable.
  • Google changed the free Product Search to Google Shopping, which is now Pay-to-Play
  • Mobile ads took off in 2011
  • Call tracking on PPC ads
  • Multi-channel funnel analysis and new attribution models
  • New paid search channels open up on social media sites (e.g. Facebook)
  • New “automated rules” in Adwords make life a little easier for those not using bid / ad management software
  • Quality Score now showing in adCenter interface
  • GPlus +1 buttons showing on paid ads
Stay tuned for a more in-depth look by our resident PPC guru Billy Overin at what has changed in paid search over the last couple of years.

Mobile Search

Just what is a ‘mobile device’ anyway?
This question used to be pretty simple to answer: A phone. These days our phones are computers; our computers are phones; tablets are laptops; laptops are tablets; everything is a TV and almost everything is “mobile” in the portable sense of the word.
Mobile Explosion: There are 5.9 Billion mobile subscribers (87% of the world’s population) with over 1.2 billion mobile web users, accounting  for about 8.9 percent of visits to websites globally (source). People don’t worry about using too much data with unlimited data plans being the norm. Websites now render very well on most smart phones and tablet devices, making the browsing and shopping experience not all that different from being on a desktop. If none of this sounds like “news” to you, that is precisely the amazing part! Accessing information on the web from a mobile phone was so difficult and expensive when I started doing SEO that it just wasn’t something we had to worry about, though there was a lot of talk at conferences about the coming mobile revolution. Over the last eight years it has gone from that point to the point of Mobile SEO sessions being among the most consistently packed rooms at every conference – because we need to know this stuff!
The Impending Death of Apps?
I did some work for a coupon site not long ago and they had a really great mobile coupons app for the iPhone and Android devices so their “Mobile” page consisted of a big call-to-action asking visitors to download the application. Not long ago if people were searching for “Mobile Something” it would be perfectly fine to ask them to download your app. However, in looking at the click-paths and digging around in some of the keyword-level metrics, it became apparent that the vast majority of visitors did not want to download anything at all. They were on a mobile device surfing the web and wanted what they wanted right away. We decided to move the app download CTAs over to the sidebar and populate the main content area of the page with coupons that were accessible (and usable) from a mobile device without having to download the app. I can’t share specific metrics, but suffice it to say that we made the right decision. You can see the page here. The point is: If you have a game or some multi-featured app that can’t be accessed from a mobile browser then perhaps an app is the best way to go. But the days of having apps built simply to show a bare-bones version of the same content you can find on the website is probably coming to an end. #ThankGod PS: Five minutes after publishing this I saw a tweet from AJ Kohn linking to a great ZURBlog post that also supports this probability.
Google’s Advice
As yet another example of how fast things change in this industry, I wrote the Adaptive Design paragraph below on Wednesday June, 6th 2012 around lunch time and by 2pm ET Barry Schwartz, while covering the iSEO panel at SMX Advanced, wrote that Google’s Webmaster Trends Analyst, Pierre Far, finally gave Google’s “official” stance on mobile SEO best practices. You canread the whole thing here, but the short version is: Use responsive / adaptive design.
Adaptive / Responsive Design
Not long ago it was ok to send people to a separate mobile “version” of your site, but these days more and more mobile experts are touting Responsive Design (some call it Adaptive Design), which avoids the problems of duplicate content, worrying about auto-redirects, unintentional cloaking, and the fact that some tablets are more like laptops and some phones are little tablets.

Other Stuff

Basic On-Page: Even the most basic on-page tactics have changed. I no longer write title tags for search engines. I write them like they are sales copy and if that means not fitting a secondary or tertiary keyword, or a keyword variant into the tag then so-be-it. The days of listing out keyowrds separated by commas, pipes or colons are gone – long-gone IMO, but looking at the SERPs I’d say about 90% of SEOs haven’t gotten the memo yet.
Negative SEO: Are people buying crappy links to try and get your site banned or caught in a Penguin filter? Are people paying Mechanical Turks to drive up the related query count for things like BrandX Rip Off and BrandX Scam? If so, what do you do about it?
New Crawlability Developments: Yes, Google can see and execute javascript. Yes, Google can and will see stuff that you have blocked in the robots.txt file (such as that script you think hides all your affiliate links) because the robots.txt blocks robots, not browsers and Chrome is a browser. Additionally, Google Preview Bot needs to render everything on a page in order to generate a preview. And since when did Google actually care about anyone’s privacy or copyrights other than their own? How do you deal with AJAX sites and what in the world is a HashBang?
Site crawlability has always been a major SEO issue. The more complicated the code gets, the smarter the bots get. The smarter the bots get the more complicated the code gets. And as bots get smarter and code gets more complicated SEOs have to keep piling on little pieces of information and learning new skills.
Video Search: Getting video thumbnails used to be difficult. You had to use your own player, or come up with some creative ways to get around the fact that Google didn’t want to show thumbs for pages that only had embedded YouTube videos from YouTube.com. Nowadays all it takes is an XML video sitemap (sometimes not even that) and Ta-Da! You have yourself a video thumbnail. And another task gets added to the list.

So SEO is dead?

Hardly. If an exponentially expanding task list is in any way associated with job security (and I think it is), SEO is a great field to be in for the foreseeable future. While those who fail to adapt are going to be looking for a job, everyone else has plenty of work to do. If your plan is to adapt you’ll need to incorporate the following two words into your SEO life:
jenga by Mikelo, on Flickr#1 – FOCUSI think we’re going to need far more specialists, and those who remain generalists will have to know who the specialists are so they can outsource as needed (see #2). That is why one of seOverflow’s goals for our entire team is to find out what most interests each of us and to encourage everyone to delve deeper into that specific area. For me it happens to be eCommerce SEO, but I’d also love to be an “expert” in video search, and schema markup while keeping my technical SEO skills razor sharp. In the end, I may have to give up on video search in order to stay up-to-date on eCommerce.
As long as you know what you want to focus on and make it a priority to learn everything you can about that one area you will have a career in SEM / SEO / Internet Marketing / Competitive Webmastering / Inbound Marketing…
Or whatever they’ll call it next.
Jenga round 2 by Off Kilter, on Flickr#2 – TEAMWORKThe delimma is that you’re not going to be able to know how to do everything, but everything has to be done. For agencies this means you need to get your top SEOs to specialize in areas that complement each other so you can provide clients the depth of knowledge, as well as the breadth of skills, needed to reach the top of the search results.
For an in-house SEO (and really for any SEO) this means being involved in the community, attending search conferences, networking online and knowing who is the expert in whatever it is you need at any given time.
When I have a video search question I go straight to Mark Robertson of ReelSEO. He isn’t cheap and his schedule is tight – and there is a reason for that. I recently had a stumper of a question about eBay and Amazon stores so I asked the folks at Channel Advisor and got the answer I needed. Likewise, I am always happy to answer a tough technical problem or eCommerce SEO question for a friend because I like to help people in the SEO community, and because they usually know things that I don’t – and I’ll be hitting them up with a question soon enough. I’m constantly going back and forth with Andrew Shotland (local search Guru, but also a very strong technical SEO), Marty Martin(had a few Magento questions for him), and my fellow Q&A Associates and Mozzers. If I have an enterprise content or online news stumper I’ll try and get in touch with Marshall Simmonds (Define Media Group, Former SEO for New York Times). Dr. Pete can take a stab at about anything I throw his way. Debra Mastaler can help me think outside the box on building links in tough industries, and our team here at seOverflow is chock full of experts in PPC, Link Building, Local Search, Project Management and more. You do not have to know everything (I certainly don’t), but it helps to be on good terms with people whose opinions and knowledge you can trust regarding the stuff you don’t know. Likewise, when they hit your up for your input be generous.
You actually read all the way down this far? Come find me at a conference because you’ve earned a pat on the back!I didn’t set out to write a post this long. I just wanted to make a point: You can’t know everything about everything in SEOunless you are way smarter than anyone I know. And if that’s the case, you should probably be working to cure cancer or sending your own rockets into space instead of getting websites to rank higher. Either that, or you should be working for the other guys. As for the rest of us, we can know a lot about a few things or a little about a lot of things. In my opinion the SEO of
the future
 today needs to focus on depth within themselves and their chosen SEO niche, and breadth within their team, community and professional support network. That was my original point, and it only took me about 11,500 words to say it.


By :   - Everett has served as in-house SEO specialist in diverse corporate and startup environments, as well as running his own agency, learning the needs and roadblocks of clients in eCommerce and other hyper-competitive niches.

Source : http://www.seoverflow.com

Share

Twitter Delicious Facebook Digg Stumbleupon Favorites More