Blog Disclaimer

The posts published in this blog are collected from different blogs or websites written by various famous bloggers/writers. I have just collected these posts only. These posts are not written by me. All collected posts are the great stuffs.

Blog Disclaimer

All content provided/collected on this blog is for informational purposes only, it is not used for any commercial purpose. At the end of any post, the visitor can find the link of the original source.

Blog Disclaimer

At the end of any post, the visitor can find the link of the original source. These posts are only for further reference to review/study latter. It’s a request to all visitors; please go through the original post by clicking on the source given below/above of every post.

January 31, 2011

Pay per Click Advertising

Major PPC Providers:

Google Adwords:

AdWords is Google’s flagship advertising product, and it’s main source of revenue. AdWords offers pay-per-click (PPC) advertising, and site-targeted advertising for both text and banner ads. The AdWords program includes local, national, and international distribution. Google’s text advertisements are short, consisting of one title line and two content text lines.
google adword ads Pay per Click Advertising Tutorials [Updated]   Session2
Lowering Click Price after Bidding:
After you get 15 or 20 clicks and have a decent click-through rate you may want to cut your bids in half or by 2/3. Often it is best to start off with your ad around the #1 or #2 positions to collect feedback and then let it fall back after you drop ad price.
Demographic Targeting:
The Demographic targeting feature introduced by Google AdWords program is a way to find and run your ads on sites with the right audience for your add campaigns.
The right audience or demographic group is an audience that shares a particular trait or characteristic such as age, gender, income, etc. Some site-publishers (e. g. social networks) ask their users to identify themselves by age and gender.

Yahoo Search Marketing:

Yahoo! Sponsored Search lets you create ads that appear in search results on the Web’s most popular destination and other sites in the Yahoo! distribution network. Their Start-Up helps to get your campaign online with five easy steps.
  • Select keywords directly relevant to your site content. The editorial staff of Yahoo! Search marketing will check your site content for correspondence with your keywords.
  • Write a search listing which consists of a title, description about your site and what you have to offer and URL.
  • Determine bid amounts for your search listing.
  • Your listing is distributed across the web.
  • Finally, your listing appears in the results of search engines.

Microsoft AdCenter:

Microsoft offers their own pay-per-click ad-bidding system called Microsoft adCenter that pairs search results with sponsored text messages from advertisers.
They help to start your online advertising campaign and target ads to the times, places, and customers you want the most. This program also has built-in tools to manage your advertising process for better results.
Keyword generation tool will automatically generate keywords based on the word or website address you choose.

Read the original post here -
Pay per Click Advertising Tutorials [Updated] - Session2

Onpage SEO Factors

1. Keyword research *
I am sure most of you already know your main keywords in your niche. Don’t make one big mistake. Do not ignore all related keywords which will help a lot if you use them in your body text (articles etc.)
So go there once again Keyword tool and make a list with all related keywords with any traffic.
2. Domain name *
Choose domain name with your main keyword or keywords if you don’t relay on branding and big advertising budget. For example if you target “SEO tips” go for exact match like seotips.com. Here comes next big problem. Most exact match domains are already taken. In this case it is better to buy something like seotips24.com. Still short and good for direct traffic. If there is now good dot com domain next best tld is dot org according to SeoMoz research. Choosing keyword rich domain name will save you a lot of time and money. This domains rank faster and better than non keyword rich domains. What else you get for free? Direct traffic. It is free targeted traffic.
Meta Information in Header Use different title and description for every page*
3. Keyword in title tag
Close to beginning. Best place is first to third word for your main keyword in your title. Keep it short and descriptive.
4. Meta Description
Use related keywords and try to avoid keyword stuffing. Description is used for automated google snippet like this one.






Onpage SEO
Google snippet
This page meta description is Optimal onpage optimization for better search engine rankings
The best use of meta description is to convince web users to visit you website. This is the place you must show what your webpage is about. Use it like ad slogan.
5. Keyword density in body text
Good density varies for every niche. Good practice is to keep it between 5-15% for all keywords and less than 5% for individual keyword.
6. Keyword in alt text
Do not Do not stuff
7. Internal links
Use the power of your internal links. Link your main page or other internal page which you want to rank high in search engines with your keyword anchor text.
8. Use clean urls
Address like http://yourwebsite.com/title-of-your-page is far better than http://yourwebsite.com/load=categoryl::Home&somethingId=16431. First one tells a lot more to search engines and will give you advantage against your competitors if they don’t use clean urls.
9. Effective Link Structure
Try to keep your pages more accessible as possible. No pages deeper than 3 clicks.
10. External links
More external links will lead to passing link juice to external website and could heart your google ranking. Reduce your external links or use nofollow attribute. Sometime nofollow from homepage to all external and internal links may help you for higher SERP. (search engine results position)


Original article is published here Onpage Seo checklist | Danail SEO blog

The Effectiveness of Karma Systems in Social Websites

Karma as a game mechanic works wonderfully in social systems. We see karma game mechanics in action most frequently for user contributions in sites like Reddit, Hacker News (HN), Stack Overflow, and Foursquare.

While each of these sites (along with the many more that have similar features) employ their own algorithms for determining a user’s karma standing, the general idea tends to be pretty consistent: Good actions (such as submitting good links or flagging link spam) increases your karma, while bad actions (like submitting spam or trolling) decreases karma.

In the social web, karma allows a community to self-regulate itself, which tremendously helps in scalability. For example, Reddit — one of the biggest websites in the world, garnering close to half a billion page views a month — is able to run with only 6 staff members, no doubt thanks to the help of the millions of people who use the site and the karma system Reddit has developed.

Karma as a Social Interaction Design Pattern in Websites

Karma System Design Pattern

Karma points are often referred to as reputation points. Users accumulate karma/reputation points as they use and interact with the site and its users. These points can garner prestige within the community, as well as special features only available after attaining a certain quantity of points.

Stack Overflow UsersThe Stack Overflow Users page displays top users with the highest amount of reputation points.

The points are earned by performing various site tasks such as submitting good content, leaving a good comment, or performing an encouraged site action such as filling out your user profile.

Karma points are rewarded by other users (often called "upvoting") and by the site.

Karma/reputation points are earned by submitting good links and writing good comments on Reddit.

Foursquare and Gowalla serve as good examples of systems that reward users for taking advantage of their service and sharing that information with their friends. The badges that users unlock for checking in at new locations offer rewards that can be shared with friends online.

Foursquare badges are displayed on user profiles.

Some systems unlock features of the website once a user gets a certain number of karma points. Usually, these features give users — who have proven their dedication and are intimately familiar with the community guidelines as indicated by their high number of karma points — a set of special community-management abilities such as content moderation and the ability to flag content as spam. For example, in Hacker News, you are able to flag submissions and make user interface customizations (for yourself) once you reach a certain number of points.

Users of Hacker News estimate that at about 250 karma points, they are able to change the top bar color of the site.

Why Do Karma Systems Exist?

So what’s the point of having a karma system on websites? Do these websites really need to associate a rank to their users in order to become (or remain) successful?

The real kicker with karma-based game mechanics is that they keep users coming back. In the case of any website, the rate at which you can retain users is a key ingredient of a successful website. While it is certainly important to always be reaching out to new users, returning users should be a top priority. Rewarding your current users for their deeds (and also keeping score of these deeds for them) is a huge tool for encouraging loyalty. If you can get your users to put the time into building up a reputation in your site, they are much less likely to stray somewhere else.

User Benefits of a Karma System

We can see why a socially oriented website would want to take advantage of the benefits provided by a reputation points system. The potential bump in visitor loyalty and participation is certainly tempting to site owners and web developers, but the rewards are not restricted to those running the show.

What’s in it for users of the site?

Information-Filtering

A well-functioning site that ranks users and content based on the credibility of those who post and promote information is a great tool for the site’s audience. The internet is an easy place to get lost in information overload. An excellent method for sifting through information to find what interests us is to use sites like Digg and Reddit, whose featured content has found success amongst a scrutinizing (and often brutal) audience.

Humans are wired to trust other humans — a concept known as social proof. When we have a tool that ranks content based on the feedback of thousands of people, it saves us time from wading through junk, allowing us to discover content that have been vetted by a community of people.

Opportunities for Top Users

On a more individual level, when we can see that a site user is experienced and reliable, the information they choose to share with us seems to bear more weight. Thus, a benefit for amassing reputation points is the influence that can come with it.

There are other rewards to be had outside of the obvious for a user who fulfills a contributor/curator role in a social media site. It is an arduous challenge to build a reputation within an online community of thousands (even millions), and the outcome of doing so can be immensely beneficial personally. For those who are looking to promote their own work, for example, having a great karma score will immediately increase the exposure of your own content. External career opportunities arise when people prove their mettle in these competitively social sites, as in the example of a software developer who got a Google interview by being a top contributor on Stack Overflow.

Downsides of Karma Systems

There are several issues in systems that incentivize participation.

Gaming the System

One primary issue is that, because attaining reputation points comes with benefits that increases a user’s success in promoting their own work, there will always be a few users that will look for ways to exploit the system to their advantage. This is often referred to as gaming the system.

For example, on Digg, because of the amount of traffic a site can get when their content becomes popular on the site, some groups have taken to gaming the system to their benefit.

Theoretically, a system relying on reputation points to rank user-contributed content should, as a matter of course, be able to self-regulate itself; but in practice, karma-based systems regularly fail in this area because users gaming the system may have greater incentives (such as financial gain) than users who participate in the system fairly and without external influences with regards to their site conduct.

Despite how sophisticated the system’s spam detection algorithms are and how big the site moderator staff is, there will always be users who will persevere in figuring out a way to exploit the system for personal and financial gain. Any time a site or web service on the web builds a high level of success, the race is on to exploit and abuse it to get a leg up on other users. Oftentimes, the result is disingenuous user-generated contributions by people who want to get more page views on their sites.

Ironically, karma systems promote a situation that is diametrically opposed to the core purpose behind it; instead of encouraging good behavior and helpfulness among its users, it entices some to behave in unscrupulous ways to get ahead of others.

Drowning Out New Users

Another downfall of karma systems is that as a community gets older and bigger, it makes it difficult for newcomers to get their voices heard amidst the old guard. Social classes begin to form (e.g., top users versus "noob" users). As a result, we see effects analogous to the rich-gets-richer socioeconomic class phenomenon. While anyone certainly has the opportunity to overcome the odds, it becomes difficult as older, top users get more and more entrenched into the community — a situation not unlike the difficulty of unseating a political incumbent.

Implementing Karma Systems in Websites: Examples and Tools

So far, one thing that reigns true about karma systems is that they work best on large-scale sites that currently have (or anticipate on having) a high volume of user contribution and interaction. The examples used above are all custom-developed sites built from the ground up specifically to accommodate a user reputation system.

This, then, begs the question: Can a karma system work in standard websites, such as a portfolio site or blog?

For example, much like large social media sites, blogs struggle with spam clogging up the comments system. Currently, the most prevalent way of catching comment spam is to employ software that automatically catches spam and/or manual moderation by site administrators. However, wouldn’t it be nice if users themselves helped a site they loved by contributing good comments, flagging comment spam, and rating content?

Let’s take a quick look at a few tools for incorporating karma systems in your websites.

Plugins and Extensions

If you happen to be using a content management system like WordPress or Drupal, there should be extensions available to you that will allow you to implement forms of karma-based interaction features.

Here’s a short list of extensions for WordPress and Drupal:

  • BuddyPress: Adds social networking capabilities to your WordPress installation
  • Disqus Comment System: Replaces the core WordPress comments system with a more robust system that has user options for flagging spam and upvoting comments
  • GD Star Rating: Allows users to rate and review content of a WordPress site
  • Comment Rating: WordPress plugin that allows users to rate comments with "like" or "dislike"
  • Comment Rating Widget: Extends the Comment Rating plugin above with additional features
  • Voting API: Powerful Drupal module for creating robust site voting systems
  • Fivestar: Configurable Drupal module for adding a rating widget
  • Flag: Can add highly customizable flagging features to a Drupal site (read more about it in our list of top 20 Drupal modules)

Third-Party, Social Networking APIs

Social networking sites such as Facebook and Twitter have APIs that site builders can use for obtaining data about online content. Using a social networking API can allow site users to rate, share, and vote content by way of retweeting or liking. Content with more retweets or Facebook likes can then provide social proof for the quality of a piece of content.

Smashing Network displays Facebook Like counts on content featured in the network.

An idea of a karma system for comments, for example, could be to provide a "Like" Facebook button on comments; other users can then use the feature to vote up a comment they like. Perhaps a tiered system can be in place where the more retweets your comment gets, the more prominent it becomes, bubbling up to the top of the comment stack and providing special benefits such as the ability to leave a link to your own site.

Here are links to popular social media APIs:

Points of Discussion

How would you implement karma systems on typical websites and smaller communities? Do you find the systems used by sites like Reddit, Stack Overflow, and Hacker News to be effective?



Read The Original Post Here -
Karma as a Social Interaction Design Pattern in Websites

January 30, 2011

How To Interlink Your Site?

The first rule of this interlinking phenomenon is to link for humans and users first and to think about engine second. Now, the reason for this isn't because I'm telling you oh you have to be super pearly white hat and we can never do anything that would affect or impact or manipulate the search engines. I understand that you're going to want to get good anchor text. You're going to want to flow link juice from powerful pages to pages that need to be indexed that you want to rank well. That's okay so long as you think about why humans would want to pass from one page to another.

Let me give you an example. Let's say I am on this domain that I own and control. I can find a page on here where I talk about something that exists on domain two - a resource, a relevant topic, something where you can learn more information about that specific subject, whether it is commercial intent, information oriented, a cool piece of link bait, a piece of news, whatever it is that exists over here. That makes great sense to link over to, and I think it is perfectly fine and legitimate to link from those. For example, if there is some great news about how elephants at a particular zoo have been thriving in the new environment and you want to rank well for elephants at this particular zoo, let's say at the Bangalore Zoo, it's fine if you have some content over here that mentions those words, that is talking about them in a post, in an update, in a news item, something relevant, and you link over. That works great, because that way humans, who might want to learn more about this topic, can go over there and get that information. It is exactly what they want. Now engines as well will see and recognize that.

But if you do something spammy or manipulative, and this goes to rule number two, and you put something like "elephants Bangalore Zoo" in the footer of every page. This page has it. That page has it. This page has it. And they are all linking over to this one, and then domain three and domain four and domain five, they're all linking there too. That's super weird. So, putting those things in footers, having overly optimized anchor text, anchor text that just doesn't sound natural, doesn't fit with the flow of the page, has nothing to do with the content of what is on there, throwing in unrelated links, throwing in "elephants Bangalore Zoo" when this page is about where to buy pens. Just that kind of stuff is going to be confusing to humans as to why it exists, and that will mean that it might get spam reported to Google. It might be seen manually by quality raters, or it might be algorithmically detected. None of which you want to have happen. Besides which, you don't want to be scaring off your users with this manipulative, junky stuff anyway. Users are sensitive to spam and manipulation just like engines are and they'll be turned off. They'll think less of your brand and your site when they see that type of stuff. So, watch out for that.

Third, last rule, Google really knows a lot about what is happening on the Web. Not just through things like Google Analytics, but through Google Webmaster Tools, through e-mail accounts, through FeedBurner, through the Google toolbar. If they see that you appear to be trying to hide a link profile from them and link profiles are looking really similar between your different domains and there is lots of interlinking happening and the registration or hosting looks like it matches . . . . there are some SEOs certainly out there who are advanced and sophisticated enough to be able to spread their network out and have that rigid discipline about never visiting the same two sites with the Google toolbar on and making sure that no Webmaster Tools accounts are linked and all this kind of stuff that black hat operators often have to jump through these different hoops. There are ways to do it. As an ordinary marketer or ordinary operator, you know, SEOmoz-type white hat operator, you're really going to want to be authentic.

It's okay to link between sites that you own. I wouldn't be paranoid about this. Just make sure that you follow these rules and do it in a genuine, authentic way. Then Google is not going to be like, "Oh, this guy owns these sites. He's linking back and forth between these." They're going to be like, "Oh, yeah, he had some news about elephants and he points over here. Yeah, that makes total sense." Or, you know, "On his 'about page' he mentions the other sites that are also owned by the business owners of this site. That seems totally reasonable and fine." Maybe the privacy policy might have this privacy policy or these terms of use govern these five different websites and link out to all of them. That's okay. That makes total sense. But having manipulative anchor text in the footer of every page or in the side bar of every page on all these different domains, that is going to getting you in trouble. I'd watch out for that.


SEOmoz - SEO Software

SEOmoz - SEO Software

Posted by caseyhen to Whiteboard Friday

January 27, 2011

How to Use Negative Keyword Lists

Frequently running search query reports and refining your keyword exclusions is a great way to cut down on traffic that simply spends your money without achieving results.  A negative keyword, if you need a refresher, is a term that can be added as a broad, phrase, or exact match to filter out “bad” traffic. Your ad won’t show when a searcher types in a query with the keyword in the form you exclude.
Recently, Google added negative keyword lists to AdWords.  This handy new feature makes it much easier to manage your negatives at the campaign level, as you no longer have to update each campaign individually.  You can just add a new negative keyword to a list, which will then apply to all of the campaigns that are sharing that list.  What’s really great about this is that if your site could potentially make changes to products, geographical area, shipping, pricing, etc. you can simply delete the negative list from the corresponding campaign(s) or just go straight to the list and delete out a few terms without downloading your negatives and sorting through them.
We’ve always had the option to add keywords at the Campaign level (made especially easy with the AdWords Desktop Editor), however, these new lists make your exclusions easier to manage and more targeted.  Follow the steps below to learn more about making and applying negative keyword lists in the AdWords interface and Desktop Tool.  Before we go into creating negative lists, let’s take a minute to review a couple of good ways to generate negatives with a couple of tools available to PPC managers.
Review for Finding Negative Keyword Ideas

Let’s use the example of a shoe store website that specializes in men’s dress shoes.  Our hypothetical site doesn’t sell any ladies fashions or any athletic/casual men’s options, but they might in the future.

The Keyword Tool can be used to generate both keyword and negative keyword ideas.  Add search terms in to the tool that you’re trying to get your ad to show up for then take the time to go through the list and identify keywords that will bring in traffic that isn’t looking for your product or service.
You can use the keyword tool to locate keywords and negatives at the same time by using the drop down menu on the keyword tool to download a keyword as a negative.  The example below is for a store selling dressy men’s shoes, but not women’s.  This can also be done easily by downloading the terms and doing it manually in good old Excel.
You can also use the search term report located on the keywords tab in the AdWords interface to see all of the queries that resulted in an impression for your ad in a specified time period.  From this list, you can sort by spend and conversions to determine if a certain query is being used a lot but not resulting in the right kinds of traffic for your site.  Then, take a broader view of the list and identify terms that just aren’t relevant (even if they aren’t spending a lot).
You’ll want to group your negatives into themed “buckets” to add as a list later.  So, taking our above example, let’s say at the time being, your shoe store doesn’t sell men’s active shoes, but you look to do so in the future.  For now, it doesn’t make sense to waste impressions and clicks on customers looking for hiking shoes (or some other activity).
You can choose to break your lists down to a broad “Sports Shoes” list or break it down even further with a list for “Hiking Shoes,” “Basketball Shoes,” “Cleats,” etc. Get the idea?  Then, when your site changes, it’s easier to find and delete the necessary negative keywords.  It’s really up to you how targeted you make these lists – look at your goals and plan accordingly.
Creating Negative Lists in the AdWords Interface
First, you need to click on the ‘Control Panel and Library’ option under All Online Campaigns, then ‘Negative Keyword Lists’.
Select the ‘Add a New List’ option and title your list.  Remember that creating themed lists of negative keywords will help you to use them as efficiently as possible.  Click save and your list is created.
You can return to your created lists and edit the term and match type by just clicking on the list. Then, just hit save and it will update your list, and all of the campaigns linked to the list.
Next, you need to decide which campaigns this list is going to apply to.  To do this, go to the keywords tab in the interface and scroll all the way down to the bottom where your negative keywords are located and click to expand.  There is an option for Keyword Lists on the Campaign level side.
From here, you can select which campaigns you want your list to apply to.  You simply highlight the campaigns and click “add”.  When you go to edit this negative list, it will automatically apply to all of the campaigns that you have the list set to.
According to AdWords Help, you can delete lists permanently by going back to the Control Panel and Library in the AdWords interface, clicking on Negative Keyword Lists and selecting the list you want to remove.  However, after I set up a practice list, I wasn’t actually able to delete the list, just the keywords I had added to it and which campaign I had applied to it.  The list itself still technically exists.  This isn’t really a problem, as the list has no negatives applied to it, however I found it frustrating.


Original Post is published here, by Jessica.Cates
Using Negative Keyword Lists | The Adventures of PPC Hero

Why A Site Is Not Indexed

Have a look on these following reasons why a website is not Indexed -

  • No Inbound Links To The Site - You need at least one good quality link from another website to be included (ie. "indexed") in the major search engines. This is the main reason why new websites are not indexed. Start by submitting your site to a good quality directory like The Open Directory Project, JoeAnt, or one of the others I recommend in my Building Links SEO Tips article. Ask other websites whose main topic is related to yours to link to your site. A good quality link comes from a prominent page on a high quality website. Forum signatures and blog comments are not likely to help your ranking much, but they can help get your site indexed.
  • Site Blocked By The robots.txt File - The robots.txt file controls how search engines access your site. Many website designers block the search engines from crawling a site while it's under construction, and then forget to remove that block when the design in completed. So check your robots.txt file in Google's Webmaster Tools, where a tool is provided that will test your robots.txt file.
  • Site Blocked By robots <META> Tag - Similarly, some web designers will insert a robots <META> tag set to "noindex" on a site's main page while it's in the testing stage. This will prevent a site from being indexed.
  • Server And DNS Problems - It's rare, but not unheard of, for server problems to be severe enough to prevent the search engines from accessing a site. Sometimes hosting services have automatic controls that see search engine crawlers (a.k.a. "robots", a.k.a. "spiders") as malicious and block them. Use the "Fetch As Googlebot" tool in Google's Webmaster Tools console to make sure that Googlebot (Google's crawler) can access your site. Extremely slow response times can also prevent indexing. An improperly configured Domain Name Server (DNS) can sometimes cause problems as well. Check your domain's DNS at intoDNS, and if their system reports problems, consult your hosting service about it.
  • Frame-Forwarded Domain - Some free and low-cost services have webmasters create a <frameset> page as the main page of a domain name package, but the website's actual content resides on a different domain. The major search engines see such a website as having no content of its own and will not index it. They will, instead, index the content at the domain where it resides.
  • Poorly Constructed Webpages - The search engines' crawlers are very tolerant of errors in the HTML mark-up of webpages, but it is possible for this to prevent indexing in extreme cases. Valid HTML mark-up will not only help insure that your site is indexed properly, it will also insure that your users see what you wanted them to see when they visit your site. Check your site's HTML with the W3C's HTML Validator and correct any errors it reports.
Read the original post here -
Why A Site Is Not Indexed - Rainbo Design SEO Tips

Search Engine Issues with Multiple Domain Names

In the heady days of the Dot-Com Boom, there were countless people registering domain names that contained the names of celebrities, well-known companies, movie and television titles, and anything else they thought would bring in the big bucks when they sold the rights to them. Domain names are not relatively the hot properties that they once were since the registrars have set up rules that protect trademarks and copyrights. But the biggest reason that catchy domain names are no longer the Holy Grail is that Internet users are now accustomed to using search engines to find products and services, rather than blindly typing in things like "bestcarprices.com". But the desire for higher rankings in the search engines has revived a great deal of momentum to the practice of buying many domain names and having them all point at the same website - or more precisely, the same WebPages.
Search engines like Google do provide some weight to the contents of the URL for a webpage in their ranking algorithms. It's just a tiny factor, but some hucksters are so anxious for any benefits that they'll register dozens of domain names that are variations of the primary keywords for their sites just to get that small boost. However, there's a fly the size of Brazil in this ointment. It's the mirror site or duplicate content filter that all search engines impose if they detect duplicate content. The search engines have several concerns in this regard. First of all, allowing multiple URLs to point to the same content degrade their search results. Second, each webpage in the index of a search engine consumes valuable resources in their networks and its understandable that they don't want to waste these resources on SPAM or even innocent copies. When Google detects duplicate content, it tries to select the best version of the page, (the "canonical version"), and devalue the copies. The common phrase "duplicate content penalty" is a bit of a misnomer, since there is no overt penalty involved. But, the problem is that you don't get to pick which copy is selected as the canonical version and which one gets ignored. So you can have half of your content indexed under one domain, and the rest on another, all of which kills your internal linking benefits, and damages your overall rankings on both domains. Using a different language to convey the same information is NOT duplicate content, so don't worry about that.
The best advice is to never have multiple domain names pointing to a single website unless you have set up 301 redirects to a single URL: that is, the URL for the primary domain name. It is doubly or triply difficult to get enough link popularity spread among multiple domains in order to have any positive effects, as opposed to simply promoting and enhancing a single website with a single domain name. Many companies and organizations buy extra domain names that are common misspellings of their preferred domain name, or might otherwise be mistyped by users or misused by competitors. In those situations, of course, the best practice is still to install 301 redirects back to the primary domain name.
But if you have a compelling reason to use multiple domain names, such as using country-specific Top-Level-Domains (ie. TLD's like ".uk" or ".au") for their search engine ranking advantage in terms of geo-location factors, you need to take steps to avoid problems by making sure that there is very little duplication among the sites you operate. Using a different page design can help, but it's the actual text within each website that needs to be as unique as practical considerations allow. Google has recently posted an article on Multi-Regional Sites that discusses this issue in great detail.
If vital information is on the company's main website and it can't be rewritten for one reason or another, then you should use one of the following alternatives:
  • Don't duplicate the page on every site. Always link to the page on the main website from the secondary sites/domains.
  • Use the rel="canonical" tag if the page really must remain on the secondary website.
  • Block the duplicate pages on the secondary sites from the search engines with the robots.txt file or by using a robots <meta> tag set to "noindex".
If you already have multiple domain names in use, then you should merge them into a single, primary domain. The solution is to use server control methods to automatically redirect all requests for pages in the secondary domains to the URL in your primary domain name. The server must return a "301 Moved Permanently" response code in order for the search engines to properly re-assign the link popularity and to update their internal records of the page's true URL and to avoid any problems. Any other response code returned by your secondary domains will, at best, prevent the link popularity to pass on to the primary domain, and, at worst, can cause the duplicate content issues to begin to spread to the primary domain and impair your rankings.
Websites running on hosts that use the Apache server software usually have it the easiest in this regard because they can control this problem on their own using the .htaccess control file. Just create a simple text file named ".htaccess" (with no filename extension), and insert the following command:
RewriteEngine On
RewriteCond %{HTTP_HOST} ^(www\.?)yourduplicatesite.com$
RewriteRule ^(.*)$ http://www.yourmainsite.com/$1 [R=301,L]

Simply replace "yourmainsite.com" in the above code with your primary website's domain name and "duplicatesite.com" with the name of your duplicate domain. Websites based on Microsoft's IIS Server Software will likely need to consult their system administrator for help. Again, be sure the server returns the redirecting result code #301 or you're not really repairing it. A code 302 redirect will not do the job properly or reliably. You can check the result code that your server sends using my Server Result Checker.

Duplicate Pages on Sites You Own
Another instance where a webmaster might have duplicate content issues is when they operate separate domains for different contries. Often, there will be information that is important to include on all such sites. For those pages where the content is identical, or nearly so, it is a good idea to use the rel="canonical" tag on all such pages to point to a single, best quality version to tell the search engines to only index this "canonical" version. The syntax is:
<link rel="canonical" href="http://www.example.com/canonical-page.html">
The search engines treat this tag much like a 301 redirect. It prevents duplicate content problems while providing users with local copies of important pages. See my SEO Tips article on the rel="canonical" tag for more information on how this tag works and how to use it.

 

Original post is published here -
Multiple Domain Names & Search Engine Ranking - Rainbo Design SEO Tips

Geo Tag Generator

What are HTML Geo-Tags?

To answer this question, we first should clarify what html meta-tags are. Well, meta-tags are additional labels in the header of a html page, containing information which is not visible for the visitor of the web page. Meta-tags supplement the web page with metadata that are designed to be read by machines (e.g. search robots). The most common html meta tags are: "author", "description" and "keywords". And this is the way meta tags look like in the html code:
<meta name="author" content="Helmut Karger" /> And now let's have a closer look at geo-tags. Html geo tags are meta tags, that provide information about the geographical location of the web page. They inform search robots where the web page is located.

What are Geo-Tags for?

According to a survey of The Kelsey Group (Princeton, NJ, USA) from the year 2004, 25% of all commercial online searches are local searches. It understandably makes little sense to look for a baker and find one who has his shop in a completely different town. Today search engines are already able to handle regional search requests, gathering city and street names from the text content of the explored web pages. Geo tags will simplify this task since they provide machine-readable information about country, region and exact latitude/longitude coordinates.

Creating Geo-Tags

The Geo Tag Generator will help you to create your own geo tags in a simple interactive way without having to deal with latitude or longitude degrees or the syntax of meta tags.
The easiest way to start is using the address search function above the map window. By march 2007 street level search is available for the following countries: Australia, Austria, Canada, France, Germany, Italy, Japan, Netherlands, New Zealand, Portugal, Spain, Sweden, Switzerland and the United States. If there is no result for your complete address, then try the combination: "city, country" or only the country name.
After a successful address search many of the fields listed below should already be filled correctly. But you may modify them if you want to. At the same time the map window is showing the found place. Zoom in (+) and switch to the satellite view to see more details in the map and use the mouse to drag the marker to the accurate position (or just click at a new position).
To protect your privacy, you have the possibility to reduce the accuracy of the latitude and longitude values by reducing the number of digits after the decimal point. Just select your desired degree of accuracy - the default value is maximum precision.
The "Dublin Core Title Tag" is not a html geo tag. However it will be read out by database sites that store geo-referenced websites in their index. Optionally you may have generated this tag here, if you want to. The "DC.title" meta tag is nothing else but the title of your webpage which you already may have included in a "title" statement in the header section of your page. Let the generator create an empty "DC.title" tag, if you want to fill it with a copy of your title statement later in your html editor.

Transfer Geocode into your Html Page

With completed input fields and the marker at the right position, click into the code area. The entire code is now marked and you can copy it to the clipboard (right mouse button and copy or Ctrl-C). Then open your web page in your html editor and paste the code into the header section (between <head> und </head>). This can be done per menu with "edit" and "paste" or with a simple Ctrl-V. In case of using a wysiwyg html editor make sure to switch to "source code view" before pasting, otherwise you won't get the header area displayed. If you're using a content management system (CMS) or blogging software, then insert the geo tag code into the page template.

Mark your Web Page as "Geocoded"

As already mentioned, geotags like all html metatags are invisible to your website visitors. If you want to show that your web page is geocoded, www.geo-tag.de provides three different icons to embed into your html page. Simply insert the corresponding code fragment into the body section of your web page. If you prefer to store the image file on your own server, feel free to copy it but don't forget to adapt the html code (img src=...) correspondingly.

Have  a look on this site -

Geo Tag Generator

Make Your Website Scream Local - 5 Ways to Make Local Friendly

Search engines have turn into one of the primary ways that people find products and services right in their hometown. This rising reality significantly increases the need for small local business owners to master this thing called local search.
There are many ways to make your website pages much more localized. This is one of the underlying elements that tell the search engines that yours is indeed a local business.

There are a number of things that website owners can do offsite, such as social media participation, that help them come up when people look for local goods and services, but the first step is to make sure that the content on your own site is local focused.
Below are five ways to make your website more local friendly.
Geo content
Simply adding geographic content to your web pages is one of the fist steps. This can include your physical address, directions with street and town names, maps, suburb names and names of communities or neighborhoods where you do work..
It’s also a great idea to do keyword research with local terms to find the best phrases to localized phrases to add to your pages. Google Keyword Tool or Wordtracker
Geo meta tags are also something worth investigating. Google continues to ignore them Bing has admitted they do use them to help determine business location. These tags go in the head section of a page and list the latitude and longitude of a business as well as city, state and country.
The tags for my business are:
meta name=”geo.region” content=”US-MO” /
meta name=”geo.placename” content=”Kansas City” /
meta name=”geo.position” content=”39.040409;-94.598657″ /
meta name=”ICBM” content=”39.040409, -94.598657″ /
Here’s a great Geo Meta Tag tool that will create these for your business address

Internal Links and External Anchor Text
One of the ways that you can enhance the local nature of your onsite and offsite content is to add local keywords in the internal links on your pages (Links that send people to another pages on within you site). So a remodeling contractor that is showcasing kitchen and bath projects located in San Diego would have links to the project pages that would read San Diego Kitchens and San Diego Baths rather than simply Baths or Kitchens.
You also want to add local keywords to the text used to link back to your site from places like LinkedIn or in article directories. So if you’re an attorney in Texas rather than using your URL or firm name in a link you might use Dallas Texas Bankruptcy Attorney as the words or anchor text for a link to your site.
Rich Snippets
Google is busy creating some of its own HTML coding to help it find and display local content and by using what are known as rich snippets you can help Google find geographic information, information about people in your business and reviews of products and services.
Beyond improving the presentation of your pages in search results, rich snippets also help users find your website when it references a local place. By using structured markup to describe a business or organization mentioned on your page, you not only improve the Web by making it easier to recognize references to specific places but also help Google surface your site in local search results.
Here’s a good tutorial for Rich Snippets and Google’s explanation of Rich Snippets for Local Search
Community Resource
It’s become an extremely good idea to add a blog or even use blog software to run you entire site. This format gives more flexibility when it comes to adding pages and content.
Many businesses can create tremendous local content by adding features such as an events calendar or coverage of local happenings around town. It’s not too hard to find an angle that is relevant to your business, interests or industry and then use it as a vehicle for producing local content.
If you partner with local non-profits you might consider giving them coverage on your site.
Local Contributors
One great marketing strategy is to develop a team of local strategic partners – other businesses that serve your same market. These partners should be looked at as a great source of potential potent local content.
Invite each member of your team to contribute content to your blog.  Create video interviews with team members and add directory pages with full local descriptions and ask that they link to these pages with local anchor text.
Find relevant local bloggers using a tool like placeblogger to exchange links and content with.
Don’t forget to get your customers in the act too. Create video success stories and describe the local nature of these customers.
Take a little time over each week to knock out one of these tips and in a little over month your local site overhaul will be paying dividends.

Read Original Post Here -
5 Ways to Make Your Website Scream Local :: Small Business Marketing Blog from Duct Tape Marketing

January 26, 2011

What is Post-Click Marketing Heuristic?

Post-click marketing is a big umbrella. There are many concepts under it, intricately interrelated to each other. It can be challenging to picture the entire ecosystem. But let’s give it a try!
Here is a proposed “post-click marketing heuristic” that offers a guided tour through the creation and optimization of post-click experiences. (Click the image below to enlarge.)
A Post-Click Marketing Heuristic
Post-click marketing starts—in the upper left corner that says START HERE—by subdividing a market into distinct audience segments. For this article, we won’t delve into pre-click marketing: how to place the right ads and distribute links to reach your audience. Instead, those sources of traffic will be our starting point.

Great Post-click Marketing Starts With Context

To create effective post-click marketing, you need to start with the context of your audience. Based on where someone clicks from — a particular ad, keyword, email, or socially distributed link — or what you may already know about them if they’ve visited you before, you want to answer five questions:
  • Who are these respondents? Do they have well-defined identities?
  • What “job” are they looking to “hire” a product or service to do?
  • In B2B/complex sales, what “roles” are participating in the decision?
  • What likely stage(s) in the buying process are they at?
  • Any special “in the moment” context, such as campaigns, seasons or current events?
You may not be able to answer these definitively, but the better sense you have of who your respondents are, and the context in which they’ve arrived on your landing page — the more effective your post-click marketing will be.
To help answer these questions, recommended reading includes What Customers Want from Your Products by Clay Christensen to learn more about marketing to the “jobs” that your audience looks to “hire.” Read The BuyerSphere Project by Gord Hotchkiss to learn about marketing to different roles at different stages, especially in B2B. And read Real-Time Marketing & PR by David Meerman Scott to appreciate the power of “in the moment” marketing.

Great Post-click Marketing Delivers Great Content

Two questions should drive development of content on your landing page:
  • What do these respondent want or expect in this context?
  • After that immediate expectation is fulfilled, what is the next step?
It’s that “next step” that is often the true goal of the respondent — if not today, eventually. For example, a prospect may respond to an offer for a webinar on digital HR best practices. Access to the webinar is what they expect when they click through. But eventually, their goal is to implement those best practices.
You want to pursue purpose-driven design with that goal in mind. This informs which kind of flow and format your post-click experience should be organized around, such as:
  • A single page with an immediate offer
  • A multi-step conversion path to guide and segment
  • A microsite to encourage focused learning and exploration
  • “Romance” pages to segue people into a store or website
The most appropriate kind of navigation is often suggested by this format choice — full navigation, a more limited contextual navigation, or implicit navigation made by choices in the main page content.
If you offer visitors segmentation choices, carefully consider the axis of those choices: by “job” (per Christensen), by role, by stage, etc. Remember the MECE principle, and read Sheena S. Iyengar’s The Art of Choosing to make those choices as “frictionless” as possible.
The majority of your effort in building a post-click experience should go into crafting persuasive content. Often, many of the lessons of content marketing can be adapted on conversion-oriented pages. Such content should be relevant, engaging and authoritative (see the READY framework). It should be differentiated from the competition and reach people on three levels:
  • Rational and logical
  • Emotional and intuitive
  • Social—including social proof
Persuasive content should culminate in an offer — the “next step” you want visitors to take. The components of an offer include the proposition itself, which should be meaningful and timely, but also how it is presented: its position and layout, imagery, copy, and increasingly interaction features. Interaction can be nurtured with video, social features, or app-like experiences, such as a “configurator.”

Great Post-click Marketing Follows Through On Conversion

The moment of truth in an offer (where content turns into conversion) is the call-to-action. This is appropriately a key focus of conversion optimization. Forms should be considered carefully for length, relevance and ease of use. Progressive conversion (collecting information from the visitor across two or more pages) may be an effective engagement model to employ as well.
For the call-to-action itself, you should weigh several different aspects of its presentation:
  • Position
  • Size
  • Color
  • Wording
  • Button design
This is the nitty-gritty of conversion optimization, but it can make a tremendous difference. (Check out WhichTestWon.com to see examples of just how significant small changes here can be — you’ll be surprised and amazed.)
But post-click marketing isn’t just about winning the conversion. It’s about delivering upon the promise of that conversion to the customer. In e-commerce, the shopping cart and the check out must facilitate an intended conversion through to receipt. Fulfillment delivered via email or video or as a downloadable file must be fast and work properly for all recipients.
Behind the scenes, you should efficiently incorporate that conversion event into the rest of your marketing operations machinery (e.g., CRMs, marketing automation platforms, business intelligence systems).
And as Sandra Niehaus wrote in an insightful column last year, don’t forget about the “thank you” page. It’s a terrific (and often overlooked) opportunity to build that budding relationship—and build your brand. It may even lead to an additional follow-up conversion.

Great Post-click Marketing Is Driven By Metrics & Testing

Finally, post-click marketing goes beyond optimizing a single experience. It’s about developing an engine for continuous production and improvement. To accomplish this, great post-click marketers embrace agile marketing practices.
Performance metrics are the dashboard for this engine: bounce rate, engagement, conversion rate, cost-per-acquisition (CPA), average order value (AOV), lifetime value (LTV), return on advertising spend (ROAS), and overall return on investment (ROI). Often the real value here is breaking down these indicators by segment—which in turn inspires new post-click experiences.
Correspondingly, testing — A/B tests for big ideas and multivariate testing (MVT) for more subtle optimizations — is the fuel for post-click marketing. As Bryan Eisenberg, John Quarto-vonTivadar and Lisa Davis advocate in their classic book of the same title: Always Be Testing.
And so post-click marketing continues in perpetual motion.

Written By -

Original post was published here -
A Post-Click Marketing Heuristic

What is Link Spike?

A link spike is a sudden jump in the rate of backlink growth to your site, followed by a sharp decrease in the rate of growth of links to your site. They say that a picture paints a thousand words, and in this case it really does. To illustrate a link spike is better than to try to explain a link spike through pages of text, especially since the name “link spike” comes from the illustrated version of the figures:
Link Spike
Graph 1
We can see from graph 1 that the rate of growth of backlinks to this site is steadily growing, but in week 11 the rate of link growth shoots up, nearly trebling the growth rate, and then quickly subsiding to the usual rate of growth.
An example to help understand the link spike in graph 1 is: your site usually naturally accumulates about 50 links per week in week 1, but the rate of accumulation increases by 5 links per week; so in week 2 the site is accumulating around 55 links per week, and in week 3 it is accumulating 60 links per week, and in week four it is accumulating around 65 links a week…. and so on. But in week 11 you decide to do a lot of link building and because of this your site is accumulating around 270 links a week. Then in week 12 you stop your link building and the natural link accumulation rate returns. This could be how you get the spike in graph 1.
It is worth pointing out in graph 1 that this is not a graph showing the number of links to the site, but a graph showing the rate of growth of the number of links. The graph of the link count is more likely to look something like this:
Link Spike image 2
Graph 2
We can see in graph 2 that our “link spike” looks a lot less like a “spike” using the
“link count” illustration.
So, now that we understand what a link spike is, and where the term comes from let’s discuss what effect it will have on your website, and why.
Well, to cut a long story short a link spike is extremely likely to have Google penalize, and maybe even exclude your site from its listings. Manual reviews happen on a daily basis, generating a large amount of inbound links over a short period is extremely rare but does happen e.g. excellent link bait or viral marketing in which case your manual reviewer will not penalize you at all, however if the quality of inbound links generated are of poor quality you will be penalized. Fighting the Google algorithm is a lot like Star Wars – Dark Side vs Light Side, sure the Dark Side will win at first but who always prevails? The light side. Black Hat or White Hat?
Why? You may ask – well back in the early days of Google, when the search engine began to put a lot of weight on the number of inbound links (backlinks) to a website for its search engine results, unethical spammers began to gather masses of backlinks using unsavory methods, to put their spam sites high in listings for popular key phrases. Google then began to change its algorithm to provide better quality results for its users.
Some of the changes that Google made were: to reduce or nullify the benefit given by links from spammy websites; and qualify the importance of backlinks more than quantify it i.e. it was no longer just about the number of backlinks a site has, but also the quality of those links (hence the famous Google PageRank was born).
The other thing they did to prevent link spam was to flag up a sudden build up of links to a website. Google will see this sudden quick build up of links (the link spike) as someone trying to influence the rankings by building a mass of links to their site, instead of the natural accumulation of links that a website will get without any interference.
So now Google’s algorithm will automatically flag up a link spike as a possible problem with a website, which could later be combined with another problem to warrant a penalty, or the link spike will mean the website automatically incurring a penalty.
What do I mean by a penalty? A penalty may be Google pushing your website down the listings for your key phrases, so that it may move from 8th, on the first page to 26th, on the 3rd page. A stiffer penalty is a ban – this is correctly termed as an “exclusion from Google’s SERPs” (Search Engine Results Pages). This will mean that your website will not be returned in Google’s listings for any search terms.
When can a link spike be OK? There are times when link spikes go unpunished. This is usually when a website is heavily referred to in the news. An example of this could be a website that was about tsunamis. This site would have had a small link rate growth until Boxing Day of 2004 when the disastrous giant tsunami hit Asia. This site would then see its link growth hit the roof for a couple of weeks, until it returned back to normal towards the end of January. This would produce a massive link spike, but because tsunamis were a major occurrence in the news Google would realize that this is a hot topic, and so not penalize that site.

Read More Here -

Link Spike | What is a link spike? | Link Building Spike

How to measure Facebook Influence ?

Appraising our social media influence is a complicated task. Clearly, numbers don’t mean anything, but it’s our “action potential” that really matters on networks. While this is obvious, diving into social networks, with all the complications and types of networks involved, measuring one’s potential or reaches on Facebook or any other social network for that matter is difficult.

SEO Blog measurefacebookinfluencefriends
However, we have certain obvious clues that tells us a lot of a person’s social media/facebook influence.

1. Comments to Time ratio – Measures Influence

If you’re getting lot of comments on your status updates, it clearly shows that people want to listen to you. But it necessarily doesn’t mean that you’re influential. But if you stack up the number of comments against time, probably its a better metric that brings in clarity on how influential you are. More comments in lesser time probably means you’re a celebrity and anything lesser than that graded accordingly by the number of comments and time.
SEO Blog measurefacebookinfluence

2. Likes to posting frequency – Measures Reach

This is probably the most obvious metric. The more the likes, the bigger your reach. But is that all ? Probably not. If you match the posting frequency and number of posts against the number of likes for each, you would get  better clarity on the person’s consistency ad reach. While this metric can be unfair, it could mean a good measure to the person’s interactivity with the medium. For example someone who’s posting updates daily and getting a high number of likes is likely to be more popular and consistent than a celebrity who posts once a month while getting a large number of likes.

3. Network growth rate – Measures Quality

If you measure a decent sample size of (lets say 1000) consistent facebook addicts, it wouldn’t be difficult to come up with an average “Network Growth Rate” – would it ?
I’d consider Average Network Growth Rate as the average number of people every facebook user adds to his network every month.
With the average network growth rate as the benchmark, it would make sense to measure one’s popularity by stacking up their Network Growth Rate against the average. The bigger the better. But again, there will be discrepancies. One metric I wish we could measure was whether friends were added by the person itself or invitations received from others. The latter makes sense, when measuring popularity, not the former.

4. Interactions from outside network – Measures popularity

Imagine a facebook user and his sphere of friends as his immediate network. Now imagine the rest of facebook as his “outside network”. Now there are lot of folks who remain within their immediate network and are incredibly popular. But I think it might make sense to figure out how much of “outside network” interactions he receives, in the form of wall posts or friend invites or tagging, that will give clues on whether the person is “heard” in other circles.
SEO Blog measurefacebookinfluencefriendsnetwork

Is it realistic to measure facebook Influence ?

Probably not. However, it is possible to compare users for their influence and reach on social media. Grading each user based on a standard probably cannot be error free but comparing metrics between more than one user is easier. Having said that the above metrics are just hints about how influence can be measured, not definitive rules.
There are lot of metrics like the above that can be used to measure influence on a social network, and one tool that does it right is Klout. They have recently launched their Facebook Influence tool that will measure your presence on facebook and measure it with some indepth metrics like the one’s mentioned above.

Read the original post here -
How to measure Facebook Influence ?

10 Social Media Monitoring Tools for Measuring Social Media Analytics

Now several brands are jumping into the social media bandwagon and are trialing with things, that’s a fact. Many a times, knowingly or unknowingly we have engaged with them as well. So it makes sense for them to monitor it and see what the results are like.
Monitoring social media is not the regular analytics stuff, that’s where many go wrong.

- Social Media shouldn’t be measured like SEO analytics on your site.

Heck, you won’t even see the referrals as good as an SEM campaign with social media. The thing is, in social media the action is elsewhere and not necessarily does reflect on your site referral stats all the time. Now, don’t get me wrong but I’m not trying to say that if you have something to sell, at the end of the day, it doesn’t. In fact it might sell, but just that the numbers might not look round as in an SEO analytics report.

- Social Media measurement has got to do with many “intangible” signals.

Now, this is a grey area but measuring social media (at least at this point of time) has got a lot to do with “intangible” or “volatile” signals. Like “feel good factor” for a brand, which can be perceived if you ask someone about it but never shows up in the referral stats.

- Social Media monitoring is probably a better word than Social Media measurement.

Why ? Because social media isn’t always about the numbers and formulas. Its more about the “perception”, “engagement” and “emotions”.
Phew ! I hope all that made sense. Now, on to some social media monitoring tools.

Honestly, many of these tools are “not-really-there”. But these are the ones we have anyways. They mostly focus on the numbers unfortunately, while some focus is also give to brand perceptions and engagement ratios. I’d like to see more tools coming up that focuses on the second part rather than the numbers. But that require some level of intelligence and I guess we’re all in a learning mode at the moment.

1. Radian6 – Social Media Monitoring Tool

SEO Blog features banner
Ridian6 is a popular social media monitoring tool that helps you track quite a lot of signals and get insights into your brands performance on various social media channels. They cover almost all the social media channels like blogs, twitter and facebook. It provides detailed analytics reports and charts, perfect for that presentation you want to pull off.

2. Alterian Social Media Monitoring Tools

Alterian’s SM2 social media monitoring tool is one that gives advanced user behavior statistics, measures and analyze daily volume, demographics, location, positive or negative content tone, themes, and trending topics for your brand/product. A little too much details but a good tool nevertheless. This one too, covers a range of social media channels like digg/linkedin/facebook/twitter/delicious etc.

3. ScoutLabs Web based Social Media Monitoring Tool

An eye candy this one. Scout Labs social media monitoring tool is web based, and with an interface like Google Analytics it tracks almost all the online social media channels. It measures all the negative/positive signals and gives you reports based on the overall performaces.
SEO Blog socialmediamonitoring

4. Real Time Social Media Monitoring Tool – Self Service

Self-service analytics software supports both qualitative and quantitative research, providing automated semantic analytics to allow you to explore insights and opinions about your products and brands on social media. It claims that the platform automatically captures large, relevant data sets through topic categorization.

5. Social Mention – Web based social media monitoring

SEO Blog Screen shot 2010 03 30 at 11.03.08 AM
Social Mention is a web based tool that makes it simple. It finds the brand mentions on any particular channel like blog/micro blogs and gives you a comprehensive idea of how the brand is perceived by users. It also gives you idea about the general “perception” of the brand in terms like positive, negative or neutral.

6. Brands Eye – Web based Social Media Measuring tool

This one is a bit different from the rest of the lot. It picks up signals from social media mentions and puts them into a different perspective unlike others. It finds you details about reputation, media origin and sentiments associated to your brand. Impressive !

7. Trendrr – Real Time Social Media Monitoring tool

SEO Blog Screen shot 2010 03 30 at 11.25.01 AM
Trendrr is a very web 3.0 kind app that does real time digital/social media monitoring for you. It analyzes social media channels for your brand mentions and puts them into perspective with numbers. Their data sources include everything from Blogs to Microblogs, Search, Social Networks and even Video.

8. Spark – Social Media Monitoring Visualizer tool

Spiral16′s spark takes social media analytics to a higher level, making it visual and thus making it easy to interpret and compare. They claim that the data is not the regular one, not limited to RSS feeds as a data source, instead combines proprietary crawler technology with public search engines. Spark captures a wider breadth of data and more goes into more detail about web pages, resulting in less spam and more relevant results.

8. MAP – Social Media Analytics Tool

SEO Blog Screen shot 2010 03 30 at 11.37.15 AM
Sysomos’s MAP – Media Analytics Program is another social media analytics program that focuses on the core concepts of social media – listen > measure > understand and engage. The tool provides real-time intelligence to manage products, brands and reputations on social media.

9. Attentio – Social Media Trends / Analytics Tool

Basically what attentio does is collect information from all social media channels related to brands and products, assimilate and study it to give you structured data. Its advantage is that its multilingual and have more of a brand centric focus.

10. DNA 13 Media Monitoring Tool

DNA 13′s media monitoring tool focuses more on a comprehensive media coverage model much like a PR tool. While I’m not a big fan of this model, the tool sure gives us a lot of insight on how and why a brand is performing well/bad on the media channels. They claim that they could provide a TV, print, online news, social media, and RSS feeds, all in one location “umbrella” strategy, which is interesting.

So, Which is the best Social Media Monitoring Tool out there ?

There is no one single social media measuring tool. You have here a bunch of the best social media monitoring tools out there. As I said, most of them do the same job of indexing and finding your brand in the discussions, but the ones that excel do a good job at finding value out of those analytics data, and putting things to perspective. The idea is to mix and match and make use of the best ones from the lot.

Read the original post here -

10 Social Media Monitoring Tools for Measuring Social Media Analytics

Share

Twitter Delicious Facebook Digg Stumbleupon Favorites More