Beanstalk on Google+ Beanstalk on Facebook Beanstalk on Twitter Beanstalk on LinkedIn Beanstalk on Pinterest
Translate:
Published On:
SEO articles and blog published on ...
Hear Us On:
Webmaster Radio
Blog Partner Of:
WebProNews Blog Partner
Helping Out:
Carbon balanced.
Archives
RSS

XMLRSS

Part Two of Ten: Competitor Analysis

Welcome to part two in this ten part SEO series. The ten parts of the SEO process we will be covering are:

  1. Keyword Research & Selection
  2. Competition Analysis
  3. Site Structure
  4. Content Optimization
  5. Link Building
  6. Social Media
  7. PPC
  8. Statistics Analysis
  9. Conversion Optimization
  10. Keeping It Up

What is a Competitor Analysis?

Have you ever wondered how a particular competitor always does so much better than you do in the search engines or online overall? A competitor analysis is one very effective method of deconstructing their online marketing strategy to discover how they are doing so well.

What Exactly Can a Competitor Analysis Reveal?

This is a very common question because many site owners don’t know the lengths that a competitor may have gone to obtain top rankings. The following examples are some of the discoveries I have uncovered in a typical competitor analysis:

  • By examining a competitor’s link structure I have found that many of the links with the most credibility came from websites the competitor actually owned. (Determining the ownership of the domain names required some sleuthing because the whois information was ‘private’ but ultimately the info became available.) In a couple of cases several of these domains had legitimate websites and this prompted some great ideas for my client to attain more traffic.
  • While researching a competitor I noticed that although the competitor’s website was very similar to my client’s, there was one major difference; the competitor’s website structure was far better optimized. By outlining the structure the competitor used and improving on it with my own expertise our client had the information he needed to apply changes to his own site.
  • In another instance I provided a client the list of all the pay per click keywords and organic keywords that each competitor was currently using. The client was flabbergasted when she realized just how many keywords she had missed promoting for her own comparable services.

The Basics of Conducting Your Own Competitor Analysis

Now that you have seen some examples of what can be gleaned from a competitor analysis you might want to conduct one of your own. For the purpose of this tutorial I am assuming that you are fairly new to SEO so I created a basic plan that works for most users; but even this will require a little preparative reading. The following is a list of essential reading material:

Many more free SEO tutorials are available if you find yourself needing more information. The following is an outline of the most revealing steps with the least amount of technical expertise required. Please keep in mind that the objective of this competitor analysis is to compare what you find to your own website later on. What you find may not seem earth shattering (or it might) but this analysis is meant to show you what you might be missing:

Competitor Walkthrough

Grab a piece of paper and a pen and while you walk through your competitor’s website look for any particularly obvious search engine optimization techniques. Here are some elements you should check:

  • Does the title tag appear well written and if so is there a common syntax used throughout the website?
  • Look at the source code of the home page and search for “H1″, “H2″ or “H3″. Do any of these tags show up? If so that means the competitor is using heading tags within the page. Now try identifying the text they used in the heading. Likely you will find the competitor’s Keyphrase is found within the tag.
  • Check if the navigation is search engine friendly. Sometimes the navigation is a drop-down menu; make sure it is a type that is search engine friendly. If not, check the footer of the page and see if a text menu is placed there.
  • Keep an eye out for a pattern of keywords being used in text links. Certain words are likely to appear more often and these are likely some of the target phrases your competitor has decided to focus on.
  • Look for nofollow tags. No follow tags are often used to channel Page Rank efficiently throughout a website. This is called a themed structure and it can have incredible ranking benefits. If you see a pattern of nofollow tag use then you can be relatively certain your competitor has/had a well-informed SEO firm on hire.
  • While you roam through the site look for pages that have particularly high Google PageRank and try to identify why. In most cases these pages have information that visitors decided to link to. Perhaps this will give you some ideas for creating similar quality content for your website.
  • Check the site for the presence of an XML sitemap. Usually it will reside at the root of the website so try typing in the basic URL of the competitor’s website and add (minus the quotes) “\sitemap.xml” on the end. The details within the sitemap might be a little confusing to you but just acknowledging that the competitor has one is noteworthy.
  • Have you found any incidences of spam throughout the site? Take note, I have lost count how many competitors succeeded using shady tactics. This doesn’t mean you copy them, however, but it may at least give you yet another indication of what helped the competitor attain rankings. Believe me, in most cases these sites will get caught with their hands in the cookie jar at which point you won’t want to be associated with the same tactics.

I can’t possibly list everything you need to keep an eye out for when walking through a competitor’s website; at least not in an article format. Just keep an eye out for anything that looks particularly purposeful in the site and linking structure as well as the content of the website. If you find something you can’t be sure is worth noting, then try researching it online; chances are someone has written about the topic/concept or can provide you advice in a forum.

Backlink Analysis

This portion of the analysis will require that you use one of the following link analysis tools: OptiLink (not free but my first choice) or Backlink Analyzer from SEO Book (free). In each case these tools have excellent help files that I suggest reading in order to get the best results from the data they generate.

In this particular stage you are going to use your new tool to analyze the first 1000 backlinks of your competitor’s domain.

Program Setup Note: Be certain to set up the software to acquire Google Rank and Alexa Rank information for each backlink and filter out any rel=nofollow links. The setting is easily found on the front of both applications with the exception of the rel=nofollow which is an option in Optilink but automatically checked in Backlink Analyzer.

When the report is completed sort the backlinks by both PageRank and then Alexa Rank; examine each sorting separately.

Why Are Both PageRank and Alexa Rank Used?

The reason both are used is because they each have notable disadvantages and advantages. PageRank is notoriously unreliable especially lately since Google now penalizes the PageRank of any site with any relation to link buying. As a result, sites with low PR could be missed as a quality site. Furthermore Alexa Rank is a decent indicator of a site’s popularity but I can’t rely on it since it is not an established indicator of how well a site is regarded in Google. Between the two stats, however, we can glean a good indication of the sites that have the best reputation for link building.

Creating a List of Authority Competitor Backlinks

Using Excel or another spreadsheet application copy and paste the data you received from OptiLink or Backlink Analyzer into a worksheet. Then create a copy of the sheet so that you have an exact copy of all the data on a single sheet. Now follow these steps:

  1. On the first worksheet sort the data by Google PageRank (PR) from highest numbers to lowest. Now remove all of the pages that had less than a PageRank of 4 so you are left with the best sites according to this data. OR just separate the lower PageRanked sites so they don’t get in the way.
  2. On the second worksheet sort the data first by Alexa Ranking (sort lowest to highest numbers) and then do a secondary sort by the Google PageRank (highest to lowest numbers). Delete or remove all sites that have a negative Alexa Ranking (“nm” is how it shows in OptiLink) or otherwise partition them from your other more valuable data.

Now you have two excellent worksheets that provide lists of authority pages that have links pointing to your competitor.

How to Use the Backlink Data

Take some time now to filter the links by domain and you will see just how many links per domain each competitor has. If you see a website that appears to be linking to a website a lot it is usually because either the competitor owns the website or has purchased a link on the website. To find out if your competitor owns the website try running a Whois on the domain.

Also check the content of the link data for how many pages listed are from the competitors own website. If you see a great deal from their own website then you can be relatively assured they have good content which is important to note; perhaps you need to focus on better content on your own website OR how to get others to notice your good content.

Now the most logical step is to figure out which links are worth getting for yourself. Chances are a decent number of the links you found are from pages that would be willing to link to you as well.

Don’t Lose Focus on Your Own Website

So now you have a few tools to conduct a cursory competitor analysis. You will likely find some very useful data that you can act on but is this all you need to do? Is a competitor analysis going to be the golden key to increased profits? No. I have a great deal of faith in competitor analysis because I know determining what a competitor is doing successfully can improve a marketing plan dramatically. That said, you also have to pay close attention to your own website and the quality information that can be gained from using free tools like Google Analytics or handy paid tools like ClickTracks Professional.

Using a quality analytics program will allow you to get as granular as monitoring the success of each page in your website with details such as: where did visitors come from (somewhere in your site or from another?), how long on average visitors stayed at a particular page, what keywords led visitors to the page (if any), and much more.

With proper analytics you can actually compare and contrast the effects of minor edits to a page’s content; this is called multivariate testing. For example you can run tests to see if you can improve the retention of visitors by adding a better image or a better tag line because you noticed that many visitors were entering at a page deep within your site that was not originally designed as an entry page.

Truly, the sky is the limit with analytics and it would be irresponsible for me to state that competitor analysis is more important than making your own website run smoothly. Do yourself a favour, if you haven’t already got an analytics program running on your site, get it done now or learn how to use the one you have; it will pay off in the long run. Especially when you want to monitor the success of the tactics you applied to your site from your competitor analysis findings.

About the author:

Ross Dunn is the owner of StepForth

Web Marketing and an all-round good guy and good SEO.

Next week the topic will be site structure and will be written by Beanstalk author and Director of Optimization, Daryl Quenet. Daryl will of course be on the show with us next Thursday along with some great guests.

SEO news blog post by @ 12:55 pm on February 7, 2008

Categories:SEO Articles

 

SEO Step One Of Ten: Keyword Research

Back in October 2004 I launched a series of articles outlining the ten crucial steps to a well optimized website. The steps were:

  1. Keyword Selection
  2. Content Creation
  3. Site Structure
  4. Optimization
  5. Internal Linking
  6. Human Testing
  7. Submissions
  8. Link Building
  9. Monitoring
  10. The Extras (all those things that didn’t fit in the first 9 steps)

Well in case you’ve been asleep for the last few years on in case you’ve just recently joined us in the SEO-realm, I – along with some of my good friends in the web marketing world – have decided to re-write the series with new information and new perspectives.

The New Series

In our updated series we’ll be dropping some of the articles and adding others to account for changes in the industry. Another major change in this series is that we’re going to compliment it with a weekly segment on Webmaster Radio’s Webcology on Thursday afternoon at 2PM EST where we’ll be conducting interviews and discussing tools with their manufacturers to help our readers and listeners make the most of this information. If you miss the show, you can always download the podcast free of charge afterwards.

The 10 steps covered in this series will be:

  1. Keyword Research & Selection
  2. Competition Analysis
  3. Site Structure
  4. Content Optimization
  5. Link Building
  6. Social Media
  7. PPC
  8. Statistics Analysis
  9. Conversion Optimization
  10. Keeping It Up

Step One: Keyword Research & Selection

There are two times in a site’s life when keyword research is conducted – when researching a site to rank in the organic results on the search engines and when researching keywords for a PPC campaign. In our article today we’re going to focus on the former and save the research involved with PPC campaigns for step seven in this series.

So we’ve got the topic down to “just” keyword research and selection for organic SEO campaigns – from there the topic once again gets split into a variety of areas. Those that we will cover here are:

  • The raw data
  • Studying those who’ve gone before
  • Understanding your choices

The Raw Data

The raw data is the raw estimated searches/day that you can expect a phrase to get on the major search engines. There are a number of tools you can use to compile this information. Here are some of the more commonly used:

Overture Keyword Suggestion Tool – Link Removed

Yahoo!’s keyword suggestion tool. It’s fast and it’s free but it has some serious drawbacks. The tool often mixes singular, plural and common misspellings into one so it could lead you astray (admittedly it’s gotten much better lately but still far from perfect).

Is a bed and breakfast in Banff, BC better to target “banff accommodation” or “banff accommodations”. How about the very common misspelling “Banff accomodations”? That said, it’s based on easily the largest pool of search data made available in this way which gives it a huge edge in accuracy based on the pool of data it’s collecting from.

WordTracker – Link Removed

WordTracker is easily one of the most popular of the paid keyword research tools. It solves the problem with the singular vs plural vs misspellings however the data it accesses is from a few meta engines and is not as comprehensive as one might like.

They offer a free trial and have options to pay for just a day or up to a year so they provide options for people who simply need it for a quick round of research on one site to SEO firms who need it on a daily bases. It sells for $59/mth.

Keyword Discovery – Link Removed

This tool is very similar to WordTracker in the advantages/disadvantages category. Better specification of keywords, lower pool of data to base them on. I personally prefer Keyword Discovery simply for some of the features and the ability to export data for clients to view easily. Of course, that could well be due to my increased experience with it.

They have a free trial as well and it sells for $69/mth.

Aaron Wall’s Summary

Noted above are some of the most popular tools and the ones I’ve used the most. There are some other tools definitely worth taking a peek at. Aaron Wall did a great summary on his site of the major tools, their pros and cons, etc. Definitely worth taking a peek at. Admittedly it’s a couple years old so some of the features have changed a bit but most of it is still valid and accurate.

Now What …

Now that we’ve looked at the tools, let’s take a look at what we’re supposed to do with them. As noted, we’ll cover how to use these tools when launching or updating a PPC campaign in a future article, however there are still a few areas and considerations that we need to consider here.

So let’s get started …

In case no one told you – size doesn’t matter. It’s not how big it is, it’s who’s using it. Let’s use as an example a phrase we at Beanstalk targetted and that’s the phrases “search engine positioning”. At first this was our big phrase which now gets 7,689 estimated searches/mth (a bit higher than it was back then). “search engine positioning services” gets a lowly 2,636 searches/mth. Of course we should be targeting the one with the higher number of searches (or so I thought).

Once we have attained top 3 rankings for both I started looking through my stats and setting up filters for conversion (forms filled out and visits to our contact page). People who entered “search engine positioning” were sure interested in our blog and articles but only those who added the word “services” contacted us. And so the big phrase was abandoned as a target and we began focusing on what I refer to as “buy phrases”. So bigger isn’t better if the people you want are searching using phrases with a lower search volume.

There’s another time when bigger isn’t better. Which of those two phrases do you suppose we ranked for first? If you guessed the services phrase then you’re right. When you launch a new website (which we had) you’re likely up against sites that have been around for a while, have some solid backlinks and a good number of pages. You’re not going to want to go up against them for the top phrases out of the gate. Choosing to go with phrases that are lower in search volume and lower in competition will almost always result in higher rankings faster, put some money back in your pocket and ready you to go for the bigger phrases.

It’s here that the model we followed works well. When you’re selecting your short term and long term targets it’s wise to choose phrases with the same root keywords (“search engine positioning” and “search engine positioning services” for example). This basically enables you to work towards your long term goals during link building for your short term targets. And who doesn’t like to kill two birds with one stone? Or perhaps you have all the time in the world and you’re one of those people who likes nothing more than working on developing incoming links.

Which brings us to …

Studying Those Who’ve Gone Before

Imitation is the sincerest form of flattery. Let’s just hang onto that thought while we research what those who are successful in your industry are targeting in order to glean some insight into what works.

I’ve recently discovered (much to my pleasure) a very cool tool that, while a bit pricey for some, simplified MANY of the processes of keyword research, tracking and competitor keyword dissection. A company called AdGooroo has created what I’ve now discovered to be an awesome keyword tracking tool (I’d call it keyword research but it does a lot more than list off search phrases). The tool allows you to do the generic keyword research that we’re all used to with the same limitations as the tools above (i.e. Google doesn’t hand out their search keyphrase volumes) but that’s just the first step.

They then take a look at your competitors on both the organic and PPC results, figure out what they’re ranking for and bidding on and provides some great reports on saturation levels, competition levels, and a lot more. With this in hand you can then begin to analyze how they’re ranking (that’ll be covered next week in our article on competition analysis).

The folks at AdGooroo also store historical information so you can look back over trends in the past and compare that to what you see now. As noted, a bit pricey for some but worth it for those who can afford to know this level of information on who’s doing what and what you should be doing.

I should also note that I’m experienced in their SEM Insight product which costs $399/mth. They also do offer AdGooroo Express which has a lot of the same feature (but missing a lot of the ones I personally feel can give a researcher a HUGE jump on their competitors). The Express version however sells at $89/mth so far more affordable for some. And like all my favorite tools, they provide a free trial. :)

But if you can’t afford that level of information you’ll want to run ranking reports on all your top competitors (you likely know who these are but if you don’t – they’re the ones who rank in the top 10 for the most competitive phrases). You can either do this manually or use a reporting tool such as WebPosition Gold (again, has a free trail).

If you find weaker sites ranking for large numbers of phrases, you know who to watch (again, we’ll get into this more next week). The only problem with this method is that you can only think of what you can think of. The site might be ranking for phrases you never thought to look into and which, in knowing, might provide some great insight into additional targets and tactics. Of course, you might well be from an industry with very obvious and defined keywords.

Understanding Your Choices

So now you’ve got choices to make. You’ve got a list of perhaps hundreds of keywords and you need to shorten that list down. The number of phrase you target will only be limited by your site and the amount of time you have to dedicate to it.

You will likely need to pare down your choices to those that will produce the fastest and highest ROI possible. This will likely be the phrases that provide the lowest competition levels for the highest searched “buy phrases”. Once you have attained these rankings you can move on.

The alternative is to go for the gold and target the biggest phrases in your industry. This will take longer (99% of the time) but might be necessary if there are no suitable secondary phrases. In this event you have to ready yourself for a slow rise to the top and a longer period of stagnant traffic with a big return (hopefully) at the end.

Another major choice you’ll have to make (especially if you have a large number of potential phrases) is whether to start out with a PPC campaign for the traffic or to test keyword phrases for an organic promotion. While these will be covered in more detail in part 7, if you just can’t wait you can find a past article on the subject titled “Using PPC To Maximize Your Search Engine Positioning ROI“.

More Info On This Series

As noted but worth mentioning again, this article series is being supplemented with a weekly show on WebmasterRadio.fm. Be sure to tune in or download the podcast to get the full information and hear some great interviews with the tool makers and experts.

Next week the topic will be competition analysis and will be written by StepForth, Inc. author and owner Ross Dunn. Ross will of course be on the show with us next Thursday along with some great guests.

SEO news blog post by @ 12:40 pm on January 31, 2008

Categories:SEO Articles

 

Ten Step SEO Series

This article series is an updated version of the 10-step series we

wrote back in 2004. This time we’re supplementing it with interviews

on Webmaster Radio and many of the articles will be written by guest

authors – experts in their own fields.

The Series:

  1. Keyword Research & Selection
  2. Competition Analysis
  3. Site Structure
  4. Content Optimization
  5. Link Building
  6. Social Media
  7. PPC
  8. Statistics Analysis
  9. Conversion Optimization
  10. Keeping It Up

SEO news blog post by @ 2:40 pm on January 20, 2008

Categories:SEO Articles

 

The Dark Art Of Search Engine Optimization

The title of this article is designed to illustrate the point of this article. Today we won’t be taking a look at black-hat search engine optimization tactics. Admittedly, I’ve toyed with them in a “know your enemy” kind of way but I’m no expert on advanced cloaking techniques nor effective link sp@mming tactics. What we’re going to cover here are the hidden (i.e. dark) areas of effective optimization strategy.

I’ve written numerous times in past articles and blog posts that using tricks to rank your site highly is, in the end, ineffective as tricks imply a manipulation of the ranking formula and will eventually become obsolete as the search engines work to advance their algorithms and shut down such possible abuses. But here I’m going to illustrate some of the tricks we use to drive traffic to our site. Is this a conflict? Not really; these “tricks” aren’t so much directed at search engines as they are website owners and visitors. These are marketing tricks, not SEO tricks – they just happen to help you with your rankings.

Before we begin let’s review an important point about Google. When most people think of Google they think of the dominant search engine (and in that they would be right) HOWEVER if Google was primarily a search engine they would be much smaller than they are now. No, they are an advertising company and the world’s largest at that. To this end they need traffic, market share, and clicks. They need you to love Google.com, visit it often, visit their other properties and offerings such as Gmail. If you do this, the odds of you clicking on one of the paid ads increases and their primary function is fulfilled. It is driven by this purpose that Google has developed the most complex search algorithm that has ever existed. Their search is their primary source of traffic. The better their results, the more you will return, the greater the likelihood you will click an ad, the more revenue they generate (thus leading to their continued increases in reported revenue quarter-after-quarter). Why is this important? Because this is the driving force of their current algorithm and will be for the foreseeable future we can assume that any action that increases relevant traffic to your site, increases the stickiness of your site and/or increases the number of links from relevant sites to yours will help your rankings and it will help Google keep their visitors loyal.

Let’s also recall the purpose of this article. This is NOT an article about black-hat search engine optimization tactics, it’s about the hidden aspects of SEO that are often overlooked. And so, without further ado, let’s get down to the meat – what are the dark tactics that you can use to boost your website rankings.

Building A Sticky Site

A point I’ve made in past articles that I will reinforce here as opposed to “contradicting” will be that of the importance of a sticky site. Of course, monitoring your statistics to assess your visitors’ behavior is an important practice for the conversions on your site however it’s importance from a search engine optimization perspective is often overlooked. I’ve mentioned before and I’ll mention again, the search engines have the ability to monitor the length of time a visitor spends between visits to that engine. If you are on Google, enter “seo services” into it and visit the Beanstalk site but only spend 5 seconds there before hitting the back button Google can infer that the site was not what you were looking for. If it was 5 or 10 minutes before you returned back to Google they could thus infer that you found content you found useful to your query.

So let’s put that more obviously, having a site on which visitors find what they’re looking for quickly, easily, and in a visually pleasing way will increase their time on your site which will thus increase the assumption by the search engines that you are relevant for the phrase the searcher has queried. This will reinforce that your site does indeed belong among the top site. As a disclaimer: this works on a mass scale so don’t go running off and clicking through to you competitors and quickly hitting the back button. First, it’s unethical (like clicking their paid links) and second, it doesn’t work like that (how big a hole would THAT be in the algorithm) so it would only be a waste of your time.

The how to of building a sticky site I will leave to designers (being an SEO – my skills lie more in understanding mathematical formula).

Clickability Counts

The engines know when your site appears in a set of search results and they further know how often your site was click on when it appeared. The more often your site is selected when presented in a set of results the more relevant it is assumed to be and thus, the more entrenched it becomes in that set of results (assuming your stickiness issues are dealt with).

What this means is that your title and description matter, not just as part of the classical search engine optimization tactics we’ve used them for since the 90′s but also to draw visitors to your site. Fortunately the end goal of the engines closely matches what your own end goal should be for your site – maximizing traffic. Let’s take a look at two example titles that the Beanstalk site could have:

An old-school over-optimized title: Search Engine Optimization (SEO) Services Company | Beanstalk Search Engine Optimization | SEO Services, Internet Marketing, Link Building, Consulting, Training & Copywriting

Our current title: Expert SEO Services by Beanstalk

Can you see the different? While our title changes periodically as we test new titles for clickthroughs we always keep it short, easily read, and always such that the whole title will appear in the SERPs (Search Engine Results Pages). Our clickthroughs are much higher with shorter titles than longer and we have seen the same results with client sites.

The same applies to your description tag but the rules are a bit different. With your description tag you want to make sure to include your targeted keywords and make the copy compelling to a searcher. The reason for this is that when searched keywords are including in your description, is is typically the description that appears in the SERPs. This give you an opportunity to determine how your ad to the world appears. You write your title, you write your description – write both well and your clickthroughs will increase. And when your clickthroughs go up, the implied relevancy the engines will assume your site has to that phrase will increase with it and thus, so too will your rankings for that phrase.

Getting People To Link To You

We’re not going to bother discussing reciprocal link building, directory submissions or the other usual suspects. There are countless articles out there on those topics; what we’re going to focus on here are the tactics for getting articles picked up widely the resources you want to get them onto (and if you’re reading this – you know it works) as well as ways to get the links that both you and the search engines will love the most – the ones you don’t ask for or work for outside of creating a great site with useful content. The best part of these links is that they not only work to boost your link popularity but they also tend to drive great traffic to your site. Let’s begin with articles.

When you’re working to publish an article there are two main audience members: the readers and, more importantly, the editors (I say more importantly as they’re the ones that determine if you have any readers at all). There are some tactics for increasing both:

  1. Write a compelling title. This gets back to the point I was making in the first paragraph. Everyone is interested in black hat search engine optimization, even those of us who don’t practice it. Readers will be drawn to it as it receives relatively low coverage and editors like to publishing something that they feel may draw some controversy. While this article doesn’t get into black hat tactics as some editors may have hoped, it will draw them in and get their attention.
  2. Find quality related resources and get the article published there. I generally use a tool like PR Prowler to find good, quality resources to submit articles to. You can do it manually through a search engine, PR Prowler just speeds up the process so much that after its first use it’s paid for itself. You want the places you submit to, to be related to your industry and you want them to provide a link back. If you can setup that link as anchor text instead of your URL – all the better.
  3. Keep a list and add to it. If you’re going to publish multiple articles don’t start from scratch every time. Keep a list and try to add a few sites to it with each submission. This will keep your list growing and get you more exposure/links as time goes on.
  4. Keep a good relationship with the editors. They are the end-all-be-all of whether this tactic will work or not as a link and traffic building tactic. Make sure you’re polite and don’t write nasty emails if you get declined. Read what they say and make sure to take it into account with future articles.

But what if you don’t want to build links with articles, what if you want to get links the old fashioned way (and I’m talking about the old old old way – you know, before there was any SEO value to it). What if you would like to get people to link to you simply because they like your content (I know, shocking but it actually happens !!!) There are a few different factors that you need to take into account to accomplish this. Here are a few important rules to follow:

  1. You’ll need to create content that others will want to link to. This is an art in-and-of-itself. I wrote about some of the basic rules involved with this in a past article “Building Link Bait” and so I won’t repeat it here.
  2. Get the bait into social bookmarking sites. This will get people interested in your topic aware of it. If it’s good, they may link to it. Don’t just focus on Digg and the other majors, look around for some industry-specific bookmarking sites. For example, when this article is complete I’ll work to get it into Sphinn, an SEO bookmarking site.
  3. Get the bait into forums and/or blogs. I’m not talking about blog sp@mming here, I’m talking about finding blogs and forums that are RELATED to your topic and who’s visitors could be genuinely helped by the tool, information, etc. that you’re providing. Don’t worry if the blog has rel=”nofollow” on the links. The purpose is webmaster awareness, not getting links from the blogs (I’ll leave that to a different article).
  4. Promote the bait on your site. Use banners, links, your blog, etc. to build awareness.
  5. Provide the code to link to your bait. The easier you make it for people to link to you, the more of them will. Provide the code with a text and banner option and you’ll increase the number of people who will link to you.
  6. Put out a press release. If it’s big enough news, put out a press release. If the media grabs it you’ve won the lottery both in publicity and in high valued links.
  7. If the topic of your bait is searched on the engines, rank it. :)

Conclusion

So these are the darker arts we’re talking about. Not black-hat, just overlooked more often than not. Add these to your repertoire of thoughts as you optimize and link building for your site and you’ve given yourself a one-up over most if not all of your competition.

SEO news blog post by @ 10:43 am on October 23, 2007

Categories:SEO Articles

 

Google Algorithm Update Analysis

Anybody who monitors their rankings with the same vigor that we in the SEO community do will have noticed some fairly dramatic shifts in the algorithm starting last Thursday (July 5th) and continuing through the weekend. Many sites are rocketing into the top 10 which, of course, means that many sites are being dropped at the same time. We were fortunate not to have any clients on the losing end of that equation however we have called and emailed the clients who saw sudden jumps into the top positions to warn them that further adjustments are coming. After a weekend of analysis there are some curiosities in the results that simply require further tweaks in the ranking system.

This update seems to have revolved around three main areas: domain age, backlinks and PageRank.

Domain Age

It appears that Google is presently giving a lot of weight to the age of a domain and, in this SEO’s opinion, disproportionately so. While the age of a domain can definitely be used as a factor in determining how solid a company or site is, there are many newer sites that provide some great information and innovative ideas. Unfortunately a lot of these sites got spanked in the last update.

On this tangent I have to say that Google’s use of domain age as a whole is a good filter, allowing them to “sandbox” sites on day one to insure that they aren’t just being launched to rank quickly for terms. Recalling back to the “wild west days” of SEO when ranking a site was a matter of cramming keywords into content and using questionable methods to generate links quickly I can honestly say that adding in this delay was an excellent step that insured that the benefits of pumping out domains became extremely limited. So I approve of domain age being used to value a site – to a point.

After a period of time (let’s call it a year shall we) the age should and generally has only had a very small influence on a site’s ranking with the myriad of other factors overshadowing the site’s whois data. This appears to have changed in the recent update with age holding a disproportionate weight. In a number of instances this has resulted in older, less qualified domains to rank higher than newer sites of higher quality.

This change in the ranking algorithm will most certainly be adjusted as Google works to maximize the searchers experience. We’ll get into the “when” question below.

Backlinks

The way that backlinks are being calculated and valued has seen some adjustments in the latest update as well. The way this has been done takes me back a couple years to the more easily gamed Google of old. This statement alone reinforces the fact that adjustments are necessary.

The way backlinks are being valued appears to have lost some grasp on relevancy and placed more importance on sheer numbers. Sites with large, unfocused reciprocal link directories are outranking sites with fewer but more relevant link. Non-reciprocal links lost the “advantages” that they held over reciprocal links until recently.

Essentially the environment is currently such that Google has made itself more easily gamed than it was a week ago. In the current environment, building a reasonable sized site with a large recip link directory (even unfocused) should be enough to get you ranking. For obvious reasons this cannot (and should not) stand indefinitely.

PageRank

On the positive side of the equation, PageRank appears to have lost some of it’s importance including the importance of PageRank as it pertains to the value of a backlinks. In my opinion this is a very positive step on Google’s part and shows a solid understanding of the fact that PageRank means little in terms of a site’s importance. That said, while PageRank is a less than perfect calculation subject to much abuse and manipulation from those pesky people in the SEO community it did serve a purpose and while it needed to be replaced it doesn’t appear to have been replaced with anything of substantial value.

A fairly common belief has been that PageRank would be or is being replaced by TrustRank and Google would not give us a green bar to gague a site’s trust on (good call Google). With this in mind one of two things has happened; either Google has decided the TrustRank is irrelevant and so is PageRank and decided to scrap both (unlikely) or they have shifted the weight from PageRank to TrustRank to some degree and are just now sorting out the issues with their TrustRank calculations (more likely). Issues that may have existed with TrustRank may not have been clear due to it’s weight in the overall algorithm and with this shift reducing the importance of PageRank the issues that face the TrustRank calculations may well be becoming more evident

In truth, the question is neither here nor there (as important a question as it may be). We will cover why this is in the …

Conclusion

So what does all of this mean? First, it means that this Thursday or Friday we can expect yet another update to correct some of the issues we’ve seen rise out of the most current round. This shouldn’t surprise anyone too much, we’ve been seeing regular updates out of Google quite a bit over the past few months.

But what does this mean regarding the aging of domains? While I truly feel that an aging delay or “sandbox” is a solid filter on Google’s part – it needs to have a maximum duration. A site from 2000 is not, by default, more relevant than a site from 2004. After a year-or-so the trust of a domain should hold steady or at most, hold a very slight weight. This is an area we are very likely to see changes in the next update.

As far as backlinks go, we’ll see changes in the way they are calculated unless Google is looking to revert back to the issues they had in 2003. Lower PageRank, high relevancy links will once again surpass high quantity, less relevant links. Google is getting extremely good and determining relevancy and so I assume the current algorithm issues has more to do with the weight assigned to different factors than an inability to properly calculate a links relevancy.

And in regards to PageRank, Google will likely shift back slightly to what worked and give more importance to PageRank, at least while they figure out what went awry here.

In short, I would expect that with an update late this week or over the weekend we’re going to see a shift back to last week’s results (or something very close to it) after which they’ll work on the issues they’ve experienced and launch a new (hopefully improved) algorithm shift the following weekend. And so, if you’ve enjoyed a sudden jump from page 6 to top 3, don’t pop the cork on the champaign too quickly and if you’ve noticed some drops, don’t panic. More adjustments to this algorithm are necessary and, if you’ve used solid SEO practices and been consistent and varied in your link building tactics – keep at it and your rankings will return.

SEO news blog post by @ 3:57 pm on July 10, 2007

Categories:SEO Articles

 

What To Look For In An SEO

It’s been about two years now that I have wanted to write this article. Why haven’t I until now? Conflict of interest. Until recently I’d have been motivated by that necessary evil … getting business. Each time I started writing this article I subconsciously asked myself, “How can I spin this towards Beanstalk?” You can’t really begrudge me this. Such is the “curse” of living in a capitalist society. Recently however we have put a hold on taking in new SEO clients. The result: consistent questions regarding who people should choose and what they should look for. And so to kill two birds with one stone, I write this now. The first bird killed is my frustration at not being able to properly write a useful article on what to look for in an SEO without bias. The second bird killed is my wasted time outlining over-and-over what people should seek out. Now I can simply point them to this article.

You’ve read this far so you’re obviously interested in finding out what you should look for in an SEO and what you might want to avoid. So let’s get right to it shall we?

Can They Rank Their Own Site?

The first thing you should look for when hiring an SEO is whether or not they can rank their own website. This may seem obvious enough but I can’t count the number of times I have heard from people attracted to Beanstalk’s guarantee because they wasted both time and money on an SEO firm that couldn’t (or didn’t) get the job done. Too often when I take a look at the SEO’s website and research their targeted phrases (usually pretty obvious when you look at the title and heading tags) I find that they don’t even rank for their own phrases.

This is clearly a big strike three (in this case I wouldn’t even give the SEO firm a strike one or two). The only exception to this rule is if they are running a new company or website and have a proven track record from the past which can be used as their reference. In this case any consideration would require research into the individual, company, and circumstances. A good example would be Andy Beal of Marketing Pilgrim. Prior to starting Marketing Pilgrim he had been involved with two other SEO firms. When MarketingPilgrim.com started it didn’t rank well. He was still a great SEO consultant with a solid track record of success.

What Do They Promise?

If you have a new site or a site in a high competition area and you are told that the company can get you great rankings on Google in 60 days they’re either just telling you what they think will make you sign on the dotted line or they have no idea what they’re doing. In either event you’re in for disappointment.

An honest and straight-forward SEO will give you realistic expectations which will generally span over many months and sometimes over years depending on the scope and competition levels involved. If you have a new site competing for moderately competitive phrases, any claims from a company that they will have you ranking on Google in anything less than 5 or 6 months (and even this may be optimistic) are likely untrue.

What Do They Include?

Asking your prospective SEO company what they’ll be including with their services is a perfectly fair question. You don’t need a full breakdown of each and every specific (nor are you likely paying your SEO for this) however understanding what areas of the site will be changed, how the link building will be undertaken and the over-riding philosophy or approach your prospective SEO company will be taking are good questions to have answered.

If something doesn’t seem right in what you’re being told, ask in one of the many great SEO forums (see below).

How Are They Backing Their Services?

In one way or another, any good SEO company will be able to back up what they’re offering. When we first started Beanstalk we decided that we were going to do this with a guarantee. Not all companies go this route and there are many excellent SEO’s and firms that provide great services without a guarantee but all such companies will be able to back their work.

To be clear, I know of many good SEO firms that don’t offer guarantees and I also know some that do offer guarantees but don’t do a very good job. My purpose here however is not to point fingers but rather to point out what you should look for and how to be able to tell the good from the bad. If the company offers a guarantee, what is it? I’ve seen a few “we guarantee you’ll be satisfied” statements out there with no qualification as to what “satisfied” means and what will happen if you’re not. If the person or company doesn’t have a guarantee then what do they have under their belt in the way of reputation? If a company isn’t putting their money where their mouth is they should have a very good reputation if they want your consideration. Are they well published or active in the SEO forums? Are they active in the SEO community in a public fashion such as speaking roles or SEO community memberships? If they are then they have a reputation to protect and they will be backing every contract with their reputation. This won’t help you recover the money you’ve spent if you don’t get the results you’re looking for but what it will do is insure that you’re hiring an SEO who is motivated towards your success.

What Are Some Major Warning Signs That You’re On The Wrong Track?

This term “warning signs” might be better put “red flags” as the tactics noted here are ones that should send you immediately looking for a new SEO. Prepare to say, “Thank you but no.” if you hear any of the following among their list of recommendations (and note: there are more than those listed – but these are some of the more common that I’ve seen and heard lately):

  • Say goodbye if you hear an SEO recommend that you build multiple website either as a linking tool by linking them together, or because it’s easier to optimize a different site for a different engine. Unless you have two-or-more incompatible topics (a work site and a personal blog for example) you have no need for more than one site. And as a link building tactic it hasn’t worked in a good number of years.
  • If your SEO is using any kind of tool to automatically generate content of any kind it’s time to shake hands and be done.
  • If your SEO is not doing link building of some type and yet is telling you they can get you rankings for anything but the lowest competition phrases you might not need to run but you definitely need them to justify what they’re saying. If you have a 6 year old site with a lot of good links already but there are some onsite issues that keep it from ranking then they may be telling the truth. If you have a new site and/or low link counts then they are not.
  • It seems obvious but I have to mention it anyways, if they’re recommending the use of any black-hat tactics then you’re in trouble. I can’t possibly list off everything that fits this category but a quick read of Google’s webmaster guidelines should help. If you read these guidelines and some of the tactics seems amiss, questioning your SEO is completely justified. You can find some great examples and information on black hat SEO on the Wikipedia site at http://en.wikipedia.org/wiki/Black-hat_SEO.
  • Advertises that they will “Submit your website to 18 billion search engines for just $x” or mention top rankings on engines you have barely heard of is a clear issue. There are a lot of search engine out there and in fact, there are a lot of pretty unique engines with some great offerings however when it comes down to brass tacks – there are four engines that matter when it comes to traffic (at least from a universally-applicable standpoint). If an SEO is promising you great rankings on an engine like Dogpile with their whopping 0.5% of the search engine market share you may want to ask what they can do about the 91.8% of the search engine market share that’s controlled by the top 4 search engines (47.9% Google, 28.1% Yahoo!, 10.6% Microsoft and 5.2% Ask).

The Conclusion

I’ve tried to Coles-notes above some of the main issues that I see and hear complaints about and/or get questions on regularly. Of course there are many more. The best advice I can give is don’t rush into a decision when you’re choosing your SEO firm. Listen to what they’re saying, ask questions and if you don’t know what questions to ask take a few hours to find out on one of the many great SEO forums out there. As I don’t want to leave anyone out by listing off some of the ones I visit (and I couldn’t possibly include them all) I’ll just recommend to search for “seo forum” and “seo blog” and visit some of the sites and ask what you should be asking. A company called Medium Blue, who’s owner I had the pleasure of chatting with on Webmaster Radio a couple weeks prior to this article’s publication, wrote a 3-part series of questions to ask your potential SEO firm. You can find the first part here (and find the others from there).

And one final note, it isn’t always about the fees they charge. We’ve had a number of clients come back to us after first opting to sign with a cheaper SEO firm. In the end it cost them the lower fees and lost sales due to not ranking sooner. This is not to say that the most expensive firm will necessarily do the best job – just that you need to be aware that sometimes things can be “too good to be true”. An SEO firm charging $500 will almost always be putting in different efforts than one charging $5,000. Find out what the differences are and do what’s right for your business. And if you’re really in doubt and don’t know what to do, contact us. Even when we’re not taking on clients I try to answer questions about choosing an SEO firm though it might take a couple days. Please specify in the title, “Need help choosing an SEO firm”.

And good luck with your online promotions.

SEO news blog post by @ 12:30 pm on February 28, 2007

Categories:SEO Articles

 

How To Win Links And Influence Engines

The title of this article is designed to prove (in an SEO kind of way) the very point that Dale Carnegie was making when he wrote one of the most influential business books of all times, “How To Win Friends And Influence People” (arguably one of the best business books ever written as well). In the titling of his book Mr. Carnegie was trying to do two things:

  1. Write a title that captures everything that people want in order to sell more books, and
  2. Tie two important things together that are related but often viewed as different. In the case of the book it was winning friends and influencing people which he points out are essentially based on the same core traits and actions. Similarly, in our title here we are capturing two of the key areas people interested in SEO are looking to read about and thus we will show the essential tie between winning links and the influence it will have on your search engine rankings. We will also discuss methods for actually winning them as opposed to settling for second-rate links rather like winning friends as opposed to settling for tolerable acquaintances.

How To Win Links

As with virtually every aspect in SEO, there are multiple areas of this single field. If there were one hard-and-fast answer to link building we would all be ranking highly on Google and the top 10 would be a VERY crowded place. Fortunately this isn’t the case and the rankings are becoming more and more a Darwinist exercise in “survival of the fittest” (which is how it should be). Proper link building will help you be the fittest and, over time, influence engines.

If you have a site in any competition level above “low” you will want to use at least two different methods for building links. Aside from speeding up the link building process this will help insure your site withstands changes in the way link values are calculated. While there are far too many methods for building links than can be listed here (and there are some that launch so far into the black hat tactics that I wouldn’t want to), here are some of the main link building methods you should consider using:

Reciprocal Link Building:

There are many who would write that reciprocal link building is dead. While it is undeniable that the “rules” around reciprocal link building have changed it is far from dead. That said, there are specific guidelines that must be followed to make a recip link building campaign a success. Some of the more important are:

  1. Relevancy is arguably the single most important factor to consider when building recip links. For every link exchange you are considering you must ask yourself, “Is this a site that my visitors would be interested in?” If you can honestly answer that your site visitors would be genuinely interested in a site you are linking to then it’s a good link.
  2. PageRank is not the end-all-be-all that is once was however it is still a decent measure of the relative value of a website. While not as important as relevancy, it is a factor and obtaining higher PageRank links will require less links to be built.
  3. Does the site you are considering linking to have a solid link building strategy in place? Just because you’re following the best practices of link building doesn’t mean that everyone in your industry is. A good site may be following misguided link building practices (real estate sites should not link to poker sites) and if they are then their overall value is or may well be reduced in the eyes of the search engines. If they have an active and ethical link building program in place then their overall value is likely to increase making them more valuable down the road than they are today.
  4. How many links appear on each page and where will your be positioned? If your link will appear at the bottom of a page with 87 links it is far less valuable than a link near the top of a page with 25 links. This fits into the “ethical” category of point 3 above but worth mentioning again.
  5. Links that exist within content are weighted as more natural than directory-style links. Thus, when possible send HTML code that places your link within the descriptive text rather than in the title. For example, we may use the following HTML for a link to the Beanstalk site:

<strong>Beanstalk Search Engine Optimization</strong><br>

Beanstalk offers ethical and effective <a href=”http://www.beanstalk-inc.com/”>search engine positioning services</a> that will get your site to the top of the rankings. Whether you operate a small business and need regional results or if you are the VP of a Fortune 500 company needing consulting on new site changes and internal page ranking strategies, we have a search engine positioning solution to fit your needs.

These links are won as opposed to gained by default. Finding people to exchange links with on the net is easy, it’s finding quality partners that will help influence the rankings (in a positive direction at least) that requires a clear understanding of what the engines want and how to give it to them.

Non-Reciprocal Link Building:

The area of non-reciprocal link building is a slippery one. There are many methods that can be used with varying degrees of success. Due to the sheer number of methods we won’t be able to get into them all here (and there are some that shouldn’t be used anywhere) we will focus below on some of the most significant and more widely applicable:

Directory Submissions:

This is perhaps the easiest and fastest of all link building methods though it can also be one of the more costly depending on the directories you submit your site to. Yahoo! for example, charges $299 for a commercial site to be submitted into the directory. DMOZ is free however, and is certainly the most important given that Google uses the DMOZ directory to provide the listings for the Google Directory. Note though: it can sometimes take months to get a listing there and sometimes even that’s not enough.

That said, there are MANY topical directories and smaller business directories that will accept free submissions and these should definitely be considered. While they may have a relatively low PageRank they will provide reasonably relevant non-reciprocal links and help build your anchor text relevancy.

Articles:

Writing articles like the one you’re reading righ now is an excellent link building strategy. By providing valuable and useful content to other webmasters you are providing them a service, which will generally translate into a link to your site “in payment”. One of the great features of articles is that the payment isn’t only in link value but in the actual traffic you get from the link itself. But we’re not talking about traffic, we’re talking about rankings; so how do articles influence engines?

There are three main benefits of articles as a link building tactic:

  1. The link to your site will be on a page that is entirely related to your topic. If you have a site about search engine positioning for example, including that phrase in the title and content gives you the opportunity to build the relevancy between the linking page and the page it links to.

    (note: I know I have not used “search engine positioning” in the title – sometimes one has to consider the value of the title from a visitor standpoint and the fact that you came to this page and are reading this article indicates to me that the right decision was made not to change it just for a bit of added relevancy.)

  2. The link will be non-reciprocal. While we indicated above that reciprocal linking is not dead (and it’s not) there is a solid belief among SEO’s (myself included) that non-reciprocal links are weighted more heavily. Having more non-reciprocal links will also help safeguard your site against future changes in the algorithm that may reduce the value of recip links.
  3. You will likely have the ability to determine how the link to your site is worded and you may have the opportunity to link to more than one page on your site. Many people settle for a directory-style author bio. Myself, I prefer to submit my bio in a couple formats (text and html) both of which place the links inside the content. The text format will simply include links such as http://www.beanstalk-inc.com/ whereas an html link will contain code very similar to that displayed above. As far as multiple links; if the site you are submitting to will allow you to reference a couple pages you may want to link to your homepage as well as one or two internal pages that you would like to see rankings attained for. Make sure these pages are related to your core article topic or a service the reader would be interested in (see the bio for this article as an example).

Quality Content:

This next part might be a bit shocking. There are actually people out there who will link to your site simply based on the fact that they have found content there they believe will interest their readers. That’s right, people actually link to sites they find of value. On the Beanstalk site and specifically in our blog we often link to other web pages that we have found useful. Other articles, tools, blog posts, etc.often receive non-recip links from us due to the value of the content they contain and we’re definitely not the only ones doing this.

Providing quality content, useful tools, or other helpful services can be a great way to attract non-reciprocal links. After all, this is the entire reason links received any value in the first place, that they are perceived as a vote for the other site.

How To Influence Engines

With proper onsite optimization in place that includes attention to such things as site structure, site size, cohesion of the content across the site, internal linking structure, keyword density and those other onsite factors you’ve likely read much about, all that is left to do is to continue to grow your site (hopefully with quality content people will want to link to) while winning strong links to it.

If what you want to do is influence engines you will need to have strong onsite and offsite factors but don’t stop there. Influencing engines isn’t just about rankings today. You will need to continue building links down the road to insure that the search engines continue to be influenced by how people have linked to you in the past and kept those links in place and also how new people are finding your site helpful and relevant. If the engines see a sudden spurt in link growth and then see that growth stop you are not likely to have a strong ranking indefinitely in any but the lowest competition sectors.

And remember; don’t focus on just one link building method. To insure a solid and secure influence you’re going to need to win links in at least two of the methods discussed above or other ethical methods you may be considering.

Additional Notes

While we couldn’t possibly cover all the methods for link building here in an article I’ve tried to cover the main ones. A couple of methods that receive much attention but which we didn’t have room for above are press release distribution and paid links.

Press releases are an excellent way to get exposure but I have not found them as good as articles for links which is why they weren’t covered above. They are good for traffic however and you will get some links out of them if the release is good so it was worth a short mention here.

Paid links are a dangerous area to discuss as there are so many factors and so many ways it can go wrong. The only advice I will give to those looking to purchase links is this, ask yourself, “Am I expecting to get traffic from this link?” What this will weed out at the very least is small footer links and links on irrelevant sites. Basically, if the link is worth it without the boost in rankings then continue to pay for it and consider any ranking increases a bonus. If you aren’t getting any traffic from the link then it’s likely not worth paying for. If you’re not getting traffic then the site likely isn’t relevant or the link is in a poor location. The engines will likely pick either of these up and you’ll end up paying for a link that isn’t passing on any weight anyways.

SEO news blog post by @ 2:06 pm on October 10, 2006


 

Google, Orion, SEO & You

Every now and then an event occurs that changes how the SEO community views the way websites are optimized and structures promotions. The purchase of the rights to the Orion Algorithm by Google and equally important, the interest that both Yahoo! and MSN took in the algorithm as they vied for ownership themselves, marks just such an event.

Bill Gates said to Forbes magazine about Orion:

“That we need to take the search way beyond how people think of it today. We believe that Orion will do that.”

What Is The Orion Algorithm?

There is much confusion about the Orion algorithm and much secrecy around the specifics. Here’s is the “What’s Been Said” and “What It Means” breakdown:

What’s Been Said: Ori Allon, the developer of this technology described Orion in this way:

“The results to the query are displayed immediately in the form of expanded text extracts, giving you the relevant information without having to go to the Web site–although you still have that option if you wish.”

He cited an example of the keyword phrase “American Revolution.” The search would not only provide extracts with the phrase, but also additional information on topics such as American history, George Washington and the Declaration of Independence.*

* CNET News, April 10, 2006

What It Means: Most on the web take this to mean the results from Google will be displayed similar to those at Ask.com where you will be able to get a sample of the site and some of it’s quality content without having to visit the actual site. The part that most caught my attention however is where he cited the example and noted the additional phrases that would be considered and the impact having this technology will have on the way queries are dealt with.

From this standpoint, the Orion Algorithm, in its essence, is a whole new way to score the value of websites that appear on the Internet. Rather than determining the value of a website based on the specific query being entered into the search box, Orion may dig deeper and query related phrases as well. Now, this may not be an entirely new concept, directories have been providing a “Related Categories” option for ages however the addition of this function to standard search engines and what this may well mean for the methods required to rank sites on them is extremely significant.

What Is Relevant?

One of the main hurdles that SEO’s will face in reaction to this new function is determining exactly how the additional relevant phrases are determined. There are a few possible sources the come to mind:

Directories (least likely) – The directories are already using “Related Categories”. It is possible that the engines will choose the simplest possible means of determining relevancy and opt to use sub-categories of a directory listing and to use the “Related Categories” as the supplemental keyword sources.

Alternatively they could simply run the search itself on their directories and reference the categories that come up and run supplemental searches for those categories.

The main drawback to this approach is that many popular keywords would not be cross-reference accurately. For example, a search for “seo” would result in a supplemental search set of “promotion”, “web design and development”, Internet marketing” along with a variety of other phrases. While these phrases are related by industry a visitor searching for “seo” may well not be interested in “web design and development”.

Thesaurus (unlikely) – It may be that the engines choose to reference a thesaurus for related phrases however this doesn’t work for many keyword phrases. Single word phrases would be doable however multiple keyword phrases would be far more difficult and acronyms (such as “seo”) would find no related words in the more common thesauruses.

Search Behavior (highly likely) – The most likely source of the relevancy data is also the most difficult to predict and this is search behavior patterns. While I have had some disagreements with members on a couple SEO forums over whether the search engines can in fact know your search patterns, the conclusion is that they indeed can under many circumstances. Search engines will be able to compile enough data based on the users they are documenting to assess overall search behavior (and here you thought all those great tools the engines come out with were just them spending their money altruistically).

If Google “knows” that after someone enters “seo” as a query they follow that up with “seo service”, this is likely to then be used as a supplemental search. Similarly, if they also know that these same searchers tend to also search shortly before or after for another common phrase, say “w3c compliance” then this too is likely to be used as a supplemental search.

Agree To Disagree: Implementation

Now that we have a better idea of what the Orion Algorithm is and how it works the big question is, what will it’s implementation mean to search engine users and to how websites get ranked on those engines. At this time there appears to be two main schools of thought:

  • What I believe, and
  • Everything else that’s been published

I’ll be the first to admit that my interpretation of how the Orion algorithm will affect search engine results is either not shared by other SEO’s (at least those who have a published opinion on the topic) or has not been thought up by them. That said, my take on the Orion Algorithm did not initially include their predicted affect whereas I now believe that it is likely both implementations will be tested if not brought into full effect within the next 12-18 months (this may seem like a long time but if you want to develop a strategy to react to it this is about the lead-time you may well need). So … what are these two possible outcomes?

Where we all agree: the addition of key parts of web content in the results. This is how the algorithm is explained to function by its developer and is thus the obvious conclusion to most in regards to how it will be implemented.

Everyone else: related information displayed separately. From what I have read, the majority of people believe that the related phrases will be displayed separate from the original query (though rightfully no one seems to be predicting exactly where or how). Essentially this will give searchers the ability to view information on related topics quickly and easily.

This is certain to be included in some capacity and we have already seen similar functions added to the Google results for specific queries though not to any capacity reliable enough to be launched across all Google search results.

And then there’s my opinion: integration in standard search results. To me it seems short-sighted to believe that Google will leave a technology that allows them to drawn information and relevancy on multiple related phrases to just displaying multiple options on a results page. With the processing power they have at their disposal why would they not reference a site against its ability to rank for these other phrases and base the final results on that? Let’s a take a quick peek at the pros and cons of such a move:

Cons first: Processing power. That about covers the downside and I’m sure we’re all aware of the fact that if this ever becomes an issue they have more than enough capital and technical know-how to get around it.

Pros: Imagine a world where running a search for a query took into consideration whether a site ranked for multiple related phrases. What do you suppose the impact on the results would be if only those sites that had content related to a number of areas of a topic ranked highly? The answer: a much more relevant set of results.

Conclusion

Fortunately, while there may be some disagreement in regards to how this new algorithm will be integrated into the search engine results pages the resulting actions required are the same. Whether the new functions will be added in the form on additional links and information on the results pages or whether they will be taken into consideration when ranking the site for the initial query, sites that rank well for a multitude of related phrases will fare better than those that rank for just one of the phrases.

The action required then on the part of SEO’s and website owners is to provide quality unique content on all the possible areas that may be considered relevant to the main keyword target. Once this is accomplished then these areas need to be promoted in order to insure that they rank well.

The resulting web will be one that rewards websites with a large amount of quality content on the highest number of topics related to a specific issue. If one considers the end goal of any of the major search engines, to provide the most relevant results possible, this new technology is sure help promote these types of results and insure that the searcher is receiving results that are likely to provide the information they’re looking for.

And let’s also consider this: should you choose to be an “early adopter” and begin making changes to your site, adding new content, optimizing it and getting it ranking well, what will the results be? Even if Orion isn’t implemented for another decade your website will gain stickiness and rank for more related keywords bringing you more targeted traffic and keeping it on your site. Could this possibly be a bad thing?

Resources

While I have strived to provide some insight into the Orion Algorithm and what it means to you, there is a lot of information/speculation out there regarding what it means and which also covers other implementations of this technology not covered in this article. Below you will find some of the better pieces of information.

I have included information that contradicts what you may have read above. This algorithm is sure to have an enormous impact on the way searchers find results and the way SEO’s promote sites and thus, you need to have all the quality information at your disposal to make the right decisions for you website and your business.

Search Engine Watch – Danny Sullivan wrote a solid piece on the subject (as he always does) which includes some good links to related information and also a link to their forum thread on the subject where you can get other opinions on what this means to searchers and SEO’s.

E-commerce Time – Jennifer LeClaire wrote a good piece on Orion which covers more on the integration of relevant listings into the results pages.

The Sidney Morning Herald – Stephen Hutcheon covers some of the basics regarding how the deal to purchase the algorithm came about, who the major players were, and a bit of the history behind Orion.

SEO news blog post by @ 4:54 pm on July 24, 2006

Categories:SEO Articles

 

Solid SEO Through De-Optimization

That’s right, today we aren’t going to so much discuss optimization as it’s antithesis. Some may wonder what sense this makes. How can one say that the road to higher rankings is built on trying not to rank? In fact, the effort is always to rank highly, it’s just the tactics that are a bit different.

What Is De-Optimization?

De-Optimization is the reduction of those tell-tale signs of SEO that once-upon-a-time worked very well and only recently have come to be viewed as blatant attempts at, well, ranking highly. To properly de-optimize a website the following areas need to be addressed:

  • Keyword density
  • Backlink anchor text
  • The use of special text
  • Site relevancy

With these areas addressed properly a site stands a much higher chance of ranking for the phrases being targeted and perhaps more importantly, holding those rankings over time.

Keyword Density

For those not aware of the concept of keyword density it is the overall percentage of your page content that is made up of the targeted keywords. A simple example would be a 200 word page on which you are targeting a single keyword such as “google”. If you use the word “google” 20 times on the page you would have a keyword density of 10% (keywords / words * 100 = keyword density).

First, let’s note that a 10% keyword density is WAY too high to begin with. I can’t specifically give an optimal keyword density in an article as there’s a good chance it will be read months from now when the densities are different. The ideal way to calculate optimal densities is to figure out what the densities of the current top 10 are and target appropriately. But how do you de-optimize your keyword densities?

Let’s say you’ve found that in your industry the optimal keyword density is 3.5% on Google but you don’t necessarily want to target the high end of the spectrum. Using variations of the keyword is the easiest and most effective route to go. Rather than using the word “google” 7 times in that 200 words it will be more effective to use the word “google” four times, “googled” a couple times and perhaps, “googling” once. Plural, past tense, etc. are used by the major search engines as similar but not identical keywords. Thus, you will get much of the benefit of using them in your keyword densities without running the same risk of hitting densities that are too high. As an added bonus, you may just rank for these other keywords as well.

Backlink Anchor Text

The same principle applies to backlink anchor text as applies to keyword densities. Having 10,000 links to your site all with the same keyword phrase in them is going to look suspicious to say the least. Altering your text is obviously useful to help you attain rankings for multiple phrases but when you have only a couple of key phrases it helps to use plurals or otherwise altered forms of your main keywords. For example, in a link to the Beanstalk site we may use “seo services” much of the time but if all our links appeared this way it would be far less effective and thus, we also use anchor text like “seo service”, “search engine positioning services”, “seo firm”, etc. to insure that we don’t have too many identical backlinks but at the same time promoting our core phrases throughout.

As with keyword density, done correctly there’s the added perk of ranking well for a variety of phrases.

The Use Of Special Text

As discussed back in the article on Content Optimization, special text (as we are using it here) is any text that is set out as unique from the majority of text on your page using such things as bold, italics, anchor text, colors, etc. The use of this type of text implies to the search engines that the text in this format is more important than the standard text on your page and also that you want this text to stand out from the rest (i.e. you want to insure your visitors see it).

For this reason special text is highly powerful from and SEO-perspective (and for all the right reasons as it’s highly effective from a human visitor perspective). Based on the article noted above we have noticed people bolding every instance of their targeted phrase on their site. This might have worked back when the engines were first calculating this in as a factor however as with any trick that’s used, it quickly got detected and filtered. The key here is to use these formats when appropriate. It should be noted that if you are targeting a phrase like “google” there will be spots in your text where you will naturally want to bold the text to draw the visitor’s eye to the phrase they have searched. For example, you would likely use special text in a sentence at the top of your page that reads, “Google: The Way The World Searches.” Whereas if further down your page you had a sentence the read, “Then one day I was sitting at my computer and Googled …” you would be much less likely to need to use any formatting outside the usual used for your general content.

As a rule of thumb, try to never exceed 30 – 40% of your keywords formatted outside of your general content style.

Site Relevancy

This is the one area where de-optimization does not apply and can be viewed as the balancing factor. While you are working to de-optimize your individual web pages you will want to simultaneously build the overall relevancy of your site to your targeted phrases. Insuring that there is a solid use of your main keywords across as many of your pages as possible while keeping the visitor experience enjoyable and informative will create a stronger overall site and further increase your ability to rank that site for multiple related phrases as well as your primary phrase.

Conclusion

De-optimization is not so much the reduction of the ranking factors insomuch as it is a return to the original intent of a website which is to cater to the visitor. As the end goal of the search engines is to provide the best possible set of search results for a specific query it logically follows that websites should strive to attain an excellent visitor experience in order to attain and maintain high rankings. The search engines themselves are getting much better at determining characteristics of a website that will provide a solid visitor experience and this ability is increasing every day (or at least, every update).

Insuring that your content reads well, is organized properly with indicators in place to tell the engines (and the visitors) where the important content is and with easy navigation will help the engines know your site is built for the visitor and highly relevant to the search query. And what more do you want the engines to know than that?

Resources

Google Guidelines – These are the guidelines that Google has set out for webmasters. Read them and add them to your Favorites (or Bookmarks for those of us using Firefox). Visit this page and review it at least once a month.

High Rankings Newsletter – This is a newsletter put out by veteran SEO Jill Whalen. While I may disagree with some of her advice, overall she is one of the best SEO’s in the industry and her newsletter (to which I am subscribed) is a great resource for anyone learning to optimize their own websites.

Total Optimizer Pro – (previously linked to “http://www.totaloptimizer.com/software/”) -

This is the primary SEO tool we use to determine optimal keyword densities and to analyze backlinks.

SEO news blog post by @ 10:26 am on July 10, 2006

Categories:SEO Articles

 

Tying It Together: SEO For The Big Three

This article is part four of a four part series on optimizing your website for the three major search engines. Part one, titled “SEO For MSN” covered optimizing your website to rank highly on MSN, while part two, titled “SEO For Yahoo!” covered optimizing your website to rank on Yahoo! and part three, titled “SEO For Google” covered how to rank highly on Google. In this article we will cover how to tie your optimization strategies together to attain the highest rankings possible on all three engines simultaneously.

The Major Factors:

There are some constants in search engine optimization; some factors that, by necessity, must be considered by all the major engines. Fortunately for use, these factors are generally the most important. Unfortunately, each of the engines uses them in different ways. Let’s begin by listing these factors:

  • Age
  • Content
  • Keyword density
  • How it fares in the results
  • Site structure
  • Backlinks

Age

Many of you will already be familiar with the aging delay that is commonly referred to as “the sandbox”. For those of you that aren’t familiar with it, the sandbox is a penalty that is applied to new sites and new links under the assumption that they cannot play nicely with others. It is only after time that the penalty is lightened and eventually disappears and the site is left to play in the park with the rest of the “nice sites”.

This penalty is applied most strongly by Google and to a lesser degree Yahoo! On Google a new website cannot expect to rank for any competitive phrases for between 6 and 8 months. Even then, the links that are being built to this site still have to age so for most new sites competing for high-competition phrases you’re looking at a good year-or-so to see top results though you’ll likely see good results for many of your secondary phrases well before then. The penalty is also applied by Yahoo! but to a far lesser degree. The penalty on Yahoo is both shorter and lighter than that applied by Google. MSN does not apply such a penalty at the time of this writing.

Content

This is obviously a key feature across all the engines but again Yahoo! and Google take the lead in penalizing sites that do not have a lot of content related to a similar theme. Recently we have seen this act as mixed blessing, at least on Google, with some major sites getting overlooked due to a large amount of information on a wide variety of topics in exchange for sites focused on a single topic however with their recent tweaks they seem to be balancing the overall content focus with other factors to create a solid set of rankings that are relevant, will provide results that are more likely to produce the desired information, that don’t neglect sites that may contain wide information on a wide variety of topics yet provide a good deal of valuable content on the searches subject. Yahoo doesn’t seem to be catching up in this area with some holes in their results. That said, as the are not “gamed” as much as Google they haven’t had to put on such strong filters and their results remain solid despite this.

It should be noted that the content does not necessarily have to contain the same keywords to be considered related. The engines are getting far better at determining themes of sites and knowing which words are related to each other. For example, Google will view the word “personal” and “personalized” as related by theme. You may not rank the same for both words in a search however they are tied together and

Keyword Density

Keyword density is the overall percentage of your page content that is made up of the targeted keywords. An additional factor in keyword density is the percentage of your keyword content that used special formatting such as bold, italic, anchor text, etc. While keyword density is not the end-all-be all of SEO (there is no single factor that is) it is a factor and one of the more difficult to optimize properly. While hitting specific densities for both overall content and special formats is easy enough, it becomes more difficult when you consider and are even more important than optimization: your real-for-real human visitor!

One should try to attain near optimal keyword densities using a tool such as Total Optimizer Pro (see below), GRKDA, or other similar software however one much always be aware of how the optimized content reads to your visitors. It’s important to keep your visitors in mind, your sales message clear, and also remember that if you have to sacrifice a bit in one area (like keyword density) it can be made up through stronger efforts in others (such as link building).

Keyword density holds the most weight on MSN, followed by Yahoo! with Google coming in last. This does not mean it should not be considered for reasons which will follow below.

How A Site Fares In The Results

One factor that is not often discussed among SEO’s and which is not known to many outside the community is that how your site fares in the results is a factor. This factor is a fairly recent addition but is sure top become a stronger and stronger part of the overall algorithm as it matures. Google pioneered this technology however Yahoo! appears to be following suit and MSN is sure to do so as well considering that this is information that is very easy for any engine to track and truly adds to the “democracy” of the results in that it becomes the users “vote” that helps secure or topple a high ranking site.

This factor breaks down as such; the search engine knows when you have clicked on a result. They also know when you have returned to the results to try another site. If a site shows up for a specific search query often yet visitors tend to return to the results quickly after visiting the site the engine can thus assume that the searcher did not find what they were looking for on the site and thus the site can be deemed not relevant for that phrase. This factor alone has far-reaching effects on a number of traditionally non-SEO related factors and pulls them into the SEO-realm. Content now has to be more captivating, navigation has to be clear and easily accessed and the visitor has to be able to find the information that they’re looking for quickly and easily. If the searcher returns to the search results quickly you will lose a point. If this happens often enough you will lose positioning.

Site Structure

The way your site is structured determines how easily a search engine spider can get through it, the priority is gives specific content, and how much code the spider has to weed through to get to your content. Essentially, having a structure that allows the spider to easily get through your website, placed the content areas as high up in the HTML code as possible, and which minimizes the use of formatting code such as the font tag will increase the overall weight of the content and insure that the content you want the spiders to focus on are what they “see” early on.

Many sites are structured such that the actual content doesn’t appear until half-way down the page as far as the HTML code is concerned. Having a content area that starts at line 174 in the code is not a good start when it comes to SEO. While there is no specific answer as to what line the content area should start, using proper table structures or better yet, tableless design practices using CSS can greatly increase the weight your content is given. Using CSS again we can significantly reduce the need for formatting code, further reducing the amount of coding that the search engine has to go through to get to the content.

The higher up in your HTML the content lies the greater the weight it is given. Optimized site structure, especially in moderate to high competition industries, is one of the first steps one can take to secure a competitive advantage over one’s competition.

Backlinks

Ah backlinks. Once upon a time simply securing mass numbers of links to your site using whatever means were available was enough to rocket sites to the top of the rankings. Fortunately for search engine users this is no longer the case. With backlinks, as with websites in general, it’s quality that counts. While there are numerous factors regarding the value of a link (many of which were discussed on the article, “SEO For Google”) the basics are:

  • Age. The older the link the more weight it has. (Google and Yahoo!)
  • Link location. Links higher up on the page hold more weight. (All three)
  • Link location two. Links occurring within content hold more weight that a directory-style link. (Google and Yahoo! to a lesser degree)
  • Anchor text and formatting. The anchor text and the use of special formats in the text affect a link’s weight. (All three)
  • Relevancy. The relevancy of the site linking to you. (Google and Yahoo predominantly)
  • Number of links. The more links there are on a single page, the less valuable the link to your site from that page is.
  • Non-recip links. Non-reciprocal links hold more weight than reciprocal links. (Google and yahoo! to a lesser degree)
  • Authority sites. Links from authority sites (.gov, .edu and respected news and information related) hold more weight.

Tying It Together

Knowing this one must assess the best course of action when launching into a new SEO campaign. For the purposes of this conclusion we will assume the the keywords we are targeting are in the moderate to high competition levels. In this event one must balance off the various factors and timelines to produce the highest ROI in the short term with an eye on maximum profitability in the long term. What we mean by this is that with aging delays occurring on Google and to a lesser degree Yahoo! one should focus first on MSN. This means that when you are adjusting your keyword densities and tweaking the onsite factors early in the campaign you will want to focus on hitting optimal levels for MSN knowing that Google, regardless of what you do, is unlikely to rank you highly for your primary phrases for some time.

Your link building efforts will need to take into account the long-term objective of ranking highly on Google with an understanding that MSN is not going to penalize your newly created backlinks with aging delays. A balance of speed vs perfection will be required. All the links you build should be relevant (if your visitors wouldn’t be interested in going to the site then don’t link to it) however if you can’t always get inline links or your link will appear lower on the page you will still want to secure it.

After time (assuming that the right tactics have been used) you will notice your MSN rankings improve. This is a good benchmark for how your site will fare overall. Once you are ranking well on MSN it’s time to focus your attention on Yahoo! At this stage you will want to slowly shift the onsite optimization towards Yahoo! You may be asking, “Am I about to lose my MSN rankings?” Good question and the answer should be, “no” if your continuing on the right path. Non-optimal levels in one area can be offset by increased strength in another. While you are slowly shifting the onsite optimization away from MSN’s optimal levels you are continuing to develop more and more links further strengthening your site in this area to make up the difference.

After a couple months you will notice your Yahoo! rankings improving. A general timeline would be (assuming you are working diligently at it and are targeting fairly competitive phrases with a new site):

  • 2 – 3 months: MSN rankings secured
  • 4 – 6 months: Yahoo! rankings improving
  • 6 – 8 months: Yahoo! rankings secured and Google improving. Many secondary phrases are attained on Google.
  • 8 – 12 months: Google rankings secured.

The timelines will be quite different if you are working with an existing site (i.e. it has a solid history and a good PageRank already), are targeting less competitive phrases, and a variety of other considerations.

Conclusion

The path is not an easy one (or SEO’s would be out of their jobs) however with hard work and perhaps more importantly, constant work it can be done. Remember, there are currently 10 sites sitting on the first page. Match what they did, do 10% better and you will be there too.

Resources

Below are a few important resources to help you on your path to higher rankings:

Total Optimizer Pro – Total Optimizer Pro is the tool we use for onsite and offsite competition reporting including keyword density and backlink analysis.

Google’s Webmaster Guidelines – They’re put out by Google but apply to all the major engines. Add this one to your favorites and reference it often.

Search Engine Watch – Great source for news on the search engines in general. Also great coverage of the Search Engine Strategies conferences when they’re being held.

Note: There are resources specific to each engine in the first three articles in this series noted above.

SEO news blog post by @ 9:59 am on May 15, 2006

Categories:Articles,SEO Articles

 

« Newer PostsOlder Posts »
Level Triple-A conformance icon, W3C-WAI Web Content Accessibility Guidelines 1.0 Valid XHTML 1.0! Valid CSS!
Copyright© 2004-2014
Beanstalk Search Engine Optimization, Inc.
All rights reserved.