Beanstalk on Google+ Beanstalk on Facebook Beanstalk on Twitter Beanstalk on LinkedIn Beanstalk on Pinterest
Published On:
SEO articles and blog published on ...
Hear Us On:
Webmaster Radio
Blog Partner Of:
WebProNews Blog Partner
Helping Out:
Carbon balanced.


SEO Step Four of Ten: Content Optimization

Welcome to part two in this ten part SEO series. The ten parts of the SEO process we will be covering are:

  1. Keyword Research & Selection
  2. Competition Analysis
  3. Site Structure
  4. Content Optimization
  5. Link Building
  6. Social Media
  7. PPC
  8. Statistics Analysis
  9. Conversion Optimization
  10. Keeping It Up

Content Is King

Content is king. More than a truism, the phrase is a mantra. Content is the stuff people are looking for on a website. A commitment to developing and deploying great page, document and site content is a commitment to good SEO.

Comprised of the most common site elements, content is the most effective tool SEOs have to work with. Loosely defined as All Things On-Page, the term “content” would include titles, tags, text, in-site links and out-bound links. In some SEO practices, the acronym ATOP is used to refer to the hands-on work environment. (ATOP, i.e.: Mark sends the keyword targets to Jade whose staff works ATOP in the overall SEO effort) Content optimization is where creative art gets mixed into the webmaster science of SEO.

In the SEO process, content optimization describes most of the hands-on work done to make unique documents place well in search engine rankings. For the purposes of search engine optimization; content either exists, has to be created, or both.

Sometimes optimization of existing site content only requires the SEO to perform minor textual tweaks. Sometimes content does not exist and has to be written by the SEO. Frequently, SEOs come across pre-existing page content that needs to be totally rewritten or redeveloped.

The object is two-fold. The first goal is to feed data to search engine spiders, the second to serve information to human visitors.

Writing for Robots

By basic definition, the goal of search engine optimization is to achieve high search engine rankings. That means writing for robotic consumption. The first rule of writing for robots is, keep it simple.

For all their silicon guts and algorithmic abilities the robots are not that bright. They cope best with one concept at a time. Though a page might rank well for any number of keywords or phrases, the best site copy is written to focus on one topic per page. Addressing multiple topics per page dilutes the overall effectiveness of a site-wide SEO effort and the ranking potential of individual pages.

Limiting your focus to one topic per page makes it far easier to work keyword targets into each of the basic on-site content elements; titles, meta descriptions, body text and links. When optimizing site content, each of these elements needs to be worked on one-by-one and then examined in relation to each other. In practice, I prefer to work from the top to the bottom of a page before spending the bulk of my time messing around in the middle.

Titles are important

The first page element search engine spiders and most human visitors see is the page title. If you found this article on a search engine or through an RSS feed, chances are the title of the page was used to make the reference link you clicked on to get here. Passing primary topical information to bots and to search engine users, the title of a web document is used by SEOs to address specific keyword targets and to convince human visitors to select the page.

A lot of webmasters overlook the title when designing and maintaining their websites. To make the point, think of the countless number of websites with index pages sporting the title “Home”.

Look at the very top of your screen. See the words beside the Firefox or Internet Explorer symbol? That’s the title of this page. Being published in WebProNews, the title of the original page this piece was published on reads, “SEO Step Four of Ten: Content Optimization | WebProNews”.

Each page in a website should have a unique title. As pages in the website gets more specific, so to should the titles of those pages. Since SEO is about getting good placements under a variety of keywords or phrases, including “long-tail” placements, topically relevant keywords should be worked into the title of each page.

Here are a few examples of optimized page titles in a general page-tree order:

  1. Eco-Friendly Products for Healing Healthy Hippies :: Green Wingnuts (INDEX page)
  2. Ecological Alternatives :: Healing Healthy Hippies :: About Green Wingnuts (About page)
  3. Magic Healing Balms, Tinctures and Lotions :: Health Products for Hippies :: Green Wingnuts (Product Stock Page)
  4. Organic Yellow Blue Algae Lotion :: Nutritious Health and Healing Products :: Green Wingnuts (Specific Product Page)

Search engines use titles to gauge the topical intent of individual pages in a website. So do human search engine users. It makes sense to give both the information they need to make the decisions you want them to.

Meta Descriptions Make a Difference

There are dozens of meta tags that have been used in the history of search engine optimization. The only extremely important one is the meta DESCRIPTION tag. Though found in the source-code and not part of the visible website, the meta description tag can have a decisive impact on rankings and selection.

Search engines use the meta description to help confirm the topical intent of web pages. They also use them for a much more practical purpose. The description is often used to phrase the short paragraphs found under the Title in search engine results. When a search engine users is making a decision which link to click, a well written meta description might make the difference. Don’t ignore this tag, each page should have a unique one.

<meta name=”description” content=”Green Wingnuts makes healing products for healthy hippies. Ecological alternative health products for a better planet” />

Visible Elements, Text, Images and Links

When approaching a fresh optimization project, SEOs takes stock of what they have to work with. SEOs often think like doctors when assessing a website with the understanding that they could do quite a bit of harm if they are not extremely careful. More often than not, changes made to titles and meta descriptions are beneficial to clients. As they are frequently overlooked or under-utilized, augmenting the titles and descriptions of pages usually helps a site achieve better rankings. Changes to the text that appears on a page, on the other hand, might unleash a host of unintended consequences. Aside from the chance a SEO might mistakenly change the message the client is trying to convey, messing around with body-text might also damage current search engine rankings. Keep that in mind as we move into making content optimization decisions.

The first task in content optimization is analysis. Having a full understanding of where a clients’ web pages rank, under which keyword phrases and the degree of success current placements enjoy is critically important for making decisions about what to work on. Analysis requires data and data requires information.

In an earlier part of this series, Dave Davies addressed Keyword Research and Selection and the making of a list of several keyword phrase targets. Content optimization analysis is about figuring out which pages are most relevant to keyword phrase targets on the list.

Almost any page in a URL has a good chance to achieve strong search engine placement under a limited number of keyword phrase. In deciding which phrases to apply to which pages, I start by dividing items on the keyword selection list into categories ranging from general to specific.

On the INDEX page of the Green Wingnuts site, the phrase “Green Wingnuts” would be the most general phrase as it is the business name of the client. The target market is deemed to be health conscious hippies, hence the slightly more specific variations on “healthy hippies”. Ecology is an important interest for most health conscious hippies, thus the use of “Eco-Friendly Products”. In this example, the index page is primed to rank for three unique keyword phrases and is easily associated with variations on each.

At first mention, content optimization might be thought to be about writing primarily for search engine spiders. It’s not. Well optimized website content should be created for live-human visitors and deployed in a way that that draws the reader towards a decision. Anyone can talk to a bot. Compelling website visitors to commit to an action and achieve a conversion is a bit more difficult.

As noted earlier, a good working rule is to stick to one topic per page and to consider the overall website as a document tree. The top of the tree is the INDEX page. Below the INDEX are the second or upper-level pages that tend to describe the company, its mission, goals, general services, and contact information. Pages found on subsequent levels of the website tend to feature more specific information the deeper a document is found on the tree. In the Green Wingnuts example, you can see in the titles how content gets more specific as we descend down the document tree.

Writing for a web-based readers and search engine spiders is much like writing for newspaper readers. Because the web is a dynamic environment, readers have notoriously short attention spans. Important points and keyword phrases need to be mentioned early in the copy and, by the end of the third short paragraph; the reader should know what they are supposed to do next. Subsequent paragraphs are used to support the story told by the first three. The goal is to hold their interest long enough to confidently direct them to the next step.

For instance, when writing copy for a real estate website, I want to ensure the readers are A) getting the information they need to assess the local area and decide they want to live there, B) understanding that the realtor is there to provide whatever they need to make a decision, and C) confident enough know how to move to the listings of properties for sale.

When applying text to a page, content optimizers need to think about its placement against other elements present on the page. How headlines or “strong” text looks beside an image is as important as the slight algorithmic bump that emphasized text brings. More important to the goal of improving the page is making it accessible to all users. Adding descriptive Alt-tags to images helps visitors who use screen readers and gives SEOs opportunity to insert relevant keywords into the alt tags. While I still use <h1> and <h2> tags, I tend not to worry as much about SEO considerations as I do page layout considerations. As long as the target keyword phrases are prominent in the titles, meta description, body text and judiciously used as anchor text, I trust the search spiders to find them.

I am far more concerned about where the pages I work on are being found. An emerging consideration in content creation asks the question, “What if it plays better in Pittsburg than it does in Cleveland?” Search engines are getting far better at delivering the right information to the right person. Knowing that there are fewer common standards in search engine results, content optimizers have to think about the regionalization of search.

Finding your regional audience

One piece of SEO software I really like that is called Enquisite. Designed to tell users how pages within their websites rank from the points of view of search engine users in regional markets around the world, Enquisite provides extraordinary information about what ranks well where. Having used Enquisite for over a year, Metamend finds it an indispensable tool.

When we develop new content or think about making changes to existing page content, we check how that site is performing in regional search markets using Enquisite. Because search engines have become extremely good at targeting where a search engine user is located they are able to serve regionally relevant information to different users in different places. While the overall object is high rankings for search queries everywhere, the advent of personalized, localized and “universal” search results make us consider create regionally specific content for the strongest markets indicated by Enquisite.


Helping site visitors move from their point of entry to an essential action or a conversion is an important part of content optimization which will be fully addressed in the ninth essay in this series. To touch on it briefly, if the overall site optimization effort goes according to plan, search engine users will be able to find specific product pages on the first page of search results. That’s an optimal visitor but a content creator has to think about directing visitors who find their way to a page from a link on another site.

Internal links are important enough to obsess on. Designing a practical and elegant navigation path through a website is essential to gaining and retaining converting visitors. A big part of an elegant navigation path is how internal links are written and phrased; a process that also has an effect on a search engine’s impression of the site.

Internal links should be short and, whenever possible, be phrased with the most relevant keyword targets to the page the link leads to. A link leading to “Health Products” is far more compelling than one leading to “Green Wingnuts Products” and gets another mention of a target keyword phrase in an area that associates it with the page the link leads to. A similar approach should be taken to phrasing links in a sitemap file.

Content optimization comprises the bulk of the work SEOs do when working on a website but that work doesn’t stop when the initial optimization process ends. Content optimization also includes the regular creation of new pages and periodic changes of existing content. These topics will be covered in future essays in this series, most likely in the ninth and tenth articles, Conversion Optimization and Keeping it Up.

More Info on This Series

This article is part of a ten part series of essays on SEO written by search marketing experts from several unique disciplines. The series is being supplemented by a weekly show on Webcology at Thursdays at 2PM eastern. Be sure to tune in or download the podcast to hear the authors talk about their takes on search marketing.

The next article in this series will address one of the most important aspects of an overall SEO campaign, Link Building. That will be in two weeks as next weeks Webcology broadcast will be pre-empted by WebmasterRadio coverage of the SMX-West conference in Santa Clara.

About the author:

Jim Hedger is a veteran SEO, a good friend, and reporter for Webmaster Radio.

Next week the topic will be site structure and will be written by Beanstalk author and Director of Optimization, Daryl Quenet. Daryl will of course be on the show with us next Thursday along with some great guests.

SEO news blog post by @ 1:30 pm on February 21, 2008

Categories:SEO Articles


Part Three of Ten: Site Structure

Welcome to part three in this ten part SEO series. The ten parts of the SEO process we will be covering are:

  1. Keyword Research & Selection
  2. Competition Analysis
  3. Site Structure
  4. Content Optimization
  5. Link Building
  6. Social Media
  7. PPC
  8. Statistics Analysis
  9. Conversion Optimization
  10. Keeping It Up


Website structure and SEO are a combination of topics that I’ve always had a particular interest in because of my background in software engineering. I have worked on, or maintained over 150 corporate websites having seen many of the things that can make a website go wrong, which can seriously impact a websites operation and search engine rankings.

Of the three pillars of SEO (Structure, Content, and Links) I find the structure of a website to be one of the most under rated things, even among search engine optimization companies. The structure of a website consists of several elements which all are interdependent on each other. These include the code behind your website, how your website interlinks, and the technologies used in your website.

At this point I’m going to strongly recommend that you’re using Firefox with the Web Developer Toolbar installed. The web developer toolbar gives you an easy way to validate your website, test your site on multiple screen resolutions, and around another 100 functions.

Valid Markup and Cascading Style Sheets (CSS)

I have made it practice to develop all my projects in XHTML 1.0 Transitional (my personal preference so I can use target=”_blank” and rel=”nofollow” attributes) or XHTML 1.0 Strict and CSS 1.0. XHTML is a reformulation of HTML 4 as an XML 1.0 application. It is a very clean and semantic markup language which will also force you to write cleaner code. Whether you choose XHTML or HTML 4 your code will be friendly to the search engines (stay away from 3rd party standards like IHTML).

As for Cascading Style Sheets (CSS) it gives us the ability to abstract the design out of a webpage, or site into a secondary document. This gives us a lot of advantages, and very few disadvantages. By removing redundant design code from your website you place the content closer to the start of the document, while reducing your code to markup ratio. It also makes it easier, and more cost effective to maintain your website as you can implement simple design changes by only editing on file.

When converting a website from table based design, to pure CSS based design there is generally around a 40% decrease in code. The reason for this is when most people use tables they end up placing tables, within tables, within tables all with their own attributes (height, width, border, etc). Now multiple all that redundant, and unneeded markup by the numbers of pages of you site and you’ll quickly see how Google (or any other search engine) will be able to index you website more efficiently.

In my research, and experience I have concluded using these two technologies in conjunction with each other is a part of guaranteeing your websites success, especially with its compatibility with Google. You will also find if you do any research on this topic a recurring mantra of CSS fanatics tables are for tabular data not design.

You’ll find that most of the highly organically ranked SEO companies implement CSS based design on their own websites. For examples of CSS based design check out Beanstalk Search Engine Optimization, SEOMoz, and Quenet Consulting.

Website Templating

Now I’m going to start this section with a rant about Dreamweaver templates, and how useless they are. As a SEO / Web Developer there is nothing I loathe more than seeing a Dreamweaver template. If you’re going to template a site use a technology like Server Side Includes, PHP Includes, or ASP includes. The disadvantages of Dreamweaver templates are:

  1. Embedded comments in your code can reak havoc on Keyword Density Tools
  2. If you need a non standard footer in an index file you will need to break it from the template, creating issues for future template updates.
  3. If you have a disagreement with your web developer / designer and you part company if he doesn’t supply you with the template it’ll cost you.

When building websites I personally use PHP for implementing Server Side Includes. PHP is a relative easy language to learn for implement simple things like includes. It is also one of the most popular Apache modules, as of April 2007 there were 20,917,850 domains, and 1,224,183 IP addresses with it installed. PHP is also available for the Microsoft IIS (Windows Server) web server.

Search Engine Friendly URLs

One thing that I can’t stress enough is try to stay away from Dynamic URLs, these are URL addresses with variables, and values following the “?” character. Google used to state that it had troubles indexing sites with dynamic URLs, and to a degree this still holds true. If you are going to use Dynamic URLs always try to have less than 2 variables in your URL. I have seen sites with excessive products, and URLs where Google / Live / Yahoo all have a different number of pages cached.

A better approach is to URL Rewrite your URLs. For the Linux side Apache has Mod Rewrite, and for Windows you can use ISAPI Rewrite. When you implement a URL Rewriting system you are essentially creating a hash URL lookup table for your site, than when a server query comes in it checks the hash table to see if it finds a match then feeds it the corresponding entry.

To put it into simple terms what we strive to accomplish with URL Rewrites is to mask our dynamic content by having it appear as a static URL. A URL like Article?Id=52&Page=5 could be rewritten to /Article/ID/52/Page/5/, which to a search engine appears to be a directory with an index.htm (or whatever default / index page your particular web server uses). To see an implementation of Mod Rewrites check out Dr. Madcow’s Web Portal in the Article Section, and Link Archive.

Dynamic Websites and Duplicate Content

If there is one reoccurring theme I see in a lot of dynamic websites on the internet is that they can sometimes present the same content on multiple pages. An example of this is when you visit a website that allows you to “view a printer friendly version of this page”, a better web solution implementation would be to develop a printer friendly Cascading Stylesheet.

Another goal is also to avoid having any additional URLs on you site such as Links for changing currency with a redirect script, links to “Email to a friend” pages, or anything related to this. Always use Forms to POST date like this so that the same page, or a static page to reduce page count. This issue seems to plague a lot of custom developed ecommerce / CMSes. I’ve actually see CMSes that will present up to 5 URL / Links for each page, in the long run the spiders got so confused in indexing the catalog that some of the main content pages were not cached.

Internal Site Navigation

If built properly most websites will never have a need for an XML Sitemap, other than to get their new pages indexed that much quicker (Ecommerce & Enterprise being exceptions). I will however recommend that every website have a user accessible Sitemap linked from every page to aide your users, and for internal linking.

Most sites with indexing problems have issues with their internal page linking structure. The biggest of all these issues are websites that implement pure javascript navigation based system, these systems depend on Javascript to insert HTML into pages as there rendered. Now Google can parse javascript menus to find URLs, however all of these pages will only be linked from the JS, and not the pages there located on (expect no internal pagerank passing). The best Javascript menus are menus that manipulate your code on your page to change which sections are being displayed via CSS. An example of a hybrid CSS / Javascript menu that I like is QuickMenu by OpenCube (these guys have a great support department).

Keep I mind the more internal links you have to a page, the more internal strength this page will be given. So when in doubt link it up.

Testing Your Site Structure

When it comes to reliable website deploying all I can say is “Test It, Test It, and then Test It Some More”. When testing structure I rely on 3 different programs / firefox extensions. The first is Xenu Link Slueth, this is a great tool to run on your website to figure out how many pages can be spidered, and to find dead links. The second is the Web Developer Extension for Firefox, make sure you always validate your code when you make changes. And the last is consult Google and Yahoo to see how many pages are in your index compared to how many pages Xenu found, on Yahoo or Google type (Don’t use Live’s site: function it is useless).

After you’ve finished testing your code if you need to debug it I strongly recommend the Firebug Firefox Extension, and the IE7 Developer Toolbar.


When trying to maximize your organic rankings your internal structure is paramount, consider your site structure to be equivalent to the foundation of your house. If your foundation is not built adequately your house may be livable, but may have long term issues. With websites your long term issues will be a failure to maximize your ROI of your website, so practice safe and smart structure.

SEO news blog post by @ 1:00 pm on February 14, 2008

Categories:SEO Articles


Part Two of Ten: Competitor Analysis

Welcome to part two in this ten part SEO series. The ten parts of the SEO process we will be covering are:

  1. Keyword Research & Selection
  2. Competition Analysis
  3. Site Structure
  4. Content Optimization
  5. Link Building
  6. Social Media
  7. PPC
  8. Statistics Analysis
  9. Conversion Optimization
  10. Keeping It Up

What is a Competitor Analysis?

Have you ever wondered how a particular competitor always does so much better than you do in the search engines or online overall? A competitor analysis is one very effective method of deconstructing their online marketing strategy to discover how they are doing so well.

What Exactly Can a Competitor Analysis Reveal?

This is a very common question because many site owners don’t know the lengths that a competitor may have gone to obtain top rankings. The following examples are some of the discoveries I have uncovered in a typical competitor analysis:

  • By examining a competitor’s link structure I have found that many of the links with the most credibility came from websites the competitor actually owned. (Determining the ownership of the domain names required some sleuthing because the whois information was ‘private’ but ultimately the info became available.) In a couple of cases several of these domains had legitimate websites and this prompted some great ideas for my client to attain more traffic.
  • While researching a competitor I noticed that although the competitor’s website was very similar to my client’s, there was one major difference; the competitor’s website structure was far better optimized. By outlining the structure the competitor used and improving on it with my own expertise our client had the information he needed to apply changes to his own site.
  • In another instance I provided a client the list of all the pay per click keywords and organic keywords that each competitor was currently using. The client was flabbergasted when she realized just how many keywords she had missed promoting for her own comparable services.

The Basics of Conducting Your Own Competitor Analysis

Now that you have seen some examples of what can be gleaned from a competitor analysis you might want to conduct one of your own. For the purpose of this tutorial I am assuming that you are fairly new to SEO so I created a basic plan that works for most users; but even this will require a little preparative reading. The following is a list of essential reading material:

Many more free SEO tutorials are available if you find yourself needing more information. The following is an outline of the most revealing steps with the least amount of technical expertise required. Please keep in mind that the objective of this competitor analysis is to compare what you find to your own website later on. What you find may not seem earth shattering (or it might) but this analysis is meant to show you what you might be missing:

Competitor Walkthrough

Grab a piece of paper and a pen and while you walk through your competitor’s website look for any particularly obvious search engine optimization techniques. Here are some elements you should check:

  • Does the title tag appear well written and if so is there a common syntax used throughout the website?
  • Look at the source code of the home page and search for “H1″, “H2″ or “H3″. Do any of these tags show up? If so that means the competitor is using heading tags within the page. Now try identifying the text they used in the heading. Likely you will find the competitor’s Keyphrase is found within the tag.
  • Check if the navigation is search engine friendly. Sometimes the navigation is a drop-down menu; make sure it is a type that is search engine friendly. If not, check the footer of the page and see if a text menu is placed there.
  • Keep an eye out for a pattern of keywords being used in text links. Certain words are likely to appear more often and these are likely some of the target phrases your competitor has decided to focus on.
  • Look for nofollow tags. No follow tags are often used to channel Page Rank efficiently throughout a website. This is called a themed structure and it can have incredible ranking benefits. If you see a pattern of nofollow tag use then you can be relatively certain your competitor has/had a well-informed SEO firm on hire.
  • While you roam through the site look for pages that have particularly high Google PageRank and try to identify why. In most cases these pages have information that visitors decided to link to. Perhaps this will give you some ideas for creating similar quality content for your website.
  • Check the site for the presence of an XML sitemap. Usually it will reside at the root of the website so try typing in the basic URL of the competitor’s website and add (minus the quotes) “\sitemap.xml” on the end. The details within the sitemap might be a little confusing to you but just acknowledging that the competitor has one is noteworthy.
  • Have you found any incidences of spam throughout the site? Take note, I have lost count how many competitors succeeded using shady tactics. This doesn’t mean you copy them, however, but it may at least give you yet another indication of what helped the competitor attain rankings. Believe me, in most cases these sites will get caught with their hands in the cookie jar at which point you won’t want to be associated with the same tactics.

I can’t possibly list everything you need to keep an eye out for when walking through a competitor’s website; at least not in an article format. Just keep an eye out for anything that looks particularly purposeful in the site and linking structure as well as the content of the website. If you find something you can’t be sure is worth noting, then try researching it online; chances are someone has written about the topic/concept or can provide you advice in a forum.

Backlink Analysis

This portion of the analysis will require that you use one of the following link analysis tools: OptiLink (not free but my first choice) or Backlink Analyzer from SEO Book (free). In each case these tools have excellent help files that I suggest reading in order to get the best results from the data they generate.

In this particular stage you are going to use your new tool to analyze the first 1000 backlinks of your competitor’s domain.

Program Setup Note: Be certain to set up the software to acquire Google Rank and Alexa Rank information for each backlink and filter out any rel=nofollow links. The setting is easily found on the front of both applications with the exception of the rel=nofollow which is an option in Optilink but automatically checked in Backlink Analyzer.

When the report is completed sort the backlinks by both PageRank and then Alexa Rank; examine each sorting separately.

Why Are Both PageRank and Alexa Rank Used?

The reason both are used is because they each have notable disadvantages and advantages. PageRank is notoriously unreliable especially lately since Google now penalizes the PageRank of any site with any relation to link buying. As a result, sites with low PR could be missed as a quality site. Furthermore Alexa Rank is a decent indicator of a site’s popularity but I can’t rely on it since it is not an established indicator of how well a site is regarded in Google. Between the two stats, however, we can glean a good indication of the sites that have the best reputation for link building.

Creating a List of Authority Competitor Backlinks

Using Excel or another spreadsheet application copy and paste the data you received from OptiLink or Backlink Analyzer into a worksheet. Then create a copy of the sheet so that you have an exact copy of all the data on a single sheet. Now follow these steps:

  1. On the first worksheet sort the data by Google PageRank (PR) from highest numbers to lowest. Now remove all of the pages that had less than a PageRank of 4 so you are left with the best sites according to this data. OR just separate the lower PageRanked sites so they don’t get in the way.
  2. On the second worksheet sort the data first by Alexa Ranking (sort lowest to highest numbers) and then do a secondary sort by the Google PageRank (highest to lowest numbers). Delete or remove all sites that have a negative Alexa Ranking (“nm” is how it shows in OptiLink) or otherwise partition them from your other more valuable data.

Now you have two excellent worksheets that provide lists of authority pages that have links pointing to your competitor.

How to Use the Backlink Data

Take some time now to filter the links by domain and you will see just how many links per domain each competitor has. If you see a website that appears to be linking to a website a lot it is usually because either the competitor owns the website or has purchased a link on the website. To find out if your competitor owns the website try running a Whois on the domain.

Also check the content of the link data for how many pages listed are from the competitors own website. If you see a great deal from their own website then you can be relatively assured they have good content which is important to note; perhaps you need to focus on better content on your own website OR how to get others to notice your good content.

Now the most logical step is to figure out which links are worth getting for yourself. Chances are a decent number of the links you found are from pages that would be willing to link to you as well.

Don’t Lose Focus on Your Own Website

So now you have a few tools to conduct a cursory competitor analysis. You will likely find some very useful data that you can act on but is this all you need to do? Is a competitor analysis going to be the golden key to increased profits? No. I have a great deal of faith in competitor analysis because I know determining what a competitor is doing successfully can improve a marketing plan dramatically. That said, you also have to pay close attention to your own website and the quality information that can be gained from using free tools like Google Analytics or handy paid tools like ClickTracks Professional.

Using a quality analytics program will allow you to get as granular as monitoring the success of each page in your website with details such as: where did visitors come from (somewhere in your site or from another?), how long on average visitors stayed at a particular page, what keywords led visitors to the page (if any), and much more.

With proper analytics you can actually compare and contrast the effects of minor edits to a page’s content; this is called multivariate testing. For example you can run tests to see if you can improve the retention of visitors by adding a better image or a better tag line because you noticed that many visitors were entering at a page deep within your site that was not originally designed as an entry page.

Truly, the sky is the limit with analytics and it would be irresponsible for me to state that competitor analysis is more important than making your own website run smoothly. Do yourself a favour, if you haven’t already got an analytics program running on your site, get it done now or learn how to use the one you have; it will pay off in the long run. Especially when you want to monitor the success of the tactics you applied to your site from your competitor analysis findings.

About the author:

Ross Dunn is the owner of StepForth

Web Marketing and an all-round good guy and good SEO.

Next week the topic will be site structure and will be written by Beanstalk author and Director of Optimization, Daryl Quenet. Daryl will of course be on the show with us next Thursday along with some great guests.

SEO news blog post by @ 12:55 pm on February 7, 2008

Categories:SEO Articles


SEO Step One Of Ten: Keyword Research

Back in October 2004 I launched a series of articles outlining the ten crucial steps to a well optimized website. The steps were:

  1. Keyword Selection
  2. Content Creation
  3. Site Structure
  4. Optimization
  5. Internal Linking
  6. Human Testing
  7. Submissions
  8. Link Building
  9. Monitoring
  10. The Extras (all those things that didn’t fit in the first 9 steps)

Well in case you’ve been asleep for the last few years on in case you’ve just recently joined us in the SEO-realm, I – along with some of my good friends in the web marketing world – have decided to re-write the series with new information and new perspectives.

The New Series

In our updated series we’ll be dropping some of the articles and adding others to account for changes in the industry. Another major change in this series is that we’re going to compliment it with a weekly segment on Webmaster Radio’s Webcology on Thursday afternoon at 2PM EST where we’ll be conducting interviews and discussing tools with their manufacturers to help our readers and listeners make the most of this information. If you miss the show, you can always download the podcast free of charge afterwards.

The 10 steps covered in this series will be:

  1. Keyword Research & Selection
  2. Competition Analysis
  3. Site Structure
  4. Content Optimization
  5. Link Building
  6. Social Media
  7. PPC
  8. Statistics Analysis
  9. Conversion Optimization
  10. Keeping It Up

Step One: Keyword Research & Selection

There are two times in a site’s life when keyword research is conducted – when researching a site to rank in the organic results on the search engines and when researching keywords for a PPC campaign. In our article today we’re going to focus on the former and save the research involved with PPC campaigns for step seven in this series.

So we’ve got the topic down to “just” keyword research and selection for organic SEO campaigns – from there the topic once again gets split into a variety of areas. Those that we will cover here are:

  • The raw data
  • Studying those who’ve gone before
  • Understanding your choices

The Raw Data

The raw data is the raw estimated searches/day that you can expect a phrase to get on the major search engines. There are a number of tools you can use to compile this information. Here are some of the more commonly used:

Overture Keyword Suggestion Tool – Link Removed

Yahoo!’s keyword suggestion tool. It’s fast and it’s free but it has some serious drawbacks. The tool often mixes singular, plural and common misspellings into one so it could lead you astray (admittedly it’s gotten much better lately but still far from perfect).

Is a bed and breakfast in Banff, BC better to target “banff accommodation” or “banff accommodations”. How about the very common misspelling “Banff accomodations”? That said, it’s based on easily the largest pool of search data made available in this way which gives it a huge edge in accuracy based on the pool of data it’s collecting from.

WordTracker – Link Removed

WordTracker is easily one of the most popular of the paid keyword research tools. It solves the problem with the singular vs plural vs misspellings however the data it accesses is from a few meta engines and is not as comprehensive as one might like.

They offer a free trial and have options to pay for just a day or up to a year so they provide options for people who simply need it for a quick round of research on one site to SEO firms who need it on a daily bases. It sells for $59/mth.

Keyword Discovery – Link Removed

This tool is very similar to WordTracker in the advantages/disadvantages category. Better specification of keywords, lower pool of data to base them on. I personally prefer Keyword Discovery simply for some of the features and the ability to export data for clients to view easily. Of course, that could well be due to my increased experience with it.

They have a free trial as well and it sells for $69/mth.

Aaron Wall’s Summary

Noted above are some of the most popular tools and the ones I’ve used the most. There are some other tools definitely worth taking a peek at. Aaron Wall did a great summary on his site of the major tools, their pros and cons, etc. Definitely worth taking a peek at. Admittedly it’s a couple years old so some of the features have changed a bit but most of it is still valid and accurate.

Now What …

Now that we’ve looked at the tools, let’s take a look at what we’re supposed to do with them. As noted, we’ll cover how to use these tools when launching or updating a PPC campaign in a future article, however there are still a few areas and considerations that we need to consider here.

So let’s get started …

In case no one told you – size doesn’t matter. It’s not how big it is, it’s who’s using it. Let’s use as an example a phrase we at Beanstalk targetted and that’s the phrases “search engine positioning”. At first this was our big phrase which now gets 7,689 estimated searches/mth (a bit higher than it was back then). “search engine positioning services” gets a lowly 2,636 searches/mth. Of course we should be targeting the one with the higher number of searches (or so I thought).

Once we have attained top 3 rankings for both I started looking through my stats and setting up filters for conversion (forms filled out and visits to our contact page). People who entered “search engine positioning” were sure interested in our blog and articles but only those who added the word “services” contacted us. And so the big phrase was abandoned as a target and we began focusing on what I refer to as “buy phrases”. So bigger isn’t better if the people you want are searching using phrases with a lower search volume.

There’s another time when bigger isn’t better. Which of those two phrases do you suppose we ranked for first? If you guessed the services phrase then you’re right. When you launch a new website (which we had) you’re likely up against sites that have been around for a while, have some solid backlinks and a good number of pages. You’re not going to want to go up against them for the top phrases out of the gate. Choosing to go with phrases that are lower in search volume and lower in competition will almost always result in higher rankings faster, put some money back in your pocket and ready you to go for the bigger phrases.

It’s here that the model we followed works well. When you’re selecting your short term and long term targets it’s wise to choose phrases with the same root keywords (“search engine positioning” and “search engine positioning services” for example). This basically enables you to work towards your long term goals during link building for your short term targets. And who doesn’t like to kill two birds with one stone? Or perhaps you have all the time in the world and you’re one of those people who likes nothing more than working on developing incoming links.

Which brings us to …

Studying Those Who’ve Gone Before

Imitation is the sincerest form of flattery. Let’s just hang onto that thought while we research what those who are successful in your industry are targeting in order to glean some insight into what works.

I’ve recently discovered (much to my pleasure) a very cool tool that, while a bit pricey for some, simplified MANY of the processes of keyword research, tracking and competitor keyword dissection. A company called AdGooroo has created what I’ve now discovered to be an awesome keyword tracking tool (I’d call it keyword research but it does a lot more than list off search phrases). The tool allows you to do the generic keyword research that we’re all used to with the same limitations as the tools above (i.e. Google doesn’t hand out their search keyphrase volumes) but that’s just the first step.

They then take a look at your competitors on both the organic and PPC results, figure out what they’re ranking for and bidding on and provides some great reports on saturation levels, competition levels, and a lot more. With this in hand you can then begin to analyze how they’re ranking (that’ll be covered next week in our article on competition analysis).

The folks at AdGooroo also store historical information so you can look back over trends in the past and compare that to what you see now. As noted, a bit pricey for some but worth it for those who can afford to know this level of information on who’s doing what and what you should be doing.

I should also note that I’m experienced in their SEM Insight product which costs $399/mth. They also do offer AdGooroo Express which has a lot of the same feature (but missing a lot of the ones I personally feel can give a researcher a HUGE jump on their competitors). The Express version however sells at $89/mth so far more affordable for some. And like all my favorite tools, they provide a free trial. :)

But if you can’t afford that level of information you’ll want to run ranking reports on all your top competitors (you likely know who these are but if you don’t – they’re the ones who rank in the top 10 for the most competitive phrases). You can either do this manually or use a reporting tool such as WebPosition Gold (again, has a free trail).

If you find weaker sites ranking for large numbers of phrases, you know who to watch (again, we’ll get into this more next week). The only problem with this method is that you can only think of what you can think of. The site might be ranking for phrases you never thought to look into and which, in knowing, might provide some great insight into additional targets and tactics. Of course, you might well be from an industry with very obvious and defined keywords.

Understanding Your Choices

So now you’ve got choices to make. You’ve got a list of perhaps hundreds of keywords and you need to shorten that list down. The number of phrase you target will only be limited by your site and the amount of time you have to dedicate to it.

You will likely need to pare down your choices to those that will produce the fastest and highest ROI possible. This will likely be the phrases that provide the lowest competition levels for the highest searched “buy phrases”. Once you have attained these rankings you can move on.

The alternative is to go for the gold and target the biggest phrases in your industry. This will take longer (99% of the time) but might be necessary if there are no suitable secondary phrases. In this event you have to ready yourself for a slow rise to the top and a longer period of stagnant traffic with a big return (hopefully) at the end.

Another major choice you’ll have to make (especially if you have a large number of potential phrases) is whether to start out with a PPC campaign for the traffic or to test keyword phrases for an organic promotion. While these will be covered in more detail in part 7, if you just can’t wait you can find a past article on the subject titled “Using PPC To Maximize Your Search Engine Positioning ROI“.

More Info On This Series

As noted but worth mentioning again, this article series is being supplemented with a weekly show on Be sure to tune in or download the podcast to get the full information and hear some great interviews with the tool makers and experts.

Next week the topic will be competition analysis and will be written by StepForth, Inc. author and owner Ross Dunn. Ross will of course be on the show with us next Thursday along with some great guests.

SEO news blog post by @ 12:40 pm on January 31, 2008

Categories:SEO Articles


Ten Step SEO Series

This article series is an updated version of the 10-step series we

wrote back in 2004. This time we’re supplementing it with interviews

on Webmaster Radio and many of the articles will be written by guest

authors – experts in their own fields.

The Series:

  1. Keyword Research & Selection
  2. Competition Analysis
  3. Site Structure
  4. Content Optimization
  5. Link Building
  6. Social Media
  7. PPC
  8. Statistics Analysis
  9. Conversion Optimization
  10. Keeping It Up

SEO news blog post by @ 2:40 pm on January 20, 2008

Categories:SEO Articles


The Dark Art Of Search Engine Optimization

The title of this article is designed to illustrate the point of this article. Today we won’t be taking a look at black-hat search engine optimization tactics. Admittedly, I’ve toyed with them in a “know your enemy” kind of way but I’m no expert on advanced cloaking techniques nor effective link sp@mming tactics. What we’re going to cover here are the hidden (i.e. dark) areas of effective optimization strategy.

I’ve written numerous times in past articles and blog posts that using tricks to rank your site highly is, in the end, ineffective as tricks imply a manipulation of the ranking formula and will eventually become obsolete as the search engines work to advance their algorithms and shut down such possible abuses. But here I’m going to illustrate some of the tricks we use to drive traffic to our site. Is this a conflict? Not really; these “tricks” aren’t so much directed at search engines as they are website owners and visitors. These are marketing tricks, not SEO tricks – they just happen to help you with your rankings.

Before we begin let’s review an important point about Google. When most people think of Google they think of the dominant search engine (and in that they would be right) HOWEVER if Google was primarily a search engine they would be much smaller than they are now. No, they are an advertising company and the world’s largest at that. To this end they need traffic, market share, and clicks. They need you to love, visit it often, visit their other properties and offerings such as Gmail. If you do this, the odds of you clicking on one of the paid ads increases and their primary function is fulfilled. It is driven by this purpose that Google has developed the most complex search algorithm that has ever existed. Their search is their primary source of traffic. The better their results, the more you will return, the greater the likelihood you will click an ad, the more revenue they generate (thus leading to their continued increases in reported revenue quarter-after-quarter). Why is this important? Because this is the driving force of their current algorithm and will be for the foreseeable future we can assume that any action that increases relevant traffic to your site, increases the stickiness of your site and/or increases the number of links from relevant sites to yours will help your rankings and it will help Google keep their visitors loyal.

Let’s also recall the purpose of this article. This is NOT an article about black-hat search engine optimization tactics, it’s about the hidden aspects of SEO that are often overlooked. And so, without further ado, let’s get down to the meat – what are the dark tactics that you can use to boost your website rankings.

Building A Sticky Site

A point I’ve made in past articles that I will reinforce here as opposed to “contradicting” will be that of the importance of a sticky site. Of course, monitoring your statistics to assess your visitors’ behavior is an important practice for the conversions on your site however it’s importance from a search engine optimization perspective is often overlooked. I’ve mentioned before and I’ll mention again, the search engines have the ability to monitor the length of time a visitor spends between visits to that engine. If you are on Google, enter “seo services” into it and visit the Beanstalk site but only spend 5 seconds there before hitting the back button Google can infer that the site was not what you were looking for. If it was 5 or 10 minutes before you returned back to Google they could thus infer that you found content you found useful to your query.

So let’s put that more obviously, having a site on which visitors find what they’re looking for quickly, easily, and in a visually pleasing way will increase their time on your site which will thus increase the assumption by the search engines that you are relevant for the phrase the searcher has queried. This will reinforce that your site does indeed belong among the top site. As a disclaimer: this works on a mass scale so don’t go running off and clicking through to you competitors and quickly hitting the back button. First, it’s unethical (like clicking their paid links) and second, it doesn’t work like that (how big a hole would THAT be in the algorithm) so it would only be a waste of your time.

The how to of building a sticky site I will leave to designers (being an SEO – my skills lie more in understanding mathematical formula).

Clickability Counts

The engines know when your site appears in a set of search results and they further know how often your site was click on when it appeared. The more often your site is selected when presented in a set of results the more relevant it is assumed to be and thus, the more entrenched it becomes in that set of results (assuming your stickiness issues are dealt with).

What this means is that your title and description matter, not just as part of the classical search engine optimization tactics we’ve used them for since the 90′s but also to draw visitors to your site. Fortunately the end goal of the engines closely matches what your own end goal should be for your site – maximizing traffic. Let’s take a look at two example titles that the Beanstalk site could have:

An old-school over-optimized title: Search Engine Optimization (SEO) Services Company | Beanstalk Search Engine Optimization | SEO Services, Internet Marketing, Link Building, Consulting, Training & Copywriting

Our current title: Expert SEO Services by Beanstalk

Can you see the different? While our title changes periodically as we test new titles for clickthroughs we always keep it short, easily read, and always such that the whole title will appear in the SERPs (Search Engine Results Pages). Our clickthroughs are much higher with shorter titles than longer and we have seen the same results with client sites.

The same applies to your description tag but the rules are a bit different. With your description tag you want to make sure to include your targeted keywords and make the copy compelling to a searcher. The reason for this is that when searched keywords are including in your description, is is typically the description that appears in the SERPs. This give you an opportunity to determine how your ad to the world appears. You write your title, you write your description – write both well and your clickthroughs will increase. And when your clickthroughs go up, the implied relevancy the engines will assume your site has to that phrase will increase with it and thus, so too will your rankings for that phrase.

Getting People To Link To You

We’re not going to bother discussing reciprocal link building, directory submissions or the other usual suspects. There are countless articles out there on those topics; what we’re going to focus on here are the tactics for getting articles picked up widely the resources you want to get them onto (and if you’re reading this – you know it works) as well as ways to get the links that both you and the search engines will love the most – the ones you don’t ask for or work for outside of creating a great site with useful content. The best part of these links is that they not only work to boost your link popularity but they also tend to drive great traffic to your site. Let’s begin with articles.

When you’re working to publish an article there are two main audience members: the readers and, more importantly, the editors (I say more importantly as they’re the ones that determine if you have any readers at all). There are some tactics for increasing both:

  1. Write a compelling title. This gets back to the point I was making in the first paragraph. Everyone is interested in black hat search engine optimization, even those of us who don’t practice it. Readers will be drawn to it as it receives relatively low coverage and editors like to publishing something that they feel may draw some controversy. While this article doesn’t get into black hat tactics as some editors may have hoped, it will draw them in and get their attention.
  2. Find quality related resources and get the article published there. I generally use a tool like PR Prowler to find good, quality resources to submit articles to. You can do it manually through a search engine, PR Prowler just speeds up the process so much that after its first use it’s paid for itself. You want the places you submit to, to be related to your industry and you want them to provide a link back. If you can setup that link as anchor text instead of your URL – all the better.
  3. Keep a list and add to it. If you’re going to publish multiple articles don’t start from scratch every time. Keep a list and try to add a few sites to it with each submission. This will keep your list growing and get you more exposure/links as time goes on.
  4. Keep a good relationship with the editors. They are the end-all-be-all of whether this tactic will work or not as a link and traffic building tactic. Make sure you’re polite and don’t write nasty emails if you get declined. Read what they say and make sure to take it into account with future articles.

But what if you don’t want to build links with articles, what if you want to get links the old fashioned way (and I’m talking about the old old old way – you know, before there was any SEO value to it). What if you would like to get people to link to you simply because they like your content (I know, shocking but it actually happens !!!) There are a few different factors that you need to take into account to accomplish this. Here are a few important rules to follow:

  1. You’ll need to create content that others will want to link to. This is an art in-and-of-itself. I wrote about some of the basic rules involved with this in a past article “Building Link Bait” and so I won’t repeat it here.
  2. Get the bait into social bookmarking sites. This will get people interested in your topic aware of it. If it’s good, they may link to it. Don’t just focus on Digg and the other majors, look around for some industry-specific bookmarking sites. For example, when this article is complete I’ll work to get it into Sphinn, an SEO bookmarking site.
  3. Get the bait into forums and/or blogs. I’m not talking about blog sp@mming here, I’m talking about finding blogs and forums that are RELATED to your topic and who’s visitors could be genuinely helped by the tool, information, etc. that you’re providing. Don’t worry if the blog has rel=”nofollow” on the links. The purpose is webmaster awareness, not getting links from the blogs (I’ll leave that to a different article).
  4. Promote the bait on your site. Use banners, links, your blog, etc. to build awareness.
  5. Provide the code to link to your bait. The easier you make it for people to link to you, the more of them will. Provide the code with a text and banner option and you’ll increase the number of people who will link to you.
  6. Put out a press release. If it’s big enough news, put out a press release. If the media grabs it you’ve won the lottery both in publicity and in high valued links.
  7. If the topic of your bait is searched on the engines, rank it. :)


So these are the darker arts we’re talking about. Not black-hat, just overlooked more often than not. Add these to your repertoire of thoughts as you optimize and link building for your site and you’ve given yourself a one-up over most if not all of your competition.

SEO news blog post by @ 10:43 am on October 23, 2007

Categories:SEO Articles


Google Algorithm Update Analysis

Anybody who monitors their rankings with the same vigor that we in the SEO community do will have noticed some fairly dramatic shifts in the algorithm starting last Thursday (July 5th) and continuing through the weekend. Many sites are rocketing into the top 10 which, of course, means that many sites are being dropped at the same time. We were fortunate not to have any clients on the losing end of that equation however we have called and emailed the clients who saw sudden jumps into the top positions to warn them that further adjustments are coming. After a weekend of analysis there are some curiosities in the results that simply require further tweaks in the ranking system.

This update seems to have revolved around three main areas: domain age, backlinks and PageRank.

Domain Age

It appears that Google is presently giving a lot of weight to the age of a domain and, in this SEO’s opinion, disproportionately so. While the age of a domain can definitely be used as a factor in determining how solid a company or site is, there are many newer sites that provide some great information and innovative ideas. Unfortunately a lot of these sites got spanked in the last update.

On this tangent I have to say that Google’s use of domain age as a whole is a good filter, allowing them to “sandbox” sites on day one to insure that they aren’t just being launched to rank quickly for terms. Recalling back to the “wild west days” of SEO when ranking a site was a matter of cramming keywords into content and using questionable methods to generate links quickly I can honestly say that adding in this delay was an excellent step that insured that the benefits of pumping out domains became extremely limited. So I approve of domain age being used to value a site – to a point.

After a period of time (let’s call it a year shall we) the age should and generally has only had a very small influence on a site’s ranking with the myriad of other factors overshadowing the site’s whois data. This appears to have changed in the recent update with age holding a disproportionate weight. In a number of instances this has resulted in older, less qualified domains to rank higher than newer sites of higher quality.

This change in the ranking algorithm will most certainly be adjusted as Google works to maximize the searchers experience. We’ll get into the “when” question below.


The way that backlinks are being calculated and valued has seen some adjustments in the latest update as well. The way this has been done takes me back a couple years to the more easily gamed Google of old. This statement alone reinforces the fact that adjustments are necessary.

The way backlinks are being valued appears to have lost some grasp on relevancy and placed more importance on sheer numbers. Sites with large, unfocused reciprocal link directories are outranking sites with fewer but more relevant link. Non-reciprocal links lost the “advantages” that they held over reciprocal links until recently.

Essentially the environment is currently such that Google has made itself more easily gamed than it was a week ago. In the current environment, building a reasonable sized site with a large recip link directory (even unfocused) should be enough to get you ranking. For obvious reasons this cannot (and should not) stand indefinitely.


On the positive side of the equation, PageRank appears to have lost some of it’s importance including the importance of PageRank as it pertains to the value of a backlinks. In my opinion this is a very positive step on Google’s part and shows a solid understanding of the fact that PageRank means little in terms of a site’s importance. That said, while PageRank is a less than perfect calculation subject to much abuse and manipulation from those pesky people in the SEO community it did serve a purpose and while it needed to be replaced it doesn’t appear to have been replaced with anything of substantial value.

A fairly common belief has been that PageRank would be or is being replaced by TrustRank and Google would not give us a green bar to gague a site’s trust on (good call Google). With this in mind one of two things has happened; either Google has decided the TrustRank is irrelevant and so is PageRank and decided to scrap both (unlikely) or they have shifted the weight from PageRank to TrustRank to some degree and are just now sorting out the issues with their TrustRank calculations (more likely). Issues that may have existed with TrustRank may not have been clear due to it’s weight in the overall algorithm and with this shift reducing the importance of PageRank the issues that face the TrustRank calculations may well be becoming more evident

In truth, the question is neither here nor there (as important a question as it may be). We will cover why this is in the …


So what does all of this mean? First, it means that this Thursday or Friday we can expect yet another update to correct some of the issues we’ve seen rise out of the most current round. This shouldn’t surprise anyone too much, we’ve been seeing regular updates out of Google quite a bit over the past few months.

But what does this mean regarding the aging of domains? While I truly feel that an aging delay or “sandbox” is a solid filter on Google’s part – it needs to have a maximum duration. A site from 2000 is not, by default, more relevant than a site from 2004. After a year-or-so the trust of a domain should hold steady or at most, hold a very slight weight. This is an area we are very likely to see changes in the next update.

As far as backlinks go, we’ll see changes in the way they are calculated unless Google is looking to revert back to the issues they had in 2003. Lower PageRank, high relevancy links will once again surpass high quantity, less relevant links. Google is getting extremely good and determining relevancy and so I assume the current algorithm issues has more to do with the weight assigned to different factors than an inability to properly calculate a links relevancy.

And in regards to PageRank, Google will likely shift back slightly to what worked and give more importance to PageRank, at least while they figure out what went awry here.

In short, I would expect that with an update late this week or over the weekend we’re going to see a shift back to last week’s results (or something very close to it) after which they’ll work on the issues they’ve experienced and launch a new (hopefully improved) algorithm shift the following weekend. And so, if you’ve enjoyed a sudden jump from page 6 to top 3, don’t pop the cork on the champaign too quickly and if you’ve noticed some drops, don’t panic. More adjustments to this algorithm are necessary and, if you’ve used solid SEO practices and been consistent and varied in your link building tactics – keep at it and your rankings will return.

SEO news blog post by @ 3:57 pm on July 10, 2007

Categories:SEO Articles


What To Look For In An SEO

It’s been about two years now that I have wanted to write this article. Why haven’t I until now? Conflict of interest. Until recently I’d have been motivated by that necessary evil … getting business. Each time I started writing this article I subconsciously asked myself, “How can I spin this towards Beanstalk?” You can’t really begrudge me this. Such is the “curse” of living in a capitalist society. Recently however we have put a hold on taking in new SEO clients. The result: consistent questions regarding who people should choose and what they should look for. And so to kill two birds with one stone, I write this now. The first bird killed is my frustration at not being able to properly write a useful article on what to look for in an SEO without bias. The second bird killed is my wasted time outlining over-and-over what people should seek out. Now I can simply point them to this article.

You’ve read this far so you’re obviously interested in finding out what you should look for in an SEO and what you might want to avoid. So let’s get right to it shall we?

Can They Rank Their Own Site?

The first thing you should look for when hiring an SEO is whether or not they can rank their own website. This may seem obvious enough but I can’t count the number of times I have heard from people attracted to Beanstalk’s guarantee because they wasted both time and money on an SEO firm that couldn’t (or didn’t) get the job done. Too often when I take a look at the SEO’s website and research their targeted phrases (usually pretty obvious when you look at the title and heading tags) I find that they don’t even rank for their own phrases.

This is clearly a big strike three (in this case I wouldn’t even give the SEO firm a strike one or two). The only exception to this rule is if they are running a new company or website and have a proven track record from the past which can be used as their reference. In this case any consideration would require research into the individual, company, and circumstances. A good example would be Andy Beal of Marketing Pilgrim. Prior to starting Marketing Pilgrim he had been involved with two other SEO firms. When started it didn’t rank well. He was still a great SEO consultant with a solid track record of success.

What Do They Promise?

If you have a new site or a site in a high competition area and you are told that the company can get you great rankings on Google in 60 days they’re either just telling you what they think will make you sign on the dotted line or they have no idea what they’re doing. In either event you’re in for disappointment.

An honest and straight-forward SEO will give you realistic expectations which will generally span over many months and sometimes over years depending on the scope and competition levels involved. If you have a new site competing for moderately competitive phrases, any claims from a company that they will have you ranking on Google in anything less than 5 or 6 months (and even this may be optimistic) are likely untrue.

What Do They Include?

Asking your prospective SEO company what they’ll be including with their services is a perfectly fair question. You don’t need a full breakdown of each and every specific (nor are you likely paying your SEO for this) however understanding what areas of the site will be changed, how the link building will be undertaken and the over-riding philosophy or approach your prospective SEO company will be taking are good questions to have answered.

If something doesn’t seem right in what you’re being told, ask in one of the many great SEO forums (see below).

How Are They Backing Their Services?

In one way or another, any good SEO company will be able to back up what they’re offering. When we first started Beanstalk we decided that we were going to do this with a guarantee. Not all companies go this route and there are many excellent SEO’s and firms that provide great services without a guarantee but all such companies will be able to back their work.

To be clear, I know of many good SEO firms that don’t offer guarantees and I also know some that do offer guarantees but don’t do a very good job. My purpose here however is not to point fingers but rather to point out what you should look for and how to be able to tell the good from the bad. If the company offers a guarantee, what is it? I’ve seen a few “we guarantee you’ll be satisfied” statements out there with no qualification as to what “satisfied” means and what will happen if you’re not. If the person or company doesn’t have a guarantee then what do they have under their belt in the way of reputation? If a company isn’t putting their money where their mouth is they should have a very good reputation if they want your consideration. Are they well published or active in the SEO forums? Are they active in the SEO community in a public fashion such as speaking roles or SEO community memberships? If they are then they have a reputation to protect and they will be backing every contract with their reputation. This won’t help you recover the money you’ve spent if you don’t get the results you’re looking for but what it will do is insure that you’re hiring an SEO who is motivated towards your success.

What Are Some Major Warning Signs That You’re On The Wrong Track?

This term “warning signs” might be better put “red flags” as the tactics noted here are ones that should send you immediately looking for a new SEO. Prepare to say, “Thank you but no.” if you hear any of the following among their list of recommendations (and note: there are more than those listed – but these are some of the more common that I’ve seen and heard lately):

  • Say goodbye if you hear an SEO recommend that you build multiple website either as a linking tool by linking them together, or because it’s easier to optimize a different site for a different engine. Unless you have two-or-more incompatible topics (a work site and a personal blog for example) you have no need for more than one site. And as a link building tactic it hasn’t worked in a good number of years.
  • If your SEO is using any kind of tool to automatically generate content of any kind it’s time to shake hands and be done.
  • If your SEO is not doing link building of some type and yet is telling you they can get you rankings for anything but the lowest competition phrases you might not need to run but you definitely need them to justify what they’re saying. If you have a 6 year old site with a lot of good links already but there are some onsite issues that keep it from ranking then they may be telling the truth. If you have a new site and/or low link counts then they are not.
  • It seems obvious but I have to mention it anyways, if they’re recommending the use of any black-hat tactics then you’re in trouble. I can’t possibly list off everything that fits this category but a quick read of Google’s webmaster guidelines should help. If you read these guidelines and some of the tactics seems amiss, questioning your SEO is completely justified. You can find some great examples and information on black hat SEO on the Wikipedia site at
  • Advertises that they will “Submit your website to 18 billion search engines for just $x” or mention top rankings on engines you have barely heard of is a clear issue. There are a lot of search engine out there and in fact, there are a lot of pretty unique engines with some great offerings however when it comes down to brass tacks – there are four engines that matter when it comes to traffic (at least from a universally-applicable standpoint). If an SEO is promising you great rankings on an engine like Dogpile with their whopping 0.5% of the search engine market share you may want to ask what they can do about the 91.8% of the search engine market share that’s controlled by the top 4 search engines (47.9% Google, 28.1% Yahoo!, 10.6% Microsoft and 5.2% Ask).

The Conclusion

I’ve tried to Coles-notes above some of the main issues that I see and hear complaints about and/or get questions on regularly. Of course there are many more. The best advice I can give is don’t rush into a decision when you’re choosing your SEO firm. Listen to what they’re saying, ask questions and if you don’t know what questions to ask take a few hours to find out on one of the many great SEO forums out there. As I don’t want to leave anyone out by listing off some of the ones I visit (and I couldn’t possibly include them all) I’ll just recommend to search for “seo forum” and “seo blog” and visit some of the sites and ask what you should be asking. A company called Medium Blue, who’s owner I had the pleasure of chatting with on Webmaster Radio a couple weeks prior to this article’s publication, wrote a 3-part series of questions to ask your potential SEO firm. You can find the first part here (and find the others from there).

And one final note, it isn’t always about the fees they charge. We’ve had a number of clients come back to us after first opting to sign with a cheaper SEO firm. In the end it cost them the lower fees and lost sales due to not ranking sooner. This is not to say that the most expensive firm will necessarily do the best job – just that you need to be aware that sometimes things can be “too good to be true”. An SEO firm charging $500 will almost always be putting in different efforts than one charging $5,000. Find out what the differences are and do what’s right for your business. And if you’re really in doubt and don’t know what to do, contact us. Even when we’re not taking on clients I try to answer questions about choosing an SEO firm though it might take a couple days. Please specify in the title, “Need help choosing an SEO firm”.

And good luck with your online promotions.

SEO news blog post by @ 12:30 pm on February 28, 2007

Categories:SEO Articles


How To Win Links And Influence Engines

The title of this article is designed to prove (in an SEO kind of way) the very point that Dale Carnegie was making when he wrote one of the most influential business books of all times, “How To Win Friends And Influence People” (arguably one of the best business books ever written as well). In the titling of his book Mr. Carnegie was trying to do two things:

  1. Write a title that captures everything that people want in order to sell more books, and
  2. Tie two important things together that are related but often viewed as different. In the case of the book it was winning friends and influencing people which he points out are essentially based on the same core traits and actions. Similarly, in our title here we are capturing two of the key areas people interested in SEO are looking to read about and thus we will show the essential tie between winning links and the influence it will have on your search engine rankings. We will also discuss methods for actually winning them as opposed to settling for second-rate links rather like winning friends as opposed to settling for tolerable acquaintances.

How To Win Links

As with virtually every aspect in SEO, there are multiple areas of this single field. If there were one hard-and-fast answer to link building we would all be ranking highly on Google and the top 10 would be a VERY crowded place. Fortunately this isn’t the case and the rankings are becoming more and more a Darwinist exercise in “survival of the fittest” (which is how it should be). Proper link building will help you be the fittest and, over time, influence engines.

If you have a site in any competition level above “low” you will want to use at least two different methods for building links. Aside from speeding up the link building process this will help insure your site withstands changes in the way link values are calculated. While there are far too many methods for building links than can be listed here (and there are some that launch so far into the black hat tactics that I wouldn’t want to), here are some of the main link building methods you should consider using:

Reciprocal Link Building:

There are many who would write that reciprocal link building is dead. While it is undeniable that the “rules” around reciprocal link building have changed it is far from dead. That said, there are specific guidelines that must be followed to make a recip link building campaign a success. Some of the more important are:

  1. Relevancy is arguably the single most important factor to consider when building recip links. For every link exchange you are considering you must ask yourself, “Is this a site that my visitors would be interested in?” If you can honestly answer that your site visitors would be genuinely interested in a site you are linking to then it’s a good link.
  2. PageRank is not the end-all-be-all that is once was however it is still a decent measure of the relative value of a website. While not as important as relevancy, it is a factor and obtaining higher PageRank links will require less links to be built.
  3. Does the site you are considering linking to have a solid link building strategy in place? Just because you’re following the best practices of link building doesn’t mean that everyone in your industry is. A good site may be following misguided link building practices (real estate sites should not link to poker sites) and if they are then their overall value is or may well be reduced in the eyes of the search engines. If they have an active and ethical link building program in place then their overall value is likely to increase making them more valuable down the road than they are today.
  4. How many links appear on each page and where will your be positioned? If your link will appear at the bottom of a page with 87 links it is far less valuable than a link near the top of a page with 25 links. This fits into the “ethical” category of point 3 above but worth mentioning again.
  5. Links that exist within content are weighted as more natural than directory-style links. Thus, when possible send HTML code that places your link within the descriptive text rather than in the title. For example, we may use the following HTML for a link to the Beanstalk site:

<strong>Beanstalk Search Engine Optimization</strong><br>

Beanstalk offers ethical and effective <a href=””>search engine positioning services</a> that will get your site to the top of the rankings. Whether you operate a small business and need regional results or if you are the VP of a Fortune 500 company needing consulting on new site changes and internal page ranking strategies, we have a search engine positioning solution to fit your needs.

These links are won as opposed to gained by default. Finding people to exchange links with on the net is easy, it’s finding quality partners that will help influence the rankings (in a positive direction at least) that requires a clear understanding of what the engines want and how to give it to them.

Non-Reciprocal Link Building:

The area of non-reciprocal link building is a slippery one. There are many methods that can be used with varying degrees of success. Due to the sheer number of methods we won’t be able to get into them all here (and there are some that shouldn’t be used anywhere) we will focus below on some of the most significant and more widely applicable:

Directory Submissions:

This is perhaps the easiest and fastest of all link building methods though it can also be one of the more costly depending on the directories you submit your site to. Yahoo! for example, charges $299 for a commercial site to be submitted into the directory. DMOZ is free however, and is certainly the most important given that Google uses the DMOZ directory to provide the listings for the Google Directory. Note though: it can sometimes take months to get a listing there and sometimes even that’s not enough.

That said, there are MANY topical directories and smaller business directories that will accept free submissions and these should definitely be considered. While they may have a relatively low PageRank they will provide reasonably relevant non-reciprocal links and help build your anchor text relevancy.


Writing articles like the one you’re reading righ now is an excellent link building strategy. By providing valuable and useful content to other webmasters you are providing them a service, which will generally translate into a link to your site “in payment”. One of the great features of articles is that the payment isn’t only in link value but in the actual traffic you get from the link itself. But we’re not talking about traffic, we’re talking about rankings; so how do articles influence engines?

There are three main benefits of articles as a link building tactic:

  1. The link to your site will be on a page that is entirely related to your topic. If you have a site about search engine positioning for example, including that phrase in the title and content gives you the opportunity to build the relevancy between the linking page and the page it links to.

    (note: I know I have not used “search engine positioning” in the title – sometimes one has to consider the value of the title from a visitor standpoint and the fact that you came to this page and are reading this article indicates to me that the right decision was made not to change it just for a bit of added relevancy.)

  2. The link will be non-reciprocal. While we indicated above that reciprocal linking is not dead (and it’s not) there is a solid belief among SEO’s (myself included) that non-reciprocal links are weighted more heavily. Having more non-reciprocal links will also help safeguard your site against future changes in the algorithm that may reduce the value of recip links.
  3. You will likely have the ability to determine how the link to your site is worded and you may have the opportunity to link to more than one page on your site. Many people settle for a directory-style author bio. Myself, I prefer to submit my bio in a couple formats (text and html) both of which place the links inside the content. The text format will simply include links such as whereas an html link will contain code very similar to that displayed above. As far as multiple links; if the site you are submitting to will allow you to reference a couple pages you may want to link to your homepage as well as one or two internal pages that you would like to see rankings attained for. Make sure these pages are related to your core article topic or a service the reader would be interested in (see the bio for this article as an example).

Quality Content:

This next part might be a bit shocking. There are actually people out there who will link to your site simply based on the fact that they have found content there they believe will interest their readers. That’s right, people actually link to sites they find of value. On the Beanstalk site and specifically in our blog we often link to other web pages that we have found useful. Other articles, tools, blog posts, etc.often receive non-recip links from us due to the value of the content they contain and we’re definitely not the only ones doing this.

Providing quality content, useful tools, or other helpful services can be a great way to attract non-reciprocal links. After all, this is the entire reason links received any value in the first place, that they are perceived as a vote for the other site.

How To Influence Engines

With proper onsite optimization in place that includes attention to such things as site structure, site size, cohesion of the content across the site, internal linking structure, keyword density and those other onsite factors you’ve likely read much about, all that is left to do is to continue to grow your site (hopefully with quality content people will want to link to) while winning strong links to it.

If what you want to do is influence engines you will need to have strong onsite and offsite factors but don’t stop there. Influencing engines isn’t just about rankings today. You will need to continue building links down the road to insure that the search engines continue to be influenced by how people have linked to you in the past and kept those links in place and also how new people are finding your site helpful and relevant. If the engines see a sudden spurt in link growth and then see that growth stop you are not likely to have a strong ranking indefinitely in any but the lowest competition sectors.

And remember; don’t focus on just one link building method. To insure a solid and secure influence you’re going to need to win links in at least two of the methods discussed above or other ethical methods you may be considering.

Additional Notes

While we couldn’t possibly cover all the methods for link building here in an article I’ve tried to cover the main ones. A couple of methods that receive much attention but which we didn’t have room for above are press release distribution and paid links.

Press releases are an excellent way to get exposure but I have not found them as good as articles for links which is why they weren’t covered above. They are good for traffic however and you will get some links out of them if the release is good so it was worth a short mention here.

Paid links are a dangerous area to discuss as there are so many factors and so many ways it can go wrong. The only advice I will give to those looking to purchase links is this, ask yourself, “Am I expecting to get traffic from this link?” What this will weed out at the very least is small footer links and links on irrelevant sites. Basically, if the link is worth it without the boost in rankings then continue to pay for it and consider any ranking increases a bonus. If you aren’t getting any traffic from the link then it’s likely not worth paying for. If you’re not getting traffic then the site likely isn’t relevant or the link is in a poor location. The engines will likely pick either of these up and you’ll end up paying for a link that isn’t passing on any weight anyways.

SEO news blog post by @ 2:06 pm on October 10, 2006


Google, Orion, SEO & You

Every now and then an event occurs that changes how the SEO community views the way websites are optimized and structures promotions. The purchase of the rights to the Orion Algorithm by Google and equally important, the interest that both Yahoo! and MSN took in the algorithm as they vied for ownership themselves, marks just such an event.

Bill Gates said to Forbes magazine about Orion:

“That we need to take the search way beyond how people think of it today. We believe that Orion will do that.”

What Is The Orion Algorithm?

There is much confusion about the Orion algorithm and much secrecy around the specifics. Here’s is the “What’s Been Said” and “What It Means” breakdown:

What’s Been Said: Ori Allon, the developer of this technology described Orion in this way:

“The results to the query are displayed immediately in the form of expanded text extracts, giving you the relevant information without having to go to the Web site–although you still have that option if you wish.”

He cited an example of the keyword phrase “American Revolution.” The search would not only provide extracts with the phrase, but also additional information on topics such as American history, George Washington and the Declaration of Independence.*

* CNET News, April 10, 2006

What It Means: Most on the web take this to mean the results from Google will be displayed similar to those at where you will be able to get a sample of the site and some of it’s quality content without having to visit the actual site. The part that most caught my attention however is where he cited the example and noted the additional phrases that would be considered and the impact having this technology will have on the way queries are dealt with.

From this standpoint, the Orion Algorithm, in its essence, is a whole new way to score the value of websites that appear on the Internet. Rather than determining the value of a website based on the specific query being entered into the search box, Orion may dig deeper and query related phrases as well. Now, this may not be an entirely new concept, directories have been providing a “Related Categories” option for ages however the addition of this function to standard search engines and what this may well mean for the methods required to rank sites on them is extremely significant.

What Is Relevant?

One of the main hurdles that SEO’s will face in reaction to this new function is determining exactly how the additional relevant phrases are determined. There are a few possible sources the come to mind:

Directories (least likely) – The directories are already using “Related Categories”. It is possible that the engines will choose the simplest possible means of determining relevancy and opt to use sub-categories of a directory listing and to use the “Related Categories” as the supplemental keyword sources.

Alternatively they could simply run the search itself on their directories and reference the categories that come up and run supplemental searches for those categories.

The main drawback to this approach is that many popular keywords would not be cross-reference accurately. For example, a search for “seo” would result in a supplemental search set of “promotion”, “web design and development”, Internet marketing” along with a variety of other phrases. While these phrases are related by industry a visitor searching for “seo” may well not be interested in “web design and development”.

Thesaurus (unlikely) – It may be that the engines choose to reference a thesaurus for related phrases however this doesn’t work for many keyword phrases. Single word phrases would be doable however multiple keyword phrases would be far more difficult and acronyms (such as “seo”) would find no related words in the more common thesauruses.

Search Behavior (highly likely) – The most likely source of the relevancy data is also the most difficult to predict and this is search behavior patterns. While I have had some disagreements with members on a couple SEO forums over whether the search engines can in fact know your search patterns, the conclusion is that they indeed can under many circumstances. Search engines will be able to compile enough data based on the users they are documenting to assess overall search behavior (and here you thought all those great tools the engines come out with were just them spending their money altruistically).

If Google “knows” that after someone enters “seo” as a query they follow that up with “seo service”, this is likely to then be used as a supplemental search. Similarly, if they also know that these same searchers tend to also search shortly before or after for another common phrase, say “w3c compliance” then this too is likely to be used as a supplemental search.

Agree To Disagree: Implementation

Now that we have a better idea of what the Orion Algorithm is and how it works the big question is, what will it’s implementation mean to search engine users and to how websites get ranked on those engines. At this time there appears to be two main schools of thought:

  • What I believe, and
  • Everything else that’s been published

I’ll be the first to admit that my interpretation of how the Orion algorithm will affect search engine results is either not shared by other SEO’s (at least those who have a published opinion on the topic) or has not been thought up by them. That said, my take on the Orion Algorithm did not initially include their predicted affect whereas I now believe that it is likely both implementations will be tested if not brought into full effect within the next 12-18 months (this may seem like a long time but if you want to develop a strategy to react to it this is about the lead-time you may well need). So … what are these two possible outcomes?

Where we all agree: the addition of key parts of web content in the results. This is how the algorithm is explained to function by its developer and is thus the obvious conclusion to most in regards to how it will be implemented.

Everyone else: related information displayed separately. From what I have read, the majority of people believe that the related phrases will be displayed separate from the original query (though rightfully no one seems to be predicting exactly where or how). Essentially this will give searchers the ability to view information on related topics quickly and easily.

This is certain to be included in some capacity and we have already seen similar functions added to the Google results for specific queries though not to any capacity reliable enough to be launched across all Google search results.

And then there’s my opinion: integration in standard search results. To me it seems short-sighted to believe that Google will leave a technology that allows them to drawn information and relevancy on multiple related phrases to just displaying multiple options on a results page. With the processing power they have at their disposal why would they not reference a site against its ability to rank for these other phrases and base the final results on that? Let’s a take a quick peek at the pros and cons of such a move:

Cons first: Processing power. That about covers the downside and I’m sure we’re all aware of the fact that if this ever becomes an issue they have more than enough capital and technical know-how to get around it.

Pros: Imagine a world where running a search for a query took into consideration whether a site ranked for multiple related phrases. What do you suppose the impact on the results would be if only those sites that had content related to a number of areas of a topic ranked highly? The answer: a much more relevant set of results.


Fortunately, while there may be some disagreement in regards to how this new algorithm will be integrated into the search engine results pages the resulting actions required are the same. Whether the new functions will be added in the form on additional links and information on the results pages or whether they will be taken into consideration when ranking the site for the initial query, sites that rank well for a multitude of related phrases will fare better than those that rank for just one of the phrases.

The action required then on the part of SEO’s and website owners is to provide quality unique content on all the possible areas that may be considered relevant to the main keyword target. Once this is accomplished then these areas need to be promoted in order to insure that they rank well.

The resulting web will be one that rewards websites with a large amount of quality content on the highest number of topics related to a specific issue. If one considers the end goal of any of the major search engines, to provide the most relevant results possible, this new technology is sure help promote these types of results and insure that the searcher is receiving results that are likely to provide the information they’re looking for.

And let’s also consider this: should you choose to be an “early adopter” and begin making changes to your site, adding new content, optimizing it and getting it ranking well, what will the results be? Even if Orion isn’t implemented for another decade your website will gain stickiness and rank for more related keywords bringing you more targeted traffic and keeping it on your site. Could this possibly be a bad thing?


While I have strived to provide some insight into the Orion Algorithm and what it means to you, there is a lot of information/speculation out there regarding what it means and which also covers other implementations of this technology not covered in this article. Below you will find some of the better pieces of information.

I have included information that contradicts what you may have read above. This algorithm is sure to have an enormous impact on the way searchers find results and the way SEO’s promote sites and thus, you need to have all the quality information at your disposal to make the right decisions for you website and your business.

Search Engine Watch – Danny Sullivan wrote a solid piece on the subject (as he always does) which includes some good links to related information and also a link to their forum thread on the subject where you can get other opinions on what this means to searchers and SEO’s.

E-commerce Time – Jennifer LeClaire wrote a good piece on Orion which covers more on the integration of relevant listings into the results pages.

The Sidney Morning Herald – Stephen Hutcheon covers some of the basics regarding how the deal to purchase the algorithm came about, who the major players were, and a bit of the history behind Orion.

SEO news blog post by @ 4:54 pm on July 24, 2006

Categories:SEO Articles


« Newer PostsOlder Posts »
Level Triple-A conformance icon, W3C-WAI Web Content Accessibility Guidelines 1.0 Valid XHTML 1.0! Valid CSS!
Copyright© 2004-2014
Beanstalk Search Engine Optimization, Inc.
All rights reserved.