Beanstalk on Google+ Beanstalk on Facebook Beanstalk on Twitter Beanstalk on LinkedIn Beanstalk on Pinterest
Translate:
Published On:
SEO articles and blog published on ...
Hear Us On:
Webmaster Radio
Blog Partner Of:
WebProNews Blog Partner
Helping Out:
Carbon balanced.
Archives
RSS

XMLRSS

Beanstalk's Internet Marketing Blog

At Beanstalk Search Engine Optimization we know that knowledge is power. That's the reason we started this Internet marketing blog back in 2005. We know that the better informed our visitors are, the better the decisions they will make for their websites and their online businesses. We hope you enjoy your stay and find the news, tips and ideas contained within this blog useful.


February 27, 2009

Ecommerce & SEO

The purpose of any business website is to promote a product or service online. The purpose of an ecommerce website is to take it one step further and to allow your visitors to purchase your products or services directly from your website. This model has many great advantages over the non-ecommerce website in that it allows for the generation of revenue with little-or-no time spent in selling past the cost to have the website designed and maintained, and it does not require the visitor to call you during business hours thus helping secure the sale to an impulse buyer. If your website provides all the information that the buyer would want, you can save significant money in sales time spent in that the visitor can find all the information they need to decide to buy from you without taking up your time or that of one of your sales staff. But ecommerce sites have a serious drawback as well; very few of them can be properly indexed by search engine spiders and thus will fail to rank highly.

A non-ecommerce website may have the disadvantage on not being able to take the visitor’s money the second they want to spend it, however if it can be found on the first page of the search engines while your beautifully designed ecommerce site sits on page eight, the advantage is theirs. The vast majority of visitors will never get to see your site, let alone buy from you, whereas a non-ecommerce site may lose sales because they don’t sell online but at least they’re able to deliver their message to an audience to begin with. So what can be done? The key is in the shopping cart you select.

SEO & Shopping Carts

The biggest problem with many SEO-friendly ecommerce solutions is that they are created after the initial product. Shopping cart systems such as Miva Merchant and OS Commerce are not designed with the primary goal of creating pages that will be well-received by the search engine spiders. Most shopping cart systems out there today are not in-and-of-themselves even spiderable and require 3rd party add-ons to facilitate even the lowest form of SEO-friendliness. The money you may have saved in choosing an inexpensive shopping cart may very well end up costing you your business in the long run, especially if you are using your shopping cart as the entire site, which we have seen may times in the past.

What Can Be Done?

There are essentially two solutions to this problem. The first is to create a front-end site separate from the shopping cart. What this will effectively do is create a number of pages that can be easily spidered (assuming that they’re well designed). The drawback to this course of action is that your website will forever be limited to the size of the front-end site. Which brings us to the second option: choose a search engine friendly shopping cart system.

Finding an SEO-friendly shopping cart system is far easier said than done. There are many factors that have to be taken into account including the spiderability of the pages themselves, the customization capacity of the individual pages, the ease of adding products and changing the pages down the road, etc. While I’ve worked with many shopping cart and ecommerce systems, to date there has been only one that has truly impressed me in that it is extremely simple to use, it allows for full customization of individual pages and the product pages get fully spidered to the point where they have PageRank assigned. A rarity in the shopping cart world.

Easy As Apple Pie

Mr. Lee Roberts, President of Rose Rock Design and creator of the Apple Pie Shopping Cart, was kind enough to take the time to speak with me regarding how he developed his system. Trying to get an understanding of how this system was born I inquired as to what differentiated their system from others. Without “giving away the farm”, Lee pointed out that his system was unique in that the search engines were a consideration from the birth of this project. Rather than trying to jerry-rig a system that was already in place, he initiated the development of a system whose first task was to allow for easily spidered and customized pages. A significant advantage to be sure.

In further discussions he pointed out a few key factors that should be considered by all when choosing a shopping cart system. While more advance shopping cart systems that provide for SEO-friendly pages may seem more expensive, they save you the cost of developing a front-end site, maintaining the pricing on a static page if one goes that route, and of course – if all your site’s pages are easily spidered and you can then have hundreds of additional relevant pages added to your site’s overall strength and relevancy you have a serious advantage in the SEO “game”. If a shopping cart system costs you an extra $100 per month to maintain but it’s use provides you with an additional $5000 in sales that month did it really “cost” you $100?

Conclusion

It is not to say that the Apple Pie Shopping Cart is end-all-be-all of SEO for an ecommerce site, if it was Lee wouldn’t be in the process of building a new version that will include many new features for Internet marketing and tracking, and we would be out of work. That said, if you’ve got an e-commerce site or are looking to have one built, one must consider what type of marketing strategy will be taken with the site and if SEO is one of those, insure to find a system that provides the same advantages as this one.

It may cost a bit more up front but doing it right the first time is far less costly than building a site that can’t be marketed properly and to it’s maximum potential.

SEO news blog post by @ 3:46 pm


 

 

July 10, 2007

Google Algorithm Update Analysis

Anybody who monitors their rankings with the same vigor that we in the SEO community do will have noticed some fairly dramatic shifts in the algorithm starting last Thursday (July 5th) and continuing through the weekend. Many sites are rocketing into the top 10 which, of course, means that many sites are being dropped at the same time. We were fortunate not to have any clients on the losing end of that equation however we have called and emailed the clients who saw sudden jumps into the top positions to warn them that further adjustments are coming. After a weekend of analysis there are some curiosities in the results that simply require further tweaks in the ranking system.

This update seems to have revolved around three main areas: domain age, backlinks and PageRank.

Domain Age

It appears that Google is presently giving a lot of weight to the age of a domain and, in this SEO’s opinion, disproportionately so. While the age of a domain can definitely be used as a factor in determining how solid a company or site is, there are many newer sites that provide some great information and innovative ideas. Unfortunately a lot of these sites got spanked in the last update.

On this tangent I have to say that Google’s use of domain age as a whole is a good filter, allowing them to “sandbox” sites on day one to insure that they aren’t just being launched to rank quickly for terms. Recalling back to the “wild west days” of SEO when ranking a site was a matter of cramming keywords into content and using questionable methods to generate links quickly I can honestly say that adding in this delay was an excellent step that insured that the benefits of pumping out domains became extremely limited. So I approve of domain age being used to value a site – to a point.

After a period of time (let’s call it a year shall we) the age should and generally has only had a very small influence on a site’s ranking with the myriad of other factors overshadowing the site’s whois data. This appears to have changed in the recent update with age holding a disproportionate weight. In a number of instances this has resulted in older, less qualified domains to rank higher than newer sites of higher quality.

This change in the ranking algorithm will most certainly be adjusted as Google works to maximize the searchers experience. We’ll get into the “when” question below.

Backlinks

The way that backlinks are being calculated and valued has seen some adjustments in the latest update as well. The way this has been done takes me back a couple years to the more easily gamed Google of old. This statement alone reinforces the fact that adjustments are necessary.

The way backlinks are being valued appears to have lost some grasp on relevancy and placed more importance on sheer numbers. Sites with large, unfocused reciprocal link directories are outranking sites with fewer but more relevant link. Non-reciprocal links lost the “advantages” that they held over reciprocal links until recently.

Essentially the environment is currently such that Google has made itself more easily gamed than it was a week ago. In the current environment, building a reasonable sized site with a large recip link directory (even unfocused) should be enough to get you ranking. For obvious reasons this cannot (and should not) stand indefinitely.

PageRank

On the positive side of the equation, PageRank appears to have lost some of it’s importance including the importance of PageRank as it pertains to the value of a backlinks. In my opinion this is a very positive step on Google’s part and shows a solid understanding of the fact that PageRank means little in terms of a site’s importance. That said, while PageRank is a less than perfect calculation subject to much abuse and manipulation from those pesky people in the SEO community it did serve a purpose and while it needed to be replaced it doesn’t appear to have been replaced with anything of substantial value.

A fairly common belief has been that PageRank would be or is being replaced by TrustRank and Google would not give us a green bar to gague a site’s trust on (good call Google). With this in mind one of two things has happened; either Google has decided the TrustRank is irrelevant and so is PageRank and decided to scrap both (unlikely) or they have shifted the weight from PageRank to TrustRank to some degree and are just now sorting out the issues with their TrustRank calculations (more likely). Issues that may have existed with TrustRank may not have been clear due to it’s weight in the overall algorithm and with this shift reducing the importance of PageRank the issues that face the TrustRank calculations may well be becoming more evident

In truth, the question is neither here nor there (as important a question as it may be). We will cover why this is in the …

Conclusion

So what does all of this mean? First, it means that this Thursday or Friday we can expect yet another update to correct some of the issues we’ve seen rise out of the most current round. This shouldn’t surprise anyone too much, we’ve been seeing regular updates out of Google quite a bit over the past few months.

But what does this mean regarding the aging of domains? While I truly feel that an aging delay or “sandbox” is a solid filter on Google’s part – it needs to have a maximum duration. A site from 2000 is not, by default, more relevant than a site from 2004. After a year-or-so the trust of a domain should hold steady or at most, hold a very slight weight. This is an area we are very likely to see changes in the next update.

As far as backlinks go, we’ll see changes in the way they are calculated unless Google is looking to revert back to the issues they had in 2003. Lower PageRank, high relevancy links will once again surpass high quantity, less relevant links. Google is getting extremely good and determining relevancy and so I assume the current algorithm issues has more to do with the weight assigned to different factors than an inability to properly calculate a links relevancy.

And in regards to PageRank, Google will likely shift back slightly to what worked and give more importance to PageRank, at least while they figure out what went awry here.

In short, I would expect that with an update late this week or over the weekend we’re going to see a shift back to last week’s results (or something very close to it) after which they’ll work on the issues they’ve experienced and launch a new (hopefully improved) algorithm shift the following weekend. And so, if you’ve enjoyed a sudden jump from page 6 to top 3, don’t pop the cork on the champaign too quickly and if you’ve noticed some drops, don’t panic. More adjustments to this algorithm are necessary and, if you’ve used solid SEO practices and been consistent and varied in your link building tactics – keep at it and your rankings will return.

SEO news blog post by @ 3:57 pm

Categories:SEO Articles

 

 

October 10, 2006

How To Win Links And Influence Engines

The title of this article is designed to prove (in an SEO kind of way) the very point that Dale Carnegie was making when he wrote one of the most influential business books of all times, “How To Win Friends And Influence People” (arguably one of the best business books ever written as well). In the titling of his book Mr. Carnegie was trying to do two things:

  1. Write a title that captures everything that people want in order to sell more books, and
  2. Tie two important things together that are related but often viewed as different. In the case of the book it was winning friends and influencing people which he points out are essentially based on the same core traits and actions. Similarly, in our title here we are capturing two of the key areas people interested in SEO are looking to read about and thus we will show the essential tie between winning links and the influence it will have on your search engine rankings. We will also discuss methods for actually winning them as opposed to settling for second-rate links rather like winning friends as opposed to settling for tolerable acquaintances.

How To Win Links

As with virtually every aspect in SEO, there are multiple areas of this single field. If there were one hard-and-fast answer to link building we would all be ranking highly on Google and the top 10 would be a VERY crowded place. Fortunately this isn’t the case and the rankings are becoming more and more a Darwinist exercise in “survival of the fittest” (which is how it should be). Proper link building will help you be the fittest and, over time, influence engines.

If you have a site in any competition level above “low” you will want to use at least two different methods for building links. Aside from speeding up the link building process this will help insure your site withstands changes in the way link values are calculated. While there are far too many methods for building links than can be listed here (and there are some that launch so far into the black hat tactics that I wouldn’t want to), here are some of the main link building methods you should consider using:

Reciprocal Link Building:

There are many who would write that reciprocal link building is dead. While it is undeniable that the “rules” around reciprocal link building have changed it is far from dead. That said, there are specific guidelines that must be followed to make a recip link building campaign a success. Some of the more important are:

  1. Relevancy is arguably the single most important factor to consider when building recip links. For every link exchange you are considering you must ask yourself, “Is this a site that my visitors would be interested in?” If you can honestly answer that your site visitors would be genuinely interested in a site you are linking to then it’s a good link.
  2. PageRank is not the end-all-be-all that is once was however it is still a decent measure of the relative value of a website. While not as important as relevancy, it is a factor and obtaining higher PageRank links will require less links to be built.
  3. Does the site you are considering linking to have a solid link building strategy in place? Just because you’re following the best practices of link building doesn’t mean that everyone in your industry is. A good site may be following misguided link building practices (real estate sites should not link to poker sites) and if they are then their overall value is or may well be reduced in the eyes of the search engines. If they have an active and ethical link building program in place then their overall value is likely to increase making them more valuable down the road than they are today.
  4. How many links appear on each page and where will your be positioned? If your link will appear at the bottom of a page with 87 links it is far less valuable than a link near the top of a page with 25 links. This fits into the “ethical” category of point 3 above but worth mentioning again.
  5. Links that exist within content are weighted as more natural than directory-style links. Thus, when possible send HTML code that places your link within the descriptive text rather than in the title. For example, we may use the following HTML for a link to the Beanstalk site:

<strong>Beanstalk Search Engine Optimization</strong><br>

Beanstalk offers ethical and effective <a href=”http://www.beanstalk-inc.com/”>search engine positioning services</a> that will get your site to the top of the rankings. Whether you operate a small business and need regional results or if you are the VP of a Fortune 500 company needing consulting on new site changes and internal page ranking strategies, we have a search engine positioning solution to fit your needs.

These links are won as opposed to gained by default. Finding people to exchange links with on the net is easy, it’s finding quality partners that will help influence the rankings (in a positive direction at least) that requires a clear understanding of what the engines want and how to give it to them.

Non-Reciprocal Link Building:

The area of non-reciprocal link building is a slippery one. There are many methods that can be used with varying degrees of success. Due to the sheer number of methods we won’t be able to get into them all here (and there are some that shouldn’t be used anywhere) we will focus below on some of the most significant and more widely applicable:

Directory Submissions:

This is perhaps the easiest and fastest of all link building methods though it can also be one of the more costly depending on the directories you submit your site to. Yahoo! for example, charges $299 for a commercial site to be submitted into the directory. DMOZ is free however, and is certainly the most important given that Google uses the DMOZ directory to provide the listings for the Google Directory. Note though: it can sometimes take months to get a listing there and sometimes even that’s not enough.

That said, there are MANY topical directories and smaller business directories that will accept free submissions and these should definitely be considered. While they may have a relatively low PageRank they will provide reasonably relevant non-reciprocal links and help build your anchor text relevancy.

Articles:

Writing articles like the one you’re reading righ now is an excellent link building strategy. By providing valuable and useful content to other webmasters you are providing them a service, which will generally translate into a link to your site “in payment”. One of the great features of articles is that the payment isn’t only in link value but in the actual traffic you get from the link itself. But we’re not talking about traffic, we’re talking about rankings; so how do articles influence engines?

There are three main benefits of articles as a link building tactic:

  1. The link to your site will be on a page that is entirely related to your topic. If you have a site about search engine positioning for example, including that phrase in the title and content gives you the opportunity to build the relevancy between the linking page and the page it links to.

    (note: I know I have not used “search engine positioning” in the title – sometimes one has to consider the value of the title from a visitor standpoint and the fact that you came to this page and are reading this article indicates to me that the right decision was made not to change it just for a bit of added relevancy.)

  2. The link will be non-reciprocal. While we indicated above that reciprocal linking is not dead (and it’s not) there is a solid belief among SEO’s (myself included) that non-reciprocal links are weighted more heavily. Having more non-reciprocal links will also help safeguard your site against future changes in the algorithm that may reduce the value of recip links.
  3. You will likely have the ability to determine how the link to your site is worded and you may have the opportunity to link to more than one page on your site. Many people settle for a directory-style author bio. Myself, I prefer to submit my bio in a couple formats (text and html) both of which place the links inside the content. The text format will simply include links such as http://www.beanstalk-inc.com/ whereas an html link will contain code very similar to that displayed above. As far as multiple links; if the site you are submitting to will allow you to reference a couple pages you may want to link to your homepage as well as one or two internal pages that you would like to see rankings attained for. Make sure these pages are related to your core article topic or a service the reader would be interested in (see the bio for this article as an example).

Quality Content:

This next part might be a bit shocking. There are actually people out there who will link to your site simply based on the fact that they have found content there they believe will interest their readers. That’s right, people actually link to sites they find of value. On the Beanstalk site and specifically in our blog we often link to other web pages that we have found useful. Other articles, tools, blog posts, etc.often receive non-recip links from us due to the value of the content they contain and we’re definitely not the only ones doing this.

Providing quality content, useful tools, or other helpful services can be a great way to attract non-reciprocal links. After all, this is the entire reason links received any value in the first place, that they are perceived as a vote for the other site.

How To Influence Engines

With proper onsite optimization in place that includes attention to such things as site structure, site size, cohesion of the content across the site, internal linking structure, keyword density and those other onsite factors you’ve likely read much about, all that is left to do is to continue to grow your site (hopefully with quality content people will want to link to) while winning strong links to it.

If what you want to do is influence engines you will need to have strong onsite and offsite factors but don’t stop there. Influencing engines isn’t just about rankings today. You will need to continue building links down the road to insure that the search engines continue to be influenced by how people have linked to you in the past and kept those links in place and also how new people are finding your site helpful and relevant. If the engines see a sudden spurt in link growth and then see that growth stop you are not likely to have a strong ranking indefinitely in any but the lowest competition sectors.

And remember; don’t focus on just one link building method. To insure a solid and secure influence you’re going to need to win links in at least two of the methods discussed above or other ethical methods you may be considering.

Additional Notes

While we couldn’t possibly cover all the methods for link building here in an article I’ve tried to cover the main ones. A couple of methods that receive much attention but which we didn’t have room for above are press release distribution and paid links.

Press releases are an excellent way to get exposure but I have not found them as good as articles for links which is why they weren’t covered above. They are good for traffic however and you will get some links out of them if the release is good so it was worth a short mention here.

Paid links are a dangerous area to discuss as there are so many factors and so many ways it can go wrong. The only advice I will give to those looking to purchase links is this, ask yourself, “Am I expecting to get traffic from this link?” What this will weed out at the very least is small footer links and links on irrelevant sites. Basically, if the link is worth it without the boost in rankings then continue to pay for it and consider any ranking increases a bonus. If you aren’t getting any traffic from the link then it’s likely not worth paying for. If you’re not getting traffic then the site likely isn’t relevant or the link is in a poor location. The engines will likely pick either of these up and you’ll end up paying for a link that isn’t passing on any weight anyways.

SEO news blog post by @ 2:06 pm


 

 

July 24, 2006

Google, Orion, SEO & You

Every now and then an event occurs that changes how the SEO community views the way websites are optimized and structures promotions. The purchase of the rights to the Orion Algorithm by Google and equally important, the interest that both Yahoo! and MSN took in the algorithm as they vied for ownership themselves, marks just such an event.

Bill Gates said to Forbes magazine about Orion:

“That we need to take the search way beyond how people think of it today. We believe that Orion will do that.”

What Is The Orion Algorithm?

There is much confusion about the Orion algorithm and much secrecy around the specifics. Here’s is the “What’s Been Said” and “What It Means” breakdown:

What’s Been Said: Ori Allon, the developer of this technology described Orion in this way:

“The results to the query are displayed immediately in the form of expanded text extracts, giving you the relevant information without having to go to the Web site–although you still have that option if you wish.”

He cited an example of the keyword phrase “American Revolution.” The search would not only provide extracts with the phrase, but also additional information on topics such as American history, George Washington and the Declaration of Independence.*

* CNET News, April 10, 2006

What It Means: Most on the web take this to mean the results from Google will be displayed similar to those at Ask.com where you will be able to get a sample of the site and some of it’s quality content without having to visit the actual site. The part that most caught my attention however is where he cited the example and noted the additional phrases that would be considered and the impact having this technology will have on the way queries are dealt with.

From this standpoint, the Orion Algorithm, in its essence, is a whole new way to score the value of websites that appear on the Internet. Rather than determining the value of a website based on the specific query being entered into the search box, Orion may dig deeper and query related phrases as well. Now, this may not be an entirely new concept, directories have been providing a “Related Categories” option for ages however the addition of this function to standard search engines and what this may well mean for the methods required to rank sites on them is extremely significant.

What Is Relevant?

One of the main hurdles that SEO’s will face in reaction to this new function is determining exactly how the additional relevant phrases are determined. There are a few possible sources the come to mind:

Directories (least likely) – The directories are already using “Related Categories”. It is possible that the engines will choose the simplest possible means of determining relevancy and opt to use sub-categories of a directory listing and to use the “Related Categories” as the supplemental keyword sources.

Alternatively they could simply run the search itself on their directories and reference the categories that come up and run supplemental searches for those categories.

The main drawback to this approach is that many popular keywords would not be cross-reference accurately. For example, a search for “seo” would result in a supplemental search set of “promotion”, “web design and development”, Internet marketing” along with a variety of other phrases. While these phrases are related by industry a visitor searching for “seo” may well not be interested in “web design and development”.

Thesaurus (unlikely) – It may be that the engines choose to reference a thesaurus for related phrases however this doesn’t work for many keyword phrases. Single word phrases would be doable however multiple keyword phrases would be far more difficult and acronyms (such as “seo”) would find no related words in the more common thesauruses.

Search Behavior (highly likely) – The most likely source of the relevancy data is also the most difficult to predict and this is search behavior patterns. While I have had some disagreements with members on a couple SEO forums over whether the search engines can in fact know your search patterns, the conclusion is that they indeed can under many circumstances. Search engines will be able to compile enough data based on the users they are documenting to assess overall search behavior (and here you thought all those great tools the engines come out with were just them spending their money altruistically).

If Google “knows” that after someone enters “seo” as a query they follow that up with “seo service”, this is likely to then be used as a supplemental search. Similarly, if they also know that these same searchers tend to also search shortly before or after for another common phrase, say “w3c compliance” then this too is likely to be used as a supplemental search.

Agree To Disagree: Implementation

Now that we have a better idea of what the Orion Algorithm is and how it works the big question is, what will it’s implementation mean to search engine users and to how websites get ranked on those engines. At this time there appears to be two main schools of thought:

  • What I believe, and
  • Everything else that’s been published

I’ll be the first to admit that my interpretation of how the Orion algorithm will affect search engine results is either not shared by other SEO’s (at least those who have a published opinion on the topic) or has not been thought up by them. That said, my take on the Orion Algorithm did not initially include their predicted affect whereas I now believe that it is likely both implementations will be tested if not brought into full effect within the next 12-18 months (this may seem like a long time but if you want to develop a strategy to react to it this is about the lead-time you may well need). So … what are these two possible outcomes?

Where we all agree: the addition of key parts of web content in the results. This is how the algorithm is explained to function by its developer and is thus the obvious conclusion to most in regards to how it will be implemented.

Everyone else: related information displayed separately. From what I have read, the majority of people believe that the related phrases will be displayed separate from the original query (though rightfully no one seems to be predicting exactly where or how). Essentially this will give searchers the ability to view information on related topics quickly and easily.

This is certain to be included in some capacity and we have already seen similar functions added to the Google results for specific queries though not to any capacity reliable enough to be launched across all Google search results.

And then there’s my opinion: integration in standard search results. To me it seems short-sighted to believe that Google will leave a technology that allows them to drawn information and relevancy on multiple related phrases to just displaying multiple options on a results page. With the processing power they have at their disposal why would they not reference a site against its ability to rank for these other phrases and base the final results on that? Let’s a take a quick peek at the pros and cons of such a move:

Cons first: Processing power. That about covers the downside and I’m sure we’re all aware of the fact that if this ever becomes an issue they have more than enough capital and technical know-how to get around it.

Pros: Imagine a world where running a search for a query took into consideration whether a site ranked for multiple related phrases. What do you suppose the impact on the results would be if only those sites that had content related to a number of areas of a topic ranked highly? The answer: a much more relevant set of results.

Conclusion

Fortunately, while there may be some disagreement in regards to how this new algorithm will be integrated into the search engine results pages the resulting actions required are the same. Whether the new functions will be added in the form on additional links and information on the results pages or whether they will be taken into consideration when ranking the site for the initial query, sites that rank well for a multitude of related phrases will fare better than those that rank for just one of the phrases.

The action required then on the part of SEO’s and website owners is to provide quality unique content on all the possible areas that may be considered relevant to the main keyword target. Once this is accomplished then these areas need to be promoted in order to insure that they rank well.

The resulting web will be one that rewards websites with a large amount of quality content on the highest number of topics related to a specific issue. If one considers the end goal of any of the major search engines, to provide the most relevant results possible, this new technology is sure help promote these types of results and insure that the searcher is receiving results that are likely to provide the information they’re looking for.

And let’s also consider this: should you choose to be an “early adopter” and begin making changes to your site, adding new content, optimizing it and getting it ranking well, what will the results be? Even if Orion isn’t implemented for another decade your website will gain stickiness and rank for more related keywords bringing you more targeted traffic and keeping it on your site. Could this possibly be a bad thing?

Resources

While I have strived to provide some insight into the Orion Algorithm and what it means to you, there is a lot of information/speculation out there regarding what it means and which also covers other implementations of this technology not covered in this article. Below you will find some of the better pieces of information.

I have included information that contradicts what you may have read above. This algorithm is sure to have an enormous impact on the way searchers find results and the way SEO’s promote sites and thus, you need to have all the quality information at your disposal to make the right decisions for you website and your business.

Search Engine Watch – Danny Sullivan wrote a solid piece on the subject (as he always does) which includes some good links to related information and also a link to their forum thread on the subject where you can get other opinions on what this means to searchers and SEO’s.

E-commerce Time – Jennifer LeClaire wrote a good piece on Orion which covers more on the integration of relevant listings into the results pages.

The Sidney Morning Herald – Stephen Hutcheon covers some of the basics regarding how the deal to purchase the algorithm came about, who the major players were, and a bit of the history behind Orion.

SEO news blog post by @ 4:54 pm

Categories:SEO Articles

 

 

October 10, 2005

Google Reader Launched

Google has recently launched a new service called Google Reader. This service, available at http://reader.google.com/ allows users to search for and easily manage RSS feeds, giving quick and easy access to the most current information and news on the topics that interest you most.

For those of you unfamiliar with RSS feeds, they are simply itemized lists in an RSS file that can be pick up and displayed on other websites and or read through the use of RSS readers. Bloggers typically us RSS to syndicate their blog posts. For example, people wishing to keep updated on what’s going on at Google may use a reader to display their RSS feed (http://googleblog.blogspot.com/atom.xml) thus allowing them instant access to any new posts in their blog.

As most are aware, the popularity of blogs and RSS with it has increased and undoubtedly will continue to increase for the foreseeable future. One has to admit, instant access to up-to-date information on topics of interest has it’s appeal. The launch of Google Reader is a giant step for the average surfer. It provides an easy and powerful tool for visitors to find and manage the feeds they are interested in.

How To Use Google Reader?

For many who are less familiar with RSS, the notion of downloading an RSS reader, configuring it, etc. seems a daunting task when one can simply visit their favorite news site or use the new search feature of their favorite search engine. Understandably there are many who would rather not undertake the task of trying to understand something new when the information they are looking for is otherwise available through other means. While this is true, RSS allows a user to keep updated on the news they might not even be aware of to look for. Google knows this and thus, Google Reader was born.

To use Google Reader (and I high recommend at least giving it a quick try) you will need to take the following steps:

  1. Visit the Google Reader site at http://reader.google.com/.
  2. Use the search box for a topic of interest (“google” for example)
  3. Look through the results for a feed of interest (I personally chose “Google News”)
  4. Click “Subscribe”

That’s all there is to signing up for a feed. You can sign up for one or many of them.

At this point you’re probably wondering what you just got for this 20 seconds of effort. If we now click back to the reader homepage (add it to your Favorites for easy access in the future) you see on the left-hand side a list of the feeds you’ve subscribed to. If you choose a feed, on the right hand side you’ll see all the new posts to that feed.

This is perhaps one of the best products to come from Google in quite a while. I’ll admit that the folks at Google are seldom short of interesting and innovative ideas however from a usability and “making your life easier” standpoint the system they have developed here allows even the less technical to easily gain access to current information and keep updated effortlessly.

Who Should Use It?

Quite honestly Google Reader, due to its power and easy of use, is a helpful tool for virtually anybody who wants to keep themselves updated on information from world news to hockey scores. That said, there are definitely people who will be prone to become “power users” of this service. People who need quick access to the most current information, from reporters and researchers to business people and consultants will find this service invaluable. I know as an SEO that I’ll be using it often as keeping on top of even the smallest changes, services and search engine updates can be crucial to the success of a campaign.

Advanced Features

While all of the benefits noted above are good for the average user, they have also added some advanced features. The advanced features include:

GMail this – Never ones’ to miss an opportunity to promote their own services and drop some ads in it, they have added a link to “GMail this” to others. Of course you have to have a GMail account to use the service which means you either have to be invited by and existing GMail user of have a mobile phone and be living in the US.

Blog This! – This is definitely my favorite of the advanced features. If you’re using Blogger (again, a Google property) you can click the “Blog This!” link and it will open a window to your Blogger account and insert a link to the blog you want to reference.

What Does This Mean For SEO’s?

The launch of Google Reader stands to make blogs and RSS an even more important componant in a thorough Internet Marketing strategy. With content syndication now made so much easier for the average user, it’s popularity is sure to climb significantly. People will begin reading and using feeds more regularly and it won’t just be the more technical that can benefit from this highly effective communications method.

Resources

While it’s definitely a simple system to setup and use, there may still be many of you wondering exactly what RSS is, how to use it on your own site, how to set it up, and perhaps a few even wondering what a blog is and how you can add one to your site. For you, here are some links to some helpful resources on the topic:

RSS (file format) – One the Wikipedia site you’ll find great information on this technology and links to other useful resources on the subject.

Client Communications As Ranking Tools – An article on the use of blogs by veteran SEO Jim Hedger. Covers the use of blogs for SEO as well as in client communications.

Blogger – There are many different tools and software packages for developing a blog on your website. Blogger is one of the easiest to use, includes a simple way to add an RSS feed of your blog, and do I need to mentioned that it’s owned by Google? This won’t get you a higher ranking on the blog search but it’ll certainly help insure that your blog and feed are developed using a technology they can easily read.

SEO news blog post by @ 4:51 pm

Categories:SEO Articles

 

 

August 24, 2005

SEO & Competition Analysis – Part Two

Once you have optimized the onsite factors from part one of this series it’s time to launch into the external factors. External SEO factors generally refer to the internal links to your, and your competitor’s, website.

Analyzing the links to your competitors is not a simple matter of running a link:www.competitorsdomain.com on Google and rushing off and duplicating what you find there. First of all, Google does not display all of the links they find to a site and thus, this count will leave you with about 5 or 6 percent of the real links to your main competitors. Yahoo! is much better at displaying all the links to a site however even this has it’s shortcomings in the analysis process. Secondly, the number of links is only a fraction of what’s important in their development.

To fully grasp how your competitors are ranking highly for your targeted phrases you will want to know a number of things about the links to their site including:

  • How many links do they have?
  • How many of these links come from the same sites?
  • Are these sites relevant?
  • What is the PageRank distribution of the links?
  • Are these links image or text links and if text,
  • What anchor text is used to link to your competitor’s site?

Why Are These Factors Important?

These factors are important as they define the value of the link. The number of links is perhaps the least important of these factors. A site can have 10,000 incoming links and if they are all from a single unrelated site with a low PageRank then the value of these links is negligible.

Knowing how many of the links to your competitor’s site come from the same site or sites will let you know where they have bought advertising and also help isolate weakness in their link counts. Multiple links from the same website are not given the same value as multiple links from different websites. If your competitors have thousands of incoming links that come from 5 different websites you have far less work to do that if they even had a couple hundred, all from different sites.

The relevancy of the incoming links is extremely important and gaining importance every update. Unfortunately this is also the hardest factor to gauge as, “what constitutes relevancy?” and, “how exactly do I find out if my competitors links are relevant without visiting every one of their links?” can be problematic questions.

Gauging relevancy can generally be done with a simple thought: if I am on a site and the link makes sense to be there (for example, a web design company linking to a web hosting company) then it can be considered relevant. Basically, if there are people who will actually click the link then it is relevant. Finding out if your competitor’s links are relevant without visiting every one of their link partners is a different hurdle to jump.

Rather than visiting each-and-every link it is easier view only the most important ones; that would be the ones from high PageRank pages. But how does one do that?

As with the use of a KDA tool in part one, we use the external analysis features of Total Optimizer Pro to tear apart the external factors our main competitors are using to hold top ten positions. While in part one I was able to note that there are other tools out there that break down keyword density elements, I am not able to do the same with offsite optimization factors. Total Optimizer Pro is the only tool we use that allows for such detailed analysis of external SEO factors when dealing with competition analysis.

The first step is to isolate which domains the links are coming from. The more links coming from a few domains the better as this indicates that the competition is lower than a pure link-count would indicate. Moving on from there we look to the PageRank breakdown of the links. The higher the numbers of high PageRank links the more difficult the competition is however, once you have isolated which domains the links are coming from it is often simply a matter of visiting the site and establishing the same links to yours either through exchanges, directory listings, or other tactics.

While you are on the sites, assess whether the content is relevant. You will undoubtedly not want to visit each and every page that links to your competitors however if you visit all the top sites (i.e. PageRank 3 or higher) you will get a very solid idea of the relevancy value of the links. Once we know the value in regards to relevancy of the content we now need to know what they’re doing in regards to transferring that relevancy along in the form of their links.

Using a tool such as Total Optimizer Pro it is simple to determine exactly what types of links are pointing to your competitors, however it is possible, though much more time-consuming, to do it manually (i.e. you will have to visit every page).

An important factor in SEO and the building of relevancy to your site comes in the form of anchor text. The verbiage used to link to your site, or the alt text in the event the link is an image link, can play an important role. To illustrate this with a great example; searches on Google for “msn” results in the page www.submit-it.com in position seven. If you view the cache, rather than receiving the highlighted use of the term “msn” (as noted in Part One of this series) you receive the note that, ” These terms only appear in links pointing to this page: msn“. The relevancy of the anchor txt in this case is so strong that this page outranks many with “msn” optimized for using onsite factors.

What Do We Know?

So what do we now know about our competition? We now know where their links are coming from, the PageRank of those links, the relevancy of the top links, the anchor text and/or alt tags used to link to your competitors’ sites, and how many of those are multiple links from the same site. Basically, combined with the information that was attained in part one of this series in regards to the onsite factors we effectively have a blueprint for what is required to hold a top positions for a specific phrase.

Where Do We Go From Here?

So now you have a blueprint, but what do you do with it? The onsite factors covered in part one need to be duplicated. The offsite factors (i.e. incoming links) need to be duplicated however what you also must keep in mind is that you are working to beat someone out. They in turn will work to take back their position, and there may be others working to do that same that just haven’t shown up yet.

Here we follow the 10%-more rule. In regards to onsite factors, all you can do is work with the average keyword densities and make sure your content is well written while maximizing the usage of keyword density and special text to give you the biggest boost possible. After that the 10% rule comes into effect. Once you know exactly what your main competitors have done in regards to their incoming links, do that but add 10% either in numbers or in value and relevancy.

While this entire process can be very time consuming, the goal here is not to save time, it is to maximize the effectiveness of the SEO performed on your site. Spending a fraction of the time to produce little or no results is never as desirable as insuring you’re doing it right from the beginning and then taking the time to do what’s needed, thus increasing your odds of success greatly.

SEO news blog post by @ 1:56 pm

Categories:SEO Articles

 

 

August 23, 2005

SEO & Competition Analysis – Part One

Analyzing your competition should be the second step taken during the SEO process (right after and sometimes even during keyword selection). Looking at what and how your competition have positioned their website where you want yours to be placed will lend great insight into how to get yours there.

The above statement should not be taken as meaning that early in the campaign is the only time that competition analysis is important. Once you are holding a top position your competition will undoubtedly renew their efforts to take back what you have replaced. Competition analysis is a step that must be taken to find out what you need to do to take a top position but which also should be performed periodically to detect your competitor’s efforts to take back “their” former positions.

In this article we will cover onsite factors which must be considered and in part two we will cover external factor analysis including incoming links, anchor text, PageRank, etc.

Onsite Factors

Onsite factors of your website are the easiest to address as they are factors which are under your complete control. You have the power to change anything within your site from the content, internal linking structure, and even the design structure itself.

Key onsite factors that must be considered in competition analysis are:

  • Titles and meta tags
  • Keyword density and content
  • Special formats and positioning

There are many tools that are available to help you determine what the optimal levels are. Generally these are knows as KDA (Keyword Density Analysis) tools. Of all of them there is one that we use at Beanstalk that we have found provides better, more accurate information than the others and that is Total Optimizer Pro by TopNet Solutions. The reason we chose this one above the others is twofold. First, it provides very easy to read and thorough information that can be analyzed quickly and second, they have built in tools to analyze offsite factors to a level that don’t exist in other software. Essentially this means for you that a single tool can basically give you the recipe you will need to take and hold your position in the top ten.

Title And Meta Tags

While meta tags definitely don’t hold the weight they once did they are certainly worth adding to your site given that they take seconds to add. Titles on the other hand hold significant weight and must be created carefully to insure that they hold maximum SEO effectiveness and also that they appeal to the searchers.

In analyzing the titles and meta tags essentially you are looking for the optimal keyword density of those tags. A KDA tool will let you know what percentage of your competitions tags are made up of the targeted keywords. A good KDA tool will also display the range or average of percentages. Due to their low weight, meta tags don’t have to be given quite the attention that titles do. When you are optimizing your titles you will want to insure that you fall somewhere near the middle of the pack. Hopefully in your industry, the top ten sites have relatively close percentages in which case it is easy to determine what the optimal percentage is, however assuming that they don’t, you will want to gear your title tag to something that falls in the upper end of the range (though not over) of densities and also keep that title interesting to the searcher who will see it as the link to your site in the search results.

Google at least and probably the other major engines as well have or will be adding into the ranking algorithm a function that records the number of times a specific link is clicked when it appears in the results. If your site appears in the top of the results but is not click at a rate that is acceptable for that position your website will slip. Like any other marketing tool, your title tag is the gateway from the search engine results to your website: insure you’ve created an attractive welcome mat.

Keyword Density And Content

There has been much discussion over the years as to whether there even is an optimal keyword density or whether density even matters. While there are intelligent SEO’s out there who would disagree, the entire debate seems obvious to us at least. If the search engines are looking at onsite factors at all (which they are) and looking for relevancy then it naturally follows that there is a percentage of your content that can consist of the targeted keywords and indicate to the engines that your site is relevant for a given phrase.

That said, and like the titles, it is not about cramming in keywords anywhere to boost the density in your content. Using a KDA tool to find the optimal density for your industry will give you a good idea of any content changes you may need to make. From here you will want to look at two additional areas of your competitors sites. One which you can get from an advances KDA tool such as Total Optimizer Pro and the other you can get right from the engines themselves. Which brings us to …

Special Formats And Positioning

Special formats will be considered content elements such as bold, colors, anchor text, or any other content characteristics that sets specific text out as different when a search engine is spidering your site. Positioning refers to the position of the keywords in relation to the entire content on a given page. Aside from this type of positioning there is also the consideration of how the content and keywords are positioned relative to the code of the page (and sometimes these can be two very different things). This topic was touched on in a past article on table structures and will be covered in a future mini-series on W3C complaint and search engine friendly design, to be published in September.

Special formats such as bold, colors, italics, highlights, etc. set specific content aside as more important than the rest. The use of these formats, provided that it is done correctly, can not only help improve that rankings of your website for specific phrases but can also enhance the usability of your website in general by drawing the human eye to key content. This is not to say that you should bold, highlight and color every instance of your targeted phrase but rather use these elements to draw the eye to the key content you are most interested in getting read.

With positioning the job is a bit more difficult to assess. One of the best ways to quickly isolate how your competitors have used special formats and where they have positioned there keywords in relation to the entire page is to simply run a search for the phrases on Google and view the cache of the page. The keywords will be highlighted in a variety of colors and will allow you to quickly glance through their page and isolate what special elements they are using and where they have positioned their keywords on the page. You will want to do this for the top 10 competitors.

Conclusion

As with any competition, if you understand what those who have what you want are doing it becomes a matter of doing the same and then adding 10% to your efforts. In the case of onsite optimization you’ll simply want to duplicate the best of the top ten, in part two on external factors you will be doing the 10% more.

SEO news blog post by @ 11:52 am

Categories:SEO Articles

 

 

July 18, 2005

Google PageRank Update Analysis

For those of you not yet aware, Google is currently updating the PageRank they are displaying in their toolbar. Each update causes a stir among the SEO community and webmasters trying to get their websites to the top of the Google Rankings.

What Is PageRank?

Without getting into too much detail, PageRank is essentially a score out of ten as to the “value” of your site in comparison to other websites on the Internet. It is based on two primary factors; the number of links you have pointing to your website and the value of the links pointing to your website. The value is calculated based on the PageRank of the page linking to you and debatably the relevancy of the page linking to you (there is no hard evidence to back up the relevancy factor in regards to PageRank that I have seen however it definitely is a factor in your overall ranking).

If you are interested in more information on PageRank you would do well to visit the many forums and articles on the topic and also visit Google’s own description on their website at http://www.google.com/technology/ where they give a brief description of the technology.

What’s New?

The most current PageRank update will undoubtedly cause a larger stir than usual in that many sites have shown drops in their visible PageRank while at the same time showing significant increases in their backlinks. This fact reveals that one of three things has occurred in this latest update:

  1. Google has raised the bar on PageRank, making it more difficult to attain a high level, or
  2. The way they are displaying their backlinks has changed, or
  3. The way they calculate the value of an incoming link has changed.

Any of these are possible and has been noted in the past as something they are willing to do. Additionally, it is possible for all to occur at the same time.

As we don’t like to use client’s as examples I will use the Beanstalk site, backlink counts, and PageRank changes as the meter by which the following conclusions are drawn, however this information was attained through looking at a number of client websites and their competitors.

Google Raising The Bar To Lower Yours

In the past few PageRank updates it has become quite apparent that Google is continuously raising the bar on PageRank. In their defense, with all of the reciprocal link building, link renting, etc. going on this was a natural reaction to the growing number high PageRank sites that attained those ranks simply by building or buying hundreds and thousands of links.

There is no doubt that this is a factor in the changes in this current update. If your site has maintained it’s PageRank, and the PageRanks of your second-level pages then you have done well in holding steady and if your competitors have not been as diligent their positions will slip.

New Backlink Calculations

I mention this one only to bring to light that it is a possibility for your future consideration during other updates. The Beanstalk website went from 750 shown backlinks on Google to 864. it should be noted that Google does not show all backlinks (if you want a more accurate backlink count go to Yahoo! and enter “link:http://www.yourdomain.com” (don’t forget the http://)).

When the Beanstalk site showed 750 backlinks on Google we were showing around 12,000 on Yahoo! (about 6.5% showing on Google). The Beanstalk site is now showing 864 on Google and 15,500 on Yahoo! (about 5.6%). If anything then Google is showing less links then before which negates the possibility that a website’s PageRank is dropping due to a decrease in links but being hidden by an increased number being displayed.

In short, while which backlinks Google chooses to display has certainly changed over time it does not appear to be a major factor in this update. If you see an increase in your sites backlink counts during this update you undoubtedly have an increased number of links.

The Value Of Links

Separate from the number of links you have is their value. This appears to be an area of significant change in this update. Areas that appear to have reduced value in regards to affecting PageRank are:

  1. Multiple links from the same site or run-of-site links

    Intelligent and relevant reciprocal links do not seem to have been penalized, probably due to the increased relevancy factor. If you reduce the value of irrelevant links and raise the value of relevant ones then there is no need to penalize reciprocal links as, done incorrectly, they will penalize themselves.

  2. Links with text around them that indicate they are purchased such as “Partners”, “Advertising”, etc.

    Google has and is actively trying to reduce the value of paid links. This appears to have been moderately successful where there is clear indication that the link is paid for.

  3. Links from sites that hold little relevancy (this factor is based on educated speculation)

    The relevancy factor appears to have become more important. Links from sites with content related to yours is showing positive results while sites with larger numbers of less relevant links are showing drops in PageRank.

What Does This Mean?

For those of you who have been proactive in your link building, and focused on relevant sites using the Google Directory, searches or a tool like PR Prowler it means, “stay the course”. Those of you who have been building or buying links based only on PageRank with little concern for it’s location, or how it is presented – you will need to adjust your link building efforts accordingly.

What Do I Do – My PageRank Dropped ?!!?

The first thing not to do is panic. Take a deep breath, PageRank is one factor of dozens that Google uses to determine the ranking of your page, it is not the only thing. Now, visit your main competitors sites – there’s a good chance you’ll see that they too dropped in PageRank. The plus side to these kinds of updates is that they’re universal. It’s not as if Google has it in for you specifically and so when they do an update, the positive and negative impact is felt by all.

Now, if you’ve noticed that everyone around you has stayed the same or increased in PageRank try to remember this, there’s nothing you can do about where you’re currently positioned in regards to PageRank and it will probably be another 3 months before Google updates the public PageRank again so … start building some good quality (high relevancy, solid PageRank) links, work towards and increase in the next update.

Panicking now won’t help, intelligent reaction will.

What Happens Now?

Traditionally the search engine results will begin to fluctuate based on the new visible PageRank 3 to 7 days after they are visible. This does not have to be the case as Google’s had these numbers all along but it’s worked this way in the majority of cases in recent history. So monitor your search engine positions over the next week or two and watch for changes. Try to hold back on making major changes to your site during this time as often the final positions will differ from those that can be viewed during the shuffling. In a couple weeks time evaluate where you stand and tweak your site as necessary but don’t spend too much time on that … you have a solid link building effort to undertake.

SEO news blog post by @ 4:38 pm

Categories:SEO Articles

 

 

June 8, 2005

Anatomy Of An Internet Search Engine

For some unfortunate souls SEO is simply the learning of tricks and techniques that, according to their understanding, should propel their site into the top rankings on the major search engines. This understanding of the way SEO works can be effective for a time however it contains one basic flaw … the rules change. Search engines are in a constant state of evolution in order to keep up with the SEO’s in much the same way that Norton, McAfee, AVG or any of the other anti-virus software companies are constantly trying to keep up with the virus writers.

Basing your entire websites future on one simple set of rules (read: tricks) about how the search engines will rank your site contains an additional flaw, there are more factors being considered than any SEO is aware of and can confirm. That’s right, I will freely admit that there are factors at work that I may not be aware of and even those that I am aware of I cannot with 100% accuracy give you the exact weight they are given in the overall algorithm. Even if I could, the algorithm would change a few weeks later and what’s more, hold your hats for this one; there is more than one search engine.

So if we cannot base our optimization on a set of hard-and-fast rules what can we do? The key my friends, is not to understand the tricks but rather what they accomplish. Reflecting back on my high school math teach Mr. Barry Nicholl I recall a silly story that had a great impact. One weekend he had the entire class watch Dumbo The Flying Elephant (there was actually going to be a question about it on our test). Why? The lesson we were to get from it is that formulas (like tricks) are the feather in the story. They are unnecessary and yet we hold on to them in the false belief that it is the feather that works and not the logic. Indeed, the tricks and techniques are not what works but rather the logic they follow and that is their shortcoming.

And So What Is Necessary?

To rank a website highly and keep it ranking over time one must optimize it with one primary understanding, that a search engine is a living thing. Obviously this is not to say that search engines have brains, I will leave those tales to Orson Scott Card and other science fiction writers, however their very nature results in a lifelike being with far more storage capacity.

If we consider for a moment how a search engine functions; it goes out into the world, follows the road signs and paths to get where it’s going, and collects all of the information in its path. From this point, the information is sent back to a group of servers where algorithms are applied in order to determine the importance of specific documents. How are these algorithms generated? They are created by human beings who have a great deal of experience in understanding the fundamentals of the Internet and the documents it contains and who also have the capacity to learn from their mistakes, and update the algorithms accordingly. Essentially we have an entity that collects data, stores it, and then sorts through it to determine what’s important which it’s happy to share with others and what’s unimportant which it keeps tucked away.

So Let’s Break It Down …

To gain a true understanding of what a search engine is, it’s simple enough to compare it to the human anatomy as, though not breathing, it contains many of the same core functions required for life. And these are:

The Lungs & Other Vital Organs – The lungs of a search engine and indeed the vast majority of vital organs are contained within the datacenters in which they are housed. Be it in the form of power, Internet connectivity, etc. As with the human body, we do not generally consider these important in defining who we are, however we’re certainly grateful to have them and need them all to function properly.

The Arms & Legs – Think of the links from the engine itself as the arms and legs. These are the vehicles by which we get where we need to go and retrieve what needs to be accessed. While we don’t commonly think of these as functions when we’re considering SEO these are the purpose of the entire thing. Much as the human body is designed primarily to keep you mobile and able to access other things, so too is the entire search engine designed primarily to access the outside world.

The Eyes – The eyes of the search engine are the spiders (AKA robots or crawlers). These are the 1s and 0s that the search engines send out over the Internet to retrieve documents. In the case of all the major search engines the spiders crawl from one page to another following the links, as you would look down various paths along your way. Fortunately for the spiders they are traveling mainly over fiber optic connections and so their ability to travel at light speed enables them to visit all the paths they come across whereas we as mere humans have to be a bit more selective.

The Brain – The brain of a search engine, like the human brain, is the most complex of its functions and components. The brain must have instinct, must know, and must learn in order to function properly. A search engine (and by search engine we mean the natural listings of the major engines) must also include these critical three components in order to survive.

The Instinct – The instinct of a search engines is defined in it’s core functions, that is the crawling of sites and either the inability to read specific types of data, or the programmed response to ignore files meeting a specific criteria. Even the programmed responses become automated by the engines and thus fall under the category of instinct much the same as the westernized human instinct to jump from a large spider is learned. An infant would probably watch the spider or even eat it meaning this is not an automatic human reaction.

The instinct of a search engines is important to understand however once one understands what can and cannot be read and how the spiders will crawl a site this will become instinct for you too and can then safely be stored in the “autopilot” part of your brain.

The Knowing – Search engines know by crawling. What they know goes far beyond what is commonly perceived by most users, webmasters and SEOs. While the vast storehouse we call the Internet provides billions upon billions of pages of data for the search engines to know they also pick up more than that. Search engines know a number of different methods for storing data, presenting data, prioritizing data and of course, way of tricking the engines themselves.

While the search engine spiders are crawling the web they are grabbing the stores of data that exist and sending it back to the datacenters, where that information is processed through existing algorithms and sp@m filters where it will attain a ranking based on the engine’s current understanding of the way the Internet and the documents contained within it work.

Similar to the way we process an article from a newspaper based on our current understanding of the world, the search engines process and rank documents based on what they understand to be true in the way documents are organized on the Internet.

The Learning – Once it is understood that search engines rank documents based on a specific understanding of the way the Internet functions, it then follows that in order to insure that new document types and technologies are able to be read and that the algorithm be changed as new understandings of the functionality of the Internet are uncovered a search engine must have the ability to “learn”.

Aside from a search engine needing the ability to properly spider documents stored in newer technologies, search engines must also have the ability to detect and accurately penalize sp@m and as well as accurately rank websites based on new understandings of the way documents are organized and links arranged. Examples of areas where search engines must learn in an ongoing basis include but are most certainly not limited to:

  • Understanding the relevancy of the content between sites where a link is found
  • Attaining the ability to view the content on documents contained within new technologies such as database types, Flash, etc.
  • Understanding the various methods used to hide text, links, etc. in order to penalize sites engaging in these tactics
  • Learning from current results and any shortcoming in them, what tweaks to current algorithms or what additional considerations must be taken into account to improve the relevancy of the results in the future.

The learning of a search engine generally comes from the uber-geeks hired by and the users of the search engines. Once a factor is taken into account and programmed into the algorithm it them moves into the “knowing” category until the next round of updates.

How This Helps in SEO

This is the point at which you may be asking yourself, “This is all well-and-good but exactly how does this help ME?” An understanding of how search engines function, how they learn, and how they live is one of the most important understandings you can have in optimizing a website. This understanding will insure that you don’t simply apply random tricks in hopes that you’ve listened to the right person in the forums that day but rather that you consider what is the search engine trying to do and does this tactic fit with the long term goals of the engine.

For a while keyword density sp@mming was all the rage among the less ethical SEOs as was building networks of websites to link together in order to boost link popularity. Neither of these tactics work today and why? They do not fit with the long-term goals of the search engine. Search engines, like humans, want to survive. If the results they provide are poor then the engine will die a slow but steady death and so they evolve.

When considering any tactic you must consider, does this fit with the long-term goals of the engine? Does this tactic in general serve to provide better results for the largest number of searches? If the answer is yes then the tactic is sound.

For example, the overall relevancy of your website (i.e. does the majority of your content focus on a single subject) has become more important over the past year or so. Does this help the searcher? The searcher will find more content on the subject they have searched on larger sites with larger amounts of related content and thus this shift does help the searcher overall. A tactic that includes the addition of more content to your site is thus a solid one as it helps build the overall relevancy of your website and gives the visitor more and updated information at their disposal once they get there.

Another example would be in link building. Reciprocal links are becoming less relevant and reciprocal-links between unrelated sites are virtually irrelevant. If you are engaging in reciprocal link building insure that the sites you link to are related to your site’s content. As a search engine I would want to know that a site in my results also provided links to other related sites thus increasing the chance that the searcher was going to find the information that they are looking for one way or another without having to switch to a different search engine.

In Short

In short, think ahead. Understand that search engines are organic beings that will continue to evolve. Help feed them when they visit your site and they will return often and reward your efforts. Use unethical tactics and you may hold a good position for a while but in the end, if you do not use tactics that provide for good overall results, you will not hold your position for long. They will learn.

SEO news blog post by @ 5:30 pm

Categories:Search Engine News

 

 

February 27, 2005

10 Steps To Higher Search Engine Positioning

There is perhaps no more level playing field in business than the Internet. It is this fact that has created millionaires from paupers. The amount of money that can be made depends of course on your industry and your products and/or services but to be sure, if it can be sold at all, it can be sold online.

While there are many methods out there for building a profitable website, from banner ads to email campaigns, by far the most cost effective over time has proven repeatedly to be search engine positioning. That major advantage search engine positioning has over other methods of producing revenue online is that once high rankings are attained and provided that the tactics used were ethical and that continued efforts are made to keep them, they can essentially hold and provide targeted traffic indefinitely. Your site will rise and your site may sometimes fall in the rankings but a solid and complete optimization of your site will insure that through algorithm changes you may fluctuate but you will not disappear.

I have been ranking websites highly on the Internet for quite a few years now and there are some essential rules that, if followed, will insure that over time your website does well and holds solid and profitable positions on the major search engines.

Here are the 10 steps to higher search engine positioning:

Step One – Choosing Keywords

You first must choose your keywords. This is perhaps the most important step of the process as incorrectly targeting phrases can result in traffic that is not interested in your product. There are three tools that I use virtually every day to help pick the most appropriate keywords:

  1. Overture’s Search Term Suggestion Tool
  2. WordTracker
  3. A Brain

The last in the list is the most important. Look through the potential keyword phrases and think, “Who would be searching using that phrase?” If the answer is, “a student looking for information” then chances are it won’t result in a sale. If the answer is “Someone who is looking specifically for a product I offer,” then obviously this is a prime candidate as a targeted keyword phrase.

Step Two – Site Content

Even before I optimize websites I like to get a good deal of new content down in order to insure that I know exactly where I’m going and exactly what I need to do to get there. Creating some of the new content before starting the optimization process can be doubly helpful in that it can reveal potential additions to your website that you may not have considered (a forum or blog for example). If you already have a site, perhaps simply sit on your back deck, sip on a coffee and imagine what you would do if your whole site was lost and you had to start again (other than launch into a very colorful discussion with your hosting company).

Step Three – Site Structure

A solid site structure is very important. Creating a site that is easily spidered by the search engines yet attractive to visitors can be a daunting and yet entirely rewarding endeavor. To adequately structure your website you must “think like a spider” which is not as difficult as it may sound. A search engine spider reads your web page like you would read a book. It starts at the top left, reads across, and then moves down.

Priority must be given then, to what you place near the top of your page.

Step Four – Optimization

Once you have your keyword targets, your content created and your site structure established you must now move on to the most obvious step, the optimization of your content.

As noted above, a spider places importance on what it reads highest on the page and so beginning with a sentence that includes your targeted phrase only makes sense. That said, stuffing in keywords in hopes that it will add weight to your page generally doesn’t work. The term “keyword density” refers to the percentage of your content that is made up of your targeted keywords. There are optimum densities according to many reputable SEO’s though exactly what they are is debata ble. Estimates seem to range anywhere from 4 or 5% to 10 to 12% (quite a gap isn’t it).

Personally, when it comes to keyword density I prescribe to one rule: put your keywords in the content as much as you can while keeping it comfortably readable to a human visitor.

Some do it first, I do it last, regardless of when you do it you must choose your heading. At the beginning of your content you have the opportunity to use the <h1> tag to specify the heading of your content. This tag is given extra weight and is also an indicator to the search engine of where your actual content starts. Make sure to use your keywords in the heading but don’t shy away from also adding additional words (though not too many).

Step Five – Internal Linking

To insure that your website gets fully indexed you have to make sure that the spiders have an easy path through your website. Text links make the best choice as the anchor text (the actual words used to link to a specific page) add relevancy to that page for the words used to link to it. For example, if I ran a website on acne and had a treatments page I could link to it with an image, with text reading “Click for more information on how to treat this skin condition” or simply “Acne Treatments”. When a search engine spider hits an image it has no idea what the image is and, while it will follow the link, it will not give any weight to the page it hits. If you use text that does not contain the keywords you are targeting you are essentially supplying the engine with the same lack of relevancy as with an image, but if you use the phrase “Acne Treatments” to link to your acne treatments page you are attaching relevancy to that page for those keywords.

There are two main ways to insure that your site gets well spidered AND that the relevancy is added. The first is to place text links on the bottom of your homepage to your main internal pages (not EVERY page, that just looks odd). The second is to create a sitemap to all your internal pages and link to it from your homepage. Both methods have advantages and disadvantages but that’s a whole article unto itself.

Step Six – Human Testing

So now you have your site, it’s optimized and you have your navigation in place. The next step is to put it past someone who has never seen your site (and preferably who won’t know how much work you’ve put in and tell you it’s great even if it’s not).

Ask them to find specific information and see how long it takes. Ask someone else to just surf your site and watch which links they click and ask them why they chose those ones.

Most importantly, find out how the content reads to them. You’ve spent hours working through the content at this point and are probably not the least biased on its readers. Find out how it reads to someone who has no invested interest in the site and correct any issues they may bring up.

Step Seven – Submissions

I take a different philosophy than most when it cones to search engine submissions. I submit to directories (both general and topic-specific) and to a few topical search engines but for the most part I’ve found submitting to Google, Yahoo!, MSN and the other major engines has proven to be a bit of a waste of time. The major search engines are spidering search engines, which means they will follow links to wherever they go. Simply having sites that are spidered by the major search engines linking to you will get your site found.

When I have spent time submitting my sites I have found they get picked up in about a week. When I have simply skipped this step and sought out reputable directories and other sites to get links from I have found that at least the homepage of the site gets indexed in as little as two days.

Neither will hurt your rankings but simply to make the best use of your time, seek our directories and other websites to get links from and leave the spiders to find you on their own.

Step Eight – Link Building

All of the major search engines give credit to sites that have quality links pointing to them. How many is enough depends on your industry and targeted phrases. Running a search on Google the reads “link:www.yourcompetition.com” will reveal approximately how many links a competitor has.

The first place to seek links is with general and topic-specific directories. After that you may want to move into reciprocal link building. Reciprocal link building is the exchange of links between two websites. Some webmasters will simply link to any website that links back to them. I highly recommend being more particular than that.

Find websites that you believe your site visitors would genuinely be interested in and you’ve probably found a good link partner. You want to find links from sites that are related to yours.

There are obviously many more methods to building links than directories and reciprocal link building. Again though, this is a whole article (or more) in itself.

Step Nine – Monitoring

Whether you use WebPosition Gold or just run searches manually by hand you will have to monitor the major search engines for your targeted phrases. Also, you will need to review your stats to see where your traffic is coming from and what search terms are being used to find you.

If a month passes and you don’t see any changes then more work needs to be done. I’m certainly not stating that you should take a month off, a solid search engine positioning strategy involves constantly adding content, building links, and insuring that your visitors are getting the information they want to have and finding it as easily as possible.

Step Ten – Reward Yourself

So you’ve done it. It’s taken many many hours of work but you’re rankings are doing well. What you’ve created is a solid position that will stand the tests of time provided that you continually revisit the above noted steps and insure that your website is always one step ahead of your competition (who have noticed you climbing and succeeding as you would notice others climbing up around your ranking).

Now it’s time to turn off your computer, take your partner out (you haven’t had much time for them lately) and have a great week(end). You’ve got a lot of work to do to maintain and build on these rankings but the hardest part is over. Congratulations!

SEO news blog post by @ 4:52 pm

Categories:SEO Articles

 

 

« Newer PostsOlder Posts »
Level Triple-A conformance icon, W3C-WAI Web Content Accessibility Guidelines 1.0 Valid XHTML 1.0! Valid CSS!
Copyright© 2004-2014
Beanstalk Search Engine Optimization, Inc.
All rights reserved.