Beanstalk on Google+ Beanstalk on Facebook Beanstalk on Twitter Beanstalk on LinkedIn Beanstalk on Pinterest
Published On:
SEO articles and blog published on ...
Hear Us On:
Webmaster Radio
Blog Partner Of:
WebProNews Blog Partner
Helping Out:
Carbon balanced.


Canadian Court Orders Google to Remove Company From Global Search Results

Captain Canada

In yet another international ruling, Google has been ordered to remove a website from its global search results. Today, B.C. Supreme Court Justice Lauri Ann Fenlon ruled that Google has 14 days to remove a company by the name of Datalink from its global search results. Datalink is the rival of technology company Equustek, who manufactures networking devices for industrial equipment. Equustek has alleged that Datalink has stolen product designs by recruiting a former Equustek engineer.

While Equustek has already won the battle in Canadian courts, this case sets a precedent for international rulings. Justice Lauri Ann Fenlon has stated:

“The courts must adapt to the reality of e-commerce with its potential for abuse by those who would take the property of others and sell it through the borderless electronic web of the internet,”

Google has argued that the B.C. court does not have jurisdiction to enforce such a ruling as their headquarters are located in the United States, but Justice Fenlon countered that the company clearly does business in the province via selling ads and providing search results.

For more information read the full article:

SEO news blog post by @ 4:10 pm on June 18, 2014

Categories:Search Engine News


Is Pengiun 3.0 On The Way?

The folks over at Search Engine Roundtable have reported that there’s been a lot of chatter on some of the webmaster forums as to a possible Google algorithm update. Although Google has yet to release any official word, the forums are a buzz with webmaster seeing shifts in search traffic and ranking positions. There have been a variety of suggestions ranging from a Penguin 3.0 update, which many suspect will be released in late May, to a Panda algorithm refresh, or it may be simply that Google is taking further action against link networks. Until we receive the official word from Google, we’ll just have to wait and see how it all plays out.

SEO news blog post by @ 12:34 pm on May 20, 2014


A Panda Attack

Google today confirmed that there is a Panda update rolling out. I find it odd that after telling webmasters that there would no longer be announcements of Panda updates, that they made this announcement and one has to wonder why.

The official message from Google is that this Panda update is softer than those previously and that there have been new signals added. There are webmasters who are reporting recoveries from previous updates with this one. I would love to hear some feedback from any of our blog readers as to changes you may have noticed in your rankings with this latest update.

I’ll publish a followup post to this one next week after we’ve had a chance to evaluate the update.

SEO news blog post by @ 10:42 am on July 18, 2013


Google+ and the Potential Impact on SEO

Although you can only join by invitation at this point, you’ve no doubt heard of Google+, Google’s latest attempt to join (or, in time perhaps, completely overtake?) Facebook and Twitter as a must have social networking tool. In the months before Google+ was launched, Google also began implementing the “+1″ button as a usable option for users to signify that they enjoy a particular site or page in an attempt to gather as much raw data as possible about the popularity and social value of sites and content before Google+ was rolled out for the masses. Preceding the Google+ and +1 button was the introduction of real time search, which was able to incorporate search results from Twitter, blogs and Facebook. Google, it would appear, is realizing the immense value of social media and the impact of social media on web search.

Search will continue to have a social element infused into it as the addition of the +1 button will change search results, as will live feeds from Google+ pages, much like Facebook “likes” and Twitter “tweets” are currently affecting search results by influencing user decisions due to their value as endorsements of certain sites and content.

Google definitely wants websites to implement the +1 button in their pages so that they can track and measure changes in click through rates. The +1 button will also be included on all SERPs as well as all Google+ feeds. What this means is business owners and marketers must ensure that a positive customer experience is, perhaps more than ever before, their primary focus in the hope that as many users as possible will +1 their site, and in doing so, endorse their business (and by association, reputation).

While it is plain to see that the introduction of the +1 button was merely a precursor/trial balloon for Google+, the potential impact of the +1 button on search could be the bridge between all of the social oriented sites and tools and ways of doing things on the web and the subsequent influence on search results.

Recently, Rand Fishkin, head of SEO Moz, decided to test some theories on the subject of social sites influencing search results. He shared a number of un-indexed URLs via Twitter both before and after Google had unceremoniously aborted the real time search results feature. Fishkin repeated the process, only this time he used Google+. He then requested that his followers on Twitter and Google+ to share the post, with the only caveat being that they were not to share it outside of the originating site.

What this yielded in terms of hard data was that even though Google has dropped the real time search, re-tweeting and tweets are still assisting page indexation. As for Google+, Fishkin’s test page ended up ranking #1 on Google within a few hours. This illustrates the fact that Google+ can also help pages get indexed, if not quite as quickly as Twitter.

But perhaps the most interesting concept presented by Google+, and one that could potentially have a significant impact on SEO, is the “Google Circles” feature.

The “Circles” feature is interesting because it grants users the ability to share whatever they choose with specific groups, or Circles, of people. As Google+ users build their Circles, they will subsequently be able to see the sites that users in their circles have +1′d in Google’s SERPs. This has enormous potential – users will be far more likely to make a choice or purchase based on the recommendation of people they have invited to their Circles – people who they know and whose opinions they trust. Most users are going to be far more likely to trust the recommendation of someone they know rather than the recommendation or review from a stranger. Over time, Circles will become much more defined as more available user data is integrated into them – using that data to effectively market could be potentially powerful SEO strategy.

Basically, Google has taken the ideas behind some of their social media competitors more influential and successful features in an attempt to make search more about real people. Google+ and the +1 button are enabling users to influence online activity, and, as such, they will have an effect on search results. Many experts are already proclaiming Google+ to have no impact on SEO whatsoever, citing Google Wave and past attempts by Google to get in on the social side of the net as indicators that this new attempt will also fail. While it is far too early to make any kind of definitive statement as to the long term usefulness or impact of Google+ and the +1 button on SEO, citing past failures as the basis for an argument as to why Google+ is going to fail as well is short sighted at best. The fact of the matter is, social factors are already intertwined with search, and this is likely only going to become more prevalent as these sites are expanded and the way we interact on the internet continues to evolve also, not less so. Whether or not Google+ ends up revolutionizing or merely co-existing with established SEO methodology remains to be seen, but the enormous potential of these features and their long term impact is fairly clear – site ranking methods are changing thanks to the +1 button and this will likely end up creating an altogether new method of SEO in the future.

SEO news blog post by @ 5:02 pm on August 31, 2011


What In The World Is Net Neutrality?

I had the great pleasure and privilege of speaking at Search Engine Strategies 2008 in San Jose. The topic? Net neutrality. This is the point where your eyes glaze over and the inevitable question, “What is net neutrality?” comes forth. And that’s the point of this article.

The session had very low turnout and Kevin Ryan, the organizer of SES, was there asking to all, “What could we do to increase awareness and attendance?” The issue is important to Kevin, important to me and in fact – it’s important to anyone who makes their living on or uses the Internet.

So What Is Net Neutrality?

Prior to speaking I would periodically get asked, “So, what panel are you on?” When I replied I would generally get a blank stare in return. It appears that the vast majority of the population, even the educated, Internet-savvy population, doesn’t understand the debate over net neutrality (or even know there is a debate to begin with). The fault in this is mine and others who do understand the importance of the issue but who have failed to be passionate about it to others.

The idea of net neutrality seems a simple one. It is the idea that all those little 1s and 0s that float around (let’s call them web pages, downloads, videos and emails) should be treated equally and that under no circumstances should the ISPs be allowed to adjust, degrade or otherwise affect them other than to pass them to the requesting party. Seems simple enough right? So what’s the debate?

The debate is hugely complicated with the providers claiming that they need to manage traffic to provide a solid experience to all and with net neutrality advocates claiming that the ISPs are destined to abuse the ability to manage traffic and that a huge array of issues will likely follow if we give them the ability. They claim that legislation is necessary to insure that all traffic is considered equal and no site, download or other traffic source is adversely affected based of its content type, origin or requesting user.

Again, on the surface it seems fairly simple. The “greedy ISPs” are looking to gouge users and degrade our service and we need the government to protect us. At first glance that’s how I saw it too.

Why Does It Matter?

Depending on which camp you’re in the reasons are different but the message is the same – the wrong decision is going to have wide-spread affects on how the Internet grows and how users and their 1s and 0s are treated. Basically – this issue will determine the health and growth of the Internet. To be sure, the Internet will survive regardless but the questions remain:

  • How will we access it?
  • How will it be charged?
  • How fast will it be?
  • How fast will it grow?
  • Who will have access?
  • What will we be able to access?

Basically, the entire future of the Internet is carried on the back of this issue. An issue that most people aren’t even aware of and even those who do know are having trouble determining which side is actually the correct one.

Those opposing net neutrality legislation could refer to it as a solution looking for a problem, showing clear examples of how issues are being dealt with under current legislation and asserting that any additional legislation will restrict future enhancement. The solution itself may become a problem if it is too broad-reaching.

Opposition would point out that the ISPs are self-serving corporations and that to that end, consumers need protection. That not taking action now will result in a scenario where the abuses will take place and there will be no institutionalized solution.

Here are the main points of both sides:

In The Debate …

My session at the conference was constructed as a debate with Cindy Krum from Blue Moon Works moderating. In the debate I was up against my good friend and the co-host on my radio show Webcology on, Jim Hedger of Markland Media. He took the side supporting net neutrality and I opposed it. In truth, we each see both sides of the argument – it’s that kind of issue.

Pro-Net Neutrality Legislation

Jim brought up many good points in his presentation. He illustrated the abuses that have take place recently including ComCast’s blocking of torrent seeding. For those of you unfamiliar with torrents, they are a peer-to-peer file sharing format. ComCast allowed for the download of the file however once downloaded they blocked the user from seeding it. ComCast claims this was in an effort to reduce the effect these users were having on the network.

In August 2008 the FCC stepped in and forced ComCast to cease these actions enforcing the idea that they could not discriminate based on the file type and degrade the service. Jim and other net neutrality advocates claim this as a victory.

In 2007 Verizon blocked pro-abortion text messages to a legitimate list of recipients. They didn’t require legislation or laws however; public outcry forced a reversal of this policy.

Jim painted a bleak future if net neutrality legislation is not passed. A future where ISPs degrade specific websites, provide preferential treatment of other sites based on a payment structure, and promote their own self-interests and web properties through degradation of the alternatives. He would assert that smaller businesses will suffer, unable to pay the fees required to compete with the “big boys”.

These are the common concerns among net neutrality advocates.

Anti-Net Neutrality Legislation

I presented the arguments opposing net neutrality legislation. It was a tougher stance with the room (albeit small) against me from the beginning but the points are legitimate nonetheless – there are solid concerns against net neutrality legislation. In the end however I isolated two main points that are clear.

The first point I brought up was that of the current legislation. As noted above regarding ComCast, there is existing legislation to protect consumers and this legislation works. That is where the argument, “Net neutrality is a solution looking for a problem,” comes from.

In fact, some might argue that even the current legislation is too much and we only need to view the ComCast decision to witness why this might be. ComCast is not allowed to block torrent traffic. Due to this they are looking at other ways to manage bandwidth. The solution they’ve come up with and that they’ll be toying with next year is to monitor all users and when traffic is high on their network – slow the speeds of those using the most.

What this boils down to is that if I was sitting at home downloading a site, chatting on Skype and maybe surfing a bit while doing this there’s a good chance my access would get slowed down just to protect those downloading movies illegally (and yes I am aware that torrents are used for legal downloads as well however I have a feeling that if their only use was legal, they wouldn’t be a problem to the ISPs).

Legitimate traffic may now well be affected negatively to protect “net neutrality”.

The second point (and probably the less popular of my arguments) was that capitalism and consumer choice in a non-monopolistic area is self-regulating. As we saw with Verizon’s blocking of pro-abortion text messages and reaction to the public outcry (which was to let them through) the consumer has enormous influence and when abuses occur, their reaction forces companies to adjust policy.

In the end we will get to choose our providers and the threat of losing business is an excellent motivator.

So Who’s Right?

The problem with asserting who’s right or wrong here is that there is key information missing. We’re trying to give an answer when we don’t really know the question. So far the debate is over net neutrality legislation. What is that? What does it cover? How does it read?

Without knowing this it’s difficult to really know what we’re for or against but the problem is, by the time there’s legislation it’ll likely be to late to back away from it.

It’s also difficult to look at the pro and against supporter lists without having it affect your decision unless you really think about why they’re there. On the pro side we’ve got companies like Google and Facebook (two friendly “little” companies) and on the against side we’ve got telco’s and business organizations (those evil people who just want to make money). In fact, both camps want to make money. Let’s not forget that they may be friendly companies – but both Google and Facebook both have billions of dollars and rely on the networks. This, and not some altruistic believe in a “free internet”, is the true motivation of these companies. They want to make sure their costs aren’t increased simply because they’re some of the biggest sources of Internet traffic, either directly or indirectly. Now, their main arguments may or may not be correct however one has to understand that each voice has its bias and we need to understand that bias rather than simply choosing sides based on which camp looks the nicest.

So What’s The Answer?

While I’d love to be able to give you my honest assessment of the situation, the fact is – the more I learn about the net neutrality issue the less clear the right decision becomes. In discussing this with Cindy and Jim after the debate we agreed that the biggest need right now is awareness and a clear definition of what both camps are seeking, what the legislation would look like and a third party evaluation of how this would impact the Internet as well as some real open dialogue, not just banner waving from both sides.

One thing we do know is that net neutrality legislation would significantly impact the state and future of the Internet – what isn’t terribly clear is how. That’s what we need to know.

The next step in the discussion is public awareness and a serious discussion with both sides and our politicians on the issue. We need to understand exactly what’s at stake, what legislation would look like and how it would impact the ISPs and the consumers. We need to look to the future, understand what is coming in the way of bandwidth requirements, and make sure that the average user will have access to the bandwidth they need and that the ISPs are motivated to insure that it’s there to be had.

My Opinion

When starting my preparation for the debate I leaned towards the net neutrality legislation camp. It seemed like the obvious choice however the more I learned, the more grey it became. Today I find my leanings favoring the anti-net neutrality side. I find that when I think of how the current legislation has protected consumers adequately thus far, how public opinion has forced complete 180’s in others and when I consider how lack-lusterly governments tend to create broad-sweeping laws in areas where the offenses are as-of-yet unknown – it seems prudent to support the current state of affairs, at least until a genuine need for specific net neutrality legislation arises that can’t be address with current legislation.

That said, Jim leans to the other side and he too understands the issue and the arguments on both sides.

Two people who have researched significantly the issue, viewing common concerns from both sides and who, in the end, land on different sides of the fence. Again, it’s that kind of an issue.

We Need

We need an open and honest debate on the issue. We need you involved with the discussion and we need those in government who support net neutrality legislation to stand up and explain what they believe it means and what the legislation would look like.

We need to hear all the points from the ISPs in regards to how the legislation would negatively impact services and future development on infrastructure and we need to hear from the pro-net neutrality camp on exactly what needs to be protected that isn’t already and why.

Until then I’ll continue to speak to less-than-packed rooms at conferences whose attendees are greatly affected by the issue – even if they’re not aware of it.

But at least you are now. Now it’s time to educate yourself further and find out for yourself why this issue is of paramount important and what you can do to insure that the Internet remains the highway of information and entertainment that it is, tomorrow and for years to come.

Additional Resources

Save The Internet – Save The Internet is a pro-net neutrality site dedicated to providing information supporting the idea of net neutrality legislation. It’s an excellent resource and required reading for anyone who wants to fully understand the issue.

Hands Off The Internet (formerly linked to:– Hands Off The Internet is an equally important website explaining the situation from the side of those opposing net neutrality legislation. As with Save The Internet, it is required reading for anyone who wants to fully understand the issues and what’s at stake.

I would warn all readers; this is not an issue to take sides on, on just face value. Read the two sites noted above and then go further and find blogs, news and other information sources. It’s easy to get a quick, biased opinion on either side but it’s important that we all understand all the issues and all the risks.

SEO news blog post by @ 12:16 pm on September 22, 2008

Categories:Search Engine News


The Search Landscape Reflected In Paid Results

It’s important to note that the writing of this article occurred on July 17, 2008. I mention this only to insure that you can put it into context and also so that those who read this article in a day or week or month from now aren’t confused by my noting of Q2 reports and references to “today”.

Any of you who have read some of my past articles or who have visited Beanstalk’s services pages will know – I’m not a PPC guy. Quite honestly, it’s not in my primary skill set and it’s something I would definitely prefer to leave to the experts. Now that said, following Google and it’s health (which is tied directly to AdWords and AdSense) is something I’m keenly interested in. To this end, recent changes in Google’s paid search display and ranking systems will have huge impacts on advertisers and, more important for the purpose of this article, on Google itself.

A couple weeks ago a friend of mine, Richard Stokes from AdGooroo sent me a PDF titled, “Search Engine Advertiser Update – Q208”. With this document they outline the changing trends in the paid search marketplace and many of the stats are surprising. If you’re a PPC manager they’re obviously directly important. For those of us in the organic optimization world they are still both interesting and important They’re interesting for reasons which will become clear further below and they’re important because anything that affects the economic health of the search engines affects the search landscape both inside and outside of the paid search realm.

Paid Search Market Share

What could be more important to the engines than their percentage of the paid search arena. Does Google really care about being the dominant search engine as far as organic search goes? Let me put this a different way, if Google was standing in front of their shareholders – would they prefer to announce that they held 80% of all worldwide searches and reported revenues of $7.8 billion dollars for the quarter OR would the rather stand up and say they hold 20% of all worldwide searches and reported revenues of $8.7 billion dollars? Organic results drive traffic which is turn results in clicks on paid ads. From a business standpoint that’s the only reason that organic search even matters.

So which engine has the healthiest paid search environment? According to AdGooroo, Q2 results show a different world that one might guess (which is why I noted that it is interesting).

Over the past twelve months advertiser growth (or lack thereof) breaks down as follows:

  • Google – -8.5%
  • Yahoo! – +9.8%
  • MSN – -6.7%

Advertiser counts have also changed (i.e. the number of advertisers on the engine). Yahoo! leads in this area as well with a growth of 0.03%, Google dropped by 6.4% and MSN dropped by almost 20% (good thing they have their OS revenue to fall back on).

And A Drop In Ads

To go even further, Google has increased the importance of quality which has resulted in a reduction of nearly 40% in the number of ads that appear on a results page. 6 months ago ~6.5 ads appear per page whereas now that number is closer to 4. This has the potential to significantly help or significantly hinder Google’s revenue.

As Richard Stokes points out and I completely concur, this places Google in an environment where one of two things will happen:

  1. Advertisers will realize that their clicks are converting much higher, search marketers will spend more time and resources creating more and more relevant ads and landing pages and advertisers will be willing to bid more as the conversions increase, or
  2. The competition for the top spots will be reduced and so too will the average bid prices.

Google’s Q2 Report

And what inspired the writing of this article was actually the release of Google’s Q2 report earlier today. After reading it I immediately had to contact Richard and let him know that the results confirmed some of the predictions noted in his work. He writes:

“… the auction-based bidding system makes this a double-edged sword. As the number of advertisers declines, so does the competitive pressure for higher bid prices. If advertisers don’t step up to the plate and bid more aggressively for placement, then it’s possible that search revenues could stagnate.”

Google revenues were up only 3% over Q1 of this year and revenue from paid click was down by 1%. This is the first time in Google’s history post-IPO that I can remember them showing reductions in revenue in one quarter over the previous. It appears that this new paid search model in not quite as effective at pulling in money as the old.

Now, to be fair, the new system of requiring higher quality scores and better ads and landing pages is new – only a few months old at this point and so there are likely still bugs to be worked out but Wall Street did not react favorably to the announcements today and I suspect that the situation isn’t going to look better for Google at the close of day tomorrow (though what do I know about stocks).

What Does This Mean?

So what does this mean? This means that Google has a lot of work to do and those in the paid search space need to pay close attention (even closer than normal) as shareholders don’t like to see losses and Google is going to need to make moves to recover and show significant gains by the time their Q3 reports come out.

One might guess that this also means that Yahoo! Is gaining ground (which is true) but it’s definitely a case of too little too late. Also earlier today (it was a busy day in search) Yahoo! released a letter to its shareholders that on one hand referred to the alliance between Microsoft and Carl Icahn as a destroyer of shareholder value for Yahoo! and then went on to say that they would be willing to sell the company to Microsoft at $33/share (which is what Microsoft has offered previously and which is more than $10 above their current market value).

It seems that the one can’t look at the stronger relative results in the paid search area that Yahoo! has achieved as a win when they seem to be backsliding on their initial position regarding the sale to Microsoft.

So Where Do We Go From Here?

For one thing, watch closely. Monitor resources such as AdGooroo’s research library, and the Clix Marketing blog. Pay close attention as we’re going to see a lot of changes to what’s going on and these changes are likely going to have effects on both the paid and the organic results as Google strives to provide the better results they’re targeting through paid search now but at the same time increase their revenue.

This may involve adjustments to the quality scoring (I can pretty much guarantee that one) and may involve adjusting how the paid ads appear on the page with the organic results. All we can really do is watch, wait and adapt.

Note: a big thanks goes out to Richard Stokes and the AdGooroo team for providing the research and stats behind this article. Your keyword research tool and compatition analysis capabilities are awesome !!!

SEO news blog post by @ 11:10 am on July 30, 2008

Categories:Search Engine News


Anatomy Of An Internet Search Engine

For some unfortunate souls SEO is simply the learning of tricks and techniques that, according to their understanding, should propel their site into the top rankings on the major search engines. This understanding of the way SEO works can be effective for a time however it contains one basic flaw … the rules change. Search engines are in a constant state of evolution in order to keep up with the SEO’s in much the same way that Norton, McAfee, AVG or any of the other anti-virus software companies are constantly trying to keep up with the virus writers.

Basing your entire websites future on one simple set of rules (read: tricks) about how the search engines will rank your site contains an additional flaw, there are more factors being considered than any SEO is aware of and can confirm. That’s right, I will freely admit that there are factors at work that I may not be aware of and even those that I am aware of I cannot with 100% accuracy give you the exact weight they are given in the overall algorithm. Even if I could, the algorithm would change a few weeks later and what’s more, hold your hats for this one; there is more than one search engine.

So if we cannot base our optimization on a set of hard-and-fast rules what can we do? The key my friends, is not to understand the tricks but rather what they accomplish. Reflecting back on my high school math teach Mr. Barry Nicholl I recall a silly story that had a great impact. One weekend he had the entire class watch Dumbo The Flying Elephant (there was actually going to be a question about it on our test). Why? The lesson we were to get from it is that formulas (like tricks) are the feather in the story. They are unnecessary and yet we hold on to them in the false belief that it is the feather that works and not the logic. Indeed, the tricks and techniques are not what works but rather the logic they follow and that is their shortcoming.

And So What Is Necessary?

To rank a website highly and keep it ranking over time one must optimize it with one primary understanding, that a search engine is a living thing. Obviously this is not to say that search engines have brains, I will leave those tales to Orson Scott Card and other science fiction writers, however their very nature results in a lifelike being with far more storage capacity.

If we consider for a moment how a search engine functions; it goes out into the world, follows the road signs and paths to get where it’s going, and collects all of the information in its path. From this point, the information is sent back to a group of servers where algorithms are applied in order to determine the importance of specific documents. How are these algorithms generated? They are created by human beings who have a great deal of experience in understanding the fundamentals of the Internet and the documents it contains and who also have the capacity to learn from their mistakes, and update the algorithms accordingly. Essentially we have an entity that collects data, stores it, and then sorts through it to determine what’s important which it’s happy to share with others and what’s unimportant which it keeps tucked away.

So Let’s Break It Down …

To gain a true understanding of what a search engine is, it’s simple enough to compare it to the human anatomy as, though not breathing, it contains many of the same core functions required for life. And these are:

The Lungs & Other Vital Organs – The lungs of a search engine and indeed the vast majority of vital organs are contained within the datacenters in which they are housed. Be it in the form of power, Internet connectivity, etc. As with the human body, we do not generally consider these important in defining who we are, however we’re certainly grateful to have them and need them all to function properly.

The Arms & Legs – Think of the links from the engine itself as the arms and legs. These are the vehicles by which we get where we need to go and retrieve what needs to be accessed. While we don’t commonly think of these as functions when we’re considering SEO these are the purpose of the entire thing. Much as the human body is designed primarily to keep you mobile and able to access other things, so too is the entire search engine designed primarily to access the outside world.

The Eyes – The eyes of the search engine are the spiders (AKA robots or crawlers). These are the 1s and 0s that the search engines send out over the Internet to retrieve documents. In the case of all the major search engines the spiders crawl from one page to another following the links, as you would look down various paths along your way. Fortunately for the spiders they are traveling mainly over fiber optic connections and so their ability to travel at light speed enables them to visit all the paths they come across whereas we as mere humans have to be a bit more selective.

The Brain – The brain of a search engine, like the human brain, is the most complex of its functions and components. The brain must have instinct, must know, and must learn in order to function properly. A search engine (and by search engine we mean the natural listings of the major engines) must also include these critical three components in order to survive.

The Instinct – The instinct of a search engines is defined in it’s core functions, that is the crawling of sites and either the inability to read specific types of data, or the programmed response to ignore files meeting a specific criteria. Even the programmed responses become automated by the engines and thus fall under the category of instinct much the same as the westernized human instinct to jump from a large spider is learned. An infant would probably watch the spider or even eat it meaning this is not an automatic human reaction.

The instinct of a search engines is important to understand however once one understands what can and cannot be read and how the spiders will crawl a site this will become instinct for you too and can then safely be stored in the “autopilot” part of your brain.

The Knowing – Search engines know by crawling. What they know goes far beyond what is commonly perceived by most users, webmasters and SEOs. While the vast storehouse we call the Internet provides billions upon billions of pages of data for the search engines to know they also pick up more than that. Search engines know a number of different methods for storing data, presenting data, prioritizing data and of course, way of tricking the engines themselves.

While the search engine spiders are crawling the web they are grabbing the stores of data that exist and sending it back to the datacenters, where that information is processed through existing algorithms and sp@m filters where it will attain a ranking based on the engine’s current understanding of the way the Internet and the documents contained within it work.

Similar to the way we process an article from a newspaper based on our current understanding of the world, the search engines process and rank documents based on what they understand to be true in the way documents are organized on the Internet.

The Learning – Once it is understood that search engines rank documents based on a specific understanding of the way the Internet functions, it then follows that in order to insure that new document types and technologies are able to be read and that the algorithm be changed as new understandings of the functionality of the Internet are uncovered a search engine must have the ability to “learn”.

Aside from a search engine needing the ability to properly spider documents stored in newer technologies, search engines must also have the ability to detect and accurately penalize sp@m and as well as accurately rank websites based on new understandings of the way documents are organized and links arranged. Examples of areas where search engines must learn in an ongoing basis include but are most certainly not limited to:

  • Understanding the relevancy of the content between sites where a link is found
  • Attaining the ability to view the content on documents contained within new technologies such as database types, Flash, etc.
  • Understanding the various methods used to hide text, links, etc. in order to penalize sites engaging in these tactics
  • Learning from current results and any shortcoming in them, what tweaks to current algorithms or what additional considerations must be taken into account to improve the relevancy of the results in the future.

The learning of a search engine generally comes from the uber-geeks hired by and the users of the search engines. Once a factor is taken into account and programmed into the algorithm it them moves into the “knowing” category until the next round of updates.

How This Helps in SEO

This is the point at which you may be asking yourself, “This is all well-and-good but exactly how does this help ME?” An understanding of how search engines function, how they learn, and how they live is one of the most important understandings you can have in optimizing a website. This understanding will insure that you don’t simply apply random tricks in hopes that you’ve listened to the right person in the forums that day but rather that you consider what is the search engine trying to do and does this tactic fit with the long term goals of the engine.

For a while keyword density sp@mming was all the rage among the less ethical SEOs as was building networks of websites to link together in order to boost link popularity. Neither of these tactics work today and why? They do not fit with the long-term goals of the search engine. Search engines, like humans, want to survive. If the results they provide are poor then the engine will die a slow but steady death and so they evolve.

When considering any tactic you must consider, does this fit with the long-term goals of the engine? Does this tactic in general serve to provide better results for the largest number of searches? If the answer is yes then the tactic is sound.

For example, the overall relevancy of your website (i.e. does the majority of your content focus on a single subject) has become more important over the past year or so. Does this help the searcher? The searcher will find more content on the subject they have searched on larger sites with larger amounts of related content and thus this shift does help the searcher overall. A tactic that includes the addition of more content to your site is thus a solid one as it helps build the overall relevancy of your website and gives the visitor more and updated information at their disposal once they get there.

Another example would be in link building. Reciprocal links are becoming less relevant and reciprocal-links between unrelated sites are virtually irrelevant. If you are engaging in reciprocal link building insure that the sites you link to are related to your site’s content. As a search engine I would want to know that a site in my results also provided links to other related sites thus increasing the chance that the searcher was going to find the information that they are looking for one way or another without having to switch to a different search engine.

In Short

In short, think ahead. Understand that search engines are organic beings that will continue to evolve. Help feed them when they visit your site and they will return often and reward your efforts. Use unethical tactics and you may hold a good position for a while but in the end, if you do not use tactics that provide for good overall results, you will not hold your position for long. They will learn.

SEO news blog post by @ 5:30 pm on June 8, 2005

Categories:Search Engine News


Level Triple-A conformance icon, W3C-WAI Web Content Accessibility Guidelines 1.0 Valid XHTML 1.0! Valid CSS!
Copyright© 2004-2014
Beanstalk Search Engine Optimization, Inc.
All rights reserved.