Beanstalk on Google+ Beanstalk on Facebook Beanstalk on Twitter Beanstalk on LinkedIn Beanstalk on Pinterest
Translate:
Published On:
SEO articles and blog published on ...
Hear Us On:
Webmaster Radio
Blog Partner Of:
WebProNews Blog Partner
Helping Out:
Carbon balanced.
Archives
RSS

XMLRSS

Beanstalk's Internet Marketing Blog

At Beanstalk Search Engine Optimization we know that knowledge is power. That's the reason we started this Internet marketing blog back in 2005. We know that the better informed our visitors are, the better the decisions they will make for their websites and their online businesses. We hope you enjoy your stay and find the news, tips and ideas contained within this blog useful.


September 17, 2013

 

July 5, 2012

Particle Physics and Search Engines

If you’ve been hiding under a rock then you may not have heard the news of the ‘God Particle’ discovery.

As someone who is fairly scientific, I look at this as more of a proof of concept than a discovery, and ‘God’ really needs to give Peter Higgs some credit for his theories.

 
I won’t dwell on the news surrounding the Higgs boson particle confirmation, but there are parallels between objects colliding and revealing previously unseen matters.

When Search Engines Collide

It’s been some time since Bing and Yahoo merged, so the data sets should be the same right?

No. That would really be a wasted opportunity, and Microsoft is clearly smarter than that.





 
By not merging the search data or algorithms of Bing and Yahoo, Microsoft can now experiment with different updates and ranking philosophies without putting all it’s eggs in one basket.

An active/healthy SEO will be watching the updates to search algorithms from as many perspectives as possible which means a variety of sites on a variety of topics tracked on a variety of search engines.

Say a site gets a ton of extra 301 links from partner sites, and this improves traffic and rankings on Bing, causes a stability of movement on Yahoo, and a drop in traffic on Google?

It’s possible to say that the drop on Google was related to a ton of different factors, untrusted links, link spam, dilution of keyword relevance, keyword anchor text spamming, you name it. This is because Google is always updating and always keeping us on our toes.

Bring on the data..

Lets now take the data from Bing and Yahoo into consideration and look at what we know of recent algo changes on those search engines. This ‘collision’ of data still leaves us with unseen factors but gives us more to go on.

Since Bing has followed Google on some of the recent updates, the upswing on Bing for position of keywords would hint that it’s neither a dilution of relevance or spamming on the keywords/anchor text.

Stability on Yahoo is largely unremarkable if you check the crawl info and cache dates. It’s likely just late to the game and you can’t bet the farm on this info.

What about the other engines? Without paying a penny for the data we can fetch Blekko and DDG(DuckDuckGo) ranking history to see what changes have occurred to rankings on these engines.

Since Blekko is currently well known to be on the warpath for duplicate content, and they are starving for fresh crawl data, a rankings drop on that service can be very informative especially if the data from the other search engines helps to eliminate key ranking factors.

In the case of our current example I’d narrow down the list of ranking factors that changed on the last ‘Penguin’ update and contrast those with the data from the other engines and probably suspect (in this example) that Google is seeing duplicity from the 301s, something Bing wouldn’t yet exhibit, but Blekko would immediately punish as badly or worse than Google.

The next step would be to check for issues of authority for the page content. Is there authorship mark-up and a reciprocal setup on the author’s end that helps establish the trust of the main site content? Does the site have the proper verified entries in Google WMT to pass authority? Barring WMT flags, what about a dynamic canonical tag in the header, even as a test if it’s not already setup?

Start making small changes, watch the results, and be patient. If you’re not gaming Google and you’ve done something accidental to cause a drop in rankings, you need to think your way through the repairs step by step.

It’s not easy to evaluate but the more data you can mash-up, and the better you understand that data, the closer/quicker you can troubleshoot ranking issues and ensure that your efforts are going to be gains.

SEO news blog post by @ 12:12 pm


 

 

May 24, 2012

Yahoo Axis – What the Flock?

I had a friend working on the Flock browser team right until it lost momentum and became clear that it was too much, too soon…

Amy's Winehouse - Too soon?

Here we go again with a new ‘all-in-one’ web browser concept, this time from a very big name?

**Update: Turns out that the leaks were really just rumors. This hype mill is a ‘googol‘ times more intense than it should be considering this is ‘just a plugin’ (unless you count Apple devices).

 

Paul Rudd doing the Double Take
Yahoo..? New?!?

Microsoft owns Yahoo right? So if Yahoo is releasing a new browser + a suite of browser plugins for people who refuse to switch browsers, what’s going on?

Well apparently giving people the option to ‘choose’ MSN/Bing/Yahoo wasn’t working out so well. Now you can run a browser or a plugin that removes that annoying hassle of choosing who’s search services you are using.

Y’know how Firefox and Chrome allow you to sign-in to your browser letting you seamlessly move from one location to the next? Yeah Axis is going to break ground and re-invent the web by also doing that same thing.

Y’know how Google is showing you previews of the sites you’re considering visiting within the search results? Yep Axis will finally let you do that, again.

Is this even a new browser or just IE9 with some ‘fluff’ and Yahoo branding? Tonight we will get a chance to try it hands-on and find out, but for now we have a few videos we can watch over on Yahoo Video.

One of the points my Economics teacher used to hammer home is to view each promotion as the promoter relating to their target audience.

If you have a good product with a smart client base, you can sell your product by focusing on real traits and strengths. Just demonstrate the product and avoid all pointless elements that distract the consumer from your product information.

Enjoy those videos and the clever/unique symbolism that hasn’t been copied too many times since Apple used it in 1984. :)

Does this mean Bing/Yahoo rankings will be important?

Who ever said they weren’t important? Okay, well expert opinions aside, you should never burn the Bing bridge, especially not with cell phones that default to Bing and new versions of Windows that also default to Bing.

It’s never wise to put all your eggs in one basket, and this is true of search engine placement/rankings as well as eggs.

Even if Yahoo Axis only manages a week of public attention, that’s one week of people around the planet searching Bing for a change.

If you rank really well on Google, we’re not going to suggest you intentionally tank your rankings for a short-term gain on Bing. The cost of recovering from such a move would probably be far more than simply paying for some pay-per-click coverage via Microsoft’s AdCenter.

There’s already folks worried about ‘Yahoo’ impressions vs. Bing impressions and the following advice has been posted in the AdCenter help forum:

1) You are currently bidding on broad match only, add phrase and exact match to your bidding structure.
2) Look at keywords with low quality score and optimize for those specifically.
3) Install the MAI tool and check on expected traffic for adCenter, you can also see what average bids are for specific positions.

Only 7 Days Left!

7 DAYS LEFT!

 

Talk about old news? I mentioned this just 2 days ago?!

We still have 7 days left in our Beanstalk Minecraft Map Competition! Check it out and even if you’re not entering, please let others know it’s coming to a close and we need all submissions by the 31st!

SEO news blog post by @ 10:03 am


 

 

January 17, 2012

Surviving the SOPA Blackout

Tomorrow, January 18th, is SOPA blackout day, and lots of very popular sites are committing to participate in the blackout.
SOPA Blackout cartoon
How can web companies, such as SEOs, and supporters (like us) maintain workflow in the midst of a major blackout?

We’ve got some tips!

I need to find things mid-blackout!

While some sites will be partially blacked out, a lot of the larger sites will be completely offline in terms of content for maximum effect.

This means that during the blackout folks will have to turn to caches to find information on the blacked out sites.

If Google and the Internet Archives both stay on-line during the blackout you can use them to get cached copies of most sites.

If you’re not sure how you’d still find the information on Google, here’s a short video created by our CEO Dave Davies to help you along. :)

I want to participate without killing my SEO campaign!

If all your back-links suddenly don’t work, or they all 301 to the same page for a day, how will that effect your rankings?

Major sites get crawls constantly, even 30 mins of downtime could get noticed by crawlers on major sites.

A smaller site that gets crawled once a week would have a very low risk doing a blackout for the daytime hours of the 18th.

Further to that you could also look at user agent detection and sort out people from crawlers, only blacking out the human traffic.

If that seems rather complex there’s two automated solutions already offered:

    • sopablackout.org is offering a JS you can include that will blackout visitors to the site and then let them click anywhere to continue.
      Simple putting this code in a main include (like a header or banner) will do the trick:
      <script type="text/javascript" src="//js.sopablackout.org/sopablackout.js"></script>

 

  • Get a SOPA plugin for your WordPress and participate without shutting down your site. It simply invokes the above Javascript on the 18th automagically so that visitors get the message and then they can continue on to the blog.

I’d be a rotten SEO if I suggested you install an external Javascript without also clearly telling folks to REMOVE these when you are done. It might be a bit paranoid, but I live by the better safe than sorry rule. Plus just because you are paranoid, it doesn’t mean people aren’t trying to track your visitors. :)

How’s Chia Bart doing? .. Well I think he’s having a mid-life crisis right now because he looks more like the Hulkster than Bart?

Pastamania!
Chia Bart number 5
To all my little Bartmaniacs, drink your water, get lots of sunlight, and you will never go wrong!

SEO news blog post by @ 11:28 am


 

 

October 11, 2011

What word to use for anchor text?

As a well connected SEO I digest a lot of publications from the web and I try to limit my opinion to factual results either from real world feedback or by controlled tests. Google is constantly evolving and improving itself to render the best search results possible, or at least better search results than the competition.

Considering where Google was with regards to just hardware in 1999, things certainly keep changing:

Evolution of Google - First server

On Monday SEO Moz published a small test they did to gauge the importance of keywords in the anchor text of links. The test is discussed in detail over on SEO Moz but the result was rather straight forward.

In a nutshell they took 3 new sites, randomly equivalent, and tried to build some controlled links to the sites using three different approaches:

  1. Build links with just ‘click here’ text
  2. Build links with the same main keyword phrase
  3. Build links with random components of the main keyword phrase

Obviously the test is a bit broken, because if you don’t have existing keyword relevance for a phrase, you should build relevance with keywords in the anchors. When Google is sorting out who will be ranked #1 for a site dealing with candies, the site linked to with relevant keywords should always rank higher than a site with links like “click here” or “this site” which aren’t relevant. The only exception would be in a situation where the links seem excessive or ‘spammy’ and may result in Google not considering any of the similar links for relevance.

Outside of a clean test environment we know the best results would be a blend of all three types, with a bit of brand linking mixed in to avoid losing focus on brand keywords. A well established site with a healthy user base will constantly be establishing brand due to all the time on site and click-through traffic for that brand.

ie. If I search for “Sears” and click on the first link only to find it’s a competitor, I’d hit back and find the right link to click. In most cases Google’s watching/learning from the process, so brand links aren’t going to be a necessity after a site is quite popular, and the % of brand links wouldn’t need to be much at all.

Kudos to SEOMoz for publishing some of their SEO test info regardless of how experimental it was. We’re constantly putting Google’s updates to the test and it’s often very hard to publish the results in such a clinical fashion for all to see. We will always make an attempt to blog on the topics we’re testing but it’s still on the to-do list to publish more of the data.

SEO news blog post by @ 11:56 am


 

 

October 3, 2011

Why Article Spinning Has Spun its Last Spin

Anyone who has been involved with SEO for any time will undoubtedly be familiar, or know of article spinning. If not, article spinning is a "black hat" tactic where you write a single article and submit it to hundreds of article submissions sites.

article spinning toilet.jpg

Spinning software is typically used to spin the content of the article and replace specified keywords with synonyms but keeps about 90% of the content the same. Some spinner software employs automation functionality that can help bypass websites CAPTCHA codes to further streamline the submission of the spun articles. As with all grey or black hat tactics, sites may experience temporary gains in rankings or traffic, but it is only a matter of time before they incur the wrath of the Google Panda and are penalized.

Before the release of the Panda algorithm update at the beginning of 2011 and its subsequent updates, this was a widely used tactic amongst less-than-reputable SEOs and website owners as a method of garnering backlinks and rankings for their sites. This is now one of the worst tactics any legitimate website can utilize in the post-panda web. It is and one that gives the rest of the SEO industry a bad name.

Article spinning was a major contributing factor for inundating the SERPs with webspam with garbage and spammy search results. Not only did it make it difficult to conduct proper searches at the time, but it pushed legitimate search results further back in the SERPs making good results and sites hard to find.

Many site owners and SEOs used article spinning as an easy road to improve their rankings; and for a time they were very successful. However, one of the main prerogatives of Panda was to attack duplicate, low-value content and spun content on the web in an effort to clean up the SERPs from the garbage; and Panda has become exceedingly efficient in doing so. Even large corporations and many business leaders have incurred severe penalties by utilizing these practices. Panda forced websites to produce higher-quality, well written content for their readers and not for rankings per se.

Duplicate content is never useful. It is as useful as going to a library only to find that all the books are the same.
Syndicating the same duplicate, spun content repeatedly with the same anchor text sends up huge red flag to the search engines and informs them that you are trying to spam the system which will lead to substantial penalties.

It is vitally important to vary the keyword anchor text between every unique article that you write and syndicate. It is no longer worthwhile to submit your quality articles to hundreds of article sites. Choose no more than five of the most relevant, high trust sites to submit your article to as Google will only give credit to the first one it finds.

If you are still using these tactics and are using software to spin and distribute your content; stop. Now. There is no place for kind of low-quality, duplicate content in a post-Panda internet. Especially when there are so many other legitimate and organic tactics that will improve your rankings and drive traffic to your site.

Become an expert in your field and write for your audience. Use compelling subject matter and well written content to entice your readers to not only read what you write, but to keep them coming back for more. More importantly, create the type of content that your readers want to share with their friends and colleges.

Speak with your SEO Company for ideas on how to appeal to the market you are trying to reach and how to create the type of content that will keep your readers wanting more.

SEO news blog post by @ 12:23 pm


 

 

April 19, 2011

Panda Puts “Hit” on ciao.co.uk

In a follow up to the post I published yesterday on the Top 20 “Losers” from Google’s Panda UK Update, one of the worst hit companies was Ciao.co.uk, a Microsoft owned company that was leading an EU competition case against Google. Accusations from Microsoft state that Google is purposely using the Panda algorithm update to attack Ciao in an effort to reduce its rankings.

Ciao.co.uk was involved in initiating an EU investigation into Google in November 2010. Microsoft claims that Google has used its dominant position to limit rivals products. The Panda update was designed to lower the overall positioning of content-farms and other low-quality websites and is part of a larger effort to reduce the amount of webspam that has permeated the search results for years.

Google’s head of search evaluation, Scott Huffman, said the accusation was “almost absurd” to suggest that the results were rigged. Of course "almost absurd" is no quite the same as “completely absurd.” Google and Microsoft have a great deal of animosity towards each other and are no strangers to the enmity that has existed between the two corporations for years.

Looking at the list of site that have been negatively affected by the Panda appears to show that most site on the list have been legitimately penalized by Panda. Panda was specifically designed to attack product comparison sites, reviews sites and voucher code sites; and Ciao is no different.

After taking just a precursory look at the Ciao website, the site is found to publish duplicate reviews on multiple pages and sites. Ciao is continually regurgitating massive amounts of content. This is exactly what Panda was targeting site for. One of the reviews on the site that I checked was republished in its entirety on over 30 individual pages and on no less than 3 other websites.

Majestic SEO reports 23 200 000 backlinks coming from 63 000 unique domains, which is an average of 368 links from each domain. Even when looking at the single domain: http://www.ciao.co.uk/, there are 157 049 backlinks coming from 1027 unique domains.

That averages 153 (157 049/1027=153) links form each domain.

From the backlinks analyzed from Majestic, this was the data over 10 000 incoming backlinks grouped by IP block.

IP Block # of Links
92.122.217.* 109,721
94.245.123.* 45,810
65.55.17.* 45,588
69.175.60.* 32,634
66.216.1.* 28,385
207.218.202.* 21,540
212.227.159.* 13,800
178.79.137.* 11,100
95.154.211.* 10,428
69.163.188.* 10,266

One of the worst offenders was http://small-business-service.com/ which has over 10974 links pointing to ciao.co.uk from a single IP.

On the site, a visitor can see the huge proliferation of spammy, low-quality links that this site engages in. The total number of links to all pages on the ciao domain including sub domains and redirects was even more astonishing:

Pages Indexed: 19,174,884
# of Backlinks Links: 23,199,785
# of Unique Domains: 62,886

It would appear that the newest iteration of the Panda algorithm update form Google is doing a great job on catching the low-quality sites and dealing with them quite justly. The new algorithm certainly needs some tweaking as many quality sites took penalties as well.

As lesser quality sites are displaced, those sites that do offer a quality user experience, use legitimate linking strategies and can offer quality content will begin to see their potential rankings increase.

Beanstalk is currently in the process of testing organic vs. non-organic strategies in an attempt to challenge the effectiveness of Panda’s filtering capabilities. Watch for our 3 part blog series on this topic coming soon!

SEO news blog post by @ 10:16 pm


 

 

April 14, 2011

Surviving the Panda-mic

As most of us know, the Panda update launched by Google in the US in February and this week in the UK has cause a lot of confusion, a lot of ranking drops and a lot of people scratching their heads wondering what to do to recover from it.

The Panda was designed to attack sites that spit out and aggregate low-quality content based on the most searched keywords on Google. The update caused a lot of shifts in the search results and helped to remove a lot of spam farms from the first page search results. This was great for publishers who were honestly trying to produce quality content. We also saw many splogs removed from Google’s index and many spun content sites lose their rankings, which in turn increased more legitimate sites up in rankings.

I have put together a few tips for webmasters that may help to offset the effects of the Panda and should help repair the loss in rankings.

  • We know that sites with duplicate content got hammered by the update. Produce only original high-quality, editorial or factual based content.
  • Domain age is important. Do not switch domain names if possible. If you do need to register a new site, then go for keyword specific terms that directly relate to your industry.
  • Google has clearly stated that social media is becoming increasingly important. Sites that were tied to Facebook, Twitter and LinkedIn accounts fared better.
  • Sites with embedded video content seem to do better

Sometime the best approach is to make the most of a situation. To get the most from the Panda try the following:

  • Install a utilize a blog on your site. Write fresh, quality content at least 2-3 times a week. This causes the Google bots to closely monitor your site for new updates.
  • Add in feeds from your social networking accounts. The more links you get coming in from Facebook, Twitter and other social sites, the better.

For those sites that took a large ranking hit from the Panda, try some of the following recommendations.

  • Don’t ignore you rankings in other popular search engines such as Yahoo and Bing. The ranking drop you experienced in Google should not have affected your ranking elsewhere.
  • Setup a Google Webmaster Tools account and use it to analyze each section of your website. This tool not only helps you analyze and correct problems, but it also gives you a clear indication of the factors that Google is looking at when assessing your site.
  • Study and ensure that your site adheres to Google’s well established quality guidelines.

Once you have completed these steps, and you are certain that you have performed an exhaustive and thorough repair of your site, you can ask Google to take another look at your site for a reconsideration request.

Panda is by far, the largest most far-reaching changes to the algorithm in the last decade. Reports indicate that as much as 16% of all search queries have been affected. By keeping abreast of the guidelines established by Google and employing best practices, you will should be able to recoup your loses and regain your former ranking status.

SEO news blog post by @ 6:44 pm


 

 

April 7, 2011

Refuting Debunked SEO Practices

I came across an interesting blog post from ISEdb.com that was titled: "16 SEO Tactics That Will NOT Bring Targeted Google Visitors" where Jill Whalen was discussing strategies that she felt were no longer valid seo tactics. I have reposted some of the points here and have added in my comments on each. Jill’s posts are in green italics.

Individually these tactics amount to very little; on this point I agree. However, add them up together and they become significant to your rankings. Being so absolutely "Google-centric" in your tactics is going to hurt you in the long run. Suppose there was no Google? (scary I know…) then you would have to redesign your sites for other search engines that may put more weight on these signals.

Meta Keywords:

"Lord help us! I thought I was done discussing the ole meta keywords tag in 1999, but today in 2011 I encounter people with websites who still think this is an important SEO tactic. My guess is it’s easier to fill out a keyword meta tag than to do the SEO procedures that do matter. Suffice it to say, the meta keyword tag is completely and utterly useless for SEO purposes when it comes to all the major search engines and it always will be."

There is sufficient evidence to show that Yahoo and Bing do use the keywords tag to help categorize and index pages. Google has been clear that they do not use the meta keywords tag as a ranking factor. The fact of the matter though is that unless it is totally deprecated from the W3C it is still best practice to include the tag. Just don’t expect that it will put you up to number 1 based solely on your use of it. There are many other search engines that are used that may or may not use this tag to index your page. Again this is a case where being too "Google-centric" can harm you in the long run. Ignoring all other search engines, seems irresponsible and is poor business sense.

XML Site Maps or Submitting to Search Engines:

"If your site architecture stinks and important optimized pages are buried too deeply to be easily spidered, an XML site map submitted via Webmaster Tools isn’t going to make them show up in the search results for their targeted keywords. At best it will make Google aware that those pages exist. But if they have no internal or external link popularity to speak of, their existence in the universe is about as important as the existence of the tooth fairy (and she won’t help your pages to rank better in Google either!)."

I agree that proper site architecture is of vital importance to have your pages indexed properly. The fact that Google gives you the ability to upload xml sitemaps through their webmaster tools indicates that it has some import. It can be debated as too how much weight it carries but the clear fact is that anything that helps the bots crawl your page, is not a bad thing.

Link Title Attributes:

"Think that you can simply add descriptive text to your “click here” link’s title attribute? (For example: Click Here.) Think again. Back in the 1990s I too thought these were the bee’s knees. Turns out they are completely ignored by all major search engines. If you use them to make your site more accessible, then that’s great, but just know that they have nothing to do with Google."

This is another case where I don’t necessarily disagree. If the W3C states that best practice is too include the title tag in images, then it should be there. Google has clearly stated time and again that W3C validation IS a ranking factor and as such it makes sense to follow W3C Validation practices. What I do not recommend is using the generic "click here" on your page as this ends up building densities for "click here" which you do not want either.

Header Tags Like H1 or H2:

"This is another area people spend lots of time in, as if these fields were created specifically for SEOs to put keywords into. They weren’t, and they aren’t. They’re simply one way to mark up your website code with headlines. While it’s always a good idea to have great headlines on a site that may or may not use a keyword phrase, whether it’s wrapped in H-whatever tags is of no consequence to your rankings."

This one I absolutely disagree with. These are of significant value, especially when used in conjunction with keywords in the page title, meta description and in the Heading Tags. Google absolutely uses these factors as signals for indexing and determining relevance to search queries….which in turn affect your rankings.

Keyworded Alt Text on Non-clickable Images:

"Thought you were clever to stuff keywords into the alt tag of the image of your pet dog? Think again, Sparky! In most cases, non-clickable image alt tag text isn’t going to provide a boost to your rankings. And it’s especially not going to be helpful if that’s the only place you have those words. (Clickable images are a different story, and the alt text you use for them is in fact a very important way to describe the page that the image is pointing to.)"

While this does not have a direct affect on rankings, it is again part of creating a W3C validated page….which Google uses as a ranking factor. This is also an important consideration in keeping your site accessible to those with visual impairments or using a text based browser.

Keyword-stuffed Content:

"While it’s never been a smart SEO strategy, keyword-stuffed content is even stupider in today’s competitive marketplace. In the 21st century, less is often more when it comes to keywords in your content. In fact, if you’re having trouble ranking for certain phrases that you’ve used a ton of times on the page, rather than adding it just one more time, try removing some instances of it. You may be pleasantly surprised at the results."

Certainly there is a balance to be had. I agree that over doing will cause problems. The best practice is to write valuable, concise content that is not spammy or of low value. Google wants you to write quality content and your readers want clear, valuable content. Doing so should organically place the appropriate amount of keywords within the textual content.

Linking to Google or Other Popular Websites:

"It’s the links pointing to your pages from other sites that help you with SEO, not the pages you’re linking out to. ‘Nuff said."

Again this is another instance, where it may not help your rankings, but if you can serve your visitors better by sending them to an external link then you should do so. It is of paramount importance to provide a quality site experience for your viewers. If you have a great site that serves your visitors well, then rankings will follow.

IMHO, it makes sense as an SEO to employee best practices always. It covers all your bases and will never hurt any of your SEO efforts.

SEO news blog post by @ 9:38 pm


 

 

February 8, 2011

The Google Honeypot Sting – Part 2

As a follow-up to my previous post regarding the accusations from Google that Bing is using click-through data as part of their ranking methodology. It is pretty certain that Google does as well and there is evidence to show that they both have been doing so for some time. Even Matt Cutts said in 2002 that "using toolbar data could help provide better SERPs." Although to this day, Google hasn’t officially disclosed if they use the click-stream data as a factor in their search ranking algorithm.

To try to prove their accusation, Google created some fake SERPs for "non-words" and sent clicks through to Bing to make sure they got hold of the data. Even though it was nonsense data, Bing still took it serious enough to use it in about 10% of their search results. Bing then accused Google of click-fraud, but because there was no PPC component it was immediately dismissed.

Bing was not forthcoming in their practices, stating: "We do not copy results from any of our competitors. Period. Full stop." Bing now reveals that they DO use 100% click stream data from sources like their IE toolbars and use this information as factors in their ranking algorithm.

In an additional statement from Bing they revealed that:

"We use over 1,000 different signals and features in our ranking algorithm. A small piece of that is click-stream data we get from some of our customers, who opt-in to sharing anonymous data as they navigate the web in order to help us improve the experience for all users."

I think the bigger story here is why this seems to be such a contentious issue for Google? Why the cloak and dagger routine between the two? I can understand that Bing may not want to divulge its practices, but it seems like adding insult to injury by denying the accusations and then admitting to them later. Both Google and Bing appear to behaving like temperamental juveniles in school yard.

What can we take away from this? Large corporations often behave like children. Even if clickstream data isn’t a leading factor in the ranking and probably never will be, it is part of the equation and as such cannot be ignored. As SEOs, we should be looking for ways to get URLs into the data stream of toolbar users.

SEO news blog post by @ 6:56 pm


 

 

Older Posts »
Level Triple-A conformance icon, W3C-WAI Web Content Accessibility Guidelines 1.0 Valid XHTML 1.0! Valid CSS!
Copyright© 2004-2014
Beanstalk Search Engine Optimization, Inc.
All rights reserved.