Beanstalk on Google+ Beanstalk on Facebook Beanstalk on Twitter Beanstalk on LinkedIn Beanstalk on Pinterest
Translate:
Published On:
SEO articles and blog published on ...
Hear Us On:
Webmaster Radio
Blog Partner Of:
WebProNews Blog Partner
Helping Out:
Carbon balanced.
Archives
RSS

XMLRSS

Beanstalk's Internet Marketing Blog

At Beanstalk Search Engine Optimization we know that knowledge is power. That's the reason we started this Internet marketing blog back in 2005. We know that the better informed our visitors are, the better the decisions they will make for their websites and their online businesses. We hope you enjoy your stay and find the news, tips and ideas contained within this blog useful.


February 14, 2013

iOS popularity = Big Bills for Bing Hating

We decided to call a spade a spade, and Google is paying a fee to keep Bing from being the default search engine on iOS.

The fee is based on per-unit pricing, and not only are there more units than ever, but the per-unit price is also going from $3.20 last year to an estimated $3.50 per unity in 2013!

A flock of sheep attempting to enter a building with an apple logo at the same time.
Given the growing user base these should almost be rabbits?

 
Since the prices are a guesstimate, one can honestly say that it will cost more for the exclusive right to the default search engine on iOS in 2013.

However there are certain ‘publications’ that have forgone the guessing part and are rather certain that Google will pay up.

For example..

Techcrunch title: GOOGLE TO PAY APPLE 1 BILLION
An honest title: GOOGLE COULD PAY APPLE 1 BILLION

In fact, if Samsung, or Google (via it’s Motorolla Mobillity acquisition), can keep one-upping each of the new iPhones, then the cost of licensing to the user-base will be peaking at a point which it will never return to again.

But is it worth the money knowing how much of a search advantage Google has over Bing? Well that depends entirely on who you ask!

Apple pundit:

People will use whatever is the default like pack of blind sheep. Everyone knows this.

Google fan:

If that’s true then why is the Google Maps app on iOS the most popular app on the device? People clearly don’t just use the default apple maps?

.. and really, if we’re talking about users who skipped over the BlackBerries, Nokias, Samsungs, etc.., for a specific device, then perhaps we should give them some credit for also choosing a better search experience?

After all, how many times would you let your phone load Bing before trying to switch it?

I personally would let a ‘Bing’ search happen once at the most, just to get info on “setting default search engine on iOS”. :)

SEO news blog post by @ 5:08 pm


 

 

January 31, 2013

Are you Modern? Take the test!

modern.IE Logo

Two pro-Microsoft posts in one week? I know, Right?!

Clearly we are not masters of fate or IT news, so today’s headline is covering the new modern:IE Test Site setup to assist web developers with creating IE compatible site content.

Wasn’t it like, two days ago that I just pointed out that the big flaw with IE is that the old versions create a web design nightmare? *tap tap* .. Apparently this thing is turned on?

What does it test?

Actually the tool is a suite of tests with some specific test cases for IE browser specific issues.

Here’s a list of categories it will test and report on without setting up a ‘Site Owner’ account:

  • Fix common problems from supporting old versions of IE:
  • Known compatibility issues
  • Compatibility Mode
  • Frameworks & libraries
  • Web standards docmode
  • Help this webpage work well across browsers, across devices:
  • CSS prefixes
  • Browser plug-ins
  • Responsive web design
  • Browser detection
  • Consider building with some new features in Windows 8:
  • Touch browsing default
  • Start screen site tile

If you plug your URL in the page will test all these areas and report back to you where improvements could be made.

Additionally there is a direct link to the ‘Pinned Site Tile’ testing/design tool.

This tool lets you select an image (144×144 pixel PNG) and text for your website when a Windows 8 user wants to ‘Pin’ the site to their start menu.

My experience with the tool wasn’t great, likely due to some caching, but if you test your code against sites that do work properly you can still sort out the needed meta tags quickly enough.

Other Goodies?

Included in the suite is a link to the Internet Explorer Test Drive site to compare HTML5 features and performance with other browsers..

 
Technically, I ended up short on time to cover more, so if you dive in and start to wonder why we didn’t point out something new/interesting, feel free to let us know, we’re always open to feedback. :)

SEO news blog post by @ 12:20 pm


 

 

July 5, 2012

Particle Physics and Search Engines

If you’ve been hiding under a rock then you may not have heard the news of the ‘God Particle’ discovery.

As someone who is fairly scientific, I look at this as more of a proof of concept than a discovery, and ‘God’ really needs to give Peter Higgs some credit for his theories.

 
I won’t dwell on the news surrounding the Higgs boson particle confirmation, but there are parallels between objects colliding and revealing previously unseen matters.

When Search Engines Collide

It’s been some time since Bing and Yahoo merged, so the data sets should be the same right?

No. That would really be a wasted opportunity, and Microsoft is clearly smarter than that.





 
By not merging the search data or algorithms of Bing and Yahoo, Microsoft can now experiment with different updates and ranking philosophies without putting all it’s eggs in one basket.

An active/healthy SEO will be watching the updates to search algorithms from as many perspectives as possible which means a variety of sites on a variety of topics tracked on a variety of search engines.

Say a site gets a ton of extra 301 links from partner sites, and this improves traffic and rankings on Bing, causes a stability of movement on Yahoo, and a drop in traffic on Google?

It’s possible to say that the drop on Google was related to a ton of different factors, untrusted links, link spam, dilution of keyword relevance, keyword anchor text spamming, you name it. This is because Google is always updating and always keeping us on our toes.

Bring on the data..

Lets now take the data from Bing and Yahoo into consideration and look at what we know of recent algo changes on those search engines. This ‘collision’ of data still leaves us with unseen factors but gives us more to go on.

Since Bing has followed Google on some of the recent updates, the upswing on Bing for position of keywords would hint that it’s neither a dilution of relevance or spamming on the keywords/anchor text.

Stability on Yahoo is largely unremarkable if you check the crawl info and cache dates. It’s likely just late to the game and you can’t bet the farm on this info.

What about the other engines? Without paying a penny for the data we can fetch Blekko and DDG(DuckDuckGo) ranking history to see what changes have occurred to rankings on these engines.

Since Blekko is currently well known to be on the warpath for duplicate content, and they are starving for fresh crawl data, a rankings drop on that service can be very informative especially if the data from the other search engines helps to eliminate key ranking factors.

In the case of our current example I’d narrow down the list of ranking factors that changed on the last ‘Penguin’ update and contrast those with the data from the other engines and probably suspect (in this example) that Google is seeing duplicity from the 301s, something Bing wouldn’t yet exhibit, but Blekko would immediately punish as badly or worse than Google.

The next step would be to check for issues of authority for the page content. Is there authorship mark-up and a reciprocal setup on the author’s end that helps establish the trust of the main site content? Does the site have the proper verified entries in Google WMT to pass authority? Barring WMT flags, what about a dynamic canonical tag in the header, even as a test if it’s not already setup?

Start making small changes, watch the results, and be patient. If you’re not gaming Google and you’ve done something accidental to cause a drop in rankings, you need to think your way through the repairs step by step.

It’s not easy to evaluate but the more data you can mash-up, and the better you understand that data, the closer/quicker you can troubleshoot ranking issues and ensure that your efforts are going to be gains.

SEO news blog post by @ 12:12 pm


 

 

May 24, 2012

Yahoo Axis – What the Flock?

I had a friend working on the Flock browser team right until it lost momentum and became clear that it was too much, too soon…

Amy's Winehouse - Too soon?

Here we go again with a new ‘all-in-one’ web browser concept, this time from a very big name?

**Update: Turns out that the leaks were really just rumors. This hype mill is a ‘googol‘ times more intense than it should be considering this is ‘just a plugin’ (unless you count Apple devices).

 

Paul Rudd doing the Double Take
Yahoo..? New?!?

Microsoft owns Yahoo right? So if Yahoo is releasing a new browser + a suite of browser plugins for people who refuse to switch browsers, what’s going on?

Well apparently giving people the option to ‘choose’ MSN/Bing/Yahoo wasn’t working out so well. Now you can run a browser or a plugin that removes that annoying hassle of choosing who’s search services you are using.

Y’know how Firefox and Chrome allow you to sign-in to your browser letting you seamlessly move from one location to the next? Yeah Axis is going to break ground and re-invent the web by also doing that same thing.

Y’know how Google is showing you previews of the sites you’re considering visiting within the search results? Yep Axis will finally let you do that, again.

Is this even a new browser or just IE9 with some ‘fluff’ and Yahoo branding? Tonight we will get a chance to try it hands-on and find out, but for now we have a few videos we can watch over on Yahoo Video.

One of the points my Economics teacher used to hammer home is to view each promotion as the promoter relating to their target audience.

If you have a good product with a smart client base, you can sell your product by focusing on real traits and strengths. Just demonstrate the product and avoid all pointless elements that distract the consumer from your product information.

Enjoy those videos and the clever/unique symbolism that hasn’t been copied too many times since Apple used it in 1984. :)

Does this mean Bing/Yahoo rankings will be important?

Who ever said they weren’t important? Okay, well expert opinions aside, you should never burn the Bing bridge, especially not with cell phones that default to Bing and new versions of Windows that also default to Bing.

It’s never wise to put all your eggs in one basket, and this is true of search engine placement/rankings as well as eggs.

Even if Yahoo Axis only manages a week of public attention, that’s one week of people around the planet searching Bing for a change.

If you rank really well on Google, we’re not going to suggest you intentionally tank your rankings for a short-term gain on Bing. The cost of recovering from such a move would probably be far more than simply paying for some pay-per-click coverage via Microsoft’s AdCenter.

There’s already folks worried about ‘Yahoo’ impressions vs. Bing impressions and the following advice has been posted in the AdCenter help forum:

1) You are currently bidding on broad match only, add phrase and exact match to your bidding structure.
2) Look at keywords with low quality score and optimize for those specifically.
3) Install the MAI tool and check on expected traffic for adCenter, you can also see what average bids are for specific positions.

Only 7 Days Left!

7 DAYS LEFT!

 

Talk about old news? I mentioned this just 2 days ago?!

We still have 7 days left in our Beanstalk Minecraft Map Competition! Check it out and even if you’re not entering, please let others know it’s coming to a close and we need all submissions by the 31st!

SEO news blog post by @ 10:03 am


 

 

May 1, 2012

Search Engine Experiment in Spam Surfing

If you took a very heavily spam-influenced search engine like Bing for example and removed the first 1 million results for a query, how good would the result be?

How about doing the same thing to the best filtered search engines available?

Well someone got curious and made the million short search engine.

What this new service does is remove a specific # of search results and show you the remainder.

I became immediately curious about a few things:

  • Where are they getting their crawl data from?
  • What are they doing to searches where there’s only a few hundred results?
  • Where is the revenue stream? I see no ads?

Given the lack of advertising I was expecting them to be pulling search data from another site?

There’s no way they are pulling from Bing/Yahoo, there are 14+ sites paying for better spots than we’ve earned on Bing for our terms..

And while the top 10 list looks a bit like DuckDuckGo, we’re seemingly banned from their rankings, and not at #6 at all. It’s funny when you look at their anti-spam approach and then look at the #1 site for ‘seo services’ on DDG. It’s like a time machine back to the days of keyword link spam. Even more ironic is that we conform to DDGs definition of a good SEO:

“The ones who do in fact make web sites suck less, and apply some common sense to the problem, will make improvements in the search ranking if the site is badly done to start with. Things like meta data, semantical document structure, descriptive urls, and whole heap of other factors can affect your rankings significantly.

The ones who want to subscribe you to massive link farms, cloaked gateway pages, and other black hat type techniques are not worth it, and can hurt your rankings in the end.
Just remember, if it sounds too good to be true, is probably is. There are some good ones, and also a lot selling snake oil.”

We know the data isn’t from Google either, we have the #1 seat for ‘seo services’ on Google and maintain that position regularly.

So what’s going on?! This is the same company that gave us the ‘Find People on Plus‘ tool and clearly they know how to monetize a property?

My guess is that they are blending results from multiple search engines, and likely caching a lot of the data so it’d be very hard to tell who’s done the heavy lifting for them?

All that aside, it’s rare to see a search engine that blatantly gives you numbered SERPs and for now MillionShort is, on the left side-bar, showing numbered positions for keywords. That’s sort of handy I guess. :)

You can also change how many results to remove, so if your search is landing you in the spam bucket, then try removing less results. If your search always sucks, and the sites you want to see in the results are on the right, you’ve apparently found a search phrase that isn’t spammed! Congrats!

Weak one: Google Drive

Well my enthusiasm for Google Drive just flew out the window on my second week using it.

UPDATE: Turns out the disk was full and Google Drive has no feedback at all. Thanks FireFox for telling me WHY the download failed. Oh man.

SEO news blog post by @ 11:01 am


 

 

January 17, 2012

Surviving the SOPA Blackout

Tomorrow, January 18th, is SOPA blackout day, and lots of very popular sites are committing to participate in the blackout.
SOPA Blackout cartoon
How can web companies, such as SEOs, and supporters (like us) maintain workflow in the midst of a major blackout?

We’ve got some tips!

I need to find things mid-blackout!

While some sites will be partially blacked out, a lot of the larger sites will be completely offline in terms of content for maximum effect.

This means that during the blackout folks will have to turn to caches to find information on the blacked out sites.

If Google and the Internet Archives both stay on-line during the blackout you can use them to get cached copies of most sites.

If you’re not sure how you’d still find the information on Google, here’s a short video created by our CEO Dave Davies to help you along. :)

I want to participate without killing my SEO campaign!

If all your back-links suddenly don’t work, or they all 301 to the same page for a day, how will that effect your rankings?

Major sites get crawls constantly, even 30 mins of downtime could get noticed by crawlers on major sites.

A smaller site that gets crawled once a week would have a very low risk doing a blackout for the daytime hours of the 18th.

Further to that you could also look at user agent detection and sort out people from crawlers, only blacking out the human traffic.

If that seems rather complex there’s two automated solutions already offered:

    • sopablackout.org is offering a JS you can include that will blackout visitors to the site and then let them click anywhere to continue.
      Simple putting this code in a main include (like a header or banner) will do the trick:
      <script type="text/javascript" src="//js.sopablackout.org/sopablackout.js"></script>

 

  • Get a SOPA plugin for your WordPress and participate without shutting down your site. It simply invokes the above Javascript on the 18th automagically so that visitors get the message and then they can continue on to the blog.

I’d be a rotten SEO if I suggested you install an external Javascript without also clearly telling folks to REMOVE these when you are done. It might be a bit paranoid, but I live by the better safe than sorry rule. Plus just because you are paranoid, it doesn’t mean people aren’t trying to track your visitors. :)

How’s Chia Bart doing? .. Well I think he’s having a mid-life crisis right now because he looks more like the Hulkster than Bart?

Pastamania!
Chia Bart number 5
To all my little Bartmaniacs, drink your water, get lots of sunlight, and you will never go wrong!

SEO news blog post by @ 11:28 am


 

 

October 11, 2011

What word to use for anchor text?

As a well connected SEO I digest a lot of publications from the web and I try to limit my opinion to factual results either from real world feedback or by controlled tests. Google is constantly evolving and improving itself to render the best search results possible, or at least better search results than the competition.

Considering where Google was with regards to just hardware in 1999, things certainly keep changing:

Evolution of Google - First server

On Monday SEO Moz published a small test they did to gauge the importance of keywords in the anchor text of links. The test is discussed in detail over on SEO Moz but the result was rather straight forward.

In a nutshell they took 3 new sites, randomly equivalent, and tried to build some controlled links to the sites using three different approaches:

  1. Build links with just ‘click here’ text
  2. Build links with the same main keyword phrase
  3. Build links with random components of the main keyword phrase

Obviously the test is a bit broken, because if you don’t have existing keyword relevance for a phrase, you should build relevance with keywords in the anchors. When Google is sorting out who will be ranked #1 for a site dealing with candies, the site linked to with relevant keywords should always rank higher than a site with links like “click here” or “this site” which aren’t relevant. The only exception would be in a situation where the links seem excessive or ‘spammy’ and may result in Google not considering any of the similar links for relevance.

Outside of a clean test environment we know the best results would be a blend of all three types, with a bit of brand linking mixed in to avoid losing focus on brand keywords. A well established site with a healthy user base will constantly be establishing brand due to all the time on site and click-through traffic for that brand.

ie. If I search for “Sears” and click on the first link only to find it’s a competitor, I’d hit back and find the right link to click. In most cases Google’s watching/learning from the process, so brand links aren’t going to be a necessity after a site is quite popular, and the % of brand links wouldn’t need to be much at all.

Kudos to SEOMoz for publishing some of their SEO test info regardless of how experimental it was. We’re constantly putting Google’s updates to the test and it’s often very hard to publish the results in such a clinical fashion for all to see. We will always make an attempt to blog on the topics we’re testing but it’s still on the to-do list to publish more of the data.

SEO news blog post by @ 11:56 am


 

 

October 3, 2011

Why Article Spinning Has Spun its Last Spin

Anyone who has been involved with SEO for any time will undoubtedly be familiar, or know of article spinning. If not, article spinning is a "black hat" tactic where you write a single article and submit it to hundreds of article submissions sites.

article spinning toilet.jpg

Spinning software is typically used to spin the content of the article and replace specified keywords with synonyms but keeps about 90% of the content the same. Some spinner software employs automation functionality that can help bypass websites CAPTCHA codes to further streamline the submission of the spun articles. As with all grey or black hat tactics, sites may experience temporary gains in rankings or traffic, but it is only a matter of time before they incur the wrath of the Google Panda and are penalized.

Before the release of the Panda algorithm update at the beginning of 2011 and its subsequent updates, this was a widely used tactic amongst less-than-reputable SEOs and website owners as a method of garnering backlinks and rankings for their sites. This is now one of the worst tactics any legitimate website can utilize in the post-panda web. It is and one that gives the rest of the SEO industry a bad name.

Article spinning was a major contributing factor for inundating the SERPs with webspam with garbage and spammy search results. Not only did it make it difficult to conduct proper searches at the time, but it pushed legitimate search results further back in the SERPs making good results and sites hard to find.

Many site owners and SEOs used article spinning as an easy road to improve their rankings; and for a time they were very successful. However, one of the main prerogatives of Panda was to attack duplicate, low-value content and spun content on the web in an effort to clean up the SERPs from the garbage; and Panda has become exceedingly efficient in doing so. Even large corporations and many business leaders have incurred severe penalties by utilizing these practices. Panda forced websites to produce higher-quality, well written content for their readers and not for rankings per se.

Duplicate content is never useful. It is as useful as going to a library only to find that all the books are the same.
Syndicating the same duplicate, spun content repeatedly with the same anchor text sends up huge red flag to the search engines and informs them that you are trying to spam the system which will lead to substantial penalties.

It is vitally important to vary the keyword anchor text between every unique article that you write and syndicate. It is no longer worthwhile to submit your quality articles to hundreds of article sites. Choose no more than five of the most relevant, high trust sites to submit your article to as Google will only give credit to the first one it finds.

If you are still using these tactics and are using software to spin and distribute your content; stop. Now. There is no place for kind of low-quality, duplicate content in a post-Panda internet. Especially when there are so many other legitimate and organic tactics that will improve your rankings and drive traffic to your site.

Become an expert in your field and write for your audience. Use compelling subject matter and well written content to entice your readers to not only read what you write, but to keep them coming back for more. More importantly, create the type of content that your readers want to share with their friends and colleges.

Speak with your SEO Company for ideas on how to appeal to the market you are trying to reach and how to create the type of content that will keep your readers wanting more.

SEO news blog post by @ 12:23 pm


 

 

September 22, 2011

1st SEO Impressions of Windows 8

I started my computer life on an Apple II PC, my first gaming/entertainment electronics experience was the Lesiure Vision, and it wasn’t until high-school that I met my first IBM, an XT with an attitude. So in my years you can bet I’ve seen a few operating system ‘revolutions’, heck the first computer I paid for with my own money was the Mac Classic back when it was the first PC to have a mouse and ‘Windows’ (plus it could talk!). :)

Things have changed a bit since that 8mhz Macintosh with it’s single color 10″ non-upgradable screen. The 4mb maximum limit of RAM that was a selling point of my Mac isn’t even enough for a modern CPU cache, let alone an OS + applications, and ‘booting from disk’ has a totally different meaning.

Along comes Windows 8 and I really felt that I needed a new operating system like I needed a new hair in my nose, so I was in no rush at all to review it. The situation reminded me of a quote from Tron 2.0:

“..what sort of improvements have been made in Flynn… I mean, um, Windows 8?” .. “This year we put an 8 on the box!”
Encom OS-12

Well it’s not really that bad, in fact the more I poke at Windows8 the more I see it’s potential and I can see how it could be a game changer for a web based business. Here’s why:

  • The start menu is now a web page with tiled animated content including feeding from websites like XKCD.com:
    Embedded websites in Windows 8 Start Menu
    – Do you have your website setup properly to feed the new start menu when people add your site there?

  • IE10 is the browser the OS uses, you can install another, but it won’t get loaded until you specifically load it
    – Does your site look the way you’d expect in IE10? I know our aging site layout looks different in IE10.

  • There is no prompt to chose a search engine, you’ve got Bing and what more could anyone want?
    – This could divide the consumer base among power users who have fiddled and those who just use things ‘as is’. Depending on your market this could change the way you look at Bing.

  • Clicking the “Make Google my homepage.” link on the google.com/.ca homepage currently causes IE10 to load a blank white page instead of the default home screen.
    – Does your site use similar javascript? Will you have the same issues with IE10 users?

  • Built in applications for reaching social networks aren’t broad enough. “Socialite” program for FB only works with FB, and drops support for Twitter, Reddit, Google Reader, Flicker, Digg, etc..
    – Speaking of which, how cozy are you with giving MS access to everything?

    Windows 8 Socialite Preview for FaceBook

Mind you, with all the stink that’s getting raised over the UEFI secure boot protocol, the rate of adoption for Windows8 could be pitiful. If Microsoft’s hardware partners went ahead with the new feature it would lock out other OSes and force people to deal with one source for new OS installs/upgrades.

SEO news blog post by @ 11:05 am


 

 

April 19, 2011

Panda Puts “Hit” on ciao.co.uk

In a follow up to the post I published yesterday on the Top 20 “Losers” from Google’s Panda UK Update, one of the worst hit companies was Ciao.co.uk, a Microsoft owned company that was leading an EU competition case against Google. Accusations from Microsoft state that Google is purposely using the Panda algorithm update to attack Ciao in an effort to reduce its rankings.

Ciao.co.uk was involved in initiating an EU investigation into Google in November 2010. Microsoft claims that Google has used its dominant position to limit rivals products. The Panda update was designed to lower the overall positioning of content-farms and other low-quality websites and is part of a larger effort to reduce the amount of webspam that has permeated the search results for years.

Google’s head of search evaluation, Scott Huffman, said the accusation was “almost absurd” to suggest that the results were rigged. Of course "almost absurd" is no quite the same as “completely absurd.” Google and Microsoft have a great deal of animosity towards each other and are no strangers to the enmity that has existed between the two corporations for years.

Looking at the list of site that have been negatively affected by the Panda appears to show that most site on the list have been legitimately penalized by Panda. Panda was specifically designed to attack product comparison sites, reviews sites and voucher code sites; and Ciao is no different.

After taking just a precursory look at the Ciao website, the site is found to publish duplicate reviews on multiple pages and sites. Ciao is continually regurgitating massive amounts of content. This is exactly what Panda was targeting site for. One of the reviews on the site that I checked was republished in its entirety on over 30 individual pages and on no less than 3 other websites.

Majestic SEO reports 23 200 000 backlinks coming from 63 000 unique domains, which is an average of 368 links from each domain. Even when looking at the single domain: http://www.ciao.co.uk/, there are 157 049 backlinks coming from 1027 unique domains.

That averages 153 (157 049/1027=153) links form each domain.

From the backlinks analyzed from Majestic, this was the data over 10 000 incoming backlinks grouped by IP block.

IP Block # of Links
92.122.217.* 109,721
94.245.123.* 45,810
65.55.17.* 45,588
69.175.60.* 32,634
66.216.1.* 28,385
207.218.202.* 21,540
212.227.159.* 13,800
178.79.137.* 11,100
95.154.211.* 10,428
69.163.188.* 10,266

One of the worst offenders was http://small-business-service.com/ which has over 10974 links pointing to ciao.co.uk from a single IP.

On the site, a visitor can see the huge proliferation of spammy, low-quality links that this site engages in. The total number of links to all pages on the ciao domain including sub domains and redirects was even more astonishing:

Pages Indexed: 19,174,884
# of Backlinks Links: 23,199,785
# of Unique Domains: 62,886

It would appear that the newest iteration of the Panda algorithm update form Google is doing a great job on catching the low-quality sites and dealing with them quite justly. The new algorithm certainly needs some tweaking as many quality sites took penalties as well.

As lesser quality sites are displaced, those sites that do offer a quality user experience, use legitimate linking strategies and can offer quality content will begin to see their potential rankings increase.

Beanstalk is currently in the process of testing organic vs. non-organic strategies in an attempt to challenge the effectiveness of Panda’s filtering capabilities. Watch for our 3 part blog series on this topic coming soon!

SEO news blog post by @ 10:16 pm


 

 

Older Posts »
Level Triple-A conformance icon, W3C-WAI Web Content Accessibility Guidelines 1.0 Valid XHTML 1.0! Valid CSS!
Copyright© 2004-2014
Beanstalk Search Engine Optimization, Inc.
All rights reserved.