Beanstalk on Google+ Beanstalk on Facebook Beanstalk on Twitter Beanstalk on LinkedIn Beanstalk on Pinterest
Translate:
Published On:
SEO articles and blog published on ...
Hear Us On:
Webmaster Radio
Blog Partner Of:
WebProNews Blog Partner
Helping Out:
Carbon balanced.
Archives
RSS

XMLRSS

Surviving the Panda-mic

As most of us know, the Panda update launched by Google in the US in February and this week in the UK has cause a lot of confusion, a lot of ranking drops and a lot of people scratching their heads wondering what to do to recover from it.

The Panda was designed to attack sites that spit out and aggregate low-quality content based on the most searched keywords on Google. The update caused a lot of shifts in the search results and helped to remove a lot of spam farms from the first page search results. This was great for publishers who were honestly trying to produce quality content. We also saw many splogs removed from Google’s index and many spun content sites lose their rankings, which in turn increased more legitimate sites up in rankings.

I have put together a few tips for webmasters that may help to offset the effects of the Panda and should help repair the loss in rankings.

  • We know that sites with duplicate content got hammered by the update. Produce only original high-quality, editorial or factual based content.
  • Domain age is important. Do not switch domain names if possible. If you do need to register a new site, then go for keyword specific terms that directly relate to your industry.
  • Google has clearly stated that social media is becoming increasingly important. Sites that were tied to Facebook, Twitter and LinkedIn accounts fared better.
  • Sites with embedded video content seem to do better

Sometime the best approach is to make the most of a situation. To get the most from the Panda try the following:

  • Install a utilize a blog on your site. Write fresh, quality content at least 2-3 times a week. This causes the Google bots to closely monitor your site for new updates.
  • Add in feeds from your social networking accounts. The more links you get coming in from Facebook, Twitter and other social sites, the better.

For those sites that took a large ranking hit from the Panda, try some of the following recommendations.

  • Don’t ignore you rankings in other popular search engines such as Yahoo and Bing. The ranking drop you experienced in Google should not have affected your ranking elsewhere.
  • Setup a Google Webmaster Tools account and use it to analyze each section of your website. This tool not only helps you analyze and correct problems, but it also gives you a clear indication of the factors that Google is looking at when assessing your site.
  • Study and ensure that your site adheres to Google’s well established quality guidelines.

Once you have completed these steps, and you are certain that you have performed an exhaustive and thorough repair of your site, you can ask Google to take another look at your site for a reconsideration request.

Panda is by far, the largest most far-reaching changes to the algorithm in the last decade. Reports indicate that as much as 16% of all search queries have been affected. By keeping abreast of the guidelines established by Google and employing best practices, you will should be able to recoup your loses and regain your former ranking status.

SEO news blog post by @ 6:44 pm on April 14, 2011


 

Refuting Debunked SEO Practices

I came across an interesting blog post from ISEdb.com that was titled: "16 SEO Tactics That Will NOT Bring Targeted Google Visitors" where Jill Whalen was discussing strategies that she felt were no longer valid seo tactics. I have reposted some of the points here and have added in my comments on each. Jill’s posts are in green italics.

Individually these tactics amount to very little; on this point I agree. However, add them up together and they become significant to your rankings. Being so absolutely "Google-centric" in your tactics is going to hurt you in the long run. Suppose there was no Google? (scary I know…) then you would have to redesign your sites for other search engines that may put more weight on these signals.

Meta Keywords:

"Lord help us! I thought I was done discussing the ole meta keywords tag in 1999, but today in 2011 I encounter people with websites who still think this is an important SEO tactic. My guess is it’s easier to fill out a keyword meta tag than to do the SEO procedures that do matter. Suffice it to say, the meta keyword tag is completely and utterly useless for SEO purposes when it comes to all the major search engines and it always will be."

There is sufficient evidence to show that Yahoo and Bing do use the keywords tag to help categorize and index pages. Google has been clear that they do not use the meta keywords tag as a ranking factor. The fact of the matter though is that unless it is totally deprecated from the W3C it is still best practice to include the tag. Just don’t expect that it will put you up to number 1 based solely on your use of it. There are many other search engines that are used that may or may not use this tag to index your page. Again this is a case where being too "Google-centric" can harm you in the long run. Ignoring all other search engines, seems irresponsible and is poor business sense.

XML Site Maps or Submitting to Search Engines:

"If your site architecture stinks and important optimized pages are buried too deeply to be easily spidered, an XML site map submitted via Webmaster Tools isn’t going to make them show up in the search results for their targeted keywords. At best it will make Google aware that those pages exist. But if they have no internal or external link popularity to speak of, their existence in the universe is about as important as the existence of the tooth fairy (and she won’t help your pages to rank better in Google either!)."

I agree that proper site architecture is of vital importance to have your pages indexed properly. The fact that Google gives you the ability to upload xml sitemaps through their webmaster tools indicates that it has some import. It can be debated as too how much weight it carries but the clear fact is that anything that helps the bots crawl your page, is not a bad thing.

Link Title Attributes:

"Think that you can simply add descriptive text to your “click here” link’s title attribute? (For example: Click Here.) Think again. Back in the 1990s I too thought these were the bee’s knees. Turns out they are completely ignored by all major search engines. If you use them to make your site more accessible, then that’s great, but just know that they have nothing to do with Google."

This is another case where I don’t necessarily disagree. If the W3C states that best practice is too include the title tag in images, then it should be there. Google has clearly stated time and again that W3C validation IS a ranking factor and as such it makes sense to follow W3C Validation practices. What I do not recommend is using the generic "click here" on your page as this ends up building densities for "click here" which you do not want either.

Header Tags Like H1 or H2:

"This is another area people spend lots of time in, as if these fields were created specifically for SEOs to put keywords into. They weren’t, and they aren’t. They’re simply one way to mark up your website code with headlines. While it’s always a good idea to have great headlines on a site that may or may not use a keyword phrase, whether it’s wrapped in H-whatever tags is of no consequence to your rankings."

This one I absolutely disagree with. These are of significant value, especially when used in conjunction with keywords in the page title, meta description and in the Heading Tags. Google absolutely uses these factors as signals for indexing and determining relevance to search queries….which in turn affect your rankings.

Keyworded Alt Text on Non-clickable Images:

"Thought you were clever to stuff keywords into the alt tag of the image of your pet dog? Think again, Sparky! In most cases, non-clickable image alt tag text isn’t going to provide a boost to your rankings. And it’s especially not going to be helpful if that’s the only place you have those words. (Clickable images are a different story, and the alt text you use for them is in fact a very important way to describe the page that the image is pointing to.)"

While this does not have a direct affect on rankings, it is again part of creating a W3C validated page….which Google uses as a ranking factor. This is also an important consideration in keeping your site accessible to those with visual impairments or using a text based browser.

Keyword-stuffed Content:

"While it’s never been a smart SEO strategy, keyword-stuffed content is even stupider in today’s competitive marketplace. In the 21st century, less is often more when it comes to keywords in your content. In fact, if you’re having trouble ranking for certain phrases that you’ve used a ton of times on the page, rather than adding it just one more time, try removing some instances of it. You may be pleasantly surprised at the results."

Certainly there is a balance to be had. I agree that over doing will cause problems. The best practice is to write valuable, concise content that is not spammy or of low value. Google wants you to write quality content and your readers want clear, valuable content. Doing so should organically place the appropriate amount of keywords within the textual content.

Linking to Google or Other Popular Websites:

"It’s the links pointing to your pages from other sites that help you with SEO, not the pages you’re linking out to. ‘Nuff said."

Again this is another instance, where it may not help your rankings, but if you can serve your visitors better by sending them to an external link then you should do so. It is of paramount importance to provide a quality site experience for your viewers. If you have a great site that serves your visitors well, then rankings will follow.

IMHO, it makes sense as an SEO to employee best practices always. It covers all your bases and will never hurt any of your SEO efforts.

SEO news blog post by @ 9:38 pm on April 7, 2011


 

The Google Honeypot Sting – Part 2

As a follow-up to my previous post regarding the accusations from Google that Bing is using click-through data as part of their ranking methodology. It is pretty certain that Google does as well and there is evidence to show that they both have been doing so for some time. Even Matt Cutts said in 2002 that "using toolbar data could help provide better SERPs." Although to this day, Google hasn’t officially disclosed if they use the click-stream data as a factor in their search ranking algorithm.

To try to prove their accusation, Google created some fake SERPs for "non-words" and sent clicks through to Bing to make sure they got hold of the data. Even though it was nonsense data, Bing still took it serious enough to use it in about 10% of their search results. Bing then accused Google of click-fraud, but because there was no PPC component it was immediately dismissed.

Bing was not forthcoming in their practices, stating: "We do not copy results from any of our competitors. Period. Full stop." Bing now reveals that they DO use 100% click stream data from sources like their IE toolbars and use this information as factors in their ranking algorithm.

In an additional statement from Bing they revealed that:

"We use over 1,000 different signals and features in our ranking algorithm. A small piece of that is click-stream data we get from some of our customers, who opt-in to sharing anonymous data as they navigate the web in order to help us improve the experience for all users."

I think the bigger story here is why this seems to be such a contentious issue for Google? Why the cloak and dagger routine between the two? I can understand that Bing may not want to divulge its practices, but it seems like adding insult to injury by denying the accusations and then admitting to them later. Both Google and Bing appear to behaving like temperamental juveniles in school yard.

What can we take away from this? Large corporations often behave like children. Even if clickstream data isn’t a leading factor in the ranking and probably never will be, it is part of the equation and as such cannot be ignored. As SEOs, we should be looking for ways to get URLs into the data stream of toolbar users.

SEO news blog post by @ 6:56 pm on February 8, 2011


 

Google Update & YaBing!

For those of you who have noticed significant fluctuations in your rankings – you’re not alone. Across the web people have reported significant changes in their rankings. We at Beanstalk were fortunate on this one in that we had ranking reports running for the past few days and got to watch the changes over the course off the report. A happy coincidence. :)

Unfortunately the algorithm shift isn’t particularly favorable to solid site optimization.  There was an odd connection is what we’re seeing.  Site that had link building that focused on high relevancy and high trustability lost ground and sites who’s links building was focused on volume in recent months have gained ground.  This indicated a shift to volume over quality.  For obvious reason we’re convinced that this shift won’t last.

This shift in quality isn’t just apparent in the sites we’re working on but as we analyze various sites across the web we’re noticing a larger degree of lower quality backlinked sites ranking.

Now – to be sure we’re always in favor of diversified link building strategies and that includes strategies that focus more on volume and other strategies that focus on trust  and relevancy but from everything we can see indicates that this update puts a disproportionate emphasis on volume.  I expect to see the rankings shift again – likely over the weekend.

I should note that this isn’t just something we’re noticing but that has been noticed by a wide array of SEO’s.  My advice?  Don’t react too quickly – corrections are coming and you don’t want to adjust the wrong way.

And in other news …

And also noticeable in the current ranking report we’re running for our clients is the merging of Yahoo! and Bing search results.  A couple days ago Yahoo! announced that their organic results in North America were being fed by Bing.  This is of course the first set of ranking reports though that have refected this.    This is (in my opinion) very exciting news and you can read more about it on Search Engine Journal here.

And stay tuned – I’ll be posting more as the Google update continues.

SEO news blog post by @ 3:13 am on August 27, 2010


 

« Newer Posts
Level Triple-A conformance icon, W3C-WAI Web Content Accessibility Guidelines 1.0 Valid XHTML 1.0! Valid CSS!
Copyright© 2004-2014
Beanstalk Search Engine Optimization, Inc.
All rights reserved.