Beanstalk on Google+ Beanstalk on Facebook Beanstalk on Twitter Beanstalk on LinkedIn Beanstalk on Pinterest
Translate:
Published On:
SEO articles and blog published on ...
Hear Us On:
Webmaster Radio
Blog Partner Of:
WebProNews Blog Partner
Helping Out:
Carbon balanced.
Archives
RSS

XMLRSS

Beanstalk's Internet Marketing Blog

At Beanstalk Search Engine Optimization we know that knowledge is power. That's the reason we started this Internet marketing blog back in 2005. We know that the better informed our visitors are, the better the decisions they will make for their websites and their online businesses. We hope you enjoy your stay and find the news, tips and ideas contained within this blog useful.


November 14, 2012

Google’s New ‘AuthorRank’ Bigger than Panda and Penguin Combined

If you are in the SEO industry, you have probably a new buzz word floating around the water cooler; “AuthorRank.”
AuthorRank signals image
In August of 2005, Google filed a patent for a technology dubbed Agent Rank in which ranking ‘agents’ use the reception of the content they create and the resulting interactions as a factor in determining their rankings. The patent goes on to suggest that more well-received and popular “agents” could have their associated content rank higher than unsigned content or the content of other less-authoritative “agents”.

After adding a continuation patent in 2011, Google is now able to attribute content to specific agents and can now rank these agents thanks to platforms like Google+. AJ Kohn goes into much detail about AuthorRank and why he feels it will be bigger than Panda and Penguin combined. AuthorRank will not be a replacement for PageRank, but will work in conjunction with it to enable Google to rank high quality content more appropriately.

I certainly don’t claim to be an expert on AuthorRank and in fact am only learning about it as I write this. What I did learn from the information I read is that content has and will always been key to the success of any website. Google’s mantra to publishers has always been that “content is king”; provide high quality content and the ranking, and followers will follow. This new signal will be in place soon as a final coup de grace to those still stuck in antiquated methods of content creation and syndication.

SEO news blog post by @ 10:59 am

Categories:Google,Google,Google+

 

 

November 1, 2012

Google Image Optimization

Image optimization for Google can mean several things, from image compression, to image resolution, or even referencing Google Image Search optimization.

Worry not, the topic becomes broad but we can tackle it section by section, and along the way we’ll be pointing you to actual Google tools in order to ensure you’re getting the best results.

Image Compression

The biggest gains you can get with the least effort typically come from looking at the wasted bytes (often kilobytes) when images aren’t compressed properly.

Here’s a comparison of JPEG image compression:

Max size Max size
5,899 bytes
Poor Compression
3,493 bytes
Quality Compression

And now PNG compression:

Max size Max size
5,590 bytes
Poor Compression
4,769 bytes
Quality Compression

Now honestly, if I had hidden the image sizes and descriptions, could you tell me which was the 3.5kb image?

Google could tell you in a flash, and Google’s PageSpeed Insights scores your page speed by how optimized your images are.

An observant reader may wonder why the PNG with ‘poor’ compression is smaller than the JPG? The answer is that it’s transparent, and the PNG is only saving image data (compressed losslessly) for the visible pixels vs. JPG which has to save the additional information that ‘these pixels are white’.

Also keep in mind that we used really small images to keep this page loading quickly, the larger the image, the more of a difference compression quality can make.

Image Resolution

The phrase ‘resolution’ has so many variable definitions that I would need to resolve the idea of this as a post vs. an article.

For the context of this discussion I’m speaking of the image dimensions, not the pixels-per-inch.

As an SEO blog I’d have to be really lazy to not mention the issue of image placement/size on a site when we know that Google has a clear concept of what’s most visible to your audience.

When I say ‘your audience’ it is not just a buzz-word, I really mean that Google looks at it’s analytics data and the browser window size of your traffic and actually knows when a site is delivering the right content for the majority of it’s user base.

So if your website is plastered with images that force the user to look for your content, and your content isn’t images, then that’s actually a problem in terms of SEO Optimization.

In fact Google’s just in the middle of moving it’s ‘Browser Size’ tool into the Google Analytics suite.

As you can see in this example of jQuery Mobile in the Browser Size tool, the existing results are generic and dare I say “unprofessional” looking?

Example of jQuery Mobile in the Google Browser Size tool
In the above image we can see what % of general web users can see the elements of the page.

I would show off an example of the same page using the new tools, but Google Analytics is only for sites you own, and the new version is still in beta, throwing out ‘Not a Number’ (NaN) errors regardless of your choice of browser.

What you want to end up with, regardless, is a site that fits the screen size of your audience. So if you are running a forum that reviews ‘apps’ you probably want to aim for a design that will fit you most important content above ‘the fold’ with mobile browsers (at least the current generation of mobile browsers).

Image Site Maps

Site Maps are typically an XML format document that explains your website’s pages to Google in a more technical manner.

An image site map is specifically for explaining the images that are on your site.

Google does a great job of finding pictures you’ve linked to, but if you use JavaScript to create galleries, without using <noscript> tags, then Google could have difficulty indexing those images.

An image sitemap’s XML structure lets you clearly spell out each image with options like:

  • loc: The full URL for the image
  • caption: Description of the image
  • geo_location: Physical location ie: British Columbia, Canada
  • title: Title of the image
  • license: URL pointing to a license for the image

Since each entry is related to a <loc> URL if your image is remotely hosted that’s fine, Google understands the need for CDNs, but that remote site needs to be registered in Webmaster Tools for proper indexing of the images.

Once again I’ve gone a bit too far on the topic for a first round, but I will return with a deeper look beyond the surface of the issue in a part 2 post.

For now if you wanted to start working on an image sitemap (or adding image references to your existing sitemap) look at this answer in Google’s Webmaster Guidelines.

SEO news blog post by @ 1:32 pm

Categories:Coding,Google

 

 

October 17, 2012

New Webmaster Guidelines Part 2 – Technical Guidelines

This is part 2 of an in depth look at the newly revised Webmaster Guidelines from Google. Google has recently updated their list of best practices and suggestions for site development. To give your site the best chance of ranking well, and to keep a competitive edge, the Google guidelines should be read like the gospel.

monkey fixes computer

• Did you ever wonder how Google processes your site to determine its focus and content? Try using a text-based browser like Lynx to understand what Google is using to interpret your site.

By displaying the page without dynamic elements such as Flash, JavaScript, cookies, sessions IDs or DHMTL, you will gain a keen insight as to what is actually visible to the Google. If there is not enough content to be read, then Google is going to have a difficult time indexing your site and establishing you value in the SERPs

• Allow bots to crawl your site without session IDs or arguments that are designed to track a user activity. Disallow specific URLs that you don’t want crawled in your robots.txt file. Sessions IDs are antiquated and should not be used in any new site development. You can use cookies instead for monitoring site traffic.

• Check to see that your web server supports the “If-Modified-Since” HTTP header. This tells Google if your content has changed since it last crawled your site, saving bandwidth and overhead.

• Use the robot.txt file to exclude directories that do not need to be crawled from Google. Keep it updated in your Webmaster Tools account and ensure that you are not blocking Google bot from crawling your site by testing it in Webmaster Tools.

• Keep advertisements (such as Google’s AdSense and DoubleClick) to a minimum and ensure that they are not affecting your rankings by making sure they are excluded in your robots.txt file.

• If you use a content management system (CMS), makes sure that it support seo friendly URL structure and is easily crawled by bots.

• Test you site in several browser’s (IE, FireFox, Chrome, Lynx, Opera, Safari) at different resolutions.

• Use tools to monitor page load speeds. This is becoming an increasingly bigger factor for rankings. Use Google’s Page Speed, or Webmaster Tools Site Performance Tool to gain insights on how to boost you page loads speeds.

SYNOPSIS:

• Make use of the robots.txt file to keep your site accessible to the Google bots
• Block unneeded/irrelevant content from
• Use SEO friendly urls and move away from parameter-based urls
• Monitor your page load speed and take steps to improve it.

SEO news blog post by @ 12:09 pm


 

 

October 10, 2012

New Webmaster Guidelines Part 1 – Design and Content

Google recently updated their webmaster guidelines following the latest algorithm update. It is easy to feel inundated with the amount of information regarding web design dos & don’ts and the best practices for the internet. As an SEO I am frequently asked, “How can I get my site to rank?” The fact of the matter is that we follow the Google’s Webmaster Guidelines which establishes the best practices for websites to follow. Many are concerned about the Panda/Penguin updates and are worried that there site will be hit; or they have a site that has been hit. Our advice remains consistent: "Drink the Google Kool-Aid".

magician_rabbit_hat

At one time, it was exceedingly difficult to get a straight answer from Google in regards to what was considered best practice. This led to a wild-west frontier attitude and many designers and SEOs adopted many bad practices. This is lead to an inundation of webspam in the Google SERPs and made it very difficult to get quality search results.

The Panda and Penguin algorithm and subsequent updates was a very concerted effort to rid the SERPs of webspam. In the wake of these substantial updates, my advice to customers remains consistent; follow the Google established guidelines. The mantra I repeat to my customers is: "Would I do this if search engines didn’t exist?"

For many of us this is old news, but I still find myself learning new things to try and better practices to adopt. Much of the messaging from Google has been very consistent regarding what makes good content. This post will looks specifically at Google’s recommended Design and Content Guidelines to help Google find, crawl and index your site.

Site Hierarchy

  • Give your site a clear hierarchical structure and make it as easy to navigate as possible. Every page should be reachable from at least one static text link.
  • Think of your website as a book with logical sections and headings; each with their own unique and relevant content.
    • The Title of you is your domain URL (eg. www.booktitle.com)
    • Your title tag <title> can be your topic for the page. It defines what content will be on this page (eg. <title>Book Characters</title>).
    • Your heading tag is your chapter title eg. <h1>Book Characters</h1>. Typically this is the same or very close to the page title and must be directly relevant.
    • Have only one topic per page and only one H1 tag on any page.
    • Use subsequent heading tags (h2, h3, h4) to define further related divisions of the chapter.

Site Map

  • Offer a sitemap for your visitors. Not only does this provide a valuable service to your customers, but it can help improve the indexing of your site by bots.
  • If you have an extensive number of links on your site, you may need to break your sitemap into multiple pages.
  • Remember that a website sitemap is different than the sitemap.xml that you should submit to Google’s Webmaster Tools.

Internal Linking

  • Keep the number of links on any page to the bare minimum. The guidelines used to state ‘around 100’ but this is one area where less is more.
  • In the most recent iteration of the Webmaster Guidelines, Google has only stated to ‘keep it to a reasonable amount’. Too many links leading to other internal pages or offsite is distracting to the visitor. It lowers conversion rates due to people getting lost and creates frustration.

Textual Content

  • Google has always stated that ‘content is king’. It is absolutely imperative that you create rich, useful and dynamic content that engages your audience. All textual content needs to be well written and grammatically correct. It should clearly and accurately describe your content and it must be relevant to the page that it is found on.
  • Do not write for what you think Google wants to see. Think about what searchers would type into a search engine to find your page and ensure that your content actually includes those terms.
  • Do not concern yourself with keyword densities. Inevitably the content comes across as spammy and does not read well. Google may regard this as keyword stuffing and see broken/confused grammar as potential spam or scrapped content…exactly what the Panda/Penguin updates are designed to target, and penalize for.

Page Coding

  • Use a crawler on your site such as XENU’s Link Sleuth, or Google’s Webmaster Tools to check you site for broken links.
  • Check your site with the W3C to ensure that your site has valid HTML.
  • Avoid the use of dynamic pages with cryptic URLs (e.g., the URL contains a "?" character). Try to use keyword focused URLs that reflect the page you are building. If you must use a dynamic URL structure, keep them few and the parameters short.

Images

  • You can give Google additional details about your images, and provide the URL of images we might not otherwise discover, by adding information to a web sitemap.
  • Do not embed important content into images; always use text links instead of images for links, important names etc, where possible. Google crawlers cannot determine the text displayed in an image. If you must use an image for textual content, ensure that you make use of the image ALT tag to describe the image with a few words.
  • Ensure that all image <title< and ALT attributes are descriptive (but not spammy) and accurate. Follow these guidelines for creating great ALT text for your images.
  • Give your images detailed and informative filenames.

The following areas (video and rich snippets and their usage are best described by Google themselves:

Video

View the full post here: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=156442

Rich Snippets

View the full post here: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1093493

Coming next time, I will review the newly updated Technical Guidelines and then conclude with Google’s Quality Guidelines.

SEO news blog post by @ 1:15 pm


 

 

October 9, 2012

EMD Insanity and Coining Phrases

It’s clearly time for Beanstalk to officially list ourselves as a sitedefibrillation solution provider.

Why? Because apparently the secret to SERP dominance with an EMD is to coin your own phrase!

Do a search for ‘coinflation’ + ‘gold’ or really, almost any other keyword to see what Google considers an ‘improved’ result following the EMD update.

Google Search results for Coinflation 
If you didn’t get something like the results above, please let us know!

 
Okay so that seems slightly silly, but how the heck did they pull that off? There’s clearly PPC/AdWords competition for the phrase, and EMD should either be a penalty or moot, shouldn’t it?

Well apparently not! In fact EMD can still clearly be an asset if the ‘quality’ scores are all above par!

This means that if you have an organic campaign, with ongoing back links/references from trusted sources, and you aren’t hitting other penalties, you really should be feeling no loss at all from the EMD update.

Indeed, if your competition was using non-organic approaches to EMDs they should have taken a trust hit, and you may see an improvement in position due to their failings!

So while I can show you some examples of the EMD apparently failing to work, we can assure you it’s working, and overall seems like a positive step for Google.

10″ Google Nexus from Samsung?

Last night CNET announced some ‘highly’ probable info that Samsung is manufacturing a new 10.1″ Nexus tablet for Google.

The article is more of a stub of hear-say but had some rather ‘exact’ details including the resolution of the display:

The 2,560×1,600 display will have a PPI (pixels per inch) of about 299, said Shim. That tops the 264 PPI on the 9.7-inch 2,048×1,536 Retina iPad.

Clearly this will be the ‘high end’ model for the Nexus line (currently manufactured by Asus), especially when you consider that Google will be releasing a 7″ Nexus subsidized down to a $99 price this December!

In fact since we’re pondering things to come more than talking facts, I’d have to assume this will be a dual or quad core device with GPU acceleration of some sort to assist with up-scaling video content and 3d games to that eye-popping resolution.

So if this high-end Nexus tablet is anything less than $399 I’d be really shocked and very worried for Apple.

Okay, perhaps more worried for Apple, would be more accurate given it’s current public affairs issues..

In case you’re wondering ‘who cares?’; Tim Pool goes to the streets and broadcasts unedited footage of protests/events.

I’d like to think Apple is patenting this to prevent companies from doing this, but in actual fact this is very creepy stuff from the overly litigious makers of the most expensive walled gardens on the planet.

It seems almost like Apple is testing how well their brand/product can weather bad public image at this point?

SEO news blog post by @ 11:53 am


 

 

October 2, 2012

You may need an EMT after the EMD Update!

Last Friday Matt Cutts tweeted about Google’s latest update, which focuses on penalties for ‘low-quality’ Exact Match Domain names, hence the EMD TLA.

Twitter posts from Matt Cutts on the latest EMD Update

While Google is never big on giving us the details lets digest this together!

Using a relevant keyword in a domain has been a very long-standing ranking signal.
ie: A consulting site for financial companies using ‘financial-consulting.com’ as a domain would be seen as relevant

Over the years this has lead to people grabbing up domains with keywords in them for SEO purposes.

JACOBS BY MARC JACOBS FOR MARC BY MARC JACOBS ETC..

Having your keywords in your domain name didn’t mean overnight dominance of the web, thankfully. Indeed, there was usually some trade-off between desirable keywords and a reasonably short domain name.

In fact, no organic/white-hat SEO would suggest you use something like:

‘best-value-online-financial-consulting-company-with-proven-results.com’

Why? Because the gains in SEO wouldn’t match the losses in user trust/conversions.

Would a good organic SEO/White Hat tell you NOT to purchase those types of domains for 301s to your main site?

I’d like to think so, but this was clearly a strategy for a lot of sites competing for top rankings.

Regardless of your SEO ethics, the practice of domain parking/selling because of search ranking signals is clearly an unnecessary burden on the internet.

While the ‘domains for sale’ issue would still exist without search engines, search engines honestly should be making your choice of domain name MUCH less relevant.

Ideally fresh internet traffic should occur as match between the searchers needs and the services/information that your site provides.

And with this latest update it’d appear that Google agrees with the idea that book should found by more than what’s on the cover.

As of this last update you can expect sites with nothing but some keyword dense 301′d domains to now face a penalty instead of a positive ranking signal.

We didn’t see this coming!

EMD Update Results

I’m already seeing people post sad tales of the deep impact this update is having on certain sites, and I’ve had a laugh at a few ‘professionals’ claiming they never felt this day would come.

Personally, while I’ve watched some very good presentations on SEO and web ranking strategies, the one thing that helps me most as an SEO is Matt Cutts’ breakdown of the real philosophy behind ‘good SEO’ which boils down to:

Never do something for the sake of search engine rankings alone.

If you like ‘Lord of the Rings’ then look at this as:

‘One Rule to Lead them all, one Rule to be found by…’

..and you should never have to fear a Google update!

In fact you should look at each Google update as a chance for your rankings to improve as other sites are punished for their ‘clever’ attempts to game the system.

Another Google Easter Egg?

And finally, to end the post with a chuckle, here’s a Google search phrase for you to test out:

I was hoping this was more than just an ‘Easter Egg‘ in Google’s search, but alas Google hasn’t yet licked mathematical artificial intelligence. :p

SEO news blog post by @ 12:01 pm


 

 

September 20, 2012

Dublin the Airports: iOS 6 Maps is Rotten


Apple’s extra Airport..

Was anyone expecting Apple to replace Google’s Maps application with something superior? Apparently, the iPhone user base and Apple actually expected this to happen.

If you look at the most extremely biased sites reviewing the new ‘Apple’ Maps app for iOS 6 you will see guarded optimism and lots of ‘reasoning’ clash with angry rants from amazed and disappointed users.

One thing I don’t see is anyone calling it the ‘Maps app that Apple bought from TomTom’ the best I’ve seen is a mention that they relied heavily on TomTom and OpenStreetMap for data alone.

Instead I see a very consistent collection of sympathetic remarks like: ‘this is beta, it can only get better’, ‘for a first attempt this is outstanding’, ‘people will question anyone who takes their own path..’

But Apple isn’t taking their own path, they are merely attempting (badly) to replace something that wasn’t really broken.

Sure, Google wasn’t toiling endlessly to include all the updates it was adding to the Android version of Google Maps.

I’m guessing Apple really expected Google to beta test ideas on the Android and then polish them up and finalize them on the iPhone?

So sure, Google put Android development first, and there were things that Google Maps did better on the Android, but that still doesn’t mean it ‘had to go’.

Apple could have offered both solutions in a ‘use what you like’ approach to pleasing it’s user base, but this is a company making headlines for outrageous profits and the working conditions of it’s manufacturing partners.

Removing the choice to pick another company’s solution would clearly explain why Apple didn’t take a settlement from Samsung and wanted to ban their phones. Apple want’s profits, and if Apple wants really happy customers they could lower prices and focus on better apps vs. removing the best ones for inferior versions.

And in other News

Google has blessed a new meta tag!
meta name=”news_keywords”

content=”Apple Maps, iOS 6, Google Maps, Android, TomTom, Google news meta tag”

Do you publish content that you would call ‘news’?
Would you like Google to better understand the topic of your posts?
Would you like the freedom to ignore keyword use in a topic for style reasons?

Then brothers and sisters, this new meta-tag is what you’ve been waiting for!

The format is very simple, and it belongs near the top of your page content, usually in the <head> … </head> section.

Here’s an example:

<meta name=”news_keywords” content=”10 keywords, separated, by commas, just like, meta keywords, etc..”>

That’s some ‘easy breezy’ SEO optimization, and it’s great if you are indeed publishing ‘news’; Not just ranting about Apple. :)

SEO news blog post by @ 11:51 am


 

 

August 27, 2012

Gmail Rank & the Moon Landing

Gmail Rank

Bill Slawski had an interesting blog post the other day speaking about the rise of Gmail Rank and the Importance of Good Subject lines. It seems that Google is experimenting with the possibility of including your own emails in your search results. Users will have to opt-in and only the emails that have been be received via Gmail will be used.

It is thought that the “rankings” used to decide which emails to show will be similar to the existing colored “importance rankings” currently used to display the relative importance of your emails. Gmail does allow the user to sort and filter gmails by their importance markers and offers some other advanced search filters; whether or not this functionality will be carried over to an integrated web search remains speculative.

…In other news:

Neil Armstrong's footprint on the moon

Beanstalk would like to say a fond farewell to Neil Armstrong. Armstrong died Saturday, Aug. 25, 2012, at age 82. Armstrong commanded the Apollo 11 spacecraft that landed on the moon July 20, 1969. As the first man to walk on the moon, his passing truly marks the end of an era. The moon landing happened a bit before my time, but those who witnessed it remember where they were and what they were doing when they heard those famous lines: “That’s one small step for man, one giant leap for mankind.”

SEO news blog post by @ 12:16 pm

Categories:Google,Rankings

 

 

August 16, 2012

You don’t want the next Penguin update…

Scary Matt Cutts

Is Matt Cutts just goofing around or is he really trying to scare us?

The statement in the title of this article, from Matt Cutts, has the SEO world looking for further information as to just how bad the next Penguin update will be.

During the SES in San Francisco this week Matt Cutts got a chance to speak about updates and how they will effect SEOs. One of the things he was quoted as saying really caught my eye:

You don’t want the next Penguin update, the engineers have been working hard…

Mr.Cutts has recently eaten some words, retracting his statement that too much SEO is a bad thing, and explaining that good SEO is still good.

Even with attendees saying that he spoke the words with no signs of ominous intent, how do you expect the SEO world to take follow up statements like:

The updates are going the be jarring and julting for a while.

That’s just not positive sounding at all and it almost has the tone of admission that the next updates are perhaps going to be ‘too much’ even in Matt’s opinion, and he’s one of Google’s top engineers!

My take is that if you are doing anything even slightly shady, you’re about to see some massive ranking spanking.

Reciprocal links, excessive directories, participating in back-link cliques/neighborhoods, pointless press releases, redundant article syndication, duplicate content without authorship markup, poorly configured CMS parameters, etc.. These are all likely to be things, in my opinion, that will burn overly SEO’d sites in the next update.

The discussion also made it’s way to the issues with Twitter data feeds. Essentially since Google and Twitter no longer have an agreement, Google is effectively ‘blocked’ from crawling Twitter.

Dead twitter bird

On the topic of Twitter crawling Matt Cutts was quoted as saying:

..we can do it relatively well, but if we could crawl Twitter in the full way we can, their infastructure[sic] wouldn’t be able to handle it

 

Which to me seems odd, since I don’t see any other sites complaining about how much load Google is placing on their infrastructure?

Clearly the issue is still political/strategic and neither side is looking to point fingers.

With Twitter’s social media relevance diminished you’d think +1′s would be a focus point but Matt Cutts also commented on the situation stating that we shouldn’t place much value on +1 stats for now.

A final point was made about Knowledge Graph, the new information panel that’s appearing on certain search terms.

Since the Google Search Quality team is now the Google Knowledge Graph team Matt Cutts had some great answers on the topic of Knowledge Graph, including the data sources and harm to Wikipedia.

There had been a lot of cursing about Google simply abusing Wikipedia’s bandwidth/resources but it was made clear during the session that Wikipedia is not traffic dependent because they don’t use ads for revenue.

Essentially, if Wikipedia’s data is getting better utilized, and they haven’t had to do anything to make it happen, they are happy.

If you wanted to get more details there’s lots of #SESSF hashed posts on Twitter and plenty of articles coming from the attendees.

I’m personally going to go start working on a moat for this Penguin problem..

SEO news blog post by @ 11:56 am


 

 

July 11, 2012

Google Puts Smack-Down on Infographics

Whether you know what they are called or not, most of us have seen those wonderful images that depict information in a pleasing graphical format and usually span 20 pages vertically. Infographic are visual representations that display information, data or knowledge. For some time now, these infographics have been used as link bait and are all the rage because they offer content in an easily digestible format.

google smash

In a recent interview by Eric Enge, Matt Cutts stated that Google feels they are being abused as a link building tactic and will be soon be discounted. Mr. Cutts when on to state:

"This is similar to what people do with widgets as you and I have talked about in the past. I would not be surprised if at some point in the future we did not start to discount these infographic-type links to a degree. The link is often embedded in the infographic in a way that people don’t realize, vs. a true endorsement of your site."

"In principle, there’s nothing wrong with the concept of an infographic." Cutts told Enge. "What concerns me is the types of things that people are doing with them. They get far off topic, or the fact checking is really poor. The infographic may be neat, but if the information it’s based on is simply wrong, then its misleading people."

Of course this is indicative of a much larger problem of trying to obtain accurate information and statistics from the internet. While it is unlikely that the value of Infographics won’t be completely abolished, the same rule apply to content on your website; if you expect people to link back to your site based on your infographic, you will need to ensure that it is:

  • Relevant to your industry and to your visitors.
  • Offers accurate sources for acquired information/statistics.
  • Gives the viewer new information, tells them how to do something, or describes a process.
  • Free of spammy content and meta information.

"Any infographics you create will do better if they’re closely related to your business and it needs to be fully disclosed what you are doing," Cutts advised.
Similar to what happened with Squidoo lenses, we are seeing another web-trend that has been over-used and abused by online marketers and now we are seeing the resulting smack-down from Google.

Like all other web trends, it is not so much a question of the usefulness of the trend, but how long it will take Google to devalue the tactic once it becomes abused. Any tactic that attemps to garner backlinks must always relevant to the user, rich in content, and be free of nefarious ploys to abuse the tactic.

By employing only white-hat tactics, any strategies or tactics you employ will allow you to weather the storms of any Google updates. It is this practice that has allowed Beanstalk SEO Inc. to pass through barrage of Panda & Penguin updates unscathed to consitently maintain our rankings.

SEO news blog post by @ 12:03 pm


 

 

« Newer PostsOlder Posts »
Level Triple-A conformance icon, W3C-WAI Web Content Accessibility Guidelines 1.0 Valid XHTML 1.0! Valid CSS!
Copyright© 2004-2014
Beanstalk Search Engine Optimization, Inc.
All rights reserved.