Beanstalk on Google+ Beanstalk on Facebook Beanstalk on Twitter Beanstalk on LinkedIn Beanstalk on Pinterest
Translate:
Published On:
SEO articles and blog published on ...
Hear Us On:
Webmaster Radio
Blog Partner Of:
WebProNews Blog Partner
Helping Out:
Carbon balanced.
Archives
RSS

XMLRSS

Windows 8 / IE10 and Flash Certification

Windows 8 is a tablet OS, and like any modern OS focused on tablets/touch/mobility options, there’s compatibility concerns with content not specifically written for a tablet/mobile device.

Apple’s famous for their certification process and using it for more than just the sake of ‘quality’ or ‘compatibility’ controls.

Indeed Microsoft has had certification for drivers, and applications in Windows for some time, but never to the point where something cannot be used without their certification.

If you wanted to install something that isn’t certified you’ll get a spooky warning, but I’ve never seen something completely fail to work due to a bad/missing certification on Windows.

Enter Windows 8 and IE10, a whole new ballgame, with two browser modes, one for normal use and a ‘desktop’ integration mode which has to play nice with the new Windows UI.

If you wish to publish web content that leverages the new ‘desktop mode’ you’ll want to visit Microsoft’s ‘developer guidance’ page for information on new meta tags and HTTP header codes that help flag such content.

In a nutshell they explain that either the header:

X-UA-Compatible: requiresActiveX=true

OR the meta tag:

<meta http-equiv="X-UA-Compatible" content="requiresActiveX=true" />

… work to create a handy little prompt explaining that the content on the page requires the page to be viewed in ‘desktop’ mode, and even gives a single-click shortcut to switch over:

IE10 desktop warning

The same page also deals with ‘Compatibility Verification’ and the steps to test/certify that your flash content is compatible with the extra features of a tablet OS.

Of particular interest is the option of a single registry entry that allows testing of your site for ‘debugging’ to see just how broken your flash content is.

The key is located here:
HKEY_LOCAL_MACHINE\Software\Microsoft\Internet Explorer\Flash\DebugDomain
.. and if you wanted to make a .reg file for easy access the contents would be:
REGEDIT4
**Blank Line/Carriage Return**
[HKEY_LOCAL_MACHINE\Software\Microsoft\Internet Explorer\Flash\DebugDomain] @="www.mywebsite.com"
**Blank Line/Carriage Return**

At that point you could right-click the .reg file you made and click on ‘install’ from within the pop-up menu.

Passing this .reg file to your developers would be fine, but since only one site can be specified, this is NOT a solution for your end users.

Obviously the best advice we can give, as SEOs, is to ditch your Flash content completely.

HTML5 with all it’s perks can replace almost anything you’ve done in Flash and Google’s even willing to help you make the switch by offering the Swiffy Flash -> HTML5 Conversion Tool.

If you feel your content is too sophisticated for Swiffy, or you haven’t tried the tool recently, you should!

Here’s an example of how well the tool works on a flash game with keyboard and mouse controls:

[iframe src="https://swiffypreviews.googleusercontent.com/view/gallery/example3_swiffy_v4.9.html"][/iframe]

SEO news blog post by @ 12:07 pm on October 11, 2012


 

New Webmaster Guidelines Part 1 – Design and Content

Google recently updated their webmaster guidelines following the latest algorithm update. It is easy to feel inundated with the amount of information regarding web design dos & don’ts and the best practices for the internet. As an SEO I am frequently asked, “How can I get my site to rank?” The fact of the matter is that we follow the Google’s Webmaster Guidelines which establishes the best practices for websites to follow. Many are concerned about the Panda/Penguin updates and are worried that there site will be hit; or they have a site that has been hit. Our advice remains consistent: "Drink the Google Kool-Aid".

magician_rabbit_hat

At one time, it was exceedingly difficult to get a straight answer from Google in regards to what was considered best practice. This led to a wild-west frontier attitude and many designers and SEOs adopted many bad practices. This is lead to an inundation of webspam in the Google SERPs and made it very difficult to get quality search results.

The Panda and Penguin algorithm and subsequent updates was a very concerted effort to rid the SERPs of webspam. In the wake of these substantial updates, my advice to customers remains consistent; follow the Google established guidelines. The mantra I repeat to my customers is: "Would I do this if search engines didn’t exist?"

For many of us this is old news, but I still find myself learning new things to try and better practices to adopt. Much of the messaging from Google has been very consistent regarding what makes good content. This post will looks specifically at Google’s recommended Design and Content Guidelines to help Google find, crawl and index your site.

Site Hierarchy

  • Give your site a clear hierarchical structure and make it as easy to navigate as possible. Every page should be reachable from at least one static text link.
  • Think of your website as a book with logical sections and headings; each with their own unique and relevant content.
    • The Title of you is your domain URL (eg. www.booktitle.com)
    • Your title tag <title> can be your topic for the page. It defines what content will be on this page (eg. <title>Book Characters</title>).
    • Your heading tag is your chapter title eg. <h1>Book Characters</h1>. Typically this is the same or very close to the page title and must be directly relevant.
    • Have only one topic per page and only one H1 tag on any page.
    • Use subsequent heading tags (h2, h3, h4) to define further related divisions of the chapter.

Site Map

  • Offer a sitemap for your visitors. Not only does this provide a valuable service to your customers, but it can help improve the indexing of your site by bots.
  • If you have an extensive number of links on your site, you may need to break your sitemap into multiple pages.
  • Remember that a website sitemap is different than the sitemap.xml that you should submit to Google’s Webmaster Tools.

Internal Linking

  • Keep the number of links on any page to the bare minimum. The guidelines used to state ‘around 100’ but this is one area where less is more.
  • In the most recent iteration of the Webmaster Guidelines, Google has only stated to ‘keep it to a reasonable amount’. Too many links leading to other internal pages or offsite is distracting to the visitor. It lowers conversion rates due to people getting lost and creates frustration.

Textual Content

  • Google has always stated that ‘content is king’. It is absolutely imperative that you create rich, useful and dynamic content that engages your audience. All textual content needs to be well written and grammatically correct. It should clearly and accurately describe your content and it must be relevant to the page that it is found on.
  • Do not write for what you think Google wants to see. Think about what searchers would type into a search engine to find your page and ensure that your content actually includes those terms.
  • Do not concern yourself with keyword densities. Inevitably the content comes across as spammy and does not read well. Google may regard this as keyword stuffing and see broken/confused grammar as potential spam or scrapped content…exactly what the Panda/Penguin updates are designed to target, and penalize for.

Page Coding

  • Use a crawler on your site such as XENU’s Link Sleuth, or Google’s Webmaster Tools to check you site for broken links.
  • Check your site with the W3C to ensure that your site has valid HTML.
  • Avoid the use of dynamic pages with cryptic URLs (e.g., the URL contains a "?" character). Try to use keyword focused URLs that reflect the page you are building. If you must use a dynamic URL structure, keep them few and the parameters short.

Images

  • You can give Google additional details about your images, and provide the URL of images we might not otherwise discover, by adding information to a web sitemap.
  • Do not embed important content into images; always use text links instead of images for links, important names etc, where possible. Google crawlers cannot determine the text displayed in an image. If you must use an image for textual content, ensure that you make use of the image ALT tag to describe the image with a few words.
  • Ensure that all image <title< and ALT attributes are descriptive (but not spammy) and accurate. Follow these guidelines for creating great ALT text for your images.
  • Give your images detailed and informative filenames.

The following areas (video and rich snippets and their usage are best described by Google themselves:

Video

View the full post here: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=156442

Rich Snippets

View the full post here: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1093493

Coming next time, I will review the newly updated Technical Guidelines and then conclude with Google’s Quality Guidelines.

SEO news blog post by @ 1:15 pm on October 10, 2012


 

EMD Insanity and Coining Phrases

It’s clearly time for Beanstalk to officially list ourselves as a sitedefibrillation solution provider.

Why? Because apparently the secret to SERP dominance with an EMD is to coin your own phrase!

Do a search for ‘coinflation’ + ‘gold’ or really, almost any other keyword to see what Google considers an ‘improved’ result following the EMD update.

Google Search results for Coinflation 
If you didn’t get something like the results above, please let us know!

 
Okay so that seems slightly silly, but how the heck did they pull that off? There’s clearly PPC/AdWords competition for the phrase, and EMD should either be a penalty or moot, shouldn’t it?

Well apparently not! In fact EMD can still clearly be an asset if the ‘quality’ scores are all above par!

This means that if you have an organic campaign, with ongoing back links/references from trusted sources, and you aren’t hitting other penalties, you really should be feeling no loss at all from the EMD update.

Indeed, if your competition was using non-organic approaches to EMDs they should have taken a trust hit, and you may see an improvement in position due to their failings!

So while I can show you some examples of the EMD apparently failing to work, we can assure you it’s working, and overall seems like a positive step for Google.

10″ Google Nexus from Samsung?

Last night CNET announced some ‘highly’ probable info that Samsung is manufacturing a new 10.1″ Nexus tablet for Google.

The article is more of a stub of hear-say but had some rather ‘exact’ details including the resolution of the display:

The 2,560×1,600 display will have a PPI (pixels per inch) of about 299, said Shim. That tops the 264 PPI on the 9.7-inch 2,048×1,536 Retina iPad.

Clearly this will be the ‘high end’ model for the Nexus line (currently manufactured by Asus), especially when you consider that Google will be releasing a 7″ Nexus subsidized down to a $99 price this December!

In fact since we’re pondering things to come more than talking facts, I’d have to assume this will be a dual or quad core device with GPU acceleration of some sort to assist with up-scaling video content and 3d games to that eye-popping resolution.

So if this high-end Nexus tablet is anything less than $399 I’d be really shocked and very worried for Apple.

Okay, perhaps more worried for Apple, would be more accurate given it’s current public affairs issues..

[iframe width="549" height="309" src="http://www.youtube.com/embed/JEy2u2n_XTQ?rel=0" frameborder="0" allowfullscreen][/iframe]

In case you’re wondering ‘who cares?’; Tim Pool goes to the streets and broadcasts unedited footage of protests/events.

I’d like to think Apple is patenting this to prevent companies from doing this, but in actual fact this is very creepy stuff from the overly litigious makers of the most expensive walled gardens on the planet.

It seems almost like Apple is testing how well their brand/product can weather bad public image at this point?

SEO news blog post by @ 11:53 am on October 9, 2012


 

You may need an EMT after the EMD Update!

Last Friday Matt Cutts tweeted about Google’s latest update, which focuses on penalties for ‘low-quality’ Exact Match Domain names, hence the EMD TLA.

Twitter posts from Matt Cutts on the latest EMD Update

While Google is never big on giving us the details lets digest this together!

Using a relevant keyword in a domain has been a very long-standing ranking signal.
ie: A consulting site for financial companies using ‘financial-consulting.com’ as a domain would be seen as relevant

Over the years this has lead to people grabbing up domains with keywords in them for SEO purposes.

JACOBS BY MARC JACOBS FOR MARC BY MARC JACOBS ETC..

Having your keywords in your domain name didn’t mean overnight dominance of the web, thankfully. Indeed, there was usually some trade-off between desirable keywords and a reasonably short domain name.

In fact, no organic/white-hat SEO would suggest you use something like:

‘best-value-online-financial-consulting-company-with-proven-results.com’

Why? Because the gains in SEO wouldn’t match the losses in user trust/conversions.

Would a good organic SEO/White Hat tell you NOT to purchase those types of domains for 301s to your main site?

I’d like to think so, but this was clearly a strategy for a lot of sites competing for top rankings.

Regardless of your SEO ethics, the practice of domain parking/selling because of search ranking signals is clearly an unnecessary burden on the internet.

While the ‘domains for sale’ issue would still exist without search engines, search engines honestly should be making your choice of domain name MUCH less relevant.

Ideally fresh internet traffic should occur as match between the searchers needs and the services/information that your site provides.

And with this latest update it’d appear that Google agrees with the idea that book should found by more than what’s on the cover.

As of this last update you can expect sites with nothing but some keyword dense 301′d domains to now face a penalty instead of a positive ranking signal.

We didn’t see this coming!

EMD Update Results

I’m already seeing people post sad tales of the deep impact this update is having on certain sites, and I’ve had a laugh at a few ‘professionals’ claiming they never felt this day would come.

Personally, while I’ve watched some very good presentations on SEO and web ranking strategies, the one thing that helps me most as an SEO is Matt Cutts’ breakdown of the real philosophy behind ‘good SEO’ which boils down to:

Never do something for the sake of search engine rankings alone.

If you like ‘Lord of the Rings’ then look at this as:

‘One Rule to Lead them all, one Rule to be found by…’

..and you should never have to fear a Google update!

In fact you should look at each Google update as a chance for your rankings to improve as other sites are punished for their ‘clever’ attempts to game the system.

Another Google Easter Egg?

And finally, to end the post with a chuckle, here’s a Google search phrase for you to test out:

I was hoping this was more than just an ‘Easter Egg‘ in Google’s search, but alas Google hasn’t yet licked mathematical artificial intelligence. :p

SEO news blog post by @ 12:01 pm on October 2, 2012


 

Dublin the Airports: iOS 6 Maps is Rotten


Apple’s extra Airport..

Was anyone expecting Apple to replace Google’s Maps application with something superior? Apparently, the iPhone user base and Apple actually expected this to happen.

If you look at the most extremely biased sites reviewing the new ‘Apple’ Maps app for iOS 6 you will see guarded optimism and lots of ‘reasoning’ clash with angry rants from amazed and disappointed users.

One thing I don’t see is anyone calling it the ‘Maps app that Apple bought from TomTom’ the best I’ve seen is a mention that they relied heavily on TomTom and OpenStreetMap for data alone.

Instead I see a very consistent collection of sympathetic remarks like: ‘this is beta, it can only get better’, ‘for a first attempt this is outstanding’, ‘people will question anyone who takes their own path..’

But Apple isn’t taking their own path, they are merely attempting (badly) to replace something that wasn’t really broken.

Sure, Google wasn’t toiling endlessly to include all the updates it was adding to the Android version of Google Maps.

I’m guessing Apple really expected Google to beta test ideas on the Android and then polish them up and finalize them on the iPhone?

So sure, Google put Android development first, and there were things that Google Maps did better on the Android, but that still doesn’t mean it ‘had to go’.

Apple could have offered both solutions in a ‘use what you like’ approach to pleasing it’s user base, but this is a company making headlines for outrageous profits and the working conditions of it’s manufacturing partners.

Removing the choice to pick another company’s solution would clearly explain why Apple didn’t take a settlement from Samsung and wanted to ban their phones. Apple want’s profits, and if Apple wants really happy customers they could lower prices and focus on better apps vs. removing the best ones for inferior versions.

And in other News

Google has blessed a new meta tag!
meta name=”news_keywords”

content=”Apple Maps, iOS 6, Google Maps, Android, TomTom, Google news meta tag”

Do you publish content that you would call ‘news’?
Would you like Google to better understand the topic of your posts?
Would you like the freedom to ignore keyword use in a topic for style reasons?

Then brothers and sisters, this new meta-tag is what you’ve been waiting for!

The format is very simple, and it belongs near the top of your page content, usually in the <head> … </head> section.

Here’s an example:

<meta name=”news_keywords” content=”10 keywords, separated, by commas, just like, meta keywords, etc..”>

That’s some ‘easy breezy’ SEO optimization, and it’s great if you are indeed publishing ‘news’; Not just ranting about Apple. :)

SEO news blog post by @ 11:51 am on September 20, 2012


 

Mmmmmm Bacon..

Did that get your attention? Some crispy fresh smoky bacon?

It’s a pity then that the story isn’t about hot pork but instead about degrees of bacon.

Degrees of Kevin Bacon to be exact.

Google has given us yet another nerdy Easter Egg, not unlike the StarCraft inspired ZergRush or StarFox inspired BarrelRoll, Easter Eggs. (Shame on PCWorld for their typo this morning!).

If you add ‘bacon number’ to an actor’s name in a Google Search, Google will tell you the degrees of separation between the actor and Kevin Bacon.

Heck it even works with actresses!

Try a Google search for: “Oliva Wilde bacon number

..you should get a Bacon Number of “2″!

This is because Oliva worked with Ryan Reynolds in ‘The Change-Up’..

Ryan Reynolds is working with Kevin Bacon on the action/comedy film ‘R.I.P.D.’ that’s coming out in early 2013.

Thus Oliva Wild’s ‘degree of separation’ with Kevin Bacon would be a 2.

All Oliva needs to do now is add her Bacon number to her profile page like so:

 
Since you’d need to be pretty famous to have a Bacon Number I expect that it will be *the* thing to have, if you’re a movie star.

Fat Hacker – Cosmo the God & UGNazi

This is not my best segway(seguay?) between stories, but I was simply blown away by the tale of a chubby 15 year old hacker in California who is in jail for widespread hacking and mischief.

This inventive teen, with poor supervision, has managed to hack a wide cross-section of some of the worlds biggest companies including:

Amazon, Apple, AT&T, PayPal, AOL, Netflix, Network Solutions, and Microsoft

`Cosmo`, as he is called online, likes to point out that none of these hacks were particularly tricky, and is calling on companies to fix their easily exploitable systems, while he sits in a juvenile detention center after admitting to many of his `hacks`.

The story I read on Wired.com was so well written I’m not even going to try and do any excerpts, I’m just going to drop the link and insist you give it a read.

 
Nicely done Mat Honan, from a victim to a sympathizer, all in one interview. This is great investigative journalism, and we need more like it.

SEO news blog post by @ 1:01 pm on September 13, 2012


 

Bing Maps : 500 Terabytes Better

Donald Sutherland from the 1978 version of the Invasion of the Body Snatchers

There’s fat ladies crooning in the shower, swine are airborne, and I found something in Bing that’s better than the same option in Google!?

Don’t send NASA to check for alien life/body-snatchers, it’s just a few really small perks that I’ve come across and they are pretty darn specific.

500 Terabytes of new image data

Microsoft started it’s ‘Global Ortho Project‘ in early 2010 with the very ambitious goal of mapping the Continental United States and Western Europe at a resolution of 30cm.

The concept is simple, just fly around with high resolution imaging devices, in this case the ‘UltraCamG‘ which Microsoft acquired in 2006 after purchasing Vexcel Imaging, GmbH in Austria.

The data is thus detailed, and current, a great thing when you are competing with Google’s constantly updated (~2 weeks) satellite images.

With a deadline of June 2012 the project is wrapping up almost on time and today the news sites are abuzz with the headlines that the project is completed and available to Bing Maps users.

For a comparison of the results here’s a look at the Beanstalk Office in Google Maps and then in Bing Maps:

Beanstalk's Office in Google MapsBeanstalk’s Office in Google Maps

 

Beanstalk's Office in Bing MapsBeanstalk’s Office in Bing Maps

 
Can you see the difference? Even if Bing didn’t have the resolution bonus, they own their image data so they aren’t required to spam their name all over the map like Google has to with the Landsat image data.

I’d love to show off the difference between Google’s Streetview and Bing’s Streetside view, but Microsoft apparently couldn’t afford to send someone by to take some images of our office <rasberry>so I’m not going to be bothered to show that off</rasberry>.

Traffic Data?

While writing this article I stumbled upon another difference between Bing and Google, there’s traffic data for the highways in my city on Bing, but Google has no data for my city (the capital city of this entire province), instead they spent the time to build traffic data for our sister city, Vancouver.
Google Traffic view of Vancouver
Talk about a let-down from Google, and a surprising plus from Bing. Tsk tsk..

On that side of things though, Google’s traffic info is much better than Bing’s. Google Maps even lets you pick a day of the week and hour of the day for planning ahead vs. making the assumption that you’ll only looking moments before you travel, or as you travel.

Overall the user experience with Bing Maps still lags behind Google Maps, with each attempt to zoom/pan/adjust on Bing Maps feeling like a blurry and slow mess due to the bitmap labels that stretch vs. re-size.

I even loaded Bing Maps in Internet Explorer (64 bit version) and Google Chrome to make sure I gave them the best chance to compare to the very peppy results with Google Maps.

Building Maps

As I was wrapping up this piece I noticed that there was a funny ‘block’ covering one of the malls in town when using Bing Maps.

Being a curious fellow I clicked it and found that they have mapped out the mall’s floor plan and allow you to see where each store is located, floor by floor!

The Bay Center Mall in Bing Maps Building ViewOooh! A 4hr 40% off Sale!? I could get some cheap studded ballerina shoes!

 
To be really honest, both Bing and Google are developing some unique features that helps maintain the competition between them which is excellent for the consumers who can use either service or both.

Now if only I could get a service to tell me where my pens have gone..

SEO news blog post by @ 12:29 pm on August 30, 2012


 

Litigation vs. Innovation – The Apple Way

I’m really ashamed of my days of being an Apple loyalist, encouraging people to consider Apple solutions, and fighting for the ‘little guy’ computer company.

That ‘little guy‘ I once championed, has since grown up to be a thug making immoral decisions that I no longer agree with.

Apple is causing me deep personal embarrassment as they strut about the digital playground smashing things that compete with their creations.

A scene from the movie The Dictator where he wins by shooting his competition

You know something’s wrong with a company’s decisions when you’re watching a Sacha Baron Cohen movie (The Dictator) and the opening scenes of winning a race by shooting the competition reminds you of Apple’s choices to force litigation/product bans vs. accepting a financial settlement with Samsung.

http://www.youtube.com/watch?v=dcu5sYxcEuo

Samsung will fight the decision and have already announced that they will counter-sue Apple.

Since Samsung successfully defended themselves in many countries (Germany, Korea, Netherlands, and United Kingdom), winning court battles which ruled that they did not copy Apple’s designs, a counter suit and appeal are likely to change the situation drastically.

On top of everything else, jurors in this recent court case are already making headlines stating that they were unable to properly review all the evidence, and ignored the prior art evidence that proved Apple clearly copied others in it’s iPhone design.

The jury actually took a defensive role, putting themselves in the mindset of innovators defending their patents. Velvin Hogan, the 67 year old jury foreman, stated that the jury :

“wanted to send a message to the industry at large that patent infringing is not the right thing to do, not just Samsung.”

With any luck, the same feelings will hold true as Motorola (Google-rola?) continues it’s legal action against Apple’s unpaid patent uses.

Since the patents in the current lawsuit are non-essential, one would assume that Google-rola has the opportunity to give Apple a taste of how it feels to block a company’s products via legal nonsense.

However, the likely result will be that even after (2?) years of trying to get Apple to pay the licensing fees, Google-rola won’t turn-down an offer of fair payment, just to block all product sales, unlike Apple.

Speaking of a ban on products, Samsung is already talking about releasing updated products that are completely free of Apple’s patent bans.

Zero Day Java Vulnerability

According to a few reputable sources online, there’s a new browser-based exploit for Java that is ‘in the wild’ and a patch won’t be coming very soon.

When someone says ‘in the wild’ it means that there’s reports of the exploit being used publicly, which means that there’s a high risk of contact.

In this case the exploit has been used to remote-control Windows based PCs that visit websites with hidden code on certain pages. The hacker in this case picked a Chinese proxy/IP and the ‘control network’ is also believed to be located in Singapore.

Since ‘wise’ hackers usually pick a point of origin outside their own country, this info actually points to someone non-Chinese as the source of the hack.

While that exploit only works on Windows computers, the payload is totally independent of the hack, so the same strategy will work on any computer and any browser.

To avoid getting hit, you may want to disable JavaScript:

In Chrome:
- type “chrome://plugins/” into your address bar
- on the plugins page, scroll down to Javascript and disable it.

In Opera:
- go to “opera:plugins”
- on the plugins page, scroll down to Java(TM) Platform
- click on Disable
- also scroll down to Java Deployment Toolkit
- click on Disable

In Firefox:
- press the Firefox button
- go to Add-ons
- go to Plugins
- click the “Disable” button next to anything named “Java”

Finally if you are using Internet Explorer, you probably don’t care, but here’s some recent instructions stolen from the help desk over at Indiana University:

To enable or disable Java in Internet Explorer:

From the Tools menu (or the Tools drop-down), select Internet options.

  • Click the Programs tab, and then click Manage Add-ons.
  • Highlight Java Plug-in.
  • Click Disable or Enable (located under “Settings” in version 7), as applicable.
  • Click OK twice.

To enable or disable JavaScript:

From the Tools menu (or the Tools drop-down), choose Internet options.

  • Click the Security tab.
  • Click Custom Level…
  • Scroll to the “Scripting” section of the list.
  • For “Active Scripting”, click Disable or Enable.
  • Click OK, and confirm if prompted.
  • Close and restart your browser.

SEO news blog post by @ 11:57 am on August 28, 2012


 

You don’t want the next Penguin update…

Scary Matt Cutts

Is Matt Cutts just goofing around or is he really trying to scare us?

The statement in the title of this article, from Matt Cutts, has the SEO world looking for further information as to just how bad the next Penguin update will be.

During the SES in San Francisco this week Matt Cutts got a chance to speak about updates and how they will effect SEOs. One of the things he was quoted as saying really caught my eye:

You don’t want the next Penguin update, the engineers have been working hard…

Mr.Cutts has recently eaten some words, retracting his statement that too much SEO is a bad thing, and explaining that good SEO is still good.

Even with attendees saying that he spoke the words with no signs of ominous intent, how do you expect the SEO world to take follow up statements like:

The updates are going the be jarring and julting for a while.

That’s just not positive sounding at all and it almost has the tone of admission that the next updates are perhaps going to be ‘too much’ even in Matt’s opinion, and he’s one of Google’s top engineers!

My take is that if you are doing anything even slightly shady, you’re about to see some massive ranking spanking.

Reciprocal links, excessive directories, participating in back-link cliques/neighborhoods, pointless press releases, redundant article syndication, duplicate content without authorship markup, poorly configured CMS parameters, etc.. These are all likely to be things, in my opinion, that will burn overly SEO’d sites in the next update.

The discussion also made it’s way to the issues with Twitter data feeds. Essentially since Google and Twitter no longer have an agreement, Google is effectively ‘blocked’ from crawling Twitter.

Dead twitter bird

On the topic of Twitter crawling Matt Cutts was quoted as saying:

..we can do it relatively well, but if we could crawl Twitter in the full way we can, their infastructure[sic] wouldn’t be able to handle it

 

Which to me seems odd, since I don’t see any other sites complaining about how much load Google is placing on their infrastructure?

Clearly the issue is still political/strategic and neither side is looking to point fingers.

With Twitter’s social media relevance diminished you’d think +1′s would be a focus point but Matt Cutts also commented on the situation stating that we shouldn’t place much value on +1 stats for now.

A final point was made about Knowledge Graph, the new information panel that’s appearing on certain search terms.

Since the Google Search Quality team is now the Google Knowledge Graph team Matt Cutts had some great answers on the topic of Knowledge Graph, including the data sources and harm to Wikipedia.

There had been a lot of cursing about Google simply abusing Wikipedia’s bandwidth/resources but it was made clear during the session that Wikipedia is not traffic dependent because they don’t use ads for revenue.

Essentially, if Wikipedia’s data is getting better utilized, and they haven’t had to do anything to make it happen, they are happy.

If you wanted to get more details there’s lots of #SESSF hashed posts on Twitter and plenty of articles coming from the attendees.

I’m personally going to go start working on a moat for this Penguin problem..

SEO news blog post by @ 11:56 am on August 16, 2012


 

Particle Physics and Search Engines

If you’ve been hiding under a rock then you may not have heard the news of the ‘God Particle’ discovery.

As someone who is fairly scientific, I look at this as more of a proof of concept than a discovery, and ‘God’ really needs to give Peter Higgs some credit for his theories.

 
I won’t dwell on the news surrounding the Higgs boson particle confirmation, but there are parallels between objects colliding and revealing previously unseen matters.

When Search Engines Collide

It’s been some time since Bing and Yahoo merged, so the data sets should be the same right?

No. That would really be a wasted opportunity, and Microsoft is clearly smarter than that.





 
By not merging the search data or algorithms of Bing and Yahoo, Microsoft can now experiment with different updates and ranking philosophies without putting all it’s eggs in one basket.

An active/healthy SEO will be watching the updates to search algorithms from as many perspectives as possible which means a variety of sites on a variety of topics tracked on a variety of search engines.

Say a site gets a ton of extra 301 links from partner sites, and this improves traffic and rankings on Bing, causes a stability of movement on Yahoo, and a drop in traffic on Google?

It’s possible to say that the drop on Google was related to a ton of different factors, untrusted links, link spam, dilution of keyword relevance, keyword anchor text spamming, you name it. This is because Google is always updating and always keeping us on our toes.

Bring on the data..

Lets now take the data from Bing and Yahoo into consideration and look at what we know of recent algo changes on those search engines. This ‘collision’ of data still leaves us with unseen factors but gives us more to go on.

Since Bing has followed Google on some of the recent updates, the upswing on Bing for position of keywords would hint that it’s neither a dilution of relevance or spamming on the keywords/anchor text.

Stability on Yahoo is largely unremarkable if you check the crawl info and cache dates. It’s likely just late to the game and you can’t bet the farm on this info.

What about the other engines? Without paying a penny for the data we can fetch Blekko and DDG(DuckDuckGo) ranking history to see what changes have occurred to rankings on these engines.

Since Blekko is currently well known to be on the warpath for duplicate content, and they are starving for fresh crawl data, a rankings drop on that service can be very informative especially if the data from the other search engines helps to eliminate key ranking factors.

In the case of our current example I’d narrow down the list of ranking factors that changed on the last ‘Penguin’ update and contrast those with the data from the other engines and probably suspect (in this example) that Google is seeing duplicity from the 301s, something Bing wouldn’t yet exhibit, but Blekko would immediately punish as badly or worse than Google.

The next step would be to check for issues of authority for the page content. Is there authorship mark-up and a reciprocal setup on the author’s end that helps establish the trust of the main site content? Does the site have the proper verified entries in Google WMT to pass authority? Barring WMT flags, what about a dynamic canonical tag in the header, even as a test if it’s not already setup?

Start making small changes, watch the results, and be patient. If you’re not gaming Google and you’ve done something accidental to cause a drop in rankings, you need to think your way through the repairs step by step.

It’s not easy to evaluate but the more data you can mash-up, and the better you understand that data, the closer/quicker you can troubleshoot ranking issues and ensure that your efforts are going to be gains.

SEO news blog post by @ 12:12 pm on July 5, 2012


 

« Newer PostsOlder Posts »
Level Triple-A conformance icon, W3C-WAI Web Content Accessibility Guidelines 1.0 Valid XHTML 1.0! Valid CSS!
Copyright© 2004-2014
Beanstalk Search Engine Optimization, Inc.
All rights reserved.