Beanstalk on Google+ Beanstalk on Facebook Beanstalk on Twitter Beanstalk on LinkedIn Beanstalk on Pinterest
Translate:
Published On:
SEO articles and blog published on ...
Hear Us On:
Webmaster Radio
Blog Partner Of:
WebProNews Blog Partner
Helping Out:
Carbon balanced.
Archives
RSS

XMLRSS

Missing Authorship Photos?

If you’ve become accustomed to seeing your charming mug in the SERPs when you are Google’ing your keywords, it might be rather unsettling to see those images suddenly disappear.

Rich Snippet SERP example

Fear not! This isn’t something you have done, or not done, this is actually kicking up a bit of fuss on the SEO forums/discussion areas today and clearly looks to be an issue on Google’s end.

In fact if you were in need of reassurance, all you have to do is hop into your Webmaster Tools account, and visit the ‘Rich Snippets Tool‘ to get a preview of what your SERPs would normally look like.

If you are sure that you’re not part of the current issue, or you’re just curious what we’re talking about, the Troubleshooting Rich Snippets page is a great resource to tackle possible problems.

Google invests another $200,000,000.00 in renewable energy..

I could have written .2 billion, or 200 million, or even 200 thousand thousands, but why play with such a large sum of money?

Google certainly isn’t playing around; With this latest investment Google’s grand total in renewable/clean energy is over $1 billion US and growing.

This isn’t just charity either, some of these investments are just smart business because the returns are very fixed and low risk.

Illustration of power saved by using GMail vs. Postal Mail

Being honest about pollution is brave, and bragging about your low footprint is begging for trouble, but Google marches on stating:

“100 searches on Google has about the same footprint as drying your hands with a standard electric dryer, ironing a shirt, or producing 1.5 tablespoons of orange juice.”

You can read more about Google’s efforts to reduce, eliminate, and assist others with power consumption/carbon footprints, over on the Google Green Pages.

SEO news blog post by @ 11:57 am on January 10, 2013


 

How Short Content Can Help you Rank

A common misconception is that you need to provide at least 500 words of onsite content to have your page rank with Google. Your rankings are dependent on many factors and signals and is not necessarily determined by the number of words on a page; no matter how well written they are.

copywriting

It all comes down to creating unique content that is not only interesting, but engages your viewers and drives ongoing conversations in the form of replies or comments. In a recent Google Webmaster Help thread John Muller of Google, clarified this exact point.

"Rest assured, Googlebot doesn’t just count words on a page or in an article, even short articles can be very useful & compelling to users. For example, we also crawl and index tweets, which are at most 140 characters long. That said, if you have users who love your site and engage with it regularly, allowing them to share comments on your articles is also a great way to bring additional information onto the page. Sometimes a short article can trigger a longer discussion — and sometimes users are looking for discussions like that in search. That said, one recommendation that I’d like to add is to make sure that your content is really unique (not just rewritten, autogenerated, etc) and of high-quality."

Google crawls everything from full articles to 140 character tweets. Google recognizes that even short comments or articles can be triggers for engaging conversations. There is no magic number; there are no “tricks” to SEO. Creating unique and valuable content and you visitors and ranking will follow.

SEO news blog post by @ 10:56 am on December 5, 2012


 

Google’s New ‘AuthorRank’ Bigger than Panda and Penguin Combined

If you are in the SEO industry, you have probably a new buzz word floating around the water cooler; “AuthorRank.”
AuthorRank signals image
In August of 2005, Google filed a patent for a technology dubbed Agent Rank in which ranking ‘agents’ use the reception of the content they create and the resulting interactions as a factor in determining their rankings. The patent goes on to suggest that more well-received and popular “agents” could have their associated content rank higher than unsigned content or the content of other less-authoritative “agents”.

After adding a continuation patent in 2011, Google is now able to attribute content to specific agents and can now rank these agents thanks to platforms like Google+. AJ Kohn goes into much detail about AuthorRank and why he feels it will be bigger than Panda and Penguin combined. AuthorRank will not be a replacement for PageRank, but will work in conjunction with it to enable Google to rank high quality content more appropriately.

I certainly don’t claim to be an expert on AuthorRank and in fact am only learning about it as I write this. What I did learn from the information I read is that content has and will always been key to the success of any website. Google’s mantra to publishers has always been that “content is king”; provide high quality content and the ranking, and followers will follow. This new signal will be in place soon as a final coup de grace to those still stuck in antiquated methods of content creation and syndication.

SEO news blog post by @ 10:59 am on November 14, 2012

Categories:Google,Google,Google+

 

Google Image Optimization

Image optimization for Google can mean several things, from image compression, to image resolution, or even referencing Google Image Search optimization.

Worry not, the topic becomes broad but we can tackle it section by section, and along the way we’ll be pointing you to actual Google tools in order to ensure you’re getting the best results.

Image Compression

The biggest gains you can get with the least effort typically come from looking at the wasted bytes (often kilobytes) when images aren’t compressed properly.

Here’s a comparison of JPEG image compression:

Max size Max size
5,899 bytes
Poor Compression
3,493 bytes
Quality Compression

And now PNG compression:

Max size Max size
5,590 bytes
Poor Compression
4,769 bytes
Quality Compression

Now honestly, if I had hidden the image sizes and descriptions, could you tell me which was the 3.5kb image?

Google could tell you in a flash, and Google’s PageSpeed Insights scores your page speed by how optimized your images are.

An observant reader may wonder why the PNG with ‘poor’ compression is smaller than the JPG? The answer is that it’s transparent, and the PNG is only saving image data (compressed losslessly) for the visible pixels vs. JPG which has to save the additional information that ‘these pixels are white’.

Also keep in mind that we used really small images to keep this page loading quickly, the larger the image, the more of a difference compression quality can make.

Image Resolution

The phrase ‘resolution’ has so many variable definitions that I would need to resolve the idea of this as a post vs. an article.

For the context of this discussion I’m speaking of the image dimensions, not the pixels-per-inch.

As an SEO blog I’d have to be really lazy to not mention the issue of image placement/size on a site when we know that Google has a clear concept of what’s most visible to your audience.

When I say ‘your audience’ it is not just a buzz-word, I really mean that Google looks at it’s analytics data and the browser window size of your traffic and actually knows when a site is delivering the right content for the majority of it’s user base.

So if your website is plastered with images that force the user to look for your content, and your content isn’t images, then that’s actually a problem in terms of SEO Optimization.

In fact Google’s just in the middle of moving it’s ‘Browser Size’ tool into the Google Analytics suite.

As you can see in this example of jQuery Mobile in the Browser Size tool, the existing results are generic and dare I say “unprofessional” looking?

Example of jQuery Mobile in the Google Browser Size tool
In the above image we can see what % of general web users can see the elements of the page.

I would show off an example of the same page using the new tools, but Google Analytics is only for sites you own, and the new version is still in beta, throwing out ‘Not a Number’ (NaN) errors regardless of your choice of browser.

What you want to end up with, regardless, is a site that fits the screen size of your audience. So if you are running a forum that reviews ‘apps’ you probably want to aim for a design that will fit you most important content above ‘the fold’ with mobile browsers (at least the current generation of mobile browsers).

Image Site Maps

Site Maps are typically an XML format document that explains your website’s pages to Google in a more technical manner.

An image site map is specifically for explaining the images that are on your site.

Google does a great job of finding pictures you’ve linked to, but if you use JavaScript to create galleries, without using <noscript> tags, then Google could have difficulty indexing those images.

An image sitemap’s XML structure lets you clearly spell out each image with options like:

  • loc: The full URL for the image
  • caption: Description of the image
  • geo_location: Physical location ie: British Columbia, Canada
  • title: Title of the image
  • license: URL pointing to a license for the image

Since each entry is related to a <loc> URL if your image is remotely hosted that’s fine, Google understands the need for CDNs, but that remote site needs to be registered in Webmaster Tools for proper indexing of the images.

Once again I’ve gone a bit too far on the topic for a first round, but I will return with a deeper look beyond the surface of the issue in a part 2 post.

For now if you wanted to start working on an image sitemap (or adding image references to your existing sitemap) look at this answer in Google’s Webmaster Guidelines.

SEO news blog post by @ 1:32 pm on November 1, 2012

Categories:Coding,Google

 

New Webmaster Guidelines Part 2 – Technical Guidelines

This is part 2 of an in depth look at the newly revised Webmaster Guidelines from Google. Google has recently updated their list of best practices and suggestions for site development. To give your site the best chance of ranking well, and to keep a competitive edge, the Google guidelines should be read like the gospel.

monkey fixes computer

• Did you ever wonder how Google processes your site to determine its focus and content? Try using a text-based browser like Lynx to understand what Google is using to interpret your site.

By displaying the page without dynamic elements such as Flash, JavaScript, cookies, sessions IDs or DHMTL, you will gain a keen insight as to what is actually visible to the Google. If there is not enough content to be read, then Google is going to have a difficult time indexing your site and establishing you value in the SERPs

• Allow bots to crawl your site without session IDs or arguments that are designed to track a user activity. Disallow specific URLs that you don’t want crawled in your robots.txt file. Sessions IDs are antiquated and should not be used in any new site development. You can use cookies instead for monitoring site traffic.

• Check to see that your web server supports the “If-Modified-Since” HTTP header. This tells Google if your content has changed since it last crawled your site, saving bandwidth and overhead.

• Use the robot.txt file to exclude directories that do not need to be crawled from Google. Keep it updated in your Webmaster Tools account and ensure that you are not blocking Google bot from crawling your site by testing it in Webmaster Tools.

• Keep advertisements (such as Google’s AdSense and DoubleClick) to a minimum and ensure that they are not affecting your rankings by making sure they are excluded in your robots.txt file.

• If you use a content management system (CMS), makes sure that it support seo friendly URL structure and is easily crawled by bots.

• Test you site in several browser’s (IE, FireFox, Chrome, Lynx, Opera, Safari) at different resolutions.

• Use tools to monitor page load speeds. This is becoming an increasingly bigger factor for rankings. Use Google’s Page Speed, or Webmaster Tools Site Performance Tool to gain insights on how to boost you page loads speeds.

SYNOPSIS:

• Make use of the robots.txt file to keep your site accessible to the Google bots
• Block unneeded/irrelevant content from
• Use SEO friendly urls and move away from parameter-based urls
• Monitor your page load speed and take steps to improve it.

SEO news blog post by @ 12:09 pm on October 17, 2012


 

New Webmaster Guidelines Part 1 – Design and Content

Google recently updated their webmaster guidelines following the latest algorithm update. It is easy to feel inundated with the amount of information regarding web design dos & don’ts and the best practices for the internet. As an SEO I am frequently asked, “How can I get my site to rank?” The fact of the matter is that we follow the Google’s Webmaster Guidelines which establishes the best practices for websites to follow. Many are concerned about the Panda/Penguin updates and are worried that there site will be hit; or they have a site that has been hit. Our advice remains consistent: "Drink the Google Kool-Aid".

magician_rabbit_hat

At one time, it was exceedingly difficult to get a straight answer from Google in regards to what was considered best practice. This led to a wild-west frontier attitude and many designers and SEOs adopted many bad practices. This is lead to an inundation of webspam in the Google SERPs and made it very difficult to get quality search results.

The Panda and Penguin algorithm and subsequent updates was a very concerted effort to rid the SERPs of webspam. In the wake of these substantial updates, my advice to customers remains consistent; follow the Google established guidelines. The mantra I repeat to my customers is: "Would I do this if search engines didn’t exist?"

For many of us this is old news, but I still find myself learning new things to try and better practices to adopt. Much of the messaging from Google has been very consistent regarding what makes good content. This post will looks specifically at Google’s recommended Design and Content Guidelines to help Google find, crawl and index your site.

Site Hierarchy

  • Give your site a clear hierarchical structure and make it as easy to navigate as possible. Every page should be reachable from at least one static text link.
  • Think of your website as a book with logical sections and headings; each with their own unique and relevant content.
    • The Title of you is your domain URL (eg. www.booktitle.com)
    • Your title tag <title> can be your topic for the page. It defines what content will be on this page (eg. <title>Book Characters</title>).
    • Your heading tag is your chapter title eg. <h1>Book Characters</h1>. Typically this is the same or very close to the page title and must be directly relevant.
    • Have only one topic per page and only one H1 tag on any page.
    • Use subsequent heading tags (h2, h3, h4) to define further related divisions of the chapter.

Site Map

  • Offer a sitemap for your visitors. Not only does this provide a valuable service to your customers, but it can help improve the indexing of your site by bots.
  • If you have an extensive number of links on your site, you may need to break your sitemap into multiple pages.
  • Remember that a website sitemap is different than the sitemap.xml that you should submit to Google’s Webmaster Tools.

Internal Linking

  • Keep the number of links on any page to the bare minimum. The guidelines used to state ‘around 100’ but this is one area where less is more.
  • In the most recent iteration of the Webmaster Guidelines, Google has only stated to ‘keep it to a reasonable amount’. Too many links leading to other internal pages or offsite is distracting to the visitor. It lowers conversion rates due to people getting lost and creates frustration.

Textual Content

  • Google has always stated that ‘content is king’. It is absolutely imperative that you create rich, useful and dynamic content that engages your audience. All textual content needs to be well written and grammatically correct. It should clearly and accurately describe your content and it must be relevant to the page that it is found on.
  • Do not write for what you think Google wants to see. Think about what searchers would type into a search engine to find your page and ensure that your content actually includes those terms.
  • Do not concern yourself with keyword densities. Inevitably the content comes across as spammy and does not read well. Google may regard this as keyword stuffing and see broken/confused grammar as potential spam or scrapped content…exactly what the Panda/Penguin updates are designed to target, and penalize for.

Page Coding

  • Use a crawler on your site such as XENU’s Link Sleuth, or Google’s Webmaster Tools to check you site for broken links.
  • Check your site with the W3C to ensure that your site has valid HTML.
  • Avoid the use of dynamic pages with cryptic URLs (e.g., the URL contains a "?" character). Try to use keyword focused URLs that reflect the page you are building. If you must use a dynamic URL structure, keep them few and the parameters short.

Images

  • You can give Google additional details about your images, and provide the URL of images we might not otherwise discover, by adding information to a web sitemap.
  • Do not embed important content into images; always use text links instead of images for links, important names etc, where possible. Google crawlers cannot determine the text displayed in an image. If you must use an image for textual content, ensure that you make use of the image ALT tag to describe the image with a few words.
  • Ensure that all image <title< and ALT attributes are descriptive (but not spammy) and accurate. Follow these guidelines for creating great ALT text for your images.
  • Give your images detailed and informative filenames.

The following areas (video and rich snippets and their usage are best described by Google themselves:

Video

View the full post here: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=156442

Rich Snippets

View the full post here: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1093493

Coming next time, I will review the newly updated Technical Guidelines and then conclude with Google’s Quality Guidelines.

SEO news blog post by @ 1:15 pm on October 10, 2012


 

EMD Insanity and Coining Phrases

It’s clearly time for Beanstalk to officially list ourselves as a sitedefibrillation solution provider.

Why? Because apparently the secret to SERP dominance with an EMD is to coin your own phrase!

Do a search for ‘coinflation’ + ‘gold’ or really, almost any other keyword to see what Google considers an ‘improved’ result following the EMD update.

Google Search results for Coinflation 
If you didn’t get something like the results above, please let us know!

 
Okay so that seems slightly silly, but how the heck did they pull that off? There’s clearly PPC/AdWords competition for the phrase, and EMD should either be a penalty or moot, shouldn’t it?

Well apparently not! In fact EMD can still clearly be an asset if the ‘quality’ scores are all above par!

This means that if you have an organic campaign, with ongoing back links/references from trusted sources, and you aren’t hitting other penalties, you really should be feeling no loss at all from the EMD update.

Indeed, if your competition was using non-organic approaches to EMDs they should have taken a trust hit, and you may see an improvement in position due to their failings!

So while I can show you some examples of the EMD apparently failing to work, we can assure you it’s working, and overall seems like a positive step for Google.

10″ Google Nexus from Samsung?

Last night CNET announced some ‘highly’ probable info that Samsung is manufacturing a new 10.1″ Nexus tablet for Google.

The article is more of a stub of hear-say but had some rather ‘exact’ details including the resolution of the display:

The 2,560×1,600 display will have a PPI (pixels per inch) of about 299, said Shim. That tops the 264 PPI on the 9.7-inch 2,048×1,536 Retina iPad.

Clearly this will be the ‘high end’ model for the Nexus line (currently manufactured by Asus), especially when you consider that Google will be releasing a 7″ Nexus subsidized down to a $99 price this December!

In fact since we’re pondering things to come more than talking facts, I’d have to assume this will be a dual or quad core device with GPU acceleration of some sort to assist with up-scaling video content and 3d games to that eye-popping resolution.

So if this high-end Nexus tablet is anything less than $399 I’d be really shocked and very worried for Apple.

Okay, perhaps more worried for Apple, would be more accurate given it’s current public affairs issues..

[iframe width="549" height="309" src="http://www.youtube.com/embed/JEy2u2n_XTQ?rel=0" frameborder="0" allowfullscreen][/iframe]

In case you’re wondering ‘who cares?’; Tim Pool goes to the streets and broadcasts unedited footage of protests/events.

I’d like to think Apple is patenting this to prevent companies from doing this, but in actual fact this is very creepy stuff from the overly litigious makers of the most expensive walled gardens on the planet.

It seems almost like Apple is testing how well their brand/product can weather bad public image at this point?

SEO news blog post by @ 11:53 am on October 9, 2012


 

You may need an EMT after the EMD Update!

Last Friday Matt Cutts tweeted about Google’s latest update, which focuses on penalties for ‘low-quality’ Exact Match Domain names, hence the EMD TLA.

Twitter posts from Matt Cutts on the latest EMD Update

While Google is never big on giving us the details lets digest this together!

Using a relevant keyword in a domain has been a very long-standing ranking signal.
ie: A consulting site for financial companies using ‘financial-consulting.com’ as a domain would be seen as relevant

Over the years this has lead to people grabbing up domains with keywords in them for SEO purposes.

JACOBS BY MARC JACOBS FOR MARC BY MARC JACOBS ETC..

Having your keywords in your domain name didn’t mean overnight dominance of the web, thankfully. Indeed, there was usually some trade-off between desirable keywords and a reasonably short domain name.

In fact, no organic/white-hat SEO would suggest you use something like:

‘best-value-online-financial-consulting-company-with-proven-results.com’

Why? Because the gains in SEO wouldn’t match the losses in user trust/conversions.

Would a good organic SEO/White Hat tell you NOT to purchase those types of domains for 301s to your main site?

I’d like to think so, but this was clearly a strategy for a lot of sites competing for top rankings.

Regardless of your SEO ethics, the practice of domain parking/selling because of search ranking signals is clearly an unnecessary burden on the internet.

While the ‘domains for sale’ issue would still exist without search engines, search engines honestly should be making your choice of domain name MUCH less relevant.

Ideally fresh internet traffic should occur as match between the searchers needs and the services/information that your site provides.

And with this latest update it’d appear that Google agrees with the idea that book should found by more than what’s on the cover.

As of this last update you can expect sites with nothing but some keyword dense 301′d domains to now face a penalty instead of a positive ranking signal.

We didn’t see this coming!

EMD Update Results

I’m already seeing people post sad tales of the deep impact this update is having on certain sites, and I’ve had a laugh at a few ‘professionals’ claiming they never felt this day would come.

Personally, while I’ve watched some very good presentations on SEO and web ranking strategies, the one thing that helps me most as an SEO is Matt Cutts’ breakdown of the real philosophy behind ‘good SEO’ which boils down to:

Never do something for the sake of search engine rankings alone.

If you like ‘Lord of the Rings’ then look at this as:

‘One Rule to Lead them all, one Rule to be found by…’

..and you should never have to fear a Google update!

In fact you should look at each Google update as a chance for your rankings to improve as other sites are punished for their ‘clever’ attempts to game the system.

Another Google Easter Egg?

And finally, to end the post with a chuckle, here’s a Google search phrase for you to test out:

I was hoping this was more than just an ‘Easter Egg‘ in Google’s search, but alas Google hasn’t yet licked mathematical artificial intelligence. :p

SEO news blog post by @ 12:01 pm on October 2, 2012


 

Dublin the Airports: iOS 6 Maps is Rotten


Apple’s extra Airport..

Was anyone expecting Apple to replace Google’s Maps application with something superior? Apparently, the iPhone user base and Apple actually expected this to happen.

If you look at the most extremely biased sites reviewing the new ‘Apple’ Maps app for iOS 6 you will see guarded optimism and lots of ‘reasoning’ clash with angry rants from amazed and disappointed users.

One thing I don’t see is anyone calling it the ‘Maps app that Apple bought from TomTom’ the best I’ve seen is a mention that they relied heavily on TomTom and OpenStreetMap for data alone.

Instead I see a very consistent collection of sympathetic remarks like: ‘this is beta, it can only get better’, ‘for a first attempt this is outstanding’, ‘people will question anyone who takes their own path..’

But Apple isn’t taking their own path, they are merely attempting (badly) to replace something that wasn’t really broken.

Sure, Google wasn’t toiling endlessly to include all the updates it was adding to the Android version of Google Maps.

I’m guessing Apple really expected Google to beta test ideas on the Android and then polish them up and finalize them on the iPhone?

So sure, Google put Android development first, and there were things that Google Maps did better on the Android, but that still doesn’t mean it ‘had to go’.

Apple could have offered both solutions in a ‘use what you like’ approach to pleasing it’s user base, but this is a company making headlines for outrageous profits and the working conditions of it’s manufacturing partners.

Removing the choice to pick another company’s solution would clearly explain why Apple didn’t take a settlement from Samsung and wanted to ban their phones. Apple want’s profits, and if Apple wants really happy customers they could lower prices and focus on better apps vs. removing the best ones for inferior versions.

And in other News

Google has blessed a new meta tag!
meta name=”news_keywords”

content=”Apple Maps, iOS 6, Google Maps, Android, TomTom, Google news meta tag”

Do you publish content that you would call ‘news’?
Would you like Google to better understand the topic of your posts?
Would you like the freedom to ignore keyword use in a topic for style reasons?

Then brothers and sisters, this new meta-tag is what you’ve been waiting for!

The format is very simple, and it belongs near the top of your page content, usually in the <head> … </head> section.

Here’s an example:

<meta name=”news_keywords” content=”10 keywords, separated, by commas, just like, meta keywords, etc..”>

That’s some ‘easy breezy’ SEO optimization, and it’s great if you are indeed publishing ‘news’; Not just ranting about Apple. :)

SEO news blog post by @ 11:51 am on September 20, 2012


 

Gmail Rank & the Moon Landing

Gmail Rank

Bill Slawski had an interesting blog post the other day speaking about the rise of Gmail Rank and the Importance of Good Subject lines. It seems that Google is experimenting with the possibility of including your own emails in your search results. Users will have to opt-in and only the emails that have been be received via Gmail will be used.

It is thought that the “rankings” used to decide which emails to show will be similar to the existing colored “importance rankings” currently used to display the relative importance of your emails. Gmail does allow the user to sort and filter gmails by their importance markers and offers some other advanced search filters; whether or not this functionality will be carried over to an integrated web search remains speculative.

…In other news:

Neil Armstrong's footprint on the moon

Beanstalk would like to say a fond farewell to Neil Armstrong. Armstrong died Saturday, Aug. 25, 2012, at age 82. Armstrong commanded the Apollo 11 spacecraft that landed on the moon July 20, 1969. As the first man to walk on the moon, his passing truly marks the end of an era. The moon landing happened a bit before my time, but those who witnessed it remember where they were and what they were doing when they heard those famous lines: “That’s one small step for man, one giant leap for mankind.”

SEO news blog post by @ 12:16 pm on August 27, 2012

Categories:Google,Rankings

 

« Newer PostsOlder Posts »
Level Triple-A conformance icon, W3C-WAI Web Content Accessibility Guidelines 1.0 Valid XHTML 1.0! Valid CSS!
Copyright© 2004-2014
Beanstalk Search Engine Optimization, Inc.
All rights reserved.