At Beanstalk Search Engine Optimization we know that knowledge is power. That's the reason we started this Internet marketing blog back in 2005. We know that the better informed our visitors are, the better the decisions they will make for their websites and their online businesses. We hope you enjoy your stay and find the news, tips and ideas contained within this blog useful.
In fact, if Samsung, or Google (via it’s Motorolla Mobillity acquisition), can keep one-upping each of the new iPhones, then the cost of licensing to the user-base will be peaking at a point which it will never return to again.
But is it worth the money knowing how much of a search advantage Google has over Bing? Well that depends entirely on who you ask!
People will use whatever is the default like pack of blind sheep. Everyone knows this.
If that’s true then why is the Google Maps app on iOS the most popular app on the device? People clearly don’t just use the default apple maps?
.. and really, if we’re talking about users who skipped over the BlackBerries, Nokias, Samsungs, etc.., for a specific device, then perhaps we should give them some credit for also choosing a better search experience?
After all, how many times would you let your phone load Bing before trying to switch it?
I personally would let a ‘Bing’ search happen once at the most, just to get info on “setting default search engine on iOS”.
Setup a schedule to execute the php page regularly
Bingo, you now have your own ranking reports tool, and nobody is the wiser, besides Google, and they are usually too busy to care that you’re extra curious about your rankings.
Don’t get me wrong, there’s a lot of fine details to explain and not everyone is comfortable installing programs like this or scripting, but I am going to look at getting permission to make this a step-by-step how-to guide with full downloads so even novices can give this a try.
A final point to make is that excessive/automated queries on Google is a breach of their TOS, and could result in annoying blocks/complaints from Google if you were to attempt to use this method for a large set of keyword phrases, or wanted the reports updated constantly.
If you are a ‘power user’ who needs a lot of data, you’ll end up paying someone, and either you pay to use someone’s API key at a premium, or you get your own API key from Google and only pay for what you use.
If you’ve become accustomed to seeing your charming mug in the SERPs when you are Google’ing your keywords, it might be rather unsettling to see those images suddenly disappear.
Fear not! This isn’t something you have done, or not done, this is actually kicking up a bit of fuss on the SEO forums/discussion areas today and clearly looks to be an issue on Google’s end.
In fact if you were in need of reassurance, all you have to do is hop into your Webmaster Tools account, and visit the ‘Rich Snippets Tool‘ to get a preview of what your SERPs would normally look like.
If you are sure that you’re not part of the current issue, or you’re just curious what we’re talking about, the Troubleshooting Rich Snippets page is a great resource to tackle possible problems.
Google invests another $200,000,000.00 in renewable energy..
I could have written .2 billion, or 200 million, or even 200 thousand thousands, but why play with such a large sum of money?
Google certainly isn’t playing around; With this latest investment Google’s grand total in renewable/clean energy is over $1 billion US and growing.
This isn’t just charity either, some of these investments are just smart business because the returns are very fixed and low risk.
Being honest about pollution is brave, and bragging about your low footprint is begging for trouble, but Google marches on stating:
“100 searches on Google has about the same footprint as drying your hands with a standard electric dryer, ironing a shirt, or producing 1.5 tablespoons of orange juice.”
You can read more about Google’s efforts to reduce, eliminate, and assist others with power consumption/carbon footprints, over on the Google Green Pages.
A common misconception is that you need to provide at least 500 words of onsite content to have your page rank with Google. Your rankings are dependent on many factors and signals and is not necessarily determined by the number of words on a page; no matter how well written they are.
It all comes down to creating unique content that is not only interesting, but engages your viewers and drives ongoing conversations in the form of replies or comments. In a recent Google Webmaster Help thread John Muller of Google, clarified this exact point.
"Rest assured, Googlebot doesn’t just count words on a page or in an article, even short articles can be very useful & compelling to users. For example, we also crawl and index tweets, which are at most 140 characters long. That said, if you have users who love your site and engage with it regularly, allowing them to share comments on your articles is also a great way to bring additional information onto the page. Sometimes a short article can trigger a longer discussion — and sometimes users are looking for discussions like that in search. That said, one recommendation that I’d like to add is to make sure that your content is really unique (not just rewritten, autogenerated, etc) and of high-quality."
Google crawls everything from full articles to 140 character tweets. Google recognizes that even short comments or articles can be triggers for engaging conversations. There is no magic number; there are no “tricks” to SEO. Creating unique and valuable content and you visitors and ranking will follow.
If you are in the SEO industry, you have probably a new buzz word floating around the water cooler; “AuthorRank.”
In August of 2005, Google filed a patent for a technology dubbed Agent Rank in which ranking ‘agents’ use the reception of the content they create and the resulting interactions as a factor in determining their rankings. The patent goes on to suggest that more well-received and popular “agents” could have their associated content rank higher than unsigned content or the content of other less-authoritative “agents”.
After adding a continuation patent in 2011, Google is now able to attribute content to specific agents and can now rank these agents thanks to platforms like Google+. AJ Kohn goes into much detail about AuthorRank and why he feels it will be bigger than Panda and Penguin combined. AuthorRank will not be a replacement for PageRank, but will work in conjunction with it to enable Google to rank high quality content more appropriately.
I certainly don’t claim to be an expert on AuthorRank and in fact am only learning about it as I write this. What I did learn from the information I read is that content has and will always been key to the success of any website. Google’s mantra to publishers has always been that “content is king”; provide high quality content and the ranking, and followers will follow. This new signal will be in place soon as a final coup de grace to those still stuck in antiquated methods of content creation and syndication.
Every once in awhile it would be nice if there was some construction on the information superhighway.
Some road work that caused folks oblivious to our websites to detour?
We all want some traffic to take a pass through our pages, even if it’s just for a few minutes.
Ideally we’d want the detour sign to read:
“Turn here for great deals on XYZ!”
…but more often than not folks go for something a bit more catchy like:
“If you like kittens and free bacon turn now before it’s too late!”
The problem with the former is that people don’t respect honesty as much as they should, after all, everyone has something for sale, tell us something we didn’t know.
The problem with the latter is that while totally successful, the traffic driven to the site won’t be on target at all, will likely bounce, and the best anyone can hope for is brand recognition. Unless the site actually has kittens and free bacon, but who would be reading this if they had all that? (Note to self, make a site with endless kitten pictures where the uploader is paid in bacon.)
Ideally we wish to find a ‘Goldilocks’ approach where we aren’t too off-putting with boring honesty, nor are we luring in people who have zero interest in the site.
So lets take a moment to look at two common approaches for traffic generation that I don’t see discussed often, one is very timely.
Google just launched a massive ARG called the Niantic Project and I am already 7 13 days behind on the clues/feeds..
The idea is that you become very curious about the game and subscribe to the daily clues. With luck this catches the eye of your friends, they get curious and sign on too. By the end of the game Google should have a large subscriber group waiting anxiously for their announcement.
Speaking of clues, one thing I seem to have discovered ahead of the crowd is the Interactive global Niantic XM (Exotic Matter) POI map that Google built:
If this game is an introduction to the recently released Google Field Trip app, then is it possible that Google associates have taken the time to embed ‘clues’ into major landmarks around the world that need local residents to ‘discover’ using an Android device and the Google Field Trip application.
With any luck Google will use Niantic to reach more people than they normally would, and the more people who know about field trip, the better/more interesting it will be.
Think Outside the Box
In this case, the box, is the web/online and thinking outside means creating web content that people will want to print/download and share.
All of our team is doing on-page optimization training so that all of us have some skills with on-page SEO. Even if we can’t have each member doing live A B tests and such, they should know why you would run one and be familiar with the current standards.
This means that each of us has an SEO cheat sheet pinned to our cork boards and each of these has branding on them that we’re fine with. In fact I’m very tempted to promote these as something all of you should print for your daily SEO but I need to check and see if they are still available to the public.
If your company has info pages that are getting a lot of traffic, I’d look at pulling together a PDF of the content for download with a quick-reference for printing.
Getting your brand out there and helping potential clients is a win win for you if the market you are in is something that you want to be recognized for.
Giving it Away
If you felt like making a resource and simply giving it away was too much for your time/budget, then you’ll be shocked by the next suggestion:
Give something substantial to a charity, preferably an example of your trade.
As an example: If you sell shoes and there’s a drive for winter shoes for the homeless, putting free footwear on people that cannot afford your product won’t cut into potential customers/sales, and it will remind people where to get shoes, and that winter is coming.
If there’s nothing you can do for charity that lines up with your company, you can always just give some money away, many sites thank donors with an ad or a link, and even micro loans are a nice way to help out with friendly options to get you started.
There’s a ton of ways to get unexpected traffic to your site in a manner that will have the visitors eager to explore, and potentially buy your product. Anything else and you risk the traffic bouncing off your site and telling Google that you aren’t offering interesting content.
Today’s Google Doodle
It’s with pride that I re-share the daily doodle for the Canadarm!
Google is celebrating 31 years of Canadarm use today with the above doodle.
After 90 missions the Discovery and Atlantis Canadarm installations will be retired with the shuttles for museum display. The Canadarm that was fitted to the Endeavour was given back to the Canadian Space Agency and it is currently on display in the Quebec headquarters.
An observant reader may wonder why the PNG with ‘poor’ compression is smaller than the JPG? The answer is that it’s transparent, and the PNG is only saving image data (compressed losslessly) for the visible pixels vs. JPG which has to save the additional information that ‘these pixels are white’.
Also keep in mind that we used really small images to keep this page loading quickly, the larger the image, the more of a difference compression quality can make.
The phrase ‘resolution’ has so many variable definitions that I would need to resolve the idea of this as a post vs. an article.
For the context of this discussion I’m speaking of the image dimensions, not the pixels-per-inch.
As an SEO blog I’d have to be really lazy to not mention the issue of image placement/size on a site when we know that Google has a clear concept of what’s most visible to your audience.
When I say ‘your audience’ it is not just a buzz-word, I really mean that Google looks at it’s analytics data and the browser window size of your traffic and actually knows when a site is delivering the right content for the majority of it’s user base.
So if your website is plastered with images that force the user to look for your content, and your content isn’t images, then that’s actually a problem in terms of SEO Optimization.
In fact Google’s just in the middle of moving it’s ‘Browser Size’ tool into the Google Analytics suite.
As you can see in this example of jQuery Mobile in the Browser Size tool, the existing results are generic and dare I say “unprofessional” looking?
In the above image we can see what % of general web users can see the elements of the page.
I would show off an example of the same page using the new tools, but Google Analytics is only for sites you own, and the new version is still in beta, throwing out ‘Not a Number’ (NaN) errors regardless of your choice of browser.
What you want to end up with, regardless, is a site that fits the screen size of your audience. So if you are running a forum that reviews ‘apps’ you probably want to aim for a design that will fit you most important content above ‘the fold’ with mobile browsers (at least the current generation of mobile browsers).
Image Site Maps
Site Maps are typically an XML format document that explains your website’s pages to Google in a more technical manner.
An image site map is specifically for explaining the images that are on your site.
An image sitemap’s XML structure lets you clearly spell out each image with options like:
loc: The full URL for the image
caption: Description of the image
geo_location: Physical location ie: British Columbia, Canada
title: Title of the image
license: URL pointing to a license for the image
Since each entry is related to a <loc> URL if your image is remotely hosted that’s fine, Google understands the need for CDNs, but that remote site needs to be registered in Webmaster Tools for proper indexing of the images.
Once again I’ve gone a bit too far on the topic for a first round, but I will return with a deeper look beyond the surface of the issue in a part 2 post.
While the topic of this post is “quality guidelines” it is perhaps the most misunderstood part of the webmaster guidelines as it is open to interpretation; however, the core of the guideline remains the same:
“Don’t engage in tactics that are questionable. If you would be hesitant to explain your actions to a competitor or to Google”
“How would you build and promote you site if there were no search engines?”
While I could go in to specifics on each point, this is an instance where it is best to get the information directly from the source. Google has not really updated anything here, but do state the following suggestions:
• Make your webpages for your readers; no for Google or other search engines
• Do not deceive your visitors
• Avoid tricks/schemes designed to improve you rankings.
• Focus on what makes your site unique, valuable, or engaging and make it stand apart from others in your field
• Actively monitor your site for hacking and remove hacked content as soon as it appears
• Prevent and removed user-generated spam from your site.
The clearest recommendations that Google makes to avoid the following practices:
• Automatically generated content
• Link schemes of exchanges
• Cloaking/hidden text or links
• Suspicious redirects
• Doorway pages
• Scraped content
• Load pages with irrelevant keywords
• Abusing rich snippets markup
• Send automated queries to Google
Once you have repaired your site and corrected and errors or errors, you can submit a reconsideration request to Google:
This is part 2 of an in depth look at the newly revised Webmaster Guidelines from Google. Google has recently updated their list of best practices and suggestions for site development. To give your site the best chance of ranking well, and to keep a competitive edge, the Google guidelines should be read like the gospel.
• Did you ever wonder how Google processes your site to determine its focus and content? Try using a text-based browser like Lynx to understand what Google is using to interpret your site.
• Check to see that your web server supports the “If-Modified-Since” HTTP header. This tells Google if your content has changed since it last crawled your site, saving bandwidth and overhead.
• Use the robot.txt file to exclude directories that do not need to be crawled from Google. Keep it updated in your Webmaster Tools account and ensure that you are not blocking Google bot from crawling your site by testing it in Webmaster Tools.
• Keep advertisements (such as Google’s AdSense and DoubleClick) to a minimum and ensure that they are not affecting your rankings by making sure they are excluded in your robots.txt file.
• If you use a content management system (CMS), makes sure that it support seo friendly URL structure and is easily crawled by bots.
• Test you site in several browser’s (IE, FireFox, Chrome, Lynx, Opera, Safari) at different resolutions.
• Use tools to monitor page load speeds. This is becoming an increasingly bigger factor for rankings. Use Google’s Page Speed, or Webmaster Tools Site Performance Tool to gain insights on how to boost you page loads speeds.
• Make use of the robots.txt file to keep your site accessible to the Google bots
• Block unneeded/irrelevant content from
• Use SEO friendly urls and move away from parameter-based urls
• Monitor your page load speed and take steps to improve it.