Beanstalk on Google+ Beanstalk on Facebook Beanstalk on Twitter Beanstalk on LinkedIn Beanstalk on Pinterest
Translate:
Published On:
SEO articles and blog published on ...
Hear Us On:
Webmaster Radio
Blog Partner Of:
WebProNews Blog Partner
Helping Out:
Carbon balanced.
Archives
RSS

XMLRSS

Beanstalk's Internet Marketing Blog

At Beanstalk Search Engine Optimization we know that knowledge is power. That's the reason we started this Internet marketing blog back in 2005. We know that the better informed our visitors are, the better the decisions they will make for their websites and their online businesses. We hope you enjoy your stay and find the news, tips and ideas contained within this blog useful.


October 21, 2013

Google Q3, Mobile Ads & Hummingbird

Google announced their Q3 earnings last Thursday (October 17th) with higher-than-expected earnings, up 12% over the previous year at $14.89 billion. This resulted in Google shares crossing the $1000 per share mark for the first time in the companies’ history. Before we get into how that’s being accomplished, let me first insert my brief rant:

<rant>

THERE IS NO REASON GOOGLE SHOULD BE VALUED AT WHAT IT IS !!  IT’S LIKE WE’VE FORGOTTEN THE DOT COM BUBBLE BURSTING WITH THE CRAZY VALUATIONS WE GIVE TECH COMPANIES !!!

</rant>

Alright, feeling better …

At the end of the day the higher than expected earnings came on the back of an average 8% drop in the average cost per click.  This drop was due mainly to the growth in mobile however (where the rates are cheaper) and rather than indicating a decline in search is an indicator to the contrary.  Because growth in the mobile realm is as high as it is, it is able to impact the overall averages dramatically however desktop search did not decline.  This is a case of Google winning in mobile and not losing in desktop creating a net gain though an average cost per click drop.

If we don’t think this growth in mobile wasn’t the key to the Hummingbird changes we’d be kidding ourselves.  Hummingbird has very little to do with desktop search and everything to do with mobile and mobile devices.  With the growth in the sector being what it is and the enormous revenue opportunities that exist there – it looks as though Google is adjusting their entire algorithm to accommodate.  And it makes sense as users demand more from mobile and from technology in general.  The contest is on to feed more data faster and monetize better.  Tell us what we want before we know we want it.

Will Google be able to keep up?  Only time will tell.  It’s theirs to lose at this point but not that long ago it was Microsoft’s to lose.  Of course, Microsoft could buy Google if they wanted so …

And now, on a lighter note (albeit it only slightly relevant) let’s take a moment to remember what we have, what mobile is doing, what we take for granted and maybe even chuckle a bit …

SEO news blog post by @ 11:10 pm

Categories:Google,Update

 

 

October 8, 2013

So you got bit by a flesh eating panda

 

It’s been a few months seeing the drastic changes Google has made in the SEO industry. Being so fresh to SEO I never truly experienced what the impact of this buzz was all about. From my perspective of owning a business before was that this change was a great thing. I mean who wouldn’t want certain shady practices cleaned up and a true sense of marketing back on the table? If you own a shop don’t just expect to get traffic by simply having a website or opening your doors. From what I hear shady architecture, irrelevant, sloppy links and no connection to your targeted demographic is in any sense an idiot’s guide to bankruptcy. A well-built site that plays by the rules and relevant networking practices are what should drive the internet as well as any business.

Fight club 1

So we have penguin, panda and now the hummingbird what I think they should have had was a flesh eating, four headed dragon, a zombie bear and a vampire seagull. Why make businesses that follow bad practices feel calm with cute animals. Well I guess it is a little more entertaining when you see a cute little bear tear the heads off of shady competitors. Sure, blackhat SEOs are under a lot of pressure to figure themselves out, but when I see the whitehats they primarily seem calm as Hindu cows; as Tyler Durden from Fight Club would say.

After weeks of Webcology broadcasts discussing change in the middle of change it really seems to me that a majority of these SEOs are uncomfortably welcoming in all of this and most have been prepared for months. However, the online businesses that sold their morals to the blackhat devil are under a lot of pressure. I don’t feel bad for them at all and they better feel lucky that there is an honest playing field to help them out of rough water.

Fight club 2

Countless numbers of walking dead sites take position on the net and I don’t want them popping up at the top of my search any more. We are becoming more social on the interweb as well as practicing more translucent behaviors. It’s this translucency that holds these digital entities accountable and this is why there is a stronger presence in search. I always do social research before I engage in an online service and if they don’t interact or seem as an authority on a subject then I don’t bother with them. These companies need to do more than just operate and have a pretty looking web site, they need personality and social is what gives them this opportunity.

At one time I heard we had a thing called link building, but I would like to introduce it as “relevant networking”. This means – tie up your boots, get out there and start making connections like your Dad did; network with other companies or personalities who share a common interest and you will begin to gain traffic and business. Just like the good old days, call them up or send them an e-mail and get creative with campaigns to draw in your crowd. It really isn’t different from the good old Brick and Mortar mentality of running good inbound marketing.

Every industry evolves and this is what keeps life interesting. This level of search evolution is a great change and will bring together our advances in technology as well as bring back healthy, vibrant and honest business. We should all ride bravely into the sunset atop of our flesh eating panda and encourage the transformation of search innovation.

Pictures from Vanessa Mathews

SEO news blog post by @ 9:00 am

Categories:Google,Rankings

 

 

October 6, 2013

Penguin 2.1 (AKA Penguin 5)

Penguin 5

The newest iteration of the Penguin algorithm has rolled out.  In a tweet by Google czar Matt Cutts at 1:50PM on October 4th it will affect ~1% of all search queries.  The tweet is as follows:

For those paying attention, the link goes to the April 2012 information on the Penguin updates.  This is an indicator that nothing new has been introduced and that this is a tweaking of the current algorithm sub-set.  The core change (according to Google) is that where Penguin tended to detect quality issues with the homepage of a website only, this change takes it further to assessing the quality of a website as a whole.  This makes good sense given other recent changes at Google which are generally seen to be pushing people to focus their energies on overall visitor experience and interaction as opposed to a focus on subsets of visitors and what you want them to do.  One need only look at the removal of the keyword data from Analytics for reinforcement of this principle.

So what does this mean for you? 

Nothing that you wouldn’t have gathered previously if you were paying attention for the past couple weeks.  Focusing on visitor experience globally seems to be more crucial than ever and insuring that you’re putting out good, quality content on a regular basis to reinforce your knowledge or, alternatively, to give Google something to pull data from (insert Hummingbird here) and potentially distract visitors who aren’t interested in your specific product/service.  In this I’m simply referring to using the information portions of your website to give your generally-bouncing visitors something to do as opposed to heading straight back to Google.

While I may have issues with the expanded knowledge graph for what it does to publishers, clearly Google wants their visitors to get their information they want quickly, on any device and decide for themselves what subset of that information they are interested in.  This tells us that we should do the same, while our product or service may not fit the searchers needs, it’s becoming more important that ever to insure that we do provide them with something.  As a perk, done well – that something way well serve as great link bait. :)

SEO news blog post by @ 11:01 am

Categories:Google,Update

 

 

September 26, 2013

Much Ado About (not provided)

Our motto is (not provided).

Our motto is (not provided).

As many of our readers may already know, earlier this week Google changed the way their URL functions in such a way that for those who monitor their analytics (which should be all of you), you’ll now only see (not provided) where once you would have seen your keyword. This move was met with disappointment and more than a bit of annoyance on the part of SEOs and website owners. The reason (so they say) is to protect the privacy of their users. The logic is, if keyword data passes then it can be picked up in the log files of the site being visited along with data such as the IP address that would allow the user to be pinpointed with some degree of accuracy. So, to make sure that the owner of the custom t-shirt site I visited last week can’t figure out it was me that searched “custom t-shirts canada” that data is now kept from the receiving site. Now, here’s the annoyance – to say that it’s a case of protecting privacy would work UNTIL we realize that the same can’t be said for paid traffic. If you purchase traffic though AdWords, the data is tracked. Now of course it has to be or we’d all just be paying for AdWords and trusting that we were getting the traffic we paid for and that the bids made sense but the hypocrisy is pretty obvious – why is a user that clicks on an organic result then more deserving of privacy than those who click on a paid result? They’re not obviously, and we’re not being told the truth BUT that’s not really the discussion to be had is it? The fact of the matter is, it’s Google and they can do what they want with their own website. I believe I should get to do with my site what I want (within the confines of the law of course) and so I won’t take that away from others. So what is the real discussion …

What Do We Do Now?

While we’re all spending time arguing about the hypocrisy and crying foul, the fact of the matter is that it is what it is and now we have to figure out what to do.  We no longer have keyword data from Google.  There are two routes forward, the short term patch and the long term changes.

Short Term

In the short term we can use Advanced Segments to at least get a good idea about what keywords are producing what effect.  Essentially we can use them to filter traffic that follows patters similar to what specific keywords or keyword groups behaved like.  This tends to only work well with large traffic groupings so unless you get huge traffic for single phrases that behave uniquely – you’ll probably have to group your traffic together.  Branded vs non-branded for example.  I’m not going to get into how this is gone here in this blog post simply because I wrote a lengthy piece on it for Search Engine Watch back when (not provided) was first becoming an issue.  You can read about it at http://searchenginewatch.com/article/2143123/How-to-Understand-Your-Google-Not-Provided-Traffic.

This will only work for a while however.  You’ll see new traffic coming in and won’t know how it’s behavior impacts the results.  Essentially – this will give you a decent idea until your traffic sources change, your site changes, or time passes.  So what do we do …

Long Term

In the long run we have no option but to make massive adjustments to the way we look at our sites.  We can no longer determine which keywords perform the best and try to caft the user experience for them.  Instead we have to look at our search traffic in a big bucket.  Or do we?

While this may be true for some traffic, we can still segment but the landing page (which will give you a good idea of the phrases) as well as look at groups of pages (all in a  single directory for example).  I know for example that this change comes right when we ourselves are redesigning our website and in light of this I will be changing the way our directory structure and page naming system work to allow for better grouping of landing pages by common URL elements.  I imagine I won’t be the last to consider this factor when building or redeveloping a website.

What will need to change is our reliance on specific pieces of data.  I know I like to see that phrase A produced X result and work to improve that.  We’ll not have to look at larger groupings of data.  A downside to this (and Google will have to address this or we as SEOs will) is that it’s going to be a lot easier to mask bad practices as specific phrase data won’t be available.  I know for example that in an audit I was part of, we found bot traffic in part based on common phrase elements.  Today we wouldn’t be able to do this and the violations would continue.

We’re All Still Learning

Through the next couple months we’ll all be adjusting our reporting practices to facilitate this change.  I know that some innovative techniques will likely be developed to report as accurately as possible what traffic is doing what.  I know I’ll be staying on top if it and we’ll keep you posted here in our blog and on our Facebook page.

SEO news blog post by @ 10:48 am

Categories:Analytics,Google,Google

 

 

July 18, 2013

A Panda Attack

Google today confirmed that there is a Panda update rolling out. I find it odd that after telling webmasters that there would no longer be announcements of Panda updates, that they made this announcement and one has to wonder why.

The official message from Google is that this Panda update is softer than those previously and that there have been new signals added. There are webmasters who are reporting recoveries from previous updates with this one. I would love to hear some feedback from any of our blog readers as to changes you may have noticed in your rankings with this latest update.

I’ll publish a followup post to this one next week after we’ve had a chance to evaluate the update.

SEO news blog post by @ 10:42 am


 

 

May 7, 2013

Google Update: Penguin #4?

Rumor has it that there’s a Google update underway. While there were some noting changes are early as late Saturday/early Sunday – general experience has it starting on Monday with many webmasters experiencing significant drops.

From the significant drops reported by many webmasters and the only one or two position movement we can see among our own clients here at Beanstalk it seems that it may be the next Penguin update which target known unethical SEO practices.  Admittedly, this is simply an educated guess and I’m not the first to suppose such however when one seems sites taking massive drops and others, where the strategies are known solid, hold steady or even gain, it’s a safe assumption that whatever update it is … it’s either targets spam or devaluing bad links.

This couldn’t be better illustrated by one comment on the Webmaster World forums by user Martin Ice Web when he said:

It now seems like Google has the intention to find all the crap in the WWW and unfortunately they get it very good.
I don´t know what kind of trust factor they are searching for but the sites i now see are complete without any trust.

It’s far too early to conclude much but I know we’ll be watching it closely here at Beanstalk.  If you’re interested, there’s a discussion on the subject over at Webmaster World here.

SEO news blog post by @ 9:08 am

Categories:Google

 

 

February 14, 2013

iOS popularity = Big Bills for Bing Hating

We decided to call a spade a spade, and Google is paying a fee to keep Bing from being the default search engine on iOS.

The fee is based on per-unit pricing, and not only are there more units than ever, but the per-unit price is also going from $3.20 last year to an estimated $3.50 per unity in 2013!

A flock of sheep attempting to enter a building with an apple logo at the same time.
Given the growing user base these should almost be rabbits?

 
Since the prices are a guesstimate, one can honestly say that it will cost more for the exclusive right to the default search engine on iOS in 2013.

However there are certain ‘publications’ that have forgone the guessing part and are rather certain that Google will pay up.

For example..

Techcrunch title: GOOGLE TO PAY APPLE 1 BILLION
An honest title: GOOGLE COULD PAY APPLE 1 BILLION

In fact, if Samsung, or Google (via it’s Motorolla Mobillity acquisition), can keep one-upping each of the new iPhones, then the cost of licensing to the user-base will be peaking at a point which it will never return to again.

But is it worth the money knowing how much of a search advantage Google has over Bing? Well that depends entirely on who you ask!

Apple pundit:

People will use whatever is the default like pack of blind sheep. Everyone knows this.

Google fan:

If that’s true then why is the Google Maps app on iOS the most popular app on the device? People clearly don’t just use the default apple maps?

.. and really, if we’re talking about users who skipped over the BlackBerries, Nokias, Samsungs, etc.., for a specific device, then perhaps we should give them some credit for also choosing a better search experience?

After all, how many times would you let your phone load Bing before trying to switch it?

I personally would let a ‘Bing’ search happen once at the most, just to get info on “setting default search engine on iOS”. :)

SEO news blog post by @ 5:08 pm


 

 

January 24, 2013

Free Ranking Reports on Google!

I keep seeing people ask for their rank, asking what the best free ranking tools are, etc., like it’s so darn hard to ask Google where your website is in terms of it’s keywords.

First of all, Google Analytics has an ‘Average Position’ column for popular search queries that tells you a lot of great info about your site’s keywords.

Google WMT Search Queries chart
This is an example of Search Queries sorted by Average Position

 
The link to this area is:
https://www.google.com/webmasters/tools/top-search-queries?hl=en&siteUrl=
+ your URL.

Our website link would look like this:
…earch-queries?hl=en&siteUrl=http://www.beanstalk-inc.com/

You can also click at the top of the position column to sort it, or tack this onto the end of the URL:
&grid.sortBy=8&grid.currentOrder=2d

If you aren’t getting enough data from this, first try out the download options, and load them up in a spreadsheet so you can sort/filter the data.

Most folks are surprised what a little bit of filtering and grouping can accomplish to provide you with a fresh perspective on data.

Still not enough? Well there’s a zillion free tools that will gladly trade your URL and keyword targets for a limited ranking report.

This is valuable data, so why not trade something free for it? Google does!

Indeed there’s enough free tools, that I won’t even bother mentioning one. Why don’t we just make one?

It’s not ‘hard’ to get your rank really, lets break it down:

  • Make a list of phrases you are trying to rank for
  • Do a Google search for your first phrase
  • Keep searching until you find your site
  • Take note of the position
  • Repeat

So how does the average person do this? It’s gets pretty technical, but all the resources are out there, and free!

To break that down in simple terms:

  • Setup a server or install XAMPP
  • Setup a database/table to store your rankings by date
  • Make a page that CURLs for your keywords
  • Setup a schedule to execute the php page regularly

Bingo, you now have your own ranking reports tool, and nobody is the wiser, besides Google, and they are usually too busy to care that you’re extra curious about your rankings.

Nerd reading a book

Don’t get me wrong, there’s a lot of fine details to explain and not everyone is comfortable installing programs like this or scripting, but I am going to look at getting permission to make this a step-by-step how-to guide with full downloads so even novices can give this a try.

A final point to make is that excessive/automated queries on Google is a breach of their TOS, and could result in annoying blocks/complaints from Google if you were to attempt to use this method for a large set of keyword phrases, or wanted the reports updated constantly.

If you are a ‘power user’ who needs a lot of data, you’ll end up paying someone, and either you pay to use someone’s API key at a premium, or you get your own API key from Google and only pay for what you use.

Seems like an easy decision to me!

SEO news blog post by @ 1:03 pm


 

 

January 10, 2013

Missing Authorship Photos?

If you’ve become accustomed to seeing your charming mug in the SERPs when you are Google’ing your keywords, it might be rather unsettling to see those images suddenly disappear.

Rich Snippet SERP example

Fear not! This isn’t something you have done, or not done, this is actually kicking up a bit of fuss on the SEO forums/discussion areas today and clearly looks to be an issue on Google’s end.

In fact if you were in need of reassurance, all you have to do is hop into your Webmaster Tools account, and visit the ‘Rich Snippets Tool‘ to get a preview of what your SERPs would normally look like.

If you are sure that you’re not part of the current issue, or you’re just curious what we’re talking about, the Troubleshooting Rich Snippets page is a great resource to tackle possible problems.

Google invests another $200,000,000.00 in renewable energy..

I could have written .2 billion, or 200 million, or even 200 thousand thousands, but why play with such a large sum of money?

Google certainly isn’t playing around; With this latest investment Google’s grand total in renewable/clean energy is over $1 billion US and growing.

This isn’t just charity either, some of these investments are just smart business because the returns are very fixed and low risk.

Illustration of power saved by using GMail vs. Postal Mail

Being honest about pollution is brave, and bragging about your low footprint is begging for trouble, but Google marches on stating:

“100 searches on Google has about the same footprint as drying your hands with a standard electric dryer, ironing a shirt, or producing 1.5 tablespoons of orange juice.”

You can read more about Google’s efforts to reduce, eliminate, and assist others with power consumption/carbon footprints, over on the Google Green Pages.

SEO news blog post by @ 11:57 am


 

 

December 5, 2012

How Short Content Can Help you Rank

A common misconception is that you need to provide at least 500 words of onsite content to have your page rank with Google. Your rankings are dependent on many factors and signals and is not necessarily determined by the number of words on a page; no matter how well written they are.

copywriting

It all comes down to creating unique content that is not only interesting, but engages your viewers and drives ongoing conversations in the form of replies or comments. In a recent Google Webmaster Help thread John Muller of Google, clarified this exact point.

"Rest assured, Googlebot doesn’t just count words on a page or in an article, even short articles can be very useful & compelling to users. For example, we also crawl and index tweets, which are at most 140 characters long. That said, if you have users who love your site and engage with it regularly, allowing them to share comments on your articles is also a great way to bring additional information onto the page. Sometimes a short article can trigger a longer discussion — and sometimes users are looking for discussions like that in search. That said, one recommendation that I’d like to add is to make sure that your content is really unique (not just rewritten, autogenerated, etc) and of high-quality."

Google crawls everything from full articles to 140 character tweets. Google recognizes that even short comments or articles can be triggers for engaging conversations. There is no magic number; there are no “tricks” to SEO. Creating unique and valuable content and you visitors and ranking will follow.

SEO news blog post by @ 10:56 am


 

 

Older Posts »
Level Triple-A conformance icon, W3C-WAI Web Content Accessibility Guidelines 1.0 Valid XHTML 1.0! Valid CSS!
Copyright© 2004-2014
Beanstalk Search Engine Optimization, Inc.
All rights reserved.