Beanstalk on Google+ Beanstalk on Facebook Beanstalk on Twitter Beanstalk on LinkedIn Beanstalk on Pinterest
Translate:
Published On:
SEO articles and blog published on ...
Hear Us On:
Webmaster Radio
Blog Partner Of:
WebProNews Blog Partner
Helping Out:
Carbon balanced.
Archives
RSS

XMLRSS

Will Google’s EU Woes Ever End?

google and the right to be forgotten ruling

The EUs “Right to be Forgotten Ruling” seems to be one endless headache for Google. The latest reports suggest that Google’s handling of these requests is the latest item up for scrutiny by European data protection authorities. What seems to be the problem? Well, it appears that Google is only removing links from search results in the EU, such as Google.co.uk but not Gooogle.com.

This action does effectively defeat the purpose of the ruling, as the offending links can still technically be found through non localized search. Google was recently accused of purposely misinterpreting the ruling in order to stir up advocates claiming the ruling itself is censorship.

The issue of censorship seems to be a hazy line drawn in the sand. While the intention is that fresh content, overtime, should lose relevance as the facts pertaining to an individual become outdated. The decisions of who and what events should be included are still largely up for debate – and by “debate” I mean up to Google.

Another point of contention is Google’s notification process. Currently the search giant sends notification to sites that have had search results removed. The data protection authorities have voiced concerns over the effects this may have on those submitting the removal request.

Because the right to be forgotten ruling is still in its infancy, there’s no doubt that this will be a hot topic for years to come as data protection authorities and Google hash out regulations and guidelines that will encapsulate most removal requests. It has been reported that Google has received some 91,000 removal requests effecting over 328,000 urls. The countries with the largest numbers are reported as France and Germany, followed by the UK.

SEO news blog post by @ 12:00 pm on July 26, 2014


 

Google Updates Local Search Algorithm

Google Places

Yesterday, SEO Roundtables Barry Schwartz reported that Google has launched an algorithm update targeted at local search. The aim of the update (as always) is to provide more accurate and relevant local results more closely tied to traditional ranking signals. It is suggested that the update will affect both the Google maps and standard web search results.

It is unknown at this point what percentage of search results will be affected with this update but Schwartz has speculated that the changes will be reasonably significant. It is too soon to tell if this update has hit the mark as many local search marketers are reporting the pendulum swing positioning of one extreme to the other, which often follows an algorithm update before settling somewhere in the middle.

SEO news blog post by @ 11:11 am on July 25, 2014


 

Matt Cutts Answers “Will backlinks lose their importance in ranking?”

In the latest video published on Google’s Webmaster Help channel, Matt Cutts (head of Google’s Webspam team) weighs in whether backlinks will lose their importance as a ranking metric in the near future. The answer? Ideally, overtime, yes.

While Mr. Cutts  states that links still have many years left in them,  Google has made no secret that quality content and resource creation will be the way of the future. There is no denying that backlinks will still hold value as a metric of measurement for quality resources, but as Google shifts focus towards furthering it’s ability in understanding conversational language in search we may see their influence take a back seat as a ranking factor.

Matt Cutts has stated:

“As we get better at understanding who wrote something and what the real meaning of that content is, inevitably over time, there will be little less emphasis on links. I would expect that for the next few years we will continue to use links to assess the basic reputation of pages and of sites.”

So while backlinks may be safe for now, it may be wise to begin investing in a good resource and content strategy that will encourage organic linking and set your business as a resource, before the hammer falls.

SEO news blog post by @ 4:30 pm on May 6, 2014

Categories:Google,link building

 

Google Q3, Mobile Ads & Hummingbird

Google announced their Q3 earnings last Thursday (October 17th) with higher-than-expected earnings, up 12% over the previous year at $14.89 billion. This resulted in Google shares crossing the $1000 per share mark for the first time in the companies’ history. Before we get into how that’s being accomplished, let me first insert my brief rant:

<rant>

THERE IS NO REASON GOOGLE SHOULD BE VALUED AT WHAT IT IS !!  IT’S LIKE WE’VE FORGOTTEN THE DOT COM BUBBLE BURSTING WITH THE CRAZY VALUATIONS WE GIVE TECH COMPANIES !!!

</rant>

Alright, feeling better …

At the end of the day the higher than expected earnings came on the back of an average 8% drop in the average cost per click.  This drop was due mainly to the growth in mobile however (where the rates are cheaper) and rather than indicating a decline in search is an indicator to the contrary.  Because growth in the mobile realm is as high as it is, it is able to impact the overall averages dramatically however desktop search did not decline.  This is a case of Google winning in mobile and not losing in desktop creating a net gain though an average cost per click drop.

If we don’t think this growth in mobile wasn’t the key to the Hummingbird changes we’d be kidding ourselves.  Hummingbird has very little to do with desktop search and everything to do with mobile and mobile devices.  With the growth in the sector being what it is and the enormous revenue opportunities that exist there – it looks as though Google is adjusting their entire algorithm to accommodate.  And it makes sense as users demand more from mobile and from technology in general.  The contest is on to feed more data faster and monetize better.  Tell us what we want before we know we want it.

Will Google be able to keep up?  Only time will tell.  It’s theirs to lose at this point but not that long ago it was Microsoft’s to lose.  Of course, Microsoft could buy Google if they wanted so …

And now, on a lighter note (albeit it only slightly relevant) let’s take a moment to remember what we have, what mobile is doing, what we take for granted and maybe even chuckle a bit …

SEO news blog post by @ 11:10 pm on October 21, 2013

Categories:Google,Update

 

So you got bit by a flesh eating panda

 

It’s been a few months seeing the drastic changes Google has made in the SEO industry. Being so fresh to SEO I never truly experienced what the impact of this buzz was all about. From my perspective of owning a business before was that this change was a great thing. I mean who wouldn’t want certain shady practices cleaned up and a true sense of marketing back on the table? If you own a shop don’t just expect to get traffic by simply having a website or opening your doors. From what I hear shady architecture, irrelevant, sloppy links and no connection to your targeted demographic is in any sense an idiot’s guide to bankruptcy. A well-built site that plays by the rules and relevant networking practices are what should drive the internet as well as any business.

Fight club 1

So we have penguin, panda and now the hummingbird what I think they should have had was a flesh eating, four headed dragon, a zombie bear and a vampire seagull. Why make businesses that follow bad practices feel calm with cute animals. Well I guess it is a little more entertaining when you see a cute little bear tear the heads off of shady competitors. Sure, blackhat SEOs are under a lot of pressure to figure themselves out, but when I see the whitehats they primarily seem calm as Hindu cows; as Tyler Durden from Fight Club would say.

After weeks of Webcology broadcasts discussing change in the middle of change it really seems to me that a majority of these SEOs are uncomfortably welcoming in all of this and most have been prepared for months. However, the online businesses that sold their morals to the blackhat devil are under a lot of pressure. I don’t feel bad for them at all and they better feel lucky that there is an honest playing field to help them out of rough water.

Fight club 2

Countless numbers of walking dead sites take position on the net and I don’t want them popping up at the top of my search any more. We are becoming more social on the interweb as well as practicing more translucent behaviors. It’s this translucency that holds these digital entities accountable and this is why there is a stronger presence in search. I always do social research before I engage in an online service and if they don’t interact or seem as an authority on a subject then I don’t bother with them. These companies need to do more than just operate and have a pretty looking web site, they need personality and social is what gives them this opportunity.

At one time I heard we had a thing called link building, but I would like to introduce it as “relevant networking”. This means – tie up your boots, get out there and start making connections like your Dad did; network with other companies or personalities who share a common interest and you will begin to gain traffic and business. Just like the good old days, call them up or send them an e-mail and get creative with campaigns to draw in your crowd. It really isn’t different from the good old Brick and Mortar mentality of running good inbound marketing.

Every industry evolves and this is what keeps life interesting. This level of search evolution is a great change and will bring together our advances in technology as well as bring back healthy, vibrant and honest business. We should all ride bravely into the sunset atop of our flesh eating panda and encourage the transformation of search innovation.

Pictures from Vanessa Mathews

SEO news blog post by @ 9:00 am on October 8, 2013

Categories:Google,Rankings

 

Penguin 2.1 (AKA Penguin 5)

Penguin 5

The newest iteration of the Penguin algorithm has rolled out.  In a tweet by Google czar Matt Cutts at 1:50PM on October 4th it will affect ~1% of all search queries.  The tweet is as follows:

For those paying attention, the link goes to the April 2012 information on the Penguin updates.  This is an indicator that nothing new has been introduced and that this is a tweaking of the current algorithm sub-set.  The core change (according to Google) is that where Penguin tended to detect quality issues with the homepage of a website only, this change takes it further to assessing the quality of a website as a whole.  This makes good sense given other recent changes at Google which are generally seen to be pushing people to focus their energies on overall visitor experience and interaction as opposed to a focus on subsets of visitors and what you want them to do.  One need only look at the removal of the keyword data from Analytics for reinforcement of this principle.

So what does this mean for you? 

Nothing that you wouldn’t have gathered previously if you were paying attention for the past couple weeks.  Focusing on visitor experience globally seems to be more crucial than ever and insuring that you’re putting out good, quality content on a regular basis to reinforce your knowledge or, alternatively, to give Google something to pull data from (insert Hummingbird here) and potentially distract visitors who aren’t interested in your specific product/service.  In this I’m simply referring to using the information portions of your website to give your generally-bouncing visitors something to do as opposed to heading straight back to Google.

While I may have issues with the expanded knowledge graph for what it does to publishers, clearly Google wants their visitors to get their information they want quickly, on any device and decide for themselves what subset of that information they are interested in.  This tells us that we should do the same, while our product or service may not fit the searchers needs, it’s becoming more important that ever to insure that we do provide them with something.  As a perk, done well – that something way well serve as great link bait. :)

SEO news blog post by @ 11:01 am on October 6, 2013

Categories:Google,Update

 

Much Ado About (not provided)

Our motto is (not provided).

Our motto is (not provided).

As many of our readers may already know, earlier this week Google changed the way their URL functions in such a way that for those who monitor their analytics (which should be all of you), you’ll now only see (not provided) where once you would have seen your keyword. This move was met with disappointment and more than a bit of annoyance on the part of SEOs and website owners. The reason (so they say) is to protect the privacy of their users. The logic is, if keyword data passes then it can be picked up in the log files of the site being visited along with data such as the IP address that would allow the user to be pinpointed with some degree of accuracy. So, to make sure that the owner of the custom t-shirt site I visited last week can’t figure out it was me that searched “custom t-shirts canada” that data is now kept from the receiving site. Now, here’s the annoyance – to say that it’s a case of protecting privacy would work UNTIL we realize that the same can’t be said for paid traffic. If you purchase traffic though AdWords, the data is tracked. Now of course it has to be or we’d all just be paying for AdWords and trusting that we were getting the traffic we paid for and that the bids made sense but the hypocrisy is pretty obvious – why is a user that clicks on an organic result then more deserving of privacy than those who click on a paid result? They’re not obviously, and we’re not being told the truth BUT that’s not really the discussion to be had is it? The fact of the matter is, it’s Google and they can do what they want with their own website. I believe I should get to do with my site what I want (within the confines of the law of course) and so I won’t take that away from others. So what is the real discussion …

What Do We Do Now?

While we’re all spending time arguing about the hypocrisy and crying foul, the fact of the matter is that it is what it is and now we have to figure out what to do.  We no longer have keyword data from Google.  There are two routes forward, the short term patch and the long term changes.

Short Term

In the short term we can use Advanced Segments to at least get a good idea about what keywords are producing what effect.  Essentially we can use them to filter traffic that follows patters similar to what specific keywords or keyword groups behaved like.  This tends to only work well with large traffic groupings so unless you get huge traffic for single phrases that behave uniquely – you’ll probably have to group your traffic together.  Branded vs non-branded for example.  I’m not going to get into how this is gone here in this blog post simply because I wrote a lengthy piece on it for Search Engine Watch back when (not provided) was first becoming an issue.  You can read about it at http://searchenginewatch.com/article/2143123/How-to-Understand-Your-Google-Not-Provided-Traffic.

This will only work for a while however.  You’ll see new traffic coming in and won’t know how it’s behavior impacts the results.  Essentially – this will give you a decent idea until your traffic sources change, your site changes, or time passes.  So what do we do …

Long Term

In the long run we have no option but to make massive adjustments to the way we look at our sites.  We can no longer determine which keywords perform the best and try to caft the user experience for them.  Instead we have to look at our search traffic in a big bucket.  Or do we?

While this may be true for some traffic, we can still segment but the landing page (which will give you a good idea of the phrases) as well as look at groups of pages (all in a  single directory for example).  I know for example that this change comes right when we ourselves are redesigning our website and in light of this I will be changing the way our directory structure and page naming system work to allow for better grouping of landing pages by common URL elements.  I imagine I won’t be the last to consider this factor when building or redeveloping a website.

What will need to change is our reliance on specific pieces of data.  I know I like to see that phrase A produced X result and work to improve that.  We’ll not have to look at larger groupings of data.  A downside to this (and Google will have to address this or we as SEOs will) is that it’s going to be a lot easier to mask bad practices as specific phrase data won’t be available.  I know for example that in an audit I was part of, we found bot traffic in part based on common phrase elements.  Today we wouldn’t be able to do this and the violations would continue.

We’re All Still Learning

Through the next couple months we’ll all be adjusting our reporting practices to facilitate this change.  I know that some innovative techniques will likely be developed to report as accurately as possible what traffic is doing what.  I know I’ll be staying on top if it and we’ll keep you posted here in our blog and on our Facebook page.

SEO news blog post by @ 10:48 am on September 26, 2013

Categories:Analytics,Google,Google

 

A Panda Attack

Google today confirmed that there is a Panda update rolling out. I find it odd that after telling webmasters that there would no longer be announcements of Panda updates, that they made this announcement and one has to wonder why.

The official message from Google is that this Panda update is softer than those previously and that there have been new signals added. There are webmasters who are reporting recoveries from previous updates with this one. I would love to hear some feedback from any of our blog readers as to changes you may have noticed in your rankings with this latest update.

I’ll publish a followup post to this one next week after we’ve had a chance to evaluate the update.

SEO news blog post by @ 10:42 am on July 18, 2013


 

Google Update: Penguin #4?

Rumor has it that there’s a Google update underway. While there were some noting changes are early as late Saturday/early Sunday – general experience has it starting on Monday with many webmasters experiencing significant drops.

From the significant drops reported by many webmasters and the only one or two position movement we can see among our own clients here at Beanstalk it seems that it may be the next Penguin update which target known unethical SEO practices.  Admittedly, this is simply an educated guess and I’m not the first to suppose such however when one seems sites taking massive drops and others, where the strategies are known solid, hold steady or even gain, it’s a safe assumption that whatever update it is … it’s either targets spam or devaluing bad links.

This couldn’t be better illustrated by one comment on the Webmaster World forums by user Martin Ice Web when he said:

It now seems like Google has the intention to find all the crap in the WWW and unfortunately they get it very good.
I don´t know what kind of trust factor they are searching for but the sites i now see are complete without any trust.

It’s far too early to conclude much but I know we’ll be watching it closely here at Beanstalk.  If you’re interested, there’s a discussion on the subject over at Webmaster World here.

SEO news blog post by @ 9:08 am on May 7, 2013

Categories:Google

 

iOS popularity = Big Bills for Bing Hating

We decided to call a spade a spade, and Google is paying a fee to keep Bing from being the default search engine on iOS.

The fee is based on per-unit pricing, and not only are there more units than ever, but the per-unit price is also going from $3.20 last year to an estimated $3.50 per unity in 2013!

A flock of sheep attempting to enter a building with an apple logo at the same time.
Given the growing user base these should almost be rabbits?

 
Since the prices are a guesstimate, one can honestly say that it will cost more for the exclusive right to the default search engine on iOS in 2013.

However there are certain ‘publications’ that have forgone the guessing part and are rather certain that Google will pay up.

For example..

Techcrunch title: GOOGLE TO PAY APPLE 1 BILLION
An honest title: GOOGLE COULD PAY APPLE 1 BILLION

In fact, if Samsung, or Google (via it’s Motorolla Mobillity acquisition), can keep one-upping each of the new iPhones, then the cost of licensing to the user-base will be peaking at a point which it will never return to again.

But is it worth the money knowing how much of a search advantage Google has over Bing? Well that depends entirely on who you ask!

Apple pundit:

People will use whatever is the default like pack of blind sheep. Everyone knows this.

Google fan:

If that’s true then why is the Google Maps app on iOS the most popular app on the device? People clearly don’t just use the default apple maps?

.. and really, if we’re talking about users who skipped over the BlackBerries, Nokias, Samsungs, etc.., for a specific device, then perhaps we should give them some credit for also choosing a better search experience?

After all, how many times would you let your phone load Bing before trying to switch it?

I personally would let a ‘Bing’ search happen once at the most, just to get info on “setting default search engine on iOS”. :)

SEO news blog post by @ 5:08 pm on February 14, 2013


 

Older Posts »
Level Triple-A conformance icon, W3C-WAI Web Content Accessibility Guidelines 1.0 Valid XHTML 1.0! Valid CSS!
Copyright© 2004-2014
Beanstalk Search Engine Optimization, Inc.
All rights reserved.