Beanstalk on Google+ Beanstalk on Facebook Beanstalk on Twitter Beanstalk on LinkedIn Beanstalk on Pinterest
Translate:
Published On:
SEO articles and blog published on ...
Hear Us On:
Webmaster Radio
Blog Partner Of:
WebProNews Blog Partner
Helping Out:
Carbon balanced.
Archives
RSS

XMLRSS

Beanstalk's Internet Marketing Blog

At Beanstalk Search Engine Optimization we know that knowledge is power. That's the reason we started this Internet marketing blog back in 2005. We know that the better informed our visitors are, the better the decisions they will make for their websites and their online businesses. We hope you enjoy your stay and find the news, tips and ideas contained within this blog useful.


January 7, 2014

Now Think About What You’ve Done: Rap Genius Explains Itself

Last week we looked at Rap Genius’ phenomenally dumb SEO strategy which landed them with a smackdown Google penalty and virtually removed them from the search results. Now, after ten days, the website appears to have cleared the penalty, and are ranking for their own name (hey, it’s a start). They’ve also uploaded a lengthy blog post to come clean as to exactly where they began, and what went so wrong. It’s a classic tale of the young upstart, growing up from nothing, riding high on the waves of success, and then blowing it all on a stupid, lazy move because of a mistaken belief that the rules no longer applied.

In short, Rap Genius has lived a 10-day version of Wall Street, give or take a few Sheen family members.

SorryAs they explain on their website, the Rap Genius founders started out small and their attempts to reach out to major music sites had minimal success. But their innovative contribution-focused layout—which allows users to annotate any lyrics to add comments and explanations about what the lyric might mean—was a draw, and their early users often had music blogs of their own. As they began using Rap Genius as a resource, linking to track pages in their posts, the site saw some good growth and an increased social media presence. Bloggers linked to Rap Genius’ track pages, because they became the go-to source for good rap analysis; in return, Rap Genius linked and talked about the blogs that had become part of their everyday communication cycle.

Greed is Good?
Rap Genius has been in the headlines before, for both good and bad publicity. They began to collaborate with publications like The Atlantic and the Huffington Post—sometimes on a piece about using rap to teach science, and sometimes on the wild and controversial behavior of the site’s founders. All the while their blog network would link to Rap Genius—often in the context of their post, but occasionally a writer would include the links for a whole album at the bottom of a review. Rap Genius made it easy to do this by creating an embed function on their album pages, so that bloggers could instantly grab all the links to the tracks with a simple copy-paste.

But Rap Genius got greedy, and they began to promise to promote any blog whose owner linked to an album, regardless of the post’s content. And, to their credit, Rap Genius acknowledges that they were completely stupid about the system; “The dubious-sounding ‘Rap Genius blog affiliate program’, the self-parodic used car salesman tone of the email to John, the lack of any discretion in the targeting of a partner – this all looked really bad. And it was really bad: a lazy and likely ineffective ‘strategy’, so over-the-top in its obviousness that it was practically begging for a response from Google,” they say in their explanatory blog post.

The blog post also outlines (in a lot of detail) the method by which Rap Genius removed as many of the problematic links as possible; in the interest of total openness, it’s actually pretty nice to see them give some insight into their situation, realize that they broke the rules, and apologize to both Google and their fans. While getting a penalty is pretty humiliating, it’s always better to cop to it, fix it, and promise to do better in the future, rather than trying to dance around the issue or lay blame elsewhere. If you’re going to get caught, be honest about it; in the end, at least for me, Rap Genius looks a little bit smarter for how they responded, and hopefully they’ve learned their lesson.

SEO news blog post by @ 3:58 pm

Categories:link building,Rankings

 

 

October 21, 2013

Google Q3, Mobile Ads & Hummingbird

Google announced their Q3 earnings last Thursday (October 17th) with higher-than-expected earnings, up 12% over the previous year at $14.89 billion. This resulted in Google shares crossing the $1000 per share mark for the first time in the companies’ history. Before we get into how that’s being accomplished, let me first insert my brief rant:

<rant>

THERE IS NO REASON GOOGLE SHOULD BE VALUED AT WHAT IT IS !!  IT’S LIKE WE’VE FORGOTTEN THE DOT COM BUBBLE BURSTING WITH THE CRAZY VALUATIONS WE GIVE TECH COMPANIES !!!

</rant>

Alright, feeling better …

At the end of the day the higher than expected earnings came on the back of an average 8% drop in the average cost per click.  This drop was due mainly to the growth in mobile however (where the rates are cheaper) and rather than indicating a decline in search is an indicator to the contrary.  Because growth in the mobile realm is as high as it is, it is able to impact the overall averages dramatically however desktop search did not decline.  This is a case of Google winning in mobile and not losing in desktop creating a net gain though an average cost per click drop.

If we don’t think this growth in mobile wasn’t the key to the Hummingbird changes we’d be kidding ourselves.  Hummingbird has very little to do with desktop search and everything to do with mobile and mobile devices.  With the growth in the sector being what it is and the enormous revenue opportunities that exist there – it looks as though Google is adjusting their entire algorithm to accommodate.  And it makes sense as users demand more from mobile and from technology in general.  The contest is on to feed more data faster and monetize better.  Tell us what we want before we know we want it.

Will Google be able to keep up?  Only time will tell.  It’s theirs to lose at this point but not that long ago it was Microsoft’s to lose.  Of course, Microsoft could buy Google if they wanted so …

And now, on a lighter note (albeit it only slightly relevant) let’s take a moment to remember what we have, what mobile is doing, what we take for granted and maybe even chuckle a bit …

SEO news blog post by @ 11:10 pm

Categories:Google,Update

 

 

October 8, 2013

So you got bit by a flesh eating panda

 

It’s been a few months seeing the drastic changes Google has made in the SEO industry. Being so fresh to SEO I never truly experienced what the impact of this buzz was all about. From my perspective of owning a business before was that this change was a great thing. I mean who wouldn’t want certain shady practices cleaned up and a true sense of marketing back on the table? If you own a shop don’t just expect to get traffic by simply having a website or opening your doors. From what I hear shady architecture, irrelevant, sloppy links and no connection to your targeted demographic is in any sense an idiot’s guide to bankruptcy. A well-built site that plays by the rules and relevant networking practices are what should drive the internet as well as any business.

Fight club 1

So we have penguin, panda and now the hummingbird what I think they should have had was a flesh eating, four headed dragon, a zombie bear and a vampire seagull. Why make businesses that follow bad practices feel calm with cute animals. Well I guess it is a little more entertaining when you see a cute little bear tear the heads off of shady competitors. Sure, blackhat SEOs are under a lot of pressure to figure themselves out, but when I see the whitehats they primarily seem calm as Hindu cows; as Tyler Durden from Fight Club would say.

After weeks of Webcology broadcasts discussing change in the middle of change it really seems to me that a majority of these SEOs are uncomfortably welcoming in all of this and most have been prepared for months. However, the online businesses that sold their morals to the blackhat devil are under a lot of pressure. I don’t feel bad for them at all and they better feel lucky that there is an honest playing field to help them out of rough water.

Fight club 2

Countless numbers of walking dead sites take position on the net and I don’t want them popping up at the top of my search any more. We are becoming more social on the interweb as well as practicing more translucent behaviors. It’s this translucency that holds these digital entities accountable and this is why there is a stronger presence in search. I always do social research before I engage in an online service and if they don’t interact or seem as an authority on a subject then I don’t bother with them. These companies need to do more than just operate and have a pretty looking web site, they need personality and social is what gives them this opportunity.

At one time I heard we had a thing called link building, but I would like to introduce it as “relevant networking”. This means – tie up your boots, get out there and start making connections like your Dad did; network with other companies or personalities who share a common interest and you will begin to gain traffic and business. Just like the good old days, call them up or send them an e-mail and get creative with campaigns to draw in your crowd. It really isn’t different from the good old Brick and Mortar mentality of running good inbound marketing.

Every industry evolves and this is what keeps life interesting. This level of search evolution is a great change and will bring together our advances in technology as well as bring back healthy, vibrant and honest business. We should all ride bravely into the sunset atop of our flesh eating panda and encourage the transformation of search innovation.

Pictures from Vanessa Mathews

SEO news blog post by @ 9:00 am

Categories:Google,Rankings

 

 

October 6, 2013

Penguin 2.1 (AKA Penguin 5)

Penguin 5

The newest iteration of the Penguin algorithm has rolled out.  In a tweet by Google czar Matt Cutts at 1:50PM on October 4th it will affect ~1% of all search queries.  The tweet is as follows:

For those paying attention, the link goes to the April 2012 information on the Penguin updates.  This is an indicator that nothing new has been introduced and that this is a tweaking of the current algorithm sub-set.  The core change (according to Google) is that where Penguin tended to detect quality issues with the homepage of a website only, this change takes it further to assessing the quality of a website as a whole.  This makes good sense given other recent changes at Google which are generally seen to be pushing people to focus their energies on overall visitor experience and interaction as opposed to a focus on subsets of visitors and what you want them to do.  One need only look at the removal of the keyword data from Analytics for reinforcement of this principle.

So what does this mean for you? 

Nothing that you wouldn’t have gathered previously if you were paying attention for the past couple weeks.  Focusing on visitor experience globally seems to be more crucial than ever and insuring that you’re putting out good, quality content on a regular basis to reinforce your knowledge or, alternatively, to give Google something to pull data from (insert Hummingbird here) and potentially distract visitors who aren’t interested in your specific product/service.  In this I’m simply referring to using the information portions of your website to give your generally-bouncing visitors something to do as opposed to heading straight back to Google.

While I may have issues with the expanded knowledge graph for what it does to publishers, clearly Google wants their visitors to get their information they want quickly, on any device and decide for themselves what subset of that information they are interested in.  This tells us that we should do the same, while our product or service may not fit the searchers needs, it’s becoming more important that ever to insure that we do provide them with something.  As a perk, done well – that something way well serve as great link bait. :)

SEO news blog post by @ 11:01 am

Categories:Google,Update

 

 

September 26, 2013

Much Ado About (not provided)

Our motto is (not provided).

Our motto is (not provided).

As many of our readers may already know, earlier this week Google changed the way their URL functions in such a way that for those who monitor their analytics (which should be all of you), you’ll now only see (not provided) where once you would have seen your keyword. This move was met with disappointment and more than a bit of annoyance on the part of SEOs and website owners. The reason (so they say) is to protect the privacy of their users. The logic is, if keyword data passes then it can be picked up in the log files of the site being visited along with data such as the IP address that would allow the user to be pinpointed with some degree of accuracy. So, to make sure that the owner of the custom t-shirt site I visited last week can’t figure out it was me that searched “custom t-shirts canada” that data is now kept from the receiving site. Now, here’s the annoyance – to say that it’s a case of protecting privacy would work UNTIL we realize that the same can’t be said for paid traffic. If you purchase traffic though AdWords, the data is tracked. Now of course it has to be or we’d all just be paying for AdWords and trusting that we were getting the traffic we paid for and that the bids made sense but the hypocrisy is pretty obvious – why is a user that clicks on an organic result then more deserving of privacy than those who click on a paid result? They’re not obviously, and we’re not being told the truth BUT that’s not really the discussion to be had is it? The fact of the matter is, it’s Google and they can do what they want with their own website. I believe I should get to do with my site what I want (within the confines of the law of course) and so I won’t take that away from others. So what is the real discussion …

What Do We Do Now?

While we’re all spending time arguing about the hypocrisy and crying foul, the fact of the matter is that it is what it is and now we have to figure out what to do.  We no longer have keyword data from Google.  There are two routes forward, the short term patch and the long term changes.

Short Term

In the short term we can use Advanced Segments to at least get a good idea about what keywords are producing what effect.  Essentially we can use them to filter traffic that follows patters similar to what specific keywords or keyword groups behaved like.  This tends to only work well with large traffic groupings so unless you get huge traffic for single phrases that behave uniquely – you’ll probably have to group your traffic together.  Branded vs non-branded for example.  I’m not going to get into how this is gone here in this blog post simply because I wrote a lengthy piece on it for Search Engine Watch back when (not provided) was first becoming an issue.  You can read about it at http://searchenginewatch.com/article/2143123/How-to-Understand-Your-Google-Not-Provided-Traffic.

This will only work for a while however.  You’ll see new traffic coming in and won’t know how it’s behavior impacts the results.  Essentially – this will give you a decent idea until your traffic sources change, your site changes, or time passes.  So what do we do …

Long Term

In the long run we have no option but to make massive adjustments to the way we look at our sites.  We can no longer determine which keywords perform the best and try to caft the user experience for them.  Instead we have to look at our search traffic in a big bucket.  Or do we?

While this may be true for some traffic, we can still segment but the landing page (which will give you a good idea of the phrases) as well as look at groups of pages (all in a  single directory for example).  I know for example that this change comes right when we ourselves are redesigning our website and in light of this I will be changing the way our directory structure and page naming system work to allow for better grouping of landing pages by common URL elements.  I imagine I won’t be the last to consider this factor when building or redeveloping a website.

What will need to change is our reliance on specific pieces of data.  I know I like to see that phrase A produced X result and work to improve that.  We’ll not have to look at larger groupings of data.  A downside to this (and Google will have to address this or we as SEOs will) is that it’s going to be a lot easier to mask bad practices as specific phrase data won’t be available.  I know for example that in an audit I was part of, we found bot traffic in part based on common phrase elements.  Today we wouldn’t be able to do this and the violations would continue.

We’re All Still Learning

Through the next couple months we’ll all be adjusting our reporting practices to facilitate this change.  I know that some innovative techniques will likely be developed to report as accurately as possible what traffic is doing what.  I know I’ll be staying on top if it and we’ll keep you posted here in our blog and on our Facebook page.

SEO news blog post by @ 10:48 am

Categories:Analytics,Google,Google

 

 

September 20, 2013

Let’s talk about Spam!

Salt n Pepa

 
Don’t get me wrong.. Email was one of the cornerstones of the internet, some might even argue that replacing postal mail might have driven the early growth of the internet?

 
So email is a fundamental part of the internet, and yet.. Just because YOU can do something, like emailing wonderful offers, does it make it right? If everyone sat around all day doing that would it be sustainable?

So we come to the topic of email spam, it’s actual cost in terms of how it taxes our time/effort to dislodge from our inboxes, and what people can do about it.

- Never buy a service that’s spam-vertized.

This is a simple one. You wouldn’t donate money to someone who’s proposing to stand outside your house and scream offers at you through the window, so why would you invest your earnings in a product advertised to you via unsolicited means?

- Identify spam without wasting time.

We’re an SEO, so if you send across an offer to help the Beanstalk SEO website rank better, I’m pretty sure I can toss your email into the spam bin and forget about it. In fact anyone who just sends you an SEO email out of the blue must be pretty desperate and incapable of ranking their own sites in order to get the traffic they need to stay in business.

I personally keep a list of these domains, mostly to block them from using our contact forms, but also as a reference of companies to avoid when clients need referrals.

Heck even “www.google.com” gets similar offers to improve their ‘conversions’ and ‘organic search results’!

Over on Matt Cutt’s blog he’s talking about a lot of email issues and he’s taken the time to laugh at SEO e-mail spam:

I was on your website www.google.com and wanted to shoot you a quick note. I think I can make a few changes (aesthetically and/or SEO – wise) to make your site convert more visitors into leads and to get it placed higher in the organic search results, for a few of the select terms.

This is NOT like one of those foreign emails you probably get in your inbox every day. Just to be upfront I have 3 agents that work with me for development /SEO.

I would just need to know which (if not both) services you’re open to checking out information about, either web design or SEO. Would you be open to seeing more brief info / quote for what I would like to accomplish?

As Matt Cutts summarized on his blog:

“this person is offering help to convert Google.com visitors into leads.
Or, you know, to improve Google.com’s rankings in organic search results. Sigh.”

 

- Use Opt-In lists that are re-checked regularly.

When you give people a chance to ‘opt-in’ to a mail campaign you win all around…

  • reach people who are interested
  • annoy less potential clients
  • avoid getting flagged as a spammer
  • spend less time trying to sell your validity
  • make the online world a better place

Keep in mind that one of the largest (if not the largest) anti-spam providers is Postini, which is now run by Google and used by many organizations from GMail to WordPress.

If you run afoul of Postini then you can expect a VERY LARGE group of listeners, including GMail users/blog readers, to be filtering out your messages, spam or not.

So even if you have a great opt-in audience now, make sure to re-check that list before it gets stale and potentially starts to annoy folks that were previously interested.

I would NEVER forward spam to friends/associates, but if someone I know is interested in something well-maintained that I’ve opted into, I’ll recommend it to them for sure.

Food for thought.. to go along with that Salt n Pepa!

SEO news blog post by @ 1:38 pm


 

 

September 17, 2013

 

August 16, 2013

SEO concerns for Mobile Websites

You want to serve your clients needs regardless of what device they visit your site with, but how do you do it easily without upsetting your SEO?

Lets look at the various options for tackling Mobile sites and what each means in terms of SEO:

Responsive Design :
 
Visual demonstration of responsive web design

  • Responsive design is growing in popularity, especially as communications technology evolves, and bandwidth/memory use is less of a concern.
  • This method also gives us a single URL to work with which helps to keep the sitemap/structure as simple as possible without redirection nightmares.
  • On top of that, Googlebot won’t need to visit multiple URLs to index your content updates.
  • Less to crawl means Googlebot will have a better chance to index more of your pages/get deeper inside your site.
“Why is/was there a concern about mobile page size?”

Low-end mobiles, like a Nokia C6 from 4+ years ago (which was still an offering from major telcos last year), typically require that total page data be less than 1mb in order for the phone to handle the memory needs of rendering/displaying the site.

If you go over that memory limit/tipping point you risk causing the browser to crash with an error that the device memory has been exceeded. Re-loading the browser drops you on the device’s default home-page with all your history lost. I think we could all agree that this is not a good remote experience for potential clients.

Higher-end devices are still victims of their real-world connectivity. Most 3rd generation devices can hit really nice peak speeds, but rarely get into a physical location where those speeds are consistent for a reasonable length of time.

Therefore, even with the latest gee-wiz handsets, your ratio of successfully delivering your entire page to mobile users will be impacted by the amount of data you require them to fetch.

In a responsive web design scenario the main HTML content is typically sent along with CSS markup that caters to the layout/screen limitations of a mobile web browser. While this can mean omission of image data and other resources, many sites simply attempt to ‘resize’ and ‘rearrange’ the content leading to very similar bandwidth/memory needs for mobile sites using responsive design approaches.

The SEO concern with responsive designs is that since the written HTML content is included in the mobile styling it’s very crucial that external search engines/crawlers understand that the mobile styled content is not cloaking or other black-hat techniques. Google does a great job of detecting this and we discuss how a bit later on with some links to Google’s own pages on the topic.

Mobile Pages :

Visual demonstration of mobile web page design

 
If you’ve ever visited ‘mobile.site.com’ or something like that, you’ve already seen what mobile versions of a site can look like. Typically these versions skip reformatting the main site content and they get right down to the business of catering to the unique needs of mobile visitors.

Not only can it be a LOT easier to build a mobile version of your site/pages, you can expect these versions to have more features and be more compatible with a wider range of devices.

Tools like jQuery Mobile will have you making pages in a jiffy and uses modern techniques/HTML5. It’s so easy you could even make a demo image purely for the sake of a blog post! ;)

This also frees up your main site design so you can make changes without worrying what impact it has on mobile.

“What about my content?”

Excellent question!

Mobile versions of sites with lots of useful content (AKA: great websites) can feel like a major hurdle to tackle, but in most cases there’s some awesome solutions to making your content work with mobile versions.

The last thing you’d want to do is block content from mobile visitors, and Google’s ranking algorithm updates in June/2013 agree.

Even something as simple as a faulty redirect where your mobile site is serving up:
mobile.site.com/
..when the visitor requested:
www.site.com/articles/how_to_rank.html

.. is a really bad situation, and in Google’s own words:

“If the content doesn’t exist in a smartphone-friendly format, showing the desktop content is better than redirecting to an irrelevant page.”

 
You might think the solution to ‘light content’ or ‘duplicate content’ in mobile versions is to block crawlers from indexing the mobile versions of a page, but you’d be a bit off the mark because you actually want to make sure crawlers know you have mobile versions to evaluate and rank.

In fact if you hop on over to Google Analytics, you will see that Google is tracking how well your site is doing for mobile, desktop, and tablet visitors:
Example of Google Analytics for a site with mobile SEO issues.

(Nearly double the bounce rate for Mobile? Low page counts/duration as well!?)

 
Google Analytics will show you even more details, so if you want to know how well you do on Android vs. BlackBerry, they can tell you.

“How do the crawlers/search engines sort it out?”

A canonical URL is always a good idea, but using a canonical between a mobile page and the desktop version just makes sense.

A canonical can cancel out any fears of showing duplicate content and help the crawlers understand the relationship between your URLs with just one line of markup.

On the flip-side a rel=”alternate” link in the desktop version of the page will help ensure the connection between them is understood completely.

The following is straight from the Google Developers help docs:

On the desktop page, add:

<link rel="alternate" media="only screen and (max-width: 640px)" href="http://m.example.com/page-1" >

and on the mobile page, the required annotation should be:

<link rel="canonical" href="http://www.example.com/page-1" >

This rel=”canonical” tag on the mobile URL pointing to the desktop page is required.

Even with responsive design, Googlebot is pretty smart, and if you aren’t blocking access to resources intended for a mobile browser, Google can/should detect responsive design from the content itself.

Google’s own help pages confirm this and provide the following example of responsive CSS markup:

    @media only screen and (max-width: 640px) {...}

In this example they are showing us a CSS rule that applies when the screen max-width is 640px; A clear sign that the rules would apply to a mobile device vs. desktop.

Google Webmaster Central takes the information even further, providing tips and examples for implementing responsive design.

Ever wondered how to control what happens when a mobile device rotates and the screen width changes? Click the link above. :)

SEO news blog post by @ 3:51 pm


 

 

July 18, 2013

A Panda Attack

Google today confirmed that there is a Panda update rolling out. I find it odd that after telling webmasters that there would no longer be announcements of Panda updates, that they made this announcement and one has to wonder why.

The official message from Google is that this Panda update is softer than those previously and that there have been new signals added. There are webmasters who are reporting recoveries from previous updates with this one. I would love to hear some feedback from any of our blog readers as to changes you may have noticed in your rankings with this latest update.

I’ll publish a followup post to this one next week after we’ve had a chance to evaluate the update.

SEO news blog post by @ 10:42 am


 

 

May 7, 2013

Google Update: Penguin #4?

Rumor has it that there’s a Google update underway. While there were some noting changes are early as late Saturday/early Sunday – general experience has it starting on Monday with many webmasters experiencing significant drops.

From the significant drops reported by many webmasters and the only one or two position movement we can see among our own clients here at Beanstalk it seems that it may be the next Penguin update which target known unethical SEO practices.  Admittedly, this is simply an educated guess and I’m not the first to suppose such however when one seems sites taking massive drops and others, where the strategies are known solid, hold steady or even gain, it’s a safe assumption that whatever update it is … it’s either targets spam or devaluing bad links.

This couldn’t be better illustrated by one comment on the Webmaster World forums by user Martin Ice Web when he said:

It now seems like Google has the intention to find all the crap in the WWW and unfortunately they get it very good.
I don´t know what kind of trust factor they are searching for but the sites i now see are complete without any trust.

It’s far too early to conclude much but I know we’ll be watching it closely here at Beanstalk.  If you’re interested, there’s a discussion on the subject over at Webmaster World here.

SEO news blog post by @ 9:08 am

Categories:Google

 

 

Older Posts »
Level Triple-A conformance icon, W3C-WAI Web Content Accessibility Guidelines 1.0 Valid XHTML 1.0! Valid CSS!
Copyright© 2004-2014
Beanstalk Search Engine Optimization, Inc.
All rights reserved.