Beanstalk on Google+ Beanstalk on Facebook Beanstalk on Twitter Beanstalk on LinkedIn Beanstalk on Pinterest
Published On:
SEO articles and blog published on ...
Hear Us On:
Webmaster Radio
Blog Partner Of:
WebProNews Blog Partner
Helping Out:
Carbon balanced.


Google Instant Analysis

I’ve just completed my initial analysis of Google Instant and written up an article on how it affects searchers, Google and SEO’s/website owners. I will be continuing to monitor the changes in search and user patterns over the coming months and am very much looking forward to keep you, our valued visitors, updated on how this technological change affects business and SEO’s.

You can read the analysis on our site here. And while you’re reading the article you can listen to SEO Jim Hedger and me having an hour long discussion on just this topic on Webmaster Radio at

SEO news blog post by @ 7:07 pm on September 12, 2010



Google AdWords

And so the test begins.

I sent over a couple screenshots to the Google AdWords team showing that a ton of clicks were staying for only 0 seconds and yet I was being charged for them.  Here is what I got back:

Please be assured that our system identifies invalid clicks and filters them so you are not charged for those clicks. Therefore, the charges that you have accrued are for legitimate clicks.

Whew. I feel better now. :)

They went on …

Dave, please understand that zero second visits do not indicate invalid click activity. Analytics calculates time spent on one page by looking at two time stamps: one from the request for the first page and one from the request of the second page. If your users have visited only one page then time on page will be zero regardless if he has actually spent time on that page. This is because Analytics does not have a reference point of another page to calculate Time on Page. Therefore, you may be seeing zero second visits even though users may have been on your site for some time. However, I will not be able to recommend any third tracking software for you.

OK – so now I’ve learned something. I’d made the mistake of thinking Google knew the time to exit, perhaps assuming they really are Big Brother and know everything – Analytics is more limited than I first thought. As I’m fundamentally an organic SEO who dabbles in PPC and I went off on this project all on my own. There are definitely phrases that would result in visitors who hit the site, read a review and head off to another site (hopefully via an affiliate link). So stay tuned, I’m going to be placing some redirect pages rather than direct link in to see if this affects the stats. If so – my apologies to the Google AdWords team for a few of the words I’ve used over the past 48 hours. :)

SEO news blog post by @ 5:51 pm on August 27, 2010



SEO for Blogs – A Starter

In a perfect world, your well-written, useful and refreshingly original blog would rank well in search engine results just because it’s good. But that’s not reality. Great content is the foundation of a good blog, but it doesn’t guarantee high rankings. There are some search engine optimization (SEO) things you simply must do to increase the chances that your blog will be found – and read.

Before we get to how you should SEO your blog, remember that your first and primary goal should be to create the most informational blog in your industry. Keep things simple, and make your goal to have first-rate, original and useful content on specific topics with the best, most current advice and information. You could have the best SEO, but your content is junk no visitor will ever return. You must fill it with great content that is helpful to your demographic. Once you have started down that road, there are a number of things you can do to increase your visibility.

Basic On Page Needs

Much of on-page work revolves around utilizing keywords intelligently – the best keywords, in the right places, done the right way. The first order of business is to do comprehensive keyword research to select strong keywords to begin with.

Good keywords are not just about what searchers type most often in the search box – you need to have a full understanding of what they are truly looking for. You have to understand their intent and their real end-sum desires. Integrate this better market understanding into your keyword research.

Then analyze the search engine results pages (SERPS). If there is information out there similar to yours, what keywords find it? Look at your competitors – what keywords are they targeting, and how successful are their efforts. Where are they doing right and wrong? When your run searches on your chosen keywords, what results come up and why?

Once you have developed the optimal keyword list, you want to use these keywords in a number of different, interrelated ways on the page. At minimum that they should be in the:

  • title of the post
  • title tag (if different from the title of the post)
  • heading tags (not just h1, either)
  • internal linking
  • links to the post’s permanent link
  • outbound links
  • URL
  • alt tags

Keywords also need to be sprinkled throughout the copy. “Salted” – not too much, not too little, and in the right places. Make sure you main keywords are in the first few sentences. Don’t overdo it by cramming in the same term(s) again and again. Be wise, use related and relevant keywords to break things up and give Google and the other search engines the variety that they expect from normal, organic, real-world communication.

Your SEO will have the best chance for success if it is balanced, realistic and natural. None of your strategies overrules the need for good, useful, original content that people will read and recommend. Natural, intelligent keyword usage in your content will guide the search engines; quality useable information in the content will encourage people to link to you, tremendously boosting your SEO efforts. Never kill good content by keyword spamming. Readers won’t like it and won’t return, other sites won’t link to it, and both of those will doom your efforts. Respect all readers and make them partners, not targets of clever schemes. Readers are not pawns to move around a chessboard—it is much better that you consider them your advertising department.

Link Love

Among the most important factors affecting your SEO results is the number and quality of links that point to your blog. The better your content, the more people will reference it. Google essentially looks at inbound links as “votes” for the value of your blog and its individual posts – if you have a lot of quality links, your content must be good, thus they should move it up the results.

So generally speaking, the more people that link to your blog the better. That being said, higher quality links help more. The more powerful the site linking to you is, the better (i.e., links from the Wall Street Journal are better than links from some random hobby site). The more relevant the site (and page) linking to you is to your topic, the better. Links with good, relevant anchor text – especially when they occur inside the body text of the page that’s linking to you – are better. Get all the above and you’re golden. Or at least the link is.

Just recognizing a good link doesn’t get the job done – you have to have a link strategy. The best strategy by far is putting your energy into creating useful, first-rate content that people will read, use, link to, and recommend to others. You can contact people for links, use link-building programs and services, or even (gasp!) buy some, but the best approach is still to build inbound links the old fashioned way – by earning them with good content.

To give your linking efforts a jump-start, you can notify other bloggers and site hosts that you’ve written something they or their readers may find useful. Sometimes people that would like your content don’t know you exist. Do NOT ask them for a link – just let them know about the post(s). Don’t spam them, just send a nice note making them aware of content they may like. Make each note custom. People can tell when you have written a template you are sending to many people.

You can still get some mileage out of directory submissions, but confine it to the quality directories like Best of the Web and avoid the overused, “spammy” ones ignored by the search engines. A good rule – if the directory offers free listings, run away.

Link out. Yes, I said it. Although linking out to external sites carries the risk of steering people away, the benefits of being associated with high-level content should outweigh any negatives. Would you really expect others to link to you if you don’t link to anyone else? Are you really a good authority if you don’t reference anyone else’s good information? Link out usefully and shrewdly, and don’t overdo it. As always, moderation works best, and make sure you are linking to solid, reputable sites and content.

Internal Linking Strategy

A comprehensive internal linking strategy is hugely beneficial, yet often underutilized. It helps the bots easily navigate your site, makes it easier for readers to find what they want, and is significant for SEO. No-brainer.

Yes, group your posts into categories so that it strengthens topical authority and relevancy by grouping related articles together, but that is just the beginning.

If in one post you reference a concept covered in another post, link to that other post – and do it from within the text, using the best anchor text possible. Each post should have a list of related posts. This will help guide the reader to related information, and the related posts’ titles contain keywords (right?), giving you good anchor text there as well.

Don’t use the default links “Next Post” and “Previous Post,” instead substituting the actual titles of the next and previous posts. This again gives you an opportunity to have relevant keywords in anchor text. For the same reason do not use the defaults “Read more…” or “Continue reading…” to link to a post’s permalink page. Again use the title of the post (for instance “SEO for Blogs continued…”)

Of course, every page should have links to your homepage and a sitemap, so every page on your site can be reached in just one or two clicks.

Content Considerations

Now, about that “great content,” there are a few ways to make sure you are creating it the right way. A post should only cover one topic. If you cover more than one topic in a post, the search engines don’t know which to rank you for. If your topic is long and involved, break it up into a series with each subtopic getting its own post. Again, don’t keyword spam. Focus on making your point, not on keyword placement. If you make your point eloquently, the keywords will find their way into the right place. Don’t try to stick a keyword in every or every other sentence. If you find yourself having to do that, maybe you are not covering the topic in adequate detail.

Most importantly, update often. This does not mean changing already-published posts, but writing new ones consistently. Publishing content regularly is incredibly undervalued. Search engines love new content, so the more frequently you post the more frequently they will return. This simultaneously gives you more indexed pages faster, more opportunities for the bots to follow your well laid out links, and more chances the search engines to see what great content you have.

Tying It Together

Now that you know that great content is the genesis for so many good SEO factors, have learned a few ways to get the word out and are starting to think strategically, just what amount is the “right” amount?

If your content is good, then more is better. The more good content you have on a subject, the more the search engines will see you as an authority on that subject. The operative word is “good.” Don’t write useless drek just to fill space. It won’t strengthen your content foundation – on the contrary, it will dilute its focus and authority. Authority is like respect – it is built up by repeated quality. Publish good useful content daily, and in time you will have a foundation of authority.

Don’t be in a hurry, and don’t look for shortcuts. Your blog is about quality and balance – conceiving it, writing it, promoting it, optimizing it for search engines and all the rest. You will make mistakes and make progress just as in any other enterprise, so the main thing is to keep learning, keep trying, and keep track of what you’re doing. Once you start seeing certain actions creating certain results, you are on your way to developing your own, customized method for developing and implementing the best search engine optimization for your blog.

On To Part Four >

SEO news blog post by @ 3:24 pm on


Google Update & YaBing!

For those of you who have noticed significant fluctuations in your rankings – you’re not alone. Across the web people have reported significant changes in their rankings. We at Beanstalk were fortunate on this one in that we had ranking reports running for the past few days and got to watch the changes over the course off the report. A happy coincidence. :)

Unfortunately the algorithm shift isn’t particularly favorable to solid site optimization.  There was an odd connection is what we’re seeing.  Site that had link building that focused on high relevancy and high trustability lost ground and sites who’s links building was focused on volume in recent months have gained ground.  This indicated a shift to volume over quality.  For obvious reason we’re convinced that this shift won’t last.

This shift in quality isn’t just apparent in the sites we’re working on but as we analyze various sites across the web we’re noticing a larger degree of lower quality backlinked sites ranking.

Now – to be sure we’re always in favor of diversified link building strategies and that includes strategies that focus more on volume and other strategies that focus on trust  and relevancy but from everything we can see indicates that this update puts a disproportionate emphasis on volume.  I expect to see the rankings shift again – likely over the weekend.

I should note that this isn’t just something we’re noticing but that has been noticed by a wide array of SEO’s.  My advice?  Don’t react too quickly – corrections are coming and you don’t want to adjust the wrong way.

And in other news …

And also noticeable in the current ranking report we’re running for our clients is the merging of Yahoo! and Bing search results.  A couple days ago Yahoo! announced that their organic results in North America were being fed by Bing.  This is of course the first set of ranking reports though that have refected this.    This is (in my opinion) very exciting news and you can read more about it on Search Engine Journal here.

And stay tuned – I’ll be posting more as the Google update continues.

SEO news blog post by @ 3:13 am on


Top Authors On Link Building

I just wanted to take a moment to thank all our blog and article subscribers and just our visitors for helping me make the list of most influential writers on link building in a poll over on the Eightfold Logic blog (link removed – resource no longer exists).  Of course, I’ve always tried to educate and hopefully entertain in my works and I’m definitely glad it has been well-received.  So thanks to you all for voting and be sure to stay tuned, if anything this inspires me to write more often on this important topic and many others.

Sharing the honor with me is a great list of writers that I’d highly recommend following as well.  they are:

  1. Eric Ward
  2. Wiep
  3. Debra Mastaler
  4. Dave Davies

and a tie for fifth;

5.  Rand Fishkin
5.  Ralph Tegtmeier – Fantomaster / “Fantomeister”

SEO news blog post by @ 5:15 pm on August 23, 2010

Categories:link building


Why Google Needs To Stand Up For Themselves

For the past week the Internet world has been abuzz with the Google/Verizon deal and how it will affect Net Neutrality.  For those of you who have heard me speak at conferences or listened to my radio show you’ll know that I’m not the biggest supporter of Net Neutrality legislation.  I tend to take a pretty hard line in a debate (almost always against Jim Hedger) but so does he and it makes for an entertaining debate with him referring to me as a closed minded hater of equality and me accusing him  of communist tendencies and wanting to implement policies and laws that counter the entire spirit of capitalism.  It’s a fun debate.

But today we saw eye-to-eye Jim and I.  While we may argue the reasons we agree – we both object to the way that Google is handling the current issue with their Verizon deal that would give their 1′s and 0′s a bit of preferential treatment.  More on that in just a bit.  First – let’s get some basic history on Google’s stand on net neutrality, the arguments of those who oppose net neutrality and go from there.  But first -

What Is Net Neutrality?

Net Neutrality is, at it’s core, the idea that the Internet is a mandatory service and that complete equality is required in the way packets are treated as they flow across it.  The idea that the Telco’s should have the ability to charge more for preferential treatment of certain packages (say … YouTube videos if Google slipped them a few extra bucks) violates this idea.  Well who can argue that?  Don’t I have the same rights to the Internet as everyone else?

The problem arises in that the Telco’s need to pay for the infrastructure and access to that network.  They argue (and let’s remember – we’re all capitalists here) that they have the right to monetize their services in a way that maximized profits.  The FTC (Federal Trade Commission) has opposed Net Neutrality legislation noting that there are consumer protection laws in place that provide the protection in productive ways and that bloating the law books with more jargon isn’t going to make the issue simpler, or solve any problems that aren’t being solved with current legislation as has been witnessed many times – including a decision again Comcast when they tried to restrict access to torrents on their network and were order to stop doing so.  Basically – Net Neutrality is protected even for a file type that is used primarily for exchanging illegal material (yes torrents are used for legitimate purposes but …)

I wrote a lengthy article a couple years ago at that explains the basics well and those haven’t changed).  So what has?

The Players

Initially there were two camps, those who opposed net neutrality and those who supported it.  The line was drawn basically based on profit like so:

Against Legislation – the “greedy” Telcos who just want to make a buck.
For Legislation – a bunch of people who stand to profit from it such as Google, Microsoft and others who claim that this will hinder innovation and growth in the technology industry.  To ask them – it has nothing to do with the fact that it would cost them more.

In 2007 Google as on record as saying:

“The nation’s spectrum airwaves are not the birthright of any one company. They are a unique and valuable public resource that belong to all Americans. The FCC’s auction rules are designed to allow U.S. consumers — for the first time — to use their handsets with any network they desire, and download and use the lawful software applications of their choice.”

At the time they were bashing Verizon from taking the stand that the decision by the FCC (Federal Communications Commission), “that would require the eventual winner of the spectrum to offer open devices and applications.” claiming such a decision was, “arbitrary and capricious, unsupported by substantial evidence and otherwise contrary to law.” You can read more about this on Google’s Policy Blog here.

So Here We Are 3 Years Later …

So here we stand 3 years later and Google and Verizon are in bed together working out a deal to prioritize some traffic over others, basically pulling a reference from George Orwell’s Animal Farm that,  “some animals are more equal than others.”  They use the example of medical applications but left the door open to gaming, 3D, entertainment, and more.  I’m sure none of us would have a problem with a heart monitor connected to a  doctor’s office over the Internet getting a priority over an MSN chat but we all know that’s not where this is going or it wouldn’t even be a debate.

Now on the table is that mobile devices should be included in the list of exempt platforms and services.  Alrighty – now we’re getting warmed up.  So they’re OK with the standard old Internet getting Net Neutrality imposed (except for special applications and services as yet to be defined of course)…but mobile, the up-and-comer and largely increasing area of bandwidth consumption and connectivity – that area should be excluded from the legislation?  Here’s where you lost me but not because I think it’s wrong to give preferential treatment but because I don’t like when people are trying to be sly.

Here’s the thing … “not all animals are equal”.  I can’t tell Google that all the can change for a PPC click is $0.40 just to make sure that everyone can afford it.  It’s just not that kind of a world (and I would argue further that it shouldn’t be).

What They Should Have Done …

Verizon has done exactly what they should have.  The way the message was delivered puts any backlash squarely on Google.  I have no advice for them, masterfully executed.

Google should have come forward and said:

“The world has changed in 3 years and we have a lot of great ideas about the direction of mobile that’s going to require that Net Neutrality legislation doesn’t apply.  We need to be able to pay more for preferential bandwidth to insure that we can provide you with the services we know you’ll love at a price you’ll enjoy even more. We want to pay extra so you don’t have to.

We would have called them on going against the policies of earlier but really – there would have been a lot less rumors and conjecture about what was going on. They should have stood up for their actions, admitted they were contrary to their former statements and basically outlined what we all know, the Internet world moves fast and the rules have changed.

Sometimes it’s refreshing to just hear a spade called a spade. I don’t believe that Google has any huge secret plans to bring down the Internet – I think they just want to be more equal. At the end of the day I don’t even disagree with their right to be more equal – they just should have come out and said so. They should have stood up for themselves.

And Now For Some Fun…

And now that you’ve made it to the end of a post on Net Neutrality here’s a video done by “Ask A Ninja” on net Neutrality:

SEO news blog post by @ 10:47 pm on August 12, 2010

Categories:Google,Net Neutrality


How HTML 5 Will Change SEO Forever

Conceived by the Web Hypertext Application Technology Working Group (WHATWG), HTML 5 has been the basis of a W3C working group since 2007. The first working draft of the new HTML 5 specification on was released in January 2008. (source: )

As mentioned over at, on the surface, HTML 5, other than the exciting <canvas> element, does not appear to be much different than its predecessor, HTML 4. It will still be XML based and is not making any moves towards being a scripting language like PHP or similar complex programming languages. It looks like the new standard will mainly introduce more effective tags for organizing the content of a webpage to make it more readable by search engine spiders. The main prerequisite of HTML 5 was to keep it accessible to the masses and to have it continue being backwards compatible…which means you will not have to re-learn the whole language.

Most HTML 4 content is currently wrapped in <div> or <span> tags regardless of what it is. New tags introduced by HTML 5 have a more semantic meaning. Tags like the <article>, <nav>, <footer>, <header>, <dialogue> and <aside> (which can be used to indicate a piece of content removed slightly from the rest of the page in terms of relevance) will be increasingly important for SEO efforts. The new <audio>, <video> and <dialogue> tags will be part of the upcoming HTML 5 standard and will allow for further segregation of page content in relevant categories.

The biggest change with the new standard will be the concept of Page Segmentation. Google already has a patent for this and many believe that the practice is already in use today. Currently, there is no way for a website developer to tell the bots how to segment the pages correctly. By dividing pages in to separate sections, a cleaner more organized structure will be created allowing for increased efficiency by bots to parse your pages for content. This also means that bots are able to more efficiently analyze the segments individually and are not wasting time trying to divine content from navigation, scripts, css and other inline elements. This will drastically increase the understanding of the relevancy of the page and will allow bots to rank multi-topic pages more accurately.

Here are some of most important new HTML 5 tags and how they will relate to SEO:

The new article tag is probably one of the best additions to HTML 5 from an SEO perspective. This new tag will allow SEO’s to mark separate entries in online publications. It will clean up the code by reducing the need for excessive <div> tags. Search engines will probably place more importance on the content wrapped in the <article> tag compared to content on the other parts of the page.

The new section tag will be used to further organize the structure of the HTML document. By using the new <section> tag to identify separate sections on a page/chapter/book and maintaining a consistent hierarchical structure, each section can have its separate HTML heading. As with the <article> tag, it can be assumed that search engines will place more attention on the contents of identified sections. If the words of a search string are found in one section for instance, this implies higher relevance, as compared to when these words are found all across the page or in separate sections.

Not to be confused with the <head> element, the <header> tag is similar to the <h1> tag. The key difference being that it can contain <h1> elements, text content and hard –coded links (bonus!) and anything else you like. This one will be huge to SEOs!

While maybe not as important as the new <header> tag, this new tag will also allow for lots of “extra” SEO content. The real bonus is that both the <header> and the <footer> tags can be used repeatedly in each <section> of the page. This gives a lot of flexibility for SEOs!

The new <nav> tag allows for the definition of site navigation or a series of internal or external links. This is another instance of HTML5 trying to organize page content in order to increase the effectiveness and efficiency of the bots that parse your site for content.

Like all W3C implementations it will take some time for the standard to be completely ratified and for people to begin implementing the new tags into their website design. Once enough web pages are using the new HTML 5 standards, search engines will inevitably begin to use it to improve search results in the SERPs. Links and content within certain tags will be treated differently than from those using redundant or archaic tags making the new HTML markup far more important to SEO efforts than it is currently.

Unlike other less popular HTML recommendations for past standardizations, I think this one is long overdue and will be embraced by SEOs and SEMs alike. Embrace the change and start building your sites with an eye on the not too distant future. Fortune favours the prepared!

SEO news blog post by @ 5:10 pm on August 10, 2010


“I’m Insecure”…or…”The Top 500 Worst Passwords of All Time”

We are all guilty of it at one time; creating an insecure password. There is a myriad of excuses that we make to justify our password infractions (can’t think of one, can’t remember it if it’s too complicated…etc.). With the ever present threats from hackers and from information piracy, we all need to do do what we can to protect ourselves. Besides…creating a strong password just makes sense doesn’t it?

Much to my chagrin, my own Gmail account was recently hacked. I am not a novice to password security or of the need to protect sensitive information, but this really made me sit up and take notice and to re-evaluate my username/password usage very seriously.

I think there is an assumption that people just automatically know what constitutes a strong password. But for those of us who need a refresher, here we go:

Tips on Creating a Secure Password
• Make sure it is alpha-numeric (letters and numbers)
• Mix up uppercase and lowercase
• Do not use real words (words found in a dictionary)
• Do not use personal information (names, birthdates, license plates)
• Use a passphrase. (Take a sentence or line from a song and make it into an acronym and substitute letters for special characters like $ for “S” and ! for “1” etc. This makes it a lot easier to remember an abstract phrase that doesn’t mean anything)
• Use different usernames and passwords for different accounts
• Change or rotate your passwords frequently
• Do not share your information with anyone
• Do not write down your usernames or passwords anywhere! ever! (as a former computer tech, you won’t beileve how many times I went to an office to see usernames/passwords conveniently displayed on monitors on bright yellow post-it notes!)
• MOST IMPORTANT! make sure you are not using a username or password on the Top 500 Worst Passwords of All Time list.

Some other common usernames and passwords to avoid:
ncc1701 – The ship number for the Starship Enterprise (and adding A, B, C, D or E does not suddenly make it more secure!)
thx1138 – The name of George Lucas’s first movie, a 1971 remake of an earlier student project
qazwsx – Follows a simple pattern when typed on a typical keyboard
qwerty – Another standard keyboard pattern
666666 – Six sixes
7777777 – Seven sevens
ou812 – The title of a 1988 Van Halen album
90210 – Some lame show from the 90’s ;-)
8675309 – The number mentioned in the 1982 Tommy Tutone song. This song supposedly caused an epidemic of people dialing “8675-309″ and asking for “Jenny” (in my own defense…I just kept getting asked for the area code by the operator…)

With all that in mind, protect yourself by getting in to the practice of creating strong passwords at every occasion. Be confident and stop being insecure today!

SEO news blog post by @ 4:34 pm on August 5, 2010


Googe Fonts

Have you ever wanted to use a font on your website and weren’t able to simply because it wasn’t a web-safe font?  Perhaps you wanted a beautiful scrolling heading but knew that doing so would require creating an image heading and really – that’s just not good SEO is it?

Last week the solution to this issue was brought to my attention by Jacob Gube over on the Mashable site in his article on the implementation of Google’s New Google Font API.  Basically this is a standardized mechanism for pulling in external font definitions into IE, Firefox, Safari, etc. allowing designers and website owners to finally use the fonts they feel would best work with their design.

I’m not going to bother outlining how it works, Jacob does a great job so head on over to the article on the Mashable site at

SEO news blog post by @ 9:28 pm on August 4, 2010



Let The Chaos Begin

As of late yesterday afternoon I noticed a few minor hiccups in the Google SERPs.  This morning those hiccups escalated into multi-page jumps, old versions of pages re-entering the index, pages being dropped from the index and different results appearing with a click of the refresh button.  It is far too early to even try to predict what type of update is underway or what it means but hang on to your hats as it looks like a fairly bumpy ride.

And note – if you see your site drop or jump up in the results – don’t count on that staying as we’re seeing bouncing in both directions and my prediction (the only one I’ll make at this early stage) is that what we’re seeing in both instances is not what we’ll see at the end of the day.

Good luck ! :)

SEO news blog post by @ 4:49 pm on



« Newer PostsOlder Posts »
Level Triple-A conformance icon, W3C-WAI Web Content Accessibility Guidelines 1.0 Valid XHTML 1.0! Valid CSS!
Copyright© 2004-2014
Beanstalk Search Engine Optimization, Inc.
All rights reserved.