Beanstalk on Google+ Beanstalk on Facebook Beanstalk on Twitter Beanstalk on LinkedIn Beanstalk on Pinterest
Published On:
SEO articles and blog published on ...
Hear Us On:
Webmaster Radio
Blog Partner Of:
WebProNews Blog Partner
Helping Out:
Carbon balanced.


Google Update

As expected, this are fluctuations occurring in Google’s results with different rankings coming in depending on the datacenter hit. it appears that they are testing out some additions to the algorithm based on Caffeine results. things should settle down by Monday or Tuesday is all goes as it generally historically does.

Hang on to your hats and hope the update treats you well. :)

For those of you not yet away – you can view your results on Google’s Caffeine server at (Link removed – no longer available).

SEO news blog post by @ 3:56 pm on September 19, 2009



Toll Free Number Back Up

For those of you who regularly visit our blog or are our client – you’ll be aware that for a couple days, our toll free number was not working. It was apparently going to take up to a week to get back online but mentioning that it’s actually faster to change to a new provider for telephone services got the matter dealt with in just a few hours.

We whole-heartedly apologize for any inconvenience this issue may have caused and we thank you for your patience.

In lighter news …

This day be International Talk Like A Pirate Day. Grab ye a pint of meed and rowdy mate and get ta celebratin’ the day. For those among ye who be not sure how this be done … put yer eyes ta this …


SEO news blog post by @ 3:39 pm on



Gotta Love Telus (phone provider)

As a special notice to our clients and visitors that our toll free number is down. Rather than calling 877-370-9750 you’ll have to call 250-370-9750 temporarily.

We apologize for any inconvenience this may cause.

SEO news blog post by @ 12:55 pm on September 17, 2009



BOTW Discount Code

Best Of The Web (my personal favorite paid directory) is offering a 20% discount on submissions to their directory and other services for the month of September.

Since DMOZ has become all but useless, Best of the Web (the oldest running directory) has become the most valuable and through directory on the web. For the price of a listing (less than $240 for a lifetime listing with this discount) it’s higher on our list than even the Yahoo! directory which is, of course, a solid directory as well though a bit overpriced for a lot of people’s needs and budget.

To submit to BOTW and get the discount you simply need to find the correct category and use the discount code SINCE94.

You can view their directory and/or submit your site at

SEO news blog post by @ 4:41 pm on September 1, 2009



Website Related Hackers and Malware Getting Smarter

Any legitimate website owners worst nightmare is to have their website hacked or used as a platform for serving malware (spyware, trojans, keyloggers, packet sniffers, etc). Luckily not only do hacking methods evolve but so does protection and safe guys such as StopBadge and Google’s website warning integration into the result set (actual message displayed under the result is “This site may harm your computer”). But every so often hackers get a little more unique in there tactics.

Today when visting the XXCOPY website (XXCOPY is a utility similar to XCOPY originally by Microsoft that extends the functionality with over 200 functions!) I ran into one of these issues. If you go directly to XXCOPY’s website there is no issue, however if you Google the phrase XXCOPY and then click on the result you may, or may not get one of the “Reported Attack Site!” message in Firefox (Firefox has the best anti Malware detection scripts).

After discovering this issue I called one of the reps at XXCOPY who proceeded to tell me that the issue was purely on my computer (talk about a slap in the face to a hardcore techie), and that he couldn’t replicate the issue so it must not exist. Digging further into the issue I soon realized that I was being redirected intermittently over to kb971657 (dot )info (most likely originally setup so people Google this particular Microsoft Knowledge Base article would land on their website), but not every time. In fact it took me 10 tries at one point to replicate the issue (clicking on the XXCopy SERP result, then clicking back and clicking it again).

By adding this seeming randomness to the malware redirection, as well as detection of referring page (Google in my case) it made it harder for the company to detect as going directly to worked every time. My assumption would be that this Malware is using some sort of form of detection and cloaking. Unlike blackhat cloaking it is hiding content from the search engine, and only showing it when it meets certain conditions (ie the visitors comes from Google or some other website, and then it does some sort of random number check that meets a secondary condition). Hopefully XXCopy gets this issue sorted out.

SEO news blog post by @ 3:23 pm on August 20, 2009



Google Caffeine Update, Paid Links, and Rankings

Some of you out there may have heard about Google’s major algorithm update called Caffeine, which can currently be tested here (link removed – resource no longer exists) they have already revised it a few times, so don’t be alarmed if it is down for a few hours at some point with a maintenance message. This algorithm update is mainly related to how Google indexes websites, but as we have seen here at Beanstalk it has given even better results to most of our clients. For instance we have a client on Page 3 for the phrase Shade Sails (still mid promotion), but on the sandbox he is in the top 10. Another client on the 7th page for California Health Insurance are on the 1st page on the Sandbox. And lastly one of the largest campaigns we are running for the phrase Web Hosting shows only a variance of 1 position (lower on Caffeine).

From what I can see not only has the indexing changed but so has the value of higher numbers of links. We are being out ranked by a company called Rankpay for the phrase SEO Service almost purely by paid links (Sample Paid Link), and links from their clients websites (Rankpay uses the anchor text SEO Service and SEO Services) whereas we rank #1 on the current Google search algorithm. Reviewing the major phrases we rank for (SEO Service, SEO Services, and SEO Consulting), there is no more than a one position difference in our rankings. Needless to say in the long run paid links will get you penalized from the rankings, so we aren’t going to change our tactics.

Below is a video produced by Michael McDonald from WebProNews and Matt Cutts (one of the top Google engineers) discussing the Caffeine update and its affects on rankings.


SEO news blog post by @ 1:05 pm on August 18, 2009



More SEO Article Copywriters Needed

As Beanstalk continues to grow we are constantly in need of talented SEO article, and content writers. What we are looking for are talent writers that can produce articles that are between 600 – 1,000 words of content. We are currently mainly looking for Tech, Law, Financial and Health article writers. Please send resumes to, as well as some samples. Cheers.

SEO news blog post by @ 12:11 pm on July 22, 2009



The Google Shuffle? Has your website recently sunk to “Davy Google Jones Jr’s Locker”?

Webmasters and SEO gurus have been scratching our heads for a few weeks now trying to figure out what has been happening to Google’s SERP rankings. After scouring blogs and forums for the last few days, it would seem that there is no real consensus. In fact, it seems that no one is willing to even speculate much as to what is happening. To date there has not been any official word from Google. We all know that Google does not announce their algorithm updates, much to the chagrin of webmasters everywhere.

The buzz recently on several blogs and from our own data demonstrates significant changes in PageRank and wild fluctuations in websites SERP. The last big news we did hear from Google was the June 16th 2009 announcement from Matt Cutts blog on PageRank sculpting where he discussed changes to how Google treats link juice when there are nofollow links. But that’s another blog topic altogether so if you like you can read the full post here: so it may be that the nofollow·attribute has been rendered useless for sculpting PageRank. But then, PR sculpting was never really the intended function behind nofollow; it was merely convenient side effect.

All that Google employee, John Mu cared to say when answering a customer’s inquiry as to why his site had suddenly dropped in PR with no apparent cause was:

“Hi Radoslav

You have a nice-looking site :). As far as I can tell, it looks like the change in Toolbar PageRank for your site is only due to some technical quirk and not something that you need to worry about.



Barry Schwartz (AKA “Rustybrick”) then pointedly asks:

“John, is the PR ‘Technical Quirk’ somewhat widespread?”

There was no further reply from Google. The post is available here:

Unfortunately, when person’s website goes south in rankings for no apparent reason, people do notice and do worry about it. So unless Google opens up a bit we are left scratching our heads as usual, trying to figure out what is going on.

The following thread gives another vote to the possibility that Google is replacing PageRank value with site trust and/or domain authority: This is also one of many threads where users are expressing frustration and beginning to consider trying the new alternative to Google, Bing. Watch your back Google.

There have been some major experiments this year form Google that were relatively short lived and those are fine. We all expect to see the occasional wild results for a weekend every few months along with quarterly PageRank updates. The June PR update was enough of a surprise coming so close on the heels of an update late in May: The update itself is not too shocking. What is interesting is that this is happening so soon after Google’s last update and the fact that garbage results and rapid ranking changes have been coming steadily for weeks now. It’s about time Google lets things settle down before more people get the bright idea to give Bing a try.

Here are some direct comments from the forum members at

“It has been my observation “followgreg” (a username) when the SERP’s get like what you describe above this is what [Google] wants to happen so the Review team and Matt’s team can put the necessary data in place that will deal with what your describing. It is easier to review a site when they are on page 1 verses page 200 and [Google] knows what filters were relaxed that would allow for the “New” 1st page ranking to pop up. I myself don’t see the polluted SERP’s as your describing but then again I am not in every sector and can only look at the nitches I am working under.”

“and right now it looks like all sets of the results include some trivial and penalized and junk .edu pages rising into the top 50, along with some long-neglected good ones. This used to happen all the time with updates — shuffle things up, the poop rises, then it gets flushed, and things settle down. we haven’t had an update in that format in a long time, but it seems clear we are in the middle of whatever is changing and not the end.”

We can analyze the SERP’s, collect all the data we can find, and listen to all of the “buzz” we like, but at the end of the day we are still at the mercy of the “Big G”. It is not unusual for Google to conduct their more aggressive algorithm changes at this time of year but it is unusual to see so much experimentation so close together taking so long. With there being no official word coming from Google it’s hard to do more than speculate on the changes that we can observe. We all certainly hope that things stabilize soon and we’ll continue monitoring changes in the rankings.

But until Google decides to straighten things out can anyone say “Pay-per-click”? I knew you could…

So how does the widely varied public opinion on the matter line up with search results?

I am willing to make an educated guess that Google is experimenting with website trust and authority in their algorithm (and perhaps plenty more) however as complaints from the forums echo Googles search results seem to be rather bi-polar these last few weeks.

We have well established sites being outranked by new sites, and by sites with very few backlinks. Also by sites using black hat techniques and unfortunately we see some established and often very trustworthy white hat websites simply dissappearing from the rankings altogether. At the same time we have literally day old Craigslist posts ranking in the top results. Some .edu and .gov sites have flown to the top while others have plummetted.

How often do you see day old pages rank near the top for competitive search terms? If “trust” has that much of an affect on a new pages rankings it’s likely that “trusted” sites will dominate the rankings with every new page of content flooding out the competition and reducing their ability to gain trust. I hope the minds at Google have their sober thinking caps on and not their beer hats. But so far there seems to be little consistent rhyme or reason since we have some trusted sites disappearing and others dominating in the SERP’s.

Luckily we had some old SERP analysis notes from June where we had a close look at one of our clients top 5 competitors for their targeted search term on Google. We decided to compare each against the current search results since Google’s latest “technical quirk”. Here’s the rundown according to Yahoo’s api and our analysis:

Former #1 website – PR 4 landing page, PR 5 root domain.

1700+ external inbound links, 800+ internal backlinks.

Almost one thousand of these backlinks are from a handful of what appear to be partner sites. A significant amount are from various blogs.

Strong root domain with almost 5k external inbound links.

Now ranking at #2

Former #2 website – PR 6 landing page, PR 7 root domain.

Less than 100 external inbound links, over 15k internal backlinks.

Root domain has 140k+ external inbound links and 16k+ internal backlinks.

Very strong root domain and what should be a high trust name. Much of the pages ranking comes from the internal backlinks from the root domain and other pages on the site.

Now ranking at #5

Former #3 website – PR 4 landing page, PR 7 root domain.

5k+ external inbound links, less than 100 internal backlinks.

Root domain has 130k+ external inbound links and 16k+ internal backlinks.

Not only is this an extremely strong domain its brand is a household name across North America and not only would I trust this site based on its name and reputation but I would say the incoming links are as organic as they come.

Strangely this website no longer ranks anywhere in the top 300 results.

Former #4 website – PR 4 root domain

1k+ external inbound links, 500+ internal backlinks.

Most external links are from articles, blogs, and directories.

Now ranking at #6

Former #5 website – PR 4 root domain

6k+ external inbound links, 400+ internal backlinks.

Many backlinks are from PR7 and PR8 blogs, hundreds from one PR5 blog in particular. The website is referenced and backlinked on some government websites as well.

No longer ranks anywhere in the top 300 results.

New #1 website – PR 6 landing page, PR 9 root domain, .gov site

700+ external inbound links, only several internal backlinks.

Root domain has 430k+ external inbound links and almost 630k internal backlinks.

New #3 website – PR 4 landing page, PR 5 root domain

Less than 100 external inbound links, 40 internal backlinks.

Root domain has less than 300 external inbound links and less than 150 internal backlinks

Despite the small number of links this site has come from nowhere. While it is a widely known brand name and should have some trust attached to that, it is strange to see it taking the place of an even larger household name which had approximately 1300 times more external inbound links.

New #4 website – PR 5 landing page, PR 8 root domain.

Less than 200 external inbound links, 200+ internal backlinks.

Root domain 3.7+ million external inbound links, 3k+ internal backlinks

It’s a wiki page and therefore is a highly trusted authority most likely according to Google. I believe it was ranking at #10 in our previous analysis.

The results show a polarized contradiction of trusted sites being brought to the top and others being shot to the bottom while sites with minimal links and reputation seem to be beating out well established competitors for their rankings. Black hat sites are seeing the same polarized change as the trusted sites with some jumping to the top and others being sent to Google’s version of Davy Jones locker.

And on that note I have to ask the same question I asked during Pirates of the Carribean III At Worlds End… “When will this end?” And when will our plunder be kindly returned from “Davy Google Jones Jr’s Locker”?

SEO news blog post by @ 11:01 am on July 19, 2009

Categories:SEO Articles


Google Update

It appears that things are settling down over at Google. I can’t get into all the details right now as I haven’t had a ton of time to analyze everything due in no small part to the fact that I’m on vacation in beautiful Whistler, BC (big thanks to Bryan from Whistler Retreats. As always – the Whistler accommodations are awesome my friend. :)

be sure to check back early next week after we’ve had some time to analyze the events and also monitor to see if there are any aftershocks. :)

SEO news blog post by @ 11:51 pm on July 16, 2009



How To Write For Search Engines

SEO (Search Engine Optimization) writing, as a distinct style, was born in the Internet era and has matured before our very eyes in a relatively short span of time. Although it is evolving and maturing still, and will continuously do so, we can define some of the tried and tested steps of content optimization to help unique pages place at or near the top of search engine rankings.

Some experts go on to say that the goal of SEO is two-fold, with the first objective to put out the appropriate “bait” for search engine spiders and the second to serve up useful information to people who want and need it. Debates about priorities continue among SEO professionals, but it is never a good idea to devalue the human factors in any success formula. The singular goal, then, would be to develop, position and refine content in such a way as to satisfy all visitors to the page and/or site, both human and bot alike.

Rethinking search engine content terms

“Content is king,” goes the old saying – and not only is good content king, it is becoming more important with every passing day. But the term content is best taken in its broadest sense. Content is not simply the written copy placed in a document, assembled on a page, or aggregated at a site. It includes all this, of course, but content actually comprises titles, headings, tags, intra-site links and external links, as well.

All of these components need to work together and form an interconnected whole so that both search engines and humans find the right things, come to the right conclusions and, most importantly, make the right decisions. Good writing is always targeted to the audience, and you are writing for an audience of two readers, human and software. Remember these two components of the audience and find creative ways to reach both of them at the same time.

First things first

Titles are critically important – they are usually the first thing read by both real and virtual visitors. A title is the “primary topical identifier” and, as such, has an invaluable function – again, a dual-purpose one. It must contain keyword targets at the individual word level while stoking interest in potential readers at the phrase level.

When a person performs a search, the title is both their first indication of your relevance to their needs and your first opportunity to compel them to click through. Search engines, more clinical and objective, give the title importance because they see it as an indicator of the page’s main idea.

Yet many pages on the Internet have no title at all, or share “Home” and “Untitled” with several million others. There is no excuse for this oversight. The ignorant cousin of these mistakes, making the company name by itself the title of every page, is just as bad. Keywords relevant to the page should be part of every page’s title.

Heading tags carry some importance too. Simply put, heading tags define the headings and subheadings of your article to both readers and spiders. By default they appear larger than normal text and are bolded. While not a magic ranking bullet, they are looked at with more importance than average text and are an opportunity to show spiders the themes of your content and what keywords you wish to rank for.

The H1 tag is the main heading of your article and carries the most importance, like a headline in a newspaper article. It should clearly convey the article’s topic to the reader and main keywords to the search engines. H2 tags are one level down in importance and structure. Use them to define subtopics under your main topic, and again use keywords where descriptive and useful. If you needed to break down your article to sub-sub-headings, you would use the H3 tags, and so forth.

For both human and robotic readers, it is vital to keep page content focused. The “one topic per page” rule is an unwritten one, certainly, and it’s followed by most professional content developers. This has less to do with the intelligence of the readers (either kind) than it does with several other considerations. For one thing, search engine “crawlers” have algorithms that tend to work best on one concept at a time, and most humans work best this way, too.

In addition, limiting the focus eases the task of placing keywords in the meta descriptions, page title, body copy, tags and links. Finally, dealing with more than one topic necessarily means using more verbiage, which dilutes the potency of a site-wide SEO program and may negatively impact ranking. Better to give these other topics their own content, strengthening your site’s overall informational authority.

SEO copywriting balance

Much ink has been spilled and many pixels propagated in discussing SEO techniques, analyzing strategies, teaching “web content” writing, and chasing changing algorithms. Mentioned less but encompassing everything is that SEO copywriting, like all SEO, is about balance.

While articles such as this one can be helpful, it is important to understand that SEO will always evolve, change, adapt and improve. Study and implement tested techniques, but remain flexible and nimble. Writing for search engines and people at the same time is tricky and challenging at best, and can be frustrating and time-consuming, too. Approach the challenges in a businesslike fashion.

SEO content writing at its best balances art with science, blending the craft of engaging the reader with the dispassionate analysis of keywords on a page. Follow best practices, but fill each article to the brim with information useful to your demographic.

In simultaneously targeting a subject, an audience, and an algorithm, a great deal of creativity must take place to get effective SEO results. And, of course, it all has to happen in an environment that encourages short attention spans and constantly tries to lure people elsewhere. It is a major challenge to craft article titles and copy so compelling as to make people stop and read – or, better yet, stop and then click where you want them to.

Basics, opportunities, and consistency

The basic approach to writing for such a dynamic, ever-changing environment is to get to the point quickly. The “USA Today” news style – which relies on short headlines, descriptive sub-headlines and a few concise paragraphs – is perhaps the best analogy for good SEO writing. The important points (keywords) should appear early and often, and within a short period of time the human readers should know what they are supposed to do, while the search engines should be able to tell what the page is about from a consistency between your page structure and your body copy.

In the eyes of the search engines, everything that it can possibly see counts. That is, using image alt-text not only helps blind readers and people using phone- or text-based browsers, it also gives you another opportunity to add more descriptive strength to the overall page for the search engines. Do not miss any opportunity to further empower and refine your content.

And always remember when writing for search engines – keep writing. Write write write. Search engine bots gorge on new information, and if you consistently update your site with fresh content they will come around more often. While this gives you more opportunities to display your value, more importantly it builds the foundation of information that obviates it.

There’s a lot to do, and it all needs to be done well. Use your numbers, metrics and analytics to point you in the right direction for creating more content. That’s some science. Your creativity and amount of useful information, on the other hand, will point site visitors and search engines in the right direction. That’s a touch of art. When both aspects of your SEO program are firing on all cylinders, you should soon be marching up the search engine rankings.

Next Week

Next week we’ll be releasing Part Three of the series –Writing For Conversions.

SEO news blog post by @ 3:20 pm on July 15, 2009


« Newer PostsOlder Posts »
Level Triple-A conformance icon, W3C-WAI Web Content Accessibility Guidelines 1.0 Valid XHTML 1.0! Valid CSS!
Copyright© 2004-2014
Beanstalk Search Engine Optimization, Inc.
All rights reserved.