Beanstalk on Google+ Beanstalk on Facebook Beanstalk on Twitter Beanstalk on LinkedIn Beanstalk on Pinterest
Published On:
SEO articles and blog published on ...
Hear Us On:
Webmaster Radio
Blog Partner Of:
WebProNews Blog Partner
Helping Out:
Carbon balanced.


SEO concerns for Mobile Websites

You want to serve your clients needs regardless of what device they visit your site with, but how do you do it easily without upsetting your SEO?

Lets look at the various options for tackling Mobile sites and what each means in terms of SEO:

Responsive Design :
Visual demonstration of responsive web design

  • Responsive design is growing in popularity, especially as communications technology evolves, and bandwidth/memory use is less of a concern.
  • This method also gives us a single URL to work with which helps to keep the sitemap/structure as simple as possible without redirection nightmares.
  • On top of that, Googlebot won’t need to visit multiple URLs to index your content updates.
  • Less to crawl means Googlebot will have a better chance to index more of your pages/get deeper inside your site.
“Why is/was there a concern about mobile page size?”

Low-end mobiles, like a Nokia C6 from 4+ years ago (which was still an offering from major telcos last year), typically require that total page data be less than 1mb in order for the phone to handle the memory needs of rendering/displaying the site.

If you go over that memory limit/tipping point you risk causing the browser to crash with an error that the device memory has been exceeded. Re-loading the browser drops you on the device’s default home-page with all your history lost. I think we could all agree that this is not a good remote experience for potential clients.

Higher-end devices are still victims of their real-world connectivity. Most 3rd generation devices can hit really nice peak speeds, but rarely get into a physical location where those speeds are consistent for a reasonable length of time.

Therefore, even with the latest gee-wiz handsets, your ratio of successfully delivering your entire page to mobile users will be impacted by the amount of data you require them to fetch.

In a responsive web design scenario the main HTML content is typically sent along with CSS markup that caters to the layout/screen limitations of a mobile web browser. While this can mean omission of image data and other resources, many sites simply attempt to ‘resize’ and ‘rearrange’ the content leading to very similar bandwidth/memory needs for mobile sites using responsive design approaches.

The SEO concern with responsive designs is that since the written HTML content is included in the mobile styling it’s very crucial that external search engines/crawlers understand that the mobile styled content is not cloaking or other black-hat techniques. Google does a great job of detecting this and we discuss how a bit later on with some links to Google’s own pages on the topic.

Mobile Pages :

Visual demonstration of mobile web page design

If you’ve ever visited ‘’ or something like that, you’ve already seen what mobile versions of a site can look like. Typically these versions skip reformatting the main site content and they get right down to the business of catering to the unique needs of mobile visitors.

Not only can it be a LOT easier to build a mobile version of your site/pages, you can expect these versions to have more features and be more compatible with a wider range of devices.

Tools like jQuery Mobile will have you making pages in a jiffy and uses modern techniques/HTML5. It’s so easy you could even make a demo image purely for the sake of a blog post! ;)

This also frees up your main site design so you can make changes without worrying what impact it has on mobile.

“What about my content?”

Excellent question!

Mobile versions of sites with lots of useful content (AKA: great websites) can feel like a major hurdle to tackle, but in most cases there’s some awesome solutions to making your content work with mobile versions.

The last thing you’d want to do is block content from mobile visitors, and Google’s ranking algorithm updates in June/2013 agree.

Even something as simple as a faulty redirect where your mobile site is serving up:
..when the visitor requested:

.. is a really bad situation, and in Google’s own words:

“If the content doesn’t exist in a smartphone-friendly format, showing the desktop content is better than redirecting to an irrelevant page.”

You might think the solution to ‘light content’ or ‘duplicate content’ in mobile versions is to block crawlers from indexing the mobile versions of a page, but you’d be a bit off the mark because you actually want to make sure crawlers know you have mobile versions to evaluate and rank.

In fact if you hop on over to Google Analytics, you will see that Google is tracking how well your site is doing for mobile, desktop, and tablet visitors:
Example of Google Analytics for a site with mobile SEO issues.

(Nearly double the bounce rate for Mobile? Low page counts/duration as well!?)

Google Analytics will show you even more details, so if you want to know how well you do on Android vs. BlackBerry, they can tell you.

“How do the crawlers/search engines sort it out?”

A canonical URL is always a good idea, but using a canonical between a mobile page and the desktop version just makes sense.

A canonical can cancel out any fears of showing duplicate content and help the crawlers understand the relationship between your URLs with just one line of markup.

On the flip-side a rel=”alternate” link in the desktop version of the page will help ensure the connection between them is understood completely.

The following is straight from the Google Developers help docs:

On the desktop page, add:

<link rel="alternate" media="only screen and (max-width: 640px)" href="" >

and on the mobile page, the required annotation should be:

<link rel="canonical" href="" >

This rel=”canonical” tag on the mobile URL pointing to the desktop page is required.

Even with responsive design, Googlebot is pretty smart, and if you aren’t blocking access to resources intended for a mobile browser, Google can/should detect responsive design from the content itself.

Google’s own help pages confirm this and provide the following example of responsive CSS markup:

    @media only screen and (max-width: 640px) {...}

In this example they are showing us a CSS rule that applies when the screen max-width is 640px; A clear sign that the rules would apply to a mobile device vs. desktop.

Google Webmaster Central takes the information even further, providing tips and examples for implementing responsive design.

Ever wondered how to control what happens when a mobile device rotates and the screen width changes? Click the link above. :)

SEO news blog post by @ 3:51 pm on August 16, 2013


How To Search Engine Optimize (SEO) an AJAX or Web 2.0 Site

One of the three major pillars of Search Engine Optimization is a website’s content, and onsite content optimization. All of the major search engine ranking algorithms have components that relate to the content that is contained on the website. Typically these components relate to Keyword Densities, number of words, content location, and sometimes age of content. In regards to the code that the content is contained in that falls under the topic of structure and not content, and will not be discussed in this article.

Asynchronous JavaScript and XML (AJAX) is an advanced web development method which can be used to create more responsive and interactive dynamic websites. AJAX accomplishes this by making object request calls back to the web server without having to refresh your browser, these object calls are then processed and are typically used to update the content of the page on your website that is currently being viewed. For the sake of this Article I’m going to ignore the XML component of AJAX as the search engines never view any of the XML data. Websites that use Javascript to manipulate content without using AJAX will also suffer from the issues described.

When a search engine sends out a robot / spider to visit your website with the goal of indexing your content it is only looking at what is being presented in the Markup Language. Generally a search engine does not behave like a user when indexing your website, it doesn’t click buttons or links it simply makes note of URLs associated with each page then individually then visits these pages to index them. This largely goes against the goal of AJAX which is to have as few pages as possible by interacting with the web server in a smarter method as the users interact with the website.

To put the last paragraph simply any content that is changed via AJAX or Javascript on a webpage that is not hardcoded in a page won’t be cached by the search engines. This essentially means that if you have great content that the search engines may love but you’re using AJAX you may be missing out on traffic. There are two approaches to rectifying these which may even give you an advantage over sites that don’t utilize Javascript / AJAX.

The first approach is to make sure that your website degrades to normal flat markup language for non javascript capable browsers and search engines. Essentially every time you would have used an AJAX call make sure you have a page with the same content. Unfortunately for a lot of people this could mean a lot of work, for those individual using a database with PHP or ASP it is not too hard to build a site that builds itself with some effective web programming.

The second approach is to use AJAX in a more minimalist fashion. The goal here is to present the search engines with your optimized content while making sure that any AJAX calls a user would do has no bearing on what you want the search engines to see. In fact this can be used to remove content from your website which may negatively affect your rankings such as testimonials. I’ve seen very few testimonials that actually do good things for a sites keyword density, I’ve even been known to optimize testimonials on client’s websites. With Javascript / AJAX you could insert a random testimonial into a page and therefore not affecting that pages keyword density. The only downside to this approach is that some offsite keyword density tools actually use Web Browser rendering engines so they may get false results as it takes the Javascript into account.

Now you may think that I’m anti AJAX from everything that I’ve said, but there are times and places for AJAX, provided it doesn’t affect how the search engines see your beautiful relevant content your trying to rank. AJAX is great to use for Member sections of your website, interactive forms, slideshows, and a lot more it just needs to be leveraged correctly to avoid missing out on search engine visitors. The final thing to keep in mind is that most search engines like to see more than a single page website which many AJAX website appear to be, always strive for at least 5 or more indexable pages as internal links and anchor text can have a lot of value.

SEO news blog post by @ 2:13 pm on July 14, 2009


An Introduction To SEO

Welcome to Daryl Quenet’s introduction to Search Engine Optimization (SEO), optimizing design, and how to maximize your websites search engine positioning for the major search engines.

When it comes to running an effective website that ranks well on the search engine results pages (SERPs), there are three major factors that can influence the number of search engine referrals (incoming searches) you get. This applies to all the major search engines (Google, Yahoo, MSN, and Live).

Content Is King

The most important thing is the content on your page regardless of how much time you put into Search Engine Optimizations (SEO) for your website without the content people are searching for you will find very little return on your efforts.

Involved with the preparation of your content is analyzing the keyword(s) for your given industry. Just putting Keywords in the keywords meta tag will get you no where without those Keywords existing in your content. This is known as Keyword Density, basically the more often you’re keywords the more relevant your content is for the searcher in the eyes of a search engine. Keep in mind an ideal density is around 3.5% per word in you phrase.

When writing your Search Engine Optimized content don’t forget about the end user. If you can’t get your keyword densities bang on, then don’t worry about it. I prefer to have a lower density but higher quality content for the end user, than having spammy content and a lower conversion rate. The end goal is still to convert your visitors to your products, services, or whatever your goal may be. Users, unlike search engines, are not interested in Keyword Density so beware of keyword spam.

And a final note on Content for this introduction is that it is advisable to constantly update your content. The longer your content goes without updates, the staler the content gets, and the lower your search engine positioning will drop. However with enough Link Building this can be negated.

Link Building Your Way Too Success

Link building is easily the second most important factor in SEO, and in some cases the most. Building links into your website is the only way as a webmaster that you can affect the authority of your website, and the value your existing content may have in the eyes of the search engines.

To conceptualize link building think of your website as if it was a person. The more popular a person is the more authoritative what they have to say is to their target audience. The big difference being that our target audience is Google, and the other major search engines, and having quality links on other sites equates to your websites “popularity”.

Now keep in mind when you start your link building that nearly no two links are exactly the same. When Google calculates the value of a link it looks at several important things to figure out just how much strength to give you. Here are just a few:

  1. How much strength did the page with the link have
  2. Number of external links on a page
  3. Anchor text used for the link
  4. Is a rel=nofollow tag used
  5. How long has that link been there

Now keep in mind all of these factors above are irrelevant if Google hasn’t cached the page with the link, if Google hasn’t found it then it is worth nothing. The stronger the strength of the page your link is on the more strength you will get in return. The more outgoing links there are on a page the more that strength will be divided between all the linked sites.

A link with a rel=”nofollow” attribute is virtually useless to your website other then increasing your overall link count to give your competitors a scare. You will mainly find NoFollow attributes for Blog Comments, Website Advertisers / Sponsors, Paid Links, or links to competitors (I use them on my resume for past work experience).

When a link is built very few search engines will give you the full strength of that link right away. This is done to maintain the quality of the SERPs if everyone could just go out build 1000s of links then rank there would be no quality to the search engines. Instead they slowly give you more strength as these links age up until around the 6 month period.

Lastly you will constantly see something called Google Pagerank. Pagerank is an arbitrary Google measurement assigned to a website / page to denote that pages strength. Now some people consider this measurement to be the end all be all, but in truth it means very little other than an indicator of you sites health. If you have a PageRank on your homepage as well as pagerank on most of your internal pages your off to a good start. Also keep in mind that pagerank only updates every 3 – 6 months, and ultimately the proof is in the search engine results not some number in the toolbar.

* It’s important to note that when I’m referring to PageRank above I’m referring to the visual PageRank displayed in the little green bar, not the actual PageRank that Google uses internally to calculate the value of a page.

Optimize Your Website Navigation Structure & Design

I purposely left site structure to last as it can be the quickest way for you to royally mess up your website rankings. The worst case with bad structure is that no part of your website will be cached and you will see no visitors. I’ve seen a lot of sites with a lot of issues causing no search engines to crawl these sites. Some of the worst yet simple structural issues that can affect your search engine crawler visibility that I’ve seen are:

  1. Automatically redirecting all visitors that come to your site to another page.
  2. Using HTTPS only
  3. Pure Javascript based navigation

On other sites I have seen Google only cached the index page, which may have an assigned Pagerank without spidering the rest of the website. The things to remember when mapping out the structure of your website are:

  1. At all costs avoid having dynamic URLs (ie index.php?PageId=1), a dynamic URL is a URL that contains HTTP GET variables. Search engines don’t tend to spider these sites well. And to users they don’t have any relevant information to their queries. Try to use Page keys that contain your keywords, if you need to use Dynamic scripts to build your website (i.e. through a Content Management System), use Apache Mod Rewrites to build a static in appearance website (link removed). If you have to use Dynamic URLs keep your number of variables at no more than 2.
  2. If possible try to use the Keywords you are targeting for your industry in your URL or Files / Directories. This helps increase your Keyword Density, as well as providing users clicking through on Google information relevant to their query in your file names.
  3. Don’t constantly change your website structure. Pagerank takes time naturally to develop, and Google holds new sites back in a Sandbox. By renaming a page you can often kiss your pre-existing search engine positioning away on renamed pages until their rank is redeveloped.
  4. When designing a new site try to avoid having filenames with extensions in the URL (ie Products.asp), this can limit your options in the future if you change programming languages (ie ASP to PHP), as well as the platform your website can be hosted on (ie Windows vs Linux Hosting).
  5. When implementing a new structure or new site, create a Google sitemap, and register it with Google to let Google know what to index.
  6. Whenever possible attached CSS and Javascript as external files.

Once you have decided on a website structure, or you have a pre-existing structure, the best way to score higher search engine positions is to have minimalist coding in the HTML to maximize your Content to Markup Ratio. The best way to minimize the amount of HTML code required is to use Cascading Style Sheets (CSS). Cascading Style Sheets allow you to pull the design out of your HTML pages and place them into a separate file. Not only does this remove a lot of HTML if you were using Tables for layout, it makes maintenance a lot simpler as all your design changes are made in one place.

When I moved my website from table based layout to Cascading Style Sheets I managed to reduce my markup code by around 60%! If you have a very large site this can be even more beneficial as some search engines limit the amount of hard drive space they will allocate to caching your website, as well as raise the position of your content higher up in your document.


And thus concludes my introduction to Search Engine Optimization (SEO), it may sound long and long winded, but that is really just a little bit of what goes into successful positioning your website on the search engines. I’ll finish up with one last warning and that is to not buy or sell links, as you can easily be penalized completely from the SERPs for this (Google supplies a page for reporting websites for buying and selling). Good luck on your Search Engine Result Pages and Positioning!

SEO news blog post by @ 5:10 pm on February 27, 2009


Ecommerce & SEO

The purpose of any business website is to promote a product or service online. The purpose of an ecommerce website is to take it one step further and to allow your visitors to purchase your products or services directly from your website. This model has many great advantages over the non-ecommerce website in that it allows for the generation of revenue with little-or-no time spent in selling past the cost to have the website designed and maintained, and it does not require the visitor to call you during business hours thus helping secure the sale to an impulse buyer. If your website provides all the information that the buyer would want, you can save significant money in sales time spent in that the visitor can find all the information they need to decide to buy from you without taking up your time or that of one of your sales staff. But ecommerce sites have a serious drawback as well; very few of them can be properly indexed by search engine spiders and thus will fail to rank highly.

A non-ecommerce website may have the disadvantage on not being able to take the visitor’s money the second they want to spend it, however if it can be found on the first page of the search engines while your beautifully designed ecommerce site sits on page eight, the advantage is theirs. The vast majority of visitors will never get to see your site, let alone buy from you, whereas a non-ecommerce site may lose sales because they don’t sell online but at least they’re able to deliver their message to an audience to begin with. So what can be done? The key is in the shopping cart you select.

SEO & Shopping Carts

The biggest problem with many SEO-friendly ecommerce solutions is that they are created after the initial product. Shopping cart systems such as Miva Merchant and OS Commerce are not designed with the primary goal of creating pages that will be well-received by the search engine spiders. Most shopping cart systems out there today are not in-and-of-themselves even spiderable and require 3rd party add-ons to facilitate even the lowest form of SEO-friendliness. The money you may have saved in choosing an inexpensive shopping cart may very well end up costing you your business in the long run, especially if you are using your shopping cart as the entire site, which we have seen may times in the past.

What Can Be Done?

There are essentially two solutions to this problem. The first is to create a front-end site separate from the shopping cart. What this will effectively do is create a number of pages that can be easily spidered (assuming that they’re well designed). The drawback to this course of action is that your website will forever be limited to the size of the front-end site. Which brings us to the second option: choose a search engine friendly shopping cart system.

Finding an SEO-friendly shopping cart system is far easier said than done. There are many factors that have to be taken into account including the spiderability of the pages themselves, the customization capacity of the individual pages, the ease of adding products and changing the pages down the road, etc. While I’ve worked with many shopping cart and ecommerce systems, to date there has been only one that has truly impressed me in that it is extremely simple to use, it allows for full customization of individual pages and the product pages get fully spidered to the point where they have PageRank assigned. A rarity in the shopping cart world.

Easy As Apple Pie

Mr. Lee Roberts, President of Rose Rock Design and creator of the Apple Pie Shopping Cart, was kind enough to take the time to speak with me regarding how he developed his system. Trying to get an understanding of how this system was born I inquired as to what differentiated their system from others. Without “giving away the farm”, Lee pointed out that his system was unique in that the search engines were a consideration from the birth of this project. Rather than trying to jerry-rig a system that was already in place, he initiated the development of a system whose first task was to allow for easily spidered and customized pages. A significant advantage to be sure.

In further discussions he pointed out a few key factors that should be considered by all when choosing a shopping cart system. While more advance shopping cart systems that provide for SEO-friendly pages may seem more expensive, they save you the cost of developing a front-end site, maintaining the pricing on a static page if one goes that route, and of course – if all your site’s pages are easily spidered and you can then have hundreds of additional relevant pages added to your site’s overall strength and relevancy you have a serious advantage in the SEO “game”. If a shopping cart system costs you an extra $100 per month to maintain but it’s use provides you with an additional $5000 in sales that month did it really “cost” you $100?


It is not to say that the Apple Pie Shopping Cart is end-all-be-all of SEO for an ecommerce site, if it was Lee wouldn’t be in the process of building a new version that will include many new features for Internet marketing and tracking, and we would be out of work. That said, if you’ve got an e-commerce site or are looking to have one built, one must consider what type of marketing strategy will be taken with the site and if SEO is one of those, insure to find a system that provides the same advantages as this one.

It may cost a bit more up front but doing it right the first time is far less costly than building a site that can’t be marketed properly and to it’s maximum potential.

SEO news blog post by @ 3:46 pm on


SEO Step Nine Of Ten: Conversion Optimization

Welcome to step nine in this ten part SEO series. The ten parts of the SEO process we will be covering are:

  1. Keyword Research & Selection
  2. Competition Analysis
  3. Site Structure
  4. Content Optimization
  5. Link Building
  6. Social Media
  7. PPC
  8. Statistics Analysis
  9. Conversion Optimization
  10. Keeping It Up

Give them Direction, then give them choice.

Marketing online doesn’t have to be terribly complicated or flashy in order to be successful.

I do a lot of market research for big corporations and smaller start-ups, all of which come to me with the same problems:

  1. They aren’t getting the right traffic to convert on their website.
  2. They aren’t getting return traffic to their website.
  3. Their abandonments are high and their website has no stickiness.
  4. Leads and/or sales are falling off.

Really, what they are all failing to do is build a real “brand relationship” with their market, and they cannot seem to figure out why.

But what is interesting is that the reason why is pretty much the same across the board – they never took the time to really consider the behavior of their current users and those users that they want to attract.

I know it is the oldest marketing tenant, but it’s so old because it is absolutely true.

If you really want to succeed and get more online conversions, you have to:

  1. Look first to your market
  2. Seek to understand your market
  3. Identify what your market needs

Here’s the trick though…

  1. Give it to them in a way that seems natural, specific to them
  2. Coach them along the way

More often than not, whether it is B2B or B2C, most companies fail to give the audience what they want – because they don’t do it in the way the audience wants to hear it. Instead they use flash or messaging that is constructed around the way the business sells and not the way its customers buy.

Let me tell you a story.

Fifty years ago my grandfather opened a small butcher shop in the east end. He was an immigrant with some scraped together savings from working on the docks in the day and a family meat shop downtown during the nights. He would grind up sausages all night, cut porterhouses, and then break his back unloading booze and everything else off the boats.

I’m sure he kept a little for himself – a sort of cheap steaks on the waterfront and cheap booze from the butcher kind of thing – but he had a dream of making his own way and he did what he had to in order to make it happen.

I didn’t know him then, but he would tell me stories sometimes about how he worked for everything he had and how nothing came easy.

The things I remember most was that he told me it took him 2 years to find his storefront; had to be on the corner with windows shining light in from both sides; had to be able to have two 12 foot long displays on the inside walls; had to be able to have a register in the middle.

Location was the key for his success.

What’s more, he also made sure that it was the on the cross of the two bus routes – one that went to the downtown offices, the other coming up from the waterfront. It also happened to be surrounded by two bedroom family apartments.

He said he always put the right cuts of meat in each display, so there was a path for the bankers and a path for the workers.

By the time I knew him, he had over a dozen butcher shops in five cities.

The story may seem like a bit of a distraction, but I think that the same design my grandfather brought to his butcher shop is what is lacking in a lot of website design.

You see, my grandfather knew that his location would make stopping at the butcher a very natural extension of his customers walk home. He also knew that the guy coming in the center door from the right side would look through the window as he marched to the door and follow that direct path to the register.

In his shop, nobody had to crisscross the store – the two displays were tailored to either the cuts favored by banker or the cuts favored by longshoremen – the money at the register was the same – and everybody chose their distinct route, but they were definitely coached along the way.

Barrier Scanning in Websites

A few weeks ago we published a little whitepaper about a trend that we were starting to see more and more through some of our eye tracking studies. ( – it’s free to download) We started calling it barrier scanning in-house, and eventually we started seeing it so frequently that we thought maybe we should write about it.

Funnel Barrier ScanningBarrier Scanning happens on all types of websites, from e-commerce to lead generation microsites, and is the act of some on the page element – be it a large graphic, whitespace, video, or even the page fold – being perceived as a natural barrier to a user’s scan pattern. In essence, it interrupts the natural scan activity and either redirects, fences, or funnels a user’s natural flow on the page.

Think of it like rocks in a stream… if the rocks are large enough, or there happens to be too many of them, the water will change its natural course.

The image (Figure 1) above is a perfect illustration of how graphic elements can alter a user’s scan pattern. (The red to blue spectrum overlay indicates the area and concentration of a user’s fixations.) See how the user’s scanning is funneled to the text links in the right rail by the two large display ads.

Size matters - with images.However, the size of an image can also determine whether or not it is perceived as a barrier and blocks the scanning of the page (see figure 2 as an example). In fact, larger graphics, without text, are easier for a user to ignore because they are not part of the fovial fixations (people use their peripheral vision to look at pictures).

Like a large graphic, video or Flash can also act as barriers, driving users around the multimedia portion of the page.

Think about getting Conversions

Now imagine that your website has a large graphic call to action… is there a good chance that it is being perceived as a natural barrier and actually pushing user’s scan patterns away instead of attracting eyeballs?

Now imagine that you have embedded your conversion trigger into your Flash file – right at the end of it… is there a good chance that user’s are scanning around the media looking for a clear navigation path?

The answer to both questions is “yes.”

Let me first give this caveat though – just because your website has this type of layout doesn’t mean this is definitely happening, but if you aren’t leaving enough natural information scent on the page, more than likely your page is failing you as a result.

What do I mean?

Think of your page like a treasure map, you can’t just place your conversion trigger and hope a user will find it… even if you make it larger than life with flashing neon. User’s scan a page so quickly and make relevance and navigation decisions within fractions of a second… for those reasons, you have to lay clues or clear a path for the user’s scanning.

There isn’t necessarily a simple design rule of thumb other than making a strong paradigm shift. Rather than focus on trying to be engaging, try to engage your particular customer. There is a difference there; often designers confuse “engaging” with salesy or entertaining – these are not synonyms. When I say “engage your particular customer,” I mean that you have to look at coaching your customer to a conversion trigger rather than directing them to it.

Try the following steps:

  1. Look at your page the way your customer sees it… if you can’t do eye tracking, asking them questions is usually a pretty good alternative.
  2. Look at your page and see if there are barriers in the way of your customer reaching your conversion trigger – if so, remove them and find better alternatives.
  3. Don’t be afraid of making a mistake, the worst mistake would be to stick with something that isn’t working – make some changes and test them.
  4. The parts of a good design.Remember that layout and design are only a part of the equation, making sure that you make every decision with your customer at the center is the best marketing practice. In order for any conversion trigger or process to really work, not only does the page design have to naturally coach the user to the conversion trigger, but so does the messaging and the overall resonance of the website or else none of your problems will be solved.
  5. Worst case, give me a call… I’m always willing to make myself available to talk about your website or more about how barriers can impact conversions, especially how we need to start looking at online as a coaching medium and not a driving medium. In fact, we will be starting up a recurring webinar in the near future that has a rotating panel of industry experts take a look at audience suggested websites and dissect what is working and what isn’t; while suggesting some best practices for the industry as a whole.

In the end

Like my grandfather said, “it takes work to make a success” – mind you my father said it took an inheritance, but I think he had a different perspective.

My grandfather was no fool though, he spent his time understanding how his customers walked and talked before he built his butcher shop, and that made all the difference …

About The Author

Rick Tobin is the Director of Research at Enquiro Research and widely regarded as an authority on conversion optimization and study.

SEO news blog post by @ 2:01 pm on April 5, 2008

Categories:Web Design Articles


W3C Compliance & SEO

From reading the title many of you are probably wondering what W3C compliance has to do with SEO and many more are probably wondering what W3C compliance is at all. Let’s begin by shedding some light on the later.

What Is W3C Compliance?

The W3C is the World Wide Web Consortium and basically, since 1994 the W3C has provided the guidelines by which websites and web pages should be structured and created. The rules they outline are based on the “best practices” and while websites don’t have to comply to be viewed correctly in Internet Explorer and other popular browsers that cater to incorrect design practices, there are a number of compelling reasons to insure that you or your designer insure that the W3C guidelines are followed and that your site is brought into compliance.

In an interview with Frederick Townes of W3 EDGE Web Design he mentioned a number of less SEO-related though very compelling arguments for W3C-complaince. Some non-SEO reasons to take on this important step in the lifecycle of your site are:

  • Compliance help insure accessibility for the disabled.
  • Compliance helps insure that your website is accessible from a number of devices; from different browsers to the growing number of surfers using PDA’s and cellular phones.
  • Compliance will also help insure that regardless of the browser, resolution, device, etc. that your website will look and function in the same or at least a very similar fashion.

At this point you may be saying, “Well that’s all well-and-good but what does this have to do with SEO?” Good question.

We at Beanstalk have seen many examples of sites performing better after we had brought them, or even just their homepage, into compliance with W3C standards. While discussing this with Frederick he explained it very well with:

“Proper use of standards and bleeding edge best practices makes sure that not only is the copy marked up in a semantic fashion which search engines can interpret and weigh without confusion, it also skews the content-to-code ratio in the direction where it needs to be while forcing all of the information in the page to be made accessible, thus favoring the content. We’ve seen several occasions where the rebuilding of a site with standards, semantics and our proprietary white hat techniques improves the performance of pages site-wide in the SERPs.”

Essentially what he is stating is a fairly logical conclusion, reduce the amount of code on your page and the content (you know, the place where your keywords are) takes a higher priority. Additionally compliance will, by necessity, make your site easily spidered and additionally allow you greater control over which portions of your content are given more weight by the search engines.


The Beanstalk website and the W3 EDGE site themselves serve as good examples of sites that performed better after complying with W3C standards. With no other changes than those required to bring our site into compliance the Beanstalk site saw instant increases. The biggest jumps were on Yahoo! with lesser though still significant increases being noticed on both Google and MSN.

As we don’t give out client URLs I can’t personally list off client site examples we’ve noticed the same effect on, however we can use W3 EDGE as another example of a site that noticed increases in rankings based solely on compliance.

So How Do I Bring My Site In Compliance With W3C Standards?

To be sure, this is easier said than done. Obviously the ideal solution is to have your site designed in compliance to begin with. If you already have a website you have one of two options:

  1. Hire a designer familiar with W3C standards and have your site redone, or
  2. Prepare yourself for a big learning curve and a bit of frustration (though well worth both).


Assuming that you’ve decided to do the work yourself there are a number of great resources out there. By far the best that I’ve found in my travels is the Web Developer extension for FireFox. You’ll have to install the FireFox browser first and then install the extension. Among other great tools for SEO this extension provides a one-click check for compliance and provides a list of where your errors are, what’s causing them and links to solutions right from the W3C. The extension provides testing for HTML, XHTML, CSS and Accessibility compliance.

Other resources you’ll definitely want to check into are:

Where Do I Get Started?

The first place to start would be to download FireFox (count this as reason #47 to do so as it’s a great browser) and install the Web Developer extension. This will give you easy access to testing tools.

The next step is to bookmark the resources above.

Once you’ve done these you’d do well to run the tests on your own site while at the same time keeping up an example site that already complies so you can look at their code if need be.

To give you a less frustrating start I would recommend beginning with your CSS validation. Generally CSS validation is easier and faster than the other forms. In my humble opinion it’s always best to start with something you’ll be able to accomplish quickly to reinforce that you can in fact do it.

After CSS you’ll need to move on to HTML or XHTML validation. Be prepared to set aside a couple hours if you’re a novice with a standard site. More if you have a large site of course.

Once you have your CSS and HTML/XHTML validated its time to comply with Accessibility standards. What you will be doing is cleaning up a ton of your code and moving a lot into CSS, which means you’ll be further adding to your style sheet. If you’re not comfortable with CSS you’ll want to revisit the resources above. CSS is not a big mystery though it can be challenging in the beginning. As a pleasant by-product you are sure to find a number of interesting effects and formats that are possible with CSS that you didn’t even know were so easily added to your site.

But What Do I Get From All This?

Once you’re done you’ll be left with a compliant site that not only will be available on a much larger number of browsers (increasingly important as browsers such as FireFox gain more and users) but you’ll have a site with far less code that will rank higher on the search engines because of it.

To be sure, W3C validation is not the “magic bullet” to top rankings. In the current SEO world there is no one thing that is, however as more and more website are born and the competition for top positioning gets more fierce it’s important to take every advantage you can to not only get to the first page but to hold your position against those who want to take it from you as you took it from someone else.

If you have any questions about W3C compliance or your website please feel free to contact us for additional information.

SEO news blog post by @ 12:10 pm on April 19, 2005

Categories:Web Design Articles


Table Structures For Top Search Engine Positioning by Mary Davies

So you have a beautiful website that you paid a pretty penny for and you are completely happy with it … except no one can find it. Many web designers do not understand search engine positioning, so when they design your website little or no thought is given to the elements of design that may affect your rankings on search engines. On the other hand some search engine positioning companies offer services that will boost your rankings but at the expense of your design. A good search engine positioning company can get your website ranking for key phrases without affecting the overall design and navigation of your site. You can make small changes to your website on your own as well, utilizing the tips noted in this article. There is a happy medium, you can “have your cake and eat it too.”

The first step in designing a website that will rank well on search engines is to build the proper table structure. This framework for your website will easily guide the search engine spiders through your site taking the route you want them to. A good table structure is essentially a map to the spiders, it guides the way as they travel through your website.

A standard website should use the table structure below:

This table structure is very beneficial to your website ranking as it guides the spiders to the text rich, most important content of your site as quickly as possible. The spiders will enter your site at the first table, the header, and then travel through to the empty cell in row 1 of the second table, the next stop is your content area, the “meat” of your website. After going through your content the spiders will then continue onto row 2 where you have inserted a table holding your navigation, if your navigation is image based, as most is, it holds very little content that matters from the spider’s perspective. The spiders will then follow through to row 3 where you have your footer, a place to add in valuable text links rich in targeted keywords.

If this table structure does not fit your design at all there can be work arounds. A good example of this is shown on Notice that the index page upon load has a very simple look, no text, clean, crisp image. This is a good way to present your business with a simple entry page. If you scroll down the page you will see that there is in fact a lot of text below this main image. With a simple 100% x 100% table you can achieve this affect on your website. This table will open the same way in all resolutions giving your index page the look of a text free page when in fact it is not.

100% x 100% table used as mentioned above:

If you choose to use the 100% x 100% table noted above you should still try to stick to the general table structure shown first for your internal pages. There can always be room for adjustments based on the size of your site however most websites use left hand navigation and this table structure is based on left hand navigation. If your navigation is across the top of your page, maybe reconsider it. It can remain at the top, I have designed a few websites that are ranking well with this top navigation, but for the best result, the left hand navigation using the table structure first mentioned in this article is best.

All in all I cannot stress enough that a well designed site will rank higher. Even if you do not have a complete SEO overhaul of your site this table structure will increase your rankings over those that do not have a clean, properly structured website. Though the table structure is not noticed so much by the visitors to your website aside from the look of it, it is very noticed by the spiders and they will go through your website based on it. For an added edge, have your site designed with this table structure, or do it yourself. Make a point of not only giving the visitors to your website the information you want them to have but also the spiders and as quickly and easily as possible.

If you have any questions about table structure or your website please feel free to contact us for additional information.

SEO news blog post by @ 10:27 am on January 27, 2005

Categories:Web Design Articles


Ten Steps To A Well Optimized Website – Step Five: Internal Linking

Welcome to part five in this search engine positioning series. Last week we discussed the importance of content optimization. In part five we will cover your website’s internal linking structure and the role that it plays in ranking highly, and in ranking for multiple phrases.

While this aspect is not necessarily the single most important of the ten steps it can be the difference between first page and second page rankings, and can make all the difference in the world when you are trying to rank your website for multiple phrases.

Over this series we will cover the ten key aspects to a solid search engine positioning campaign.

The Ten Steps We Will Go Through Are:

  1. Keyword Selection – October 24, 2004
  2. Content – October 31, 2004
  3. Site Structure – November 7, 2004
  4. Optimization – November 14, 2004
  5. Internal Linking – November 21, 2004
  6. Human Testing – November 29, 2004
  7. Submissions – December 5, 2004
  8. Link Building – December 12, 2004
  9. Monitoring – December 19, 2004
  10. The Extras – December 28, 2004

Step Five – Internal Linking

With all the talk out there about linking, one might be under the impression that the only links that count are those from other websites. While these links certainly play an important role (as will be discussed in part eight of this series) these are certainly not the only important links.

When you’re about to launch into your link work why not stop and consider the ones that are easiest to attain and maximize first. That would be, the ones right there on your own site and those which you have total and complete control of. Properly used internal links can be a useful weapon in your SEO arsenal.

The internal linking structure can:

  1. Insure that your website gets properly spidered and that all pages are found by the search engines
  2. Build the relevancy of a page to a keyword phrase
  3. Increase the PageRank of an internal page

Here is how the internal linking structure can affect these areas and how to maximize the effectiveness of the internal linking on your own website.

Getting Your Website Spidered

Insuring that every page of your website gets found by the search engine spiders is probably the simplest thing you can do for your rankings. Not only will this increase the number of pages that a search engine credits your site with, but it also increases the number of phrases that your website has the potential to rank for.

I have seen websites that, once the search engines find all of their pages, find that they are ranking on the first page and seeing traffic from phrases they never thought to even research or target.

This may not necessarily be the case for you however having a larger site with more pages related to your content will boost the value of your site overall. You are offering this content to your visitors, so why hide it from the search engines.

Pages can be hidden from search engines if the linking is done in a way that they cannot read. This is the case in many navigation scripts. If your site uses a script-based navigation system then you will want to consider the implementation of one of the internal linking structures noted further in the article.

Additionally, image-based navigation is spiderable however the search engines can’t see what an image is and thus, cannot assign any relevancy from an image to the page it links to other than assigning it a place in your website hierarchy.

Building The Relevancy Of A Page To A Keyword Phrase

Anyone who wants to get their website into the top positions on the search engines for multiple phrases must start out with a clearly defined objective, including which pages should rank for which phrases. Generally speaking it will be your homepage that you will use to target your most competitive phrase and move on to targeting less competitive phrases on your internal pages.

To help build the relevancy of a page to a keyword phrase you will want to use the keyword phrase in the anchor text of the links to that page. Let’s assume that you have a website hosting company. Rather than linking to your homepage with the anchor text “home” link to it with the text “web hosting main”. This will attach the words “web” and “hosting” and “main” to your homepage. You can obviously leave the word “main” out if desirable however in many cases it does work for the visitor (you know, those people you’re actually building the site for).

This doesn’t stop at the homepage. If you are linking to internal pages either through your navigation, footers, or inline text links – try to use the phrases that you would want to target on those pages as the linking text. For example, if that hosting company offered and wanted to target “dedicated hosting”, rather than leaving the link at solely the beautiful graphic in the middle of the homepage they would want to include a text link with the anchor text “dedicated hosting” and link to this internal page. This will tie the keywords “dedicated hosting” to the page.

In a field as competitive as hosting this alone won’t launch the site to the top ten however it’ll give it a boost and in SEO, especially for competitive phrases, every advantage you can give your site counts.

Increasing The PageRank Of Internal Pages

While we will be discussing PageRank (a Google-based term) here the same rules generally apply for the other engines. The closer a page is in clicks from your homepage, the higher the value (or PageRank) the page is assigned. Basically, if I have a page linked to from my homepage it will be given more weight that a page that is four or five levels deep in my site.

This does not mean that you should link to all of your pages from your homepage. Not only does this diffuse the weight of each individual link but it will look incredibly unattractive if your site is significantly large.

Figure out what your main phrases are and which pages will be used to rank for them and be sure to include text links to these internal pages on your homepage. It’s important to pick solid pages to target keyword phrases on as you don’t want human visitors going to your “terms and conditions” page before they’ve even seen the products.

If that hosting company noted above has a PageRank 6 homepage, the pages linked from its homepage will generally be a PageRank 5 (sometimes 4, sometimes 6 depending on the weight of the 6 for the homepage). Regardless, it will be significantly higher that if that page was linked to from a PageRank 3 internal page.

How To Improve Your Internal Linking Structure

There are many methods you can use to improve your internal linking structure. The three main ones are:

  1. Text link navigation
  2. Footers
  3. Inline text links

Text Link Navigation

Most websites include some form of navigation on the left hand side. This makes it one of the first things read by a search engine spider (read “Table Structures For Top Search Engine Positioning” by Mary Davies for methods on getting your content read before your left hand navigation). If it is one of the first things the search engine spiders sees when it goes through your site it will have a strong weight added to it so it must be optimized with care.

If you are using text link navigation be sure to include the targeted keywords in the links. Thankfully this cannot be taken as meaning “cram your keywords into each and every link” because this is your navigation and that would look ridiculous. I’ve seen sites that try to get the main phrase in virtually every link. Not only does this look horrible but it may get your site penalized for spam (especially if the links are one after another).

You don’t have to get your keywords in every link but if workable, every second or third link works well. Also consider what you are targeting on internal pages. If your homepage target is “web hosting” and you’ve linked to your homepage in the navigation with “web hosting main” which is followed by your contact page so you’ve used “contact us”, it would be a good idea to use the anchor text “dedicated hosting” for the third link. It reinforces the “hosting” relevancy and also attaches relevancy to the dedicated hosting page of the site to the phrase “dedicated hosting” in the anchor text.


Footers are the often overused and abused area of websites. While they are useful for getting spiders through your site and the other points noted above, they should not be used as spam tools. I’ve seen in my travels, footers that are longer than the content areas of pages from websites linking to every single page in their site from them. Not only does this look bad but it reduces that value of each individual link (which then become 1 out of 200 links rather than 1 out of 10 or 20).

Keep your footers clean, use the anchor text well, and link to the key internal pages of your website and you will have a well optimized footer. You will also want to include in your footer a link to a sitemap. On this sitemap, link to every page in your site. Here is where you can simply insure that every page gets found. Well worded anchor text is a good rule on your sitemap as well. You may also want to consider a limited description of the page on your sitemap. This will give you added verbiage to solidify the relevancy of the sitemap page to the page you are linking to.

Internal Text Links

Internal text links are links placed within the content of your work. They were covered in last week’s article on content optimization, which gives me a great opportunity to use one as an example.

While debatable, inline text links do appear to be given extra weight as their very nature implies that the link is entirely relevant to the content of the site.

You can read more on this in last week’s article.

Final Notes

As noted above, simply changing your internal navigation will not launch your site to the top of the rankings however it’s important to use each and every advantage available to create a solid top ten ranking for your site that will hold it’s position.

They will get your pages doing better, they will help get your entire site spidered, they will help increase the value of internal pages and they will build the relevancy of internal pages to specific keyword phrases.

Even if that’s all they do, aren’t they worth taking the time to do right?

Next Week

Next week in part six of our “Ten Steps To an Optimized Website” series we will be covering the importance of human testing. Having a well-ranked website will mean nothing if people can’t find their way through it or if it is visually unappealing.

SEO news blog post by @ 4:28 pm on November 22, 2004


Ten Steps To A Well Optimized Website – Step Three: Site Structure

Welcome to part three in this search engine positioning series. Last week we discussed the importance and considerations that much be made while creating the content that will provide the highest ROI for your optimization efforts. In part three we will discuss the importance of site structure.

While there are numerous factors involved with the search engine algorithms, site structure is certainly of constant importance. Cleaner structures that remove lines of code between your key content and the search engine spiders can mean the difference between second page and first page rankings.

Over this series we will cover the ten key aspects to a solid search engine positioning campaign.

The Ten Steps We Will Go Through Are:

  1. Keyword Selection – October 24, 2004
  2. Content – October 31, 2004
  3. Site Structure – November 7, 2004
  4. Optimization – November 14, 2004
  5. Internal Linking – November 21, 2004
  6. Human Testing – November 29, 2004
  7. Submissions – December 5, 2004
  8. Link Building – December 12, 2004
  9. Monitoring – December 19, 2004
  10. The Extras – December 28, 2004

Step Three – Site Structure

Developing the structure of your website is a very important step in its overall optimization. The site structure will dictate how the spiders read your site, what information they gather, what content holds the most weight, how much useless code they must weed through and more. You must structure your website to appeal to the visitor and the spiders.

When developing your website you want to be sure not to create useless code that can confuse spiders and take away from the content of your site. When developing your site I recommend hand coding as the best option however not everyone has the time or the skill to do this so I would suggest Dreamweaver as a great option. (Though the code will not be as clean as hand coding it does not create an over the top amount of extra code like programs such as Front Page do.) The object here is to keep the code as clean as possible! Remember the more code you have the more the spiders must weed through to get to your content, where you want them to be.

A great way to cut down on extra code as well is to use style sheets. You can use style sheets in ways as simple as defining fonts or as advanced as creating tableless designs. There are many ways to use style sheets and the biggest perk to using them is to cut back on the code on any given individual page.

When you are setting up the initial structure of your site you want to be sure that the table structure is laid out in such a way that the spiders can easily and as quickly as possible get to the most important content. A great way to attain this is to create your website using the table structure outlined in my article “Table Structures For Top Search Engine Positioning“. When the spiders visit your site they read through it top to bottom, left to right following the rows and columns. The key to the table structure outlined above is the little empty row. Were this row not there the spiders would read through that first column hitting nothing but images and Alt tags, your navigation, until it would then move onto the next column, your content area. Placing this empty cell in the first row of the main table guides the spiders directly to your content, they hit the empty row and with nothing to read move onto the next column to the right, where you want them. After they have read your content they will then move back to the left in row 2 and read your navigation images and Alt tags, finally they will end the page at your footer, a great place for keyword rich text links. (Internal linking structures will be covered in part 5 of this 10 part series.)

Once you have created the site structure and inserted all of your content you will then begin the basic optimization of your site. In your code you will want to create Meta tags that fit your keyword choice. The two most important Meta tags are the Description tag and the Keyword tag. Your description should highlight your keyword phrase, keeping it focused, to the point and readable. Your keyword tags should also be focused using each keyword a maximum of 3 times in any set. These tags should be customized on each page to fit the specific phrase targeted.

After the Meta tags have been inserted appropriately to fit each page it is important to title each page appropriately. The main targeted phrase should be the focus of the title, keep it simple, focused, to the point, do not bog it down with extra descriptive text, this is not your description, it is your title.

Next move onto Alt tags. Though it is good practice to add Alt tags to all your images the spiders only put weight on those that are contained within links. An example of this:

<a href=””><img src=”/Images/webhead.jpg” alt=”Beanstalk Search Engine Optimization” width=”461″ height=”145″ border=”0″></a>

These Alt tags allow you to make your images matter. Most main navigation is image based so be sure to add appropriate Alt tags targeting your keywords to this very prominent area of your site. Another great place to add a link along with its Alt tag is in your header image. Linking this image to your URL adds the ability to make the first thing the spiders hit within your tables to at least hold some content that “matters” rather than simply a static image.

H1 tags are also great way to add weight to your content however, use them wisely. You can use any of the H1,2,3,4 tags, the idea being H1 has the most weight, H2 a little less and so on. Do not over use these tags or they will lose their value all together. The correct way to use these is to use them where they actually belong, for example the first line of text on a page, the title. Also, if you are defining your fonts in a style sheet, which you should be, be sure not to abuse these tags. An H1 tag should be defined bigger than an H2, etc.

Utilizing the above tips will create a site structure that is the perfect environment for the spiders, it is clean, focused and easily read. Your site structure is now optimized and ready for the more advanced content optimization elements to come.

SEO news blog post by @ 4:19 pm on November 7, 2004


Ten Steps To A Well Optimized Website Series

Due to the great interest and feedback we received from our article “Ten Steps To Higher Search Engine Positioning” we decided to cover each of the ten steps in greater detail in a ten part series.

Below you will find links to all ten steps of this series.

The Ten Steps To A Well Optimized Website:

  1. Keyword Selection – October 24, 2004
  2. Content – October 31, 2004
  3. Site Structure – November 7, 2004
  4. Optimization – November 14, 2004
  5. Internal Linking – November 21, 2004
  6. Human Testing – November 29, 2004
  7. Submissions – December 5, 2004
  8. Link Building – December 12, 2004
  9. Monitoring – December 19, 2004
  10. The Extras – December 28, 2004

SEO news blog post by @ 2:13 pm on October 20, 2004


Level Triple-A conformance icon, W3C-WAI Web Content Accessibility Guidelines 1.0 Valid XHTML 1.0! Valid CSS!
Copyright© 2004-2014
Beanstalk Search Engine Optimization, Inc.
All rights reserved.