Beanstalk on Google+ Beanstalk on Facebook Beanstalk on Twitter Beanstalk on LinkedIn Beanstalk on Pinterest
Translate:
Published On:
SEO articles and blog published on ...
Hear Us On:
Webmaster Radio
Blog Partner Of:
WebProNews Blog Partner
Helping Out:
Carbon balanced.
Archives
RSS

XMLRSS

Beanstalk's Internet Marketing Blog

At Beanstalk Search Engine Optimization we know that knowledge is power. That's the reason we started this Internet marketing blog back in 2005. We know that the better informed our visitors are, the better the decisions they will make for their websites and their online businesses. We hope you enjoy your stay and find the news, tips and ideas contained within this blog useful.


August 28, 2013

Link Reduction for Nerds

Let’s face it, even with our best efforts to make navigation clear and accessible, many websites are not as easy to navigate as they could be.

It doesn’t matter if you are first page super star, or a mom n pop blog with low traffic, most efforts really are no match for the diversity of our visitors.

When I first started blogging on SEO topics for Beanstalk I took a lot of effort to make my posts as accessible as I could with a bunch of different tricks like <acronym> tags (now they are <abbr> tags) and hyperlinks to any content that could be explored further.

Like a good SEO I added the rel="nofollow" to any external links, because that totally fixes all problems, right?

“No.. Not really.”

External links, when they actually are relevant to your topic, and point to a trusted resource, should not be marked as no-follow. Especially in the case of discussions or dynamic resources where you could be referencing a page that was recently updated with information on your topic. In that case you ‘need’ the crawlers to see that the remote page is relevant now.

Internal links are also a concern when they become redundant or excessive. If all your pages link to all your pages, you’re going to have a bad time.

If you went to a big new building downtown, and you asked the person at the visitors desk for directions and the fellow stopped at every few words to explain what he means by each word, you may never get to understanding the directions, at least not before you’re late for whatever destination you had.

Crawlers, even smart ones like Google Bot, don’t really appreciate 12 different URLs on one page that all go the same place. It’s a waste of resources to keep adding the same URL to the spiders as a bot crawls each of your pages.

In fact in some cases, if your pages have tons of repeated links to more pages with the same internal link structures, all the bots will see are the same few pages/URLs until they take the time push past the repeated links and get deeper into your site.

The boy who cried wolf.

The boy who cried wolf would probably be jumping up and down with another analogy, if the wolves hadn’t eaten him, just as your competition will gladly eat your position in the SERPs if your site is sending the crawlers to all the same pages.

Dave Davies has actually spoken about this many times, both on our blog, and on Search Engine Watch: Internal Linking to Promote Keyword Clusters.

“You really only NEED 1 link per page.”

Technically, you don’t actually need any links on your pages, you could just use Javascript that changes the window.location variable when desired and your pages would still work, but how would the robots get around without a sitemap? How would they understand which pages connect to which? Madness!

But don’t toss Javascript out the window just yet, there’s a middle ground where everyone can win!

If you use Javascript to send clicks to actual links on the page, you can markup more elements of your page without making a spaghetti mess of your navigation and without sending crawlers on repeated visits to duplicate URLs.

“In fact jQuery can do most of the work for you!”

Say I wanted to suggest you look at our Articles section, because we have so many articles, in the Articles section, but I didn’t want our articles page linked too many times?

Just tell jQuery to first find a matching <anchor>:
jQuery("a[href='/articles/']")

Then tell it to add an ID to that URL:
.attr( 'id', '/articles/');

And then tell it to send a click to that ID:
document.getElementById('/articles/').click();

Finally, make sure that your element style clearly matched the site’s style for real hyperlinks (ie: cursor: pointer; text-decoration: underline;)

UPDATE: For Chrome browsers you need to either refresh the page or you have to include the following in your page header: header("X-XSS-Protection: 0");

SEO news blog post by @ 6:07 pm


 

 

August 16, 2013

SEO concerns for Mobile Websites

You want to serve your clients needs regardless of what device they visit your site with, but how do you do it easily without upsetting your SEO?

Lets look at the various options for tackling Mobile sites and what each means in terms of SEO:

Responsive Design :
 
Visual demonstration of responsive web design

  • Responsive design is growing in popularity, especially as communications technology evolves, and bandwidth/memory use is less of a concern.
  • This method also gives us a single URL to work with which helps to keep the sitemap/structure as simple as possible without redirection nightmares.
  • On top of that, Googlebot won’t need to visit multiple URLs to index your content updates.
  • Less to crawl means Googlebot will have a better chance to index more of your pages/get deeper inside your site.
“Why is/was there a concern about mobile page size?”

Low-end mobiles, like a Nokia C6 from 4+ years ago (which was still an offering from major telcos last year), typically require that total page data be less than 1mb in order for the phone to handle the memory needs of rendering/displaying the site.

If you go over that memory limit/tipping point you risk causing the browser to crash with an error that the device memory has been exceeded. Re-loading the browser drops you on the device’s default home-page with all your history lost. I think we could all agree that this is not a good remote experience for potential clients.

Higher-end devices are still victims of their real-world connectivity. Most 3rd generation devices can hit really nice peak speeds, but rarely get into a physical location where those speeds are consistent for a reasonable length of time.

Therefore, even with the latest gee-wiz handsets, your ratio of successfully delivering your entire page to mobile users will be impacted by the amount of data you require them to fetch.

In a responsive web design scenario the main HTML content is typically sent along with CSS markup that caters to the layout/screen limitations of a mobile web browser. While this can mean omission of image data and other resources, many sites simply attempt to ‘resize’ and ‘rearrange’ the content leading to very similar bandwidth/memory needs for mobile sites using responsive design approaches.

The SEO concern with responsive designs is that since the written HTML content is included in the mobile styling it’s very crucial that external search engines/crawlers understand that the mobile styled content is not cloaking or other black-hat techniques. Google does a great job of detecting this and we discuss how a bit later on with some links to Google’s own pages on the topic.

Mobile Pages :

Visual demonstration of mobile web page design

 
If you’ve ever visited ‘mobile.site.com’ or something like that, you’ve already seen what mobile versions of a site can look like. Typically these versions skip reformatting the main site content and they get right down to the business of catering to the unique needs of mobile visitors.

Not only can it be a LOT easier to build a mobile version of your site/pages, you can expect these versions to have more features and be more compatible with a wider range of devices.

Tools like jQuery Mobile will have you making pages in a jiffy and uses modern techniques/HTML5. It’s so easy you could even make a demo image purely for the sake of a blog post! ;)

This also frees up your main site design so you can make changes without worrying what impact it has on mobile.

“What about my content?”

Excellent question!

Mobile versions of sites with lots of useful content (AKA: great websites) can feel like a major hurdle to tackle, but in most cases there’s some awesome solutions to making your content work with mobile versions.

The last thing you’d want to do is block content from mobile visitors, and Google’s ranking algorithm updates in June/2013 agree.

Even something as simple as a faulty redirect where your mobile site is serving up:
mobile.site.com/
..when the visitor requested:
www.site.com/articles/how_to_rank.html

.. is a really bad situation, and in Google’s own words:

“If the content doesn’t exist in a smartphone-friendly format, showing the desktop content is better than redirecting to an irrelevant page.”

 
You might think the solution to ‘light content’ or ‘duplicate content’ in mobile versions is to block crawlers from indexing the mobile versions of a page, but you’d be a bit off the mark because you actually want to make sure crawlers know you have mobile versions to evaluate and rank.

In fact if you hop on over to Google Analytics, you will see that Google is tracking how well your site is doing for mobile, desktop, and tablet visitors:
Example of Google Analytics for a site with mobile SEO issues.

(Nearly double the bounce rate for Mobile? Low page counts/duration as well!?)

 
Google Analytics will show you even more details, so if you want to know how well you do on Android vs. BlackBerry, they can tell you.

“How do the crawlers/search engines sort it out?”

A canonical URL is always a good idea, but using a canonical between a mobile page and the desktop version just makes sense.

A canonical can cancel out any fears of showing duplicate content and help the crawlers understand the relationship between your URLs with just one line of markup.

On the flip-side a rel=”alternate” link in the desktop version of the page will help ensure the connection between them is understood completely.

The following is straight from the Google Developers help docs:

On the desktop page, add:

<link rel="alternate" media="only screen and (max-width: 640px)" href="http://m.example.com/page-1" >

and on the mobile page, the required annotation should be:

<link rel="canonical" href="http://www.example.com/page-1" >

This rel=”canonical” tag on the mobile URL pointing to the desktop page is required.

Even with responsive design, Googlebot is pretty smart, and if you aren’t blocking access to resources intended for a mobile browser, Google can/should detect responsive design from the content itself.

Google’s own help pages confirm this and provide the following example of responsive CSS markup:

    @media only screen and (max-width: 640px) {...}

In this example they are showing us a CSS rule that applies when the screen max-width is 640px; A clear sign that the rules would apply to a mobile device vs. desktop.

Google Webmaster Central takes the information even further, providing tips and examples for implementing responsive design.

Ever wondered how to control what happens when a mobile device rotates and the screen width changes? Click the link above. :)

SEO news blog post by @ 3:51 pm


 

 

August 6, 2013

Twitter’s New Anti-Abuse Policies and the Dark Side of Social Media

I won’t lie when I say that one of the best parts of my job is managing social media accounts; it can be legitimately fun, but it’s also a very important illustration of how the Internet affects customer/business interactions. My experience mostly comes from being a voracious and active social media user in my private life; I enjoy a following of 400+ people on Twitter, and I have seen what the network is capable of: live-blogging the Vancouver Olympic opening ceremonies, catching cheating politicians in the act, and spreading the word of everything from hot TV shows to full-blown revolutions. While some might resist it, social media is vital for modern reputation management and customer service; the web has democratized marketing in a very drastic way, making it nearly impossible for a company to cover up substantial issues with their products or service. When you do a great job, you might get the occasional positive mention; when you mess up, your customers will definitely air their grievances. And as a social media user myself, I can vouch for the fact that the public has come to respect businesses that address these issues honestly when they’re contacted about them.

Unfortunately, this democratization has lead to some inevitable abuses of the system. In some cases it’s a rival company posting fake reviews in an attempt to discredit the competition; in others, a company (or person) may be the subject of a vicious complaint that goes viral online. Part of online reputation management is being able to mitigate these issues, whether by reporting abuse to site moderators or addressing complaints head-on.

I say all of this because some business owners on desktop and Android platforms may see a new feature on Twitter in the coming weeks: an in-tweet ‘Report Abuse’ button. Currently, users who wish to flag threats must visit the online help center and go through several extra steps to report abuse; the new button will make the process far quicker, and (hopefully) hasten the removal of hate speech. Twitter’s announcement wasn’t just a routine update; it was spurred largely by a British woman named Caroline Criado-Perez, and the flood of horrific rape, violence, and bomb threats she received over the weekend. These weren’t mere trolls; the abuse got so serious that at least one man was arrested on Sunday as a result. What did Criado-Perez do to warrant hundreds of 140-character threats of violence? She campaigned—successfully—for the British government to put author Jane Austen’s face on the new £10 banknote. The threats were also sent to a female Member of Parliament who tweeted her support for the campaign.

If it seems absurd, that’s because it is; this wasn’t a case of radical politics or controversial opinion, but a fairly tame move to represent more British women on currency. The horrifying result was a stark reminder of the abusive power of social media, especially against women and other marginalized groups in society. But even if you’re not an active participant in social issues online, it’s intimidating to realize just how quickly the anonymous web can turn against you. While some have applauded Twitter for finally taking a decisive action to make their website safer for all users, the decision has also drawn criticism from people who have seen how ‘Report Abuse’ functions on other websites have actually been used against legitimate accounts as a form of abuse in and of itself; a group of trolls flagging an account they disagree with can result in that account being suspended by the website, even when the owner hasn’t actually violated any rules.

Of course, the gender politics and personal vendettas of social media are quite a bit more intense than what we do as SEOs to help clients. In terms of reputation management online, the Report Abuse button will likely be a helpful way to ensure that a company doesn’t suffer from malicious treatment. However, it also may be far too easy to report a dissatisfied (and vocal) customer out of sheer frustration. Online reputation is a fickle beast; a few damning reviews can take down an entire small business, and the damage can be very difficult to control—it’s easy to feel helpless when it seems like nothing you do can push down a few dissatisfied customers in favor of the happy ones. Business owners on Twitter should still make it a priority to engage with unhappy customers on a personal level, rather than just report an account because of a particularly bad review—even if it makes the problem temporarily disappear, the Internet is not kind to those types of tactics.

The Criado-Perez debacle over the weekend has shown Twitter’s dark side, particularly when it comes to misogyny and online gender violence. The effect of the new reporting feature remains to be seen in that regard. While smaller businesses on social media may not engage in that debate, it’s a prudent reminder that the web’s anonymity can cause a lot of malicious action in the name of free speech. Reputation management isn’t going to get easier as a result of Twitter’s changes; it will still require a human touch and an honest connection, because that’s what garners respect in the social media sphere. But hopefully this small corner of the web will be a little safer for everyone who uses it, giving people more courage to speak their minds without fear of retaliatory attempts to forcibly silence them.

SEO news blog post by @ 3:14 pm


 

 

July 18, 2013

A Panda Attack

Google today confirmed that there is a Panda update rolling out. I find it odd that after telling webmasters that there would no longer be announcements of Panda updates, that they made this announcement and one has to wonder why.

The official message from Google is that this Panda update is softer than those previously and that there have been new signals added. There are webmasters who are reporting recoveries from previous updates with this one. I would love to hear some feedback from any of our blog readers as to changes you may have noticed in your rankings with this latest update.

I’ll publish a followup post to this one next week after we’ve had a chance to evaluate the update.

SEO news blog post by @ 10:42 am


 

 

August 31, 2011

Google+ and the Potential Impact on SEO

Although you can only join by invitation at this point, you’ve no doubt heard of Google+, Google’s latest attempt to join (or, in time perhaps, completely overtake?) Facebook and Twitter as a must have social networking tool. In the months before Google+ was launched, Google also began implementing the “+1″ button as a usable option for users to signify that they enjoy a particular site or page in an attempt to gather as much raw data as possible about the popularity and social value of sites and content before Google+ was rolled out for the masses. Preceding the Google+ and +1 button was the introduction of real time search, which was able to incorporate search results from Twitter, blogs and Facebook. Google, it would appear, is realizing the immense value of social media and the impact of social media on web search.

Search will continue to have a social element infused into it as the addition of the +1 button will change search results, as will live feeds from Google+ pages, much like Facebook “likes” and Twitter “tweets” are currently affecting search results by influencing user decisions due to their value as endorsements of certain sites and content.

Google definitely wants websites to implement the +1 button in their pages so that they can track and measure changes in click through rates. The +1 button will also be included on all SERPs as well as all Google+ feeds. What this means is business owners and marketers must ensure that a positive customer experience is, perhaps more than ever before, their primary focus in the hope that as many users as possible will +1 their site, and in doing so, endorse their business (and by association, reputation).

While it is plain to see that the introduction of the +1 button was merely a precursor/trial balloon for Google+, the potential impact of the +1 button on search could be the bridge between all of the social oriented sites and tools and ways of doing things on the web and the subsequent influence on search results.

Recently, Rand Fishkin, head of SEO Moz, decided to test some theories on the subject of social sites influencing search results. He shared a number of un-indexed URLs via Twitter both before and after Google had unceremoniously aborted the real time search results feature. Fishkin repeated the process, only this time he used Google+. He then requested that his followers on Twitter and Google+ to share the post, with the only caveat being that they were not to share it outside of the originating site.

What this yielded in terms of hard data was that even though Google has dropped the real time search, re-tweeting and tweets are still assisting page indexation. As for Google+, Fishkin’s test page ended up ranking #1 on Google within a few hours. This illustrates the fact that Google+ can also help pages get indexed, if not quite as quickly as Twitter.

But perhaps the most interesting concept presented by Google+, and one that could potentially have a significant impact on SEO, is the “Google Circles” feature.

The “Circles” feature is interesting because it grants users the ability to share whatever they choose with specific groups, or Circles, of people. As Google+ users build their Circles, they will subsequently be able to see the sites that users in their circles have +1′d in Google’s SERPs. This has enormous potential – users will be far more likely to make a choice or purchase based on the recommendation of people they have invited to their Circles – people who they know and whose opinions they trust. Most users are going to be far more likely to trust the recommendation of someone they know rather than the recommendation or review from a stranger. Over time, Circles will become much more defined as more available user data is integrated into them – using that data to effectively market could be potentially powerful SEO strategy.

Basically, Google has taken the ideas behind some of their social media competitors more influential and successful features in an attempt to make search more about real people. Google+ and the +1 button are enabling users to influence online activity, and, as such, they will have an effect on search results. Many experts are already proclaiming Google+ to have no impact on SEO whatsoever, citing Google Wave and past attempts by Google to get in on the social side of the net as indicators that this new attempt will also fail. While it is far too early to make any kind of definitive statement as to the long term usefulness or impact of Google+ and the +1 button on SEO, citing past failures as the basis for an argument as to why Google+ is going to fail as well is short sighted at best. The fact of the matter is, social factors are already intertwined with search, and this is likely only going to become more prevalent as these sites are expanded and the way we interact on the internet continues to evolve also, not less so. Whether or not Google+ ends up revolutionizing or merely co-existing with established SEO methodology remains to be seen, but the enormous potential of these features and their long term impact is fairly clear – site ranking methods are changing thanks to the +1 button and this will likely end up creating an altogether new method of SEO in the future.

SEO news blog post by @ 5:02 pm


 

 

September 21, 2010

Google Instant & SEO

From the moment Google Instant was announced back on September 8 there have been forum chats, blog posts, articles and podcasts discussing the ramification of this new technology. Some have called it the “Death of SEO” which others (myself included) have proclaimed this a step forward and an opportunity for SEO’s, not a threat. And then of course there’s those who don’t even know there’s been a change at all, let’s call them “the vast majority”. In this article we’re going to discuss the pros and cons of Google Instant as it pertains to SEO’s and to website owners as well as cover some of the reasons that this new technology may not have as large an impact on search behavior as some may fear/predict.

But first, let’s cover the basic question …

What Is Google Instant?

Google instant is a technology that allows Google to predict what you are looking for as you type. They are referring to it as ‘search-before-you-type” technology (catchy). Essentially – as I type a phrase (let’s say “buy shoes online”) as soon as I get to “buy sh” I start seeing results for “buy shoes”. As soon as I’ve entered “buy shoes “ (with a space after shoes indicating I want more than just the 2 word phrase) I start seeing results for “buy shoes online”.

Technologically this is genius. Google is now serving likely billions of additional search results pages per day as each query has multiplied results that apply to it. Well … I suppose we all wondered what the Caffeine infrastructure update was all about didn’t we? But what does this do in the real world?

Why Google Instant Isn’t A Big Deal

Alright, obviously it is a significant technological enhancement in search but the way some react you’d think the whole universe was about to be turned on it’s head. There are two reasons why that’s not the case.

    1. I find it unlikely that many will notice right away that the change has occurred and further I find it even less likely that the majority will use the feature. You see – the major hindrance of this enhancement isn’t in the technology – it’s in the users. Only those who touch type and can do so without looking at their keyboard will be affected. If the user looks at their keyboard while typing then they wouldn’t even notice the results coming in ahead of their actual search.

 

  1. This will only affect users who are searching in instances where the shorter or predicted terms match the users end goals. For example, if I am searching for “buy shoes online” and get as far as “buy sh” the top results are sites which clearly suit the needs of a searcher for “buy shoes online” and thus – this may work to the detriment of sites who rank well for “buy shoes online” as they may well lose traffic. In the case of a site targeting, oh – I don’t know – “seo consulting” there will likely be little affect if any. The searcher, looking for an SEO consultant, will find once they’ve entered “seo” that they are presented with Wikipedia and Google – sites that, while informative, don’t offer the services (or results) desired and thus – the searcher would be less affected. Once they proceeded on to enter the “seo c” the searcher would be presented with the results for “seo company” but I’m prone to believe that if the searcher wanted those results – they would have searched for it. For this phrase I’m confident we’ll see little in the way of negative affect from Google Instant.

So we’ve discussed why Google Instant isn’t a big deal, now let’s discuss …

Why Google Instant Is A Big Deal

On the other side of the coin lies the reasons why Google Instant brings forth a revolution in search technology. Followers of the Beanstalk blog or my radio show on WebmasterRadio.fm (Webcology) will know I’m not one to love everything Google does but in this case the immediate affects and long terms affects may well be significant and at the very least – one has to appreciate the brilliance behind the effort. In this section of the article we’re going to cover the three important perspectives involved with the launch off this (or any) Google product. They are:

The Searcher – we’ll look at the pros and cons from a searcher perspective. It’s this aspect that will dictate whether the feature will matter at all.

Google – we’ll look at the positive affect on Google. Of course – this aspect is of paramount importance for this feature to be kept.

SEO’s – I’m of course incredibly interested and have spent much of my analysis time determining the pros and cons to SEO’s (admittedly – there’s more than a bit of self interest here).

So let’s begin …

Google Instant And The Searcher

This is sort of a win-win for Google from a searcher perspective. One of two things will happen for the searcher. Either they won’t notice the change or won’t be affected and thus – Google will be exactly where they are now OR they will notice the change and will select results quicker and find the feature helpful. As I noted – it’s a win-win. There isn’t much of scenario from a searcher perspective where the searcher will be negatively impacted and if they are – they’d simply revert back to past searching patterns. From the perspective of impact on the user – Google has it made with this feature. Their worst-case scenario is that they’re exactly where they are now.

Google Instant From Google’s Perspective

Any feature added to any corporate system must serve a single primary function – it must make it’s developer money. We’ve already seen that the feature itself can’t really negatively impact the searcher but can it make Google money? There are two ways that this can happen:

    1. Improved loyalty and marketshare, and

 

  1. Increased revenue directly from the initiative

Fortunately for Google – they’re going to win on both fronts here and when we see the Q3 earnings and moreso in the Q4 earning Google reports we’ll begin to see how significant an impact this change will have for them – mainly in the second of the two monetary reward methods noted above. And here’s why …

We’ve already covered the improved loyalty this can have on the searchers. Anything that makes my life easier and makes my quest for information faster will make me more loyal. At worst – Google will see my behavior stay the same but for many, the search experience will become faster and more effective – especially once the technology is improved by user behavior to a degree that people trust it more. Overall there will be a net gain in the experience – we’ve only to wait to see how large that net gain is and how it translates into marketshare. The big win is in the second point.

For anyone who’s every bid with AdWords you’ll know that for the most part – bids for generic terms are more expensive than bids for very specific terms. If I’m bidding on “shoes” I’m going to pay more than I would for “shoes online”. So let’s view the world where I start showing the results (and paid ads) for “shoes” while someone is searching for “shoes online”. And what if that person sees the ads that was written and bid on for “shoes” but relates to their query and they click on it. Google just made more from the paid ad click. Maybe only pennies but multiply that by billions of searches per day and you’ve got a significant increase in annual revenue.

The move is a huge win for Google but it does come with a theoretical downside and that is annoying the businesses that are paying for the ads. The argument I’ve heard is that if businesses find that the cost of their campaigns is increasing higher than the ROI that they might get annoyed. Fair enough BUT I would argue – what are they going to do about it? As long as Google maintains the first consideration (the searcher) then the advertisers have no choice. They can drop their bids but at worst – they’ll level off to what they were paying for the longtail phrases. Again – worst case scenario, Google will find themselves where they are today.

Google Instant From The SEO’s Perspective

So let’s assume for a moment that Google Instant is here to stay. Based on all the ways Google and the searchers can win and the limited situational permutations by which they could only come out even I’d say that’s a safe assumption. Given this, what’s happens to SEO’s and those optimizing their own websites?

For one thing – we can’t assume that research we did up to and before the 8th will be relevant down the road. I have already scheduled to redo keyword research in a couple months to see what industries and search types have been most (and least) affected by this change. The main reason for this is that I have a strong suspicion that specific industries will be more prone to being affected by the change based mainly on search types (such as the “buy shoes” vs “seo consulting” example above) and demographics. A Linux developer site is more likely to have a demographic off touch typers who can type without looking at the keyboard than say a life insurance site with a more scattered and thus less technically proficient overall demographic.

So in the short term – life is going to be very interesting for the SEO and website owner while we figure out which industries and phrase types are most affected. In a few months when we see the trends and which phrases are being affected and how we’ll likely have to make adjustments to many campaigns. The downside for may business owners will be that for those who’s campaigns focuses on searches for longtail phrases – they may find the search volumes for their phrases decrease and a shift to more generic (and generally more expensive to attain) phrases is necessary. Only time will tell what the best moves are there and we may not know what exactly will shift and how for a few months yet and even then – we’ll then know the trends, not where things will settle (if anything in online marketing can be referred to as “settling” anymore).

If there is a segment that should be concerned about the situation it is small business owners with limited organic or PPC budgets. Google Instant – because it puts preferences to more generic phrases – clearly favors businesses with larger budgets. How much so we’ll know after we’ve had a chance to see how the search volumes shift. For SEO’s this presents two opportunities and for business owners who do their own SEO – it offers one. And here’s the good news for those.

For SEO’s you’ll find two new opportunities, The first is that there will be a shift to more generic terms in search volumes. This means that there will be stiffer competition for more competitive phrases. If this sounds like a bad thing it’s not. If you’re a skilled SEO who knows how to get the job done it means you’ll have more access to larger volumes of traffic without the added efforts required to rank for a wide array or phrases. Rather than needing to rank for 10 or 20 phrases to get traffic you’ll be able to focus in more and reap the same rewards in the way of traffic. On top of that – SEO’s will be able to charge more for the rankings as fewer phrases have a higher value. A win-win for SEOs and a win for business owners who either do their own SEO or have talented SEO’s on staff.

The second opportunity will come in the form of improved clickthrough rates though I’ll admit – at this point that’s just a theory (noted with a hint sent to Gord Hotchkiss to run eyetracking tests on this theory). If I type while looking at my screen and I’m entering in “buy shoes online” and I rank organically or via PPC for both “buy shoes” and “buy shoes online” I would hypothesize that searchers who complete the phrase “buy shoes online” who had the site (or ad) for “buy shoes” appear and then the same site appear for the full query will have a tendency to click on the familiar. This same principle has been witnessed in sites appearing in both paid and organic results who have an increase in their organic clickthrough rates. This will present opportunities for both PPC and organic marketers to improve the traffic to sites by ranking for specific phrases meant to both attain traffic on their own but also to improve traffic for the other. I would suggest that down the road we’ll be hearing of this phenomenon when conducting and discussing keyword research.

Conclusion

There isn’t much to conclude that hasn’t been discussed above. Virtually every party wins or at worst, breaks even with the introduction of this technology. The only victim appears to be small businesses without the budgets to compete for the more generic phrases but even they may win with a shift away from these phrases by the larger companies. It may well occur that while the search volume shift heads in favor of large companies with larger budget – that the lower hanging fruit, while reduced in it’s search volume, may fall too in the competition levels making it more affordable. Larger business may focus like snipers on larger phrases and smaller business may well be presented with the opportunity to go after more, less search phrases that aren’t worth targeting for larger companies – at least organically.

But only time will tell and of course – we have much data to collect and many algorithmic updates to come between here and there.

SEO news blog post by @ 4:32 pm

Categories:SEO Articles

 

 

June 8, 2010

Competition Analysis Basics for SEO

In my last article titled, “Keyword Research Basics for SEO” I discussed keyword research and the basics of keyword selection. Of course – you can’t solidify your targets until you understand what you’re up against. All the keyword research in the world won’t help you rank for the keyword phrase “windows” in 6 months with a brand new site. So understanding how to analyze your competitors and get a feel for who you can compete with in a reasonable period of time is paramount to creating a solid strategy. I’ll also be flashing back a bit on keyword strategy.

In the last article we closed with a list of potential keyword phrases, the idea that we needed to divide our phrases into major phrases and longtail phrases and also a new domain (just to keep things realistic). So where do we go from there?

Generally I start at the top. From the highest searched phrases to the lowest – I do a quick analysis of the major phrases to determine the long term goals and the short term. I also like to look for what I call “holes”. These are phrases that have competition levels lower than one would expect when looking at the search volume. So let’s use the example I was using in the last article and imagine a US-based downhill mountain bike company. And let’s begin with the major targets.

The phrases we’ll examine for the purposes of this article are the top 10 phrases as ordered by search volume. They are:

  • mountain bike
  • mountain bikes
  • specialized mountain bike
  • trek mountain bike
  • mountain bike frame
  • full suspension mountain bike
  • cannondale mountain bike
  • giant mountain bike
  • mountain bike parts
  • mountain bike reviews

So what are we looking for? It’s obviously not feasible to do incredibly thorough competition analysis at this stage. I’ve listed 10 phrases here but in reality there are hundreds to consider and so we need a quick(ish) way to determine the competition levels of phrases. First, let’s install a couple tools to help you make some quick decisions. You’ll need to install the Firefox browser and the SEO Quake add on. Now when you run a search you’ll be able to quickly pull the competitor stats. I like to look at the PageRank, links to the ranking page and sitelinks. Remember now – this is the basic competitor analysis here.

Here are the stats for the top 10 ranking sites across the 10 top phrases (I’ll leave out the URLs so there’s no promotion):

Phrase: mountain bike

Site 1 – PR6, 70,268 page links, 71,177 domain links

Site 2 – PR6, 262,609 page links, 290,281 domain links

Site 3 – PR5, 0 page links, 604 domain links

Site 4 – PR6, 101,136 page links, 206,397 domain links

Site 5 – PR5, 741 page links, 118,791,902 domain links

Phrase: mountain bikes

Site 1 – PR5, 33,097 page links, 40,747 domain links

Site 2 – PR6, 42,010 page links, 91,385 domain links

Site 3 – PR6, 262,609 page links, 290,281 domain links

Site 4 – PR6, 101,136 page links, 206,397 domain links

Site 5 – PR5, 25,059 page links, 38,132 domain links

Phrase: specialized mountain bikes

Site 1 – PR6, 101,136 page links, 206,397 domain links

Site 2 – PR1, 1 page links, 206,397 domain links

Site 3 – PR4, 2,001 page links, 2,095 domain links

Site 4 – PR5, 734 page links, 738 domain links

Site 5 – PR2, 4 page links, 230 domain links

Phrase: trek mountain bikes

Site 1 – PR6, 65,464 page links, 178,712 domain links

Site 2 – PR4, 108 page links, 178,712 domain links

Site 3 – PR4, 127 page links, 523 domain links

Site 4 – PR4, 2,001 page links, 2,095 domain links

Site 5 – PR0, 0 page links, 3,854,233 domain links

Phrase: mountain bike frame

Site 1 – PR4, 6,348 page links, 44,535 domain links

Site 2 – PR2, 6 page links, 4,303 domain links

Site 3 – PR4, 196 page links, 523 domain links

Site 4 – PR0, 28 page links, 35 domain links

Site 5 – PR1, 0 page links, 294,361,703 domain links

Phrase: full suspension mountain bike

Site 1 – PR4, 58 page links, 178,712 domain links

Site 2 – PR4, 20 page links, 1,729 domain links

Site 3 – PR3, 7 page links, 9,959,894 domain links

Site 4 – PR5, 240 page links, 290,281 domain links

Site 5 – PR3, 0 page links, 294,362,703 domain links

Phrase: cannondale mountain bikes

Site 1 – PR6, 62,614 page links, 91,301 domain links

Site 2 – PR6, 410 page links, 91,301 domain links

Site 3 – PR4, 0 page links, 2,056 domain links

S
ite 4 – PR3, 3 page links, 80,580 domain links

Site 5 – PR2, 3 page links, 9,959,894 domain links

Phrase: giant mountain bikes

Site 1 – PR3, 7 page links, 136,232 domain links

Site 2 – PR4, 2,001 page links, 2,095 domain links

Site 3 – PR0, 6 page links, 6 domain links

Site 4 – PR4, 2,262 page links, 2,392 domain links

Site 5 – PR2, 1 page links, 60,131 domain links

Phrase: mountain bike parts

Site 1 – PR4, 610 page links, 2,366 domain links

Site 2 – PR4, 851 page links, 4,303 domain links

S
ite 3 – PR4, 6,348 page links, 44,535 domain links

Site 4 – PR5, 4,612 page links, 20,931 domain links

Site 5 – PR6, 4,612 page links, 20,931 domain links

Phrase: mountain bike reviews

Site 1 – PR6, 262,609 page links, 290,281 domain links

Site 2 – PR5, 240 page links, 290,281 domain links

Site 3 – PR6, 560 page links, 361,873 domain links

Site 4 – PR5, 0 page links, 604 domain links

Site 5 – PR4, 22 page links, 90,123 domain links

Now, I’d definitely look further down my keyword list than this but for the purposes of this article let’s assume this is all we have. If that’s the case – what do you suppose would be the primary choice(s)? Were it to me I’d go with:

mountain bike frame – we have a range of PageRank, a range of links and a range of sites. Basically – we’re not up against a wall of high competition and the search volume is solid.

full suspension mountain bike – a full range of sites. Higher competition than “mountain bike frame” but we’re looking at a phrase that would sell a whole bike which needs to be considered and a slightly higher competition is thus acceptable.

So of these two phrases what would I do? Well – if this was all we had to work with I’d select “full suspension mountain bike” as the main phrase and follow that up with “mountain bike frame” as a major secondary phrase and thus a prime target for proactive internal page link building and optimization.

So now let’s look at whether there are any good longtail phrases. In this industry we’ll be looking for specific parts. Since going through all the different types of parts would be a nightmare in an article I’ll focus on a couple parts I just ordered recently and that was a new handlebar and and a new rim. To keep things simple I’m going to focus on just a couple brands in the research BUT in reality we’d take the extra time and look into all the part types and all the brands that we’d be able to sell on our site.

So for handlebars, here’s the long and short of the numbers and competition:

Brands researched – origin and easton

“easton handlebars” with 1,000 estimated searches/mth with low competition outside of the manufacturer is a great start. Further, when we look up the manufacturer we further see that the ea70 and ea90 Easton models are both sought after as well.

When we build our site we obviously want to build a structure and heirarchy that are conducive to longtail rankings overall but what we’re looking for here are ideas as to where to put our energies when it comes to content creation and link building. Handlebars looks good by search volume. The average sale per item would be around $25.

And now to rims:

Brands researched – mavic and sun

“mavic rims” and “sun rims” both come in at 1,900 estimated searches but the comeptition for “sun rims” is significantly lower with lower link counts and lower PageRank sites ranking. The average sale here is also going be in the $40 to $45 range.

Based on this my first efforts for the whole site wold be “full suspension mountain bike” for the homeapge, mountain bike frame” as a major internal page and I’d focus my first efforts on “rims” (“sun rim” specifically).

Now – we’d of course look further than this but what we can see is the direction that we’d go if all we had to go on was the above data. As noted – were we launching this site we’d look into every brand and every part type and research further than the top 10 phrases but that would have made for a book, not and article and let’s be honest – it would have been a very boring book unless you were planning on launching a mountain bike site.

So now you’ve done enough competition analysis (remember – it’s basic research we’re talking about) to figure out what direction to head in. In my next article I’m going to cover more advanced competition analysis. We’ll go in knowing what we want to accomplish in the way of keywords and be working to map out how to take the top spots.

Until then – get your campaigns sorted out for potential keywords and keep reading … this is where it gets really interesting.

SEO news blog post by @ 2:35 pm

Categories:SEO Articles

 

 

January 25, 2010

How to Write Engaging Blogs People Want to Read 

Thomas Edison famously remarked that genius was “1% inspiration, 99% perspiration.” For bloggers this means that if you put your effort into it, you can create a blog that gathers a following. If you look at a group of bloggers, one with a worldwide following and the rest with small audiences, the former will not necessarily be the best writer, the funniest, the smartest or even the one with the most inside info or useful tips. The great bloggers you follow yourself could have varying amounts of these characteristics.

So what separates the good bloggers from the ones with larger followings? Many call it the “x factor.” Since this is a bit amorphous we’ll touch on it later. You can take your first steps toward creating an engaging blog that builds a loyal following by following some simple guidelines. There are definitely tips, techniques and tools that will get you there and equip you to compete in the blogging big leagues. We’ll return to the “x factor” after getting you to that starting line.

Audience as foundation

Know your audience. Marshall McLuhan observed almost 50 years ago that the world was transforming into a “global village” through mass communication. The global village is here. People don’t log on to the Internet to be lectured. They log on for information, but also for intelligent dialogue – for exchange, for discussion, for sharing – with people like themselves. Know your audience and the information and conversation they are looking for. You need to engage your readers and speak directly to them with a personal touch, a sense of inclusion, and even a hint of intimacy. Blogs are about relationships, and relationships are about discussions and dialogues of all kinds. The “Monologue Era” is over. Your blog will succeed to the extent that you connect with your audience.

In our Dialogue Era, if you offer people something useful you can become a resource. People bookmark resources and return to them repeatedly, expecting more of the same. Once you have defined your audience you must set about adding value to their visits. Provide information helpful to your audience. Write clearly and don’t try too hard – be natural but concise, instructive but conversational. Produce useful, supportive and brief pieces that people can apply – today, tomorrow, whenever. That will show they can return for more information without wasting their time. Blogs are not articles, so keep them to the point, but do not enforce an arbitrary word limit. Your length will depend on your topic and your audience – make every word count.

Draw them in, move them along

To engage an audience in the first place, craft interesting headlines that invite readers in and use subheads to move them along and allow them to scan for the specific information they are looking for. The flow is enhanced if you keep sentences shorter rather than longer, and active rather than passive. Don’t posture, pretend, boast or brag, and always maintain a healthy skepticism and sense of humor. You are not writing great literature, your helping your neighbor. Finally, always review your output and rewrite where necessary. During this process, make words “pay their rent” by weeding out unnecessary ones.

You have many things to consider, a number of bottom lines – plural. Bottom line: You need to read about writing, learn how to edit and refine your technique over time. Bottom line: You need to learn the particular writing techniques that have evolved around blogs, like how to craft good bullet points, when to use them, how to use the page layout to your advantage and so forth. Bottom line: You have to continue reading your competition and your colleagues, often one and the same, and analyze what works and what doesn’t. Bottom line: There are a lot of bottom lines in blogging.

Go forth and blog

Coming full circle, then, let’s consider that “x factor” again. Although it’s not possible to define it quite precisely, we know where it is located. It is in you. It is your personality, your spark, your unique outlook. Be yourself, not what you think they want you to be. In that jigsaw puzzle that is “you” there are many traits and abilities, opinions and truisms, dreams and fears, and the sum total of them all is what adds up to “you” – and no one else – and your own real personality coming off the page is often what engages people. How can you inject “you” into your writing? There’s only one way to draw it out, of course, and that is to write.

Since you are forming relationships, do what Dale Carnegie advised about 80 years ago and ask small favors of your readers. Invite their comments. Ask for their opinion. Encourage them to express their point of view. This tells them you value what they think. More importantly, it engages them and makes them a valuable active participant (instead of a passive visitor), a member of your community, and part of an ongoing and growing dialog. This is what will lead many of them to make the all-important cognitive leap that will have them bookmark your blog, link to your posts, tell all their friends about it and continue the dialog. The leap occurs when readers stop thinking of themselves as readers, and start thinking of themselves as “stakeholders” – readers that interact with you.

If you can convert readers into stakeholders, you’re on your way.

SEO news blog post by @ 2:20 pm


 

 

July 14, 2009

How To Search Engine Optimize (SEO) an AJAX or Web 2.0 Site

One of the three major pillars of Search Engine Optimization is a website’s content, and onsite content optimization. All of the major search engine ranking algorithms have components that relate to the content that is contained on the website. Typically these components relate to Keyword Densities, number of words, content location, and sometimes age of content. In regards to the code that the content is contained in that falls under the topic of structure and not content, and will not be discussed in this article.

Asynchronous JavaScript and XML (AJAX) is an advanced web development method which can be used to create more responsive and interactive dynamic websites. AJAX accomplishes this by making object request calls back to the web server without having to refresh your browser, these object calls are then processed and are typically used to update the content of the page on your website that is currently being viewed. For the sake of this Article I’m going to ignore the XML component of AJAX as the search engines never view any of the XML data. Websites that use Javascript to manipulate content without using AJAX will also suffer from the issues described.

When a search engine sends out a robot / spider to visit your website with the goal of indexing your content it is only looking at what is being presented in the Markup Language. Generally a search engine does not behave like a user when indexing your website, it doesn’t click buttons or links it simply makes note of URLs associated with each page then individually then visits these pages to index them. This largely goes against the goal of AJAX which is to have as few pages as possible by interacting with the web server in a smarter method as the users interact with the website.

To put the last paragraph simply any content that is changed via AJAX or Javascript on a webpage that is not hardcoded in a page won’t be cached by the search engines. This essentially means that if you have great content that the search engines may love but you’re using AJAX you may be missing out on traffic. There are two approaches to rectifying these which may even give you an advantage over sites that don’t utilize Javascript / AJAX.

The first approach is to make sure that your website degrades to normal flat markup language for non javascript capable browsers and search engines. Essentially every time you would have used an AJAX call make sure you have a page with the same content. Unfortunately for a lot of people this could mean a lot of work, for those individual using a database with PHP or ASP it is not too hard to build a site that builds itself with some effective web programming.

The second approach is to use AJAX in a more minimalist fashion. The goal here is to present the search engines with your optimized content while making sure that any AJAX calls a user would do has no bearing on what you want the search engines to see. In fact this can be used to remove content from your website which may negatively affect your rankings such as testimonials. I’ve seen very few testimonials that actually do good things for a sites keyword density, I’ve even been known to optimize testimonials on client’s websites. With Javascript / AJAX you could insert a random testimonial into a page and therefore not affecting that pages keyword density. The only downside to this approach is that some offsite keyword density tools actually use Web Browser rendering engines so they may get false results as it takes the Javascript into account.

Now you may think that I’m anti AJAX from everything that I’ve said, but there are times and places for AJAX, provided it doesn’t affect how the search engines see your beautiful relevant content your trying to rank. AJAX is great to use for Member sections of your website, interactive forms, slideshows, and a lot more it just needs to be leveraged correctly to avoid missing out on search engine visitors. The final thing to keep in mind is that most search engines like to see more than a single page website which many AJAX website appear to be, always strive for at least 5 or more indexable pages as internal links and anchor text can have a lot of value.

SEO news blog post by @ 2:13 pm


 

 

February 27, 2009

An Introduction To SEO

Welcome to Daryl Quenet’s introduction to Search Engine Optimization (SEO), optimizing design, and how to maximize your websites search engine positioning for the major search engines.

When it comes to running an effective website that ranks well on the search engine results pages (SERPs), there are three major factors that can influence the number of search engine referrals (incoming searches) you get. This applies to all the major search engines (Google, Yahoo, MSN, and Live).

Content Is King

The most important thing is the content on your page regardless of how much time you put into Search Engine Optimizations (SEO) for your website without the content people are searching for you will find very little return on your efforts.

Involved with the preparation of your content is analyzing the keyword(s) for your given industry. Just putting Keywords in the keywords meta tag will get you no where without those Keywords existing in your content. This is known as Keyword Density, basically the more often you’re keywords the more relevant your content is for the searcher in the eyes of a search engine. Keep in mind an ideal density is around 3.5% per word in you phrase.

When writing your Search Engine Optimized content don’t forget about the end user. If you can’t get your keyword densities bang on, then don’t worry about it. I prefer to have a lower density but higher quality content for the end user, than having spammy content and a lower conversion rate. The end goal is still to convert your visitors to your products, services, or whatever your goal may be. Users, unlike search engines, are not interested in Keyword Density so beware of keyword spam.

And a final note on Content for this introduction is that it is advisable to constantly update your content. The longer your content goes without updates, the staler the content gets, and the lower your search engine positioning will drop. However with enough Link Building this can be negated.

Link Building Your Way Too Success

Link building is easily the second most important factor in SEO, and in some cases the most. Building links into your website is the only way as a webmaster that you can affect the authority of your website, and the value your existing content may have in the eyes of the search engines.

To conceptualize link building think of your website as if it was a person. The more popular a person is the more authoritative what they have to say is to their target audience. The big difference being that our target audience is Google, and the other major search engines, and having quality links on other sites equates to your websites “popularity”.

Now keep in mind when you start your link building that nearly no two links are exactly the same. When Google calculates the value of a link it looks at several important things to figure out just how much strength to give you. Here are just a few:

  1. How much strength did the page with the link have
  2. Number of external links on a page
  3. Anchor text used for the link
  4. Is a rel=nofollow tag used
  5. How long has that link been there

Now keep in mind all of these factors above are irrelevant if Google hasn’t cached the page with the link, if Google hasn’t found it then it is worth nothing. The stronger the strength of the page your link is on the more strength you will get in return. The more outgoing links there are on a page the more that strength will be divided between all the linked sites.

A link with a rel=”nofollow” attribute is virtually useless to your website other then increasing your overall link count to give your competitors a scare. You will mainly find NoFollow attributes for Blog Comments, Website Advertisers / Sponsors, Paid Links, or links to competitors (I use them on my resume for past work experience).

When a link is built very few search engines will give you the full strength of that link right away. This is done to maintain the quality of the SERPs if everyone could just go out build 1000s of links then rank there would be no quality to the search engines. Instead they slowly give you more strength as these links age up until around the 6 month period.

Lastly you will constantly see something called Google Pagerank. Pagerank is an arbitrary Google measurement assigned to a website / page to denote that pages strength. Now some people consider this measurement to be the end all be all, but in truth it means very little other than an indicator of you sites health. If you have a PageRank on your homepage as well as pagerank on most of your internal pages your off to a good start. Also keep in mind that pagerank only updates every 3 – 6 months, and ultimately the proof is in the search engine results not some number in the toolbar.

* It’s important to note that when I’m referring to PageRank above I’m referring to the visual PageRank displayed in the little green bar, not the actual PageRank that Google uses internally to calculate the value of a page.

Optimize Your Website Navigation Structure & Design

I purposely left site structure to last as it can be the quickest way for you to royally mess up your website rankings. The worst case with bad structure is that no part of your website will be cached and you will see no visitors. I’ve seen a lot of sites with a lot of issues causing no search engines to crawl these sites. Some of the worst yet simple structural issues that can affect your search engine crawler visibility that I’ve seen are:

  1. Automatically redirecting all visitors that come to your site to another page.
  2. Using HTTPS only
  3. Pure Javascript based navigation

On other sites I have seen Google only cached the index page, which may have an assigned Pagerank without spidering the rest of the website. The things to remember when mapping out the structure of your website are:

  1. At all costs avoid having dynamic URLs (ie index.php?PageId=1), a dynamic URL is a URL that contains HTTP GET variables. Search engines don’t tend to spider these sites well. And to users they don’t have any relevant information to their queries. Try to use Page keys that contain your keywords, if you need to use Dynamic scripts to build your website (i.e. through a Content Management System), use Apache Mod Rewrites to build a static in appearance website (link removed). If you have to use Dynamic URLs keep your number of variables at no more than 2.
  2. If possible try to use the Keywords you are targeting for your industry in your URL or Files / Directories. This helps increase your Keyword Density, as well as providing users clicking through on Google information relevant to their query in your file names.
  3. Don’t constantly change your website structure. Pagerank takes time naturally to develop, and Google holds new sites back in a Sandbox. By renaming a page you can often kiss your pre-existing search engine positioning away on renamed pages until their rank is redeveloped.
  4. When designing a new site try to avoid having filenames with extensions in the URL (ie Products.asp), this can limit your options in the future if you change programming languages (ie ASP to PHP), as well as the platform your website can be hosted on (ie Windows vs Linux Hosting).
  5. When implementing a new structure or new site, create a Google sitemap, and register it with Google to let Google know what to index.
  6. Whenever possible attached CSS and Javascript as external files.

Once you have decided on a website structure, or you have a pre-existing structure, the best way to score higher search engine positions is to have minimalist coding in the HTML to maximize your Content to Markup Ratio. The best way to minimize the amount of HTML code required is to use Cascading Style Sheets (CSS). Cascading Style Sheets allow you to pull the design out of your HTML pages and place them into a separate file. Not only does this remove a lot of HTML if you were using Tables for layout, it makes maintenance a lot simpler as all your design changes are made in one place.

When I moved my website from table based layout to Cascading Style Sheets I managed to reduce my markup code by around 60%! If you have a very large site this can be even more beneficial as some search engines limit the amount of hard drive space they will allocate to caching your website, as well as raise the position of your content higher up in your document.

Conclusion

And thus concludes my introduction to Search Engine Optimization (SEO), it may sound long and long winded, but that is really just a little bit of what goes into successful positioning your website on the search engines. I’ll finish up with one last warning and that is to not buy or sell links, as you can easily be penalized completely from the SERPs for this (Google supplies a page for reporting websites for buying and selling). Good luck on your Search Engine Result Pages and Positioning!

SEO news blog post by @ 5:10 pm


 

 

Older Posts »
Level Triple-A conformance icon, W3C-WAI Web Content Accessibility Guidelines 1.0 Valid XHTML 1.0! Valid CSS!
Copyright© 2004-2014
Beanstalk Search Engine Optimization, Inc.
All rights reserved.