Beanstalk on Google+ Beanstalk on Facebook Beanstalk on Twitter Beanstalk on LinkedIn Beanstalk on Pinterest
Translate:
Published On:
SEO articles and blog published on ...
Hear Us On:
Webmaster Radio
Blog Partner Of:
WebProNews Blog Partner
Helping Out:
Carbon balanced.
Archives
RSS

XMLRSS

An Introduction To SEO

Welcome to Daryl Quenet’s introduction to Search Engine Optimization (SEO), optimizing design, and how to maximize your websites search engine positioning for the major search engines.

When it comes to running an effective website that ranks well on the search engine results pages (SERPs), there are three major factors that can influence the number of search engine referrals (incoming searches) you get. This applies to all the major search engines (Google, Yahoo, MSN, and Live).

Content Is King

The most important thing is the content on your page regardless of how much time you put into Search Engine Optimizations (SEO) for your website without the content people are searching for you will find very little return on your efforts.

Involved with the preparation of your content is analyzing the keyword(s) for your given industry. Just putting Keywords in the keywords meta tag will get you no where without those Keywords existing in your content. This is known as Keyword Density, basically the more often you’re keywords the more relevant your content is for the searcher in the eyes of a search engine. Keep in mind an ideal density is around 3.5% per word in you phrase.

When writing your Search Engine Optimized content don’t forget about the end user. If you can’t get your keyword densities bang on, then don’t worry about it. I prefer to have a lower density but higher quality content for the end user, than having spammy content and a lower conversion rate. The end goal is still to convert your visitors to your products, services, or whatever your goal may be. Users, unlike search engines, are not interested in Keyword Density so beware of keyword spam.

And a final note on Content for this introduction is that it is advisable to constantly update your content. The longer your content goes without updates, the staler the content gets, and the lower your search engine positioning will drop. However with enough Link Building this can be negated.

Link Building Your Way Too Success

Link building is easily the second most important factor in SEO, and in some cases the most. Building links into your website is the only way as a webmaster that you can affect the authority of your website, and the value your existing content may have in the eyes of the search engines.

To conceptualize link building think of your website as if it was a person. The more popular a person is the more authoritative what they have to say is to their target audience. The big difference being that our target audience is Google, and the other major search engines, and having quality links on other sites equates to your websites “popularity”.

Now keep in mind when you start your link building that nearly no two links are exactly the same. When Google calculates the value of a link it looks at several important things to figure out just how much strength to give you. Here are just a few:

  1. How much strength did the page with the link have
  2. Number of external links on a page
  3. Anchor text used for the link
  4. Is a rel=nofollow tag used
  5. How long has that link been there

Now keep in mind all of these factors above are irrelevant if Google hasn’t cached the page with the link, if Google hasn’t found it then it is worth nothing. The stronger the strength of the page your link is on the more strength you will get in return. The more outgoing links there are on a page the more that strength will be divided between all the linked sites.

A link with a rel=”nofollow” attribute is virtually useless to your website other then increasing your overall link count to give your competitors a scare. You will mainly find NoFollow attributes for Blog Comments, Website Advertisers / Sponsors, Paid Links, or links to competitors (I use them on my resume for past work experience).

When a link is built very few search engines will give you the full strength of that link right away. This is done to maintain the quality of the SERPs if everyone could just go out build 1000s of links then rank there would be no quality to the search engines. Instead they slowly give you more strength as these links age up until around the 6 month period.

Lastly you will constantly see something called Google Pagerank. Pagerank is an arbitrary Google measurement assigned to a website / page to denote that pages strength. Now some people consider this measurement to be the end all be all, but in truth it means very little other than an indicator of you sites health. If you have a PageRank on your homepage as well as pagerank on most of your internal pages your off to a good start. Also keep in mind that pagerank only updates every 3 – 6 months, and ultimately the proof is in the search engine results not some number in the toolbar.

* It’s important to note that when I’m referring to PageRank above I’m referring to the visual PageRank displayed in the little green bar, not the actual PageRank that Google uses internally to calculate the value of a page.

Optimize Your Website Navigation Structure & Design

I purposely left site structure to last as it can be the quickest way for you to royally mess up your website rankings. The worst case with bad structure is that no part of your website will be cached and you will see no visitors. I’ve seen a lot of sites with a lot of issues causing no search engines to crawl these sites. Some of the worst yet simple structural issues that can affect your search engine crawler visibility that I’ve seen are:

  1. Automatically redirecting all visitors that come to your site to another page.
  2. Using HTTPS only
  3. Pure Javascript based navigation

On other sites I have seen Google only cached the index page, which may have an assigned Pagerank without spidering the rest of the website. The things to remember when mapping out the structure of your website are:

  1. At all costs avoid having dynamic URLs (ie index.php?PageId=1), a dynamic URL is a URL that contains HTTP GET variables. Search engines don’t tend to spider these sites well. And to users they don’t have any relevant information to their queries. Try to use Page keys that contain your keywords, if you need to use Dynamic scripts to build your website (i.e. through a Content Management System), use Apache Mod Rewrites to build a static in appearance website (link removed). If you have to use Dynamic URLs keep your number of variables at no more than 2.
  2. If possible try to use the Keywords you are targeting for your industry in your URL or Files / Directories. This helps increase your Keyword Density, as well as providing users clicking through on Google information relevant to their query in your file names.
  3. Don’t constantly change your website structure. Pagerank takes time naturally to develop, and Google holds new sites back in a Sandbox. By renaming a page you can often kiss your pre-existing search engine positioning away on renamed pages until their rank is redeveloped.
  4. When designing a new site try to avoid having filenames with extensions in the URL (ie Products.asp), this can limit your options in the future if you change programming languages (ie ASP to PHP), as well as the platform your website can be hosted on (ie Windows vs Linux Hosting).
  5. When implementing a new structure or new site, create a Google sitemap, and register it with Google to let Google know what to index.
  6. Whenever possible attached CSS and Javascript as external files.

Once you have decided on a website structure, or you have a pre-existing structure, the best way to score higher search engine positions is to have minimalist coding in the HTML to maximize your Content to Markup Ratio. The best way to minimize the amount of HTML code required is to use Cascading Style Sheets (CSS). Cascading Style Sheets allow you to pull the design out of your HTML pages and place them into a separate file. Not only does this remove a lot of HTML if you were using Tables for layout, it makes maintenance a lot simpler as all your design changes are made in one place.

When I moved my website from table based layout to Cascading Style Sheets I managed to reduce my markup code by around 60%! If you have a very large site this can be even more beneficial as some search engines limit the amount of hard drive space they will allocate to caching your website, as well as raise the position of your content higher up in your document.

Conclusion

And thus concludes my introduction to Search Engine Optimization (SEO), it may sound long and long winded, but that is really just a little bit of what goes into successful positioning your website on the search engines. I’ll finish up with one last warning and that is to not buy or sell links, as you can easily be penalized completely from the SERPs for this (Google supplies a page for reporting websites for buying and selling). Good luck on your Search Engine Result Pages and Positioning!

SEO news blog post by @ 5:10 pm on February 27, 2009


 

Ecommerce & SEO

The purpose of any business website is to promote a product or service online. The purpose of an ecommerce website is to take it one step further and to allow your visitors to purchase your products or services directly from your website. This model has many great advantages over the non-ecommerce website in that it allows for the generation of revenue with little-or-no time spent in selling past the cost to have the website designed and maintained, and it does not require the visitor to call you during business hours thus helping secure the sale to an impulse buyer. If your website provides all the information that the buyer would want, you can save significant money in sales time spent in that the visitor can find all the information they need to decide to buy from you without taking up your time or that of one of your sales staff. But ecommerce sites have a serious drawback as well; very few of them can be properly indexed by search engine spiders and thus will fail to rank highly.

A non-ecommerce website may have the disadvantage on not being able to take the visitor’s money the second they want to spend it, however if it can be found on the first page of the search engines while your beautifully designed ecommerce site sits on page eight, the advantage is theirs. The vast majority of visitors will never get to see your site, let alone buy from you, whereas a non-ecommerce site may lose sales because they don’t sell online but at least they’re able to deliver their message to an audience to begin with. So what can be done? The key is in the shopping cart you select.

SEO & Shopping Carts

The biggest problem with many SEO-friendly ecommerce solutions is that they are created after the initial product. Shopping cart systems such as Miva Merchant and OS Commerce are not designed with the primary goal of creating pages that will be well-received by the search engine spiders. Most shopping cart systems out there today are not in-and-of-themselves even spiderable and require 3rd party add-ons to facilitate even the lowest form of SEO-friendliness. The money you may have saved in choosing an inexpensive shopping cart may very well end up costing you your business in the long run, especially if you are using your shopping cart as the entire site, which we have seen may times in the past.

What Can Be Done?

There are essentially two solutions to this problem. The first is to create a front-end site separate from the shopping cart. What this will effectively do is create a number of pages that can be easily spidered (assuming that they’re well designed). The drawback to this course of action is that your website will forever be limited to the size of the front-end site. Which brings us to the second option: choose a search engine friendly shopping cart system.

Finding an SEO-friendly shopping cart system is far easier said than done. There are many factors that have to be taken into account including the spiderability of the pages themselves, the customization capacity of the individual pages, the ease of adding products and changing the pages down the road, etc. While I’ve worked with many shopping cart and ecommerce systems, to date there has been only one that has truly impressed me in that it is extremely simple to use, it allows for full customization of individual pages and the product pages get fully spidered to the point where they have PageRank assigned. A rarity in the shopping cart world.

Easy As Apple Pie

Mr. Lee Roberts, President of Rose Rock Design and creator of the Apple Pie Shopping Cart, was kind enough to take the time to speak with me regarding how he developed his system. Trying to get an understanding of how this system was born I inquired as to what differentiated their system from others. Without “giving away the farm”, Lee pointed out that his system was unique in that the search engines were a consideration from the birth of this project. Rather than trying to jerry-rig a system that was already in place, he initiated the development of a system whose first task was to allow for easily spidered and customized pages. A significant advantage to be sure.

In further discussions he pointed out a few key factors that should be considered by all when choosing a shopping cart system. While more advance shopping cart systems that provide for SEO-friendly pages may seem more expensive, they save you the cost of developing a front-end site, maintaining the pricing on a static page if one goes that route, and of course – if all your site’s pages are easily spidered and you can then have hundreds of additional relevant pages added to your site’s overall strength and relevancy you have a serious advantage in the SEO “game”. If a shopping cart system costs you an extra $100 per month to maintain but it’s use provides you with an additional $5000 in sales that month did it really “cost” you $100?

Conclusion

It is not to say that the Apple Pie Shopping Cart is end-all-be-all of SEO for an ecommerce site, if it was Lee wouldn’t be in the process of building a new version that will include many new features for Internet marketing and tracking, and we would be out of work. That said, if you’ve got an e-commerce site or are looking to have one built, one must consider what type of marketing strategy will be taken with the site and if SEO is one of those, insure to find a system that provides the same advantages as this one.

It may cost a bit more up front but doing it right the first time is far less costly than building a site that can’t be marketed properly and to it’s maximum potential.

SEO news blog post by @ 3:46 pm on


 

What In The World Is Net Neutrality?

I had the great pleasure and privilege of speaking at Search Engine Strategies 2008 in San Jose. The topic? Net neutrality. This is the point where your eyes glaze over and the inevitable question, “What is net neutrality?” comes forth. And that’s the point of this article.

The session had very low turnout and Kevin Ryan, the organizer of SES, was there asking to all, “What could we do to increase awareness and attendance?” The issue is important to Kevin, important to me and in fact – it’s important to anyone who makes their living on or uses the Internet.

So What Is Net Neutrality?

Prior to speaking I would periodically get asked, “So, what panel are you on?” When I replied I would generally get a blank stare in return. It appears that the vast majority of the population, even the educated, Internet-savvy population, doesn’t understand the debate over net neutrality (or even know there is a debate to begin with). The fault in this is mine and others who do understand the importance of the issue but who have failed to be passionate about it to others.

The idea of net neutrality seems a simple one. It is the idea that all those little 1s and 0s that float around (let’s call them web pages, downloads, videos and emails) should be treated equally and that under no circumstances should the ISPs be allowed to adjust, degrade or otherwise affect them other than to pass them to the requesting party. Seems simple enough right? So what’s the debate?

The debate is hugely complicated with the providers claiming that they need to manage traffic to provide a solid experience to all and with net neutrality advocates claiming that the ISPs are destined to abuse the ability to manage traffic and that a huge array of issues will likely follow if we give them the ability. They claim that legislation is necessary to insure that all traffic is considered equal and no site, download or other traffic source is adversely affected based of its content type, origin or requesting user.

Again, on the surface it seems fairly simple. The “greedy ISPs” are looking to gouge users and degrade our service and we need the government to protect us. At first glance that’s how I saw it too.

Why Does It Matter?

Depending on which camp you’re in the reasons are different but the message is the same – the wrong decision is going to have wide-spread affects on how the Internet grows and how users and their 1s and 0s are treated. Basically – this issue will determine the health and growth of the Internet. To be sure, the Internet will survive regardless but the questions remain:

  • How will we access it?
  • How will it be charged?
  • How fast will it be?
  • How fast will it grow?
  • Who will have access?
  • What will we be able to access?

Basically, the entire future of the Internet is carried on the back of this issue. An issue that most people aren’t even aware of and even those who do know are having trouble determining which side is actually the correct one.

Those opposing net neutrality legislation could refer to it as a solution looking for a problem, showing clear examples of how issues are being dealt with under current legislation and asserting that any additional legislation will restrict future enhancement. The solution itself may become a problem if it is too broad-reaching.

Opposition would point out that the ISPs are self-serving corporations and that to that end, consumers need protection. That not taking action now will result in a scenario where the abuses will take place and there will be no institutionalized solution.

Here are the main points of both sides:

In The Debate …

My session at the conference was constructed as a debate with Cindy Krum from Blue Moon Works moderating. In the debate I was up against my good friend and the co-host on my radio show Webcology on WebmasterRadio.fm, Jim Hedger of Markland Media. He took the side supporting net neutrality and I opposed it. In truth, we each see both sides of the argument – it’s that kind of issue.

Pro-Net Neutrality Legislation

Jim brought up many good points in his presentation. He illustrated the abuses that have take place recently including ComCast’s blocking of torrent seeding. For those of you unfamiliar with torrents, they are a peer-to-peer file sharing format. ComCast allowed for the download of the file however once downloaded they blocked the user from seeding it. ComCast claims this was in an effort to reduce the effect these users were having on the network.

In August 2008 the FCC stepped in and forced ComCast to cease these actions enforcing the idea that they could not discriminate based on the file type and degrade the service. Jim and other net neutrality advocates claim this as a victory.

In 2007 Verizon blocked pro-abortion text messages to a legitimate list of recipients. They didn’t require legislation or laws however; public outcry forced a reversal of this policy.

Jim painted a bleak future if net neutrality legislation is not passed. A future where ISPs degrade specific websites, provide preferential treatment of other sites based on a payment structure, and promote their own self-interests and web properties through degradation of the alternatives. He would assert that smaller businesses will suffer, unable to pay the fees required to compete with the “big boys”.

These are the common concerns among net neutrality advocates.

Anti-Net Neutrality Legislation

I presented the arguments opposing net neutrality legislation. It was a tougher stance with the room (albeit small) against me from the beginning but the points are legitimate nonetheless – there are solid concerns against net neutrality legislation. In the end however I isolated two main points that are clear.

The first point I brought up was that of the current legislation. As noted above regarding ComCast, there is existing legislation to protect consumers and this legislation works. That is where the argument, “Net neutrality is a solution looking for a problem,” comes from.

In fact, some might argue that even the current legislation is too much and we only need to view the ComCast decision to witness why this might be. ComCast is not allowed to block torrent traffic. Due to this they are looking at other ways to manage bandwidth. The solution they’ve come up with and that they’ll be toying with next year is to monitor all users and when traffic is high on their network – slow the speeds of those using the most.

What this boils down to is that if I was sitting at home downloading a site, chatting on Skype and maybe surfing a bit while doing this there’s a good chance my access would get slowed down just to protect those downloading movies illegally (and yes I am aware that torrents are used for legal downloads as well however I have a feeling that if their only use was legal, they wouldn’t be a problem to the ISPs).

Legitimate traffic may now well be affected negatively to protect “net neutrality”.

The second point (and probably the less popular of my arguments) was that capitalism and consumer choice in a non-monopolistic area is self-regulating. As we saw with Verizon’s blocking of pro-abortion text messages and reaction to the public outcry (which was to let them through) the consumer has enormous influence and when abuses occur, their reaction forces companies to adjust policy.

In the end we will get to choose our providers and the threat of losing business is an excellent motivator.

So Who’s Right?

The problem with asserting who’s right or wrong here is that there is key information missing. We’re trying to give an answer when we don’t really know the question. So far the debate is over net neutrality legislation. What is that? What does it cover? How does it read?

Without knowing this it’s difficult to really know what we’re for or against but the problem is, by the time there’s legislation it’ll likely be to late to back away from it.

It’s also difficult to look at the pro and against supporter lists without having it affect your decision unless you really think about why they’re there. On the pro side we’ve got companies like Google and Facebook (two friendly “little” companies) and on the against side we’ve got telco’s and business organizations (those evil people who just want to make money). In fact, both camps want to make money. Let’s not forget that they may be friendly companies – but both Google and Facebook both have billions of dollars and rely on the networks. This, and not some altruistic believe in a “free internet”, is the true motivation of these companies. They want to make sure their costs aren’t increased simply because they’re some of the biggest sources of Internet traffic, either directly or indirectly. Now, their main arguments may or may not be correct however one has to understand that each voice has its bias and we need to understand that bias rather than simply choosing sides based on which camp looks the nicest.

So What’s The Answer?

While I’d love to be able to give you my honest assessment of the situation, the fact is – the more I learn about the net neutrality issue the less clear the right decision becomes. In discussing this with Cindy and Jim after the debate we agreed that the biggest need right now is awareness and a clear definition of what both camps are seeking, what the legislation would look like and a third party evaluation of how this would impact the Internet as well as some real open dialogue, not just banner waving from both sides.

One thing we do know is that net neutrality legislation would significantly impact the state and future of the Internet – what isn’t terribly clear is how. That’s what we need to know.

The next step in the discussion is public awareness and a serious discussion with both sides and our politicians on the issue. We need to understand exactly what’s at stake, what legislation would look like and how it would impact the ISPs and the consumers. We need to look to the future, understand what is coming in the way of bandwidth requirements, and make sure that the average user will have access to the bandwidth they need and that the ISPs are motivated to insure that it’s there to be had.

My Opinion

When starting my preparation for the debate I leaned towards the net neutrality legislation camp. It seemed like the obvious choice however the more I learned, the more grey it became. Today I find my leanings favoring the anti-net neutrality side. I find that when I think of how the current legislation has protected consumers adequately thus far, how public opinion has forced complete 180’s in others and when I consider how lack-lusterly governments tend to create broad-sweeping laws in areas where the offenses are as-of-yet unknown – it seems prudent to support the current state of affairs, at least until a genuine need for specific net neutrality legislation arises that can’t be address with current legislation.

That said, Jim leans to the other side and he too understands the issue and the arguments on both sides.

Two people who have researched significantly the issue, viewing common concerns from both sides and who, in the end, land on different sides of the fence. Again, it’s that kind of an issue.

We Need

We need an open and honest debate on the issue. We need you involved with the discussion and we need those in government who support net neutrality legislation to stand up and explain what they believe it means and what the legislation would look like.

We need to hear all the points from the ISPs in regards to how the legislation would negatively impact services and future development on infrastructure and we need to hear from the pro-net neutrality camp on exactly what needs to be protected that isn’t already and why.

Until then I’ll continue to speak to less-than-packed rooms at conferences whose attendees are greatly affected by the issue – even if they’re not aware of it.

But at least you are now. Now it’s time to educate yourself further and find out for yourself why this issue is of paramount important and what you can do to insure that the Internet remains the highway of information and entertainment that it is, tomorrow and for years to come.

Additional Resources

Save The Internet – Save The Internet is a pro-net neutrality site dedicated to providing information supporting the idea of net neutrality legislation. It’s an excellent resource and required reading for anyone who wants to fully understand the issue.

Hands Off The Internet (formerly linked to:http://www.handsoff.org/blog/)– Hands Off The Internet is an equally important website explaining the situation from the side of those opposing net neutrality legislation. As with Save The Internet, it is required reading for anyone who wants to fully understand the issues and what’s at stake.

I would warn all readers; this is not an issue to take sides on, on just face value. Read the two sites noted above and then go further and find blogs, news and other information sources. It’s easy to get a quick, biased opinion on either side but it’s important that we all understand all the issues and all the risks.

SEO news blog post by @ 12:16 pm on September 22, 2008

Categories:Search Engine News

 

The Search Landscape Reflected In Paid Results

It’s important to note that the writing of this article occurred on July 17, 2008. I mention this only to insure that you can put it into context and also so that those who read this article in a day or week or month from now aren’t confused by my noting of Q2 reports and references to “today”.

Any of you who have read some of my past articles or who have visited Beanstalk’s services pages will know – I’m not a PPC guy. Quite honestly, it’s not in my primary skill set and it’s something I would definitely prefer to leave to the experts. Now that said, following Google and it’s health (which is tied directly to AdWords and AdSense) is something I’m keenly interested in. To this end, recent changes in Google’s paid search display and ranking systems will have huge impacts on advertisers and, more important for the purpose of this article, on Google itself.

A couple weeks ago a friend of mine, Richard Stokes from AdGooroo sent me a PDF titled, “Search Engine Advertiser Update – Q208”. With this document they outline the changing trends in the paid search marketplace and many of the stats are surprising. If you’re a PPC manager they’re obviously directly important. For those of us in the organic optimization world they are still both interesting and important They’re interesting for reasons which will become clear further below and they’re important because anything that affects the economic health of the search engines affects the search landscape both inside and outside of the paid search realm.

Paid Search Market Share

What could be more important to the engines than their percentage of the paid search arena. Does Google really care about being the dominant search engine as far as organic search goes? Let me put this a different way, if Google was standing in front of their shareholders – would they prefer to announce that they held 80% of all worldwide searches and reported revenues of $7.8 billion dollars for the quarter OR would the rather stand up and say they hold 20% of all worldwide searches and reported revenues of $8.7 billion dollars? Organic results drive traffic which is turn results in clicks on paid ads. From a business standpoint that’s the only reason that organic search even matters.

So which engine has the healthiest paid search environment? According to AdGooroo, Q2 results show a different world that one might guess (which is why I noted that it is interesting).

Over the past twelve months advertiser growth (or lack thereof) breaks down as follows:

  • Google – -8.5%
  • Yahoo! – +9.8%
  • MSN – -6.7%

Advertiser counts have also changed (i.e. the number of advertisers on the engine). Yahoo! leads in this area as well with a growth of 0.03%, Google dropped by 6.4% and MSN dropped by almost 20% (good thing they have their OS revenue to fall back on).

And A Drop In Ads

To go even further, Google has increased the importance of quality which has resulted in a reduction of nearly 40% in the number of ads that appear on a results page. 6 months ago ~6.5 ads appear per page whereas now that number is closer to 4. This has the potential to significantly help or significantly hinder Google’s revenue.

As Richard Stokes points out and I completely concur, this places Google in an environment where one of two things will happen:

  1. Advertisers will realize that their clicks are converting much higher, search marketers will spend more time and resources creating more and more relevant ads and landing pages and advertisers will be willing to bid more as the conversions increase, or
  2. The competition for the top spots will be reduced and so too will the average bid prices.

Google’s Q2 Report

And what inspired the writing of this article was actually the release of Google’s Q2 report earlier today. After reading it I immediately had to contact Richard and let him know that the results confirmed some of the predictions noted in his work. He writes:

“… the auction-based bidding system makes this a double-edged sword. As the number of advertisers declines, so does the competitive pressure for higher bid prices. If advertisers don’t step up to the plate and bid more aggressively for placement, then it’s possible that search revenues could stagnate.”

Google revenues were up only 3% over Q1 of this year and revenue from paid click was down by 1%. This is the first time in Google’s history post-IPO that I can remember them showing reductions in revenue in one quarter over the previous. It appears that this new paid search model in not quite as effective at pulling in money as the old.

Now, to be fair, the new system of requiring higher quality scores and better ads and landing pages is new – only a few months old at this point and so there are likely still bugs to be worked out but Wall Street did not react favorably to the announcements today and I suspect that the situation isn’t going to look better for Google at the close of day tomorrow (though what do I know about stocks).

What Does This Mean?

So what does this mean? This means that Google has a lot of work to do and those in the paid search space need to pay close attention (even closer than normal) as shareholders don’t like to see losses and Google is going to need to make moves to recover and show significant gains by the time their Q3 reports come out.

One might guess that this also means that Yahoo! Is gaining ground (which is true) but it’s definitely a case of too little too late. Also earlier today (it was a busy day in search) Yahoo! released a letter to its shareholders that on one hand referred to the alliance between Microsoft and Carl Icahn as a destroyer of shareholder value for Yahoo! and then went on to say that they would be willing to sell the company to Microsoft at $33/share (which is what Microsoft has offered previously and which is more than $10 above their current market value).

It seems that the one can’t look at the stronger relative results in the paid search area that Yahoo! has achieved as a win when they seem to be backsliding on their initial position regarding the sale to Microsoft.

So Where Do We Go From Here?

For one thing, watch closely. Monitor resources such as AdGooroo’s research library, and the Clix Marketing blog. Pay close attention as we’re going to see a lot of changes to what’s going on and these changes are likely going to have effects on both the paid and the organic results as Google strives to provide the better results they’re targeting through paid search now but at the same time increase their revenue.

This may involve adjustments to the quality scoring (I can pretty much guarantee that one) and may involve adjusting how the paid ads appear on the page with the organic results. All we can really do is watch, wait and adapt.

Note: a big thanks goes out to Richard Stokes and the AdGooroo team for providing the research and stats behind this article. Your keyword research tool and compatition analysis capabilities are awesome !!!

SEO news blog post by @ 11:10 am on July 30, 2008

Categories:Search Engine News

 

>1 Is The Loneliest Number

A lot of my recent speaking engagements at both Search Engine Strategies and SMX have been geared towards running an SEO company, dealing with a changing economic landscape and similar issues. It is with this in mind that I got thinking about what separates one company from another. There are many great SEO and SEM firms out there, I like to think that Beanstalk is among them but there are also a number of poor ones. What separates the two and why will some succeed and others fail?

In thinking this over I considered skills first. Is it that the companies that weather the years, ride out the ups-and-downs in the fiscal year and the trends in the economy have the highest skills? Not entirely. At first this seemed like a logical, “survival of the fittest”-type scenario but I have seen skilled people (in this industry and others) going down while those who have very little in the way of skill succeed. So it’s not entirely about the ability to get the job done. Or is it …

One defining trend that I have noticed (though I would be very interested to hear about any exceptions to this you might have) is that the companies that specialize tend to be more successful than those who try to do many things. Companies that start by doing, say, web design and get lured into SEO (“Why give away the client to someone else – it’s just a matter of packing in some meta tags and buying some software to submit the sites to a billion search engines every month right?”) or try to host their own client’s sites (“My reseller package gives me unlimited domains and unlimited traffic.”) or offer other services that get into trouble.

So my advice has to be (and I’m not the first to say it) – do one thing, be excellent, and leave the rest to the experts in other fields.

Honestly, I’ve been tempted over the years to try to delve into other areas. I’m a half-decent designer and I know my code well enough (or what kind of SEO would I be?) so when a client comes with no site but a great idea it’s always tempting to take the whole contract, but then reason sinks in (even when I have staff who can do the parts that I can’t). Even the Beanstalk site was designed by a professional web designer (and many thanks to Frederick from W3 EDGE Web Design for a solid site that converts well). The key then is to find experts in other areas that you can trust with your clients. To that end I personally look for other, similarly-minded companies that specialize in what they do best and leave the rest to others.

The Exception

Before I get an onslaught of comments and emails blasting me for saying such a “crazy” thing as noone can be an expert at everything I should note some exceptions to this rule. There are firms out there that consist of multiple divisions, each of those divisions dedicated to an individual task. Let’s take for example a firm such as WeDo Hosting Canada (I used to work there more moons ago than I’d like to count so they make a great example). Robert Gagnon (owner) built an excellent hosting facility but it was to support his software development projects. Instead of trying to do it all he created a hosting company and a software company, hired great hosting experts to manage and support the one company and developers for the other.

If you are yourself trying to be a designer, SEO and host (why not add in a little social media marketing and PPC management just for fun) you’ve basically created a recipe for disaster and if I keep my eye on my watch I should be able to figure out pretty closely the exact moment that it all tumbles into decline. It will be the moment an issue arises in an area that you are not an expert. If your host goes down and you’re on a standard reseller package and not able to directly fix your situation, you become reliant on others. What if they didn’t make a backup of the product and/or sales database? And now your client blames you and will pull the entire set of services you provide them on the shelf – someone else’s shelf.

But I digress …

Who Are The Experts?

According to Merriam Webster’s dictionary, an expert is defined as, “one with the special skill or knowledge representing mastery of a particular subject.” This seems like a pretty fair analysis of the word. Now, there are certainly the Leonardo Da Vinci’s out there who can unquestionably prove themselves to be experts and masters in a variety of fields however I am not on that plane thus – I am limited to focusing all of my time and attention to a single endeavor. In my case I chose organic optimization and let me tell you (if you don’t already know) there’s enough going on there to keep one’s attention fully occupied and if I had two brains instead of one – both could be kept busy.

The same can be said for all of the other areas that are commonly grouped by individuals. Developers are generally logic-based thinkers, designers are generally creative, good PPC managers have a knack and skill for weeding out specific trends and stats to maximize revenue while minimizing undesirable clicks (those would be the clicks from people unlikely to convert). Social media experts focus primarily on the here-and-now (i.e. what’s working right now to drive massive traffic through social media sites) and so on.

Because I like to avoid speaking ill of others, especial those in the SEO-realm I’ll focus on my limitations as I’m always welcome to pick on myself. My personal strengths and interests lie in evaluating and understanding trends in ranking fluctuations, analyzing competitors and applying the finding across multiple sites as appropriate. Ask me to design a site … goodness no – please don’t, for your own sake. I can’t create pretty things in my head (or on paper) and I certainly can’t move that image onto the web. When client’s need design or development I send them to designers like Moonrise Design from San Francisco who we’ve worked well with on a number of projects or Atomic Crayon from Victoria.

Ask me to manage a large-scale PPC campaign – not if you want it to be successful. I can hold my own on small campaigns or campaigns just for testing keywords but when I think of titles and descriptions I’m thinking of the organic results – write them to get the click as it’s free and we can work on converting them when we get them to our site. This doesn’t apply well to PPC. I’d rather refer a client to David Szetela and crew over at Clix Marketing who have the same feeling we do – their monthly fees are based on your profit not your spend so they’re focused on making the most of each dollar in your budget – not just getting rid of it all.

How about hosting? – I’m not even going to go there. I’ll leave hosting to the likes of Lunarpages Web Hosting or Superb Hosting. Is there anything more critical to the success of an online business than hosting? No matter what was spent on SEO or PPC or your design, if a site isn’t up – what does it matter?

And So My Advice Is …

If you’re a provider of services, be excellent. Pick the one thing you do best and hopefully most enjoy and be the best provider of that service your client could have. Find reliable and trustworthy partners to offer the services you do not and refer your clients to them. You can likely take a commission. At Beanstalk we’ve opted not to take commissions on referrals just to make sure we’re always give what’s actually the advice in the best interest of the client, however there’s nothing wrong with doing so if you know you’re giving great advice.

If you’re on the hunt for an SEO or other Internet marketing service – select a company that either does one thing extremely well and can help you find suitable providers of the other services or which has dedicated staff for specific tasks, thus enabling them to learn and focus on the skills best suited to the task they are performing on your site.

But What Does This Have To Do With The Lonliest Number?

Everything. Through this piece I’ve discussed essentially what will be the downfall of many Internet Marketing firms. The economy is changing. The fat is being trimmed and the most skilled may not be the ones who survive – if they extend themselves into areas where they’re not the best.

As a web services provider or as the client of one I’d want to know that I’ve got the best doing what they do best. Until Leonardo comes back and takes up Internet marketing, design and hosting – that’s going to need a team or set of teams – not an individual. If you don’t take this advice, well – it’s very lonely when you have a poorly designed site hosted on a slow server that doesn’t rank very well – or if you’re the company who’s client had that site.

If you have any questions please feel free to contact us for additional information or sign up for our free search engine optimization review.

SEO news blog post by @ 10:11 am on July 18, 2008

Categories:SEO Articles

 

Part Ten of Ten: Keeping It Up by Jim Hedger

Welcome to part ten in this ten part SEO series. The ten parts of the SEO process we will be covering are:

  1. Keyword Research & Selection
  2. Competition Analysis
  3. Site Structure
  4. Content Optimization
  5. Link Building
  6. Social Media
  7. PPC
  8. Statistics Analysis
  9. Conversion Optimization
  10. Keeping It Up

Sustaining Search Rankings and Increasing Site Conversions

Over the past three months, the WebmasterRadio.FM show Webcology and WebProNews have run a joint series of radio shows with corresponding articles on the ten basic steps or stages shared by most effective SEO campaigns. Today, we cover the last section (but certainly not final points) in a round-up article aptly named, “Keeping It Up”.

If the initial object of search engine optimization is to attract website traffic, the long-term objectives are to retain and convert that traffic and new traffic into repeating business. As all analytic webmasters know, the bulk of site traffic tends to come from search engine referrals. Retaining strong search rankings is essential to sustaining strong traffic, increasing sales and thus expanding your business.

Even after a website has established itself with its visitors and has a strong conversion record, search engines will continue to provide the vast majority of all new site traffic. If the SEO has done his or her work properly, blogs, social media and paid-search ads should also be driving new and repeat visitors.

Under normal circumstances, sustaining strong search engine rankings, while hard work, is fairly straight forward. That doesn’t mean it is easy by any extent but search marketing maintenance is not necessarily rocket science either. There is a never ending series of regular, methodical tasks to plan and work through. Depending on the size and scope of the website one is working on, a number of decisions should be made in order to prioritize work to most effectively use one’s time.

Generally, when we do the “final” touches on the initial phases of a client campaign, we wait a short period of seven to ten days before worrying too much about where pages or documents in the site are ranking. During that time, there are a number of tasks we perform to help boost the site’s performance, many of which have already been covered in the previous articles and radio segments that make up this series.

Post-SEO: Day 1, Whew! The initial phase is 99.9% done

Let’s pick up the story from the moment Metamend’s head SEO, Jade Carter, signs-off on the completion an initial project. All augmented or freshly created files have been uploaded to the host server, are live to the web and presumably being actively spidered. While most of Jade’s staff have already moved the bulk of their attentions to the next project, Jade still has a bunch of things to do before assigning the long-term life of the file to one of his account managers.

The first critical step is to perform a final “doh!-check” to ensure everything is working properly. Getting a site posted properly can be more complicated than one might think. At this point every minute counts as we can expect search-spiders to visit the site fairly quickly after changes are made. “Doh!-checks” are among of the most important parts of his job and can only be conducted with 100% certainty when all files are live.

As with all busy search marketing agencies, Metamend’s SEO staff will see hundreds of files cross their monitors each week and, as previous parts of this series show, frequently perform several different task-sets each day. Clients often use their own in-house web-teams to implement SEO recommendations or upload new files. With larger clients, it can take a few days or even weeks to see changes implemented, time during which any number of issues can arise. Mistakes can happen but since we work in a professional best-practices environment, mistakes should never see the backlight of day.

Jade scans each page to make sure all our recommendations or hands-on changes have carried over to the live-site. Running through all links, he checks to see if the site looks and acts the way he expects it to.

Search engine optimization and marketing has often been described as a mix of art and science. When reviewing the on-page work of his staff, Jade needs to think using of both sides of his brain. His first priorities are technical. Jade has to verify the website is functioning correctly and is, in fact, search spider-friendly. He then needs to turn his mind to marketing to make sure the website is user-friendly and accessible.

Thinking like a techie, he needs to inspect the site structure, and link-paths while planning the continuance of link building efforts. Moments later, Jade shifts mental gears and thinks like a marketer, checking if fresh and optimized content files appear on the pages they are supposed to appear on and if the on-page layout is attractive and compelling. He also needs to look into any social media marketing and paid-search marketing campaigns to map out staff work time and assign long-term tasks appropriately. Assuming everything is found working according to plan, Jade is able to do the final “doh!-check” sign-off before moving his focus to post SEO management and metrics.

Post-SEO: Day 2 – Day 7, The Garden is Seeded, now we add water and watch

(Please see the other articles in the series and listen to corresponding Webcology podcasts for information on any specific steps or techniques)

The biggest bulk of the heady, heavy work is done! Manually working through each file in a website takes a lot of time and energy at the beginning of a contract. It is bulk work that can be all encompassing for days at a time. That’s why SEO/SEM firms often charge a large up-front fee along with slightly smaller monthly fees. Over the coming months, an enormous amount of work will continue with focus on link building, social media, and ppc management. As with the initial SEO phase, much of the research, budgeting and data entry; the heavy lifting, is done in the first week.

The day after an optimized site and PPC campaign go live, the analytics begin to kick in. This is when things get intellectually interesting as statistics analysis forms the foundations for future planning and improvements. Most web analytic and monitoring software we use begin collecting data immediately after installation but it takes a few days for truly informative metrics to emerge.

In the initial week, software suites such as Enquisite Pro, Click Tracks, Google Analytics, and soon Yahoo’s newly purchased suite IndexTools need to be tweaked, trained and tutored in order to get the best overviews of website traffic and unique page performance. The PPC monitoring and click compliance software suite, PPC Assurance begins working immediately to record details of PPC driven traffic.

The first week does give search marketers a fairly good idea of how their PPC campaigns will generally fare. A lot of tweaking and testing can go into the phrasing of winning PPC headlines and ad-copy through-out the life of the campaign but in the first week, SEMs get a sense of how the present competition will act. Running in conjunction with PPC Assurance, the dashboards of the PPC networks (Google and Yahoo) provide very good analytic information, providing enough data to run a global advertising campaign from your desktop or laptop computer.

A critical note on pay-per-click management, it’s necessary. Frequent checking and the use of alerts are important to monitor bid-rates and the cost of every click. Diligent PPC monitoring is essential through-out the life of the search marketing contract.

Post SEO: Day 8 to ∞, In Which SEO and SEM Become Website Marketing

SEO and SEM turn into website marketing when numerous tools are used in conjunction to improve website traffic and increase web page conversions. With the deployment of optimized content and a blog, pay per click advertising and social media marketing, the pillars of the campaign are set in motion.

Some pages are expected to perform better than others. Similarly, some pages will perform better than expected. Some pages will not perform well at all, including ones you initially had high hopes for. Getting a true grip on how each page in a website or facet of the overall website marketing campaign is performing is like examining each tree in a woodlot. Compared to the intensity of the initial hands-on SEO phase, the work of Keeping It Up (to sustain search rankings and improve conversions) is much more mental than manual.

Analytics play an enormous role in long term search marketing management. From basic ranking reports to local search results in specific cities to eye-tracking studies and mouse-tracking technologies; search marketers rely on data generated by website visitors. Examining the data shows search marketers which pages or files are working and which require improvement.

The first thing SEOs look at is simple, organic page and site rankings across the major search engines under target keyword phrases. Though ranking reports provide the most basic information, the vast majority of search engine referrals come from first page placements and of those, the greatest numbers come from placements above the fold in the Top 1 – 5 spots. Placements on the second and third pages of results can be worked on and improved, moving them upwards in search results.

A problem with basic ranking reports is search engine placements often differ from region to region and city to city. In many circumstances, websites with local relevance will place better in their respective regions. Quite often, people from different places use different words to describe the same things. This is where the Enquisite Pro toolset is particularly helpful as it generates an increasingly granular view of search traffic and keyword usage down to the micro-level of zip and postal codes. Knowing how unique pages rank in specific cities, and which keyword phrases to target in different places helps improve conversions by allowing webmasters to tailor specific promotions to specific locations.

Use of other search marketing tools to drive traffic and improve rankings starts to take an important role in website marketing. Blog posts relating to well selling products in a specific region or city might be written and distributed to relevant publications with links back to pages that require a boost or support in the rankings. A well thought through social media campaign could attract visitors to a landing page that re-directs them to a regionally specific landing page based on IP address or their own social media settings.

As time moves forward, analytics packages begin to flush out a clear view of how website visitors move through the various pages in the site. When navigation patterns become clear, the process of conversion optimization begins. Briefly, conversion optimization is the art of prompting website visitors to take pre-planned actions such as clicking a link, buying a product or requesting information.

As analytics and user-tracking begin to paint a picture of how web visitors move through a website, search marketers can improve pages in the site to best meet the actions of the visitors and improve chances of conversion. Phrases such as “time on page”, “bounce rate”, “referral page”, “entry and exit points”, capture the attention of web marketing analysts trying to figure out the very best way to make and remake web pages to increase conversions. For anyone who simply needs to know the language of search marketing analytics, the Interactive Advertising Bureau publishes a 29-page glossary of Interactive Advertising Terms.

Good use of analytics allows search marketers to keep track of dozens of site-critical factors, giving them a easy overview of web-traffic patterns and ways to increase them. Getting an easy overview is pretty important for seasoned search marketers. They are often responsible for several files at the same time.

Working on improving a website and bettering site conversions brings the welcome benefit of working to increase page and overall site rankings. The introduction of improved content, RSS feeds and (hopefully) higher volumes of web traffic create a stronger reputation in search databases. Additional traffic and incoming links are drawn by a blog, social media marketing and link building efforts.

As weeks move into months, the progress and prosperity of the website should be well established and the marketing campaign(s) fine tuned. Barring unforeseen circumstances and assuming best practices have been followed all the way through, the majority of a search marketer’s working time is likely spent on continuing to update the blog, link building and social media marketing. And on it goes …

About the author:

Jim Hedger is a veteran SEO, a good friend, and reporter for Webmaster Radio.

SEO news blog post by @ 2:05 pm on April 13, 2008

Categories:SEO Articles

 

SEO Step Eight Of Ten: Statistics Analysis

Welcome to step eight in this ten part SEO series. The ten parts of the SEO process we will be covering are:

  1. Keyword Research & Selection
  2. Competition Analysis
  3. Site Structure
  4. Content Optimization
  5. Link Building
  6. Social Media
  7. PPC
  8. Statistics Analysis
  9. Conversion Optimization
  10. Keeping It Up

The Final Stages

So you’ve done your research on keywords, tested them with PPC for conversions, analyzed your competitors, optimized your site, built your links and used social media to promote your cool tools, awesome news and more. You’ve watched your site crawl up the rankings and you’ve finally hit the sweet spot – you’re on the first page of Google or better yet – you’re above the fold on Google! Time to kick back, open up a beer (if you’re over 21 – 19 here in Canada), and watch the money pour in from whatever goods or services you’re providing. Here comes the bad news – now’s when the work REALLY begins. Now don’t get me wrong, you’ve gone through a lot to get here and you’ve definitely put in your hours – you do deserve to take a bit of time off from dealing with your site before you head into the final three stages (which last for the lifetime of your site by the way). This couple weeks will give you time to collect some baseline data for what we’re going to discuss here …

Improving Your Site Health

You’ve just put in months of hard work to get your site ranking. You’re likely making some sales now and you’re probably pretty pleased with your efforts, and you should be. But are you getting all that you wanted or could have from the rankings? If you’re now making 5 sales per day, could you make 10? If your sales jumped from $5,000 per month to $50,000 … could they be $80,000 … how about $180,000? As an SEO I definitely understand how much simply ranking on the search engines can mean to a company’s bottom line. As someone who’s taken a good hard look at stats for numerous sites and made or recommended numerous changes to sites based on them I definitely understand that while rankings will bring you the traffic … they don’t help you make them buy. This section also applies to those of you who already have sites ranking well, have some good traffic and just want to make the most of it.

When we’re thinking about increasing sales from existing traffic we generally think about conversion optimization. If this is your first thought you’re 100% right but in order to increase conversions first you need to understand what your visitors are doing. After that you can look at your site and understand what’s going wrong when you’re considering what you need to do to increase your conversions. So today we’re going to determine how to tell what you’re doing wrong (and right) by looking into your stats.

Obviously we can’t get into each and every aspect of stats here. Your statistics are different than my statistics and the issues that various sites face will dictate which aspects are most critical to look at and how they should be analyzed. That said, there are some areas of any site’s stats that need to be looked at and which can be reviewed by people of virtually any skill level with some understanding. We’re going to focus on these areas. First, let’s look at a couple of stats programs:

Webalizer & AWstats – Chances are your web hosting provider offers some sort of free stats program like Webalizer or AWstats. These stats programs are fairly elementary and don’t give in-depth information nor can you customize what you’re seeing. They are however very easy to read and understand. If you’ve never looked at a statistics program before – it might be a good start to check these out.

Google Analytics – This is Urchin adjusted. The stats are collected and seen by Google and provided to you in a fairly simple-to-read format complete with the ability to customize a lot of the data and set targets so you can track specifically who is landing on specific pages (such as you’re “thank you” page). Probably the best of the free programs out there but there’s just a part of me that’s nervous about handing over all the data about my visitors and their patterns until I know that data to be favorable. if the average visitor to your site only stays for 30 seconds and visits 1.5 pages … do you really want Google to know that? You might prefer to work on getting your pageviews and time on site up to good levels before letting Google see under the hood. Now, I don’t know that they use this data but then – just because you’re paranoid doesn’t mean they’re not out to get you.

ClickTracks – This is my personal favorite of all the stats programs. The virtually endless ways it can be customized, the in-depth ability to calculate ROI and the reports make this a winner. It’s not the cheapest option (it’s hard to compete with free). They have a free trial well worth checking out. For more information be sure to turn in to WebmasterRadio.fm at 2PM EST on March 3rd, 2008 when we will have ClickTrack’s Andres Galdames on the show to discuss stats analysis with us. If you miss the show don’t worry – you can download the podcast at http://www.webmasterradio.fm/Search-Engine-Optimization/Webcology/.

What You’re Looking For

Alright, so we know what to use – but what are we looking for? There are some universally applicable measurements that virtually everyone should look at:

Visitors: This is exactly what it sounds like – it’ll tell you how many visitors you’ve had to your site in a given timeframe. Obviously this is what you want to see increase over time. This stats won’t help you increase conversions but will let you know how your rankings and links from other sites affect your traffic.

Referrers: This will tell you where you’re traffic is coming from including which engines are sending how much traffic.

Keywords: You know how you rank but what does that REALLY mean. Your keyword stats will tell you how much traffic these rankings are really producing.

Exit Pages: When you’re looking to improve your conversions and visitor patterns very few stats are as important as this one. Knowing which pages your visitors are leaving from can get you well on your way to repairing the major issues with your site.

When we’re using more advanced stats packages such as Google Analytics and ClickTracks we can start to further tailor the information to receive. You’ll be able to see the entry and exit pages for visitors from specific engines or who came to your site after using a specific phrase. You’ll be able to track the value of a visitor if you have an e-commerce site or track which visitors are contacting you most often.

How Will You Use This Data

When you’re trying to increase your conversions you need to first understand what’s going on with your site. In some cases you will be monitoring for problems, in others you’ll simply be looking for missed opportunities. In the end the truth of the matter is – there is always room for improvement. No matter how well designed the site, there will always be a top exit page which means there is always testing to be done.

To give you an idea of the range of issues you will be looking for I’ll take a page from Beanstalk’s history. When we first started out our primary phrase was “search engine positioning” (and thus our company name) with “search engine positioning services” as one of the secondary phrases. When we hit the first page for the services-based phrase we started getting some good traffic. It took us a few more months to rank for “search engine positioning”. After about a month we reviewed our stats and what did we find? Using ClickTracks I was able to specify wanting to know which visitors landed on one of our thank you pages (indicating filling out our contact form or interest in one of our services). After collecting a few weeks of data I discovered that the majority of the people who contacted us had entered with “search engine positioning services” and that not a single person who came to our site with “search engine positioning” filled out a form.

Obviously this changed the entire SEO strategy from that point on. After changing our keywords dramatically and working on increasing the traffic from keywords that would convert higher we then switched our focus to streamlining the process and working on our top exit pages. Now, our blog is a main exit and entry page but that’s acceptable given that most of the visitors to the blog are there to read current news (i.e. they’ll only be visiting the index page of the blog and then leaving). When we see that a services page is a main exit page however we know we need to take a good look at the page and see what we can do to get the visitor to visit more pages or communicate with us.

In another example, we were working for a client that was undergoing a complete redevelopment of their site. This time we ransacked the stats from prior to the change as well as after to understand exactly how the visitor’s interaction with the site changed. As can happen, I was surprised with many of the changes and a lot of issues that I (not having gone all the way through the cart once told it was functioning) did not see coming but which a look through the stats clearly outlined. There was serious abandonment at the cart level as the final steps were made more difficult than they had previously been. The client had thought their traffic had declined however we were able to show that other than a few days’ dip it held pretty steady and that it was a drop in conversions that was responsible for the drop in sales despite having a site that otherwise performed better (in regards to navigation and search functionality).

What Should YOU Do

The one things that’s true in every case is that you need to check your stats. You don’t have to look at them every day but a weekly check is going to help make sure you always know what’s going on, where your traffic is coming from and what they’re doing.

If you have the patience to learn how to use the more advanced tools (and I highly recommend you do or hire someone who can) you’ll get a feel for where things could be improved and where there are critical issues. With what can often be a few minor adjustments you can see significant increases in sales and the ROI from the site as a whole.

Conclusion

Unfortunately there is no way for me to cover all the the possible areas you would want to look at in your stats. What I’ve tried to do here is to give a brief outline of what you can use and the basics of what you’ll be looking for and some example of how they can be useful. Unfortunately this only covers about 1% of what can really be understood about stats, the tools that can be used, and the uses for it.

Fortunately we’re going to have a chance to hear from the fine folks at Enquiro next week (who will be writing part nine of this series and who will also be on the radio show with me). They will be able to discuss some different stats techniques though they will be focusing on the next step in the process – conversion optimization. Once you have your stats you need to know what to do to fix the issues you’ve found. Next week we’ll discuss what that is.

If you take only one thing away from this article and the radio show I hope it’s this – your statistics are the key to understanding your site’s health and to making the most of it. They are the key to maximizing the ROI from a website and need to be reviewed regularly. If you don’t know how to read your stats or what specific things mean – that’s what forums, articles and the support documents for the stats program you’re using are for. And if you can’t find the help you need there (or just don’t have the patience) – contact your SEO or contract one to help you get the understanding you need to have to make the right decisions.

Next week’s the topic will be conversion optimization and will be written by Rick Tobin of Enquiro.

SEO news blog post by @ 1:56 pm on March 29, 2008

Categories:SEO Articles

 

Starting An SEO Company

Starting an SEO company (correctly) is a lot harder than you might first think. Let’s assume for a moment that you’re at the stage where you’re launching the company, whether that means you’ll be starting a full-scale SEO company or going it alone is up to you.

Here’s a quick summary of what you’ve got: a new site with no links, history and probably light on content. You’ve likely got a name with very little brand recognition and an SEO (you) that probably has little in the way of recognition as well. If these don’t apply to you then this article likely doesn’t either, but if any of this sounds familiar then hopefully you’ll find some helpful advice here.

The Background

Just to give you a bit of an idea of the perspective being taken in this article’s writing, I started an SEO company back in 2004. While the company now enjoys solid rankings and traffic from a variety of sources as well as additional other marketing avenues – this obviously wasn’t always the case.

When the company was first started I had a budget of, well – I had enough to cover the hosting and about a month without a wage. So you can imagine that it was pretty important to come up with a strategy that included getting business today and also promoted working towards building growth down the road. If you’re just launching your own SEO company, you’re likely in a very similar situation. Here’s are some things and tactics to consider …

Getting Yourself Noticed

Getting yourself noticed can be hard in the beginning. It can sometimes seem like that saying, “it takes money to make money,” except more like, “it takes a name to make a name.” There are a few tactics that, if done properly, can help you get your name in front of your prospective clients.

Entering Contests

Entering SEO contests such as the classic nigritude ultramarine or more recent yicrosoft directory is a great way to test your skills, get some publicity, win some prizes and build a bit of a reputation that will help with some other tactics we’ll cover later. There are pretty much always SEO contests of some sort going on. By doing your rounds on the forums and running searches every now and then you’ll be able to quickly uncover what they are.

And as a bonus, even if you don’t win – this gives you the opportunity to deeply study the tactics of the winner knowing the domain ages, the backlinks, the onsite elements, etc.

Writing Articles

I’ve said it before and I’ll say it again, writing articles (and by articles, I mean good – informative articles) is a great way to get yourself known by both your potential clients and those in the publishing world (who, let’s remember, are the ones who control whether your works get read by your prospective clients).

Often it can be challenging to write for your potential clients AND for the editors. For your potential clients you want to write something that shows you’re en expert but without “giving away the farm”. The editors and publishers have a different take on things. They’re not there to sell you business, they was readers and that means they need to provide valuable content. Basically, you might not have to give away the farm but you will have to say good-bye to a few of the chickens.

There are three great point that need to be understood when you’re pondering how precious that nugget of insight you have into the way the engines work:

  1. You’re not going to cover everything in one article. Even if Matt Cutts was point-forming all the secretes to high rankings he has stored in his head – he could NEVER do it in a couple thousand words. You’re writing an article. Not an encyclopedia so relax – if everything you know about SEO can be given away in 2000 words perhaps you’re starting the wrong kind of company.
  2. Tomorrow things will be different. SEO isn’t just knowing what you know today, it’s testing and keeping up, and reading so you know what’s coming tomorrow. Even if you gave away everything (let’s pretend that’s possible for a second) things will be different in a year.
  3. And perhaps the most important point, your clients don’t have time! The reason someone hires an SEO is that their time is more valuable doing what they do and they’re rather hire a professional SEO than learn the skill set. I don’t do dentistry on my kids, they don’t promote their own website.

Attend Conferences

Attending conferences such as Search Engine Strategies and SMX is a great way to get known. The specific conference and it’s location will influence the types of contacts you’ll make but they’re always valuable.

For example, at Search engine strategies San Jose I’ve always tended to meet and mix with other SEO’s, publishers and other assorted geeks. At the same convention in New York you’re more likely to meet business people and company executives there for research and to find consultants. Two very different groups – both valuable in their own way.

Parting Words

In the end, if you’re looking to start your own SEO company it’s going to take a variety of tactics and some patience to really see the benefits. The period before you are able to secure a name for yourself or attain solid rankings is a difficult one but which can be overcome with patience and hard work.

If you can use tactics such as well-written articles to get yourself business in the short term, attend conferences and enter contests to build a name for yourself in the community (which is going to help get your articles syndicated much quicker) and you’ll have the recipe for a successful venture in the world of SEO companies.

Note: It’s important to note that this article is written based on my own personal experiences. While I and the Beanstalk company were successful – there are other ways to do it and any advice noted above should not be held as the “recipe for success”. Before starting a business you need to consider all angles including laws, your own personal skills, funding, etc. We were starting it with very little funds, a solid skill set, and a strong understanding of how to market an Internet company based on previous work experience. You need to consider your starting point and skills before engaging in any new business.

SEO news blog post by @ 10:35 am on March 4, 2008

Categories:SEO Articles

 

SEO Step Four of Ten: Content Optimization

Welcome to part two in this ten part SEO series. The ten parts of the SEO process we will be covering are:

  1. Keyword Research & Selection
  2. Competition Analysis
  3. Site Structure
  4. Content Optimization
  5. Link Building
  6. Social Media
  7. PPC
  8. Statistics Analysis
  9. Conversion Optimization
  10. Keeping It Up

Content Is King

Content is king. More than a truism, the phrase is a mantra. Content is the stuff people are looking for on a website. A commitment to developing and deploying great page, document and site content is a commitment to good SEO.

Comprised of the most common site elements, content is the most effective tool SEOs have to work with. Loosely defined as All Things On-Page, the term “content” would include titles, tags, text, in-site links and out-bound links. In some SEO practices, the acronym ATOP is used to refer to the hands-on work environment. (ATOP, i.e.: Mark sends the keyword targets to Jade whose staff works ATOP in the overall SEO effort) Content optimization is where creative art gets mixed into the webmaster science of SEO.

In the SEO process, content optimization describes most of the hands-on work done to make unique documents place well in search engine rankings. For the purposes of search engine optimization; content either exists, has to be created, or both.

Sometimes optimization of existing site content only requires the SEO to perform minor textual tweaks. Sometimes content does not exist and has to be written by the SEO. Frequently, SEOs come across pre-existing page content that needs to be totally rewritten or redeveloped.

The object is two-fold. The first goal is to feed data to search engine spiders, the second to serve information to human visitors.

Writing for Robots

By basic definition, the goal of search engine optimization is to achieve high search engine rankings. That means writing for robotic consumption. The first rule of writing for robots is, keep it simple.

For all their silicon guts and algorithmic abilities the robots are not that bright. They cope best with one concept at a time. Though a page might rank well for any number of keywords or phrases, the best site copy is written to focus on one topic per page. Addressing multiple topics per page dilutes the overall effectiveness of a site-wide SEO effort and the ranking potential of individual pages.

Limiting your focus to one topic per page makes it far easier to work keyword targets into each of the basic on-site content elements; titles, meta descriptions, body text and links. When optimizing site content, each of these elements needs to be worked on one-by-one and then examined in relation to each other. In practice, I prefer to work from the top to the bottom of a page before spending the bulk of my time messing around in the middle.

Titles are important

The first page element search engine spiders and most human visitors see is the page title. If you found this article on a search engine or through an RSS feed, chances are the title of the page was used to make the reference link you clicked on to get here. Passing primary topical information to bots and to search engine users, the title of a web document is used by SEOs to address specific keyword targets and to convince human visitors to select the page.

A lot of webmasters overlook the title when designing and maintaining their websites. To make the point, think of the countless number of websites with index pages sporting the title “Home”.

Look at the very top of your screen. See the words beside the Firefox or Internet Explorer symbol? That’s the title of this page. Being published in WebProNews, the title of the original page this piece was published on reads, “SEO Step Four of Ten: Content Optimization | WebProNews”.

Each page in a website should have a unique title. As pages in the website gets more specific, so to should the titles of those pages. Since SEO is about getting good placements under a variety of keywords or phrases, including “long-tail” placements, topically relevant keywords should be worked into the title of each page.

Here are a few examples of optimized page titles in a general page-tree order:

  1. Eco-Friendly Products for Healing Healthy Hippies :: Green Wingnuts (INDEX page)
  2. Ecological Alternatives :: Healing Healthy Hippies :: About Green Wingnuts (About page)
  3. Magic Healing Balms, Tinctures and Lotions :: Health Products for Hippies :: Green Wingnuts (Product Stock Page)
  4. Organic Yellow Blue Algae Lotion :: Nutritious Health and Healing Products :: Green Wingnuts (Specific Product Page)

Search engines use titles to gauge the topical intent of individual pages in a website. So do human search engine users. It makes sense to give both the information they need to make the decisions you want them to.

Meta Descriptions Make a Difference

There are dozens of meta tags that have been used in the history of search engine optimization. The only extremely important one is the meta DESCRIPTION tag. Though found in the source-code and not part of the visible website, the meta description tag can have a decisive impact on rankings and selection.

Search engines use the meta description to help confirm the topical intent of web pages. They also use them for a much more practical purpose. The description is often used to phrase the short paragraphs found under the Title in search engine results. When a search engine users is making a decision which link to click, a well written meta description might make the difference. Don’t ignore this tag, each page should have a unique one.

<meta name=”description” content=”Green Wingnuts makes healing products for healthy hippies. Ecological alternative health products for a better planet” />

Visible Elements, Text, Images and Links

When approaching a fresh optimization project, SEOs takes stock of what they have to work with. SEOs often think like doctors when assessing a website with the understanding that they could do quite a bit of harm if they are not extremely careful. More often than not, changes made to titles and meta descriptions are beneficial to clients. As they are frequently overlooked or under-utilized, augmenting the titles and descriptions of pages usually helps a site achieve better rankings. Changes to the text that appears on a page, on the other hand, might unleash a host of unintended consequences. Aside from the chance a SEO might mistakenly change the message the client is trying to convey, messing around with body-text might also damage current search engine rankings. Keep that in mind as we move into making content optimization decisions.

The first task in content optimization is analysis. Having a full understanding of where a clients’ web pages rank, under which keyword phrases and the degree of success current placements enjoy is critically important for making decisions about what to work on. Analysis requires data and data requires information.

In an earlier part of this series, Dave Davies addressed Keyword Research and Selection and the making of a list of several keyword phrase targets. Content optimization analysis is about figuring out which pages are most relevant to keyword phrase targets on the list.

Almost any page in a URL has a good chance to achieve strong search engine placement under a limited number of keyword phrase. In deciding which phrases to apply to which pages, I start by dividing items on the keyword selection list into categories ranging from general to specific.

On the INDEX page of the Green Wingnuts site, the phrase “Green Wingnuts” would be the most general phrase as it is the business name of the client. The target market is deemed to be health conscious hippies, hence the slightly more specific variations on “healthy hippies”. Ecology is an important interest for most health conscious hippies, thus the use of “Eco-Friendly Products”. In this example, the index page is primed to rank for three unique keyword phrases and is easily associated with variations on each.

At first mention, content optimization might be thought to be about writing primarily for search engine spiders. It’s not. Well optimized website content should be created for live-human visitors and deployed in a way that that draws the reader towards a decision. Anyone can talk to a bot. Compelling website visitors to commit to an action and achieve a conversion is a bit more difficult.

As noted earlier, a good working rule is to stick to one topic per page and to consider the overall website as a document tree. The top of the tree is the INDEX page. Below the INDEX are the second or upper-level pages that tend to describe the company, its mission, goals, general services, and contact information. Pages found on subsequent levels of the website tend to feature more specific information the deeper a document is found on the tree. In the Green Wingnuts example, you can see in the titles how content gets more specific as we descend down the document tree.

Writing for a web-based readers and search engine spiders is much like writing for newspaper readers. Because the web is a dynamic environment, readers have notoriously short attention spans. Important points and keyword phrases need to be mentioned early in the copy and, by the end of the third short paragraph; the reader should know what they are supposed to do next. Subsequent paragraphs are used to support the story told by the first three. The goal is to hold their interest long enough to confidently direct them to the next step.

For instance, when writing copy for a real estate website, I want to ensure the readers are A) getting the information they need to assess the local area and decide they want to live there, B) understanding that the realtor is there to provide whatever they need to make a decision, and C) confident enough know how to move to the listings of properties for sale.

When applying text to a page, content optimizers need to think about its placement against other elements present on the page. How headlines or “strong” text looks beside an image is as important as the slight algorithmic bump that emphasized text brings. More important to the goal of improving the page is making it accessible to all users. Adding descriptive Alt-tags to images helps visitors who use screen readers and gives SEOs opportunity to insert relevant keywords into the alt tags. While I still use <h1> and <h2> tags, I tend not to worry as much about SEO considerations as I do page layout considerations. As long as the target keyword phrases are prominent in the titles, meta description, body text and judiciously used as anchor text, I trust the search spiders to find them.

I am far more concerned about where the pages I work on are being found. An emerging consideration in content creation asks the question, “What if it plays better in Pittsburg than it does in Cleveland?” Search engines are getting far better at delivering the right information to the right person. Knowing that there are fewer common standards in search engine results, content optimizers have to think about the regionalization of search.

Finding your regional audience

One piece of SEO software I really like that is called Enquisite. Designed to tell users how pages within their websites rank from the points of view of search engine users in regional markets around the world, Enquisite provides extraordinary information about what ranks well where. Having used Enquisite for over a year, Metamend finds it an indispensable tool.

When we develop new content or think about making changes to existing page content, we check how that site is performing in regional search markets using Enquisite. Because search engines have become extremely good at targeting where a search engine user is located they are able to serve regionally relevant information to different users in different places. While the overall object is high rankings for search queries everywhere, the advent of personalized, localized and “universal” search results make us consider create regionally specific content for the strongest markets indicated by Enquisite.

Link-seeding

Helping site visitors move from their point of entry to an essential action or a conversion is an important part of content optimization which will be fully addressed in the ninth essay in this series. To touch on it briefly, if the overall site optimization effort goes according to plan, search engine users will be able to find specific product pages on the first page of search results. That’s an optimal visitor but a content creator has to think about directing visitors who find their way to a page from a link on another site.

Internal links are important enough to obsess on. Designing a practical and elegant navigation path through a website is essential to gaining and retaining converting visitors. A big part of an elegant navigation path is how internal links are written and phrased; a process that also has an effect on a search engine’s impression of the site.

Internal links should be short and, whenever possible, be phrased with the most relevant keyword targets to the page the link leads to. A link leading to “Health Products” is far more compelling than one leading to “Green Wingnuts Products” and gets another mention of a target keyword phrase in an area that associates it with the page the link leads to. A similar approach should be taken to phrasing links in a sitemap file.

Content optimization comprises the bulk of the work SEOs do when working on a website but that work doesn’t stop when the initial optimization process ends. Content optimization also includes the regular creation of new pages and periodic changes of existing content. These topics will be covered in future essays in this series, most likely in the ninth and tenth articles, Conversion Optimization and Keeping it Up.

More Info on This Series

This article is part of a ten part series of essays on SEO written by search marketing experts from several unique disciplines. The series is being supplemented by a weekly show on Webcology at WebmasterRadio.fm Thursdays at 2PM eastern. Be sure to tune in or download the podcast to hear the authors talk about their takes on search marketing.

The next article in this series will address one of the most important aspects of an overall SEO campaign, Link Building. That will be in two weeks as next weeks Webcology broadcast will be pre-empted by WebmasterRadio coverage of the SMX-West conference in Santa Clara.

About the author:

Jim Hedger is a veteran SEO, a good friend, and reporter for Webmaster Radio.

Next week the topic will be site structure and will be written by Beanstalk author and Director of Optimization, Daryl Quenet. Daryl will of course be on the show with us next Thursday along with some great guests.

SEO news blog post by @ 1:30 pm on February 21, 2008

Categories:SEO Articles

 

Part Three of Ten: Site Structure

Welcome to part three in this ten part SEO series. The ten parts of the SEO process we will be covering are:

  1. Keyword Research & Selection
  2. Competition Analysis
  3. Site Structure
  4. Content Optimization
  5. Link Building
  6. Social Media
  7. PPC
  8. Statistics Analysis
  9. Conversion Optimization
  10. Keeping It Up

Overview

Website structure and SEO are a combination of topics that I’ve always had a particular interest in because of my background in software engineering. I have worked on, or maintained over 150 corporate websites having seen many of the things that can make a website go wrong, which can seriously impact a websites operation and search engine rankings.

Of the three pillars of SEO (Structure, Content, and Links) I find the structure of a website to be one of the most under rated things, even among search engine optimization companies. The structure of a website consists of several elements which all are interdependent on each other. These include the code behind your website, how your website interlinks, and the technologies used in your website.

At this point I’m going to strongly recommend that you’re using Firefox with the Web Developer Toolbar installed. The web developer toolbar gives you an easy way to validate your website, test your site on multiple screen resolutions, and around another 100 functions.

Valid Markup and Cascading Style Sheets (CSS)

I have made it practice to develop all my projects in XHTML 1.0 Transitional (my personal preference so I can use target=”_blank” and rel=”nofollow” attributes) or XHTML 1.0 Strict and CSS 1.0. XHTML is a reformulation of HTML 4 as an XML 1.0 application. It is a very clean and semantic markup language which will also force you to write cleaner code. Whether you choose XHTML or HTML 4 your code will be friendly to the search engines (stay away from 3rd party standards like IHTML).

As for Cascading Style Sheets (CSS) it gives us the ability to abstract the design out of a webpage, or site into a secondary document. This gives us a lot of advantages, and very few disadvantages. By removing redundant design code from your website you place the content closer to the start of the document, while reducing your code to markup ratio. It also makes it easier, and more cost effective to maintain your website as you can implement simple design changes by only editing on file.

When converting a website from table based design, to pure CSS based design there is generally around a 40% decrease in code. The reason for this is when most people use tables they end up placing tables, within tables, within tables all with their own attributes (height, width, border, etc). Now multiple all that redundant, and unneeded markup by the numbers of pages of you site and you’ll quickly see how Google (or any other search engine) will be able to index you website more efficiently.

In my research, and experience I have concluded using these two technologies in conjunction with each other is a part of guaranteeing your websites success, especially with its compatibility with Google. You will also find if you do any research on this topic a recurring mantra of CSS fanatics tables are for tabular data not design.

You’ll find that most of the highly organically ranked SEO companies implement CSS based design on their own websites. For examples of CSS based design check out Beanstalk Search Engine Optimization, SEOMoz, and Quenet Consulting.

Website Templating

Now I’m going to start this section with a rant about Dreamweaver templates, and how useless they are. As a SEO / Web Developer there is nothing I loathe more than seeing a Dreamweaver template. If you’re going to template a site use a technology like Server Side Includes, PHP Includes, or ASP includes. The disadvantages of Dreamweaver templates are:

  1. Embedded comments in your code can reak havoc on Keyword Density Tools
  2. If you need a non standard footer in an index file you will need to break it from the template, creating issues for future template updates.
  3. If you have a disagreement with your web developer / designer and you part company if he doesn’t supply you with the template it’ll cost you.

When building websites I personally use PHP for implementing Server Side Includes. PHP is a relative easy language to learn for implement simple things like includes. It is also one of the most popular Apache modules, as of April 2007 there were 20,917,850 domains, and 1,224,183 IP addresses with it installed. PHP is also available for the Microsoft IIS (Windows Server) web server.

Search Engine Friendly URLs

One thing that I can’t stress enough is try to stay away from Dynamic URLs, these are URL addresses with variables, and values following the “?” character. Google used to state that it had troubles indexing sites with dynamic URLs, and to a degree this still holds true. If you are going to use Dynamic URLs always try to have less than 2 variables in your URL. I have seen sites with excessive products, and URLs where Google / Live / Yahoo all have a different number of pages cached.

A better approach is to URL Rewrite your URLs. For the Linux side Apache has Mod Rewrite, and for Windows you can use ISAPI Rewrite. When you implement a URL Rewriting system you are essentially creating a hash URL lookup table for your site, than when a server query comes in it checks the hash table to see if it finds a match then feeds it the corresponding entry.

To put it into simple terms what we strive to accomplish with URL Rewrites is to mask our dynamic content by having it appear as a static URL. A URL like Article?Id=52&Page=5 could be rewritten to /Article/ID/52/Page/5/, which to a search engine appears to be a directory with an index.htm (or whatever default / index page your particular web server uses). To see an implementation of Mod Rewrites check out Dr. Madcow’s Web Portal in the Article Section, and Link Archive.

Dynamic Websites and Duplicate Content

If there is one reoccurring theme I see in a lot of dynamic websites on the internet is that they can sometimes present the same content on multiple pages. An example of this is when you visit a website that allows you to “view a printer friendly version of this page”, a better web solution implementation would be to develop a printer friendly Cascading Stylesheet.

Another goal is also to avoid having any additional URLs on you site such as Links for changing currency with a redirect script, links to “Email to a friend” pages, or anything related to this. Always use Forms to POST date like this so that the same page, or a static page to reduce page count. This issue seems to plague a lot of custom developed ecommerce / CMSes. I’ve actually see CMSes that will present up to 5 URL / Links for each page, in the long run the spiders got so confused in indexing the catalog that some of the main content pages were not cached.

Internal Site Navigation

If built properly most websites will never have a need for an XML Sitemap, other than to get their new pages indexed that much quicker (Ecommerce & Enterprise being exceptions). I will however recommend that every website have a user accessible Sitemap linked from every page to aide your users, and for internal linking.

Most sites with indexing problems have issues with their internal page linking structure. The biggest of all these issues are websites that implement pure javascript navigation based system, these systems depend on Javascript to insert HTML into pages as there rendered. Now Google can parse javascript menus to find URLs, however all of these pages will only be linked from the JS, and not the pages there located on (expect no internal pagerank passing). The best Javascript menus are menus that manipulate your code on your page to change which sections are being displayed via CSS. An example of a hybrid CSS / Javascript menu that I like is QuickMenu by OpenCube (these guys have a great support department).

Keep I mind the more internal links you have to a page, the more internal strength this page will be given. So when in doubt link it up.

Testing Your Site Structure

When it comes to reliable website deploying all I can say is “Test It, Test It, and then Test It Some More”. When testing structure I rely on 3 different programs / firefox extensions. The first is Xenu Link Slueth, this is a great tool to run on your website to figure out how many pages can be spidered, and to find dead links. The second is the Web Developer Extension for Firefox, make sure you always validate your code when you make changes. And the last is consult Google and Yahoo to see how many pages are in your index compared to how many pages Xenu found, on Yahoo or Google type site:www.yourdomain.com (Don’t use Live’s site: function it is useless).

After you’ve finished testing your code if you need to debug it I strongly recommend the Firebug Firefox Extension, and the IE7 Developer Toolbar.

Conclusion

When trying to maximize your organic rankings your internal structure is paramount, consider your site structure to be equivalent to the foundation of your house. If your foundation is not built adequately your house may be livable, but may have long term issues. With websites your long term issues will be a failure to maximize your ROI of your website, so practice safe and smart structure.

SEO news blog post by @ 1:00 pm on February 14, 2008

Categories:SEO Articles

 

« Newer PostsOlder Posts »
Level Triple-A conformance icon, W3C-WAI Web Content Accessibility Guidelines 1.0 Valid XHTML 1.0! Valid CSS!
Copyright© 2004-2014
Beanstalk Search Engine Optimization, Inc.
All rights reserved.