Beanstalk on Google+ Beanstalk on Facebook Beanstalk on Twitter Beanstalk on LinkedIn Beanstalk on Pinterest
Translate:
Published On:
SEO articles and blog published on ...
Hear Us On:
Webmaster Radio
Blog Partner Of:
WebProNews Blog Partner
Helping Out:
Carbon balanced.
Archives
RSS

XMLRSS

Ten Steps To A Well Optimized Website – Step Eight: Link Building

Welcome to part eight in this search engine positioning series. Last week we discussed website submissions. In part eight we will be covering the importance of link building and developing inbound links to your website.

This is arguably on of the most important aspect of the SEO process and can mean the difference between first page rankings and 100th. It has to be done right and it has to be done on an ongoing basis.

Over this series we will cover the ten key aspects to a solid search engine positioning campaign.

The Ten Steps We Will Go Through Are:

  1. Keyword Selection – October 24, 2004
  2. Content – October 31, 2004
  3. Site Structure – November 7, 2004
  4. Optimization – November 14, 2004
  5. Internal Linking – November 21, 2004
  6. Human Testing – November 29, 2004
  7. Submissions – December 5, 2004
  8. Link Building – December 12, 2004
  9. Monitoring – December 19, 2004
  10. The Extras – December 28, 2004

Step Eight – Link Building

Link building: it’s pretty much understood that this is a critical component when you’re trying to attain top search engine positioning, however the confusion enters when it’s time to decide exactly what you should do.

From talk about reciprocal link building one might come to believe that this is the golden egg of SEO. While reciprocal link building can definitely be beneficial to your rankings, it is far from the only or even the best method. In this article we will cover the following link-building tactics:

  • Reciprocal link building
  • Directory listings
  • Non-reciprocal link building tactics
  • Tools to maximize your efforts

And so, without further ado …

Reciprocal Link Building

Reciprocal link building is the trading of links between two websites. Essentially it’s an “I’ll post yours if you’ll post mine” sort of arrangement. There are many sites out there that will essentially link to any-and-all sites willing to link to them. This is not a good practice.

While purely speculation at this point, there is significant debate in the SEO community regarding how search engines might be altering their algorithms to take into account a Webmaster or SEO’s ability to manipulate their rankings with reciprocal links. Whether or not these speculations are true currently, they are most certainly being integrated if they have not be already.

Essentially, the search engines need to protect themselves and provide relevant results to their users. While inbound links as part of search engine algorithms is certainly here to stay, the way these links are calculated changes constantly and in reaction to the current environment and also in prediction of future developments, the way we build them too must evolve.

There are some basic rules to follow when exchanging links:

  • Relevancy is more important than PageRank
  • Check and make sure the recips aren’t being blocked
  • Link pages with more than 50 links aren’t worth exchanging with
  • Prepare for the future

Relevancy

Many Webmasters focus only on the PageRank of a website when deciding whether to exchange links with it. Without a doubt PageRank is important however more important is whether or not that website’s content is related to yours. There are two reasons for this:

  1. The algorithms are changing to take into consideration the relevancy of links. A link from a relevant PageRank 3 page will be considered more valuable than a PageRank 5 link from a totally unrelated site. Some predict that unrelated links will soon be given little or no weight whatsoever.
  2. Believe it or not, Google is not the only search engine. PageRank is Google’s ranking of the value of a site. What Google gives a 3 out of 10, Yahoo! may give more weight to.

Basically, after a series of tests we have determined that links to related sites will never hinder your rankings. With this in mind feel free to link to any site you think your visitors would naturally be interested in if they are at your site.

Blocked Recips

Unethical website owners (or their SEOs) will sometimes block the links backs from search engine spiders. Be this in an effort to attain what appear to be one-way links as opposed to reciprocal, or simply to make their website appear to have fewer outbound links, this is not ethical and it certainly won’t help you.

When you’re looking at a potential link exchange page, check the source code for the robots tag. If it’s set to “noindex,nofollow” then the page is being blocked and the link won’t help at all.

Some wiser webmasters will use the robots.txt file to block search engine spiders. If you look for robots.txt at the root of the domain (i.e. at http://www.domaininquestion.com/robots.txt) you will see the files/folders that are being blocked. Look for the links pages and/or the directory these pages are in, in this list. If you find it, then don’t exchange links with them.

A new one I’ve recently found along this tangent is to draw the links from a script and to block the script and database folders from the search engines. The files won’t show up in the excluded list but the links won’t be counted. To detect this the easiest thing to do is to view the cache of the page. If the page is cached but none of the links appear and the script directory is listed in the robots.txt file then this tactic is being used. Again, don’t bother exchanging links.

If you find Webmasters employing any of these tactics they are unethical. Unethical Webmasters shouldn’t be rewarded with high PageRanks or good results. If you have the time and inclination you may want to email those websites listed on the page (heck, they may be good recip link partners anyway) and let them know what’s going on. You’ll be doing them a favor and they’ll probably be happy to exchange links with you as well.

Link Pages With More than 50 Links

Webmasters who are trying to actually do their link partners a favor will limit their links pages to 50 links (the lower the better). The reason for this is that every page gets one vote. A link to another website counts as a vote for that site. This is why it can help improve rankings. As each page only gets one vote a link from a page with 10 links counts at 0.1 of a vote, whereas a link from a page with 100 links counts as 0.01 of a vote. Anything past about 100 links is not counted at all.

Additionally, the higher up on a page your link appears the more weight it is given. If the page lists sites alphabetically try to insure that your title begins with a number or a letter early in the alphabet (which works well for companies like “Beanstalk Search Engine Optimization”).

Prepare For The Future

Just because a rule applies today does not mean that it will tomorrow. This is true in on-site SEO as well as link building. If generic recip links work today, consider whether you believe it’s in the best interest of your targeted engine to keep it this way. As the answer will undoubtedly be “no” it’s in your best interest to insure that you take the extra time to build links that will still be valuable months and even a year from now to save yourself a drop in the rankings and additional work later.

Directory Listings

Having your website listed in quality directories is perhaps one of the most valuable things you can do for it in regards to inbound links. Directories link DMOZ and Yahoo! hold significant weight. Google draws it’s directory results from DMOZ and Yahoo! draws it’s directory results from, well, Yahoo!. These links are given a lot of weight.

Make sure that you submit your website to both of these directories and if they’re not listed a couple months down the road, try again (and you may want to try a slightly different category if a relevant one exists, as you may have hit one of the many overworked editors who’s getting behind).

Aside from these two there are literally thousand of other directories our there. Look for others and submit your site. Some may charge a fee. If this is the case, take a look at the page your site would be listed on, take a look at the PageRank, the number of outbound links on the page and determine whether it’s worth the price. I’ve seen directories charging $10 for a permanent PageRank 5 link on a page with 3 other outbound links (though this number is certain to grow over time). Well worth the $10 investment.

You can find may great directories using search engines and, of course, the major directories. For example, were I looking for topical directories a great place to start would be http://directory.google.com/Top/Reference/Directories/ in the Google directory.

Non-Reciprocal Link Building Tactics

There are a number of other tactics for building non-reciprocal links. Here we will outline three of the most popular:

  1. Articles
  2. Press Releases
  3. Paid Links

Articles

Writing articles is a great way of getting inbound links and generating quality traffic. Articles give you the opportunity to control the content on the linking page meaning that you can guarantee that it is totally relevant, it’s a one-way link, and it’s a link that you’ll actually get traffic from.

Let’s assume that you run a small computer shop. Why not write an article about how to troubleshoot a common Windows problem (no no, it’s true … Windows can be a bit buggy every now and then). The next step is to simply find places to submit your article to and do just that. From experience I would highly recommend keeping a list in your favorites of the sites you submit to. If you decide to publish another article you probably don’t want to have to find them all from scratch again.

If you were looking for places to submit to you would run searches on the major search engines for “my topic articles” (in this case a search for “windows errors articles” and “computer troubleshooting articles” would be great places to start). If you find a lot of results only post their own articles you may want to add the word “submit” to the string.

Press Releases

Press releases are another great way to attain one-way inbound links. If you have news that you feel worth telling, submit a press release about it. While you’ll probably want to manually submit your site to the key online publishers, services such as PRWeb exist to submit your press release to a large audience at a very reasonable price.

Like articles, if the news is good you’re likely to get quality traffic from a press release and on top of that, you are likely to get some good, related links to your website.

Paid Links

Paid links are links from other websites purchased solely for the value of the link rather than for direct clicks. Paid links have become so popular that auction sites have sprouted up for just this purpose and they can even be bought on eBay.

There is no particular problem with paid links per-se however I would recommend applying the same criteria that you would to reciprocal links. If you are going to purchase links, only purchase them from related sites and try to make sure the link is not buried down at the bottom of the page.

Run-of-site links (links that appear on every page) are not significantly more valuable than single links on the homepage other than for the traffic. If you’ve purchased a link in a good location and on a good site you’re likely to get some good traffic from it. In fact, this is the general rule I go into any paid link arrangement with – purchase the link for the traffic. If the link increases my PageRank it’s a great bonus but if I’ve bought the link for the traffic and I’m getting it, then the link value becomes secondary.

Link Building Tools

Because link building has become so important to improve search engine positioning, a number of great tools have been developed to help in the process. While I couldn’t possible list them all here there have been two developments by a company named TopNet Solutions than have truly impressed me and which are the only tools that I use in every link building campaign.

PR Prowler
PR Prowler from TopNet Solutions searches the web based on your specific criteria providing results with a minimum PageRank that you determine. A very handy tool for your link-building efforts.

Total Optimizer Pro
When we first purchased PR Prowler we thought we’d found the ultimate link building tool. That was, until we found Total Optimizer Pro. Made by the same folks who put out PR Prowler this tool rips apart and tells you everything there is to know about your competitor’s backlinks, the anchor text used to link to them, the PageRank distribution of their incoming links and much more.

If you have any questions about these tools or how they are used feel free to contact us. I’m happy to answer any questions that you might have.

Next Week

Next week in part nine of our “Ten Steps To an Optimized Website” series we will be covering the importance of monitoring. This isn’t simply checking the rankings of your primary phrase every now and then but a scheduled check of all the key components on your optimization and search engine positioning efforts.

SEO news blog post by @ 5:35 pm on December 20, 2004


 

Climbing the Beanstalk December 14, 2004

Welcome to the December 14, 2004 edition of “Climbing The Beanstalk”, the bi-weekly newsletter on search engines and search engine positioning from Beanstalk. In this edition we will discuss the most recent development in the battle between Google and MSN for search dominance, the recent articles published by Beanstalk staff as well as some special tips on link building not included in our most recent article (Hey, we like to save a few special tips for our loyal newsletter subscribers).

If you have any questions regarding any of the areas covered in this newsletter please don’t hesitate to contact us.

Here it comes! Here it comes! Darn – Not yet.

There comes a time in a person’s life when they have to come to terms with who they are. Monday the 13th was that day for me. Last week I read a press release from Microsoft announcing a teleconference this Monday morning. The anticipation over the weekend was high. MSN was going to be announcing the official launch of their new search engine. I’ve obviously been tracking our client’s results on the MSN beta site and they’re doing very well.

And then Monday morning came around. I woke up before my alarm clock and had hours to wait before the teleconference. I ransacked the MSN beta engine doing some final searches to insure everything would be all-good in the world of my clients and the results were even better than they had been previously.

And then the conference started. The VP was introduced and the announcement came: MSN had developed a new toolbar and was putting it out in Beta at http://beta.toolbar.msn.com/. I was incredibly disappointed and that’s when I had to admit myself … somewhere along the way I had become a geek (and a disappointed geek at that).

Nonetheless the still-to-come announcement that the MSN search engine is going live will be enormous news for the search engine world and at least signs that we won’t have too long to wait were present in Monday’s conference. The new toolbar includes a desktop search function (borrowing a feature from Google’s Desktop Search). The biggest thing from my perspective was the fact that the toolbar search function does not give results based on the current MSN listings but rather on the beta search from http://beta.search.msn.com/. That they’re choosing to pull results from their own engine as opposed to the results they’re presenting currently on msn.com gives a clear indication that the launch of the new engine is coming up sooner rather than later.

With everything that’s going on over at Microsoft these days one would expect Google to be worried however they’re definitely holding their own in the product development and announcements category. Launching targeted engines such as Google Scholar which allows people to search only scholarly literature (a great tool for those students and researchers who are aware of it’s presence) Google is sticking pretty much announcement-to-announcement with MSN.

Will this hurt the launch of the new MSN search engine? Perhaps. My guess would be that it will only hurt MSN in the short term and really only in the eyes of Google investors. Whether MSN grows to be “the king” or not remains to be seen however I’d put my dollar on it taking a big chunk of Google marketshare over the next couple years.

If you haven’t guessed, this means learn the new algorithm and get ready to optimize for MSN. It already holds a large marketshare that promises to increase. It holds the potential to generate significant revenue and it’s best not to realize that after all your competition have.

Recent Search Engine Positioning Articles

Beanstalk Search Engine Optimization has recently had two of it’s articles picked up by WebProNews, ISEDB, and an assortment of other SEO resource sites. These are recommended reading for anyone interested in attaining high rankings.

Website Submissions

With services offering to help you get more traffic and higher search engine positioning by submitting your website to “18 Bazillion Search Engines For Just $19.95 Per Month!” and other such claims, there has grown much confusion around website submissions. In this article we will clear up many of the misconceptions around submitting your website and may even save you “Just $19.95 Per Month!” in the process … <more>

Link Building

Link building: it’s pretty much understood that this is a critical component when you’re trying to attain top search engine positioning however the confusion enters when it’s time to decide exactly what you should do … <more>

Anchor’s Away!

Step eight of the ten step series currently being published by Beanstalk is on link building. We believe that our loyal subscribers deserve just a little bit more than the average surfer as so we held a bit back just for you. What we held back is the portion on anchor text. This is one of the most important aspect of your link building campaign.

Everyone seems to know that it’s a good idea to use your keywords in the links to your site but there’s a heck of a lot more to know than that. If you are working on link building then be sure to follow these rules to maximizing the effect of your efforts.

Rule 1 – Keep the link text to a minimum. I’ve seen anchor text that’s 10 and sometimes even 20 words long. This diversifies your keyword focus so much you won’t do as well for any one of them. Focus your energies on a single phrase (or perhaps two at most provided they are related). An example of two related phrases would be “search engine positioning” and “guaranteed search engine positioning”.

Rule 2 – Mix up your link text. Some people send the same link information to everyone. It’s a better idea to at least mix up a couple words. Instead of using “search engine positioning” every time I would add the words beanstalk or guaranteed or services (to fit other keyword targets). Using the same phrase every time will get picked up as over-optimizing. If you think about it, naturally occurring links are not phrased exactly the same every time and neither should the links you request.

Rule 3 – Use solid descriptions and make sure to use the targeted phrase in it. Not a required but a handy tactic is to include your url (i.e. www.beanstalk-inc.com) in the description. It won’t count as a link but you can then run searches for your url and you will find all the links you’ve set up. It also makes it easier to find out which sites have removed their links to your site, etc.

If you follow these rules in addition to those outlined in the link building article referenced above you will do well in your link building efforts.

Thank You

Thank you very much for subscribing to “Climbing The Beanstalk”, the bi-weekly search engine positioning newsletter. If you have any questions about the areas covered or if there are any areas of search engine positioning that you would like to see covered in future articles/newsletter please don’t hesitate to contact us. We want to write what you want to know.

SEO news blog post by @ 1:03 pm on December 14, 2004

Categories:SEO Newsletters

 

Ten Steps To A Well Optimized Website – Step Seven: Website Submissions

Welcome to part seven in this ten-part search engine positioning series. Last week we discussed the importance of human testing. In part seven we will cover the best practices of website submissions, where to submit your website to, and how to do so.

With services offering to help you get more traffic and higher search engine positioning by submitting your website to “18 Bazillion Search Engines For Just $19.95 Per Month!” and other such claims, there has grown much confusion around website submissions. In this article we will clear up many of the misconceptions around submitting your website and may even save you “Just $19.95 Per Month!” in the process.

Over this series we will cover the ten key aspects to a solid search engine positioning campaign.

The Ten Steps We Will Go Through Are:

  1. Keyword Selection – October 24, 2004
  2. Content – October 31, 2004
  3. Site Structure – November 7, 2004
  4. Optimization – November 14, 2004
  5. Internal Linking – November 21, 2004
  6. Human Testing – November 29, 2004
  7. Submissions – December 5, 2004
  8. Link Building – December 12, 2004
  9. Monitoring – December 19, 2004
  10. The Extras – December 28, 2004

Step Seven – Website Submissions

While there are definitely more critical areas of the website optimization process there is perhaps no area subject to as much misinformation and to such a vast audience. Here are some common misconceptions that are often believed about search engine submissions:

  1. You need to submit your website often to keep it indexed by the search engines
  2. You need to submit your website to thousands and thousands of search engines to get decent traffic
  3. Submitting your website often will keep you at the top of the search engine rankings

These beliefs are all incorrect and those who can make a quick buck selling this disservice perpetrate them. If you have not recently received an email offering to “Submit Your Website To More Search Engines Than There Are Websites On The Internet For Just $19.95 Per Month!” then I can pretty much guarantee that you will in the not-too-distant future if your email can be found somewhere on your website.

An irony of this can be found in Google’s webmaster area where they note:

Amazingly, we get these spam emails too:

“Dear google.com,

I visited your website and noticed that you are not listed in most of the major search engines and directories…”

Reserve the same skepticism for unsolicited email about search engines as you do for “burn fat at night” diet pills or requests to help transfer funds from deposed dictators.

Good advice as I’m sure Google has their website submissions taken care of. Just because you receive such an email, doesn’t mean that you’re missing out on anything. Let’s first look at a breakdown of which engines are responsible for which traffic.

According to research the major search engines are responsible for the following percentages of traffic as of June 2004:

Google – 41.6%

Yahoo! – 31.5%

MSN – 27.4% (MSN draws their results from Yahoo!/Overture)

AOL – 13.6% (AOL draws their results from Google)

Ask Jeeves – 7.0%

Lycos – 3.7%

Netscape – 3.0% (Netscape draws their results from Google)

AltaVista – 2.7% (AltaVista draws the Yahoo!/Overture)

Source: Neilson/Netratings

Note: These numbers total over 100% as people may use multiple search engines if they don’t find the information they are looking for at the first one they try.

So what does this tell us? This tells us that the very vast majority of search engine traffic does not come from many thousands of search engines but rather, relatively few. This would lead to the obvious questions, “Is it worth paying to be submitted to thousands of search engines?” The real answer, “No.”

Then How Do I Submit My Own Website?

Automated search engine submission systems simply access the existing and readily accessible “Add URL” pages of the search engines and automatically submit your site. You can do this yourself simply by visiting the search engines and submitting through these same pages.

To simplify this process you can visit the “search engines” page of the Beanstalk Search Engine Optimization website where we link directly to the submissions pages of the major engines.

But What About The Other Engines? Surely They Provide Some Traffic?

Quite honestly, they may. You may get a visitor or two. Is it worth $19.95/mth or some such amount? No. You can get a better dollar/visitor ratio on any of the many PPC engines out there.

An additional point to note is that you may want to actually visit some of the lists of engines on the sites offering these services to you. You will discover a couple of important facts:

  • Many of these so-called “search engines” are not engines at all but rather FFA (Free-For-All) pages and classified ads sites. They will not help your rankings, you will not see traffic from them and your listing will probably last about as long as spam in your Inbox.
  • Many of the actual search engines and directories are topical. What this means is that they are focused on a single area and unless your site coincidentally is about space exploration, topographical mapping, etc. you won’t get listed. Submitting should not be confused with “guaranteed listing”. Submitting your site to thousands of engines is not the same as getting your website indexed on thousands of engines.

The Submission Myth

The truth of that matter is, submitting your website at all can realistically be considered a waste of time. Aside from a few key general directories (DMOZ, Yahoo!, etc.) and a number of SEO directories, we did not submit the website www.beanstalk-inc.com to any of the major search engines. It’s true, not a single submission.

Are we indexed? Yes we are.

How did we get indexed without submitting our site? If you take the time that you would be spending submitting your site and spend it instead finding quality inbound links (which we will write about next week) your site will be indexed and much quicker than you think.

You’ve probably heard the term “search engine spider”. Search engines crawl websites. This means that they visit a page, follow all the links on that page and so on. If you have a link on a website that is already known to the search engines it is only a matter of time before your website will be found by default. In fact, when the Beanstalk site went live and the first link was established to it, it did not take the weeks that are estimated through the use of the submissions pages for our site to be found. The homepage of beanstalk-inc.com was index by Google three days after the site went live and the other major engines followed within a week or so.

Final Notes

If there are any points that I hope you take away from this article they are the following:

  1. Automated search engine submissions services are not worth the money they charge.
  2. You do not need to be submitted to thousands of “search engines”. The vast majority of traffic comes from the top few.
  3. You will want to consider whether it is even worth the time to submit to search engines or whether that time could be better spent building quality, relevant links to your site and submitting your site to the major and topical directories.

An additional failing to the automated submissions systems not covered above is their inability to take into consideration the exact characteristics of your website for their directory submissions. When you’re submitting your website to directories you will have to choose the exact category your site falls into. Most directories have slightly different category hierarchies and the more exact you are in your submission, the higher the chance you will be listed. Automated systems can never be as exact across multiple directories as a human can.

Submitting your website, even correctly, will not guarantee you top rankings however it will leave you with money in your pocket to spend on other promotional endeavors that may actually produce a solid ROI. And THAT’S what it’s all about.

The rankings? You’ll have to read the other nine steps of the series to find out how to attain those.

Next Week

In part eight of this search engine positioning series we will cover the importance of link building, how to attain high quality, relevant links to your website, and the tools to reduce the time it takes to do so significantly. With the importance of inbound links to your overall rankings you won’t want to miss this very important step in the website optimization process.

SEO news blog post by @ 5:26 pm on December 9, 2004


 

Climbing the Beanstalk December 1, 2004

Welcome to the December 1, 2004 edition of “Climbing The Beanstalk”, the bi-weekly newsletter on search engines and search engine positioning from Beanstalk. In this edition we will explore some of the more useful tools out there to help you with your SEO, some recent article from the Beanstalk staff on how to optimize your website as well as discuss the ongoing battle between Google and MSN for search dominance and the innovations this is leading to.

If you have any questions regarding any of the areas covered in this newsletter please don’t hesitate to contact us.

Tools Of The Trade

Just as there are tools to help a carpenter build tables there are tools to help SEOs build rankings. In fact, there are literally hundreds of them of varying degrees of usefulness. Through an understanding of what to look for and through experimentation one can determine what characteristics are common among all the best tools to help you attain higher search engine positioning.

What To Look For In An SEO Tool

There are five main characteristics of a good SEO tool. they are:

  1. The tool should make suggestions and not changes. Any tool that automatically applies changes to your site and/or pages should not be used. If you don’t know what’s going on with your site you stand a very real chance of hindering your rankings more than helping. The worst part is, you may never even know that you did so.
  2. The tool should not be used to create doorway pages. Doorway pages are pages built solely to attain rankings for specific keywords and not to provide value to the visitor. This is a very common and yet banned technique that is unlikely to work and more likely to get your website banned (if not by automatic detection then when one of your competitor notices and reports you).
  3. Never use an automated search engine submission service. You do not need to submit your site once a month and no, this will not help you with your rankings. In fact, with some quality links to your site you don’t need to submit your site at all, it will be found on it’s own. We have submitted website and do so as part of our service however when we were putting our own site up we decided to test and see how long it would take to be indexed by the major search engines without being submitted. Rather than submitting our site we took that time to build some quality inbound links. It took 3 days from the day the site went live to first appearing on Google and only a few days longer to be found on Yahoo! and MSN.
    Some might say “But what about the other thousands on engines that the automated systems will submit my site to?” To those I would pose the question: Which search engine do you use? I would guess that it’s not one of the other “thousands of engines” but rather one of the top 5. You can visit our search engines page and find the add url links to the major search engines. This will get your site listed on the engines that account for over 95% of all search traffic (with the exception of the PPC engines which you won’t get top placements on no matter which submission service you use unless you bid-per-click).
  4. A good tool will address issues that are of key importance to the search engines today. Doorway page generators, meta tag builders, etc. do not fit this category. Tools to help you find quality link partners, keyword density analyzers, and reporting tools that just make your life easier fit this bill.Please note: tools to find quality link partners should not be confused with tools that automate links pages. You should look at every single site you are exchanging links with and only link to those that are relevant to your content. If your visitors wouldn’t be interested in visiting the site then don’t link to it.
  5. A good SEO tool will have positive feedback from credible sources. Look for testimonials on the site and on the search engines. Remember, on site testimonials may or may not be legitimate. Look for information on the tool on other sites and see what others have to say. If in doubt, search engine forums such as those at WebProWorld and SEOChat give you the opportunity to ask questions and get others’ opinions of the product.

Some Of The Tools We Recommend

There are some tools that we have found very useful indeed. On our site we give a bit of information on what they do and why they are useful. Some of the key SEO tools that we have found useful in our travels are:

There are certainly other tools out there however these are the ones we use and so they’re the ones we recommend.

Recent Search Engine Positioning Articles

Beanstalk Search Engine Optimization has recently had two of it’s articles picked up by WebProNews, ISEDB, and an assortment of other SEO resource sites. These are recommended reading for anyone interested in attaining high rankings.

Human Testing

The most important part of your website is to reach the visitor. You have taken all the steps to create a great design and added SEO elements to your site, you have created the perfect online presence. Now to see if all that hard work has attained the main goal, to reach the visitor and steer them in the direction most desirable … <more>

Internal Links

When you’re about to launch into your link work why not stop and consider the ones that are easiest to attain and maximize first. That would be, the ones right there on your own site and those which you have total and complete control of. Properly used internal links can be a useful weapon in your SEO arsenal … <more>

Let The Games Begin!

The search engines have been going for years at yet it seems as though they were just getting warmed up in preparation for the largest battle in search engine history. With the threat of an upcoming MSN search launch and a few pennies in their coffers from their IPO, Google has been working hard to “one up” MSN.

Google – It is most certainly not a coincidence that Google decided that the date of the MSN beta search launch was a good day to announce the doubling of their indexed web pages (taking their number from a bit over 4 billion indexed pages to over 8 billion) thus overshadowing the MSN announcement.

Google has since made a number of announcements with many more sure to come. With search engines now designed for specific sectors including the recently launched Google Scholar for scientists and the predicted but not yet announced Google Browser, Google is positioning themselves for a major battle to hold their search engine dominance. (The belief that Google will be launching their own browser is reinforced by their registration of the domain gbrowser.com though it is currently not being used)

MSN on the other hand had put out little in the way of information regarding the launch of their new engine. They currently draw results from Yahoo!/Overture and have not released when this will change. Some predict that it will be sometime this month. Based on how their beta engine has been received this would be a logical decision, as it has received wide-spread kudos for dealing with many of the issues Google results present including the pervasiveness of spam.

MSN holds a very strong position in this battle as they have admitted that they would integrate the search engine right into their next operating system. This will give them immediate access to 85% of computer users, something Google can only dream about.

Predictions

Who will win this war is a hotly contested subject among SEOs and Internet enthusiasts. That Google will lose marketshare is not the only issue but rather, how much and will they even be a serious contender in a few years’ time.

Were I to hazard a guess I would put my money on the MSN search based on the results I have seen and their ability to virtually dominate whatever market they enter. But then … Google may very well have a few surprises left. If MSN delays their launch past January 2005 they may very well find themselves in a significantly weakened position after giving Google, with the billions they have in their war-chest, months to develop results, products and services that allow them to further entrench themselves as the only “viable solution” in search.

Thank You

Thank you very much for subscribing to “Climbing The Beanstalk”, the bi-weekly search engine positioning newsletter. If you have any questions about the areas covered or if there are any areas of search engine positioning that you would like to see covered in future articles/newsletter please don’t hesitate to contact us. We want to write what you want to know.

SEO news blog post by @ 2:02 pm on December 1, 2004

Categories:SEO Newsletters

 

Ten Steps To A Well Optimized Website – Step Six: Human Testing

Welcome to part six in this search engine positioning series. Last week we discussed the importance of internal linking. In part six we will cover the obvious and yet often overlooked importance of its appeal to a real-live human being.

While not directly related to SEO it is so often overlooked in the quest for higher search engine positioning that it has become a fundamental step in our ten step series.

Over this series we will cover the ten key aspects to a solid search engine positioning campaign.

The Ten Steps We Will Go Through Are:

  1. Keyword Selection – October 24, 2004
  2. Content – October 31, 2004
  3. Site Structure – November 7, 2004
  4. Optimization – November 14, 2004
  5. Internal Linking – November 21, 2004
  6. Human Testing – November 29, 2004
  7. Submissions – December 5, 2004
  8. Link Building – December 12, 2004
  9. Monitoring – December 19, 2004
  10. The Extras – December 28, 2004

Step Six – Human Testing

The most important part of your website is to reach the visitor. You have taken all the steps to create a great design and added SEO elements to your site, you have created the perfect online presence. Now to see if all that hard work has attained the main goal, to reach the visitor and steer them in the direction most desirable.

First things first, now’s the time to check for the careless errors that happen along the way, things like spelling mistakes, paragraph breaks, incorrect wording etc. Once you have given your new beauty a once over pass it around and get others to do the same, preferably people who have never read the content before. The problem with relying on yourself to proof read is that you already expect what you are going to see and do not read it in its entirety the way someone would at first glance.

Once the text is out of the way have some fresh eyes again take a look at the site. Are there images that they find appealing, unappealing, distracting? Is there anything in the layout of the content that is too busy or confusing? Once you’ve done a check of the visual appeal of the site you will move onto navigation.

When having someone test your site navigation it is again very important to use fresh eyes, make sure these people have no idea what to expect or where to find anything – this way they will be free to follow your beautifully laid out website or fumble and stumble into some dark hole of your site, lost screaming for help. Okay, perhaps I may have given the worst-case scenario however, how many of us can say we have never been in that horrid place? These human testers will be sure to let you know just how your site navigation works for them. They are the average visitor and if they find what they are looking for easily then you can congratulate yourself on having such great intuition and move on to the rest of the tests to come. If there are problems in the navigation I cannot stress enough how very important it is that you address these immediately. You must get the desired information across as easily and quickly as possible.

While on the topic of navigation let’s discuss the different possibilities of the placement of your main navigation. The majority of sites out there either have their main nav on the left or the top of the page. Is there one that is better? Well, they both have their perks, either is good, anything else is bad. The majority of visitors look in these two places to navigate because that is where it always is. There will be other navigation elements throughout your site that will not be listed in your main navigation area, these internal text and image links should be well placed and easily followed IN BOTH DIRECTIONS. It’s great to give the visitor the option to check out information further into your site but you really want to be sure they can get back to where they came from, especially if you are sending them off to information and away from the product pages. Ways to achieve this are to have the information open in a new window, add a “back to previous page” link or add breadcrumb navigation. What you choose will depend on the overall structure of your site as well as the size of your site. If the main nav includes all of your pages (as in some small sites) then there is no need to add these nav elements however in larger sites it is easy for a visitor to get lost if the navigation has not been tried and tested and designed specifically for ease of use. All in all, play with the navigation and test and retest it until there are no problems. The site navigation is so very important – your visitors MUST be able to browse through your site easily and without frustration.

The placement of your content is equally important. If you are selling something obviously you want it offered as easily as possible, and you don’t just want it to be available – you want to sell it. There are many ideas to consider when deciding on the placement of certain content. A great read that really shows the way a visitor looks through your site can be found at http://www.poynterextra.org/eyetrack2004/main.htm (Link removed – no longer available). Taking a look through this information can give you lots of tips to work with in deciding on product and special offer placement etc. In the above-mentioned article you will be able to see the way an average visitor views a website, the pattern in which their eyes follow the information, the advertising positions that are most effective, etc. This is a great resource for you and your company.

Quite possibly one of the most useful tools available is Clicktracks. This tool will show you all the very specific details of how visitors are navigating your site. This tool is many steps above your typical web stats, it will show you details so specific that you can not only see the search term a visitor used to find you but what search engine they came from and the path they followed through your site right down to which search term is selling the most. This highly detailed information can be an incredibly valuable tool for you. With access to such info you can, over time, adjust your content, navigation, and SEO based on these reports – watching the changes happen and see the effects not just make good guesses.

The value of having an average visitor test your site and getting real feedback is huge. You have no choice but to be a little biased when viewing your own site and this outsider information can give you tips that you may have only wished you had. Don’t put your site out there and wonder what all the visitors are thinking and doing, just ask! You may even go as far as having a poll included on your website, so long as it’s not popping up every time they click a link. A simple “we welcome your feedback” email form on your contact or profile page would be a professional simple way to keep up with what the visitors are liking or disliking on a continuous basis.

Next Week

Now that your site has been designed, had SEO elements added, tested and edited, you are ready to submit it to the search engines and get those visitors coming. Stay tuned for the next article in this 10 part series “Submissions”.

SEO news blog post by @ 4:38 pm on November 30, 2004

Categories:SEO Articles

 

Ten Steps To A Well Optimized Website – Step Five: Internal Linking

Welcome to part five in this search engine positioning series. Last week we discussed the importance of content optimization. In part five we will cover your website’s internal linking structure and the role that it plays in ranking highly, and in ranking for multiple phrases.

While this aspect is not necessarily the single most important of the ten steps it can be the difference between first page and second page rankings, and can make all the difference in the world when you are trying to rank your website for multiple phrases.

Over this series we will cover the ten key aspects to a solid search engine positioning campaign.

The Ten Steps We Will Go Through Are:

  1. Keyword Selection – October 24, 2004
  2. Content – October 31, 2004
  3. Site Structure – November 7, 2004
  4. Optimization – November 14, 2004
  5. Internal Linking – November 21, 2004
  6. Human Testing – November 29, 2004
  7. Submissions – December 5, 2004
  8. Link Building – December 12, 2004
  9. Monitoring – December 19, 2004
  10. The Extras – December 28, 2004

Step Five – Internal Linking

With all the talk out there about linking, one might be under the impression that the only links that count are those from other websites. While these links certainly play an important role (as will be discussed in part eight of this series) these are certainly not the only important links.

When you’re about to launch into your link work why not stop and consider the ones that are easiest to attain and maximize first. That would be, the ones right there on your own site and those which you have total and complete control of. Properly used internal links can be a useful weapon in your SEO arsenal.

The internal linking structure can:

  1. Insure that your website gets properly spidered and that all pages are found by the search engines
  2. Build the relevancy of a page to a keyword phrase
  3. Increase the PageRank of an internal page

Here is how the internal linking structure can affect these areas and how to maximize the effectiveness of the internal linking on your own website.

Getting Your Website Spidered

Insuring that every page of your website gets found by the search engine spiders is probably the simplest thing you can do for your rankings. Not only will this increase the number of pages that a search engine credits your site with, but it also increases the number of phrases that your website has the potential to rank for.

I have seen websites that, once the search engines find all of their pages, find that they are ranking on the first page and seeing traffic from phrases they never thought to even research or target.

This may not necessarily be the case for you however having a larger site with more pages related to your content will boost the value of your site overall. You are offering this content to your visitors, so why hide it from the search engines.

Pages can be hidden from search engines if the linking is done in a way that they cannot read. This is the case in many navigation scripts. If your site uses a script-based navigation system then you will want to consider the implementation of one of the internal linking structures noted further in the article.

Additionally, image-based navigation is spiderable however the search engines can’t see what an image is and thus, cannot assign any relevancy from an image to the page it links to other than assigning it a place in your website hierarchy.

Building The Relevancy Of A Page To A Keyword Phrase

Anyone who wants to get their website into the top positions on the search engines for multiple phrases must start out with a clearly defined objective, including which pages should rank for which phrases. Generally speaking it will be your homepage that you will use to target your most competitive phrase and move on to targeting less competitive phrases on your internal pages.

To help build the relevancy of a page to a keyword phrase you will want to use the keyword phrase in the anchor text of the links to that page. Let’s assume that you have a website hosting company. Rather than linking to your homepage with the anchor text “home” link to it with the text “web hosting main”. This will attach the words “web” and “hosting” and “main” to your homepage. You can obviously leave the word “main” out if desirable however in many cases it does work for the visitor (you know, those people you’re actually building the site for).

This doesn’t stop at the homepage. If you are linking to internal pages either through your navigation, footers, or inline text links – try to use the phrases that you would want to target on those pages as the linking text. For example, if that hosting company offered and wanted to target “dedicated hosting”, rather than leaving the link at solely the beautiful graphic in the middle of the homepage they would want to include a text link with the anchor text “dedicated hosting” and link to this internal page. This will tie the keywords “dedicated hosting” to the page.

In a field as competitive as hosting this alone won’t launch the site to the top ten however it’ll give it a boost and in SEO, especially for competitive phrases, every advantage you can give your site counts.

Increasing The PageRank Of Internal Pages

While we will be discussing PageRank (a Google-based term) here the same rules generally apply for the other engines. The closer a page is in clicks from your homepage, the higher the value (or PageRank) the page is assigned. Basically, if I have a page linked to from my homepage it will be given more weight that a page that is four or five levels deep in my site.

This does not mean that you should link to all of your pages from your homepage. Not only does this diffuse the weight of each individual link but it will look incredibly unattractive if your site is significantly large.

Figure out what your main phrases are and which pages will be used to rank for them and be sure to include text links to these internal pages on your homepage. It’s important to pick solid pages to target keyword phrases on as you don’t want human visitors going to your “terms and conditions” page before they’ve even seen the products.

If that hosting company noted above has a PageRank 6 homepage, the pages linked from its homepage will generally be a PageRank 5 (sometimes 4, sometimes 6 depending on the weight of the 6 for the homepage). Regardless, it will be significantly higher that if that page was linked to from a PageRank 3 internal page.

How To Improve Your Internal Linking Structure

There are many methods you can use to improve your internal linking structure. The three main ones are:

  1. Text link navigation
  2. Footers
  3. Inline text links

Text Link Navigation

Most websites include some form of navigation on the left hand side. This makes it one of the first things read by a search engine spider (read “Table Structures For Top Search Engine Positioning” by Mary Davies for methods on getting your content read before your left hand navigation). If it is one of the first things the search engine spiders sees when it goes through your site it will have a strong weight added to it so it must be optimized with care.

If you are using text link navigation be sure to include the targeted keywords in the links. Thankfully this cannot be taken as meaning “cram your keywords into each and every link” because this is your navigation and that would look ridiculous. I’ve seen sites that try to get the main phrase in virtually every link. Not only does this look horrible but it may get your site penalized for spam (especially if the links are one after another).

You don’t have to get your keywords in every link but if workable, every second or third link works well. Also consider what you are targeting on internal pages. If your homepage target is “web hosting” and you’ve linked to your homepage in the navigation with “web hosting main” which is followed by your contact page so you’ve used “contact us”, it would be a good idea to use the anchor text “dedicated hosting” for the third link. It reinforces the “hosting” relevancy and also attaches relevancy to the dedicated hosting page of the site to the phrase “dedicated hosting” in the anchor text.

Footers

Footers are the often overused and abused area of websites. While they are useful for getting spiders through your site and the other points noted above, they should not be used as spam tools. I’ve seen in my travels, footers that are longer than the content areas of pages from websites linking to every single page in their site from them. Not only does this look bad but it reduces that value of each individual link (which then become 1 out of 200 links rather than 1 out of 10 or 20).

Keep your footers clean, use the anchor text well, and link to the key internal pages of your website and you will have a well optimized footer. You will also want to include in your footer a link to a sitemap. On this sitemap, link to every page in your site. Here is where you can simply insure that every page gets found. Well worded anchor text is a good rule on your sitemap as well. You may also want to consider a limited description of the page on your sitemap. This will give you added verbiage to solidify the relevancy of the sitemap page to the page you are linking to.

Internal Text Links

Internal text links are links placed within the content of your work. They were covered in last week’s article on content optimization, which gives me a great opportunity to use one as an example.

While debatable, inline text links do appear to be given extra weight as their very nature implies that the link is entirely relevant to the content of the site.

You can read more on this in last week’s article.

Final Notes

As noted above, simply changing your internal navigation will not launch your site to the top of the rankings however it’s important to use each and every advantage available to create a solid top ten ranking for your site that will hold it’s position.

They will get your pages doing better, they will help get your entire site spidered, they will help increase the value of internal pages and they will build the relevancy of internal pages to specific keyword phrases.

Even if that’s all they do, aren’t they worth taking the time to do right?

Next Week

Next week in part six of our “Ten Steps To an Optimized Website” series we will be covering the importance of human testing. Having a well-ranked website will mean nothing if people can’t find their way through it or if it is visually unappealing.

SEO news blog post by @ 4:28 pm on November 22, 2004


 

Ten Steps To A Well Optimized Website – Step Four: Content Optimization

Welcome to part four in this search engine positioning series. Last week we discussed the importance of the structure of your website and the best practices for creating an easily spidered and easily read site. In part four we will discuss content optimization.

This is perhaps the single most important aspect of ranking your website highly on the search engines. While all of the factors covered in this series will help get your website into the top positions, it is your content that will sell your product or service and it is your content that the search engines will be reading when they take their “snapshot” of your site and determine where it should be placed in relation to the other billions of pages on the Internet.

Over this series we will cover the ten key aspects to a solid search engine positioning campaign.

The Ten Steps We Will Go Through Are:

  1. Keyword Selection – October 24, 2004
  2. Content – October 31, 2004
  3. Site Structure – November 7, 2004
  4. Optimization – November 14, 2004
  5. Internal Linking – November 21, 2004
  6. Human Testing – November 29, 2004
  7. Submissions – December 5, 2004
  8. Link Building – December 12, 2004
  9. Monitoring – December 19, 2004
  10. The Extras – December 28, 2004

Step Four – Content Optimization

There are aspects of the optimization process that gain and lose importance. Content optimization is no exception to this. Through the many algorithm changes that take place each year, the weight given to the content on your pages rises and falls. Currently incoming links appear to supply greater advantage than well-written and optimized content. So why are we taking an entire article in this series to focus on the content optimization?

The goal for anyone following this series is to build and optimize a website that will rank well on the major search engines and, more difficult and far more important, hold those rankings through changes in the search engine algorithms. While currently having a bunch of incoming links from high PageRank sites will do well for you on Google you must consider what will happen to your rankings when the weight given to incoming links drops, or how your website fares on search engines other than Google that don’t place the same emphasis on incoming links.

While there are many characteristics of your content that are in the algorithmic calculations, there are a few that consistently hold relatively high priority and thus will be the focus of this article. These are:

  1. Heading Tags
  2. Special Text (bold, colored, etc.)
  3. Inline Text Links
  4. Keyword Density

Heading Tags

The heading tag (for those who don’t already know) is code used to specify to the visitor and to the search engines what the topic is of your page and/or subsections of it. You have 6 predefined heading tags to work with ranging from <H1> to <H6>.

By default these tags appear larger than standard text in a browser and are bold. These aspects can be adjusted using the font tags or by using Cascading Style Sheets (CSS).

Due to their abuse by unethical webmasters and SEO’s, the weight given to heading tags is not what it could be however the content between these tags is given increased weight over standard text. There are rules to follow with the use of heading tags that must be adhered to. If you use heading tags irresponsibly you run the risk of having your website penalized for spam even though the abuse may be unintentional.

When using your heading tags try to follow these rules:

  • Never use the same tag twice on a single page
  • Try to be concise with your wording
  • Use heading tags only when appropriate. If bold text will do then go that route
  • Don’t use CSS to mask heading tags

Never use the same tag twice on a single page. While the <H1> tags holds the greatest weight of the entire heading tags, its purpose is to act as the primary heading of the page. If you use it twice you are obviously not using it to define the main topic of the page. If you need to use another heading tag use the <H2> tag. After that the <H3> tag and so on. Generally I try never to use more than 2 heading tags on a page.

Try to be concise with your wording. If you have a 2 keyword phrase that you are trying to target and you make a heading that is 10 words long then your keyword phrase only makes up about 20% of the total verbiage. If you have a 4-word heading on the other hand you would then have a 50% density and increased priority given to the keyword phrase you are targeting.

Use heading tags only when appropriate. If bold text will do then go that route. I have seen sites with heading tags all over the place. If overused the weight of the tags themselves are reduced with decreasing content and “priority” being given to different phrases at various points in the content. If you have so much great content that you feel you need to use many heading tags you should consider dividing the content up into multiple pages, each with its own tag and keyword target possibilities. For the most part, rather than using additional heading tags, bolding the content will suffice. The sizing will be kept the same as your usual text and it will stand out to the reader as part of the text but with added importance.

Don’t use CSS to mask heading tags. This one just drives me nuts and is unnecessary. Cascading Style Sheets (CSS) serve many great functions. They can be used to define how a site functions, looks and feels however they can also be used to mislead search engines and visitors alike. Each tags has a default look and feel. It is fine to use CSS to adjust this somewhat to fit how you want your site to look. What is not alright is to adjust the look and feel to mislead search engines. It is a simple enough task to define in CSS that your heading should appear as regular text. Some unethical SEO’s will also then place their style sheet in a folder that is hidden from the search engine spiders. This is secure enough until your competitors look at the cached copy of your page (and they undoubtedly will at some point) see that you have hidden heading tags and report you to the search engines as spamming. It’s an unnecessary risk that you don’t need to take. Use your headings properly and you’ll do just fine.

Special Text

“Special text” (as it is used here) special is any content on your page that is set to stand out from the rest. This includes bold, underlined, colored, highlighted, sizing and italic. This text is given weight higher than standard content and rightfully so. Bold text, for example, is generally used to define sub-headings (see above), or to pull content out on a page to insure the visitor reads it. The same can be said for the other “special text” definitions.

Search engines have thus been programmed to read this as more important than the rest of the content and will give it increased weight. For example, on our homepage we begin the content with “Beanstalk Search Engine Optimization …” and have chosen to bold this text. This serves two purposes. The first is to draw the eye to these words and further reinforce the “brand”. The second purpose (and it should always be the second) is to add weight to the “Search Engine Positioning” portion of the name. It effectively does both.

Reread your content and, if appropriate for BOTH visitors and search engines, use special text when it will help draw the eye to important information and also add weight to your keywords. This does not mean that you should bold every instance of your targeted keywords nor does it mean that you should avoid using special text when it does not involve your keywords. Common sense and a reasonable grasp of sales and marketing techniques should be your guide in establishing what should and should not be drawn out with “special text”.

Inline Text Links

Inline text links are links added right into text in the verbiage of your content. For example, in this article series I may make reference to past articles in the series. Were I to refer to the article on keyword selection, rather than simply making a reference to it as I just have it might be better to write it as, “Were I to refer to the article on keyword selection rather …”

Like special text this serves two purposes. The first is to give the reader a quick and easy way to find the information you are referring to. The second purpose of this technique is to give added weight to this phrase for the page on which the link is located and also to give weight to the target page.

While this point is debatable, there is a relatively commonly held belief that inline text links are given more weight that a text link which stands alone. If we were to think like a search engine this makes sense. If the link occurs within the content area then chances are it is highly relevant to the content itself and the link should be counted with more strength than a link placed in a footer simply to get a spider through the site.

Like “special text” this should only be employed if it helps the visitor navigate your site. An additional benefit to inline text links is that you can help direct your visitors to the pages you want them on. Rather than simply relying on visitors to use your navigation bar as you are hoping they will, with inline text links you can link to the internal pages you are hoping they will get to such as your services page, or product details.

Keyword Density

For those of you who have never heard the term “keyword density” before, it is the percentage of your total content that is made up of your targeted keywords. There is much debate in forums, SEO chat rooms and the like as to what the “optimal” keyword density might be. Estimates seem to range from 3% to 10%.

While I would be the first to admit that logic dictates that indeed there is an optimal keyword density. Knowing that search engines operate on mathematical formulas implies that this aspect of your website must have some magic number associated with it that will give your content the greatest chance of success.

With this in mind there are three points that you should consider:

  1. You do not work for Google or Yahoo! or any of the other major search engines (and if you do you’re not the target audience of this article). You will never know 100% what this “magic number” is.
  2. Even if you did know what the optimal keyword density was today, would you still know it after the next update? Like other aspects of the search engine algorithm, optimal keyword densities change. You will be chasing smoke if you try to constantly have the optimal density and chances are you will hinder your efforts more than help by constantly changing the densities of your site.
  3. The optimal keyword density for one search engine is not the same as it is for another. Chasing the density of one may very well ruin your efforts on another.

So what can you do? Your best bet is to simply place your targeted keyword phrase in your content as often as possible while keeping the content easily readable by a live visitor. Your goal here is not to sell to search engines, it is to sell to people. I have seen sites that have gone so overboard in increasing their keyword density that the content itself reads horribly. If you are simply aware of the phrase that you are targeting while you write your content then chances are you will attain a keyword density somewhere between 3 and 5%. Stay in this range and, provided that the other aspects of the optimization process are in place, you will rank well across many of the search engines.

Also remember when you’re looking over your page that when you’re reading it the targeted phrase may seem to stand out as it’s used more than any other phrase on the page and may even seem like it’s a bit too much. Unless you’ve obviously overdone it (approached the 10% rather than 5% end of the spectrum) it’s alright for this phrase to stand out. This is the phrase that the searcher was searching for. When they see it on the page it will be a reminder to them what they are looking for and seeing it a few times will reinforce that you can help them find the information they need to make the right decision.

Final Notes

In an effort to increase keyword densities, unethical webmasters will often use tactics such as hidden text, extremely small font sizes, and other tactics that basically hide text from a live visitor that they are providing to a search engines. Take this advice, write quality content, word it well and pay close attention to your phrasing and you will do well. Use unethical tactics and your website may rank well in the short term but once one of your competitors realizes what you’re doing you will be reported and your website may very well get penalized. Additionally, if a visitor realizes that you’re simply “tricking” the search engines they may very well decide that you are not the type of company they want to deal with; one that isn’t concerned with integrity but rather one that will use any trick to try to get at their money. Is this the message you want to send?

Next Week

Next week in part five of our “Ten Steps To an Optimized Website” series we will be covering internal linking strategies and best practices. This will cover everything from image links and scripts to inline and basic text links.

SEO news blog post by @ 3:12 pm on November 16, 2004


 

Climbing the Beanstalk November 16, 2004

Welcome to the November 16, 2004 edition of “Climbing The Beanstalk”, the bi-weekly newsletter on search engines and search engine positioning from Beanstalk. In this edition we will explore the MSN search engine launch, recent published articles by Beanstalk staff, and discuss the Google update currently underway (at the time of this writing).

If you have any questions regarding any of the areas covered in this newsletter please don’t hesitate to contact us.

The Microsoft Way

While global domination may be nothing more than a fanciful dream to the executive at Microsoft, it seems to be built into everything they do. From the integration of the browser and multimedia into their operating systems to offering Internet access, their business model seems determined to monopolize computer systems and everything you might do with them. And now, to further this virtual monopoly over all things electronic, Microsoft last week announced the official launch of the MSN search engine.

Until last week MSN results were served by Yahoo!/Overture. It has long been known that the folks over at MSN were hard at work developing their own search engine. Last week that engine was unveiled. While many were surprised that it wasn’t launched right in the live MSN results, SEO’s worldwide have been searching http://beta.search.msn.com. It appears that Microsoft has decided to first launch this engine in beta, seeking feedback on both the interface and the results.

This is not the first time we’ve seen this from Microsoft in regards to this engine. Two brief “tech previews” predated the beta launch. The engine is reportedly being put to the live MSN.com site sometime before January 2005.

It’s ironic that they have put their search engine out in beta. The reason I say this is because of their history of mistakes. Traditionally they put out their product, let us all test it for them, and then patch up the errors (does anyone remember Windows ME … I’m betting they wish we didn’t). It’s ironic because this is one of the better products I’ve seen from Microsoft.

The results from this search engine address many of the problems with Google. The results are relatively spam-free, and it has the same clean interface that Google does (but will it when it goes live?)

Does Microsoft want to rule the world? Not really. They just want to control everything you have on your computer and everything that travels across the Internet. With this one they just might do it. It’s a good engine and the folks over at Microsoft know what they’ve got. Expect this new search engine to be integrated into your next operating system. It may be monopolistic but if they keep the quality of their results up … who will care?

Recent Search Engine Positioning Articles

Beanstalk Search Engine Optimization has recently had two of it’s articles picked up by WebProNews, ISEDB, and an assortment of other SEO resource sites. These are recommended reading for anyone interested in attaining high rankings.

Website Optimization

This is perhaps the single most important aspect of ranking your website highly on the search engines. While all of the factors covered in this series will help get your website into the top positions, it is your content that will sell your product or service and it is your content that … <more>

Site Structure

Developing the structure of your website is a very important step in its overall optimization. The site structure will dictate how the spiders read your site, what information they gather, what content holds the most weight … <more>

Protecting Your Corporate Identity On The Search Engines

Companies go to great lengths to establish their corporate identity through marketing, advertisements, promotions, search engine positioning, and other means. As with any success, it may well happen that criticism follows. Any company is likely to do something that someone somewhere won’t like … <more>

The Google Shuffle

Starting on November 15, 2004, Google began it’s most recent update. Easily predictable, this update falls on the one year anniversary of the Florida Update. For those who don’t recall the Florida Update, on November of last year Google decided to introduce an entirely new algorithm that sent websites jumping with SEO’s almost ready to do the same.

The results were jam-packed full of spam and came just when online businesses were prepped for the holiday rush. Those sites that deserved to do well dropped pages in the rankings while those who broke the rules rose to the top.

This last update did not hold the same fate (thank goodness). In fact, there was very little to report. From what we have seen, Google just got a little bit better. They increased their index to 8 billion pages (coincidentally on the same day that MSN launched it’s beta engine ;) Other than that there is very little difference.

If anything links have dropped slightly in their importance and the overall weight given to a site that focuses on one theme seems to have gone up. Basically this means that massive link building holds a bit less weight (though still a dominating factor) and sites that are about a single subject or that focus on a basic set of phrases will tend to do better than those that focus on many topics and/or a variety of phrases.

All-in-all the past two weeks have seen good results in the search engine world with 2 of the 3 major players improving their positions. The official launch of the MSN engine promises to be a very interesting one indeed with Google working hard to introduce improvements equally as fast.

And when you want to know what’s going on … look for your Beanstalk newsletter. We’ll be watching this very closely and will let you know what comes of it all.

Thank You

Thank you very much for subscribing to “Climbing The Beanstalk”, the bi-weekly search engine positioning newsletter. If you have any questions about the areas covered or if there are any areas of search engine positioning that you would like to see covered in future articles/newsletter please don’t hesitate to contact us. We want to write what you want to know.

SEO news blog post by @ 1:41 pm on

Categories:SEO Newsletters

 

Ten Steps To A Well Optimized Website – Step Three: Site Structure

Welcome to part three in this search engine positioning series. Last week we discussed the importance and considerations that much be made while creating the content that will provide the highest ROI for your optimization efforts. In part three we will discuss the importance of site structure.

While there are numerous factors involved with the search engine algorithms, site structure is certainly of constant importance. Cleaner structures that remove lines of code between your key content and the search engine spiders can mean the difference between second page and first page rankings.

Over this series we will cover the ten key aspects to a solid search engine positioning campaign.

The Ten Steps We Will Go Through Are:

  1. Keyword Selection – October 24, 2004
  2. Content – October 31, 2004
  3. Site Structure – November 7, 2004
  4. Optimization – November 14, 2004
  5. Internal Linking – November 21, 2004
  6. Human Testing – November 29, 2004
  7. Submissions – December 5, 2004
  8. Link Building – December 12, 2004
  9. Monitoring – December 19, 2004
  10. The Extras – December 28, 2004

Step Three – Site Structure

Developing the structure of your website is a very important step in its overall optimization. The site structure will dictate how the spiders read your site, what information they gather, what content holds the most weight, how much useless code they must weed through and more. You must structure your website to appeal to the visitor and the spiders.

When developing your website you want to be sure not to create useless code that can confuse spiders and take away from the content of your site. When developing your site I recommend hand coding as the best option however not everyone has the time or the skill to do this so I would suggest Dreamweaver as a great option. (Though the code will not be as clean as hand coding it does not create an over the top amount of extra code like programs such as Front Page do.) The object here is to keep the code as clean as possible! Remember the more code you have the more the spiders must weed through to get to your content, where you want them to be.

A great way to cut down on extra code as well is to use style sheets. You can use style sheets in ways as simple as defining fonts or as advanced as creating tableless designs. There are many ways to use style sheets and the biggest perk to using them is to cut back on the code on any given individual page.

When you are setting up the initial structure of your site you want to be sure that the table structure is laid out in such a way that the spiders can easily and as quickly as possible get to the most important content. A great way to attain this is to create your website using the table structure outlined in my article “Table Structures For Top Search Engine Positioning“. When the spiders visit your site they read through it top to bottom, left to right following the rows and columns. The key to the table structure outlined above is the little empty row. Were this row not there the spiders would read through that first column hitting nothing but images and Alt tags, your navigation, until it would then move onto the next column, your content area. Placing this empty cell in the first row of the main table guides the spiders directly to your content, they hit the empty row and with nothing to read move onto the next column to the right, where you want them. After they have read your content they will then move back to the left in row 2 and read your navigation images and Alt tags, finally they will end the page at your footer, a great place for keyword rich text links. (Internal linking structures will be covered in part 5 of this 10 part series.)

Once you have created the site structure and inserted all of your content you will then begin the basic optimization of your site. In your code you will want to create Meta tags that fit your keyword choice. The two most important Meta tags are the Description tag and the Keyword tag. Your description should highlight your keyword phrase, keeping it focused, to the point and readable. Your keyword tags should also be focused using each keyword a maximum of 3 times in any set. These tags should be customized on each page to fit the specific phrase targeted.

After the Meta tags have been inserted appropriately to fit each page it is important to title each page appropriately. The main targeted phrase should be the focus of the title, keep it simple, focused, to the point, do not bog it down with extra descriptive text, this is not your description, it is your title.

Next move onto Alt tags. Though it is good practice to add Alt tags to all your images the spiders only put weight on those that are contained within links. An example of this:

<a href=”http://www.beanstalk-inc.com”><img src=”/Images/webhead.jpg” alt=”Beanstalk Search Engine Optimization” width=”461″ height=”145″ border=”0″></a>

These Alt tags allow you to make your images matter. Most main navigation is image based so be sure to add appropriate Alt tags targeting your keywords to this very prominent area of your site. Another great place to add a link along with its Alt tag is in your header image. Linking this image to your URL adds the ability to make the first thing the spiders hit within your tables to at least hold some content that “matters” rather than simply a static image.

H1 tags are also great way to add weight to your content however, use them wisely. You can use any of the H1,2,3,4 tags, the idea being H1 has the most weight, H2 a little less and so on. Do not over use these tags or they will lose their value all together. The correct way to use these is to use them where they actually belong, for example the first line of text on a page, the title. Also, if you are defining your fonts in a style sheet, which you should be, be sure not to abuse these tags. An H1 tag should be defined bigger than an H2, etc.

Utilizing the above tips will create a site structure that is the perfect environment for the spiders, it is clean, focused and easily read. Your site structure is now optimized and ready for the more advanced content optimization elements to come.

SEO news blog post by @ 4:19 pm on November 7, 2004


 

Climbing the Beanstalk November 3, 2004

Welcome to the November 3, 2004 edition of “Climbing The Beanstalk”, the bi-weekly newsletter on search engines and search engine positioning from Beanstalk. In this edition we will explore the launch of the new MSN search, recent articles by Beanstalk staff, and tips on ways to test keyword phrases before you pay for search engine optimization services or perform the optimization yourself.

If you have any questions regarding any of the areas covered in this newsletter please don’t hesitate to contact us.

The Launch Of The New MSN:
Something Old Or Something New?

As of the writing of this newsletter the new MSN search is available for preview. The launch of the new MSN search has been much anticipated and for good reason. Resent statistics show MSN responsible for 27.4% of all searches. While Google still comes in at over 40% and their IPO has given them plenty of money to play with, anyone in the Internet age must wonder at competing head-to-head with Microsoft. It rarely ends in the favor of the competition.

The new MSN search, said to be launching later this month, stands to be the most significant competitor that Google has had to face. With enormous budgets on both sides and resources available that would make Scotty from Star Trek envious, both sides are launching and testing new products such as Google’s new Desktop Search and the rumored browser. MSN on the other side seems focused on launching the search engine, however the backing of Microsoft certainly gives them more brand identity off the starting block (or default search in your operating system as the case may be).

After running a number of searches on the new MSN preview however I came to a startling conclusion, it’s very similar to the others. Of course any algorithm has it’s differences but all in all I found the results to be essentially a mixture of Yahoo! and Google. Until it goes live we won’t know for sure what they’ll have up their sleeves but I wouldn’t recommend losing sleep over your MSN rankings. Chances are they’ll be very similar to what you see now.

Still worried? View the new MSN search in it’s preview at http://techpreview.search.msn.com/ (link made inactive as it is now live and the preview redirects) and run some tests on your own keywords.

Recent Search Engine Positioning Articles

Beanstalk Search Engine Optimization has recently had two of it’s articles picked up by WebProNews, ISEDB, and an assortment of other SEO resource sites. These are recommended reading for anyone interested in attaining high rankings.

10 Steps To A Well Optimized Site: Part One – Keyword Selection

This is part one of ten in this search engine positioning series. In part one we will outline how to choose the keyword phrases most likely to produce a high ROI for your search engine positioning efforts. Over this ten part series we will go through ten essential elements and steps to optimizing a … <more>

10 Steps To A Well Optimized Site: Part Two – Content Creation

Content is the key to search engine rankings. While there are numerous factors involved with the search engine algorithms, content remains a constant in stable rankings for a number of important reasons … <more>

SEO & PPC

When most people think of PPC engines they think of Overture and Google. They think of expensive click-through campaigns that have to be monitored and then … well, many people stop thinking about them right about there.

Without getting too much into PPC management, etc. it is important to note that a great additional use for PPC engines is in the testing of keyword phrases. The first step of any search engine positioning campaign is keyword selection (see article above). Once you’ve chosen what you believe to be the ideal keywords you’re in the precarious situation of having to then either put in enormous amounts of work to optimize your website or spend money on a search engine positioning firm. In either case you need to know that the keyword phrases you are about to target will produces the highest ROI for your business.

For those of you who have read our articles you’ll recall that there are “phrases that sell” (as discussed in our “Keyword Selection” article). Let’s assume that what you have as options are a series of generic phrases or, a slightly better scenario, a bunch of “phrases that sell” and you can’t decide which are the best.

Now is when you turn to the PPC engines. While Overture and Google AdWords are great for their volume, for those on a limited budget you may want to use some of the smaller (and generally less expensive per click) PPC engines. A couple of them can be found on our website at /info/ppc.htm.

What you can now do is choose a few of your selected phrases and bid on them. Generally you can start an account with about $50. Get your site in the number 2 or 3 position (#1 if it’s not much of a jump in price) and monitor your click-throughs, sales, and stats. If you have the budget for it ClickTracks can give you even more information right down to which keyword phrase is delivering the most people to your shopping cart or “thank you” page. It runs at about $495 but is a fantastic tool for a variety of reasons that could take up entire articles unto itself.

Assuming that you won’t be spending $500 on a program you’ll have to just test your keyword phrases a few at a time but do test them all. If you think you have the perfect one but haven’t finished testing try to be patient. You may forgo a great keyword phrase if you take the first one that promised to produce a decent ROI.

With this testing you’ll have a better (not 100% perfect but certainly far better) idea of how your keywords will produce for you in the real world. You may even learn a bit about your site and how visitors navigate it. This is another task for ClickTracks but you can get a good handle on the basics with the stats package provided by your hosting company (if they don’t supply stats then you may want to consider a new host as this is included with even the cheapest hosting packages out there).

Thank You

Thank you very much for subscribing to “Climbing The Beanstalk”, the bi-weekly search engine positioning newsletter. If you have any questions about the areas covered or if there are any areas of search engine positioning that you would like to see covered in future articles/newsletter please don’t hesitate to contact us. We want to write what you want to know.

SEO news blog post by @ 2:50 pm on November 3, 2004

Categories:SEO Newsletters

 

Older Posts »
Level Triple-A conformance icon, W3C-WAI Web Content Accessibility Guidelines 1.0 Valid XHTML 1.0! Valid CSS!
Copyright© 2004-2014
Beanstalk Search Engine Optimization, Inc.
All rights reserved.