Beanstalk on Google+ Beanstalk on Facebook Beanstalk on Twitter Beanstalk on LinkedIn Beanstalk on Pinterest
Translate:
Published On:
SEO articles and blog published on ...
Hear Us On:
Webmaster Radio
Blog Partner Of:
WebProNews Blog Partner
Helping Out:
Carbon balanced.
Archives
RSS

XMLRSS

Beanstalk's Internet Marketing Blog

At Beanstalk Search Engine Optimization we know that knowledge is power. That's the reason we started this Internet marketing blog back in 2005. We know that the better informed our visitors are, the better the decisions they will make for their websites and their online businesses. We hope you enjoy your stay and find the news, tips and ideas contained within this blog useful.


August 16, 2013

SEO concerns for Mobile Websites

You want to serve your clients needs regardless of what device they visit your site with, but how do you do it easily without upsetting your SEO?

Lets look at the various options for tackling Mobile sites and what each means in terms of SEO:

Responsive Design :
 
Visual demonstration of responsive web design

  • Responsive design is growing in popularity, especially as communications technology evolves, and bandwidth/memory use is less of a concern.
  • This method also gives us a single URL to work with which helps to keep the sitemap/structure as simple as possible without redirection nightmares.
  • On top of that, Googlebot won’t need to visit multiple URLs to index your content updates.
  • Less to crawl means Googlebot will have a better chance to index more of your pages/get deeper inside your site.
“Why is/was there a concern about mobile page size?”

Low-end mobiles, like a Nokia C6 from 4+ years ago (which was still an offering from major telcos last year), typically require that total page data be less than 1mb in order for the phone to handle the memory needs of rendering/displaying the site.

If you go over that memory limit/tipping point you risk causing the browser to crash with an error that the device memory has been exceeded. Re-loading the browser drops you on the device’s default home-page with all your history lost. I think we could all agree that this is not a good remote experience for potential clients.

Higher-end devices are still victims of their real-world connectivity. Most 3rd generation devices can hit really nice peak speeds, but rarely get into a physical location where those speeds are consistent for a reasonable length of time.

Therefore, even with the latest gee-wiz handsets, your ratio of successfully delivering your entire page to mobile users will be impacted by the amount of data you require them to fetch.

In a responsive web design scenario the main HTML content is typically sent along with CSS markup that caters to the layout/screen limitations of a mobile web browser. While this can mean omission of image data and other resources, many sites simply attempt to ‘resize’ and ‘rearrange’ the content leading to very similar bandwidth/memory needs for mobile sites using responsive design approaches.

The SEO concern with responsive designs is that since the written HTML content is included in the mobile styling it’s very crucial that external search engines/crawlers understand that the mobile styled content is not cloaking or other black-hat techniques. Google does a great job of detecting this and we discuss how a bit later on with some links to Google’s own pages on the topic.

Mobile Pages :

Visual demonstration of mobile web page design

 
If you’ve ever visited ‘mobile.site.com’ or something like that, you’ve already seen what mobile versions of a site can look like. Typically these versions skip reformatting the main site content and they get right down to the business of catering to the unique needs of mobile visitors.

Not only can it be a LOT easier to build a mobile version of your site/pages, you can expect these versions to have more features and be more compatible with a wider range of devices.

Tools like jQuery Mobile will have you making pages in a jiffy and uses modern techniques/HTML5. It’s so easy you could even make a demo image purely for the sake of a blog post! ;)

This also frees up your main site design so you can make changes without worrying what impact it has on mobile.

“What about my content?”

Excellent question!

Mobile versions of sites with lots of useful content (AKA: great websites) can feel like a major hurdle to tackle, but in most cases there’s some awesome solutions to making your content work with mobile versions.

The last thing you’d want to do is block content from mobile visitors, and Google’s ranking algorithm updates in June/2013 agree.

Even something as simple as a faulty redirect where your mobile site is serving up:
mobile.site.com/
..when the visitor requested:
www.site.com/articles/how_to_rank.html

.. is a really bad situation, and in Google’s own words:

“If the content doesn’t exist in a smartphone-friendly format, showing the desktop content is better than redirecting to an irrelevant page.”

 
You might think the solution to ‘light content’ or ‘duplicate content’ in mobile versions is to block crawlers from indexing the mobile versions of a page, but you’d be a bit off the mark because you actually want to make sure crawlers know you have mobile versions to evaluate and rank.

In fact if you hop on over to Google Analytics, you will see that Google is tracking how well your site is doing for mobile, desktop, and tablet visitors:
Example of Google Analytics for a site with mobile SEO issues.

(Nearly double the bounce rate for Mobile? Low page counts/duration as well!?)

 
Google Analytics will show you even more details, so if you want to know how well you do on Android vs. BlackBerry, they can tell you.

“How do the crawlers/search engines sort it out?”

A canonical URL is always a good idea, but using a canonical between a mobile page and the desktop version just makes sense.

A canonical can cancel out any fears of showing duplicate content and help the crawlers understand the relationship between your URLs with just one line of markup.

On the flip-side a rel=”alternate” link in the desktop version of the page will help ensure the connection between them is understood completely.

The following is straight from the Google Developers help docs:

On the desktop page, add:

<link rel="alternate" media="only screen and (max-width: 640px)" href="http://m.example.com/page-1" >

and on the mobile page, the required annotation should be:

<link rel="canonical" href="http://www.example.com/page-1" >

This rel=”canonical” tag on the mobile URL pointing to the desktop page is required.

Even with responsive design, Googlebot is pretty smart, and if you aren’t blocking access to resources intended for a mobile browser, Google can/should detect responsive design from the content itself.

Google’s own help pages confirm this and provide the following example of responsive CSS markup:

    @media only screen and (max-width: 640px) {...}

In this example they are showing us a CSS rule that applies when the screen max-width is 640px; A clear sign that the rules would apply to a mobile device vs. desktop.

Google Webmaster Central takes the information even further, providing tips and examples for implementing responsive design.

Ever wondered how to control what happens when a mobile device rotates and the screen width changes? Click the link above. :)

SEO news blog post by @ 3:51 pm


 

 

July 14, 2009

How To Search Engine Optimize (SEO) an AJAX or Web 2.0 Site

One of the three major pillars of Search Engine Optimization is a website’s content, and onsite content optimization. All of the major search engine ranking algorithms have components that relate to the content that is contained on the website. Typically these components relate to Keyword Densities, number of words, content location, and sometimes age of content. In regards to the code that the content is contained in that falls under the topic of structure and not content, and will not be discussed in this article.

Asynchronous JavaScript and XML (AJAX) is an advanced web development method which can be used to create more responsive and interactive dynamic websites. AJAX accomplishes this by making object request calls back to the web server without having to refresh your browser, these object calls are then processed and are typically used to update the content of the page on your website that is currently being viewed. For the sake of this Article I’m going to ignore the XML component of AJAX as the search engines never view any of the XML data. Websites that use Javascript to manipulate content without using AJAX will also suffer from the issues described.

When a search engine sends out a robot / spider to visit your website with the goal of indexing your content it is only looking at what is being presented in the Markup Language. Generally a search engine does not behave like a user when indexing your website, it doesn’t click buttons or links it simply makes note of URLs associated with each page then individually then visits these pages to index them. This largely goes against the goal of AJAX which is to have as few pages as possible by interacting with the web server in a smarter method as the users interact with the website.

To put the last paragraph simply any content that is changed via AJAX or Javascript on a webpage that is not hardcoded in a page won’t be cached by the search engines. This essentially means that if you have great content that the search engines may love but you’re using AJAX you may be missing out on traffic. There are two approaches to rectifying these which may even give you an advantage over sites that don’t utilize Javascript / AJAX.

The first approach is to make sure that your website degrades to normal flat markup language for non javascript capable browsers and search engines. Essentially every time you would have used an AJAX call make sure you have a page with the same content. Unfortunately for a lot of people this could mean a lot of work, for those individual using a database with PHP or ASP it is not too hard to build a site that builds itself with some effective web programming.

The second approach is to use AJAX in a more minimalist fashion. The goal here is to present the search engines with your optimized content while making sure that any AJAX calls a user would do has no bearing on what you want the search engines to see. In fact this can be used to remove content from your website which may negatively affect your rankings such as testimonials. I’ve seen very few testimonials that actually do good things for a sites keyword density, I’ve even been known to optimize testimonials on client’s websites. With Javascript / AJAX you could insert a random testimonial into a page and therefore not affecting that pages keyword density. The only downside to this approach is that some offsite keyword density tools actually use Web Browser rendering engines so they may get false results as it takes the Javascript into account.

Now you may think that I’m anti AJAX from everything that I’ve said, but there are times and places for AJAX, provided it doesn’t affect how the search engines see your beautiful relevant content your trying to rank. AJAX is great to use for Member sections of your website, interactive forms, slideshows, and a lot more it just needs to be leveraged correctly to avoid missing out on search engine visitors. The final thing to keep in mind is that most search engines like to see more than a single page website which many AJAX website appear to be, always strive for at least 5 or more indexable pages as internal links and anchor text can have a lot of value.

SEO news blog post by @ 2:13 pm


 

 

February 27, 2009

An Introduction To SEO

Welcome to Daryl Quenet’s introduction to Search Engine Optimization (SEO), optimizing design, and how to maximize your websites search engine positioning for the major search engines.

When it comes to running an effective website that ranks well on the search engine results pages (SERPs), there are three major factors that can influence the number of search engine referrals (incoming searches) you get. This applies to all the major search engines (Google, Yahoo, MSN, and Live).

Content Is King

The most important thing is the content on your page regardless of how much time you put into Search Engine Optimizations (SEO) for your website without the content people are searching for you will find very little return on your efforts.

Involved with the preparation of your content is analyzing the keyword(s) for your given industry. Just putting Keywords in the keywords meta tag will get you no where without those Keywords existing in your content. This is known as Keyword Density, basically the more often you’re keywords the more relevant your content is for the searcher in the eyes of a search engine. Keep in mind an ideal density is around 3.5% per word in you phrase.

When writing your Search Engine Optimized content don’t forget about the end user. If you can’t get your keyword densities bang on, then don’t worry about it. I prefer to have a lower density but higher quality content for the end user, than having spammy content and a lower conversion rate. The end goal is still to convert your visitors to your products, services, or whatever your goal may be. Users, unlike search engines, are not interested in Keyword Density so beware of keyword spam.

And a final note on Content for this introduction is that it is advisable to constantly update your content. The longer your content goes without updates, the staler the content gets, and the lower your search engine positioning will drop. However with enough Link Building this can be negated.

Link Building Your Way Too Success

Link building is easily the second most important factor in SEO, and in some cases the most. Building links into your website is the only way as a webmaster that you can affect the authority of your website, and the value your existing content may have in the eyes of the search engines.

To conceptualize link building think of your website as if it was a person. The more popular a person is the more authoritative what they have to say is to their target audience. The big difference being that our target audience is Google, and the other major search engines, and having quality links on other sites equates to your websites “popularity”.

Now keep in mind when you start your link building that nearly no two links are exactly the same. When Google calculates the value of a link it looks at several important things to figure out just how much strength to give you. Here are just a few:

  1. How much strength did the page with the link have
  2. Number of external links on a page
  3. Anchor text used for the link
  4. Is a rel=nofollow tag used
  5. How long has that link been there

Now keep in mind all of these factors above are irrelevant if Google hasn’t cached the page with the link, if Google hasn’t found it then it is worth nothing. The stronger the strength of the page your link is on the more strength you will get in return. The more outgoing links there are on a page the more that strength will be divided between all the linked sites.

A link with a rel=”nofollow” attribute is virtually useless to your website other then increasing your overall link count to give your competitors a scare. You will mainly find NoFollow attributes for Blog Comments, Website Advertisers / Sponsors, Paid Links, or links to competitors (I use them on my resume for past work experience).

When a link is built very few search engines will give you the full strength of that link right away. This is done to maintain the quality of the SERPs if everyone could just go out build 1000s of links then rank there would be no quality to the search engines. Instead they slowly give you more strength as these links age up until around the 6 month period.

Lastly you will constantly see something called Google Pagerank. Pagerank is an arbitrary Google measurement assigned to a website / page to denote that pages strength. Now some people consider this measurement to be the end all be all, but in truth it means very little other than an indicator of you sites health. If you have a PageRank on your homepage as well as pagerank on most of your internal pages your off to a good start. Also keep in mind that pagerank only updates every 3 – 6 months, and ultimately the proof is in the search engine results not some number in the toolbar.

* It’s important to note that when I’m referring to PageRank above I’m referring to the visual PageRank displayed in the little green bar, not the actual PageRank that Google uses internally to calculate the value of a page.

Optimize Your Website Navigation Structure & Design

I purposely left site structure to last as it can be the quickest way for you to royally mess up your website rankings. The worst case with bad structure is that no part of your website will be cached and you will see no visitors. I’ve seen a lot of sites with a lot of issues causing no search engines to crawl these sites. Some of the worst yet simple structural issues that can affect your search engine crawler visibility that I’ve seen are:

  1. Automatically redirecting all visitors that come to your site to another page.
  2. Using HTTPS only
  3. Pure Javascript based navigation

On other sites I have seen Google only cached the index page, which may have an assigned Pagerank without spidering the rest of the website. The things to remember when mapping out the structure of your website are:

  1. At all costs avoid having dynamic URLs (ie index.php?PageId=1), a dynamic URL is a URL that contains HTTP GET variables. Search engines don’t tend to spider these sites well. And to users they don’t have any relevant information to their queries. Try to use Page keys that contain your keywords, if you need to use Dynamic scripts to build your website (i.e. through a Content Management System), use Apache Mod Rewrites to build a static in appearance website (link removed). If you have to use Dynamic URLs keep your number of variables at no more than 2.
  2. If possible try to use the Keywords you are targeting for your industry in your URL or Files / Directories. This helps increase your Keyword Density, as well as providing users clicking through on Google information relevant to their query in your file names.
  3. Don’t constantly change your website structure. Pagerank takes time naturally to develop, and Google holds new sites back in a Sandbox. By renaming a page you can often kiss your pre-existing search engine positioning away on renamed pages until their rank is redeveloped.
  4. When designing a new site try to avoid having filenames with extensions in the URL (ie Products.asp), this can limit your options in the future if you change programming languages (ie ASP to PHP), as well as the platform your website can be hosted on (ie Windows vs Linux Hosting).
  5. When implementing a new structure or new site, create a Google sitemap, and register it with Google to let Google know what to index.
  6. Whenever possible attached CSS and Javascript as external files.

Once you have decided on a website structure, or you have a pre-existing structure, the best way to score higher search engine positions is to have minimalist coding in the HTML to maximize your Content to Markup Ratio. The best way to minimize the amount of HTML code required is to use Cascading Style Sheets (CSS). Cascading Style Sheets allow you to pull the design out of your HTML pages and place them into a separate file. Not only does this remove a lot of HTML if you were using Tables for layout, it makes maintenance a lot simpler as all your design changes are made in one place.

When I moved my website from table based layout to Cascading Style Sheets I managed to reduce my markup code by around 60%! If you have a very large site this can be even more beneficial as some search engines limit the amount of hard drive space they will allocate to caching your website, as well as raise the position of your content higher up in your document.

Conclusion

And thus concludes my introduction to Search Engine Optimization (SEO), it may sound long and long winded, but that is really just a little bit of what goes into successful positioning your website on the search engines. I’ll finish up with one last warning and that is to not buy or sell links, as you can easily be penalized completely from the SERPs for this (Google supplies a page for reporting websites for buying and selling). Good luck on your Search Engine Result Pages and Positioning!

SEO news blog post by @ 5:10 pm


 

 

Ecommerce & SEO

The purpose of any business website is to promote a product or service online. The purpose of an ecommerce website is to take it one step further and to allow your visitors to purchase your products or services directly from your website. This model has many great advantages over the non-ecommerce website in that it allows for the generation of revenue with little-or-no time spent in selling past the cost to have the website designed and maintained, and it does not require the visitor to call you during business hours thus helping secure the sale to an impulse buyer. If your website provides all the information that the buyer would want, you can save significant money in sales time spent in that the visitor can find all the information they need to decide to buy from you without taking up your time or that of one of your sales staff. But ecommerce sites have a serious drawback as well; very few of them can be properly indexed by search engine spiders and thus will fail to rank highly.

A non-ecommerce website may have the disadvantage on not being able to take the visitor’s money the second they want to spend it, however if it can be found on the first page of the search engines while your beautifully designed ecommerce site sits on page eight, the advantage is theirs. The vast majority of visitors will never get to see your site, let alone buy from you, whereas a non-ecommerce site may lose sales because they don’t sell online but at least they’re able to deliver their message to an audience to begin with. So what can be done? The key is in the shopping cart you select.

SEO & Shopping Carts

The biggest problem with many SEO-friendly ecommerce solutions is that they are created after the initial product. Shopping cart systems such as Miva Merchant and OS Commerce are not designed with the primary goal of creating pages that will be well-received by the search engine spiders. Most shopping cart systems out there today are not in-and-of-themselves even spiderable and require 3rd party add-ons to facilitate even the lowest form of SEO-friendliness. The money you may have saved in choosing an inexpensive shopping cart may very well end up costing you your business in the long run, especially if you are using your shopping cart as the entire site, which we have seen may times in the past.

What Can Be Done?

There are essentially two solutions to this problem. The first is to create a front-end site separate from the shopping cart. What this will effectively do is create a number of pages that can be easily spidered (assuming that they’re well designed). The drawback to this course of action is that your website will forever be limited to the size of the front-end site. Which brings us to the second option: choose a search engine friendly shopping cart system.

Finding an SEO-friendly shopping cart system is far easier said than done. There are many factors that have to be taken into account including the spiderability of the pages themselves, the customization capacity of the individual pages, the ease of adding products and changing the pages down the road, etc. While I’ve worked with many shopping cart and ecommerce systems, to date there has been only one that has truly impressed me in that it is extremely simple to use, it allows for full customization of individual pages and the product pages get fully spidered to the point where they have PageRank assigned. A rarity in the shopping cart world.

Easy As Apple Pie

Mr. Lee Roberts, President of Rose Rock Design and creator of the Apple Pie Shopping Cart, was kind enough to take the time to speak with me regarding how he developed his system. Trying to get an understanding of how this system was born I inquired as to what differentiated their system from others. Without “giving away the farm”, Lee pointed out that his system was unique in that the search engines were a consideration from the birth of this project. Rather than trying to jerry-rig a system that was already in place, he initiated the development of a system whose first task was to allow for easily spidered and customized pages. A significant advantage to be sure.

In further discussions he pointed out a few key factors that should be considered by all when choosing a shopping cart system. While more advance shopping cart systems that provide for SEO-friendly pages may seem more expensive, they save you the cost of developing a front-end site, maintaining the pricing on a static page if one goes that route, and of course – if all your site’s pages are easily spidered and you can then have hundreds of additional relevant pages added to your site’s overall strength and relevancy you have a serious advantage in the SEO “game”. If a shopping cart system costs you an extra $100 per month to maintain but it’s use provides you with an additional $5000 in sales that month did it really “cost” you $100?

Conclusion

It is not to say that the Apple Pie Shopping Cart is end-all-be-all of SEO for an ecommerce site, if it was Lee wouldn’t be in the process of building a new version that will include many new features for Internet marketing and tracking, and we would be out of work. That said, if you’ve got an e-commerce site or are looking to have one built, one must consider what type of marketing strategy will be taken with the site and if SEO is one of those, insure to find a system that provides the same advantages as this one.

It may cost a bit more up front but doing it right the first time is far less costly than building a site that can’t be marketed properly and to it’s maximum potential.

SEO news blog post by @ 3:46 pm


 

 

November 22, 2004

Ten Steps To A Well Optimized Website – Step Five: Internal Linking

Welcome to part five in this search engine positioning series. Last week we discussed the importance of content optimization. In part five we will cover your website’s internal linking structure and the role that it plays in ranking highly, and in ranking for multiple phrases.

While this aspect is not necessarily the single most important of the ten steps it can be the difference between first page and second page rankings, and can make all the difference in the world when you are trying to rank your website for multiple phrases.

Over this series we will cover the ten key aspects to a solid search engine positioning campaign.

The Ten Steps We Will Go Through Are:

  1. Keyword Selection – October 24, 2004
  2. Content – October 31, 2004
  3. Site Structure – November 7, 2004
  4. Optimization – November 14, 2004
  5. Internal Linking – November 21, 2004
  6. Human Testing – November 29, 2004
  7. Submissions – December 5, 2004
  8. Link Building – December 12, 2004
  9. Monitoring – December 19, 2004
  10. The Extras – December 28, 2004

Step Five – Internal Linking

With all the talk out there about linking, one might be under the impression that the only links that count are those from other websites. While these links certainly play an important role (as will be discussed in part eight of this series) these are certainly not the only important links.

When you’re about to launch into your link work why not stop and consider the ones that are easiest to attain and maximize first. That would be, the ones right there on your own site and those which you have total and complete control of. Properly used internal links can be a useful weapon in your SEO arsenal.

The internal linking structure can:

  1. Insure that your website gets properly spidered and that all pages are found by the search engines
  2. Build the relevancy of a page to a keyword phrase
  3. Increase the PageRank of an internal page

Here is how the internal linking structure can affect these areas and how to maximize the effectiveness of the internal linking on your own website.

Getting Your Website Spidered

Insuring that every page of your website gets found by the search engine spiders is probably the simplest thing you can do for your rankings. Not only will this increase the number of pages that a search engine credits your site with, but it also increases the number of phrases that your website has the potential to rank for.

I have seen websites that, once the search engines find all of their pages, find that they are ranking on the first page and seeing traffic from phrases they never thought to even research or target.

This may not necessarily be the case for you however having a larger site with more pages related to your content will boost the value of your site overall. You are offering this content to your visitors, so why hide it from the search engines.

Pages can be hidden from search engines if the linking is done in a way that they cannot read. This is the case in many navigation scripts. If your site uses a script-based navigation system then you will want to consider the implementation of one of the internal linking structures noted further in the article.

Additionally, image-based navigation is spiderable however the search engines can’t see what an image is and thus, cannot assign any relevancy from an image to the page it links to other than assigning it a place in your website hierarchy.

Building The Relevancy Of A Page To A Keyword Phrase

Anyone who wants to get their website into the top positions on the search engines for multiple phrases must start out with a clearly defined objective, including which pages should rank for which phrases. Generally speaking it will be your homepage that you will use to target your most competitive phrase and move on to targeting less competitive phrases on your internal pages.

To help build the relevancy of a page to a keyword phrase you will want to use the keyword phrase in the anchor text of the links to that page. Let’s assume that you have a website hosting company. Rather than linking to your homepage with the anchor text “home” link to it with the text “web hosting main”. This will attach the words “web” and “hosting” and “main” to your homepage. You can obviously leave the word “main” out if desirable however in many cases it does work for the visitor (you know, those people you’re actually building the site for).

This doesn’t stop at the homepage. If you are linking to internal pages either through your navigation, footers, or inline text links – try to use the phrases that you would want to target on those pages as the linking text. For example, if that hosting company offered and wanted to target “dedicated hosting”, rather than leaving the link at solely the beautiful graphic in the middle of the homepage they would want to include a text link with the anchor text “dedicated hosting” and link to this internal page. This will tie the keywords “dedicated hosting” to the page.

In a field as competitive as hosting this alone won’t launch the site to the top ten however it’ll give it a boost and in SEO, especially for competitive phrases, every advantage you can give your site counts.

Increasing The PageRank Of Internal Pages

While we will be discussing PageRank (a Google-based term) here the same rules generally apply for the other engines. The closer a page is in clicks from your homepage, the higher the value (or PageRank) the page is assigned. Basically, if I have a page linked to from my homepage it will be given more weight that a page that is four or five levels deep in my site.

This does not mean that you should link to all of your pages from your homepage. Not only does this diffuse the weight of each individual link but it will look incredibly unattractive if your site is significantly large.

Figure out what your main phrases are and which pages will be used to rank for them and be sure to include text links to these internal pages on your homepage. It’s important to pick solid pages to target keyword phrases on as you don’t want human visitors going to your “terms and conditions” page before they’ve even seen the products.

If that hosting company noted above has a PageRank 6 homepage, the pages linked from its homepage will generally be a PageRank 5 (sometimes 4, sometimes 6 depending on the weight of the 6 for the homepage). Regardless, it will be significantly higher that if that page was linked to from a PageRank 3 internal page.

How To Improve Your Internal Linking Structure

There are many methods you can use to improve your internal linking structure. The three main ones are:

  1. Text link navigation
  2. Footers
  3. Inline text links

Text Link Navigation

Most websites include some form of navigation on the left hand side. This makes it one of the first things read by a search engine spider (read “Table Structures For Top Search Engine Positioning” by Mary Davies for methods on getting your content read before your left hand navigation). If it is one of the first things the search engine spiders sees when it goes through your site it will have a strong weight added to it so it must be optimized with care.

If you are using text link navigation be sure to include the targeted keywords in the links. Thankfully this cannot be taken as meaning “cram your keywords into each and every link” because this is your navigation and that would look ridiculous. I’ve seen sites that try to get the main phrase in virtually every link. Not only does this look horrible but it may get your site penalized for spam (especially if the links are one after another).

You don’t have to get your keywords in every link but if workable, every second or third link works well. Also consider what you are targeting on internal pages. If your homepage target is “web hosting” and you’ve linked to your homepage in the navigation with “web hosting main” which is followed by your contact page so you’ve used “contact us”, it would be a good idea to use the anchor text “dedicated hosting” for the third link. It reinforces the “hosting” relevancy and also attaches relevancy to the dedicated hosting page of the site to the phrase “dedicated hosting” in the anchor text.

Footers

Footers are the often overused and abused area of websites. While they are useful for getting spiders through your site and the other points noted above, they should not be used as spam tools. I’ve seen in my travels, footers that are longer than the content areas of pages from websites linking to every single page in their site from them. Not only does this look bad but it reduces that value of each individual link (which then become 1 out of 200 links rather than 1 out of 10 or 20).

Keep your footers clean, use the anchor text well, and link to the key internal pages of your website and you will have a well optimized footer. You will also want to include in your footer a link to a sitemap. On this sitemap, link to every page in your site. Here is where you can simply insure that every page gets found. Well worded anchor text is a good rule on your sitemap as well. You may also want to consider a limited description of the page on your sitemap. This will give you added verbiage to solidify the relevancy of the sitemap page to the page you are linking to.

Internal Text Links

Internal text links are links placed within the content of your work. They were covered in last week’s article on content optimization, which gives me a great opportunity to use one as an example.

While debatable, inline text links do appear to be given extra weight as their very nature implies that the link is entirely relevant to the content of the site.

You can read more on this in last week’s article.

Final Notes

As noted above, simply changing your internal navigation will not launch your site to the top of the rankings however it’s important to use each and every advantage available to create a solid top ten ranking for your site that will hold it’s position.

They will get your pages doing better, they will help get your entire site spidered, they will help increase the value of internal pages and they will build the relevancy of internal pages to specific keyword phrases.

Even if that’s all they do, aren’t they worth taking the time to do right?

Next Week

Next week in part six of our “Ten Steps To an Optimized Website” series we will be covering the importance of human testing. Having a well-ranked website will mean nothing if people can’t find their way through it or if it is visually unappealing.

SEO news blog post by @ 4:28 pm


 

 

November 7, 2004

Ten Steps To A Well Optimized Website – Step Three: Site Structure

Welcome to part three in this search engine positioning series. Last week we discussed the importance and considerations that much be made while creating the content that will provide the highest ROI for your optimization efforts. In part three we will discuss the importance of site structure.

While there are numerous factors involved with the search engine algorithms, site structure is certainly of constant importance. Cleaner structures that remove lines of code between your key content and the search engine spiders can mean the difference between second page and first page rankings.

Over this series we will cover the ten key aspects to a solid search engine positioning campaign.

The Ten Steps We Will Go Through Are:

  1. Keyword Selection – October 24, 2004
  2. Content – October 31, 2004
  3. Site Structure – November 7, 2004
  4. Optimization – November 14, 2004
  5. Internal Linking – November 21, 2004
  6. Human Testing – November 29, 2004
  7. Submissions – December 5, 2004
  8. Link Building – December 12, 2004
  9. Monitoring – December 19, 2004
  10. The Extras – December 28, 2004

Step Three – Site Structure

Developing the structure of your website is a very important step in its overall optimization. The site structure will dictate how the spiders read your site, what information they gather, what content holds the most weight, how much useless code they must weed through and more. You must structure your website to appeal to the visitor and the spiders.

When developing your website you want to be sure not to create useless code that can confuse spiders and take away from the content of your site. When developing your site I recommend hand coding as the best option however not everyone has the time or the skill to do this so I would suggest Dreamweaver as a great option. (Though the code will not be as clean as hand coding it does not create an over the top amount of extra code like programs such as Front Page do.) The object here is to keep the code as clean as possible! Remember the more code you have the more the spiders must weed through to get to your content, where you want them to be.

A great way to cut down on extra code as well is to use style sheets. You can use style sheets in ways as simple as defining fonts or as advanced as creating tableless designs. There are many ways to use style sheets and the biggest perk to using them is to cut back on the code on any given individual page.

When you are setting up the initial structure of your site you want to be sure that the table structure is laid out in such a way that the spiders can easily and as quickly as possible get to the most important content. A great way to attain this is to create your website using the table structure outlined in my article “Table Structures For Top Search Engine Positioning“. When the spiders visit your site they read through it top to bottom, left to right following the rows and columns. The key to the table structure outlined above is the little empty row. Were this row not there the spiders would read through that first column hitting nothing but images and Alt tags, your navigation, until it would then move onto the next column, your content area. Placing this empty cell in the first row of the main table guides the spiders directly to your content, they hit the empty row and with nothing to read move onto the next column to the right, where you want them. After they have read your content they will then move back to the left in row 2 and read your navigation images and Alt tags, finally they will end the page at your footer, a great place for keyword rich text links. (Internal linking structures will be covered in part 5 of this 10 part series.)

Once you have created the site structure and inserted all of your content you will then begin the basic optimization of your site. In your code you will want to create Meta tags that fit your keyword choice. The two most important Meta tags are the Description tag and the Keyword tag. Your description should highlight your keyword phrase, keeping it focused, to the point and readable. Your keyword tags should also be focused using each keyword a maximum of 3 times in any set. These tags should be customized on each page to fit the specific phrase targeted.

After the Meta tags have been inserted appropriately to fit each page it is important to title each page appropriately. The main targeted phrase should be the focus of the title, keep it simple, focused, to the point, do not bog it down with extra descriptive text, this is not your description, it is your title.

Next move onto Alt tags. Though it is good practice to add Alt tags to all your images the spiders only put weight on those that are contained within links. An example of this:

<a href=”http://www.beanstalk-inc.com”><img src=”/Images/webhead.jpg” alt=”Beanstalk Search Engine Optimization” width=”461″ height=”145″ border=”0″></a>

These Alt tags allow you to make your images matter. Most main navigation is image based so be sure to add appropriate Alt tags targeting your keywords to this very prominent area of your site. Another great place to add a link along with its Alt tag is in your header image. Linking this image to your URL adds the ability to make the first thing the spiders hit within your tables to at least hold some content that “matters” rather than simply a static image.

H1 tags are also great way to add weight to your content however, use them wisely. You can use any of the H1,2,3,4 tags, the idea being H1 has the most weight, H2 a little less and so on. Do not over use these tags or they will lose their value all together. The correct way to use these is to use them where they actually belong, for example the first line of text on a page, the title. Also, if you are defining your fonts in a style sheet, which you should be, be sure not to abuse these tags. An H1 tag should be defined bigger than an H2, etc.

Utilizing the above tips will create a site structure that is the perfect environment for the spiders, it is clean, focused and easily read. Your site structure is now optimized and ready for the more advanced content optimization elements to come.

SEO news blog post by @ 4:19 pm


 

 

October 20, 2004

Ten Steps To A Well Optimized Website Series

Due to the great interest and feedback we received from our article “Ten Steps To Higher Search Engine Positioning” we decided to cover each of the ten steps in greater detail in a ten part series.

Below you will find links to all ten steps of this series.

The Ten Steps To A Well Optimized Website:

  1. Keyword Selection – October 24, 2004
  2. Content – October 31, 2004
  3. Site Structure – November 7, 2004
  4. Optimization – November 14, 2004
  5. Internal Linking – November 21, 2004
  6. Human Testing – November 29, 2004
  7. Submissions – December 5, 2004
  8. Link Building – December 12, 2004
  9. Monitoring – December 19, 2004
  10. The Extras – December 28, 2004

SEO news blog post by @ 2:13 pm


 

 

Level Triple-A conformance icon, W3C-WAI Web Content Accessibility Guidelines 1.0 Valid XHTML 1.0! Valid CSS!
Copyright© 2004-2014
Beanstalk Search Engine Optimization, Inc.
All rights reserved.