Beanstalk on Google+ Beanstalk on Facebook Beanstalk on Twitter Beanstalk on LinkedIn Beanstalk on Pinterest
Translate:
Published On:
SEO articles and blog published on ...
Hear Us On:
Webmaster Radio
Blog Partner Of:
WebProNews Blog Partner
Helping Out:
Carbon balanced.
Archives
RSS

XMLRSS

Link Reduction for Nerds

Let’s face it, even with our best efforts to make navigation clear and accessible, many websites are not as easy to navigate as they could be.

It doesn’t matter if you are first page super star, or a mom n pop blog with low traffic, most efforts really are no match for the diversity of our visitors.

When I first started blogging on SEO topics for Beanstalk I took a lot of effort to make my posts as accessible as I could with a bunch of different tricks like <acronym> tags (now they are <abbr> tags) and hyperlinks to any content that could be explored further.

Like a good SEO I added the rel="nofollow" to any external links, because that totally fixes all problems, right?

“No.. Not really.”

External links, when they actually are relevant to your topic, and point to a trusted resource, should not be marked as no-follow. Especially in the case of discussions or dynamic resources where you could be referencing a page that was recently updated with information on your topic. In that case you ‘need’ the crawlers to see that the remote page is relevant now.

Internal links are also a concern when they become redundant or excessive. If all your pages link to all your pages, you’re going to have a bad time.

If you went to a big new building downtown, and you asked the person at the visitors desk for directions and the fellow stopped at every few words to explain what he means by each word, you may never get to understanding the directions, at least not before you’re late for whatever destination you had.

Crawlers, even smart ones like Google Bot, don’t really appreciate 12 different URLs on one page that all go the same place. It’s a waste of resources to keep adding the same URL to the spiders as a bot crawls each of your pages.

In fact in some cases, if your pages have tons of repeated links to more pages with the same internal link structures, all the bots will see are the same few pages/URLs until they take the time push past the repeated links and get deeper into your site.

The boy who cried wolf.

The boy who cried wolf would probably be jumping up and down with another analogy, if the wolves hadn’t eaten him, just as your competition will gladly eat your position in the SERPs if your site is sending the crawlers to all the same pages.

Dave Davies has actually spoken about this many times, both on our blog, and on Search Engine Watch: Internal Linking to Promote Keyword Clusters.

“You really only NEED 1 link per page.”

Technically, you don’t actually need any links on your pages, you could just use Javascript that changes the window.location variable when desired and your pages would still work, but how would the robots get around without a sitemap? How would they understand which pages connect to which? Madness!

But don’t toss Javascript out the window just yet, there’s a middle ground where everyone can win!

If you use Javascript to send clicks to actual links on the page, you can markup more elements of your page without making a spaghetti mess of your navigation and without sending crawlers on repeated visits to duplicate URLs.

“In fact jQuery can do most of the work for you!”

Say I wanted to suggest you look at our Articles section, because we have so many articles, in the Articles section, but I didn’t want our articles page linked too many times?

Just tell jQuery to first find a matching <anchor>:
jQuery("a[href='/articles/']")

Then tell it to add an ID to that URL:
.attr( 'id', '/articles/');

And then tell it to send a click to that ID:
document.getElementById('/articles/').click();

Finally, make sure that your element style clearly matched the site’s style for real hyperlinks (ie: cursor: pointer; text-decoration: underline;)

UPDATE: For Chrome browsers you need to either refresh the page or you have to include the following in your page header: header("X-XSS-Protection: 0");

SEO news blog post by @ 6:07 pm on August 28, 2013


 

Level Triple-A conformance icon, W3C-WAI Web Content Accessibility Guidelines 1.0 Valid XHTML 1.0! Valid CSS!
Copyright© 2004-2014
Beanstalk Search Engine Optimization, Inc.
All rights reserved.