Beanstalk on Google+ Beanstalk on Facebook Beanstalk on Twitter Beanstalk on LinkedIn Beanstalk on Pinterest
Translate:
Published On:
SEO articles and blog published on ...
Hear Us On:
Webmaster Radio
Blog Partner Of:
WebProNews Blog Partner
Helping Out:
Carbon balanced.
Archives
RSS

XMLRSS

Google+ plus company profiles, plus company page, plus site link?

Pleasing plus is presently proving to be a problem with the plethora of possibilities. Confused by all the Plus linking options suddenly available? Here’s a round-up of what it looks like right now.

  1. Create a Google+ page for the company.
  2. Create employee G+ pages.
  3. Add your employee G+ pages to the company.
  4. Add a link or badge from your website to the G+ page for the company.
  5. Add rel=author links between content on your site and your employee pages.
  6. Add +1 options to the homepage and content/product pages.

Here’s a very busy illustration of the process:

URLs and Code Pages
Create Google+ Pages
Link your website to the Company G+ page
Add rel=author links between your content pages and the employee G+ pages.
Make sure your site’s landing page, content (blog), and product pages have +1 buttons.

I’d put your content/blog posts on your website first, and then follow up with a share to the G+ profile page of the employee/author responsible for the content.

That’s the whole process for G+ interaction between a website, staff pages, and the company page. Doing this properly will tell Google your content is legitimate and maximize the potential ranking signals for your site as it pertains to Google Plus.

Last step is getting folks to follow your Google+ page, hit the +1 buttons, and interact with your Google Plus postings/profile. We’ll have some ideas for this and followers other social networks as the excitement over recent Panda updates quells and we have more time to get back to addressing followers/traffic. Don’t forget that past articles (of which we’ve had a few) may still apply or at least offer some ideas.

Hope everyone has a good long weekend!

SEO news blog post by @ 3:22 pm on November 10, 2011


 

Get your own Google+ Page

Today Google announced they are ready to let users claim pages on the G+ domain. It’s a bit busy over here: Create a Google + Page

Stampede to get Google+ pages.The servers over at Google+ must feel a bit like this?

… but you may want to bother with the line however because this is where you claim your name, brand, trademark, for Google+ pages.

Since I’m waiting in said line-up, I can’t demo the experience and relay first hand info, but I can share what I do know:

- Pages are ‘private’ right now.
- Only the creator has access, so for a company, use the company account
- Access on company pages for other users is coming
- Expect page invites to be a bit excessive on larger profiles to start with

Oh joy my page is waiting for me to setup! Are you folks still reading this? Go!

SEO news blog post by @ 11:55 am on November 8, 2011


 

Finding Your Way With Sitemaps

If you don’t know what a sitemap is, or have never created one…read on. A sitemap is a list of the individual pages on your website displayed in a hierarchical fashion similar to a table of contents, or index. They are sometimes used as a planning tool during the developmental stages of a site design, but more importantly, sitemaps act as a powerful navigational aid by providing a site overview at a glance. Sitemaps also benefit search engine optimization by ensuring that all the pages of a site can be found by web bots.

sitemap image

At one time, sitemaps were viewed as a luxury, or at the very least, not vital. For new sites, they are especially critical as it can take several months for a new site to get crawled and indexed by the search engines. Implementing a sitemap and submitting it to search engines and web analytic utilities such as Google Web Master Tools, will greatly aid in the indexing of your site. Sitemaps do not guarantee all links will be crawled, and being crawled does not guarantee indexing. However, a Sitemap is still the best insurance for getting a search engine to learn about your entire site.

If your site is very large, has a complicated navigation system, or employs Flash or JavaScript menus that do not include html links, parts of the site may never get indexed. Even if you only have a small site, having a sitemap will ensure that all your pages are linked to and ensure that they will be picked up by the crawlers.

Users and crawlers will now be able to access deep links and nested pages much more readily. Having well named, SEO friendly urls in your sitemaps creates the added functionality for users to conduct site-wide searches of the sitemap for specific keywords that they may be looking for in the site. Sitemaps have also been shown to increase PageRank and link popularity to all the pages it links to. While it is more important to have high quality links pointing to your site, you should not underestimate the usefulness of internal links pointing to your own pages.

Sitemaps are written and saved as an .xml file which is the document structure and encoding standard used for webcrawlers to find and parse sitemaps. As such they are very unforgiving and must contain only valid XML syntax. (http://validator.w3.org/ ) Sites are able to be prioritized on an sliding scale from 0.1 to 1.0. Sitemaps are also beneficial in letting search engine bots know when you last updated your website.

Even after reading this post you are still not convinced of the benefits of a sitemap, remember that Google has stated that a sitemap is a ranking factor for your site. Although it may be a small one, added together with several other smaller ranking factors, they all add up to substantial ranking factors and is considered the best practice for any website.

For further information, check out this page in the Google Webmaster Tools Help.

SEO news blog post by @ 11:52 am on November 2, 2011

Categories:Google,SEO Tips

 

Secure search service stirs SEOs slightly

Every once in a while there’s an announcement that makes a huge kerfuffle online only to be yesterdays news the next week. Yesterday’s news is that Google made the move towards secure searches for Google account holders that are logged in while searching. It was actually announced on the 18th, and I didn’t see anything until Kyle mentioned it on the afternoon of the 19th, so it’s actually worse than yesterday’s news!

Google secure search

Anyone following search engine news would be perfectly normal to feel a bit of déjà vu since Google’s had secure search options way back in early 2010. The latest announcement that is stirring up responses is the fact that they are now dropping header info that would normally be passed along to the destination site which could then be tracked and analyzed for SEO purposes.

Google has plenty of good reasons to make this move and only a few reasons against it. Here’s a quick breakdown of the pros/cons:

  • Most searchers are not logged in and won’t be effected
  • Estimates fall between %3-%7 of current search traffic is logged in
  • Tracking the “not provided” searches in Google Analytics will show the missing traffic
  • Mobile users connecting from public WiFi networks can search securely
  • Users of free internet services will have additional privacy
  • HTTPS Everywhere is crucial and backed by Google
  • Webmaster Central still provides search terms to registered owners

Cons:

  • Mobile searchers tend to be logged in
  • Traffic projections for mobile search are growing
  • Google has to make the data accessible to it’s paid users
  • SSL is now becoming a much larger ranking factor

Amy Chang over on the Google Analytics blog had the following point to make:

“When a signed in user visits your site from an organic Google search, all web analytics services, including Google Analytics, will continue to recognize the visit as Google ‘organic’ search, but will no longer report the query terms that the user searched on to reach your site..”
“Keep in mind that the change will affect only a minority of your traffic. You will continue to see aggregate query data with no change, including visits from users who aren’t signed in and visits from Google ‘cpc’.”

Thom Craver, Web and Database specialist for the Saunders College at Rochester Institute of Technology (RIT) was quoted on Search Engine Watch as noting:

“Analytics can already run over https if you tell it to in the JavaScript Code … There’s no reason why Google couldn’t make this work, if the site owners cooperated by offering their entire site via HTTPS.”

Personally, as you can tell from my lead-in, I feel like this is much ado about nothing. Unless competing search engines are willing to risk user privacy/safety to cater to SEOs in a short term bid for popularity, this isn’t going to be repealed. I don’t like to see the trend of money = access, but in this case I don’t see much choice and I’ll stand behind Google’s move for now.

SEO news blog post by @ 12:12 pm on October 20, 2011


 

Resurrecting Dead Backlinks

I came across a great post today from JR Cooper on the SEOMoz site in which he was discussing how to use backlink checkers to find broken links and how to use these to obtain new links. First off he recommended a great new Chrome extension called "Check My Links."

dead link grave

I have just installed the extension myself so I cannot comment directly on it. But the great things JR Cooper reports about it sound very compelling.

"Pretty much, it’s the greatest link building browser extension I’ve ever used. First of all, it’s extremely fast. Like almost too fast. It usually checks half the page in under 10 seconds. It also finds the links that are quickest to check, saving the links with long load times for last (I still don’t know how they do this). Best of all, I can check multiple pages at once, which saves some serious time because I usually find 50 pages at a time to check. As a bonus, it even tells you what kind of page error the broken link got (i.e. 404, 500, etc.)."

The description from the Chrome Web Store:

"Check My Links" is an extension developed primarily for web designers, developers and content editors (and SEOs).>When you’re editing a web page that has lots of links, wouldn’t it be handy to be able to quickly check that all the links on the page are working ok? That’s where &Check My Links" comes in. "Check My Links" quickly finds all the links on a web page, and checks each one for you. It highlights which ones are valid and which ones are broken, simple as that. HTTP response codes and full URLs of broken links are published in the Console log.

As most of us in the SEO industry are finding, it is becoming increasingly difficult to build links to your client’s websites. Tactics that were once widely utilized are no completely ineffective. At the risk of repeating myself again and again; the Panda algorithm has effectively changed everything about how links are obtained. For instance, subsequent updates have rendered posting to forums virtually ineffective for these purposes.

Cooper goes on to detail how this extension can be used for dead link building. The first tactic he describes is Direct Find and Replace. This is where you generate a list of broken links from blogrolls and link pages. You then contact the webmasters of the sites and ask to replace one of the dead links with a link back to your site.

The next method he describes is Content Replacement. He suggests looking at the actual pages that are broken and using the Internet Archive’s "Way Back Machine" to find the original content that was being linked to and then to recreate the content on your own site. You can then contact the webmaster to update their links to the new (and improved) content. Subsequently, you can then use free tools such as Open Site Explorer or Yahoo Site Explorer to discover other sites that were linking to the original content as well and ask if they would like to link to the new and improved content as well.

The last technique he describes is Broken Blogger Blogs where you use the tools to find broken links on blogrolls that point to subdomains on blogspot.com and then looking to see if he can register the blog himself. If so, then he puts up a static page with a desired keyword linking back to the new blog location. Not only does this give you the anchor text of your choice, but it gives a link with a higher amount of link juice (depending on how many outbound links are pointing to that page). He does state that this is a fairly "greyhat" tactic and has requested reader feedback on the ethics of such a tactic.

To recap; the Panda updates are forcing all users to generate better content. It is a bold effort by Google to reduce the amounts of web-spam that have inundated the SERPs for far too long. As an end-user you should love Google for their efforts; as an SEO it means that the whole game has changed and that we have to continue to evolve with the changes to remain effective in our industry.

SEO news blog post by @ 11:53 am on October 19, 2011


 

What word to use for anchor text?

As a well connected SEO I digest a lot of publications from the web and I try to limit my opinion to factual results either from real world feedback or by controlled tests. Google is constantly evolving and improving itself to render the best search results possible, or at least better search results than the competition.

Considering where Google was with regards to just hardware in 1999, things certainly keep changing:

Evolution of Google - First server

On Monday SEO Moz published a small test they did to gauge the importance of keywords in the anchor text of links. The test is discussed in detail over on SEO Moz but the result was rather straight forward.

In a nutshell they took 3 new sites, randomly equivalent, and tried to build some controlled links to the sites using three different approaches:

  1. Build links with just ‘click here’ text
  2. Build links with the same main keyword phrase
  3. Build links with random components of the main keyword phrase

Obviously the test is a bit broken, because if you don’t have existing keyword relevance for a phrase, you should build relevance with keywords in the anchors. When Google is sorting out who will be ranked #1 for a site dealing with candies, the site linked to with relevant keywords should always rank higher than a site with links like “click here” or “this site” which aren’t relevant. The only exception would be in a situation where the links seem excessive or ‘spammy’ and may result in Google not considering any of the similar links for relevance.

Outside of a clean test environment we know the best results would be a blend of all three types, with a bit of brand linking mixed in to avoid losing focus on brand keywords. A well established site with a healthy user base will constantly be establishing brand due to all the time on site and click-through traffic for that brand.

ie. If I search for “Sears” and click on the first link only to find it’s a competitor, I’d hit back and find the right link to click. In most cases Google’s watching/learning from the process, so brand links aren’t going to be a necessity after a site is quite popular, and the % of brand links wouldn’t need to be much at all.

Kudos to SEOMoz for publishing some of their SEO test info regardless of how experimental it was. We’re constantly putting Google’s updates to the test and it’s often very hard to publish the results in such a clinical fashion for all to see. We will always make an attempt to blog on the topics we’re testing but it’s still on the to-do list to publish more of the data.

SEO news blog post by @ 11:56 am on October 11, 2011


 

Early October SEO Shakeups at Google

New panda updates that target tag clouds and forum links? New paid adwords seems to be diminishing the quality of the free service? Landing page quality score improvements to be had with latest AdWords updates? What’s not changed over at Google this month?

Seeking change

Tag Clouds and Forum Links?

For some time now it’s been easy to add tag clouds to blogs and websites, most of them are even dynamically built so they reflect the ongoing topics of your pages, and the really clever ones make each keyword a link.

The result of all that effort leaves a typical tag cloud looks something like this:

.. and that’s a LOT of keywords + links for a crawler to ignore! Word from some of the worst hit sites seems to place a common factor on keyword clouds as the likely component that is now the target of this most recent Panda update over at Google. We’re a really aggressive source of content with a high level of trust, so I doubt one instance of using a tag cloud will tank our blog, but I did debate making the above example an image only.

Forum Links are Worth-Less?

One site that’s been taking a beating from Panda over and over again (eh! rocko!) is DaniWeb. They have been acting as a lightning rod during the storm of over 500 changes Google’s made this year alone to ranking algorithms. In a recent video post from the CEO and Founder of DaniWeb on WebProNews the topic of diminished return of value from forum posts begs for testing:
(Video removed – no longer available)

New AdWords Pro and Language improvements?

This is a topic we can’t just lump into a big multi-post and we know needs in-depth discussion. Many SEOs are discussing how the professional offerings from AdWords coincide with ‘improvements’ to the free service that have actually been viewed as setbacks by the users.

Right now we’re still working with the free version that all our clients are using, but I’d bet we’ll give the pro-service a trial by the years end and will have some input on how valuable we think the upgrade is. I doubt we’ll extract enough value to cover the monthly fees Google is currently asking for, but we would have to try it and see to be sure.

The recently improved AdWords language support means that targeted ads are improving the quality score of landing pages. This could be a bit of a change depending on where your competition is based. If you are a local US market you probably won’t see much if any competition change, but if you’re an international your customers for other countries could be looking at a fresh set of SERPs. As a result, SEOs, and people watching their stats closely would do well to note this factor.

Expect to hear more about these changes, and really any changes that effect SEO in a way that matters. It’s one thing to mention things as they happen it’s another situation entirely to have tested these things first hand and have intimate experience to share. Soon!

SEO news blog post by @ 12:41 pm on October 4, 2011


 

Blech to Blekko

One of the biggest SEO stirrings this morning is over the 30million that Yandex just invested in Blekko. If you haven’t heard of either one, don’t sweat it, my spell check is painting red squiggly lines under both of them too.

Blech to Blekko Search

Why so negative? Well Blekko is trying to ‘get started’ in a game that’s already been in play for some time. When you look at the competition’s investment in search engine work it’s a bit like France suddenly saying they’re ready to join WWII. In this case ‘better late than never’ really doesn’t fit the situation.

Why is a new contender such a bad idea? Take a moment to compare search results on Google, Bing, Blekko, and Yandex for a major site, something that’s been around for some time, had some serious competition and SEO efforts.

This site (Beanstalk) in particular is a great example, we’ve had thousands of our pages duplicated over the years, so try a search for ‘seo services’ or a keyword we really should be at the top of the rankings for. Blekko won’t show us in the top 20, heck even if you search for “beanstalk” we’re #4 because of ‘duplication’ penalties. To Blekko, crawling the web with fresh spiders, all the duplication looks the same, they can’t tell who owns the content or who published it first, they would have to use Google or some really well developed search engine to get that data.

The fact that Yandex’s CEO, Arkady Volozh, will be joining Blekko’s board is interesting. It’s notable because if Yandex could work out a deal to improve Blekko’s crawl data using the much better indexes over at Yandex, then they could make up for some lost time.

It’s not all bad over at Blekko, in fact it’s interesting to see what information they are sharing with searchers in an attempt to explain their anti-spam approach to search results. Have a look at the SEO link in any Blekko search result:

SEO Link on Blekko Search

Clicking that link will take you to a metrics page where Blekko seems to explain it’s result/ranking for that site. The tools they offer on these screens, including drilling down to backlinks by site, are fantastic:

SEO results for backlinks on Blekko Search

I’ve seen worse services from paid products pitched at professional SEOs!

If the crawl data wasn’t so poorly pulled together and had better history, it would be at the top of my list for SEO tools. It certainly is a great free way to see some SEO statistics from a fresh perspective, even if you can’t really get an accurate picture from the limited index.

SEO news blog post by @ 11:57 am on September 29, 2011


 

Blogging Trackbacks, Pingbacks & SEO

There seems to be a lot of confusion amongst newbie bloggers over the definition and use of trackbacks, pingbacks and how they can be used for SEO. If you have done any blogging before and have comments enabled, you probably realized very quickly that the amount of spam that comes from the comments of your post can be quite overwhelming.

bloghands.gif

Comments on blogs are often criticized as lacking authority, since anyone can post anything using any name they like and because there is no verification process available to ensure that the person is in fact who they claim to be.

Trackbacks and Pingbacks were implemented in an effort to provide some level of verification to blog commenting. Pingbacks and trackbacks use drastically different communication technologies (XML-RPC and HTTP POST, respectively).

Trackbacks

A Trackback shows an excerpt from an originating blog post and is editable by the trackback recipient. Trackbacks are an automated process of notifying a blog when you make a post that references it. By sending a trackback, you create a link back to your blog from the blog you are referencing. The trackback was designed to provide a method of notification between websites and a method of sharing comments on a person’s blog but having them show on your own blog as an excerpt for your readers to view.

Person A’s blog receives the trackback and displays it as a comment to the original post. This comment contains a link to Person B’s original post. The excerpt then acts as a teaser and encourages the reader of person A’s blog to go to the originating source of the post to read more.

Person B’s trackback to Person A’s blog generally gets posted along with all the comments. This means that Person A can edit the contents of the trackback on his own server, which means that the whole idea of “authenticity” isn’t really solved. Person A can only edit the contents of the trackback on his own site. He cannot edit the post on Person B’s site that sent the trackback.

When you want to use the trackback feature, you will need to use a special link provided on the blog you want to reference. Most trackback links appear just after the blog post content and before the comments and will sometimes appear as a plain text link.

Pingbacks

Pingbacks were introduced as a method to alleviate some of the issues that people found with trackbacks lacking authenticity. Pingbacks allow you to notify a blog of your entry just by posting its permalink directly in the content of your blog entry.

This leaves all editorial control over the posts exclusively with the author. This automatic verification process grants a level of authenticity which ultimately makes it more difficult to fake a pingback. No special Trackback link is necessary and the Pingbacks do not send any content. In order for Pingbacks to work, you must enable them within WordPress.

Some feel that trackbacks are superior because the readers of Person A’s blog can at least see some of what Person B has to say before deciding if they want to read more and follow the link to the orginal blog source. Others feel that pingbacks are superior as they create a verifiable connection between posts. Pingbacks are akin to having remote comments.

SEO

Many blogging platforms treat the links from trackbacks, pingbacks and comment as "nofollow" so that you do not lose any link-juice or other SEO "value" in using them. Other than the rare “diamond in the rough” link you might acquire from using these features, there is not much SEO value to Trackbacks or Pingbacks. However, it is possible that you may get some value from these tactics after linking to an authoritative site such as the Google Blog which may bring in a lot of traffic to your site.

It is good to link to others in your posts, but it does not mean you have to allow pingbacks or trackbacks. If you do decide to use these features, you should beware of sending both a trackback and a pingback. This creates two separate links on the blog you are referencing and could be considered spam. Of course you should only trackback or pingback if you actually reference the site you are sending the trackback to.

For further information, please refer to the online documentation from WordPress in their Introduction to Blogging.

SEO news blog post by @ 12:26 pm on September 28, 2011

Categories:SEO Tips

 

Google announces rel=standout

I wouldn’t normally blog on a Monday, but everyone’s got the cold or is travelling, and Google just announced a very important new feature called rel=standout.

Google News supports rel=standout

The attribute works the same way as the other link rel attributes (like rel=nofollow):

  • The tag should be placed in the <head> section of the source code on the page
  • The syntax is <link rel=”standout” href=”URL”>

For example:

<link rel=”standout” href=”http://www.beanstalk-inc.com/blog/google-announces-relstandout”>

You can use this on your own domain up to 7 times per week, but you can point to other domains as much as you’d like.

Google’s News service will consider this link as an indication of items that should be included in the ‘featured’ news feeds.

Some sites are also mentioning the importance of tying this in with the rel=canonical and rel=author tags for maximum SEO. Since this is a new feature and all these features require testing we’ll likely speak more on this later when we’ve had a chance to test things first-hand.

In the mean time, better start including the tag for maximum effect, at least 7 times per week.

(UPDATE: We have a lot of clients who use WordPress and they may want to know how we updated our blog so quickly. The patches we’ve applied to our blog require a plugin which we cannot endorse, and the code is very specific to our site, so it’s nothing we’d share in public. If the days pass and you don’t see a rel=standout solution for WordPress, or your blog, we can probably help but we’ll need to look at how your blog is setup to assist. I am working on a specific plugin solution for WordPress that applies the link to only ‘post’ headers, and only when a specific category/tag is used. If I get the kinks worked out it will be offered to all our WP enabled clients.)

SEO news blog post by @ 11:50 am on September 26, 2011


 

« Newer PostsOlder Posts »
Level Triple-A conformance icon, W3C-WAI Web Content Accessibility Guidelines 1.0 Valid XHTML 1.0! Valid CSS!
Copyright© 2004-2014
Beanstalk Search Engine Optimization, Inc.
All rights reserved.