Beanstalk on Google+ Beanstalk on Facebook Beanstalk on Twitter Beanstalk on LinkedIn Beanstalk on Pinterest
Translate:
Published On:
SEO articles and blog published on ...
Hear Us On:
Webmaster Radio
Blog Partner Of:
WebProNews Blog Partner
Helping Out:
Carbon balanced.
Archives
RSS

XMLRSS

Beanstalk's Internet Marketing Blog

At Beanstalk Search Engine Optimization we know that knowledge is power. That's the reason we started this Internet marketing blog back in 2005. We know that the better informed our visitors are, the better the decisions they will make for their websites and their online businesses. We hope you enjoy your stay and find the news, tips and ideas contained within this blog useful.


December 15, 2011

We’d feel dirty not posting about SOPA today..

This is the day folks, the bill is in Congress as I type and here’s some good spots to follow the proceedings closely:
Dirty Bar of Soap
EFF Twitter Feed
Video Webcast
Justin.tv re-broadcast of the live feed

Wondering what all the fuss is about?
Here’s a great read:
Wikipedia -> Stop Online Piracy Act

Who supports SOPA?
Domino Project’s SOPA Supporter List

What sort of organizations are opposed to SOPA?. It was such a bad move that Wikipedia was publicly contemplating a blackout of the service just to make it clear how bad the bill is!

There’s also a few very active/current discussions over on Reddit in the r/technology section that give a good ‘nerds eye view’ of the bill reading.

Wonder why Google was opposed to the bill? Here’s a humorous take on the essence of their fears:
Mockery of SOPAs effect on Google in 2012

If I had to personally sum everything up into a TL;DR I would have to go with:

“Artist and labour groups who don’t have a nerdy understanding of how the internet works and how to approach piracy are joining with other anti-piracy groups to fast-track an ill-considered and potentially dangerous bill.

While most folks don’t understand the internet enough to argue the bill as experts the general reaction today has been “we are rushing something we don’t understand and we can’t proceed”.

With any luck that’ exactly how bill H.R.3261 will end, some potential, but not ready. *fingers crossed*

SEO news blog post by @ 10:42 am


 

 

November 16, 2011

How Rich-Snippets for Apps Increase CTR

Yesterday Beanstalk Blogger, Ryan Morben introduced a list of 10 New Changes to the Google Algorithm. One of the new updates that he mentioned was the use of "Better Snippets." I thought I would take this opportunity to elaborate on these further.

In September this year, Google introduced rich snippets to be used for reviews, events and music sites. This was an effort to help users determine if a particular website had the relevant information they were searching for. The snippets allow you to get information about the applications, reviews and pricing within the actual search results before you download the app.

These rich snippets are becoming increasingly important for not just sites offering mobile apps, but for all software applications available to be downloaded. These rich snippets are becoming increasingly critical for software developer sites, software publishers, download portals and review sites to standout from the rest of the SERPs.

This new rich snippets has two additional attributes that help to specify which countries are currently supporting the new app, and which ones are not. However at this time, there is no formal standardization for the format specifications on schema.org.

Sites that have utilized this new snippet, specifically those with large review sections, or downloadable content, show much larger images in the SERPs than the author rich snippets. These larger images inevitably lead to larger CTRs and ultimate help to increase conversions.

serp pic 1
serp pic 2

Google will also inevitably prefer those sites using rich snippets/microformats that have more complete and detailed meta data. For this reason, it is imperative to provide meaningful data in all the available attribute areas and not to only fill in the required ones. You should always test new rich snippets and apply to Google to clear the new extensions in the SERPs to help boost your CTRs.

SEO news blog post by @ 11:12 am


 

 

November 15, 2011

10 new changes to Google algorithms

New features from GoogleYesterday, over on the Google Inside Search blog, Matt Cutts shared 10 recent changes to the Google search algorithms from the last few weeks.

As always these posts can get a bit technical, and anyone subscribed to the feed can just get it from the horses’ mouth. The goal of this post is to put the changes into clearer terms from a SEO perspective:

Translated search titles:
When searching with languages where limited web content is available, Google can translate the English-only results and display the translated titles directly below the English titles in the search results. This also translates the result automatically, thereby increasing the available web content for non-English searchers. If you were selling products that appealed to a global market, but hadn’t yet invested in translations/global site structure, this could drive fresh traffic to your sites/products.

Better Snippets:
Google’s mantra is always ‘content, content, + more content’, and now the snippet code is focusing on the page content vs. header/menu areas. Because of the way sites use keywords in the headers/menus, coding the snippets to seek out body content will result in more relevant text in search snippets.

Improved Google generated page titles:
When a page is lacking a title, Google has code in place to assign a title to the page using various signals. A key signal used is back-link anchor text pointing to the page. If a site has a ton of duplicate anchor text in the back-links, Google has found that putting less emphasis on those links creates a far more relevant title than previously. In this way the titles in the search results should be much less misleading.

Improved Russian auto-complete:
Languages are a constant headache for search engines, and new features like auto-complete can take a very long time to mature in languages outside of English. Recently the prediction system for auto-completed queries was improved to avoid overly long comparisons to the partial query to make auto-complete function much better in Russian, and closer to how well it works for English queries.

More information in application snippets:
Last week Google announced a new method of improved snippets for applications. The feature’s pretty technical and looks like an entire blog post is coming on just this topic. Here’s an example image that hopefully gives you a gist of how the snippets are giving details, like prices, ratings, and user reviews.

Example of application snippet from Google search results.

The feature has been very popular and Google recently added even more options that will elicit a full blog post soon here.

Less document relevance in Image searches:
If you look up search engine optimization in Wikipedia and look at the entry for Image search optimization you will note that there’s really nothing to say about SEO tactics towards images. This hasn’t been true, there are signals that Google has to look for when deciding what image to show for a particular keyword.
Previously, an image referenced in PDF or other searchable documents multiple times would get higher placement in the results. Google has done away with this signal as it wasn’t giving improved results and could easily be abused. *Innocent whistling*

Higher ranking signals on fresh content:
Consider if you will, how Google would look if they never gave new sites/fresh content a shot at the top, or a moment in the limelight? By default most ratings systems will show you the ‘best of the most recent’ by default just to avoid older content dominating the results. As a person on the phones taking SEO leads I can tell you there’s always been a ’10 mins of fame’ situation on Google where the explainable happens in the search results with fresh sites/content, only to return to normal later on when the dust settles. Google claims the recent change impacts roughly 35% of total search traffic which could be a significant boost for sites that take the time to publish fresh content, or for new sites looking for a chance to be seen.

Improved official page detection:
We’ve blogged recently about the importance of the rel=author attributes, tying your content to a G+ profile, and completing the circle with a back-link from the profile to your site. Google’s added even more methods to establish ‘offical’ pages and is continuing to give ‘official’ pages higher rankings on searches where authority is important. If you missed our article on this topic from last week, here’s the link.

Better date specific results:
The date a page is discovered may not always be the date the information is published. Google has the difficult task of sorting out the ‘date’ relevance for search results, and they keep improving on this where possible. A good example would be using duplicate matches to avoid showing you a 3 year old article that was posted two days ago if you specify that you only want results from say ‘last week’.

Enhanced prediction for non-Latin characters:
You’d think it’s hard enough to get a predictive query straight when the character set is limited to Latin, and you’d be right. When it takes several keystrokes to complete a single character in non-Latin, a service like Google’s auto-complete would be hard pressed to know when to start guessing. Previous to this update predictions in Russian, Arabic, and Hebrew were giving gibberish results as the user was forming characters.

These are 10 changes out of 500+ made so far this year. We try to document the most important changes for you but there’s lots of times where Google can’t release info because of exploits/cheating. When that happens you’ll see us chime in with experiments and our personal experience when we can. So while I’d normally suggest folks interested in this topic subscribe to the inside search blog, we know that you’ll only be getting part of the story by doing so. ;)

SEO news blog post by @ 1:16 pm


 

 

November 10, 2011

Google+ plus company profiles, plus company page, plus site link?

Pleasing plus is presently proving to be a problem with the plethora of possibilities. Confused by all the Plus linking options suddenly available? Here’s a round-up of what it looks like right now.

  1. Create a Google+ page for the company.
  2. Create employee G+ pages.
  3. Add your employee G+ pages to the company.
  4. Add a link or badge from your website to the G+ page for the company.
  5. Add rel=author links between content on your site and your employee pages.
  6. Add +1 options to the homepage and content/product pages.

Here’s a very busy illustration of the process:

URLs and Code Pages
Create Google+ Pages
Link your website to the Company G+ page
Add rel=author links between your content pages and the employee G+ pages.
Make sure your site’s landing page, content (blog), and product pages have +1 buttons.

I’d put your content/blog posts on your website first, and then follow up with a share to the G+ profile page of the employee/author responsible for the content.

That’s the whole process for G+ interaction between a website, staff pages, and the company page. Doing this properly will tell Google your content is legitimate and maximize the potential ranking signals for your site as it pertains to Google Plus.

Last step is getting folks to follow your Google+ page, hit the +1 buttons, and interact with your Google Plus postings/profile. We’ll have some ideas for this and followers other social networks as the excitement over recent Panda updates quells and we have more time to get back to addressing followers/traffic. Don’t forget that past articles (of which we’ve had a few) may still apply or at least offer some ideas.

Hope everyone has a good long weekend!

SEO news blog post by @ 3:22 pm


 

 

November 9, 2011

Why Great Content Is Seldom Seen

milk carton

The post-panda Internet has left many website owners desperate to regain former rankings. The main directive of the new algorithm was to force websites to produce higher quality, relevant content on their websites if they hoped to remain competitive and to keep or increase their SERP rankings.

With the advent of social media en masse, Google (and the web in general) began using the public sharing of a website’s content across social networks as a predominant search engine ranking factor.

While this was a wonderful idea from a user perspective as it forced sites to produce better quality content for their visitors, many content developers found that their rankings were still suffering due to an apparent inability to generate interest in the wonderful content they were developing.

"Content marketing is an umbrella term encompassing all marketing formats that involve the creation or sharing of content for the purpose of engaging current and potential consumer bases."

Often we produce what we feel is great content only to find our efforts falling upon deaf ears. In most cases it is not the content that fails but is more often the result of what we do (or do not do) once the content has been created. Great ideas do not simply propagate themselves into the collective consciousness of the public. Viral web content is the reward of a well devised promotion and a carefully planned implementation strategy. The deployment of your content marketing strategy is crucial to its success.

Most of us rate how "great" content is by the number of page visits, tweets, or likes that the post receives; but what makes good content and more importantly, what can you do to ensure that it is distributed by as many visitors as possible?

Credibility

The best content comes from writing about topics that you know about. Those subjects that you have intimate knowledge about or are derived from your own experiences will always make for more credible content and will be considered higher quality content from a reader’s perspective.

Good content takes time and effort to develop. Great works (in any media) rarely come on a whim or spontaneous inspiration. If you have taken the time to prepare your piece by researching the subject and can offer something that is new, fresh or can communicate it in an especially novel or exciting fashion, it is much more likely to be shared by your readers. If you are particularly passionate or verbose in your delivery, your content becomes an effective vehicle for instilling confidence in the readers mind and generates credibility thereby allowing you to be considered a "specialist" in your area of expertise.

Actionably

Effective content should illicit an emotional response, or create a call to action for the reader. Try to make your content actionable. Leave your readers with the sense they have gained wisdom from your piece and give them something they can take away from it. Content will be shared more readily if it speaks to your readers directly in a more actionable way. A particularity well written piece of content will almost share itself. If you can be proud of the content you have developed and are excited to share it with others, chances are that your readers will want to share it as well.

Marketing

Once you have taken the time to do your research and have composed a wonderful piece of content, how do you get others to read it and share it? While great content is more likely to be shared virally, it is utterly useless without the uses of proper exploitation and a comprehensive marketing strategy.

The first distribution base for your content can be to the friends, coworkers and acquaintances in your email contacts. Remember that it is probably okay to ask your close contacts if they would mind redistributing your content as well. Customer newsletters are still a viable option to use if you have one in place.

"Content marketing subscribes to the notion that delivering high-quality, relevant and valuable information to prospects and customers drives profitable consumer action. Content marketing has benefits in terms of retaining reader attention and improving brand loyalty."

Effective syndication relies on your company having a strong social media presence. Reach out to your online community, through your social media profiles that you have setup. While there are a myriad of social networks you can share your content with, Facebook, Twitter and LinkedIn are the most popular and are the best places to start syndicating your content. Don’t neglect other niche market social networks that may be closely geared towards your industry as well.

Delivery

Timing is everything. You will need to ensure that your content is not being syndicated at ineffective times. Plan to release your content on a Monday morning rather than on a Friday afternoon or on the weekend. Statistics show that most people check their social accounts at the beginning of the work day and after lunch. A 9-5 Monday through Thursday deployment strategy will typically be more effective, with Mondays, Tuesdays and Wednesdays being the most effective days. Remember that it is okay to tweet your content in the morning and again in the afternoon. However, over-using this tactic can quickly annoy your followers.

Plan your content publishing as you would plan a product launch. If it is a particularly large story or news item, you can pre-announce its coming as well. This is an effective way to build consumer anticipation. Consider using a press release with a noteworthy online content syndication service such as PRWeb for your press releases.

Promotion

Perhaps one of the biggest reasons marketing content fails to attract views is due to the lack of follow-through and ongoing promotion of the piece. You need to continue with the promotion of your content long after it has been initially syndicated. Develop a promotion plan that includes reminding people of your content via your social networks and actively work to build links to your content on relevant sites through press releases, your website, guest blogging, online advertising or online radio shows. Any medium where you can gain exposure to your content will be beneficial in securing views.

Any online content takes time to develop traction and get noticed. The Internet has caused most of us to believe success happens overnight. Careful planning and implementation over the course of a well planned promotion, will always yield better returns.

SEO news blog post by @ 11:48 am


 

 

November 8, 2011

Get your own Google+ Page

Today Google announced they are ready to let users claim pages on the G+ domain. It’s a bit busy over here: Create a Google + Page

Stampede to get Google+ pages.The servers over at Google+ must feel a bit like this?

… but you may want to bother with the line however because this is where you claim your name, brand, trademark, for Google+ pages.

Since I’m waiting in said line-up, I can’t demo the experience and relay first hand info, but I can share what I do know:

- Pages are ‘private’ right now.
- Only the creator has access, so for a company, use the company account
- Access on company pages for other users is coming
- Expect page invites to be a bit excessive on larger profiles to start with

Oh joy my page is waiting for me to setup! Are you folks still reading this? Go!

SEO news blog post by @ 11:55 am


 

 

November 2, 2011

Finding Your Way With Sitemaps

If you don’t know what a sitemap is, or have never created one…read on. A sitemap is a list of the individual pages on your website displayed in a hierarchical fashion similar to a table of contents, or index. They are sometimes used as a planning tool during the developmental stages of a site design, but more importantly, sitemaps act as a powerful navigational aid by providing a site overview at a glance. Sitemaps also benefit search engine optimization by ensuring that all the pages of a site can be found by web bots.

sitemap image

At one time, sitemaps were viewed as a luxury, or at the very least, not vital. For new sites, they are especially critical as it can take several months for a new site to get crawled and indexed by the search engines. Implementing a sitemap and submitting it to search engines and web analytic utilities such as Google Web Master Tools, will greatly aid in the indexing of your site. Sitemaps do not guarantee all links will be crawled, and being crawled does not guarantee indexing. However, a Sitemap is still the best insurance for getting a search engine to learn about your entire site.

If your site is very large, has a complicated navigation system, or employs Flash or JavaScript menus that do not include html links, parts of the site may never get indexed. Even if you only have a small site, having a sitemap will ensure that all your pages are linked to and ensure that they will be picked up by the crawlers.

Users and crawlers will now be able to access deep links and nested pages much more readily. Having well named, SEO friendly urls in your sitemaps creates the added functionality for users to conduct site-wide searches of the sitemap for specific keywords that they may be looking for in the site. Sitemaps have also been shown to increase PageRank and link popularity to all the pages it links to. While it is more important to have high quality links pointing to your site, you should not underestimate the usefulness of internal links pointing to your own pages.

Sitemaps are written and saved as an .xml file which is the document structure and encoding standard used for webcrawlers to find and parse sitemaps. As such they are very unforgiving and must contain only valid XML syntax. (http://validator.w3.org/ ) Sites are able to be prioritized on an sliding scale from 0.1 to 1.0. Sitemaps are also beneficial in letting search engine bots know when you last updated your website.

Even after reading this post you are still not convinced of the benefits of a sitemap, remember that Google has stated that a sitemap is a ranking factor for your site. Although it may be a small one, added together with several other smaller ranking factors, they all add up to substantial ranking factors and is considered the best practice for any website.

For further information, check out this page in the Google Webmaster Tools Help.

SEO news blog post by @ 11:52 am

Categories:Google,SEO Tips

 

 

October 27, 2011

SEO Articles

Today was a pretty big day for Beanstalk in the category of putting out some solid content for our valued readers.  2 new articles have been published, one written by the reputable Kyle Krenbrink titled “Updating Your Website’s Content“.  The title is just a titch misleading.  When he first handed it to me for reading I thought I was about to read a technical piece on how to get new content onto your website.  Instead it’s well-written piece discussing what content you should be looking at adding and how often.  I may be biased but to me it’s a great piece for anyone interested in rankings in a post-Panda world.

The second article is written by yours truly and appears over on the Search Engine Watch website.  The article covers rankings your website for image search and discusses everything from the tactics to do so as well as cautions on when it might not actually be beneficial (and yes, there are times this is the case).  With an author’s bias I have to say it’s a solid piece and definitely worth the read.  The article is appropriately  titled, “Ranking on Image Search“.  Enjoy. :)

SEO news blog post by @ 1:31 pm

Categories:Articles,Google

 

 

October 26, 2011

The Google, The FTC, and The News headlines

If anyone’s been looking at the tech headlines today, particularly the really big sites with very political writers, you may have read something about the FTC having Google in chains over outrageous privacy violations.

Some of that info is based on fact but most of what I’ve read is personal takes on the news with a heavy spin to sidetrack the facts and make a story.

Google behind bars

First, lets just get the elephant in the room to step into the light so we’re all looking at it:

Google’s bread and butter is handling trust and privacy properly.
If users can’t trust Google, we can’t use them.

This is why Google has repeatedly been it’s own whistle blower.

  • The web was programmed by humans..
  • Humans make mistakes..
  • The real measure of things is dealing with the mistakes!

When Google’s engineers came up with a shockingly brilliant method of ‘fingerprinting’ WiFi access points by sampling the data coming to/from the devices it wasn’t anyone outside the company that complained.

The fact is that many homes (and some businesses) have zero wireless security, so what was a brilliant plan to get a ‘fingerprint’ ended up becoming a nightmare of un-encrypted data that had to be destroyed properly.

Plus Google had to figure out what it could do to prevent this from happening again, so as part of the punishment Google helped devise for themselves, they setup a fund to create a privacy resource/knowledge base.

At the time many sites tried to make news from the issue and imply that Google was a privacy nightmare, stealing data from unsuspecting users, etc.., etc.., totally overlooking the fact that anyone could (and probably does) roam around in a vehicle and collect the exact same data Google collected.

The majority of the media coverage was almost insulting to the intellect of the readers, but I saw smart people drinking the cool-aid so don’t feel bad if you saw the headlines and got the wrong idea too.

This latest issue is no different at all in terms of Google acting responsibly and the news makers trying to generate headlines.

So here’s a factual take on the actual settlement, not some poorly considered opinion that I’m hoping will make this a headline:

“Google Inc. has agreed to settle an FTC complaint that it used deceptive tactics and violated its own privacy policy when it launched the Google Buzz social network last year. In addition to alleged FTC privacy violations, this is the first time the FTC has alleged violations of the substantive privacy requirements of the U.S.-EU Safe Harbor Framework, a method for U.S. companies to transfer personal data lawfully from the European Union to the United States.

The settlement agreement bars the Google from future privacy misrepresentations, requires it to implement a comprehensive privacy program and includes regular, independent privacy audits for the next 20 years. This is the first time an FTC settlement order has required a company to implement a comprehensive privacy program to protect the privacy of consumers’ information.

According to the FTC complaint, on the day Buzz was launched through the Gmail service, users got a message announcing the new service and were given two options: “Sweet! Check out Buzz,” and “Nah, go to my inbox.” However, some Gmail users who clicked on “Nah…” were enrolled in certain features of the Google Buzz social network anyway. For those Gmail users who clicked on “Sweet!,” the FTC alleges that they were not adequately informed that the identity of individuals they emailed most frequently would be made public by default. Google also offered a “Turn Off Buzz” option that did not fully remove the user from the social network.

When Google launched Buzz, its privacy policy stated that “When you sign up for a particular service that requires registration, we ask you to provide personal information. If we use this information in a manner different than the purpose for which it was collected, then we will ask for your consent prior to such use.” The FTC complaint charges that Google violated its privacy policies by using information provided for Gmail for another purpose – social networking – without obtaining consumers’ permission in advance.

The agency also alleges that by offering options like “Nah, go to my inbox,” and “Turn Off Buzz,” Google misrepresented that consumers who clicked on these options would not be enrolled in Buzz. In fact, they were enrolled in certain features of Buzz.

The complaint further alleges that a screen that asked consumers enrolling in Buzz, “How do you want to appear to others?” indicated that consumers could exercise control over what personal information would be made public. The FTC charged that Google failed to disclose adequately that consumers’ frequent email contacts would become public by default.

Finally, the agency alleges that Google misrepresented that it was treating personal information from the European Union in accordance with the U.S.-EU Safe Harbor privacy framework. The framework is a voluntary program administered by the U.S. Department of Commerce in consultation with the European Commission. To participate, a company must self-certify annually to the Department of Commerce that it complies with a defined set of privacy principles. The complaint alleges that Google’s assertion that it adhered to the Safe Harbor principles was false because the company failed to give consumers notice and choice before using their information for a purpose different from that for which it was collected.”

SEO news blog post by @ 1:46 pm


 

 

October 20, 2011

Secure search service stirs SEOs slightly

Every once in a while there’s an announcement that makes a huge kerfuffle online only to be yesterdays news the next week. Yesterday’s news is that Google made the move towards secure searches for Google account holders that are logged in while searching. It was actually announced on the 18th, and I didn’t see anything until Kyle mentioned it on the afternoon of the 19th, so it’s actually worse than yesterday’s news!

Google secure search

Anyone following search engine news would be perfectly normal to feel a bit of déjà vu since Google’s had secure search options way back in early 2010. The latest announcement that is stirring up responses is the fact that they are now dropping header info that would normally be passed along to the destination site which could then be tracked and analyzed for SEO purposes.

Google has plenty of good reasons to make this move and only a few reasons against it. Here’s a quick breakdown of the pros/cons:

  • Most searchers are not logged in and won’t be effected
  • Estimates fall between %3-%7 of current search traffic is logged in
  • Tracking the “not provided” searches in Google Analytics will show the missing traffic
  • Mobile users connecting from public WiFi networks can search securely
  • Users of free internet services will have additional privacy
  • HTTPS Everywhere is crucial and backed by Google
  • Webmaster Central still provides search terms to registered owners

Cons:

  • Mobile searchers tend to be logged in
  • Traffic projections for mobile search are growing
  • Google has to make the data accessible to it’s paid users
  • SSL is now becoming a much larger ranking factor

Amy Chang over on the Google Analytics blog had the following point to make:

“When a signed in user visits your site from an organic Google search, all web analytics services, including Google Analytics, will continue to recognize the visit as Google ‘organic’ search, but will no longer report the query terms that the user searched on to reach your site..”
“Keep in mind that the change will affect only a minority of your traffic. You will continue to see aggregate query data with no change, including visits from users who aren’t signed in and visits from Google ‘cpc’.”

Thom Craver, Web and Database specialist for the Saunders College at Rochester Institute of Technology (RIT) was quoted on Search Engine Watch as noting:

“Analytics can already run over https if you tell it to in the JavaScript Code … There’s no reason why Google couldn’t make this work, if the site owners cooperated by offering their entire site via HTTPS.”

Personally, as you can tell from my lead-in, I feel like this is much ado about nothing. Unless competing search engines are willing to risk user privacy/safety to cater to SEOs in a short term bid for popularity, this isn’t going to be repealed. I don’t like to see the trend of money = access, but in this case I don’t see much choice and I’ll stand behind Google’s move for now.

SEO news blog post by @ 12:12 pm


 

 

« Newer PostsOlder Posts »
Level Triple-A conformance icon, W3C-WAI Web Content Accessibility Guidelines 1.0 Valid XHTML 1.0! Valid CSS!
Copyright© 2004-2014
Beanstalk Search Engine Optimization, Inc.
All rights reserved.