Beanstalk on Google+ Beanstalk on Facebook Beanstalk on Twitter Beanstalk on LinkedIn Beanstalk on Pinterest
Translate:
Published On:
SEO articles and blog published on ...
Hear Us On:
Webmaster Radio
Blog Partner Of:
WebProNews Blog Partner
Helping Out:
Carbon balanced.
Archives
RSS

XMLRSS

Beanstalk's Internet Marketing Blog

At Beanstalk Search Engine Optimization we know that knowledge is power. That's the reason we started this Internet marketing blog back in 2005. We know that the better informed our visitors are, the better the decisions they will make for their websites and their online businesses. We hope you enjoy your stay and find the news, tips and ideas contained within this blog useful.


June 12, 2012

Microsoft sues Google: Rankings on Google are too crucial!

Microsoft knows the pains of anti-trust lawsuits, million dollar fines, and the expensive nature of dividing up a business so it doesn’t look like a monopoly.
Breaking up the monopoly
So it’s no shock that one of the biggest weapons in Microsoft’s war chest is a handful of small companies that can claim Google services have stymied their opportunities to succeed.

According to this “Google treads carefully in European antitrust case” article posted yesterday in Canada.com, companies with direct links to Microsoft are suing because they cannot compete in EU markets without ranking well on Google:

Google’s competition includes Microsoft but is mostly small, specialist Internet services which argue the Silicon Valley giant is ensuring their names come low or don’t even figure in searches. In Europe, 80 per cent of Web searches are run on Google, according to the most recent figures by comScore, compared with 67 per cent in the United States. Its opponents say that means Google, which makes its money by advertising sales, can make or break a business by its ranking.

… followed by:

Moreover, Google says the small companies claiming to be its victims are linked to Microsoft. The third original complainant, Ciao.de, is a German travel search site owned by Microsoft. Several are also members of I-comp, whose most prominent member is Microsoft, and which produces position papers on subjects such as web market concentration. I-comp lawyer Wood acknowledges the organization is not independent, but says “our palette is much broader than Microsoft’s.”
 
The scary truth is that if actions like this are successful we would have to reorganize or dismantle all companies like Google that offer free services which prevent smaller companies from selling the same services.

Typically such a thing would never happen here in North America, since due diligence requires proof of consumer harm, not just harm to the competition.

No matter how you look at it, Google is the opposite of consumer harm, but in the EU courts this may not matter.

Once Google loses in EU courts it will be ‘game-on’ for all other countries to dog-pile on the remains of Google, allowing greed to kill off one of the best things that’s ever happened to us.

Looking at history of humanity and greed vs. virtue, we should have seen this coming?

In my opinion it is as if Microsoft woke up one morning, looked into their magical mirror to reflect on how beautiful they are, and came to realize that some poison apples need to be handed out post-haste.

Speaking of humanity vs. greed, I MUST comment on this whole FunnyJunk vs. Oatmeal ‘fiasco’.

Either this is some brilliant promotional scheme or the owners of FunnyJunk painted a bullseye on their own foot. I am really not sure which one, but man is it sad.

Give it a read if you really want to be shocked at how low a business can stoop to make a profit from artists and the community.

It’s also refreshing to see the Oatmeal prove they could shut down TFJ, but instead they used the $20,000 they raised in 64 minutes to fund cancer research and support the World Wildlife Federation.

SEO news blog post by @ 11:08 am


 

 

June 5, 2012

Google Advisor: Where have you been all my life?

Admittedly, when I read the announcement that Google Advisor was here to help me manage my money the first thoughts were about privacy and that last bastion of private information Google hasn’t touched yet: Banking.

Gloved hand that is reaching for banking and credit info

Being wrong never felt so good!

Google Advisor is not (at the moment) a way to suck more private information from you, it’s actually more of a consulting service for comparing bank accounts, credit cards, certificates of deposit, and more.

Google Advisor

As someone who’s setup review sites for various services/offerings I can tell you how handy/popular it is to break down competing services so the consumer can select something that meets their exact needs.

Google Advisor claims that the information it’s showing is based on my data, but a 0% intro rate on transfers for 18months? If that’s really available to me I’m going to have to send Google some chocolates.

Google bought QuickOffice

QuickOffice Logo

Google bought the mobile office suite ‘QuickOffice‘ which allows ‘App-Level’ access to office documents for mobile devices based on Android/iOS/Symbian.

This move seems redundant with Google’s ‘Docs’ suite offering even more connectivity to your documents/spreadsheets/presentations, but that is just a cloud service, not an ‘App’ and you can have more offline control of your work if you have an ‘App’ vs. a cloud service.

Plus you can’t argue with the users, they want ‘Apps’ and will pay for them.

Google bought Meebo

Meebo Logo

I’m not sure if this was related to Yahoo’s ‘Axis’ bar plugin that came and went with zero fanfare, but it’s an interesting purchase for SEO interests.

Meebo is a handy social media tool with some great options for ad placement and on-line marketing. SEOs not already dabbling with the tool should take a look, like yesterday.

If you’ve been managing your Twitter, Google+, Facebook, etc.., profiles without a management tool, aggregation sites like Meebo are really what you’ve been missing out on.

We know that Google owned properties have more relevance and trust on the web than similar services/products. After all, if you can’t trust yourself, who can you trust?

So if you were using some other social aggregation tool, and were doing it solely for SEO awareness, you can safely assume it’s worth the effort to try out Meebo for a potentially improved result/relevance from your efforts.

We will be doing some testing (as we always do) and will blog about our results to further expand on what the service offers over others. This may even warrant an article or two?

SEO news blog post by @ 12:42 pm


 

 

May 28, 2012

GoogleBot Now Indexing JavaScript, AJAX & CSS

Gogole Bot

Improving the way that GoogleBot parses and interprets content on the web has always been integral to the Google mandate. It now seems that GoogleBot has reverently been bestowed the ability to parse JavaScript, AJAX and Cascading Style Sheets.

In the past developers avoided the use of JavaScript to deliver content or links to content due to the inherent difficulty by the GoogleBot to correctly index this dynamic content. Over the years it has become so good at the task that Google is now asking us to allow the GoogleBot to scan JavaScript used in our websites.

Google did not release specific details of how or what the GoogleBot does with the JavaScript code it finds due to fears that the knowledge would quickly be incorporated into BlackHat tactics designed to game Search Engine Results Pages (SERPs). A recent blog post on Tumblr is responsible for the media attention. The post has shown server logs where the bot was shown to be accessing JavaScript files.

The ability for the GoogleBot to successfully download and parse dynamic content is a huge leap forward in the indexing of the web and stands to cause many fluctuations in rankings as sites are re-crawled and re-indexed with this dynamic content now factored in to the page’s content.

Previously Google attempted to get developers to standardize the way dynamic content was handled so that it could crawl but the proposal (https://developers.google.com/webmasters/ajax-crawling/) has been more or less ignored.

The GoogleBot has to download the JavaScript and execute it on the Google Servers running the GoogleBot leading some to the conclusion that it may be possible to use the Google Cloud to compute data at a large scale.

SEO news blog post by @ 11:22 am

Categories:Coding,Google,Google

 

 

May 24, 2012

Yahoo Axis – What the Flock?

I had a friend working on the Flock browser team right until it lost momentum and became clear that it was too much, too soon…

Amy's Winehouse - Too soon?

Here we go again with a new ‘all-in-one’ web browser concept, this time from a very big name?

**Update: Turns out that the leaks were really just rumors. This hype mill is a ‘googol‘ times more intense than it should be considering this is ‘just a plugin’ (unless you count Apple devices).

 

Paul Rudd doing the Double Take
Yahoo..? New?!?

Microsoft owns Yahoo right? So if Yahoo is releasing a new browser + a suite of browser plugins for people who refuse to switch browsers, what’s going on?

Well apparently giving people the option to ‘choose’ MSN/Bing/Yahoo wasn’t working out so well. Now you can run a browser or a plugin that removes that annoying hassle of choosing who’s search services you are using.

Y’know how Firefox and Chrome allow you to sign-in to your browser letting you seamlessly move from one location to the next? Yeah Axis is going to break ground and re-invent the web by also doing that same thing.

Y’know how Google is showing you previews of the sites you’re considering visiting within the search results? Yep Axis will finally let you do that, again.

Is this even a new browser or just IE9 with some ‘fluff’ and Yahoo branding? Tonight we will get a chance to try it hands-on and find out, but for now we have a few videos we can watch over on Yahoo Video.

One of the points my Economics teacher used to hammer home is to view each promotion as the promoter relating to their target audience.

If you have a good product with a smart client base, you can sell your product by focusing on real traits and strengths. Just demonstrate the product and avoid all pointless elements that distract the consumer from your product information.

Enjoy those videos and the clever/unique symbolism that hasn’t been copied too many times since Apple used it in 1984. :)

Does this mean Bing/Yahoo rankings will be important?

Who ever said they weren’t important? Okay, well expert opinions aside, you should never burn the Bing bridge, especially not with cell phones that default to Bing and new versions of Windows that also default to Bing.

It’s never wise to put all your eggs in one basket, and this is true of search engine placement/rankings as well as eggs.

Even if Yahoo Axis only manages a week of public attention, that’s one week of people around the planet searching Bing for a change.

If you rank really well on Google, we’re not going to suggest you intentionally tank your rankings for a short-term gain on Bing. The cost of recovering from such a move would probably be far more than simply paying for some pay-per-click coverage via Microsoft’s AdCenter.

There’s already folks worried about ‘Yahoo’ impressions vs. Bing impressions and the following advice has been posted in the AdCenter help forum:

1) You are currently bidding on broad match only, add phrase and exact match to your bidding structure.
2) Look at keywords with low quality score and optimize for those specifically.
3) Install the MAI tool and check on expected traffic for adCenter, you can also see what average bids are for specific positions.

Only 7 Days Left!

7 DAYS LEFT!

 

Talk about old news? I mentioned this just 2 days ago?!

We still have 7 days left in our Beanstalk Minecraft Map Competition! Check it out and even if you’re not entering, please let others know it’s coming to a close and we need all submissions by the 31st!

SEO news blog post by @ 10:03 am


 

 

May 22, 2012

FB stock drops as SpaceX soars to success!

There were so many interesting technology/internet developments between Friday and now today that I can’t really pick which one to focus on?

Sliding FB stock prices, Google finally taking over what was the mobility division of Motorola, SpaceX reaching the ISS, Wiki-leaks’ social media platform, and the Google Knowledge Graph.. and more!

If we looked at them from an SEO standpoint I would still have to struggle a bit to pick the most interesting/focused story, but it’s a great way to dive in so lets take a look at the weekends headlines from an SEO aspect.

Facepalm – FB IPO = Uh Oh

 
Dave’s nailed this one really well on Friday in this post:
Facebook IPO vs Ford (real world) Valuation Comparison

The image of money flushing down the toilet was very ‘apt’ since that’s exactly where I see the stock price going:
https://www.google.ca/finance?q=NASDAQ%3AFB

The current ‘low’ appears to be $31/share at the moment, with the price currently dancing around $32.50/share as I write this.

Google Mobility

Google already makes some cool hardware for their servers and other projects, but most people I know wouldn’t think of them as a manufacturer.

And yet here we are today, watching history unfold, as the mobile division of one of the worlds best handset manufacturers changes hands with a company that is at the head of the Android software alliance.

Google does a lot of things for free, even at a loss, because they see value in things that others would squander and ignore. Now that they have a hardware division to support this bad habit things are going to get very interesting.

We already know from looking through project glass’s details that Google will be needing a very skilled manufacturer with assets in micro mobility and wireless. HTC has always been very willing to participate with Google’s projects, but they are a vastly successful hardware manufacturer with no visible brand loyalty.

I personally had Android running on a HTC Windows Mobile so why can’t I run Windows Mobile on a Google subsidized Android HTC phone? I probably could, which is why it’d be very silly for Google to subsidize HTC hardware.

If Google can produce the hardware and find ways to keep 90%+ of the owners using Google services, it’s a much safer bet, and it appears to be exactly what they are doing. Heck if they make the hardware they might not even care what OS you use if they are allowed to sniff the data and still learn about users from the data they are using.

The only part of the puzzle that’s missing is deployment of Google owned, Motorola equipped, cell-towers so that Google can offer hardware, software, and services on their terms, in a model that makes sense to them, which would likely mean no caps on network use for Google products?

Yeah I could be dreaming but if I was a competitive cellular provider I’d be strongly considering opening my arms to Google before it’s an arms race against Google. ;)

Google Knowledge Graph

While the bearing on SEO for this news item is rather debatable and curious. The feature itself is incredibly handy and something Google has the unique opportunity to provide.

By taking key points of knowledge and building some hard links to relate that knowledge to other data points Google has developed a Wikipedia of it’s own design.

Knowing the struggles that Wikipedia has faced in terms of moderation and updating content, it will be anyone’s guess how Google is going to maintain it’s knowledge graph without someone manipulating the results, but kudos to Google for trying?

Right now the coverage on this is going to be all the same because the content in Google KG is still being built up, but you can expect further discussion as the service grows.

FoWL – Wiki-Leaks’ Social Media Service

Since this service claims to be private and encrypted, it would be very foul of me to really spend much of your time discussing it.

As it can’t be officially crawled by Google it’s probably going to have a very low effect on SEO and rankings in general. The only real bearing I could see it having is using it as a traffic tool for sites that are in-line with the Wiki-leaks mantra of public information. So if you can pretend that your services are so good the FBI doesn’t want you talking about them..??

SpaceX reaches ISS

This isn’t search engine related at all. I suppose you could point to the success of Google vs. government run indexes, and then point to the success of SpaceX vs. NASA with a bunch of startling similarities, but that’s some serious reaching.

At the same time, posting this on the same day the first private effort has docked with the International Space Station? I am obligated as a nerd to at least tuck this into the tail of the post. It’s pretty cool!

9 Days Left!

 
9 DAYS LEFT!

 

We still have 9 days left in our Beanstalk Minecraft Map Competition! Check it out and even if you’re not entering, please let others know it’s coming to a close and we need all submissions by the 31st!

Good Luck! :)

SEO news blog post by @ 12:01 pm


 

 

May 9, 2012

SEOmoz SPAM Outing

In the recent wake of the Penguin update from Google and the impact it has had on many sites, Rand Fishkin, CEO of SEOmoz, announced on his Google+ page that SEOmoz is currently developing tools to facilitate the "classifying, indentifying and removing/limiting link juice passed from sites/pages."

feathers mcgraw

SEOmoz wants to develop software to add to their existing toolset available to subscribers on their website to aid in determining if their own website or a competitor’s website appears to be spammy in nature.

If SEOmoz has developed a method to analysis signals that can be used to determine if a site is spammy, it is safe to assume that Google is viewing the page or site in question in the same light. Links that are determined to be spammy will pass little link juice and could potentially incur a penalty from Google. Fishkin summed it the process by saying that if they (SEOmoz) classifies a site or page as having spammy backlinks, “we’re pretty sure Google would call it webspam.”
Some in the SEO community are angered at Rand Fishkin’s policy of “outing” SEOs for spamming practices, so this time, Rand has enlisted the public to answer whether or not he should do so.

Some of our team members, though, do have concerns about whether SEOs will be angry that we’re “exposing” spam. My feeling is that it’s better to have the knowledge out there (and that anything we can catch, Google/Bing can surely better catch and discount) then to keep it hidden. I’m also hopeful this can help a lot of marketers who are trying to decide whether to acquire certain links or who have to dig themselves out of a penalty (or reverse what might have caused it).

antispam

Preliminary results show that most are in favor of Rand’s reporting of other SEOs for spammy practices. Certainly the reporting of offenders will help Google to combat the unwanted webspam that has permeated search results since the inception of the Internet into mainstream society. It is the new mantra of the modern web; you need to follow the rules and guidelines established by Google for fear of serious reprisal – whether or not you agree with it. Ultimately, what benefits the search results, benefits the searcher.

On a slighlty related note, I would like to suggest Feathers McGraw as the new face for the Penguin algorithm update from Google…

SEO news blog post by @ 10:49 am

Categories:Google,Rankings

 

 

May 1, 2012

Search Engine Experiment in Spam Surfing

If you took a very heavily spam-influenced search engine like Bing for example and removed the first 1 million results for a query, how good would the result be?

How about doing the same thing to the best filtered search engines available?

Well someone got curious and made the million short search engine.

What this new service does is remove a specific # of search results and show you the remainder.

I became immediately curious about a few things:

  • Where are they getting their crawl data from?
  • What are they doing to searches where there’s only a few hundred results?
  • Where is the revenue stream? I see no ads?

Given the lack of advertising I was expecting them to be pulling search data from another site?

There’s no way they are pulling from Bing/Yahoo, there are 14+ sites paying for better spots than we’ve earned on Bing for our terms..

And while the top 10 list looks a bit like DuckDuckGo, we’re seemingly banned from their rankings, and not at #6 at all. It’s funny when you look at their anti-spam approach and then look at the #1 site for ‘seo services’ on DDG. It’s like a time machine back to the days of keyword link spam. Even more ironic is that we conform to DDGs definition of a good SEO:

“The ones who do in fact make web sites suck less, and apply some common sense to the problem, will make improvements in the search ranking if the site is badly done to start with. Things like meta data, semantical document structure, descriptive urls, and whole heap of other factors can affect your rankings significantly.

The ones who want to subscribe you to massive link farms, cloaked gateway pages, and other black hat type techniques are not worth it, and can hurt your rankings in the end.
Just remember, if it sounds too good to be true, is probably is. There are some good ones, and also a lot selling snake oil.”

We know the data isn’t from Google either, we have the #1 seat for ‘seo services’ on Google and maintain that position regularly.

So what’s going on?! This is the same company that gave us the ‘Find People on Plus‘ tool and clearly they know how to monetize a property?

My guess is that they are blending results from multiple search engines, and likely caching a lot of the data so it’d be very hard to tell who’s done the heavy lifting for them?

All that aside, it’s rare to see a search engine that blatantly gives you numbered SERPs and for now MillionShort is, on the left side-bar, showing numbered positions for keywords. That’s sort of handy I guess. :)

You can also change how many results to remove, so if your search is landing you in the spam bucket, then try removing less results. If your search always sucks, and the sites you want to see in the results are on the right, you’ve apparently found a search phrase that isn’t spammed! Congrats!

Weak one: Google Drive

Well my enthusiasm for Google Drive just flew out the window on my second week using it.

UPDATE: Turns out the disk was full and Google Drive has no feedback at all. Thanks FireFox for telling me WHY the download failed. Oh man.

SEO news blog post by @ 11:01 am


 

 

March 22, 2012

Don’t drink the link bait..

Kool-Aid
Kool-Aid
Thanks to the recent (April/March) Google updates, ‘tread lightly’ has never been better advice to anyone in the SEO industry.

Between extra offers in my inbox to ‘exchange links’, ‘sell links’, ‘purchase links’, that all seem to be coming from GMail accounts, and reports of simple Java-script causing pages to drop from Google’s index, I’m about ready to dig a fox hole and hide in it.

First off, lets talk about how dumb it is to even offer to sell/buy/exchange links at this stage of Google’s anti-spam efforts.

Even if the offer came from some part of the universe where blatantly spamming services, using GMail of all things, was not the most painfully obvious way a person who SHOULD be hiding every effort could get detected, it still doesn’t bode well for the ethics of the company trying to sell you some ‘success’ when they can’t even afford their own mail account and have to use a free one.

Further, if the offer came from someone who was magically smart enough to send out all the spam and not have it tracked, if they are at all successful what you’ll be doing is adding your site to a group of sites ‘cheating’ the system. The more sites in the ‘exchange’ the more likely it is to get you caught and penalized. So technically, any success there is to be had, will also be your successful undoing.

Secondly, lets consider how you would try to catch people buying/selling links if you were Google? It’s an invasion of privacy to snoop through someone’s GMail to see if they bought/sold links, but if Google sends you and email asking to purchase a link on your site, is that an invasion of privacy or just a really accurate way to locate the worst spam sites on-line? The same would go for selling a back link to your site, just send out an email, wait for positive responses from the verified site owner, start demoting the site. Talk about making it easy for Google.

Heck as an SEO trying to do things the right way, if I get enough offers to sell/buy links from a particular spammer, wouldn’t it be worth my time to submit a report to Google’s quality team? I think the ‘lack of wisdom’ of these offers should be very obvious now, but they still persist for some curious reason; Perhaps they are all coming from those relentless Nigerian email scammers?

Java Script?

The next issue is on-page Java Script with questionable tactics. I know Google can’t put a human in-front of every page review, even if they actually do a LOT of human based site review. So the safe assumption for now is that your site will be audited by ‘bots’ that have to make some pretty heavy decisions.

When a crawler bot comes across Java Script the typical response is to isolate and ignore the information inside the <script></script> tags. Google, however, seems to be adding Java Script interpreters to their crawler bots in order to properly sort out what the Java Script is doing to the web page.

Obviously if a Java Script is confusing the crawler the most likely reaction is to not process the page for consideration in SERPS, and this appears to be what we’re seeing a lot of recently with people claiming they have been ‘banished’ from Google due to Java Script that was previously ignored. We even did some tests on our blog late in 2011 for Java Script impact and the results were similar to what I’m hearing from site owners right now in this last update.

So, the bottom line is to re-evaluate your pages and decide: is the Java Script you’ve been using is worth risking your rankings over?

If you are implementing Java Script for appearance reasons, using something very common like jQuery, you probably have nothing to fear. Google endorses jQuery and even helps host an on-line version to make it easier to implement.

On the flip-side, if you are using something obscure/custom, like a click-tracker/traffic Java Script which is inserting links to known ‘SEO’ services, I’d remove it now to avoid any stray rounds from Google’s anti-SEO flak-cannon.
Google Flak Cannon

I did toss some Minecraft demo map videos on-line last night/this morning, but they didn’t turn out so swell for a bunch of reasons and I’m just going to re-record them with better software. Stay tuned!

SEO news blog post by @ 12:42 pm


 

 

March 19, 2012

Newest Panda Attacks Onsite Optimization

Google will be penalizing websites that overuse onsite optimization tactics. Matt Cutts of Google, announced the new algorithm update during a panel discussion with Danny Sullivan of Search Engine Land and Microsoft’s Senior Product Marketing Manager of Bing at SXSW named "Dear Google & Bing: Help Me Rank Better!"

panda conspiracy

Cutts reveals that over the last few months Google has been working on the new update specifically designed to target sites that are "over-optimized" or "overly SEO’d."

This is the latest effort by Google to reduce the amount of webspam that still permeates the SERPs. Reminiscent of the Panda update, the new update is designed to target and penalize those that are utilize black hat seo tactics and who try to manipulate Google’s search results through less than savory optimization tactics.

Sites that keep to white hat SEO tactics apparently will have nothing to fear (fingers crossed). The new update is designed to address sites that focus only on SEO and not on delivering quality content.

In search results, Google wants to "level the playing field" regarding "all those people doing, for lack of a better word, over optimization or overly SEO–versus those making great content and great sites," Schwartz quotes Cutts as saying, in a rough transcription.

"We are trying to make GoogleBot smarter, make our relevance better, and we are also looking for those who abuse it, like too many keywords on a page, or exchange way too many links or go well beyond what you normally expect," the transcript continues.

The new update is expected to be implemented and to begin affecting search results in the upcoming month or next few weeks, although Google had no official comment on the matter.

The Wall Street Journal reported earlier this week that Google is about to embark on the biggest-ever overhaul of its search system, one that involves semantic search as well as changes to search engine optimization, advertising, and page-ranking results.

SEO news blog post by @ 12:07 pm

Categories:Google,Google

 

 

February 29, 2012

Panda 3.3: Building a Better Panda

Following up on my previous blog post from Monday where I recapped the launch and implementation of the Google Algorithm update affectionately known as "Panda." First released in February of 2011, Panda has been through several iterations and has a profound effect the quality of search results, webspam and SEO.

Panda Terminator

Google confirmed on February 27th the release of the Panda 3.3 update in conjunction with forty other search updates occurring in February or currently in progress. Although it seems very similar on the surface to Google’s January release of Panda 3.2, which was described by Google as a "data refresh," Google describes this update as follows: This launch refreshes data in the Panda system, making it more accurate and more sensitive to recent changes on the web.

In their blog post, Google states that they are retiring the link evaluation signal that has been employed for many years. An act that is going to cause some heated discussing around SEO water coolers everywhere. Google was reluctant to release too much information regarding the details for fears of revealing details regarding ranking signals.

Link evaluation. We often use characteristics of links to help us figure out the topic of a linked page. We have changed the way in which we evaluate links; in particular, we are turning off a method of link analysis that we used for several years. We often re-architect or turn off parts of our scoring in order to keep our system maintainable, clean and understandable.

Another update of Panda 3.3 will focus on local search rankings. Google revealed that the traditional algorithmic ranking factors are now playing a larger part in triggering local search results.

Here is the released details of the Panda 3.3 Algorithm update:

  • More coverage for related searches. [launch codename “Fuzhou”] This launch brings in a new data source to help generate the “Searches related to” section, increasing coverage significantly so the feature will appear for more queries. This section contains search queries that can help you refine what you’re searching for.
  • Tweak to categorizer for expanded sitelinks. [launch codename “Snippy”, project codename “Megasitelinks”] This improvement adjusts a signal we use to try and identify duplicate snippets. We were applying a categorizer that wasn’t performing well for our expanded sitelinks, so we’ve stopped applying the categorizer in those cases. The result is more relevant sitelinks.
  • Less duplication in expanded sitelinks. [launch codename “thanksgiving”, project codename “Megasitelinks”] We’ve adjusted signals to reduce duplication in the snippets for expanded sitelinks. Now we generate relevant snippets based more on the page content and less on the query.
  • More consistent thumbnail sizes on results page. We’ve adjusted the thumbnail size for most image content appearing on the results page, providing a more consistent experience across result types, and also across mobile and tablet. The new sizes apply to rich snippet results for recipes and applications, movie posters, shopping results, book results, news results and more.
  • More locally relevant predictions in YouTube. [project codename “Suggest”] We’ve improved the ranking for predictions in YouTube to provide more locally relevant queries. For example, for the query [lady gaga in ] performed on the US version of YouTube, we might predict [lady gaga in times square], but for the same search performed on the Indian version of YouTube, we might predict [lady gaga in India].
  • More accurate detection of official pages. [launch codename “WRE”] We’ve made an adjustment to how we detect official pages to make more accurate identifications. The result is that many pages that were previously misidentified as official will no longer be.
  • Refreshed per-URL country information. [Launch codename “longdew”, project codename “country-id data refresh”] We updated the country associations for URLs to use more recent data.
  • Expand the size of our images index in Universal Search. [launch codename “terra”, project codename “Images Universal”] We launched a change to expand the corpus of results for which we show images in Universal Search. This is especially helpful to give more relevant images on a larger set of searches.
  • Minor tuning of autocomplete policy algorithms. [project codename “Suggest”] We have a narrow set of policies for autocomplete for offensive and inappropriate terms. This improvement continues to refine the algorithms we use to implement these policies.
  • “Site:” query update [launch codename “Semicolon”, project codename “Dice”] This change improves the ranking for queries using the “site:” operator by increasing the diversity of results.
  • Improved detection for SafeSearch in Image Search. [launch codename "Michandro", project codename “SafeSearch”] This change improves our signals for detecting adult content in Image Search, aligning the signals more closely with the signals we use for our other search results.
  • Interval based history tracking for indexing. [project codename “Intervals”] This improvement changes the signals we use in document tracking algorithms. 
  • Improvements to foreign language synonyms. [launch codename “floating context synonyms”, project codename “Synonyms”] This change applies an improvement we previously launched for English to all other languages. The net impact is that you’ll more often find relevant pages that include synonyms for your query terms.
  • Disabling two old fresh query classifiers. [launch codename “Mango”, project codename “Freshness”] As search evolves and new signals and classifiers are applied to rank search results, sometimes old algorithms get outdated. This improvement disables two old classifiers related to query freshness.
  • More organized search results for Google Korea. [launch codename “smoothieking”, project codename “Sokoban4”] This significant improvement to search in Korea better organizes the search results into sections for news, blogs and homepages.
  • Fresher images. [launch codename “tumeric”] We’ve adjusted our signals for surfacing fresh images. Now we can more often surface fresh images when they appear on the web.
  • Update to the Google bar. [project codename “Kennedy”] We continue to iterate in our efforts to deliver a beautifully simple experience across Google products, and as part of that this month we made further adjustments to the Google bar. The biggest change is that we’ve replaced the drop-down Google menu in the November redesign with a consistent and expanded set of links running across the top of the page.
  • Adding three new languages to classifier related to error pages. [launch codename "PNI", project codename "Soft404"] We have signals designed to detect crypto 404 pages (also known as “soft 404s”), pages that return valid text to a browser but the text only contain error messages, such as “Page not found.” It’s rare that a user will be looking for such a page, so it’s important we be able to detect them. This change extends a particular classifier to Portuguese, Dutch and Italian.
  • Improvements to travel-related searches. [launch codename “nesehorn”] We’ve made improvements to triggering for a variety of flight-related search queries. These changes improve the user experience for our Flight Search feature with users getting more accurate flight results.
  • Data refresh for related searches signal. [launch codename “Chicago”, project codename “Related Search”] One of the many signals we look at to generate the “Searches related to” section is the queries users type in succession. If users very often search for [apple] right after [banana], that’s a sign the two might be related. This update refreshes the model we use to generate these refinements, leading to more relevant queries to try.
  • International launch of shopping rich snippets. [project codename “rich snippets”] Shopping rich snippets help you more quickly identify which sites are likely to have the most relevant product for your needs, highlighting product prices, availability, ratings and review counts. This month we expanded shopping rich snippets globally (they were previously only available in the US, Japan and Germany).
  • Improvements to Korean spelling. This launch improves spelling corrections when the user performs a Korean query in the wrong keyboard mode (also known as an “IME”, or input method editor). Specifically, this change helps users who mistakenly enter Hangul queries in Latin mode or vice-versa.
  • Improvements to freshness. [launch codename “iotfreshweb”, project codename “Freshness”] We’ve applied new signals which help us surface fresh content in our results even more quickly than before.
  • Web History in 20 new countries. With Web History, you can browse and search over your search history and webpages you’ve visited. You will also get personalized search results that are more relevant to you, based on what you’ve searched for and which sites you’ve visited in the past. In order to deliver more relevant and personalized search results, we’ve launched Web History in Malaysia, Pakistan, Philippines, Morocco, Belarus, Kazakhstan, Estonia, Kuwait, Iraq, Sri Lanka, Tunisia, Nigeria, Lebanon, Luxembourg, Bosnia and Herzegowina, Azerbaijan, Jamaica, Trinidad and Tobago, Republic of Moldova, and Ghana. Web History is turned on only for people who have a Google Account and previously enabled Web History.
  • Improved snippets for video channels. Some search results are links to channels with many different videos, whether on mtv.com, Hulu or YouTube. We’ve had a feature for a while now that displays snippets for these results including direct links to the videos in the channel, and this improvement increases quality and expands coverage of these rich “decorated” snippets. We’ve also made some improvements to our backends used to generate the snippets.
  • Improvements to ranking for local search results. [launch codename “Venice”] This improvement improves the triggering of Local Universal results by relying more on the ranking of our main search results as a signal. 
  • Improvements to English spell correction. [launch codename “Kamehameha”] This change improves spelling correction quality in English, especially for rare queries, by making one of our scoring functions more accurate.
  • Improvements to coverage of News Universal. [launch codename “final destination”] We’ve fixed a bug that caused News Universal results not to appear in cases when our testing indicates they’d be very useful.
  • Consolidation of signals for spiking topics. [launch codename “news deserving score”, project codename “Freshness”] We use a number of signals to detect when a new topic is spiking in popularity. This change consolidates some of the signals so we can rely on signals we can compute in realtime, rather than signals that need to be processed offline. This eliminates redundancy in our systems and helps to ensure we can continue to detect spiking topics as quickly as possible.
  • Better triggering for Turkish weather search feature. [launch codename “hava”] We’ve tuned the signals we use to decide when to present Turkish users with the weather search feature. The result is that we’re able to provide our users with the weather forecast right on the results page with more frequency and accuracy.
  • Visual refresh to account settings page. We completed a visual refresh of the account settings page, making the page more consistent with the rest of our constantly evolving design.
  • Panda update. This launch refreshes data in the Panda system, making it more accurate and more sensitive to recent changes on the web.
  • Link evaluation. We often use characteristics of links to help us figure out the topic of a linked page. We have changed the way in which we evaluate links; in particular, we are turning off a method of link analysis that we used for several years. We often rearchitect or turn off parts of our scoring in order to keep our system maintainable, clean and understandable.
  • SafeSearch update. We have updated how we deal with adult content, making it more accurate and robust. Now, irrelevant adult content is less likely to show up for many queries.
  • Spam update. In the process of investigating some potential spam, we found and fixed some weaknesses in our spam protections.
  • Improved local results. We launched a new system to find results from a user’s city more reliably. Now we’re better able to detect when both queries and documents are local to the user.

Other details of update and changes that Google has made recently can be found here:

SEO news blog post by @ 10:57 am


 

 

« Newer PostsOlder Posts »
Level Triple-A conformance icon, W3C-WAI Web Content Accessibility Guidelines 1.0 Valid XHTML 1.0! Valid CSS!
Copyright© 2004-2014
Beanstalk Search Engine Optimization, Inc.
All rights reserved.