At Beanstalk Search Engine Optimization we know that knowledge is power. That's the reason we started this Internet marketing blog back in 2005. We know that the better informed our visitors are, the better the decisions they will make for their websites and their online businesses. We hope you enjoy your stay and find the news, tips and ideas contained within this blog useful.
I sent over a couple screenshots to the Google AdWords team showing that a ton of clicks were staying for only 0 seconds and yet I was being charged for them. Here is what I got back:
Please be assured that our system identifies invalid clicks and filters them so you are not charged for those clicks. Therefore, the charges that you have accrued are for legitimate clicks.
Whew. I feel better now.
They went on …
Dave, please understand that zero second visits do not indicate invalid click activity. Analytics calculates time spent on one page by looking at two time stamps: one from the request for the first page and one from the request of the second page. If your users have visited only one page then time on page will be zero regardless if he has actually spent time on that page. This is because Analytics does not have a reference point of another page to calculate Time on Page. Therefore, you may be seeing zero second visits even though users may have been on your site for some time. However, I will not be able to recommend any third tracking software for you.
OK – so now I’ve learned something. I’d made the mistake of thinking Google knew the time to exit, perhaps assuming they really are Big Brother and know everything – Analytics is more limited than I first thought. As I’m fundamentally an organic SEO who dabbles in PPC and I went off on this project all on my own. There are definitely phrases that would result in visitors who hit the site, read a review and head off to another site (hopefully via an affiliate link). So stay tuned, I’m going to be placing some redirect pages rather than direct link in to see if this affects the stats. If so – my apologies to the Google AdWords team for a few of the words I’ve used over the past 48 hours.
For those of you who have noticed significant fluctuations in your rankings – you’re not alone. Across the web people have reported significant changes in their rankings. We at Beanstalk were fortunate on this one in that we had ranking reports running for the past few days and got to watch the changes over the course off the report. A happy coincidence.
Unfortunately the algorithm shift isn’t particularly favorable to solid site optimization. There was an odd connection is what we’re seeing. Site that had link building that focused on high relevancy and high trustability lost ground and sites who’s links building was focused on volume in recent months have gained ground. This indicated a shift to volume over quality. For obvious reason we’re convinced that this shift won’t last.
This shift in quality isn’t just apparent in the sites we’re working on but as we analyze various sites across the web we’re noticing a larger degree of lower quality backlinked sites ranking.
Now – to be sure we’re always in favor of diversified link building strategies and that includes strategies that focus more on volume and other strategies that focus on trust and relevancy but from everything we can see indicates that this update puts a disproportionate emphasis on volume. I expect to see the rankings shift again – likely over the weekend.
I should note that this isn’t just something we’re noticing but that has been noticed by a wide array of SEO’s. My advice? Don’t react too quickly – corrections are coming and you don’t want to adjust the wrong way.
And in other news …
And also noticeable in the current ranking report we’re running for our clients is the merging of Yahoo! and Bing search results. A couple days ago Yahoo! announced that their organic results in North America were being fed by Bing. This is of course the first set of ranking reports though that have refected this. This is (in my opinion) very exciting news and you can read more about it on Search Engine Journal here.
And stay tuned – I’ll be posting more as the Google update continues.
I just wanted to take a moment to thank all our blog and article subscribers and just our visitors for helping me make the list of most influential writers on link building in a poll over on the Eightfold Logic blog (link removed – resource no longer exists). Of course, I’ve always tried to educate and hopefully entertain in my works and I’m definitely glad it has been well-received. So thanks to you all for voting and be sure to stay tuned, if anything this inspires me to write more often on this important topic and many others.
Sharing the honor with me is a great list of writers that I’d highly recommend following as well. they are:
and a tie for fifth;
5. Rand Fishkin
5. Ralph Tegtmeier – Fantomaster / “Fantomeister”
For the past week the Internet world has been abuzz with the Google/Verizon deal and how it will affect Net Neutrality. For those of you who have heard me speak at conferences or listened to my radio show you’ll know that I’m not the biggest supporter of Net Neutrality legislation. I tend to take a pretty hard line in a debate (almost always against Jim Hedger) but so does he and it makes for an entertaining debate with him referring to me as a closed minded hater of equality and me accusing him of communist tendencies and wanting to implement policies and laws that counter the entire spirit of capitalism. It’s a fun debate.
But today we saw eye-to-eye Jim and I. While we may argue the reasons we agree – we both object to the way that Google is handling the current issue with their Verizon deal that would give their 1′s and 0′s a bit of preferential treatment. More on that in just a bit. First – let’s get some basic history on Google’s stand on net neutrality, the arguments of those who oppose net neutrality and go from there. But first -
What Is Net Neutrality?
Net Neutrality is, at it’s core, the idea that the Internet is a mandatory service and that complete equality is required in the way packets are treated as they flow across it. The idea that the Telco’s should have the ability to charge more for preferential treatment of certain packages (say … YouTube videos if Google slipped them a few extra bucks) violates this idea. Well who can argue that? Don’t I have the same rights to the Internet as everyone else?
The problem arises in that the Telco’s need to pay for the infrastructure and access to that network. They argue (and let’s remember – we’re all capitalists here) that they have the right to monetize their services in a way that maximized profits. The FTC (Federal Trade Commission) has opposed Net Neutrality legislation noting that there are consumer protection laws in place that provide the protection in productive ways and that bloating the law books with more jargon isn’t going to make the issue simpler, or solve any problems that aren’t being solved with current legislation as has been witnessed many times – including a decision again Comcast when they tried to restrict access to torrents on their network and were order to stop doing so. Basically – Net Neutrality is protected even for a file type that is used primarily for exchanging illegal material (yes torrents are used for legitimate purposes but …)
Initially there were two camps, those who opposed net neutrality and those who supported it. The line was drawn basically based on profit like so:
Against Legislation – the “greedy” Telcos who just want to make a buck. For Legislation – a bunch of people who stand to profit from it such as Google, Microsoft and others who claim that this will hinder innovation and growth in the technology industry. To ask them – it has nothing to do with the fact that it would cost them more.
In 2007 Google as on record as saying:
“The nation’s spectrum airwaves are not the birthright of any one company. They are a unique and valuable public resource that belong to all Americans. The FCC’s auction rules are designed to allow U.S. consumers — for the first time — to use their handsets with any network they desire, and download and use the lawful software applications of their choice.”
At the time they were bashing Verizon from taking the stand that the decision by the FCC (Federal Communications Commission), “that would require the eventual winner of the spectrum to offer open devices and applications.” claiming such a decision was, “arbitrary and capricious, unsupported by substantial evidence and otherwise contrary to law.” You can read more about this on Google’s Policy Blog here.
So Here We Are 3 Years Later …
So here we stand 3 years later and Google and Verizon are in bed together working out a deal to prioritize some traffic over others, basically pulling a reference from George Orwell’s Animal Farm that, “some animals are more equal than others.” They use the example of medical applications but left the door open to gaming, 3D, entertainment, and more. I’m sure none of us would have a problem with a heart monitor connected to a doctor’s office over the Internet getting a priority over an MSN chat but we all know that’s not where this is going or it wouldn’t even be a debate.
Now on the table is that mobile devices should be included in the list of exempt platforms and services. Alrighty – now we’re getting warmed up. So they’re OK with the standard old Internet getting Net Neutrality imposed (except for special applications and services as yet to be defined of course)…but mobile, the up-and-comer and largely increasing area of bandwidth consumption and connectivity – that area should be excluded from the legislation? Here’s where you lost me but not because I think it’s wrong to give preferential treatment but because I don’t like when people are trying to be sly.
Here’s the thing … “not all animals are equal”. I can’t tell Google that all the can change for a PPC click is $0.40 just to make sure that everyone can afford it. It’s just not that kind of a world (and I would argue further that it shouldn’t be).
What They Should Have Done …
Verizon has done exactly what they should have. The way the message was delivered puts any backlash squarely on Google. I have no advice for them, masterfully executed.
Google should have come forward and said:
“The world has changed in 3 years and we have a lot of great ideas about the direction of mobile that’s going to require that Net Neutrality legislation doesn’t apply. We need to be able to pay more for preferential bandwidth to insure that we can provide you with the services we know you’ll love at a price you’ll enjoy even more. We want to pay extra so you don’t have to.
We would have called them on going against the policies of earlier but really – there would have been a lot less rumors and conjecture about what was going on. They should have stood up for their actions, admitted they were contrary to their former statements and basically outlined what we all know, the Internet world moves fast and the rules have changed.
Sometimes it’s refreshing to just hear a spade called a spade. I don’t believe that Google has any huge secret plans to bring down the Internet – I think they just want to be more equal. At the end of the day I don’t even disagree with their right to be more equal – they just should have come out and said so. They should have stood up for themselves.
And Now For Some Fun…
And now that you’ve made it to the end of a post on Net Neutrality here’s a video done by “Ask A Ninja” on net Neutrality:
As mentioned over at WebConfs.com, on the surface, HTML 5, other than the exciting <canvas> element, does not appear to be much different than its predecessor, HTML 4. It will still be XML based and is not making any moves towards being a scripting language like PHP or similar complex programming languages. It looks like the new standard will mainly introduce more effective tags for organizing the content of a webpage to make it more readable by search engine spiders. The main prerequisite of HTML 5 was to keep it accessible to the masses and to have it continue being backwards compatible…which means you will not have to re-learn the whole language.
Most HTML 4 content is currently wrapped in <div> or <span> tags regardless of what it is. New tags introduced by HTML 5 have a more semantic meaning. Tags like the <article>, <nav>, <footer>, <header>, <dialogue> and <aside> (which can be used to indicate a piece of content removed slightly from the rest of the page in terms of relevance) will be increasingly important for SEO efforts. The new <audio>, <video> and <dialogue> tags will be part of the upcoming HTML 5 standard and will allow for further segregation of page content in relevant categories.
The biggest change with the new standard will be the concept of Page Segmentation. Google already has a patent for this and many believe that the practice is already in use today. Currently, there is no way for a website developer to tell the bots how to segment the pages correctly. By dividing pages in to separate sections, a cleaner more organized structure will be created allowing for increased efficiency by bots to parse your pages for content. This also means that bots are able to more efficiently analyze the segments individually and are not wasting time trying to divine content from navigation, scripts, css and other inline elements. This will drastically increase the understanding of the relevancy of the page and will allow bots to rank multi-topic pages more accurately.
Here are some of most important new HTML 5 tags and how they will relate to SEO:
The new article tag is probably one of the best additions to HTML 5 from an SEO perspective. This new tag will allow SEO’s to mark separate entries in online publications. It will clean up the code by reducing the need for excessive <div> tags. Search engines will probably place more importance on the content wrapped in the <article> tag compared to content on the other parts of the page.
The new section tag will be used to further organize the structure of the HTML document. By using the new <section> tag to identify separate sections on a page/chapter/book and maintaining a consistent hierarchical structure, each section can have its separate HTML heading. As with the <article> tag, it can be assumed that search engines will place more attention on the contents of identified sections. If the words of a search string are found in one section for instance, this implies higher relevance, as compared to when these words are found all across the page or in separate sections.
Not to be confused with the <head> element, the <header> tag is similar to the <h1> tag. The key difference being that it can contain <h1> elements, text content and hard –coded links (bonus!) and anything else you like. This one will be huge to SEOs!
While maybe not as important as the new <header> tag, this new tag will also allow for lots of “extra” SEO content. The real bonus is that both the <header> and the <footer> tags can be used repeatedly in each <section> of the page. This gives a lot of flexibility for SEOs!
The new <nav> tag allows for the definition of site navigation or a series of internal or external links. This is another instance of HTML5 trying to organize page content in order to increase the effectiveness and efficiency of the bots that parse your site for content.
Like all W3C implementations it will take some time for the standard to be completely ratified and for people to begin implementing the new tags into their website design. Once enough web pages are using the new HTML 5 standards, search engines will inevitably begin to use it to improve search results in the SERPs. Links and content within certain tags will be treated differently than from those using redundant or archaic tags making the new HTML markup far more important to SEO efforts than it is currently.
Unlike other less popular HTML recommendations for past standardizations, I think this one is long overdue and will be embraced by SEOs and SEMs alike. Embrace the change and start building your sites with an eye on the not too distant future. Fortune favours the prepared!
We are all guilty of it at one time; creating an insecure password. There is a myriad of excuses that we make to justify our password infractions (can’t think of one, can’t remember it if it’s too complicated…etc.). With the ever present threats from hackers and from information piracy, we all need to do do what we can to protect ourselves. Besides…creating a strong password just makes sense doesn’t it?
Much to my chagrin, my own Gmail account was recently hacked. I am not a novice to password security or of the need to protect sensitive information, but this really made me sit up and take notice and to re-evaluate my username/password usage very seriously.
I think there is an assumption that people just automatically know what constitutes a strong password. But for those of us who need a refresher, here we go:
Tips on Creating a Secure Password
• Make sure it is alpha-numeric (letters and numbers)
• Mix up uppercase and lowercase
• Do not use real words (words found in a dictionary)
• Do not use personal information (names, birthdates, license plates)
• Use a passphrase. (Take a sentence or line from a song and make it into an acronym and substitute letters for special characters like $ for “S” and ! for “1” etc. This makes it a lot easier to remember an abstract phrase that doesn’t mean anything)
• Use different usernames and passwords for different accounts
• Change or rotate your passwords frequently
• Do not share your information with anyone
• Do not write down your usernames or passwords anywhere! ever! (as a former computer tech, you won’t beileve how many times I went to an office to see usernames/passwords conveniently displayed on monitors on bright yellow post-it notes!)
• MOST IMPORTANT! make sure you are not using a username or password on the Top 500 Worst Passwords of All Time list.
Some other common usernames and passwords to avoid: ncc1701 – The ship number for the Starship Enterprise (and adding A, B, C, D or E does not suddenly make it more secure!) thx1138 – The name of George Lucas’s first movie, a 1971 remake of an earlier student project qazwsx – Follows a simple pattern when typed on a typical keyboard qwerty – Another standard keyboard pattern 666666 – Six sixes 7777777 – Seven sevens ou812 – The title of a 1988 Van Halen album 90210 – Some lame show from the 90’s 8675309 – The number mentioned in the 1982 Tommy Tutone song. This song supposedly caused an epidemic of people dialing “8675-309″ and asking for “Jenny” (in my own defense…I just kept getting asked for the area code by the operator…)
With all that in mind, protect yourself by getting in to the practice of creating strong passwords at every occasion. Be confident and stop being insecure today!
Have you ever wanted to use a font on your website and weren’t able to simply because it wasn’t a web-safe font? Perhaps you wanted a beautiful scrolling heading but knew that doing so would require creating an image heading and really – that’s just not good SEO is it?
Last week the solution to this issue was brought to my attention by Jacob Gube over on the Mashable site in his article on the implementation of Google’s New Google Font API. Basically this is a standardized mechanism for pulling in external font definitions into IE, Firefox, Safari, etc. allowing designers and website owners to finally use the fonts they feel would best work with their design.
As of late yesterday afternoon I noticed a few minor hiccups in the Google SERPs. This morning those hiccups escalated into multi-page jumps, old versions of pages re-entering the index, pages being dropped from the index and different results appearing with a click of the refresh button. It is far too early to even try to predict what type of update is underway or what it means but hang on to your hats as it looks like a fairly bumpy ride.
And note – if you see your site drop or jump up in the results – don’t count on that staying as we’re seeing bouncing in both directions and my prediction (the only one I’ll make at this early stage) is that what we’re seeing in both instances is not what we’ll see at the end of the day.
Last week I talked about “gleaning” information from a variety of news blogs and websites. Of the sites that I mentioned, SEOmoz is by far at the top of my favorites. They have well written and informative blog posts, a slew of great (and free!) seo tools and a lot of great resources that any new or experienced seo tech would be remiss in neglecting.
If you are not already, it is only a matter of time before you hear the name of “Rand Fishkin” from SEOmoz. Rand’s name is synonymous with SEO and I would again like to send out props to him and the peeps at SEOmoz for putting together the latest version of their “Beginners Guide to SEO”
Even those of us who consider ourselves to be adept at SEO would do well to give this guide a once over. It’s like watching an epic movie like Star Wars…you always see something that you didn’t catch before. For the seo “padowan” learner, I have found this small (51 page) guide an absolutely indispensable trainer. Fortunately for all of us, Rand is on the light side of the seo “force” and has not been corrupted by the “dark side”. More on the Dark side later
Well first let me welcome Kyle to Beanstalk’s blogging realm. Kyle has been with us for quite some time and has moved from a Link Builder to Link Department Manager to his current position as an SEO Technician. We hope you will enjoy reading Kyle’s blogs where he will be writing about his take on current SEO happenings and where he’ll be sharing some of the SEO tips and resources he’s learned and used in developing his SEO chops. Welcome Kyle.
And for our avid readers I’ve got a first-come-first-served AdWords credit for you. Sign up for a new account and the following code will start you off with $100 (though you’ll have to deposit $10 to get the account started). Caution – I did this with one of my affiliate accounts and it proved so profitable that now Google’s made thousands off me (of course – I’ve made more so …
The code for your $100 is: 4QD3-7WET-9SAC-74Y9-DVM2.