As many of our readers may already know, earlier this week Google changed the way their URL functions in such a way that for those who monitor their analytics (which should be all of you), you’ll now only see (not provided) where once you would have seen your keyword. This move was met with disappointment and more than a bit of annoyance on the part of SEOs and website owners. The reason (so they say) is to protect the privacy of their users. The logic is, if keyword data passes then it can be picked up in the log files of the site being visited along with data such as the IP address that would allow the user to be pinpointed with some degree of accuracy. So, to make sure that the owner of the custom t-shirt site I visited last week can’t figure out it was me that searched “custom t-shirts canada” that data is now kept from the receiving site. Now, here’s the annoyance – to say that it’s a case of protecting privacy would work UNTIL we realize that the same can’t be said for paid traffic. If you purchase traffic though AdWords, the data is tracked. Now of course it has to be or we’d all just be paying for AdWords and trusting that we were getting the traffic we paid for and that the bids made sense but the hypocrisy is pretty obvious – why is a user that clicks on an organic result then more deserving of privacy than those who click on a paid result? They’re not obviously, and we’re not being told the truth BUT that’s not really the discussion to be had is it? The fact of the matter is, it’s Google and they can do what they want with their own website. I believe I should get to do with my site what I want (within the confines of the law of course) and so I won’t take that away from others. So what is the real discussion …
What Do We Do Now?
While we’re all spending time arguing about the hypocrisy and crying foul, the fact of the matter is that it is what it is and now we have to figure out what to do. We no longer have keyword data from Google. There are two routes forward, the short term patch and the long term changes.
In the short term we can use Advanced Segments to at least get a good idea about what keywords are producing what effect. Essentially we can use them to filter traffic that follows patters similar to what specific keywords or keyword groups behaved like. This tends to only work well with large traffic groupings so unless you get huge traffic for single phrases that behave uniquely – you’ll probably have to group your traffic together. Branded vs non-branded for example. I’m not going to get into how this is gone here in this blog post simply because I wrote a lengthy piece on it for Search Engine Watch back when (not provided) was first becoming an issue. You can read about it at http://searchenginewatch.com/article/2143123/How-to-Understand-Your-Google-Not-Provided-Traffic.
This will only work for a while however. You’ll see new traffic coming in and won’t know how it’s behavior impacts the results. Essentially – this will give you a decent idea until your traffic sources change, your site changes, or time passes. So what do we do …
In the long run we have no option but to make massive adjustments to the way we look at our sites. We can no longer determine which keywords perform the best and try to caft the user experience for them. Instead we have to look at our search traffic in a big bucket. Or do we?
While this may be true for some traffic, we can still segment but the landing page (which will give you a good idea of the phrases) as well as look at groups of pages (all in a single directory for example). I know for example that this change comes right when we ourselves are redesigning our website and in light of this I will be changing the way our directory structure and page naming system work to allow for better grouping of landing pages by common URL elements. I imagine I won’t be the last to consider this factor when building or redeveloping a website.
What will need to change is our reliance on specific pieces of data. I know I like to see that phrase A produced X result and work to improve that. We’ll not have to look at larger groupings of data. A downside to this (and Google will have to address this or we as SEOs will) is that it’s going to be a lot easier to mask bad practices as specific phrase data won’t be available. I know for example that in an audit I was part of, we found bot traffic in part based on common phrase elements. Today we wouldn’t be able to do this and the violations would continue.
We’re All Still Learning
Through the next couple months we’ll all be adjusting our reporting practices to facilitate this change. I know that some innovative techniques will likely be developed to report as accurately as possible what traffic is doing what. I know I’ll be staying on top if it and we’ll keep you posted here in our blog and on our Facebook page.
SEO news blog post by Dave Davies, CEO @ 10:48 am on September 26, 2013