Beanstalk on Google+ Beanstalk on Facebook Beanstalk on Twitter Beanstalk on LinkedIn Beanstalk on Pinterest
Published On:
SEO articles and blog published on ...
Hear Us On:
Webmaster Radio
Blog Partner Of:
WebProNews Blog Partner
Helping Out:
Carbon balanced.


Facebook’s Contest Rules: NOW They Change Things!

This summer I put together a Facebook contest for a client. Up until last week, the social media’s site rules were explicitly clear: absolutely no promotion-related content could be administered within Facebook itself. If you wanted to make a promotion, you had to build it on a third-party app developer and host it as a tab on your page. Users could not enter by commenting on a post; likes could not count as votes. While my contest was a fantastic learning experience, the actual process—researching what Facebook would and wouldn’t expect, vetting third-party developers, trying to design and program the tab itself—was complicated and sometimes frustrating.

likeFacebook has now revamped its contest guidelines. The biggest change has been the removal of the third-party administration requirement; while it’s one alteration, it has massive ramifications for how businesses conduct themselves and interact with their fans. A comment, post, or like can now function as an entry or a vote; while third-party apps can still be used for larger campaigns, it can make the process of a quick giveaway or draw much simpler—as easy as just posting an update and asking for comments. This is obviously a big plus for page owners; fans are more likely to enter a giveaway where all they have to do is comment or like. It also becomes a great deal cheaper to host a promotion; while contests can be real business-builders, the app developers often charge a subscription fee for use of their service and may only offer a bare-bones free option, if any.

So the changes are a good thing for small businesses and pages looking to increase their traffic by doing giveaways and contests. Facebook still encourages the use of apps for larger and more personalized experiences; they also forbid pages from asking users to take part in promotions by liking or posting something to their own personal Timeline. And if I’d only done this client contest a few months later, I would have very possibly been able to pull it off quicker than I did (though I would have missed the opportunity to become truly acquainted with Photoshop).

That said, there are some legal ramifications for this change that will be interesting to follow as the new rules go into practice. For one, entry management may become a great deal more difficult; while the apps are very good at keeping track of exactly who enters the contest and what they must do, it could easily become a hassle to ensure each entry was legitimate when you’re just asking people to like a post. Furthermore, it can run up against official location rules; if the giveaway is tailored to the sweepstakes rules for The United States and the winner is in Britain, their legal claim to the prize—and the legality of their participation in the first place—may not be simple.

With apps, page owners must make clear exactly what counts as an entry and how the winners will be chosen. The US has very strict rules which dictate that all entries into a sweepstakes or draw must have an equal chance of winning. But if users can enter through a variety of actions, it can be difficult to track them; it also removes Facebook’s careful denial of liability, which had been so prominent in the earlier rules.

I’m interested to see how these rule changes will work out in the long run. While it’ll make things much easier for a lot of businesses, I can see many ways where things can go wrong, and the results remain to be seen. Until then, you can enter that draw for free wings without worry. Go forth and like to your heart’s content

SEO news blog post by @ 11:58 am on September 4, 2013


Think About It

On Tuesday August 14th 2013 the Syrian Electric Army, (pro-government hackers) shut down The New York Times and compromised Twitter. By deceptive means they managed to gain access via a stolen ID and password from a “reseller” account. I won’t give the SEA credit by allowing them the word “hack” as this is a term I give to peeps who employ movie-like techniques by use of code and lasers to break in. This was our classic case of old school deception and could have been prevented if the weak link was smart enough to notice the scam. This wasn’t some high tech offense, this was a schoolyard technique to steal another kids Fruit Roll-Up. In fact, a large percent of today’s digital break and entries are done in this same practice. Our vulnerability is based on how educated we are on deceptive motives.

The most important tip to prevent vulnerability is to play smart. Never lead with your innocence on your sleeve and get into the habit of second guessing information. It’s the internet, a worldwide connection at the touch of the keyboard. If you search a cure for a cold and it says jump off a cliff; I’m sure you would second guess that direction. For example when you receive an e-mail from Grandpa Joe with a subject that says Hi and leaves a simple link; this is more likely going to be a malicious gift waiting to get its hands onto your information. Second guess it, contact Grandpa then find out if he actually sent it. I know it’s a simple example, but just enough reality could save you from trouble. Take the time to educate yourself and become aware of these simple deceptive tactics as they will prevent future loss.

There are obviously many techniques to comprise your digital security. Far more advanced and complex then what I’ve simply expressed, but at the end of the day we have to keep vigilant of protecting our information. We have to take measures into our own hands and think twice of how we approach the internet as well as how we get approached.

SEO news blog post by @ 10:26 am on August 30, 2013



Link Reduction for Nerds

Let’s face it, even with our best efforts to make navigation clear and accessible, many websites are not as easy to navigate as they could be.

It doesn’t matter if you are first page super star, or a mom n pop blog with low traffic, most efforts really are no match for the diversity of our visitors.

When I first started blogging on SEO topics for Beanstalk I took a lot of effort to make my posts as accessible as I could with a bunch of different tricks like <acronym> tags (now they are <abbr> tags) and hyperlinks to any content that could be explored further.

Like a good SEO I added the rel="nofollow" to any external links, because that totally fixes all problems, right?

“No.. Not really.”

External links, when they actually are relevant to your topic, and point to a trusted resource, should not be marked as no-follow. Especially in the case of discussions or dynamic resources where you could be referencing a page that was recently updated with information on your topic. In that case you ‘need’ the crawlers to see that the remote page is relevant now.

Internal links are also a concern when they become redundant or excessive. If all your pages link to all your pages, you’re going to have a bad time.

If you went to a big new building downtown, and you asked the person at the visitors desk for directions and the fellow stopped at every few words to explain what he means by each word, you may never get to understanding the directions, at least not before you’re late for whatever destination you had.

Crawlers, even smart ones like Google Bot, don’t really appreciate 12 different URLs on one page that all go the same place. It’s a waste of resources to keep adding the same URL to the spiders as a bot crawls each of your pages.

In fact in some cases, if your pages have tons of repeated links to more pages with the same internal link structures, all the bots will see are the same few pages/URLs until they take the time push past the repeated links and get deeper into your site.

The boy who cried wolf.

The boy who cried wolf would probably be jumping up and down with another analogy, if the wolves hadn’t eaten him, just as your competition will gladly eat your position in the SERPs if your site is sending the crawlers to all the same pages.

Dave Davies has actually spoken about this many times, both on our blog, and on Search Engine Watch: Internal Linking to Promote Keyword Clusters.

“You really only NEED 1 link per page.”

Technically, you don’t actually need any links on your pages, you could just use Javascript that changes the window.location variable when desired and your pages would still work, but how would the robots get around without a sitemap? How would they understand which pages connect to which? Madness!

But don’t toss Javascript out the window just yet, there’s a middle ground where everyone can win!

If you use Javascript to send clicks to actual links on the page, you can markup more elements of your page without making a spaghetti mess of your navigation and without sending crawlers on repeated visits to duplicate URLs.

“In fact jQuery can do most of the work for you!”

Say I wanted to suggest you look at our Articles section, because we have so many articles, in the Articles section, but I didn’t want our articles page linked too many times?

Just tell jQuery to first find a matching <anchor>:

Then tell it to add an ID to that URL:
.attr( 'id', '/articles/');

And then tell it to send a click to that ID:

Finally, make sure that your element style clearly matched the site’s style for real hyperlinks (ie: cursor: pointer; text-decoration: underline;)

UPDATE: For Chrome browsers you need to either refresh the page or you have to include the following in your page header: header("X-XSS-Protection: 0");

SEO news blog post by @ 6:07 pm on August 28, 2013



Take that Google

Yahoo! made it into the number one spot this past July for unique visitors, according to comScore. It won’t stay that way, but it makes us reflect and ask what did they do to gain this kind of interest. Of course Marissa Mayer will have to have credit for taking this position; placing the big Goog (Google) behind them. Tumbler was a big 1.1 Billion cash acquisition for Yahoo! and maybe this had something to do with it. It’s an obvious direction, however Yahoo! and Tumbler are still being ranked as two different entities so this fluctuation could just be seasonal movement. Andrew Lipman, comScore’s vice president of industry analysis stated, “Tumbler did not contribute to Yahoo!’s visitor tally.” Before anyone gets too carried away here the numbers between the two search giants Goog and Yahoo! are large, but clearly only marginal. Not enough to knock our socks off. This sort of ranking would have to support a longer trending time before we should even think of spending more effort on the topic.

The Need for Speed

Even though Google hasn’t been clear whether page speed is important for rankings, I speculate that it does. Forget Google thinking how you would want to view a page; I have a very short attention span and if a page takes forever to load on mobile or PC I trash it and probably won’t go back to the site. It’s simple and not to complicated and we don’t need Google to tell us if it does or not take page load speed into consideration for ranking. I think this same notion goes for many other people who coast the web. We want information fast, simple and uncluttered, maybe with a little visual bling to keep us around a little longer. None the less I’m just stating the obvious and that if your page is slow nobody has the time for it so count your loss.

Who’s Watching Your Back?

On yesterday’s Webmaster Radio’s show, Webcology, they mentioned SEMPO – “Search Engine Marketing Professional Association,” and if they need to exist in today’s internet. My opinion is no. I think Jim Hedger brought up a good point when he said, “Back when everyone needed them to do something they didn’t and aren’t showing any signs of doing so today, so why bother?” I don’t think anyone can direct the internet and how things are played or even defend the needs of online marketers, so again, in my opinion forget about it. If it’s education you’re after sure SEMPO might be a great resource, but there are many other resources to find education and networking. That’s my two cents.

A video to entertain you until Google officially comes out about page speed.

SEO news blog post by @ 1:48 pm on August 23, 2013



The Sci-Fi Reality of Google’s Pay-Per-Gaze Patent

Steven Spielberg’s 2002 film Minority Report takes place in Washington, DC, in the year 2054. It centers around a police officer (Tom Cruise) who is the head of the PreCrime police force, which uses precognitive visions to prevent murders before they take place. When Cruise’s character is predicted to commit murder, he is forced to go on the run and try to clear his name. The film garnered praise not only for its action-packed plot, but also for its uniquely plausible vision of the future of American life. One of the most memorable—and plausible—aspects of the setting was the way retinal scanners were used to track citizens at all times. But the technology wasn’t only for identification purposes; it was also used by electronic billboards in public areas, which would deliver direct advertisements to passersby. In fact, the constant identification forces Cruise’s character to undergo a black market eye replacement so that he can move in public without being called out by name and tipping off the authorities.

Spielberg received praise for Minority Report‘s examination of privacy invasion and the consequences of having personal information used for commercial gains; it was a unique spin on the conventional Orwellian surveillance scenario that was grounded in the established advertising industry’s continual efforts to maximize their advertisement ROI. According to Jeff Boortz, who oversaw the product placement in the film, the billboards would “recognize you—not only recognize you, but recognize your state of mind.”

minority google glassLast week, tech blogs reported that back in 2011, Google patented a Gaze Tracking System for a head-mounted device that—in 2013—sounds an awful lot like Google Glass. The technology (found here) monitors eye movements to track what a user is looking at, and can even sense emotional responses via pupil dilation. The technology is proposed to have several useful applications, but one of the most prudent for Google is a “pay-per-gaze” advertising feature. According to the patent, the system can potentially charge advertisers based solely on whether a user actually looked at their ad—not just for online advertisements, but also for billboards, newspapers, and other commercials. The idea is similar to the existing pay-per-click model used on Google search results, except it would apply to everything you viewed while walking to work on a Monday morning.

The patent was filed two years ago, but only became public in mid-August, and it sounds remarkably similar to the constant surveillance in Minority Report—where your personal information is most highly valued for its ability to direct efficient advertisements your way. To companies, it’s a dream come true; rather than trying to guess how to appeal to a large demographic, they could target individuals who are far more likely to buy the product. The ratio of advertising cost to return on investment could shrink immensely. There are even benefits for the user, who would only see relevant ads and wouldn’t have to suffer through annoying ones they’d normally ignore. But it’s also not surprising that some have voiced concerns over being constantly tracked like this; it’s enough to give any privacy expert nightmares, and it’s not difficult to envision how the pay-per-gaze system could be used against you. While a set of removable glasses is far less invasive than the retinal scanners in Minority Report, and it’s unlikely that a fugitive on the run would don the specs, it’s still not impossible to imagine a scenario where a private matter is made public by advertisers because of what you’ve looked at recently.

To their credit, Google has anticipated the possible backlash; the patent details options to anonymize data and to opt out of what information is gathered and collected. Furthermore, as points out, a patent does not necessarily guarantee a product will be developed. But that said, Google Glass is already in existence, and its use in commercial advertising ventures has yet to be seen. Time will tell if this technology will end up integrated into the glasses, and whether we as a society will be willing to sacrifice a large amount of our privacy for the convenience of personalized advertisements.

SEO news blog post by @ 3:45 pm on August 20, 2013


SEO concerns for Mobile Websites

You want to serve your clients needs regardless of what device they visit your site with, but how do you do it easily without upsetting your SEO?

Lets look at the various options for tackling Mobile sites and what each means in terms of SEO:

Responsive Design :
Visual demonstration of responsive web design

  • Responsive design is growing in popularity, especially as communications technology evolves, and bandwidth/memory use is less of a concern.
  • This method also gives us a single URL to work with which helps to keep the sitemap/structure as simple as possible without redirection nightmares.
  • On top of that, Googlebot won’t need to visit multiple URLs to index your content updates.
  • Less to crawl means Googlebot will have a better chance to index more of your pages/get deeper inside your site.
“Why is/was there a concern about mobile page size?”

Low-end mobiles, like a Nokia C6 from 4+ years ago (which was still an offering from major telcos last year), typically require that total page data be less than 1mb in order for the phone to handle the memory needs of rendering/displaying the site.

If you go over that memory limit/tipping point you risk causing the browser to crash with an error that the device memory has been exceeded. Re-loading the browser drops you on the device’s default home-page with all your history lost. I think we could all agree that this is not a good remote experience for potential clients.

Higher-end devices are still victims of their real-world connectivity. Most 3rd generation devices can hit really nice peak speeds, but rarely get into a physical location where those speeds are consistent for a reasonable length of time.

Therefore, even with the latest gee-wiz handsets, your ratio of successfully delivering your entire page to mobile users will be impacted by the amount of data you require them to fetch.

In a responsive web design scenario the main HTML content is typically sent along with CSS markup that caters to the layout/screen limitations of a mobile web browser. While this can mean omission of image data and other resources, many sites simply attempt to ‘resize’ and ‘rearrange’ the content leading to very similar bandwidth/memory needs for mobile sites using responsive design approaches.

The SEO concern with responsive designs is that since the written HTML content is included in the mobile styling it’s very crucial that external search engines/crawlers understand that the mobile styled content is not cloaking or other black-hat techniques. Google does a great job of detecting this and we discuss how a bit later on with some links to Google’s own pages on the topic.

Mobile Pages :

Visual demonstration of mobile web page design

If you’ve ever visited ‘’ or something like that, you’ve already seen what mobile versions of a site can look like. Typically these versions skip reformatting the main site content and they get right down to the business of catering to the unique needs of mobile visitors.

Not only can it be a LOT easier to build a mobile version of your site/pages, you can expect these versions to have more features and be more compatible with a wider range of devices.

Tools like jQuery Mobile will have you making pages in a jiffy and uses modern techniques/HTML5. It’s so easy you could even make a demo image purely for the sake of a blog post! ;)

This also frees up your main site design so you can make changes without worrying what impact it has on mobile.

“What about my content?”

Excellent question!

Mobile versions of sites with lots of useful content (AKA: great websites) can feel like a major hurdle to tackle, but in most cases there’s some awesome solutions to making your content work with mobile versions.

The last thing you’d want to do is block content from mobile visitors, and Google’s ranking algorithm updates in June/2013 agree.

Even something as simple as a faulty redirect where your mobile site is serving up:
..when the visitor requested:

.. is a really bad situation, and in Google’s own words:

“If the content doesn’t exist in a smartphone-friendly format, showing the desktop content is better than redirecting to an irrelevant page.”

You might think the solution to ‘light content’ or ‘duplicate content’ in mobile versions is to block crawlers from indexing the mobile versions of a page, but you’d be a bit off the mark because you actually want to make sure crawlers know you have mobile versions to evaluate and rank.

In fact if you hop on over to Google Analytics, you will see that Google is tracking how well your site is doing for mobile, desktop, and tablet visitors:
Example of Google Analytics for a site with mobile SEO issues.

(Nearly double the bounce rate for Mobile? Low page counts/duration as well!?)

Google Analytics will show you even more details, so if you want to know how well you do on Android vs. BlackBerry, they can tell you.

“How do the crawlers/search engines sort it out?”

A canonical URL is always a good idea, but using a canonical between a mobile page and the desktop version just makes sense.

A canonical can cancel out any fears of showing duplicate content and help the crawlers understand the relationship between your URLs with just one line of markup.

On the flip-side a rel=”alternate” link in the desktop version of the page will help ensure the connection between them is understood completely.

The following is straight from the Google Developers help docs:

On the desktop page, add:

<link rel="alternate" media="only screen and (max-width: 640px)" href="" >

and on the mobile page, the required annotation should be:

<link rel="canonical" href="" >

This rel=”canonical” tag on the mobile URL pointing to the desktop page is required.

Even with responsive design, Googlebot is pretty smart, and if you aren’t blocking access to resources intended for a mobile browser, Google can/should detect responsive design from the content itself.

Google’s own help pages confirm this and provide the following example of responsive CSS markup:

    @media only screen and (max-width: 640px) {...}

In this example they are showing us a CSS rule that applies when the screen max-width is 640px; A clear sign that the rules would apply to a mobile device vs. desktop.

Google Webmaster Central takes the information even further, providing tips and examples for implementing responsive design.

Ever wondered how to control what happens when a mobile device rotates and the screen width changes? Click the link above. :)

SEO news blog post by @ 3:51 pm on August 16, 2013


The Elephant Beneath My Feet

Using social media as a way to generate strategy and effective business plays a mighty role in our digital world. Many clients are concerned with how the big brand names are stuffing smaller labels under their feet by pushing them to the next page on Google. We should recognize that social media is here for a reason and can be harnessed to drive sales and traffic to the smaller label. The benefits of harnessing social media range from generating long lasting impressions, creating community and relationships and also connecting to clients. Dave Davies had mentioned that at the end of the day it’s creating traffic and making that decent pay check.

How do I generate long lasting impressions with social media?

In early 2010 Blendtec, one of the leading blender companies, used shock-com marketing to create long lasting impressions. The “Will it Blend” You Tube series used everything from an iPad to iPhone and blended it up. Owning an iPad at the time was a must have, but throwing it in the blender was inconceivable. Why would anybody do that? Blendtec did and it clearly showed that it could pulverize. It also created an impression that was virally shared, counting beyond the millions and bringing home economic growth. This type of campaigning has been around for centuries, like how Edison electrocuted an elephant as a propaganda campaign against Tesla. We all know how that worked for Edison. The truth is, we don’t have to go as far as electrocuting an elephant, but we can use this technique to conquer the elephant (Box Stores) with an internet impression.

When business enters the digital world it doesn’t mean you toss out old school networking ideals. When a brick and mortar business opens its door for the first time it focuses on its community and begins to build relationships within it. This connects their store and ensures credibility for their products as well as service. If a small business is accepted well within its own community people begin to speak highly of it. Word of mouth can have a positive impact or a negative impact, but it can happen fast. When a website connects with their clients using social media they gain credibility, and if that client is happy it’s easy enough to hit that share button. This kind of word of mouth happens at a relatively faster pace, with an even larger number of people reached. You can now take hundreds of individuals and focus that feeling of being personally catered to. All this can be a benefit by giving the small brand a voice in its community and avoid being crushed by the popular giant.

Big corporate companies don’t have to rid the local mom and pop store in regards to penguin and panda. It’s about playing smart and using the same techniques that were used in the days before the internet. Be personable when creating a community and relationships, but also never fear going out of your comfort zone to create a long lasting impression. Develop that traffic, spike that interest and make that decent check at the end of the day. Soon enough your presence will strengthen and you will hold that elephant beneath your foot.

SEO news blog post by @ 2:00 pm on


Got Bad Links To Your Site? Don’t Fret It, Disavow It!


This week in the Webcology universe we boogled with Google’s latest WMT video, hacked into Defcon, scratched our heads at Facebook and congratulated the Ninjas of 15 years of Cre8asite Forums.


We love Mat Cutts and he really makes us smile, but when he nonchalantly said in simple form, if you have bad links just disavow them… Well Matt, we wish it was that simple and that the process could be done with the snap of a finger, but in all reality, leave it for us to explain the steps to our clients. An over simplification of a disavow can make it misleading as well as creating a misunderstanding to what actually needs to be done. Clients are number one, but when their back link history is wrought with unnatural and black hat sleight of hand this can take time as well as a little money. So in response to the latest Web Master Tools video please make sure you point out more than just the color of the shoes.

 If there was such thing as a robotic shoe we know for a fact there are people out there who can hack into that. Def-con recently took place in Las Vegas bringing the most intelligent minds together, leaving attendants feeling more vulnerable than they ever thought they were. “Thus being the perfect place for an SEO,” says Kristine Schachinger, an internet guru that puts the best Jedi to shame. Knowing what we don’t know isn’t scary, but in fact it can strengthen our ability to better direct clients as well as be aware of new tactics that maliciously can destroy a site. Many parts of the amazing technology Kristine found was a simple scan that can search for vulnerabilities and simultaneously upload disabling viruses, all within a speed of thirty seconds or less. This conference is not for the faint of heart and shouldn’t be taken lightly, but without this type of gathering we couldn’t prepare stronger strategies in the SEO industry.


 Facebook You Love Changing and We Hate Not Knowing

 What’s with all these Facebook changes and do we care? “Yes we do” says Firestarter Social Media CEO Michelle Stinson Ross. Facebook is adapting to its 1.11 billion users and looking at innovative ways to make it economically work. Does it make sense to have older posts show up later on a time line? In fact it does, when looking at different angles from a promotional point of view, it generates more of a need to place posts that drive interest, likes and shares. Trying to keep Facebook interesting when the meaning of social posting is being misinterpreted by online business. Social platform is here to stir up conversation and create communication between users. This has been snuffing out as businesses have been taking a less creative move and preventing stoked conversation. Facebook is now allowing for popular posts to be recognized and herding creativity back on the time line. Now with Facebook blogging future implements on their blog we can all better prepare our marketing direction for our clients.


Cre8asite forums has been a long time hangout for much of the leading web industry leaders. Their 15 year anniversary is a milestone that’s been dedicated to connecting professionals around the world can share and educate each other of the fast paced and often crazy world wide web. It’s great to know that throughout the years there has been a place on the web that offers networking for famous experts and beginners alike. Congratulations on an amazing fifteen years.

At the end of the day this matrix we call internet is changing while algorithms are fluctuating and disappearing. A new era in marketing is evolving creating a new today for social media and SEO but without this change the endless space we once took for granted would be rendered useless. With respect to what Kristine Schachinger had said that “Knowing what we don’t know isn’t scary, but, in fact it can strengthen our ability to better direct clients and make a better future.” Let’s take this and move forward to a better future of the internet and respect the change that is happening at this moment.


SEO news blog post by @ 2:54 pm on August 9, 2013



Twitter’s New Anti-Abuse Policies and the Dark Side of Social Media

I won’t lie when I say that one of the best parts of my job is managing social media accounts; it can be legitimately fun, but it’s also a very important illustration of how the Internet affects customer/business interactions. My experience mostly comes from being a voracious and active social media user in my private life; I enjoy a following of 400+ people on Twitter, and I have seen what the network is capable of: live-blogging the Vancouver Olympic opening ceremonies, catching cheating politicians in the act, and spreading the word of everything from hot TV shows to full-blown revolutions. While some might resist it, social media is vital for modern reputation management and customer service; the web has democratized marketing in a very drastic way, making it nearly impossible for a company to cover up substantial issues with their products or service. When you do a great job, you might get the occasional positive mention; when you mess up, your customers will definitely air their grievances. And as a social media user myself, I can vouch for the fact that the public has come to respect businesses that address these issues honestly when they’re contacted about them.

Unfortunately, this democratization has lead to some inevitable abuses of the system. In some cases it’s a rival company posting fake reviews in an attempt to discredit the competition; in others, a company (or person) may be the subject of a vicious complaint that goes viral online. Part of online reputation management is being able to mitigate these issues, whether by reporting abuse to site moderators or addressing complaints head-on.

I say all of this because some business owners on desktop and Android platforms may see a new feature on Twitter in the coming weeks: an in-tweet ‘Report Abuse’ button. Currently, users who wish to flag threats must visit the online help center and go through several extra steps to report abuse; the new button will make the process far quicker, and (hopefully) hasten the removal of hate speech. Twitter’s announcement wasn’t just a routine update; it was spurred largely by a British woman named Caroline Criado-Perez, and the flood of horrific rape, violence, and bomb threats she received over the weekend. These weren’t mere trolls; the abuse got so serious that at least one man was arrested on Sunday as a result. What did Criado-Perez do to warrant hundreds of 140-character threats of violence? She campaigned—successfully—for the British government to put author Jane Austen’s face on the new £10 banknote. The threats were also sent to a female Member of Parliament who tweeted her support for the campaign.

If it seems absurd, that’s because it is; this wasn’t a case of radical politics or controversial opinion, but a fairly tame move to represent more British women on currency. The horrifying result was a stark reminder of the abusive power of social media, especially against women and other marginalized groups in society. But even if you’re not an active participant in social issues online, it’s intimidating to realize just how quickly the anonymous web can turn against you. While some have applauded Twitter for finally taking a decisive action to make their website safer for all users, the decision has also drawn criticism from people who have seen how ‘Report Abuse’ functions on other websites have actually been used against legitimate accounts as a form of abuse in and of itself; a group of trolls flagging an account they disagree with can result in that account being suspended by the website, even when the owner hasn’t actually violated any rules.

Of course, the gender politics and personal vendettas of social media are quite a bit more intense than what we do as SEOs to help clients. In terms of reputation management online, the Report Abuse button will likely be a helpful way to ensure that a company doesn’t suffer from malicious treatment. However, it also may be far too easy to report a dissatisfied (and vocal) customer out of sheer frustration. Online reputation is a fickle beast; a few damning reviews can take down an entire small business, and the damage can be very difficult to control—it’s easy to feel helpless when it seems like nothing you do can push down a few dissatisfied customers in favor of the happy ones. Business owners on Twitter should still make it a priority to engage with unhappy customers on a personal level, rather than just report an account because of a particularly bad review—even if it makes the problem temporarily disappear, the Internet is not kind to those types of tactics.

The Criado-Perez debacle over the weekend has shown Twitter’s dark side, particularly when it comes to misogyny and online gender violence. The effect of the new reporting feature remains to be seen in that regard. While smaller businesses on social media may not engage in that debate, it’s a prudent reminder that the web’s anonymity can cause a lot of malicious action in the name of free speech. Reputation management isn’t going to get easier as a result of Twitter’s changes; it will still require a human touch and an honest connection, because that’s what garners respect in the social media sphere. But hopefully this small corner of the web will be a little safer for everyone who uses it, giving people more courage to speak their minds without fear of retaliatory attempts to forcibly silence them.

SEO news blog post by @ 3:14 pm on August 6, 2013


A Panda Attack

Google today confirmed that there is a Panda update rolling out. I find it odd that after telling webmasters that there would no longer be announcements of Panda updates, that they made this announcement and one has to wonder why.

The official message from Google is that this Panda update is softer than those previously and that there have been new signals added. There are webmasters who are reporting recoveries from previous updates with this one. I would love to hear some feedback from any of our blog readers as to changes you may have noticed in your rankings with this latest update.

I’ll publish a followup post to this one next week after we’ve had a chance to evaluate the update.

SEO news blog post by @ 10:42 am on July 18, 2013


« Newer PostsOlder Posts »
Level Triple-A conformance icon, W3C-WAI Web Content Accessibility Guidelines 1.0 Valid XHTML 1.0! Valid CSS!
Copyright© 2004-2014
Beanstalk Search Engine Optimization, Inc.
All rights reserved.