Local Interactive Strategies

Content farms and aggressive SEO

So in a blog post today, Matt Cutts, Principal Engineer at Google, says Google is cracking down on   content farms whose sole purpose is getting high ranks in search results, thus generating clicks, and thus click-throughs on ads.

We hear the feedback from the web loud and clear: people are asking for even stronger action on content farms and sites that consist primarily of spammy or low-quality content. We take pride in Google search and strive to make each and every search perfect.

Makes sense to me. Google’s only asset is the quality of its search results. As long as Google delivers me search results that answer my questions, I’ll come back. Why would I bother trying some other search engine?

Google has always seemed to be working hard to anticipate what I’m looking for, versus what someone’s trying to shove in front of me.

For instance,  I just now Googled “restaurant” and yeah, OK, there’s, and the Wikipedia entry for “restaurant,” but most of the page displays restaurants around my current location. Google understands that when I search  “restaurant,”  I’m probably not asking how to start a restaurant, or what a restaurant is, or who makes the best restaurant equipment.

Point being, Google’s priority is delivering results that the user values – that’s basically Google’s definition of “quality.”  If anything starts diluting that quality, according to Google, they’ll push back. And “webspam” is diluting the quality of search results. Cutts writes:

Just as a reminder, webspam is junk you see in search results when websites try to cheat their way into higher positions in search results…”

The obvious implication is that these content farms could have a brief lifespan. They exist only because of Google’s search algorithms, and Google can change those algorithms anytime.

The second implication is that Google is paying attention to tactics that drive results higher than their “natural” ranking.  That’s called “search engine optimization” – but  it’s a very broad topic.  At one extreme, it simply means avoiding things that  will keep your site  unnaturally low in Google rankings. Examples include not assigning unique URLs to every page, not using unique   titles for each page, not putting relevant terms in titles and headlines.

At the other extreme is what I consider aggressive SEO:  figuring out some secret sauce that gets your site into a high rank that makes no sense to the casual observer. Try this. Google “Maine real estate.” I offer this  example because I’m familiar with the architecture of the sites under the MaineToday brand (where I worked until 2008). You’ll see those sites in the first page of Google results (Numbers 3 and 4 – sweet!) . They have home listings from all over Maine, so logically, they should be ranked high on a search for “Maine real estate.” But check out the other sites on your first page of Google results. On this day, I’m seeing three sites that have only a few listings for a couple of  very   small  areas of Maine, and another that’s nothing but a set of links to other sites. So out of  10 search results on the first page, 40 percent are not very useful for someone generally looking to buy a home in Maine.

Under Google’s definition of “webspam,” and Google’s goal of “perfect” search results, that’s a 40 percent error rate.

So the second big implication of Google’s crackdown on “webspam” is this: Does aggressive SEO have a future?


January 21, 2011 - Posted by | Uncategorized

1 Comment »

  1. Does aggressive SEO have a future? Of course it does, so long as people continue to search for things on the web using search engines.

    Indeed, I’ve heard it said that if you take a month or two off from running a web business, the whole world will have changed when you come back.

    Well in this case, Deb and I have just taken the past four years off (but who’s counting?) since selling … and as I start to drop back into the workaday world, it seems to me as if Matt Cutts is still involved in the same whack-a-mole work that he was involved with four and five and six years ago.

    Back when we ran Old House Web, search engines in general delivered about 80% of the 20,000 or so people who wandered through our site on any given day — and Google delivered about 80% of that. And so we paid close attention to what Matt and his team liked and disliked … what they considered quality content and what they considered spammy.

    Ironically, the quotes you include in your column from Matt could have come out of his mouth (or off his fingers) in 2004 or 2006 or on any day since he joined Google.

    So, does aggressive SEO have a future? Of course it does. The techniques change. The nuances shift. But as long as Google and its competitors continue to drive huge volumes of traffic, there will be spammers who know how to push sites to the boundaries — and who will discover new ways to game the system. And there will be whitehat SEO consultants who know how to advise clients how to be aggressive in presenting their sites to the search engines in the best possible light.

    Comment by Ken Holmes | February 11, 2011 | Reply

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: