Call us on:
01392 349 580

Can Google Truly Make Quality Content King of The Web?

For those webmasters and e-marketers who can remember a web without Google, life was much less complicated and a lot less tumultuous back then. Since Google came onto the scene and became the dominant search engine in the world… things have changed drastically.

And not necessarily in a negative way, those same webmasters probably jumped for joy when they reached the top of Google for their keywords. Then they complained just as loudly when Google made one of their never-ending changes to their algorithm and these webmasters saw their rankings drop or in some severe cases, disappear from the web altogether. In those early days, most of Google’s major updates were kept secret until the fallout got webmasters fuming or rejoicing.

However, in recent algorithm updates or changes, Google has openly broadcasted these changes to anybody who was listening. The same openness applies to Google’s recent changes dealing with “content farms” and “low quality content” in Google’s SERPs. Matt Cutts stated in his blog, “we’re evaluating multiple changes that should help drive spam levels even lower, including one change that primarily affects sites that copy others’ content and sites with low levels of original content.”

Basically, what Google is trying to do with these changes is to improve the overall quality of their search results by lowering the rankings of sites which it perceives as low quality and containing little or no original content. These would be sites that have scraped content from other websites and that have displayed it, usually along with ads and/or links to affiliate products or other related sites.

At present, this only affects search traffic in the States, but this is no small update since 11% of the queries have changed. And as some webmasters have noted, these changes are indeed improving search results.

One interesting find comes from Alexis Madrigal at that looks at Google’s new improved search results for the keywords “drywall dust” and shows that there were indeed less “content farm” listings in the new results.

However, it’s Google’s definition of “content farms” which has many long-time webmasters concerned. As an online marketer who attributes most of his success to article marketing, Google’s recent updates have me somewhat worried. I contribute articles on a regular basis to many online article directories, most of which are free for other webmasters to use as long as they keep my resource box and links attached. These articles get picked up and displayed on countless sites around the web. I also feature many of those same articles on my own site. I am sure there are thousands of webmasters who do the same thing and who are also worrying how Google’s new changes will affect all this duplicate content.

In most cases, my articles in Ezinearticles usually get displayed at the top of the rankings in Google, sometimes even above the same article on my own main site. This is understandable since Ezinearticles is a much more respected authority site in the eyes of the search engines. Also, all sites come with their own unique keyword ranking “DNA,” which means they optimized for those keywords and any keyword related content which is added to those sites will rank higher in the search engines, especially Google.

Years ago, I tried on several occasions to use “spin software” to make all my articles unique, but I could’t ever bring myself to accept the resulting spins or versions of my articles. They just didn’t seem right and didn’t have the right flow. For me, writing has always been more of a pleasure than a chore and corrupting it in any way is just not worth it. Besides, I have been horrified more than once by seeing fragments of my articles mutilated on some of the aforementioned low quality sites which have scraped my content from the web.

Instead, I started writing unique articles or content which I placed on other sites. One of the main directories for this was Buzzle which switched over to only accepting unique content two or three years ago. I have monitored Buzzle over the years and noticed that it’s traffic stats keep increasing in a steady line upwards, probably due to adding all this constant, unique content. Other article directories seem to have a more see-saw flow to their traffic numbers, if you compare them on sites like Alexa.

I believe this whole issue comes down to quality content and what the search engines perceive as quality. Just because something is unique, doesn’t mean it’s quality content. Google has to judge the quality of the content it finds on the web and it has over 200 ranking factors which it says can filter out the top content and present it to the searcher. The recent “content farm” update is difficult to evaluate… just because content is duplicated or appears on another site doesn’t mean it lacks quality.

As a webmaster, I have always placed related videos and press releases on my sites to complement my own content. I also reference other sites and data in my articles to back up an opinion or to prove a point. Although going forward, I will be very wary of placing content on my sites which is not unique.

Sometimes I find it ironic that Google, since day one, has not created any unique content… its robots crawl the web and compile that information into search results. The quality of those results largely depend on the quality of the scraped content and how well its algorithm can filter out the low quality stuff.

Supposedly, no human eyes judge this whole process which I don’t believe for a minute. Google’s engineers are constantly monitoring the results and constantly adjusting its algorithm to filter out what they don’t like which brings us back to the question at hand. Can Google really perfect a system where only the quality content on the web rises to the top?

Obviously, they can use such factors as bounce rates, time spent on-site, pageviews per visitor, direct access, backlinks from authority sites… and bookmarks in the social media/networking sites. Let’s face it, if a piece of content has 2,000 re-tweets, it must contain something of interest/quality for a lot of people. Likewise, if a piece of content or video has 5,000 comments attached to it and 10,000 Diggs or Likes… chances are good that it is of high quality.

Of course, there is also the technological/mechanical side to site quality. If a site loads slowly and has countless dead links, then it can be easily ranked as low quality. Google has always maintained a user/surfer’s experience is important to what it lists in its results. Content farms and sites with little or no unique content would probably be high on Google’s líst of what not to display.

However, judging the quality of a piece of information or writing, without human eyes viewing it, is not so easy. Unless Google has a thousand little Watsons running quietly in the background, intelligently reading and rating all that content, making only quality content king of the web will be extremely difficult for Google to do. Time will tell.

About The Author
The author is a full time online marketer who operates numerous niche sites, as well as two sites on Internet Marketing, where you can get valuable marketing tips for free: internet marketing tools or try here marketing tools. Titus Hoskins

Get our weekly Digital Marketing Insights (its free!)

 

Comments

Ya may be this time truly the Quality content play an important role in search engine ranking. Google disable some important article sites who create alot duplicate content. Hope for the best. SEO is not a easy work this time.

Discover the exact formula you need to implement to get more sales & enquiries online with our video guide.

* indicates required

Yes, I want to receive weekly digital marketing insights

* indicates required