Call us on:
01392 349 580

New Google Patent May be Less News for Small News

Google’s never ending search for providing a quality end user experience has culminated into a bullet with patent number WO 2005/029368 imprinted across the side. Unfortunately for smaller news services, the bullet may strike the heart of aspiring upstarts—a casualty of Google’s friendly fire.

Of course, it will all depend on how heavily certain things are weighed in the news algorithm technology Google has just sealed in the patent offices of the US and other countries. News giants like CNN and the New York Times will barely notice the decrease of air pressure in the blogosphere, and probably won’t mourn the impending loss of younger cousins vying for their thrones.

The patent is aimed at increasing the quality of news delivered into search results, a noble effort to weed out inaccurate, biased, and disreputable sources. Until the implementation of the new algorithm, news is ranked according to relevance to the search word query and by the date (or timeliness) of the article. The source is not considered.

The new technology will take several new things into account, continually measuring qualitative factors like how long the news source has been in existence, the number of stories published, the credibility of the source, average story length, number of stories with bylines, the size of the organization’s staff, circulation, number of global operations, number of links to stories from the source, and Web traffic to the site.

Currently, typing in “George W. Bush” will bring up two sources at the top that probably lack many of the things the new algorithm will be looking for. At the top of the page is a satire site called “Unconfirmed Sources” and an editorial from “Guerrilla News Network.” So these aren’t really news sources, their entertainment and editorial sources.

What concerns many is not so much the supposed increase in objectivity Google painstakingly aspires to create. It is the loss of sites such as the aforementioned from ranking, dropping them into obscurity so that they are difficult to find, read, and mull over. The filtering is taken out of the minds of the users and given into the hands of computer-generated objectivity. Many would rather that job be left to them, as Reuters and the Washington Post are easy enough to find.

The European world is already leery of the far-reaching arms of Google information channeling. The next ten years will add $200 million dollars to the information indexing effort as Google digitizes some 15 million books from the most respected libraries in the US. The fears of “googlization” are becoming widespread as Europe mobilizes against the search engine by setting up their own literature database in fear of American cultural imperialism.

This is not to say that Europe isn’t paranoid with misplaced anxiety. In all honesty, Google should be praised for their digitization and qualitative efforts and for setting the benchmark by which other search engines are measured. And Europe should have had their digitization effort going in spite of, rather than in response to Google’s efforts. All information indexing should be welcomed on the web.

There will always be critics. Critics are the warts that come with power and fame. But its hard to not be at least a little bit worried about the underdogs, the upshoots, the legitimate news sources without thousands on staff, who haven’t been around for decades and have themselves comfortably imbedded into the establishment. You have to pine just a little bit for the voices that could be potentially lost in the fight for search engine credibility.

About the Author:
Jason L. Miller is a staff writer for WebProNews covering technology and business.

 

Camilla Todd
Camilla Todd is Head of Digital Marketing at WNW Digital and manages Search Engine Optimisation, PPC, Social Media campaigns and Brand Awareness for WNW Digital SEO clients. You can follow her on Twitter @camilla_wnw, email her at camilla@wnwdigital.co.uk or phone on 01392 349580