Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

Why we need an underground Google

Mike Elgan | July 15, 2014
Governments are forcing search engines to show wrong results. It's time for search engines to go rogue so they can be right.

Regimes ranging from democracies, such as Turkey's, to authoritarian governments, such as Iran's, are increasingly following China's lead in sophisticated methods for censoring search engine results.

The bottom line is that with each passing year, search engine results are becoming increasingly inaccurate and unreliable and search engines are therefore increasingly failing to perform their most basic function — helping you find what you're looking for on the Internet.

It's clear that the world — from the victims of the most repressive governments to the citizens of the freest democracies — need search engines that can't be made inaccurate by governments.

So what's the solution?
An immodest proposal

The reason governments can force search engines to be inaccurate is because search engines are caught in a catch-22.

In order to be comprehensive and fully index the Internet, search engines need a lot of money for massive server farms and highly trained employees. In order to make money, they need to cooperate with governments and obey national laws and rules in whatever country they operate in, so they can sell ads to pay for it all. However, that cooperation ends up requiring them to skew search results, which prevents them from fully indexing the Internet.

And that's why there is no accurate search engine. The search sites with the money can't have the freedom to be accurate; the sites with the freedom can't make the money.

One solution might be a distributed search engine, for example, where instead of being housed in a location that can be shut down, it could be distributed in many locations and moved and shifted on the fly.

It's been tried before. Projects like InfraSearch, Opencola, YaCy and FAROO have attempted distributed search engines.

The problem is that the Internet is too large and too quickly changing for these small-time projects to get anywhere near the big search engines in comprehensiveness, even with government censorship.

So instead of trying to duplicate the indices of the major search engines, we need a distributed search engine that focuses exclusively on the censored content, where the major search engines have been forced to provide inaccurate results. Perhaps Google, Microsoft and others might even help this effort by freely providing data about what has been censored and why.

This distributed search engine should display the results from the search engine chosen by the user (Google, Bing, etc.), alongside results known to be censored somewhere -- anywhere.

Together, these two sets of results would not only show an accurate view of what's really on the Internet, but also what has been censored in a clear way.

I believe that anyone paying attention to the corrosive power of government censorship of search engine results sees how necessary this is. And if you don't see it, just wait a year or two.


Previous Page  1  2  3  4  Next Page 

Sign up for Computerworld eNewsletters.