неділя, 23 травня 2010 р.

Entropy loss problem

Past decades are widely considered as the period of enormous growth of the amount of information available around the world. Though the overall amount of information on the planet has really grown significantly (and continues to grow), a much bigger basis for this viewpoint, as for me, lies in dramatic decrease of entropy in the information data surrounding us.

I cannot say “the entropy in the information surrounding us” as this is not formally correct, because “information” assumes 100% entropy, while abstract “data” may contain no information at all (and thus have zero entropy).

What we have today is enormous growth of the amount of data, not information. We have more books, more music bands, more radio stations, more TV channels. However, all those sources bring us much less information than they used to 50 years ago. Most of the news channels use original copy-and-paste method, obtaining news from a small number of information agencies (e.g. Thomson-Reuters or Associated Press), and only several of them do work on producing something more intellectual than just reproducing the facts. The shelves of bookstores are full of books, but it is quite difficult to choose a good book to read. Finally, I have just entirely stuffed my new 400GB hard drive, but I doubt that my PC contains ten times more information than it used to do with 40GB drive ten years ago.

Why entropy loss is a problem? Due to its flooding effect, of course. There is just too much low-entropy data around us, so retrieving high-entropy data requires more and more efforts to be done. Moreover, certain skills are needed to be able to tell a smile from a veil (C); it is too easy to get led by fake ideals. Finally, the world is not that simple by itself, so why should we make it more complicated by bringing extra trash into it?

Немає коментарів: