Forgetting things

I am unpleasantly surprised by the society’s response to the modern computer capabilities. The problem is very simple to state: Computers are less limited in the way humans are. With sufficient technology, computers can indefinitely store information. This is a genuine problem, but the public response is a lot like the response of an emu on concrete floor – a bloody mess. I came to writing this post after seeing several fascinating news items discussing Google Glass. Most of them went like this: OMG, if you run with electronic equipment, this can rob our privacy!!! And considering the right to be forgotten, as the European Court names it, the situation is even more grim, as the society happily jumps on the bandwagon of thought with the destination I Do Not Want To Be Judged For My Past.

However, by discarding the past, one is not entitled to a future. Past actions offer a useful prediction on future behavior, and by discarding the obvious consequence of the expansion of human memory, people are willingly limiting their capabilities of judgment. The most radical call is to remove anything from the Internet that might affect one’s reputation negatively, and if this will be done, no information can be deemed reliable as it will be subject to one giant confirmation bias with legal restrictions on negative tests of the hypothesis “Can we trust them?”. As an application, suppose I am a political expert. I tell you exciting stories about the dangers of loloization (my newly-invented buzzword) of society and make fun (for you and me) and profitable (indirectly, for me) predictions that with 99% certainty, the society will turn into idiots incapable of thinking by 2015. Next year, as this does not happen, I nicely ask to remove all my interviews containing the word loloization, because I have the right to be forgotten and my reputation of a political guru will be obviously damaged if these interviews are still visible. This is an obvious exploit. Furthermore, by demanding the right to be forgotten, people are delegating their agency to the goodwill of the legislative and the judicative systems. Furthermore, as (seemingly) the responsibility for actions in public is being weakened to pre-digital status quo, nobody can stop the technically competent private enterprises from creating a mass archive of public human (mis)behavior as long as there is some public or corporate interest in it. On the other hand, the same corporate interest can, using the right to be forgotten, remove traces of corporate misbehavior. The most prominent example is the Spanish guy who removed Google entries of his past bankruptcy. A sane solution here would be an explicit display of the date of the relevant entry, not its erasure.

The more tricky use case is about party pictures on the Internet. This is something that is regulated, but only in parts; some countries enforce a very strict regulation, which is something I would generally prefer. However, the more responsible thing would be to design technical devices in a responsible way and, most importantly, not to use devices that you cannot control. (Should I rise to power with my fellow technocrats, I will enforce strict computer literacy)

As someone who grew up in the computer era, I would like the humanity to embrace the new technological capabilities and act responsibly. Technology always carries risks, and the first man to discover fire undoubtedly made this observation. It should be clear that irresponsible behavior with technology should be limited, but it should also be obvious that legal restrictions can also be done in an irresponsible manner (consider, as a not directly related, but similar case, the current Russian legislation against “gay propaganda”, it is neither useful nor important not responsible, and one day the relevant law will be defunct anyway), and our responsibility is also to prevent that.

Leave a Reply

Your email address will not be published.

Time limit is exhausted. Please reload CAPTCHA.

This site uses Akismet to reduce spam. Learn how your comment data is processed.