Professor Writes Research Paper on Computers Needing to Forget

Lifestreaming inevitably always seems to prompt concerns of privacy. The basis for creating a Lifestream first begins by either already using or finding a web service that by design already provides anybody the ability to see our activity. allows users to find our profile page and see what music we listen to, while allows users to view our profile to see what we’re bookmarking.

Some sites and services allow you to filter what data you share. allows you to set a flag to disable sharing every time you bookmark a page and Cluztr allows you to define sites which are private and won’t appear in your linkstream. Basically we, as the authors of our own Lifestreams, choose how open a book we want our lives to be for others to view.

Is Lifestreaming for everyone? No, but I want to nip mis-conceptions about privacy in the bud as quickly and easily as possible so others are more open to the concept. Jon (founder of Cluztr) sent word to me about several new features to his site and of them I quote the following.

Beefed-up privacy
This was the number 1 requested feature and we delivered. Users now have the ability to set the privacy level of their clickstreams. Default setting is public, but now can be set to private so only their friends can view it. Additionally, other new features allow users to distill their clickstreams by tagging sites or pages as “private”. This effectively gives users full control over their own clickstreams.

It’s good to see how users can help shape the quality and privacy of sources for our Lifestreams. And honestly, the data in our Lifestreams are harmless interesting data nuggets compared to the amount of private data about us that we can’t control sitting on corporate computers and hard drives all over the world. So without further ado and with my long-winded spiel. Here’s a snippet from the article on “Why computers must learn to forget”.

If whatever we do can be held against us years later, if all our impulsive comments are preserved, they can easily be combined into a composite picture of ourselves,” he writes in the paper. “Afraid how our words and actions may be perceived years later and taken out of context, the lack of forgetting may prompt us to speak less freely and openly.

I was also impressed to learn what a panopticon is. You can read the rest of the article on Ars Technica

1 thought on “Professor Writes Research Paper on Computers Needing to Forget”

  1. Privacy will always be a concern, for some, not everyone, but it’s certainly LESS of a concern then say it was for most people even a few years ago.

    Web behavior, as you’ve mentioned before, is moving in a direction where people are more willing to give us aspects of their privacy to get services in return.

    But to many people it’s still like getting into a pool of cold water, some people jump right in, others take a while. To each their own, and Cluztr facilitates that.

    It would be great is computer could forget, I mean who hasn’t left a flaming rant in some forum or in a blog comment? … but I think the real lesson is that in the virtual world, like in the real world, your actions matter and can have consequences. So think before you click “submit”.

    Anonymity on the web never existed.

Comments are closed.