Wired.com, LiveScience and other sites have published stories in the last few days about a University of Wisconsin grad student Adam Wilson's recent development of a brain-computer interface system that can send Twitter messages. (There's also a video demonstration of the technology you can watch on YouTube). Essentially, it allows a user wearing an electrode-laden cap to select letters by concentrating on them as they flash on a screen instead of typing them in by hand. I first learned about brain-computer interface devices of this sort when reading this blog about a year ago, and I've blogged elsewhere (on PrawfsBlawg) about recent work on adapting them for video games.
This is clearly of enormous value for those who are prevented, by physical injury or illness, from typing or speaking. As the CNN story on it says: "The development could be a lifeline for people with 'locked-in syndrome' -- whose brains function normally but who cannot speak or move because of injury or disease."
But the technology also raises interesting privacy questions, as is highlighted by LiveScience's interesting title for the story: "Mind Reading Device Sends Twitter Messages." A mind reading device, of course, might send more that a Twitter message:
it might pick up -- and communicate to others -- more than you want to communicate about what's in your mind. As the blog post here last year pointed out, brain-interface devices can detect not just electrical activity in the brain associated with a letter you're focusing on, but also with "non-conscious emotions."
This isn't to say this kind of intrusiveness is an inevitable part of using the technology. From everything I've read so far, the Twitter message featured in this story was indistinguishable from a Twitter message typed in by hand. It didn't contain any hidden information about Twitterer's mood or uncertainty.
But the fact that such technology **can** be privacy-protective in this sense doesn't automatically mean it will be. Some people (or companies or agencies) after all, may decide they have any interest in knowing not only what someone says but also the feelings or internal mental states that accompany it. Is it possible that we'll one day live in a world where, at least in certain social or business interactions, people will be expected to reveal themselves not just through words, but in technological displays of feelings that might otherwise be unshared. In short, these developments seem to vindicate a fear that Justice Brandeis expressed in 1928 when he warned (in a Supreme Court dissent demanding greater constitutional private protection) that "[a]dvances in the psychic and related sciences may bring means of exploring unexpressed beliefs, thoughts and emotions."
There has been a lot of discussion about whether evidence gleaned from brain-based methods of lie detection should be used in court. (Perhaps most notably in the recent discussion and debate over the use of it in a murder case in India). But it is not only public courts who will be considering whether to use technology of this sort, but also many private entities -- and perhaps, even private individuals interested in doing what they can to assess the truthfulness of a friend or business partner. And even if courts find it unreliable, individuals or institutions could decide to use such devices in the "private justice" systems one finds in arbitration hearings and in the disciplinary decisions of many schools, businesses, and other private entities. Non-state actors might ask people to don electrodes as part of an evidence collection process, or might ask them to keep and provide recordings of their feeling states in certain interactions in the event such evidence is needed at a later time.
None of this is to say I think an Orwellian privacy-free world is just around the corner. People aren't always expected to communicate by video chat even though it's now widely available. They still communicate with each other in voice calls or chats, and in e-mails or text messages that reveal far less than is possible. But I suspect what will come soon is lots of questions and challenges, both in law and in the development of social norms, about what kinds of information about people's internal states it will be fair (and legal) for individuals, institutions, and businesses to expect to people to reveal about themselves, or to capture and store for possible future use or examination.
I just stumbled upon your blog, nice blog.....
Posted by: 88 India | 04/28/2009 at 02:42 AM