Americas

  • United States

Asia

Oceania

Real life HAL 9000: computers reacting to human emotions?

Analysis
Mar 03, 20115 mins
Data and Information SecurityEnterprise ApplicationsMicrosoft

We are posed on the brink of computers recognizing and responding to users' emotions. Would you view that as cool or as a privacy invasion? Are you ready for HAL 9000 in real life?

Would you trust your computer to interact on a deep personal touchy-feely level with you, reading, interpreting, reacting, or storing your emotions? Most of us at some point have had one of those days when we’ve tried sweet talking or cussing a malfunctioning computer, but what if the computer apologized? What if computers could sense your emotions and respond appropriately?

Computers in touch with human feelings sent me on a quick flashback to 2001: A Space Odyssey when HAL 9000 had just killed the rest of the crew. HAL said, “Look Dave, I can see you’re really upset about this. I honestly think you ought to sit down calmly, take a stress pill, and think things over.”

Yet we are posed on the brink of computers recognizing and responding to users’ emotions.

An engineering and consulting firm, Design Interactive, reports it partnered with VRSonic to develop an Affective Virtual Environment Training System (A-VETS), a tool that uses “noninvasive” methods of evaluating emotional responses in real-time. The A-VETS tool is targeted at military training and research community needs.

Technology Review reported that Design Interactive is working with DARPA and the Office of Naval Research. Design Interactive owner, Kay Stanney, said, “that a lot of information about a user’s mental and physiological state can be measured, and that this data can help computers cater to that user’s needs.”

According to Columbus Cars, the computers in some cars are somewhat emotionally attentive in an attempt to help keep drivers safe. Although Volvo has “heartbeat sensors,” it’s more likely meant for detecting erratic heartbeat such as if an intruder suddenly pops up from hiding in the back seat. But Ferrari is working on a mind-reading car, so that biometric and psychometric in-cockpit sensors will be able to “monitor a driver’s heart rate, blood pressure, facial reactions and brain activity.” Besides helping to monitor Ferrari drivers’ fatigue level, it may also “forensically measure driver reactions in the moments before a road-rage incident or high-speed crash.”

A company called EmSense claims to be able to measure emotions and offers marketers the “largest neuromarketing database in the world.” If consumers wear a funky headband, it can measure emotion and engagement like “a window into the mind of consumers.” While that’s not quite the same as a computer sensing and reacting to your emotions, it delves into the possibilities of computers reading your moods and emotional states.

You can scan ports and scan for other security vulnerabilities, but how about scanning for love?

An older technology called Layered Voice Analysis (VLA) supposedly detects and measures emotions in voices. The Israelis have used VLA tech in the war against terrorism to expose hidden hatred and malicious intent in the voices and hearts of terrorist suspects. This security tool is also allegedly a tool for love, meant to check for fluctuations in the voice that may betray real emotions. This was marketed as ‘The LOVE Detector.’ The civilian Love Detector version measures only 5 of the 129 parameters of emotion that the security version measures while detecting if the speaker is lying or telling the truth. The Big Blue Marble reported that the security version “is being licensed to governments and intelligence agencies by the Israeli company Nemesyco.” Yet when two Swedish scientists published a paper claiming VLA was not an accurate security tool, Nemesysco Limited threatened to sue them and the electronic version of the paper was taken down.

Although it might prove to be wonderful, I’m not too excited about the prospect of my computer reading, reacting, or storing my emotions. I don’t want to worry if the touchy-feely personal data would be shared with marketers or anyone not of my choosing. How often is the tone in chat or email misunderstood based upon personal state of mind at the time of reading the message? Even people face-to-face can misread other people’s emotions, so what if the computer misunderstood? It would not be cool if the computer started closing applications because it judged the user was too tired, too frustrated, or too excited to continue.

On the other hand, it might be nice if airport security could simply ask people if they intended to hijack the plane and lie-detector glasses, a voice analyzer, or a computer at the TSA station could quickly verify a person was not a terrorist or any kind of threat. It would however be very bad news if the analysis was wrong.

Are you ready for your own personal “HAL” to get touchy-feely, interpreting and reacting to your emotions?

  • All of today’s Microsoft news and blogs
  • Are smart meters real-time surveillance spies?
  • Digital Data Mined Dating
  • Behavioral Ads Appearing On Online Banking Statements
  • Digital Signage: Privacy in a ‘One-Way Mirror Society’
  • Snooping on Social Networks to Vet Jurors and Hire Employees
  • Microsoft: Want Unrestricted Net Access? Need PC Health Certificate
  • Watchdog Group questions Google’s relationship with NSA
  • Former FBI Agent Turned ACLU Attorney: Feds Routinely Spy on Citizens
  • Hackers needed to save the world — at least America

Follow me on Twitter @PrivacyFanatic

ms smith

Ms. Smith (not her real name) is a freelance writer and programmer with a special and somewhat personal interest in IT privacy and security issues. She focuses on the unique challenges of maintaining privacy and security, both for individuals and enterprises. She has worked as a journalist and has also penned many technical papers and guides covering various technologies. Smith is herself a self-described privacy and security freak.