We are posed on the brink of computers recognizing and responding to users' emotions. Would you view that as cool or as a privacy invasion? Are you ready for HAL 9000 in real life? Would you trust your computer to interact on a deep personal touchy-feely level with you, reading, interpreting, reacting, or storing your emotions? Most of us at some point have had one of those days when we’ve tried sweet talking or cussing a malfunctioning computer, but what if the computer apologized? What if computers could sense your emotions and respond appropriately?Computers in touch with human feelings sent me on a quick flashback to 2001: A Space Odyssey when HAL 9000 had just killed the rest of the crew. HAL said, “Look Dave, I can see you’re really upset about this. I honestly think you ought to sit down calmly, take a stress pill, and think things over.”Yet we are posed on the brink of computers recognizing and responding to users’ emotions. An engineering and consulting firm, Design Interactive, reports it partnered with VRSonic to develop an Affective Virtual Environment Training System (A-VETS), a tool that uses “noninvasive” methods of evaluating emotional responses in real-time. The A-VETS tool is targeted at military training and research community needs.Technology Review reported that Design Interactive is working with DARPA and the Office of Naval Research. Design Interactive owner, Kay Stanney, said, “that a lot of information about a user’s mental and physiological state can be measured, and that this data can help computers cater to that user’s needs.” According to Columbus Cars, the computers in some cars are somewhat emotionally attentive in an attempt to help keep drivers safe. Although Volvo has “heartbeat sensors,” it’s more likely meant for detecting erratic heartbeat such as if an intruder suddenly pops up from hiding in the back seat. But Ferrari is working on a mind-reading car, so that biometric and psychometric in-cockpit sensors will be able to “monitor a driver’s heart rate, blood pressure, facial reactions and brain activity.” Besides helping to monitor Ferrari drivers’ fatigue level, it may also “forensically measure driver reactions in the moments before a road-rage incident or high-speed crash.”A company called EmSense claims to be able to measure emotions and offers marketers the “largest neuromarketing database in the world.” If consumers wear a funky headband, it can measure emotion and engagement like “a window into the mind of consumers.” While that’s not quite the same as a computer sensing and reacting to your emotions, it delves into the possibilities of computers reading your moods and emotional states.You can scan ports and scan for other security vulnerabilities, but how about scanning for love?An older technology called Layered Voice Analysis (VLA) supposedly detects and measures emotions in voices. The Israelis have used VLA tech in the war against terrorism to expose hidden hatred and malicious intent in the voices and hearts of terrorist suspects. This security tool is also allegedly a tool for love, meant to check for fluctuations in the voice that may betray real emotions. This was marketed as ‘The LOVE Detector.’ The civilian Love Detector version measures only 5 of the 129 parameters of emotion that the security version measures while detecting if the speaker is lying or telling the truth. The Big Blue Marble reported that the security version “is being licensed to governments and intelligence agencies by the Israeli company Nemesyco.” Yet when two Swedish scientists published a paper claiming VLA was not an accurate security tool, Nemesysco Limited threatened to sue them and the electronic version of the paper was taken down.Although it might prove to be wonderful, I’m not too excited about the prospect of my computer reading, reacting, or storing my emotions. I don’t want to worry if the touchy-feely personal data would be shared with marketers or anyone not of my choosing. How often is the tone in chat or email misunderstood based upon personal state of mind at the time of reading the message? Even people face-to-face can misread other people’s emotions, so what if the computer misunderstood? It would not be cool if the computer started closing applications because it judged the user was too tired, too frustrated, or too excited to continue.On the other hand, it might be nice if airport security could simply ask people if they intended to hijack the plane and lie-detector glasses, a voice analyzer, or a computer at the TSA station could quickly verify a person was not a terrorist or any kind of threat. It would however be very bad news if the analysis was wrong. Are you ready for your own personal “HAL” to get touchy-feely, interpreting and reacting to your emotions?All of today’s Microsoft news and blogsAre smart meters real-time surveillance spies? Digital Data Mined Dating Behavioral Ads Appearing On Online Banking Statements Digital Signage: Privacy in a ‘One-Way Mirror Society’ Snooping on Social Networks to Vet Jurors and Hire Employees Microsoft: Want Unrestricted Net Access? Need PC Health Certificate Watchdog Group questions Google’s relationship with NSA Former FBI Agent Turned ACLU Attorney: Feds Routinely Spy on Citizens Hackers needed to save the world — at least AmericaFollow me on Twitter @PrivacyFanatic Related content news Dow Jones watchlist of high-risk businesses, people found on unsecured database A Dow Jones watchlist of 2.4 million at-risk businesses, politicians, and individuals was left unprotected on public cloud server. By Ms. Smith Feb 28, 2019 4 mins Data Breach Hacking Security news Ransomware attacks hit Florida ISP, Australian cardiology group Ransomware attacks might be on the decline, but that doesn't mean we don't have new victims. A Florida ISP and an Australian cardiology group were hit recently. By Ms. Smith Feb 27, 2019 4 mins Ransomware Security news Bare-metal cloud servers vulnerable to Cloudborne flaw Researchers warn that firmware backdoors planted on bare-metal cloud servers could later be exploited to brick a different customer’s server, to steal their data, or for ransomware attacks. By Ms. Smith Feb 26, 2019 3 mins Cloud Computing Security news Meet the man-in-the-room attack: Hackers can invisibly eavesdrop on Bigscreen VR users Flaws in Bigscreen could allow 'invisible Peeping Tom' hackers to eavesdrop on Bigscreen VR users, to discreetly deliver malware payloads, to completely control victims' computers and even to start a worm infection spreading through VR By Ms. Smith Feb 21, 2019 4 mins Hacking Vulnerabilities Security Podcasts Videos Resources Events SUBSCRIBE TO OUR NEWSLETTER From our editors straight to your inbox Get started by entering your email address below. Please enter a valid email address Subscribe