• United States



AI experts building ‘world’s angriest robot’ to constantly run ‘what if’ scenarios

May 13, 20154 mins
Data and Information SecurityMicrosoftSecurity

Touchpoint Group is planning to develop "the world's angriest artificial intelligence machine," which is named Radiant after a sci-fi supercomputer that could predict future behavior.

We’ve all heard recorded announcements, such as “this call may be recorded for quality assurance or training purposes.” Telecoms, insurance firms and one of Australia’s biggest banks is handing over “reams of real-life customer interactions that have been collated over the past two years.” The data will be fed into “the world’s angriest robot” to help determine what causes customers to reach the breaking point and go nuclear.

The New Zealand-based Touchpoint Group is investing $500,000 to develop “the world’s angriest artificial intelligence machine.” Data scientists will work on the platform and algorithms that, by the end of next year, “will simulate hundreds of millions of angry customer interactions that will help companies better understand the behaviors and processes that trigger customer outbursts.”

Touchpoint chief executive Frank van der Velden told The Australian the research will help identify “how customers were affected by the various products, systems, policies, processes and people they interacted with in the lead-up to reaching breaking point.” The AI engine will try out past examples of what works with customers and what doesn’t work.

The angry AI project has been dubbed “Radiant” from Isaac Asimov’s Foundation science fiction novel series. The Australian added, “In the Foundation series, Prime Radiant was a supercomputer that could predict the future behavior and development of humanity through the analysis of history, sociology and mathematical statistics.”

“Radiant will be searching for behavioral patterns that typically suggest moments of risk or opportunity when interacting with customers,” Touchpoint Group’s CEO told CMO. “Effectively, it will be constantly running ‘what if’ scenarios, to see if a particular scenario is likely to enrage or benefit the customer.”

While it would be great if Radiant can help it be less painful when calling customer support, let’s hope the angry AI and predictor of future behavior doesn’t take on a life of its own and bring about the apocalypse, as Telsa’s Elon Musk, super-brain Stephen Hawking, Bill Gates, and other smart techies have warned.

Regarding recorded messages heard after placing a call to a business, one of my favs is “Your call is very important to us. All representatives are currently busy. Please stay on the line and your call will be answered in the order it was received.” After 15 minutes of hold time with the looped recording over elevator music, it’s hard to see how “your call is very important;” it gets even better when a real person finally answers yet manages to disconnect the call. The next call back may result in no way to placate or otherwise pacify the customer.

Another favorite for ticking me off before I’ve even spoken to anyone deals with redundantly pushing a number to reach a department; press “9” for technical support; press “9” if you are experiencing problems connecting to the Internet; press “9” if you wish to report an internet outage; press “9” if you cannot connect to the internet; press “9” to speak to a technician about an internet outage.

One ISP has done this; each time it leads to technical support dealing with “if you’re experiencing problems with your internet connection.” You must redundantly push buttons as if you changed your mind about reporting the net being down, to eventually hear a recording about “due to the overwhelmingly positive response and high demand for our award-winning services, all technicians are currently busy. Your call may be monitored for quality assurance purposes. Please hold.” When trying to reach the department that deals with lack of Internet connectivity seems like a silly time to try and tell consumers about the “high demand” for “award-winning services.”

Data from real-life problem calls being fed into Touchpoint’s angry Radiant robot will surely include voice recognition input fails. But that seems to be a common problem even if you are not dealing with a call center or automated customer representative.

Touchpoint’s “end goal is to build an engine that can recommend solutions to companies. We’re not in the business of managing complaints; we are in the business of managing issues that might turn into complaints. We’re at the top of the cliff, not at the bottom. This will allow companies to better predict and identify those issues.”

ms smith

Ms. Smith (not her real name) is a freelance writer and programmer with a special and somewhat personal interest in IT privacy and security issues. She focuses on the unique challenges of maintaining privacy and security, both for individuals and enterprises. She has worked as a journalist and has also penned many technical papers and guides covering various technologies. Smith is herself a self-described privacy and security freak.