The chatbot revolution will (not) be secure

Two blockchain projects hope to prevent bot authentication issues before they start.

artificially intelligent [AI] virtual assistant / chatbot
Thinkstock

Verizon Vice President of Digital Ashok Kumar believes chatbots will soon replace websites as the user interface (UI) of choice. He’s likely right: According to a 2017 Grand View Research report, roughly 45 percent of consumers already prefer to address customer service concerns by bot and Oracle claims that by 2020, 80 percent of sales and marketing departments will be using bots.

While bad UI can make sites difficult to maneuver, chat is naturally intuitive. "Everybody knows language and everybody knows how to speak," Kumar says.

But when the chatbot revolution comes, will its ecosystem be secure enough to take over?

Just like web sites, chatbots must be secure in multiple ways: Bot developers have to store data safely, guard their systems against hack, and make sure connections to the core messaging platform (such as Slack or Facebook Messenger) are secure.

"When you go to a website, there's [an SSL] certificate," says chatbot investor Andrew Rollins. SSL is short for Secure Sockets Layer, a protocol that ensures encryption between any given website and the user’s browser. “There's some certificate authority that granted that certificate,” he continues. “There's a private key installed in the web server.”

These measures aren’t just there for the developer’s benefit, Rollins explains, but also for the user: “When you see that URL in the top search bar is green instead of red, you have some semblance of an expectation that that is in fact Amazon [for example] that you just went to.” For chatbots, he continues, there’s nothing.

There needs to be. “If you want to text something — whether it's on Twitter or through Facebook or just an SMS or whatever, how do you know that the thing you're talking to on the other side is what you think you're talking to?” Rollins asks. Without a bot version of SSL, hackers could pose as authorized chatbots, leaving you none the wiser.

Solving the trust issue

Of course, in Slack, Ashok Pitchamani says, identity is easier. Pitchamani is a product manager with BotChain, an identity and audit ledger for chatbots currently in public beta. “In Slack, it's easy to trust a bot because they're embedded,” he explains, “but if you go to Twitter or Facebook or something like that, it's really, really hard to figure out if this a legit bot or not — especially Twitter.”

According to Pitchamani, the answer of course is BotChain: "In essence, it’s a registry where a set of people can come and prove that this AI or this bot is what it’s supposed to be." He says the blockchain-based platform doesn’t want to be the end-all, be-all of bot security — rather a starting point “where all the bot vendor companies can provide a framework where you can register your bot and make sure you have an entity that you can re-verify with.” Vendors can also compare their security to others’.

While chatting, users simply click on the bot, Pitchamani explains: “Just like the way you’re used to an HTTPS certificate and the SSL certificate, you can see a bot certificate too, which will be powered by the bot in the backend.” This, he continues, creates a system “by which a consumer can easily figure out a way to see if that bot is real or not.”

Although BotChain is still in beta, the company claims more than 150,000 chatbots currently run 4 billion conversations through the system per month. For a platform that doesn’t launch until July 17, these figures might sound exaggerated — especially in light of the fact that only 2 billion conversations total run through Facebook Messenger every month. But chatbots aren’t restricted to Messenger; they also connect through Slack, Skype, Microsoft Teams, and pretty much every other communication platform you can think of, including websites and text.

“Bots are gonna be the next paradigm shift,” says Nathan Shedroff, executive director of Seed Vault. “There’s gonna be the paradigm shift that is similar to what happened with the web and what happened when we moved from command line to graphic user interfaces.” As this shift occurs, Shedroff continues, “Authentication is gonna be a problem. Trust is gonna be a real issue.”

As a BotChain alternative, Seed Vault plans to offer developers a blockchain-based, open-source community. Right now, he says, the group is “concentrating on building some infrastructure that we think will create some standards and create trust, and create a platform for this [ecosystem] to grow well from.” In other words, by sharing code, Shedroff hopes developers will share security best practices as well. Early standards are currently available on Seed Vault’s GitHub with a complete developer environment planned for launch second quarter of next year.

Needed: a platform for chatbot security

Whether the community embraces BotChain registration or follows Seed Vault’s less direct route, Rollins says the time for action is now, while the ecosystem around chatbots is still developing: “We have to think a lot about lock-in and what we're building on top of and then the immutability of the chain. It's very hard to take things back and to configure them once [they’re] out there.”

Once identity is complete, though, Pitchamani says all the other necessary pieces for chatbot security will follow: “Once we nail identity, I think, the rest of the thing can come through. For example, reputation can follow...Somebody can build like an audit and compliance service for enterprise use cases and stuff like that. We felt like bots needed some kind of standard, just like the way the rest of us have standards by which we follow. And we thought, ‘We can take the first steps and take an ecosystem with us.’”

Copyright © 2018 IDG Communications, Inc.

7 hot cybersecurity trends (and 2 going cold)