• United States



by Anthony Caruana

AusCERT 2017 – The rise of the machines: AI and machine learning in infosec

Jun 05, 20174 mins
Access ControlApplication SecurityData and Information Security

While AI and machine learning are buzzwords, Symantec’s Nick Savvides said, during this year’s AusCERT conference they have been a big deal in computing circles since the 1950s. But it was in the 1980s when AI came into mainstream thinking a culture. It was movies like War Games and The Terminator, and TV shows like Knight Rider that took this important technology and moved it into mainstream consciousness.

Savvides pointed to KITT, the automotive star of Knight Rider, as an example of what AI might one day deliver.

“It had the ability to perceive, to provide constant analysis and make decisions,” said Savvides.

Machine learning differs from traditional programming, said Savvides. Whereas programs were traditionally developed as systems where data was provided to the compiled application, in machine learning systems, the data is part of the program.

In the driverless vehicles today, which use machine learning, the software can recognise patterns and develop the ability to recognise patterns. The data it works with is inside the program and used to recognise a pattern and then carry out some further operation.

For example, by providing the computer in a car with braking distances based on speed, road conditions and tyre wear it can automatically apply the brakes to avoid a collision.

While many of the concepts developed and documented through academia and industry in the 1950s are still important today, Savvides says a fundamental shift took place in 2006.

“It was the balkanisation of machine learning. It was when computing power became effectively cheap enough to run these algorithms”.

One of the stimuli, said Savvides, was provided by Netflix who offered a million dollars to developers who could improve their recommendation engine. This drove significant development, he said.

“One of the big benefits is that it can act as a force multiplier,” said Savvides.

An example was the production of a trailer for the movie Morgan. While Savvides described this as a movie “you’d only watch on a plane” he noted that an algorithm was used to choose which scenes from the movie ought to be used in the trailer. The algorithm was “taught” to recognise action sequences and other elements that made a good trailer.

While people then reordered the scenes and added the soundtrack, this cut the production time from the trailer from about three months to 24 hours including the human editing. When it comes to cyber-security, Savvides says there are several potential applications for machine learning and AI.

“The main application is in threat detection. Machine learning is very effective at being a threat detector. But it’s also very good at watching human behaviour – watching a user building constant of behaviours and building a profile of what they’re doing. And finally, there’s anomaly detection – something doesn’t look right”.

With threat detection, machine learning can go from the end-point to the security operations centre. With a global shortage of security analysists, machine learning can help fill the gaps we traditionally filled with people and it can free up the resources we have so they can work on higher level tasks.

And these systems can provide information to other machines which carry out other specialised tasks. Or it can look at data being collected in other systems and detect patterns and point humans towards targeted investigations.

As well as working with known threats, the ability to detect anomalies means machine learning can be used to detect previously unknown threats.

One of the other areas where machine learning can be used is model extraction,

“This is where I feed data into the machine, and I see what comes back and learn the model. The intellectual property is the model. If I understand the model I can poison it,” said Savvides.

This moves us towards new machine learning driven defensive models and then use machine to reverse an attack. For example, in a business email compromise attack, where a senior corporate officer is targeted through email to transfer funds to an unauthorised third party, a machine learning system could detect the attack and then respond using a bot, in the voice of the corporate officer, to assist with tracking the source of the fraud.

As for the future, Savvides says the systems will move from being reactive to becoming predictive and telling us that an attack is imminent. The challenge, he said, may come from our inability to understand how the computer came to its conclusions and we don’t trust it.