• United States



Contributing writer

How to handle security vulnerability reports

Jan 17, 20178 mins
Data and Information SecuritySecurityVulnerabilities

There are people out there willing to help with your company’s security issues. Isn’t it time your company had its own ‘see something, say something’ policy?

dashboard / report / metrics / results / analysis  / management
Credit: Peshkov / Getty Images

If there’s a flaw in your IT security — and there probably is — you can’t assume that someone in your organization will be the first to find it. But if you’re lucky, instead of ending up with ransomware or a data breach, you might hear about it from a security researcher or even a smart customer who’s spotted the problem and wants to warn you. Are you ready to listen?

Many companies aren’t, warns security consultant Troy Hunt. Hunt runs, a website that helps people discover if any of their accounts have been compromised by data breaches. Because of his role with the website, he routinely finds himself in a position to contact organizations about breaches and other security issues that he’s found or that other people pass on to him.

“It’s often very difficult just to get in touch with a company in the first place — even the big ones. I’m going through multiple data breaches and I just can’t get the contacts,” he told When he discovered some 40,000 patient reports from an Indian pathology lab — including sensitive information like HIV status — were publicly available, the obvious ways of contacting the lab didn’t work. “Email was bouncing; even the WHOIS contact information for their domain was bouncing.”

Another security researcher spoke to had found a SQL injection bug on the website of a large grocery store. The site offered no contact details, so they used the online form for submitting issues. About 10 days later, they were notified that their ‘request’ had been forwarded to the store manager. Two days after that, they received a survey asking how they’d rate the customer service they’d received. At the time of writing, they still hadn’t managed to confirm that the grocery store had received the warning.

Slow responses aren’t uncommon, notes Hunt. “There are a lot of even large organizations that don’t have people in dedicated security roles and they don’t have a process for how they’re going to deal with this incident.” When he notified consultantcy Cap Gemini that someone had found the database backups for their customer, recruitment firm Michael Page, in a publicly accessible folder share, “it took over a week for them to figure out what had happened, prepare a response and notify people.”

Other researchers — even those working at large software companies — report that companies they try to notify about issues ignore reports, or respond with legal threats.

A survey of the Forbes Global 2000 companies in 2015 showed the scale of the problem: Only 6 percent had a public method to report a security vulnerability. And yet, in November 2016, the U.S. Department of Defense posted its policy for how security researchers could submit vulnerabilities they’d discovered in DoD websites, with clear guidelines for what researchers could investigate and when they’d get a response. Isn’t it time your company had its own ‘see something, say something’ policy?

Process and point of contact

The first thing most security researchers wish organizations would do is tell them who to talk to. “What I’d like to see on every site is a contact; here’s who to contact if you have a security question,” says Hunt.

But you also need to respond sensibly, and that means having the right procedures in place. “I’d like to get to where every company has a published statement acknowledging that they’re going to have vulnerabilities and recognizing how they’re going to handle them. You need to be clear on how you accept feedback, how you handle feedback and how you handle disclosure.”

“It is important to establish a process and an ‘inbox’ so that every request or report doesn’t become a one-off request (or, worse, a fire drill),” says Dwayne Melancon, vice president for products at security software company Tripwire. “Create an inbound email address or alias that is designed to receive these kinds of reports, and a known, documented process for dealing with the inbound reports.”

You’ll need to triage the reports that come in, you want to know if they’re legitimate problems, how severe they are and what risk they expose you to. “You need to separate things that are clear and obvious security vulnerabilities — anything from SQL injection vulnerabilities to exposed files to improperly implemented HTTPS; they’re in a different realm from a customer complaining that your password rules don’t allow spaces,” says Hunt.

“A highly exploitable issue that would have a critical business impact should jump to the front of the remediation queue and engage your top security analysts,” Melancon advises. “In contrast, a question about your password policy should be routed to a customer-facing responder rather than a specialized security analyst.”

Don’t ignore even small issues, warns HackerOne co-founder, Michiel Prins. “Some of the biggest vulnerabilities are achieved by chaining together small and innocent vulnerabilities or missing best practices. My advice would be to investigate any claim of the existence of a vulnerability, even if it is around a missing best practice. If it is easy to implement, then why not do it? Defense in depth will pay off in the long term.”

Those password questions still need to get a good answer, with a sound security basis, but beware of doing that on social media, Hunt warns. “Many people like to just fire off a tweet or leave comments on Facebook pages, but the last thing you want is a security discussion on public social media. First of all, if you’ve got stupid password rules then you have an issue you need to deal with, and then the people controlling the social media account are often out of their depth. So many times, I see companies making bad statements under the corporate banner.”

Your social media staff will be turning to the company handbook to get an answer, so make sure it’s in there. “Then get the feedback offline and take them through your process.”

Secure by culture

Fostering a culture of security starts with not getting defensive when you get a report of a problem with your systems. “Even when it may be awkward in the beginning, show gratitude to the person who has the bravery to tell you about a security problem,” says Prins. “You must realize that this person comes to you in good faith and that a security vulnerability in the wrong hands could do serious damage to your business and brand. Keep the person who reported the vulnerability to you informed throughout the process all the way from receipt to validation to resolution.”

That’s part of having the right attitude and culture in not just your security team, but also your engineering team, Prins says. “Your engineering team is most likely responsible for fixing reported vulnerabilities. It is very likely they are also the group that introduced them in the first place. This is why the security team and engineering team need to have a healthy relationship. If engineering sees the security team as ‘oh, those guys again,’ you have a cultural and attitude problem to overcome first. If you have a dedicated product management team, you need to get them on board too.”

In fact, dealing with vulnerability reports needs to be part of the way you design, implement and test your systems, which includes making sure everyone knows when a system is being launched or updated, he says. “Especially engineering teams, product teams, legal team and your communications team. When the first vulnerability from an external party comes in, it should not be a surprise for anyone.”

“In short, develop a culture where everyone involved truly believes security is important and fixing an externally discovered security vulnerability gets priority over most other things.”

That culture will help the escalation process for serious vulnerabilities work well. “You need to have a smooth internal communication process, tight integration with the issue tracking software your engineering team uses, and maybe even appoint an individual who manages the internal coordination until a fix has been rolled out,” Prins says. “Some of the questions you need answered are: ‘who owns doing the root cause analysis?’, ‘who owns the development and deployment of the fix?’ and ‘who owns investigating if the security issue has been exploited maliciously before?’ For less impactful security issues, you can build upon your existing prioritization and escalation process for fixing bugs.”

If you’re looking for a formal method, surprisingly few people know that there are ISO standards for vulnerability disclosure (ISO 29147) and vulnerability handling processes (ISO 30111). The first is now free to download and gives you best practices for receiving vulnerability reports and responding to them. If you want to see how well equipped you are for that, work through the questions in this Vulnerability Coordination Maturity Model, which covers executive support, communication, analytics and incentives as well as IT and engineering.

Contributing writer

Mary Branscombe is a freelance journalist who has been covering technology for over two decades and has written about everything from programming languages, early versions of Windows and Office and the arrival of the web to consumer gadgets and home entertainment.

More from this author