• United States



3 DevSecOps success stories

Sep 26, 201910 mins
Application SecuritySecuritySoftware Development

Learn how three very different companies — Microsoft, Verizon and the Pokemon Company — got their development and security teams to work together smoothly.

teamwork / collaboration / developers / development / engineers / binary code / virtual interface
Credit: Dean Mitchell / Getty Images

The relationship between development and security teams is often contentious. Security might see developers as a liability when it comes to protecting data and systems, and developers often view security as a disruption to their workflow.

Both parties are right if the organization in which they work fails to create an environment of collaboration and shared goals between development and security. Without that kind of culture, the two groups will inevitably be at odds with one another.

DevSecOps is an approach where security becomes an integral part of the development process. It requires developers and security personnel to have a mutual understanding and respect for what the other group does. A successful DevSecOps process reduces stress on both teams and avoids vulnerabilities being inadvertently built into the code.

The three companies profiled here—Microsoft, Verizon and the Pokemon Company—have very different business models and security needs. However, they all benefited from taking a DevSecOps approach to their internal development process.

Verizon developer dashboard provides vulnerability visibility

Verizon IT’s AppSec team needed a way to facilitate secure DevOps practices as it moved to the cloud. They also wanted to drive a culture change within the company. “We needed something that is more sustainable that can help us build a larger influence of our centralized team, and at the same time, not burn the IT application team by keep dumping more work on their to-do list,” explains Manah Khalil, IT director of application security. 

To accomplish those goals, they adopted a DevSecOps approach, but they still had to convince developers to accept it. To help with that and nurture a security culture, Verizon created the developer dashboard program. It combines technical aspects of vulnerability management with individual accountability to help instill a security mindset among the company’s developers.

The developer dashboard is a centralized, real-time record of how vulnerabilities are introduced into applications within Verizon’s business. It keeps track of scanning frequency and results, as well as the types and density of vulnerabilities within any one of the 2,100 applications being monitored (measured per 10,000 lines of code). It provides a view of where in the development lifecycle that vulnerability was introduced and by whom.

“Typically, in a software, you can measure the number of vulnerabilities, you can measure the density, but how does it relate to the culture?” says Khalil. “You have so many dashboards out there: dashboards that are looking at your build quality, at your code quality, how often are you generating new builds, testing it, deploying, etc,” he says. “But those are more meant as a to-do list. We’re trying to use this as a way to look for the change and the culture change.”

The dashboard pulls information from a variety of sources including asset management, learning management system (LMS), version control, code analysis, integrated developer environment tools, third-party scanning tools, configuration data, and web and firewall logs. “Everything we built in the in the developer dashboard was designed so that we can make a change and then quantify that change,” says Khalil

The developer dashboard provides a centralized view of Verizon’s vulnerability risk and gives near real-time feedback to developers on the risks they may be introducing to the business. It also is designed to help create more long-term changes for both the individual and the organization.

Lesson learned: Look for ways to increase ownership and opportunities for developers to get better. Ordering developers to fix vulnerabilities is not a sustainable approach and only helps you in the short term. Giving the individual developer immediate feedback on an ongoing basis empowers them to find ways to improve their skills.

Pokemon Company embraces security by design to protect children’s privacy

The runaway success of the Pokemon Go mobile app in 2016 created a liability issue for the Pokemon Company. With 800 million downloads—many of those by children—the company knew it not only had to comply with privacy standards like the EU’s General Data Protection Regulation (GDPR), but also show worried parents that they could be trusted with their children’s personal data. That meant creating a security culture for the entire organization, including development.

Pokémon Go was developed by Google spin-off Niantic. The two companies share responsibility when it comes to how the game operates. “They control what runs on the application, and then they control a certain percentage of the backend, and we control a bit of the backend for Child Online Privacy Protection Act (COPPA) compliance,” says John Visneski, director of information security and data protection officer at the Pokemon Company International.

However, that’s is just one part of the data puzzle. As well as Pokémon Go, the company has user data from its other applications, its in-person tournaments for the physical trading card game, some of the video games, and more.

“The goal isn’t for me to say yes we can collect something or no we can’t collect that,” says Visneski. “You have to raise the security culture of your entire organization. Eventually, people become privacy and security experts and they won’t even realize it.”

Visneski feels this mantra of not simply being a naysayer means security is viewed in a higher regard by the rest of the business, and it is more involved as a result. “Our philosophy when it comes to security is that we’re business enablers first and security professionals second,” he says.

“What we try to do is make sure that the first question we ask ourselves isn’t about risk; it isn’t about threats or fancy tools. What we think about first is ‘what does the business need, what are the business objectives, and how is our technology arm going to make the business be more effective and efficient?’,” he says. 

Adopting this mindset, Visneski argues, prevents security being seen merely as the people telling the business what it can’t do. Security, then, becomes a team the business wants in meetings to help the business achieve its goals.

One example of being an enabler is the company’s use of Sumo Logic. The log management and analytics provider was brought in primarily for security analytics capabilities, but is now widely used across the business including DevOps. “Our second biggest power user is someone that works in our games studio. That allowed us to sit at that nexus of how to integrate data securely across the company in order to do enable the business,” says Visneski.

Lesson learned: Security needs to be seen by developers as an enabler and partner. If both security and development teams have a “what’s best for the business” mindset, then they are more likely to be in sync during the development and vulnerability testing processes. 

Sharing information, best practices brings development and security together at Microsoft

Software giant Microsoft believes it has achieved a common purpose between its development and security operations, and that this shared purpose has resulted in better security for both its internal and commercial software and services.

Microsoft’s approach is simple and is based on good, consistent training and communication. Executing that approach is not so simple. It requires buy-in from both groups, ongoing training, effective communication and, importantly, a strong endorsement from executive management.

Originally, we had the secure development lifecycle, which was really the philosophy of how to do threat modeling and ensure code quality in the boxed products we ship. Then you go to 2008, where we started doing online services,” says Microsoft CISO Bret Arsenault. “Now our security engineering teams are doing operational security as well as service and product security. We review that with every team every month and make sure we’re there.”

Those teams share security best‑practices at Red Zone meetings, as Microsoft calls them. “We’re leveraging each other’s [best practices], both technology, learnings and capability,” says Arsenault. 

A lack of understanding and poor communication can doom the relationship between security and development teams. The two groups need to share knowledge, and they need to feel empowered to help each other achieve shared goals. To that end, Microsoft has created an ongoing Strike training program. How you change and drive a culture – change the DNA – is through education and a set of behaviors and measurement,” says Bharat Shah, vice-president for security engineering in Microsoft’s cloud and AI division.

Microsoft looks at that change from three perspectives. First is training for all employees through the standards of business conduct, which Arsenault says always includes security training. The next level is what Microsoft calls “security foundations,” and it allows them to address security for all employees in greater depth.

The third level adds Strike training designed exclusive for all Microsoft engineers. It is closed‑session training that walks them through what threat actors are doing and to help them understand the global threat landscape. 

These Strike sessions train developers and engineers to understand the reasons behind Microsoft’s security practices, the techniques and tactics hackers use, and the engineering tools available to them. The goal is to help them build a network of peers and resources they can leverage to ensure that security is built into everything they do. 

“How do you make sure you’re doing the right thing with code, with identity, with secrets, with all the other things?” says Arsenault. “We’re very prescriptive.”

While Arsenault sees training as a critical factor to bring IT and security together, the monthly reviews with the operation teams, how Microsoft manages and assesses risk, and how that risk assessment is reported from the engineers up to the board and risk management council is what makes it all work.

Once a month, my boss reviews the security scorecards, with each of these directs and pushes things that you think are important for the team,” says Shah. “That is a culture that is top‑down. Training is super awesome at the bottom‑up level.”

Security assurance specialists on Shah’s team see the mistakes and compromises in the code and look at all the core reviews. They then pass what they learn to the rest of the engineering team, In some cases, this has helped “eliminate a whole class of bugs,” says Shah. “Occasionally, what we learn, we push it back into our tools group or into our compiler group, into even things like static analysis, to just catch these things at scale.”

A large part of what Shah and his team do is build security services for Microsoft’s engineers, allowing them to “blend in” security into their engineering processes and systems. “Number one, we build these high‑scale services that help our engineers get security right.”

One of those services addresses vulnerability management and scanning. “Azure runs across more than 90 data centers, millions of VMs,” says Shah. “We don’t have the option of scanning a VM at a time, so we’ve built a large‑scale vulnerability scanning infrastructure, where we can scan twice, thrice, or even four times a day, if we have to, at scale.” This allows Microsoft’s engineers to quickly find and fix unpatched VMs or services.

The engineers on Shah’s team build large-scale services just as engineers in other Microsoft groups do. “In that sense, we have a better sense of empathy when security becomes a little bit of a pain in the neck.” That, along with what security shares with them about threat risk, gives them needed insight into why they need to develop with security in mind.

Lesson learned: The more security and development understand about what the other group does, the more empathetic and collaborative they will be during the development process. This will lead to fewer vulnerabilities in the final product and faster fixes.