The relationship between development and security teams is often contentious. Security might see developers as a liability when it comes to protecting data and systems, and developers often view security as a disruption to their workflow.Both parties are right if the organization in which they work fails to create an environment of collaboration and shared goals between development and security. Without that kind of culture, the two groups will inevitably be at odds with one another.DevSecOps is an approach where security becomes an integral part of the development process. It requires developers and security personnel to have a mutual understanding and respect for what the other group does. A successful DevSecOps process reduces stress on both teams and avoids vulnerabilities being inadvertently built into the code.The three companies profiled here\u2014Microsoft, Verizon and the Pokemon Company\u2014have very different business models and security needs. However, they all benefited from taking a DevSecOps approach to their internal development process.Verizon developer dashboard provides vulnerability visibilityVerizon IT\u2019s AppSec team needed a way to facilitate secure DevOps practices as it moved to the cloud. They also wanted to drive a culture change within the company. \u201cWe needed something that is more sustainable that can help us build a larger influence of our centralized team, and at the same time, not burn the IT application team by keep dumping more work on their to-do list,\u201d explains Manah Khalil, IT director of application security.\u00a0To accomplish those goals, they adopted a DevSecOps approach, but they still had to convince developers to accept it. To help with that and nurture a security culture, Verizon created the developer dashboard program. It combines technical aspects of vulnerability management with individual accountability to help instill a security mindset among the company\u2019s developers.The developer dashboard is a centralized, real-time record of how vulnerabilities are introduced into applications within Verizon\u2019s business. It keeps track of scanning frequency and results, as well as the types and density of vulnerabilities within any one of the 2,100 applications being monitored (measured per 10,000 lines of code). It provides a view of where in the development lifecycle that vulnerability was introduced and by whom.\u201cTypically, in a software, you can measure the number of vulnerabilities, you can measure the density, but how does it relate to the culture?\u201d says Khalil. \u201cYou have so many dashboards out there: dashboards that are looking at your build quality, at your code quality, how often are you generating new builds, testing it, deploying, etc,\u201d he says. \u201cBut those are more meant as a to-do list. We're trying to use this as a way to look for the change and the culture change.\u201dThe dashboard pulls information from a variety of sources including asset management, learning management system (LMS), version control, code analysis, integrated developer environment tools, third-party scanning tools, configuration data, and web and firewall logs.\u00a0\u201cEverything we built in the in the developer dashboard was designed so that we can make a change and then quantify that change,\u201d says KhalilThe developer dashboard provides a centralized view of Verizon\u2019s vulnerability risk and gives near real-time feedback to developers on the risks they may be introducing to the business. It also is designed to help create more long-term changes for both the individual and the organization.Lesson learned: Look for ways to increase ownership and opportunities for developers to get better. Ordering developers to fix vulnerabilities is not a sustainable approach and only helps you in the short term. Giving the individual developer immediate feedback on an ongoing basis empowers them to find ways to improve their skills.Pokemon Company embraces security by design to protect children\u2019s privacyThe runaway success of the Pokemon Go mobile app in 2016 created a liability issue for the Pokemon Company. With 800 million downloads\u2014many of those by children\u2014the company knew it not only had to comply with privacy standards like the EU\u2019s General Data Protection Regulation (GDPR), but also show worried parents that they could be trusted with their children\u2019s personal data. That meant creating a security culture for the entire organization, including development.Pok\u00e9mon Go was developed by Google spin-off Niantic. The two companies share responsibility when it comes to how the game operates. \u201cThey control what runs on the application, and then they control a certain percentage of the backend, and we control a bit of the backend for Child Online Privacy Protection Act (COPPA) compliance,\u201d says John Visneski, director of information security and data protection officer at the Pokemon Company International.However, that\u2019s is just one part of the data puzzle. As well as Pok\u00e9mon Go, the company has user data from its other applications, its in-person tournaments for the physical trading card game, some of the video games, and more.\u201cThe goal isn't for me to say yes we can collect something or no we can't collect that,\u201d says Visneski. \u201cYou have to raise the security culture of your entire organization. Eventually, people become privacy and security experts and they won't even realize it.\u201dVisneski feels this mantra of not simply being a naysayer means security is viewed in a higher regard by the rest of the business, and it is more involved as a result. \u201cOur philosophy when it comes to security is that we're business enablers first and security professionals second,\u201d he says.\u201cWhat we try to do is make sure that the first question we ask ourselves isn't about risk; it isn't about threats or fancy tools. What we think about first is 'what does the business need, what are the business objectives, and how is our technology arm going to make the business be more effective and efficient?',\u201d he says.\u00a0Adopting this mindset, Visneski argues, prevents security being seen merely as the people telling the business what it can't do. Security, then, becomes a team the business wants in meetings to help the business achieve its goals.One example of being an enabler is the company\u2019s use of Sumo Logic. The log management and analytics provider was brought in primarily for security analytics capabilities, but is now widely used across the business including DevOps. \u201cOur second biggest power user is someone that works in our games studio. That allowed us to sit at that nexus of how to integrate data securely across the company in order to do enable the business,\u201d says Visneski.Lesson learned: Security needs to be seen by developers as an enabler and partner. If both security and development teams have a "what's best for the business" mindset, then they are more likely to be in sync during the development and vulnerability testing processes.\u00a0Sharing information, best practices brings development and security together at MicrosoftSoftware giant Microsoft believes it has achieved a common purpose between its development and security operations, and that this shared purpose has resulted in better security for both its internal and commercial software and services.Microsoft\u2019s approach is simple and is based on good, consistent training and communication. Executing that approach is not so simple. It requires buy-in from both groups, ongoing training, effective communication and, importantly, a strong endorsement from executive management.\u201cOriginally, we had the secure development lifecycle, which was really the philosophy of how to do threat modeling and ensure code quality in the boxed products we ship. Then you go to 2008, where we started doing online services,\u201d says Microsoft CISO Bret Arsenault. \u201cNow our security engineering teams are doing operational security as well as service and product security. We review that with every team every month and make sure we're there.\u201dThose teams share security best\u2011practices at Red Zone meetings, as Microsoft calls them. \u201cWe're leveraging each other's [best practices], both technology, learnings and capability,\u201d says Arsenault.\u00a0A lack of understanding and poor communication can doom the relationship between security and development teams. The two groups need to share knowledge, and they need to feel empowered to help each other achieve shared goals. To that end, Microsoft has created an ongoing Strike training program.\u00a0\u201cHow you change and drive a culture \u2013 change the DNA \u2013 is through education and a set of behaviors and measurement,\u201d says Bharat Shah, vice-president for security engineering in Microsoft\u2019s cloud and AI division.Microsoft looks at that change from three perspectives. First is training for all employees through the standards of business conduct, which Arsenault says always includes security training. The next level is what Microsoft calls \u201csecurity foundations,\u201d and it allows them to address security for all employees in greater depth.The third level adds Strike training designed exclusive for all Microsoft engineers. It is closed\u2011session training that walks them through what threat actors are doing and to help them understand the global threat landscape.\u00a0These Strike sessions train developers and engineers to understand the reasons behind Microsoft's security practices, the techniques and tactics hackers use, and the engineering tools available to them. The goal is to help them build a network of peers and resources they can leverage to ensure that security is built into everything they do.\u00a0\u201cHow do you make sure you're doing the right thing with code, with identity, with secrets, with all the other things?\u201d says Arsenault. \u201cWe're very prescriptive.\u201dWhile Arsenault sees training as a critical factor to bring IT and security together, the monthly reviews with the operation teams, how Microsoft manages and assesses risk, and how that risk assessment is reported from the engineers up to the board and risk management council is what makes it all work.\u201cOnce a month, my boss reviews the security scorecards, with each of these directs and pushes things that you think are important for the team,\u201d says Shah. \u201cThat is a culture that is top\u2011down. Training is super awesome at the bottom\u2011up level.\u201dSecurity assurance specialists on Shah\u2019s team see the mistakes and compromises in the code and look at all the core reviews. They then pass what they learn to the rest of the engineering team, In some cases, this has helped \u201celiminate a whole class of bugs,\u201d says Shah. \u201cOccasionally, what we learn, we push it back into our tools group or into our compiler group, into even things like static analysis, to just catch these things at scale.\u201dA large part of what Shah and his team do is build security services for Microsoft\u2019s engineers, allowing them to \u201cblend in\u201d security into their engineering processes and systems. \u201cNumber one, we build these high\u2011scale services that help our engineers get security right.\u201dOne of those services addresses vulnerability management and scanning. \u201cAzure runs across more than 90 data centers, millions of VMs,\u201d says Shah. \u201cWe don't have the option of scanning a VM at a time, so we've built a large\u2011scale vulnerability scanning infrastructure, where we can scan twice, thrice, or even four times a day, if we have to, at scale.\u201d This allows Microsoft\u2019s engineers to quickly find and fix unpatched VMs or services.The engineers on Shah\u2019s team build large-scale services just as engineers in other Microsoft groups do. \u201cIn that sense, we have a better sense of empathy when security becomes a little bit of a pain in the neck.\u201d That, along with what security shares with them about threat risk, gives them needed insight into why they need to develop with security in mind.Lesson learned: The more security and development understand about what the other group does, the more empathetic and collaborative they will be during the development process. This will lead to fewer vulnerabilities in the final product and faster fixes.