The purported veering of a jetliner caused by an onboard hacker points up a larger problem, experts say \u2013 airlines and other providers of services may be blind to the value such security researchers can offer in the name of public safety.While it\u2019s far from clear that security researcher Chris Roberts actually did commandeer the avionics system of an airplane and force it to steer to one side, the story is prompting other security experts to call for better cooperation between white-hat hackers and industries whose infrastructures they probe.Airlines have to get used to the idea that this type of hacking can be useful and ought to make test environments that simulate aircraft systems available for researchers to hack against, says Jeremiah Grossman, the founder of White Hat Security.There is an emotional reaction on the part of corporate executives not to admit their systems have weaknesses, he says, but as their view matures, they become more cooperative with the research community. \u201cTheir first reaction is only authorized people can test their network. Google got over it. Facebook got over it,\u201d he says.He points to his own experience in 2000 with Yahoo, when he hacked his own Yahoo Mail account\u2013 an exploit he could have carried out against anyone else\u2019s account. But instead he told the company, and it not only worked with him to resolve the issue without penalizing him, it gave him a job, he says.[ ALSO: Are commercial planes in danger of being hacked? ]But that\u2019s not always the case, and hackers with the best of intentions are warned against probing for flaws without permission. The first thing Grossman tells researchers is, \u201cNever touch or test anything when you don\u2019t have express written consent or own it. Otherwise you\u2019re at the whim of the target.\u201dIf that whim is to call in law enforcement, the consequences can be dire for the hacker personally, but also for the general public safety, says Josh Corman, CTO of Sonatype, and founder of I Am The Cavalry, a grassroots group \u201cfocused on issues where computer security intersect public safety and human life.\u201dThe group has been working closely with the auto industry for more than a year to build enough trust between hackers and the car manufacturers so they can work toward safer vehicles, he says. And it is hoping to do the same with airlines and medical-device manufacturers.A big hurdle is that leaders in industries that need protection are not well versed in the nature of cyber security. Their worlds are governed by static sets of facts and principles of science like those used by the engineers who make their products. \u201cCyber lacks physics,\u201d Corman says, which makes security an elusive target. \u201cAssumptions move or are incomplete or are no longer sound.\u201d It\u2019s a slow process to get industries to first understand the difference and then build enough trust with security researchers so they can work together toward safer products and services, he says.The problem is compounded by these industries wanting to project an image of safety and security to the public, says Paul Kocher. For example, services that store data online want customers to trust that their data is safe. \u201cIf I\u2019m operating a service my financial interest is usually in trying to make my customers feel comfortable, which is not necessarily to disclose accurately what the risks are,\u201d he says. \u201cSo they are understandably reluctant to allow hackers to test their systems, particularly if the flaws are revealed publicly, even if finding vulnerabilities and fixing them is desirable. \u201cIf you want to have researchers providing this level of authentic, unfiltered information \u2013 at least when people are doing things badly \u2013 it\u2019s going to be challenging,\u201d he says.He thinks there are black and white areas about what is and is not appropriate for hackers to do, and then there is an enormous gray area where things are not so clear.His company tries to stay \u201cvery, very far into that white zone, but I recognize the challenges. The question of messing with a flying plane\u2019s avionics is pretty clearly for me in the black area. I see a lot of places where there\u2019s a big debate, but I don\u2019t see this as one where the shades of gray are as nuanced as they will be in future situations that will come along.\u201dOne such area is medical devices, which must receive lengthy FDA approval. When changes are made, devices must be reapproved. \u201cBut that doesn\u2019t fit very well into your zero-day response cycles,\u201d Kocher says. \u201cHow do you patch the Linux install running inside your implantable medical device? We haven\u2019t worked these things out yet, and perhaps we never will. It\u2019s just one of these horribly messy problems that we\u2019ll be struggling with for a long time.\u201dBut that doesn\u2019t mean hackers and industry shouldn\u2019t come to agreement, Corman says. Constant testing of existing systems is necessary, even if they have previously determined secure. For example, Shellshock bugs lived in the Unix Bash shell from 1989 to 2014 before they were exposed. Similarly, the Heartbleed bug existed in the Open SSL library from 2011 to 2014.Hostile reactions to responsible bug disclosures by altruistic hackers could hurt the security of vital systems, he says. \u201cWe have a role to play and are willing and able to help,\u201d he says.If white-hat hackers face legal consequences of finding flaws in avionics networks, for example, they may abandon studying them. \u201cThey can easily research something less risky and not run afoul of the law,\u201d he says. \u201cIf it\u2019s too risky or politically loaded, who in their right mind would do it?\u201dOne obvious answer: the black hat hackers.