Did\u00a0FBI v. Apple \u2013 the most high-profile conflict to date over government access to private, encrypted data \u2013 change the world of digital surveillance and personal privacy?With the first anniversary of the court fight over unlocking a terrorist\u2019s iPhone approaching, the answer both from evidence and various stakeholders is: Not much \u2013 yet.According to experts, it did not significantly increase law enforcement surveillance, but certainly didn\u2019t curb it. It did not lead to any major legal precedents, either through legislation or Supreme Court rulings. IDGSome of those may be coming, however, possibly this year. Congressional committees have studied and reported on the issue. There is draft legislation focused on it in the works. There are ongoing conflicts \u2013 legal, legislative and philosophical \u2013 over whether forcing private firms to grant government access to data for criminal investigations or surveillance can be done without eroding personal privacy and civil rights.Of course there is the arrival of President Donald Trump, who has not promised any executive orders on the matter, but did famously call for a boycott of Apple when the company refused to comply with the FBI demand.[ RELATED: Why every leader needs to understand Apple vs. FBI ]And there is also the march of technology \u2013 as a number of experts have pointed out, there are dozens of messaging apps that use strong encryption, which could make creating a backdoor to unlock a phone somewhat irrelevant.But at a minimum, the case intensified the debate over whether there is a way for competing privacy and government public safety interests to coexist.Some have argued that firms like Apple can create backdoors that allow the good guys to access data, but prevent access by bad actors. Unfortunately, this isn't possible.John Verdi, vice president of policy, Future of Privacy ForumThe obvious reason FBI v. Apple didn't set any precedents is that it never got legally resolved. Just before a scheduled hearing on the case, the FBI withdrew, saying it had found a vendor that was able to unlock the phone.And in the view of some privacy advocates, that\u2019s unfortunate, because they think the FBI would have lost. Nate Cardozo, senior staff attorney with the Electronic Frontier Foundation (EFF), who called the FBI complaint \u201cwild overreach,\u201d is one of them. He said the FBI knew its case was weak.\u201cThe fact that the government pulled the plug on the litigation on the literal eve of the hearing speaks volumes about the strength of its legal argument. Instead of risking binding precedent on our side, the government blinked,\u201d he said.That is also the view of Greg Nojeim, director of the Freedom, Security and Technology Project at the Center for Democracy & Technology (CDT).\u201cThe fact that the government was able to access the content it sought without Apple\u2019s assistance undermined its argument that it needed to be able to compel Apple and other providers for the assistance they sought,\u201d he said.A bit of backgroundBy now, the basics of the case are well known: On Dec. 1, 2015, Syed Rizwan Farook and his wife, Tashfeen Malik, killed 14 people and wounded 22 in a San Bernardino, Calif., shooting rampage. Both were then killed in a shootout with police.About two months later, on Feb. 9, the FBI announced that it had been unable to unlock Farook\u2019s employer-issued iPhone 5C, and demanded that Apple provide a way to do it.The fact that the government pulled the plug on the litigation on the literal eve of the hearing speaks volumes about the strength of its legal argument.Nate Cardozo, senior staff attorney with the Electronic Freedom FoundationA week after that, a federal magistrate upheld the demand, ordering Apple to disable the security feature that would wipe the data on the phone after more than 10 unsuccessful attempts to guess the passcode.Apple appealed. CEO Tim Cook said the company didn\u2019t have the capability to defeat that feature, and if it was forced to create a \u201cbackdoor\u201d into the device, it would amount to\u00a0"the software equivalent of cancer" that would endanger the privacy and security of hundreds of millions of iPhone users throughout the world.More than 30 tech companies including Google, Facebook and Twitter, along with a host of privacy and civil rights advocates supported the appeal.But, 43 days after the original court order, the FBI withdrew its complaint, saying it had hired a vendor that was able to break into the phone. The agency refused to name the vendor or what method had been used to hack the phone.That, of course, didn\u2019t make the issue go away. While no case since has generated that level of publicity, the conflicts continue. Last October, after a\u00a0mass stabbing at a Minnesota mall linked to the terrorist group Isis, the FBI said it was seeking to unlock the iPhone of the attacker, Dahir Adan. That potential conflict never made it to the courts.The same is true of a standoff between the government and WhatsApp, the world\u2019s largest mobile messaging service, which is owned by Facebook. Within the past 18 months, it began encrypting communications, which meant the Justice Department couldn\u2019t eavesdrop on them, even with a judge\u2019s wiretap order.Cardozo said as far as he knows, that case is currently dormant.Of course, unlocking or decrypting devices are not the only forms of government surveillance that continue to be contentious and that have not been settled. One is the use by police departments, for more than a decade, of the\u00a0Stingray \u2013 a device that \u201cimpersonates\u201d a cell tower and thereby monitors cell phone traffic in a given area.The manufacturer, Harris Corp., has fought to keep information about it secret, arguing that any information about it will help criminals. But it amounts to mass surveillance without a warrant \u2013 police departments frequently deploy it without a warrant, and gather information from any users in the area of the device.There is also the change in Rule 41 of the Federal Rules of Criminal Procedure, which took effect this past Dec. 1 and allows any US judge \u2013 even a magistrate \u2013 to issue search warrants that give the FBI and law enforcement agencies the authority to hack multiple computers remotely in any jurisdiction, including outside the United States.But, as Cardozo said, while concerning those forms of surveillance are different from the Apple case, in that they don\u2019t require a company to, \u201csubvert its security.\u201dAnd a high-intensity battle over that may be in the works.From Congress, less than a month after the FBI withdrew its complaint, the Senate Select Committee on Intelligence Chairman Richard Burr (R-N.C.) and Vice Chairman Dianne Feinstein (D-Calif.) issued a draft of what they labeled the "Compliance With Court Orders Act of 2016."The draft, which called for a mandate that, \u201call entities must comply with court orders to protect Americans from criminals and terrorists,\u201d was never filed as actual legislation. But it would have required that, \u201ccovered entities,\u201d which included, \u201cdevice manufacturers, software manufacturers, electronic communication services, remote communication services, providers of wire or electronic communication services, providers of remote communication services, or any person who provides a product or method to facilitate a communication or to process or store data,\u201d comply with court orders to turn over data in an intelligible format.Feinstein said at the time in a press release that the proposal was not intended to undermine privacy, but simply protect the public. \u201cWe need strong encryption to protect personal data, but we also need to know when terrorists are plotting to kill Americans,\u201d she said.But it drew broad and loud condemnation from privacy advocates and technology experts, who said requiring a backdoor into devices would undermine security for all devices. Julian Sanchez, founding editor of Just Security and a senior fellow at the Cato Institute, called it "insanely misguided."Backdoor for the good guys?John Verdi, vice president of policy at the Future of Privacy Forum (FPF), was among hundreds of critics who said it couldn\u2019t work. \u201cSome have argued that firms like Apple can create backdoors that allow the good guys to access data, but prevent access by bad actors. Unfortunately, this isn't possible,\u201d he said.[ RELATED: Many unanswered questions in Apple-FBI controversy ]And John Bambenek, threat systems manager at Fidelis Cybersecurity, noted that most tech companies are global. If they are forced to provide a backdoor for US intelligence or law enforcement, it could then be used by, \u201cless-friendly jurisdictions that may have their own motivations.\u201dWhile there have been reports since last fall that a revised version of Burr-Feinstein may be filed this year, the logistics of it are not clear, since Feinstein has moved to the Judiciary Committee.\u201cBurr-Feinstein will be reintroduced,\u201d said Paul Rosenzweig, founder of Red Branch Consulting and a former deputy assistant secretary for policy at the Department of Homeland Security. \u201cBut with Feinstein at Judiciary now, the exact structure will be different.\u201dA report from another congressional group has received a warmer reception.The\u00a0Encryption Working Group of the House Judiciary Committee and the House Energy and Commerce Committee issued its annual report in December, which included the following \u201cobservations\u201d:Any measure that weakens encryption works against the national interest.Encryption technology is a global technology that is widely and increasingly available around the world.The variety of stakeholders, technologies, and other factors create different and divergent challenges with respect to encryption and the "going dark" phenomenon, and therefore there is no one-size-fits-all solution to the encryption challenge.Congress should foster cooperation between the law enforcement community and technology companies.Bambanek called the report \u201cbang on,\u201d especially with regard to weakening encryption, because to do so would, \u201cwork against the national interest.\u201dOf course, any legislation that results will depend in large measure on how the various stakeholders define \u201ccooperation.\u201dThere is also a bill in the works by Rep. Michael McCaul (R-Texas), who chairs the House Homeland Security Committee, and Sen. Mark Warner (D-Va.), that would create a 16-member\u00a0"Encryption Commission" to report on how conflicts might be resolved. It would include tech industry executives, privacy advocates, cryptologists, law enforcement officials and members of the intelligence community.But EFF opposes it, arguing that the \u201cquestions\u201d the commission would address have already been answered.They haven\u2019t been answered to the satisfaction of all parties, of course. As Nojeim put it, \u201ccompanies and law enforcement are trying to adapt to new technology, and there is no road map for how that should best be done.\u201dEstablishing that road map will inevitably be contentious. Bruce Schneier, CTO of Resilient Systems and an encryption expert who blogs on the topic frequently, wrote in a post last month that, \u201cthere will be more government surveillance and more corporate surveillance. I expect legislative and judicial battles along several lines: a renewed call from the FBI for backdoors into encryption, more leeway for government hacking without a warrant, no controls on corporate surveillance, and more secret government demands for that corporate data.\u201cAnd if there's a major terrorist attack under Trump's watch, it'll be open season on our liberties. We may lose a lot of these battles, but we need to lose as few as possible and as little of our existing liberties as possible,\u201d he wrote.