While voluntary code on data collection appears extensive, privacy advocates say it favors business interests over those of consumers A federal effort to get industry and consumer advocates to agree on a privacy code of conduct for mobile app developers is being lambasted for falling far short of protecting personal information.The National Telecommunications and Information Administration, an agency within the Commerce Department, announced Thursday that app makers and advertisers could start testing the voluntary code to determine how it would affect business models. The test followed more than a year of haggling between government, industry and privacy advocates.Companies adopting the proposal would agree to use standardized onscreen notices in detailing data collection in smartphone and tablet apps. The notices would cover a variety of information, including biometrics; browser history; phone or text logs; contacts; financial information; health, medical, or therapy information; location data; and user files, such as photos or videos.The code would also require disclosure of whether data is shared with ad networks, carriers, consumer data resellers, data analytics providers, government entities, other apps or social networks. While the code’s requirements appear extensive, some privacy advocates said it heavily favored business interests over those of consumers, an allegation industry denied.“I think it’s a farce, but a tragic one,” said Jeffrey Chester, executive director of the Center for Digital Democracy in Washington, D.C. The CDD pushed unsuccessfully for independent testing to see if the code’s requirements were effective in helping consumers understand what was collected and how it was used.While privacy advocates could conduct their own tests, Chester said most of the organizations did not have the money to hire professionals.Sarah Hudgins, director of public policy for the Interactive Advertising Bureau, said trade groups representing advertisers and developers were not against independent testing. However, because each product is unique, companies believed they were in the best position to do it.“I look at the sophisticated people that are in product development at many of our companies and they frankly have the best expertise to do [the testing],” Hudgins said.The Consumer Federation of America also denounced the code and the process used by the NTIA, saying they were “seriously flawed.”[Also see: California’s mobile privacy crackdown praised] For example, app makers would not have to disclose whether data was being shared with third parties, such as social networks and ad networks, if they are part of the same corporate structure as the developers, the CFA said in a statement.Critics also complained that developers and advertisers were not first required to explain in detail the data collected, how it is shared and with whom. In addition, there was no legal framework on which to build the code, so terms, even those as simple as user data, were not universally understood.“The NTIA acted like a surgeon who operates without first conducting an examination of the patient,” Chester said. IAB contended that the industry could not list all the data that advocates wanted upfront. “There’s no reasonable way to identify every single data element that could ever be collected by any type of app in the universe,” Hudgins said. “That is not a realistic list that one can make.” Instead, the industry opted to discuss the types of data collected and how users shared that information. “That’s a more practical conversation and that’s one that we repeatedly had,” she said.Other privacy advocates were less harsh in their criticism. The Electronic Privacy Information Center agreed that the code was plagued by problems, but was a “slight improvement” over what exists now.Today, there are no standards for privacy notices on mobile devices and important pieces of information are often omitted, said David Jacobs, a consumer protection fellow of the EPIC. The group has not decided whether to support the code.The American Civil Liberties Union, a supporter, called the NTIA-led effort a “modest but important step forward.” By knowing in advance which developers had agreed to follow the code, consumers would at least be able to make an informed decision whether to reject those who haven’t, the ACLU said in a statement.Despite varying degrees of criticism, advocates universally agreed that federal legislation was still necessary to protect consumer privacy against abuse. Having a baseline privacy law would shift the discussion from the rules of behavior for industry to how to best put the law into practice, the CFA said.Beyond protecting privacy, efforts like the NTIA’s are important in trade negotiations between the European Union and the Obama administration.The EU views privacy for its citizens as a legally protected right and has penalized U.S. companies for their handling of personal information. The administration has argued that its efforts at self-regulation by industry are sufficient to protect privacy. Related content news Google Chrome zero-day jumps onto CISA's known vulnerability list A serious security flaw in Google Chrome, which was discovered under active exploitation in the wild, is a new addition to the Cybersecurity and Infrastructure Agency’s Known Exploited vulnerabilities catalog. By Jon Gold Oct 03, 2023 3 mins Zero-day vulnerability brandpost The advantages and risks of large language models in the cloud Understanding the pros and cons of LLMs in the cloud is a step closer to optimized efficiency—but be mindful of security concerns along the way. By Daniel Prizmant, Senior Principal Researcher at Palo Alto Networks Oct 03, 2023 5 mins Cloud Security news Arm patches bugs in Mali GPUs that affect Android phones and Chromebooks The vulnerability with active exploitations allows local non-privileged users to access freed-up memory for staging new attacks. By Shweta Sharma Oct 03, 2023 3 mins Android Security Vulnerabilities news UK businesses face tightening cybersecurity budgets as incidents spike More than a quarter of UK organisations think their cybersecurity budget is inadequate to protect them from growing threats. By Michael Hill Oct 03, 2023 3 mins CSO and CISO Risk Management Podcasts Videos Resources Events SUBSCRIBE TO OUR NEWSLETTER From our editors straight to your inbox Get started by entering your email address below. Please enter a valid email address Subscribe