Big data, as its proponents have been saying for nearly a decade now, can bring big benefits: advertisements focused on what you actually want to buy, smart cars that can help you avoid collisions or call for an ambulance if you happen to get in one anyway, wearable or implantable devices that can monitor your health and notify your doctor if something is going wrong.\u00a0It can also lead to big privacy problems. By now it is glaringly obvious that when people generate thousands of data points every day \u2014 where they go, who they communicate with, what they read and write, what they buy, what they eat, what they watch, how much they exercise, how much they sleep and more \u2014 they are vulnerable to exposure in ways unimaginable a generation ago.\u00a0It is just as obvious that such detailed information, in the hands of marketers, financial institutions, employers and government, can affect everything from relationships to getting a job, and from qualifying for a loan to even getting on a plane. While there have been multiple expressions of concern from privacy advocates and government, there has been little action to improve privacy protections in the online, always connected world.\u00a0It was more than five years ago that the Obama administration published a blueprint for what it termed a Consumer Privacy Bill of Rights (CPBR), in February 2012. That document declared that, \u201cthe consumer privacy data framework in the U.S. is, in fact, strong \u2026 (but it) lacks two elements: A clear statement of basic privacy principles that apply to the commercial world, and a sustained commitment of all stakeholders to address consumer data privacy issues as they arise from advances in technologies and business models.\u201d\u00a0Three years later, in February 2015, that blueprint became proposed legislation by the same name, but it was immediately attacked, both by industry groups, who says it would impose \u201cburdensome\u201d regulations, and by privacy advocates, who says it was riddled with loopholes. It never made it to a vote.\u00a0The CPBR declaration that the, \u201cconsumer privacy data framework in the U.S. is, in fact, strong \u2026\u201d ironically came about a year before revelations by former NSA contractor Edward Snowden that the U.S. government was, in fact, spying on its citizens.Beyond that, government hasn\u2019t been able to agree on other privacy initiatives. The so-called broadband privacy rules issued by the Federal Communications Commission (FCC) just before the 2016 election, which would have limited data collection by Internet service providers (ISPs), were repealed by Congress in March, before they took effect.\u00a0Susan Grant, director of consumer protection and privacy at the Consumer Federation of America (CFA), called it \u201ca terrible setback,\u201d and says it would allow ISPs, \u201cto spy on their customers and sell their data without consent.\u201d\u00a0Others, however, have argued that putting limits on ISPs would still leave other online giants like Google free to collect and sell the data they collect, and consumers would see few, if any, benefits.\u00a0Given all that, it should be no surprise that experts say privacy risks are even more intense, and the challenges to protect privacy have become even more complicated.\u00a0Organizations like the CFA, the Electronic Privacy Information Center (EPIC) and the Center for Democracy and Technology (CDT), along with individual advocates like Rebecca Herold, CEO of The Privacy Professor, have enumerated multiple ways that big data analytics, and resulting automated decision-making, can invade the personal privacy of individuals. They include:\u00a01. DiscriminationEPIC declared more than three years ago, in comments to the U.S. Office of Science and Technology Policy that, \u201cThe use of predictive analytics by the public and private sector \u2026 can now be used by the government and companies to make determinations about our ability to fly, to obtain a job, a clearance or a credit card. The use of our associations in predictive analytics to make decisions that have a negative impact on individuals directly inhibits freedom of association.\u201d\u00a0Since then, things have gotten worse, privacy advocates say. While discrimination is illegal, automated decision-making makes it more difficult to prove.\u00a0\u201cBig data algorithms have matured significantly over the past several years, along with the increasing flood of data from the nascent internet of things, and the ability to analyze these data using variants of artificial intelligence." says Edward McNicholas, global co-leader of the Privacy, Data Security, and Information Law Practice at Sidley Austin LLP. \u201cBut despite this technological growth, the legal protections have not advanced materially.\u201d\u00a0\u201cI think the discussion around big data has moved beyond mere accusations of discrimination to larger concerns about automated decision-making,\u201d says Joseph Jerome, policy counsel at the CDT, who noted that it has been used, \u201cto direct calls at call service centers, evaluate and fire teachers, and even predict recidivism.\u201d\u00a0Herold has been saying for years that big data analytics can make discrimination essentially \u201cautomated,\u201d and therefore more difficult to detect or prove. She says that is true, \u201cin more ways than ever\u201d today.\u00a0\u201cBig data analytics coupled with internet of things (IoT) data will be \u2014 and has already been \u2014 able to identify health problems and genetic details of individuals that those individuals didn\u2019t even know themselves,\u201d she says.\u00a0\u00a0McNicholas believes, \u201cthe most significant risk is that it is used to conceal discrimination based on illicit criteria, and to justify the disparate impact of decisions on vulnerable populations.\u201d\u00a02. An embarrassment of breachesBy now, after catastrophic data breaches at multiple retailers like Target and Home Depot, restaurant chains like P.F. Chang\u2019s, online marketplaces like eBay, the federal Office of Personnel Management that exposed the personal information of 22 million current and former federal employees, universities, and online services giants like Yahoo, public awareness about credit card fraud and identity theft is probably at an all-time high.\u00a0Unfortunately, the risks remain just as high, especially given the reality that billions of IoT devices in everything from household appliances to cars, remain rampantly insecure, as encryption and security guru Bruce Schneier, CTO at IBM Resilient, frequently observes in his personal blog.\u00a0[Related: The 15 biggest security breaches of the 21st century]3. Goodbye anonymityIt is increasingly difficult to do much of anything in modern life, \u201cwithout having your identity associated with it,\u201d Herold says.\u00a0She says even de-identified data does not necessarily remove privacy risks. \u201cThe standards used even just a year or two ago are no longer sufficient. Organizations that want to anonymize data to then use it for other purposes are going to find it increasingly difficult.\u00a0\u201cIt will soon become almost impossible to effectively anonymize data in a way that the associated individuals cannot be re-identified,\u201d she says.\u00a0\u00a0Besides being vulnerable to breaches, IoT device are a massive data collection engine of users\u2019 most personal information.\u00a0\u201cIndividuals are paying for smart devices, and the manufacturers can change their privacy terms at a moment's notice,\u201d Jerome says. \u201cIt's one thing to tell a user to stop using a web service; it's another to tell them to unplug their smart TV or disconnect their connected car.\u201d\u00a04. Government exemptionsAccording to EPIC, \u201cAmericans are in more government databases than ever,\u201d including that of the FBI, which collects personally identifiable information (PII) including name, any aliases, race, sex, date and place of birth, Social Security number, passport and driver\u2019s license numbers, address, telephone numbers, photographs, fingerprints, financial information like bank accounts, and employment and business information.\u00a0Yet, \u201cincredibly, the agency has exempted itself from Privacy Act (of 1974) requirements that the FBI maintain only, \u2018accurate, relevant, timely and complete\u2019 personal records,\u201d along with other safeguards of that information required by the Privacy Act, EPIC says.\u00a0The NSA also opened a storage facility in Bluffdale, Utah, in 2014 that is reportedly capable of storing 12 zettabytes of data \u2014 a single zettabyte is the amount of information it would take 750 billion DVDs to store.\u00a0\u00a0While there have been assurances, including from former President Obama, that government is \u201cnot listening to your phone calls or reading your emails,\u201d that obviously ducks the question of whether government is storing them.\u00a05. Your data gets brokeredNumerous companies collect and sell consumer data that are used to profile individuals, without much control or limits.\u00a0There was the famous case of companies beginning to market products to a pregnant woman before she had told others in her family, thanks to automated decision-making. The same can be true of things like sexual orientation or an illness like cancer.\u201cSince 2014, data brokers have been having a field day in selling all the data they can scoop up from anywhere they can find it on the internet. And there are few \u2014 none explicit that I know of \u2014 legal protections for involved individuals,\u201d Herold says.\u00a0\u201cThis practice is going to increase, unfettered, until privacy laws restricting such use are enacted.\u00a0There is also little or no accountability or even guarantees that the information is accurate.\u00a0Where do we go from here?Those are not the only risks, and there is no way to eliminate them. But there are ways to limit them. One, according to Jerome, is to use big data analytics for good \u2014 to expose problems.\u00a0\u201cIn many respects, big data is helping us make better, fairer decisions,\u201d he says, noting that it can be, \u201ca powerful tool to empower users and to fight discrimination. More data can be used to show where something is being done in a discriminatory way. Traditionally, one of the biggest problems in uncovering discrimination is a lack of data,\u201d he says.\u00a0There is general agreement among advocates that Congress needs to pass a version of the CPBR, which called for consumer rights to include:\u00a0Individual control over what personal data companies collect from them and how they use it.Transparency, or easily understandable and accessible information about privacy and security practices.The collection, use and disclosure of personal data to be done in ways that are consistent with the context in which consumers provide the data.Security and responsible handling of personal data.Access to their personal data in usable formats, with the power to correct errors.Reasonable limits on the personal data that companies collect and retain.\u00a0McNicholas says that \u201ctransparency\u201d should include an overhaul of \u201cprivacy policies,\u201d which are so dense and filled with legalese that almost nobody reads them. \u201cTelling consumers to read privacy policies and exercise opt-out rights seems to be a solution better suited to last century,\u201d he says. \u201cConsumer privacy must shift to consumer-centric, where consumers have real control over their information."\u00a0Jerome agrees. \u201cI certainly don't think we can expect consumers to read privacy policies. That's madness. What we should expect are better and more controls. It's a good thing that users can review and delete their Echo recordings. It's great that Twitter allows users to toggle all sorts of personalization and see who has targeted them,\u201d he says.\u00a0\u201cBut ultimately, if individuals aren't given more options over collection and sharing, we're going to have serious issues about our personal autonomy.\u201d\u00a0Given the contentious atmosphere in Congress, there is little chance of something resembling the CPBR being passed anytime soon.\u00a0That doesn\u2019t mean consumers are defenseless, however. What can they do?Jerome says even if users don\u2019t read an entire policy, they should, \u201cstill take a moment before clicking \u2018OK\u2019 to consider why and with whom they're sharing their information. A recent study suggested that individuals would give up sensitive information about themselves in exchange for homemade cookies.\u201d\u00a0Herold offers several other individual measures to lower your privacy risks:\u00a0Quit sharing so much on social media. \u201cIf you only have a few people you want to see photos or videos, then send directly to them instead of posting where many can access them,\u201d she says.Don\u2019t provide information to businesses or other organizations that are not necessary for the purposes for which you\u2019re doing business with them. Unless they really need your address and phone number, don\u2019t give it to them.Use an anonymous browser, like Hotspot Shield or Tor (The Onion Router) when visiting sites that might yield information that could cause people to draw inaccurate conclusions about you.Ask others not to share information online about you without your knowledge. \u201cIt may feel awkward, but you need to do it,\u201d she says, adding that the hard truth is that consumers need to protect themselves because nobody else will be doing it for them.\u00a0Regarding legislation, she says she has not heard about any other drafts of the CPBR in the works, \u201cand I quite frankly do not expect to see anything in the next four years that will improve consumer privacy; Indeed, I expect to see government protections deteriorate.\u00a0\u201cI hope I am wrong,\u201d she says.