Juglar and Kondratiev tell us about the structural mechanism of economic cycles. But when it comes to regulation, it is tempting to fix the conjunctural root cause. Efforts to secure the financial industry leveraged various mechanisms including the requirements for banks to improve their auditability, reinforce the vetting of transactions, fight against money laundering with enhanced due diligence on client, counterparties, etc. The process \u201cKnow Your Customer\u201d (KYC) for instance has been drastically expanded in the U.S. and abroad.The banking tradition to keep evidence forever has been extended to a large scope of information to support and build audit trails and while keeping proof of each step, major decisions, and supporting material. The Fed and OCC are looking at auditing global banks at a single place. The Department of Financial Services is expecting evidences to support audit trails for any material transactions\u2026 In this perspective, even the KYC process became a huge machine to collect and store information from customers and counterparties, all the way to the ultimate beneficiary of a transaction.Banks are serving clients, and are structured around activities and products. Data are shared between business lines to serve as well as possible the clients\u2019 interests. Sharing data means often duplicating it. Therefore the data architecture becomes complex with two main challenges:What is the master data?Where are located all the instances of a specific information?The recent and coming regulations are pushing to reduce the risks related to data management on both side: integrity of the data and confidentiality of those data. Numerous regulations are already published and applicable in the U.S. Dedicated regulation will focus on the data life cycle. Other will protect the client confidentiality with cyber protection effort such as the NYS-DFS 500-23 regulation in New York.The industry leverages various frameworks either based on regulatory sources or best practices to ensure the proper identification of what needs to be protected. Unfortunately, in an intertwined ecosystem pulling a string brings all the ecosystem. This issue is faced by both front office side with the definition of the scope of data to be protected, the back office with a similar effort on the application or the IT with all the underlying infrastructure.Many solutions are applied to make sure we link the efforts to the risks and to narrow down the scope of initiatives on what matter the most. But this arbitration introduces challenge-able choices that will be under scrutiny and criticism the day a breach occurs.Gambling with information security is messing up with customersIn a world without capability to protect a system with 100 percent success, there is no reason to believe focusing on the critical sub-scope will protect 100 percent of clients. Private institutions have to face malicious actors, sometimes sponsored by other countries. The capabilities of such teams can only succeed against a private institution with limited budget.There are no excuses for institutions messing up with their clients without efficiently protecting their data. Equifax is a good example. But there is no way to guarantee that the data managed are 100 percent safe. But maybe the simplest idea is, if we cannot protect our pockets, let\u2019s not put jewelry in them.The concept of zero-data process is enforcing the idea of limiting (to the extreme) the amount of information collected to perform the activity, and reduce to zero the information kept after the process execution. Then the effort to protect customer data will have to focus only on the internal confidential information, deemed to be material. A KYC process, used to be a huge generator of confidential material will tomorrow have only produced internal average sensitivity content.The path to the zero-data-process is clearing upRegulations are challenging the adoption of those processes by the financial industry. The requirement to have a permanent audit trail of all the major decisions, or significant transactions force the institutions to store, and keep data. The pain and cost of data theft is booming and will challenge the inconsistent request of the legacy regulations: store vs. dispose.The path to a zero-data process is clearing up: audits should focus on process efficiency or excellence more than legacy evidences of past failures. For instance, if we can audit today the KYC process with the active cases, and prove a perfect efficiency of the process we can rely and trust the past results. The audit will have to shift to a live performance assessment rather than postmortem review. It is a strong evolution of the audit dogma and will impact the regulators approach but also the internal audit activities and compliance roles.The protection of the customer data, citizen privacy will therefore depend on our capability to evolve our conviction around what-I-need-to-know, what-I-need-to-keep and how law and rules enforcement control the stability of the financial industry.Let\u2019s hope the zero-data process will not remain just another good idea before the next big storms.