Americas

  • United States

Asia

Oceania

Contributor

Is your information security program giving you static?

Opinion
Oct 29, 20155 mins
Application SecurityCybercrimeInternet Security

The importance of moving to a dynamic approach to threat management

Consider the following hypothetical, but probably typical scenario: Your organization experiences a minor (if there is such a thing) security breach, which you accidentally discover, and correct immediately. You then call in the best information security consultant you can find, and follow their advice to the letter. You bring in new technology, and run extensive vulnerability scans on all of your applications. You correct everything you find, and then have the consultant check your security one more time. You get a clean bill of health. You then tell your board that all is well, take a deep breath, and relax.

If you are resting on your laurels for more than a few days, I would suggest that you are another security breach waiting to happen. Why? After all, the best security consultant available said you were fine.

The issue is that cyber security is a rapidly moving target. An application that is vulnerability free this week may have major exposures next week. I have encountered this a number of times with customers, surprised at the difference between two vulnerability scans run a short time apart.

The issue we all face with information security is the dynamic nature of the threat. I follow a variety of security organizations, such as US CERT, on Twitter. I am constantly amazed at the number of new vulnerabilities discovered with existing software programs and products on a weekly basis. Some recent examples:

  • Adobe Flash vulnerability allowing remote code execution 
  • NTP (Network Time Protocol)  issue allowing denial-of-service attacks
  • Cisco ASA firewall exposure allowing for denial-of-service attacks
  • Apple, thought for a long time to be invulnerable, releasing iOS 9, quickly followed by additional releases to correct newly discovered exposures

And the list goes on. All of these exposures relate to vulnerabilities in existing products that were unknown just weeks ago. The affected parties did not have to change anything to be exposed, or be negligent in any way. They were secure until some hacker discovered a new vulnerability to exploit. They were secure one minute, and insecure the next.

If this seems unfair, welcome to life. Our inability to stay on top of threats is largely a function of our static approach to security.

Most of us in the information technology have many responsibilities other than security. On the other hand, we fight against a growing number of hackers whose only job is to find new ways to break into our systems. Their motivations vary from the hobbyist who just loves technology and lacks ethics, to those sponsored by organized crime, to those propped up by foreign governments. Regardless of the reason, they have vast amounts of time to put into finding ways into our networks and applications.

Since we can’t keep up, should we just give up? While tempting, this is obviously not possible. As such, we must find ways to replace our static approach to security with a dynamic one.

A few weeks ago, former NSA Director Keith Alexander, speaking at a conference, put it well stating that “We need to move now to a new approach to cybersecurity — an approach that is proactive, agile and adaptive.”  We need to view information security as a constantly changing landscape, rather than a still painting of a landscape. Given our deficit of resources as compared to those we battle against, this is not an easy undertaking. Challenging or not, however, dynamic security is essential if we are to have a fighting chance of winning. I would suggest the following as good starting points in moving from a static to a dynamic model:

Make vulnerability checks a regular and frequent task

I recently worked with a HIPAA customer who had a clean vulnerability scan that was close to a year old. They felt pretty comfortable about their situation, given that nothing significant had changed on their environment during that time. My first act on their behalf was to have them rerun the tests, which of course found numerous issues, some of them major. I would suggest that external vulnerability scans need to happen at least monthly. Internal scans should take place on the same schedule, or when software or configuration changes are made, whichever happens first. The latter is a specific requirement of PCI DSS.

Fortunately, there are excellent tools to help us with this task. My favorite is Qualys, which makes such scans an easy process, can handle internal and external scans, and offers some good free options. There are a growing number of other options for this, include some like OpenVAS that are open source.

Pay attention to the fundamentals

I have preached for some time that, despite the availability of new, expensive products to prevent and detect security breaches, the answer actually lies in the fundamentals — the day to day things we should do, such as checking logs and auditing access rights. Fortune reminds us of this in their recent article, “Asking these 4 questions will stop up to 90% of hacks.”

Stay in top of firmware updates

Many of the exposures we face today result from issues found in the firmware of devices attached to our networks. These may be core devices, including routers or firewalls, or Internet of Things devices, such as printers and copiers. Know what devices you have, and check for firmware updates frequently.

Be threat aware

The odds are good that you will not be the first victim of a new vulnerability. Check threat sources, such as US CERT or IBM X-Force Exchange, to know what to watch for.

Bottom line — migrate your security efforts from static to dynamic, and sleep better tonight.

Contributor

Robert C. Covington, the "Go To Guy" for small and medium business security and compliance, is the founder and president of togoCIO.com. Mr. Covington has B.S. in Computer Science from the University of Miami, with over 30 years of experience in the technology sector, much of it at the senior management level. His functional experience includes major technology implementations, small and large-scale telecom implementation and support, and operations management, with emphasis on high-volume, mission critical environments. His expertise includes compliance, risk management, disaster recovery, information security and IT governance.

Mr. Covington began his Atlanta career with Digital Communications Associates (DCA), a large hardware/software manufacturer, in 1984. He worked at DCA for over 10 years, rising to the position of Director of MIS Operations. He managed the operation of a large 24x7 production data center, as well as the company’s product development data center and centralized test lab.

Mr. Covington also served as the Director of Information Technology for Innotrac, which was at the time one of the fastest growing companies in Atlanta, specializing in product fulfillment. Mr. Covington managed the IT function during a period when it grew from 5 employees to 55, and oversaw a complete replacement of the company’s systems, and the implementation of a world-class call center operation in less than 60 days.

Later, Mr. Covington was the Vice President of Information Systems for Teletrack, a national credit bureau, where he was responsible for information systems and operations, managing the replacement of the company’s complete software and database platform, and the addition of a redundant data center. Under Mr. Covington, the systems and related operations achieved SAS 70 Type II status, and received a high audit rating from the Federal Deposit Insurance Corporation and the Office of the Comptroller of the Currency.

Mr. Covington also served as Director of Information Technology at PowerPlan, a software company providing software for asset-intensive industries such as utilities and mining concerns, and integrating with ERP systems including SAP, Oracle Financials, and Lawson. During his tenure, he redesigned PowerPlan's IT infrastructure using a local/cloud hybrid model, implemented IT governance based on ITIT and COBIT, and managed the development of a new corporate headquarters.

Most recently, Mr. Covington, concerned about the growing risks facing small and medium business, and their lack of access to an experienced CIO, formed togoCIO, an organization focused on providing simple and affordable risk management and information security services.

Mr. Covington currently serves on the board of Act Together Ministries, a non-profit organization focused on helping disadvantaged children, and helping to strengthen families. He also leads technical ministries at ChristChurch Presbyterian. In his spare time, he enjoys hiking and biking.

The opinions expressed in this blog are those of Robert C. Covington and do not necessarily represent those of IDG Communications, Inc., its parent, subsidiary or affiliated companies.

More from this author