• United States



by Senior Editor

Ouch! Security Pros’ Worst Mistakes

Sep 03, 200811 mins
CareersCSO and CISOIT Jobs

We've all done regrettable things on the job, but does any valuable wisdom come of it? Four security pros candidly explain their biggest blunders and what they learned in the process

It was a mistake so bad the person who made it asked that his name and company not be mentioned here. Let’s call him Frank.

Frank was regional manager for a physical security vendor in the southeast. One day when a huge business deal was left hanging in the balance because of a client who couldn’t get his schedule straight. The client sent an e-mail to say he had changed his mind about the timing of a site visit that had been scheduled, and Frank blew his top. He used the same medium to let the client and others know exactly how he felt. “I sent an e-mail to the girl I was working with in setting up the arrangements where I blasted the guy, saying he was a flake, that he was treating us like his own personal travel agency, that he was obnoxious for never picking up the phone, etc, etc,” Frank says. The trouble is, “I also sent the e-mail to him, accidentally.”

Practically everyone has a moment where they make a huge mistake on the job. Some are fired as a result, others are not. What matters in the long run is if the mistake-maker gains any wisdom from what happened.

Here are four tales in which security professionals recall their biggest professional mistakes and the lessons they learned.


  • Mistake maker: “Frank”
  • Position: Regional Manager for a vendor of physical security equipment
  • Location: Southeast United States
  • The incident: Blasted a client by e-mail

“I had a large deal with a university on the line. As I am the manufacturer’s rep, I was working closely with an integrator. I was trying to schedule a tour of the manufacturing facility but was having difficulty as the customer continued to change his mind about travel dates, departure cities, etc. It was also very difficult to get him on the phone or to reply in a timely manner.

“We thought we had it all set up (I was working with the manufacturer to make the arrangements) when he changed his mind again, notifying us via email.

“I sent an email to the girl I was working with in setting up the arrangements where I blasted the guy, saying he was a flake, that he was treating us like his own personal travel agency, that he was obnoxious for never picking up the phone, and that we should tell him ‘take it or leave it’ as far as the most recent itinerary we had sent to him. Except that I sent the email to him, accidentally.

“It was very surreal. I was in a hotel and was tired and it was around 10 p.m. I sent an e-mail to my boss letting him know what happened. I also sent an e-mail to the guy apologizing, copying both of the managing partners of my firm. I also sent an e-mail to the managing partners offering to resign, as this deal was worth hundreds of thousands of dollars; high-level sales.

“I didn’t go to sleep that night and went to work early, around 4 a.m., to work on what I was going to say. My boss called me when he got to the office around 5.30 a.m. We decided to see if we could recall the e-mail. We kept sending each other e-mails and trying to erase them and/or recall them, to no avail. So I decided to face the music. My boss did a good job in reassuring me that I still had a job.

“At 7:45 a.m. my boss called and said that the guy at the university had sent out a very reactive e-mail to everyone involved saying that he would not be flying to the manufacturer at that time nor anytime in the future. It had not popped up in my PDA yet due to the delay (there is usually a two-minute delay in e-mails reaching the PDA), so I assumed he was on campus, too, and I began to search out the IT building. I saw two guys walking together who looked like they worked there so I asked them where the IT building was, which they pointed out to me. One of the guys peeled off to go to another building, but the remaining guy said he was going to the IT building so I followed him. Upon entering, he said he could help me find the guy I was looking for as he worked in the IT building. I told him the name as we walked up the stairs — to which he replied ‘That’s me.’

“I followed him down the hallway — it felt like I was going to the principal’s office — and sat in his office. I apologized profusely. Then he started to defend his actions over the past couple weeks and I cut him off saying that he was right and that I was completely wrong. That calmed him down. I also offered to excuse myself from the deal. At the end of the conversation, he did say that it took a lot of guts to come and meet in person with someone who was so angry with me. We ended up getting the deal, which I found out about two weeks later.”


“I learned that I really needed to work, on a regular basis, to maintain a healthy detachment. I also learned to pick up the phone. Rather than sending out those e-mails late at night, I should have waited until the morning to call everyone and deal with it on the phone. E-mails are a cold way of communicating, anyway, so I have become much more reliant on the phone now. Also, it was difficult to go meet with him in person that morning, but I have learned that you reduce the damage if you are willing to accept responsibility and meet the mistake head-on.”


  • Mistake maker: Jennifer Jabbusch (and colleagues)
  • Position: CISO at Carolina Advanced Digital, Inc., security blogger
  • Location: Raleigh-Durham area, North Carolina
  • The incident: Found out the hard way that one shouldn’t neglect business continuity planning

“I would have to say the biggest mistake has been the sin of ‘priority pass-over.’ When we sat down to review and revise policies for our data security and business continuity, we updated our procedures for discovery, data classification, retention, backups and continuity. We had each of these items in place, but wanted to structure them a bit more and come up with a more definitive schedule for verification.

“[But] with all the hustle and bustle happening, customer projects and service deadlines, we got caught up in other ‘to-dos’ and didn’t complete our changes to comply with our new policies. Of course, the possibility of losing all your data, equipment and PCs is a huge concern, so revising our backup scheme should have been (and was) a priority. But the customer projects were more in our faces and they got the attention first.

“A couple of weeks later, the unthinkable happened. We had a fire AND a flood in the office. The fire, which was started in an unused portion of the old warehouse above our space, caused the sprinklers to go off. And there were lots of them. It might have been okay, but the sprinklers didn’t stop. The water flooded into our office space. Ceiling tiles came crashing down and the floor was covered with 8 inches of water, burnt chunks of wood were falling through and just about every piece of equipment was ruined. It was Easter Saturday. I’ll never forget the call that morning, or the feeling I had when I walked into the dark water-loggedbuilding. It was such a mess.

“My first two thoughts were how would we ever clean all this up and where was our data. As we started the cleanup process, I had other thoughts: How do we secure our printed records while the clean-up crew is here? How will we document and destroy all these ruined records? And still, where is our data? We had water, sogginess and mildew to contend with so the cleanup process was much more involved than I could imagine. Security was a priority for us, and the whole team was on board to ensure everything was handled properly. We successfully sequestered sensitive paperwork in a locked facility, waited for it to dry, and then had it destroyed.

“We were lucky. Our primary server room remained unscathed, our servers, backups and main networking equipment was all intact (and dry). As you can imagine, after the clean-up the what-ifs started flying through our heads. What if the server room had been destroyed? In our case, the previous incarnation of our backup procedures would have saved us. Our critical data was indeed secured at an off-site location, but in its current state, it would have made maintaining business continuity a much slower process.”


“The most important lesson is this: Never displace your organization’s business priorities with day-to-day ’emergencies.’ You never know when something incredibly unexpected will occur.”


  • Mistake maker: Andrew Cardwell
  • Position: Computing Security Officer, Director at Cardwell Security Ltd.
  • Location: United Kingdom
  • The incident: Mistyped serial number causing Internet domains to stagnate for a week

“I worked for an ISP that at the time was responsible for controlling the domains. We where essentially the authority for any domains under that TLD [top-level domain]. This was around 1995, when domain names where all manually applied for, approved, updated and controlled.

“I had to update the main registry file and insert a new name and update the serial number which controlled the updates on the DNS server. The serial number was in the form of YYYYMMDDXX. XX represented the number of changes that day so in order to get it updated we had to do new XX = old XX +1. Sadly, I removed one of the digits so the serial number turned into YYYYMMDDX. As a result, the name server did not pull in the new file and update the domains for a week until we discovered it on closer inspection — and after several complaints.”


“This is an ideal example of lack of controls around the software, lack of a sanity check and human error. Over the years in places that now run the TLDs, controls have been introduced to ensure this kind of human error is sanity-checked through logical rules. I added something so the serial number should go up not down to help eliminate or reduce the number of human errors.”


  • Mistake maker: Dave Bixler
  • Position: CISO, Siemens IT Solutions and Services Inc.
  • Location: Mason, Ohio
  • The incident: Sarcasm with the CEO

“Many years ago, during one of the last great e-mail-based virus outbreaks — it was six or seven years ago and may have been the Anna Kournikova virus — I was wearing two hats as the information security person, and also responsible for infrastructure, including the e-mail servers. The virus outbreak had spread rapidly through all seven of our mail servers, and by the time we had a virus signature that could clean out the virus, the mail servers had ground to a screeching halt.

“We took the servers offline and were in the process of getting them cleaned up when I received a call from the CEO, asking for a status. I proceeded to explain where we were, what the impact was, and how long it would take before the servers would be back online. At the end of my explanation, he joked ‘Better you than me.’ Naturally, my mouth engaged well in advance of my brain and I responded with ‘Well, that’s what you underpay me for.'”


Think before you speak.

“Fortunately, my CEO had an excellent sense of humor and I was still employed the following day.” ##

Editor’s note: This is the first of a two-part feature. The next installment will focus on ‘mistakes’ that had a positive result for the security pros involved. If there’s a mistake you would happily make all over again, e-mail