• United States




5 signs that your software probably isn’t secure

Oct 03, 20175 mins
Application SecurityData and Information SecurityEnterprise Applications

Software that is sloppily written probably isn't very secure. Here are some things to look for.

computer frustration woman orange
Credit: Jupiterimages/

If developers, project managers, and business owners aren’t paying attention to some of the basics of software development, then they certainly aren’t spending any time making sure that their software is secure.  Following that logic, if a piece of software shows signs of inattentive design in one area, then it is reasonable to assume that there was inattentive design in other areas.  Since security is typically addressed after functionality (although it shouldn’t be!) it stands to reason that software that demonstrates sloppy architecture, design, or user interface, is probably not secure.

Here are the top 5 signs that software is poorly written and is probably not very secure:

1. Format compliance warnings (Phone numbers, SSN, etc.)

One of my first clues that the development team didn’t spend any time on processing input and just did the minimum to make the software functional is when I get a warning that an input is not formatted properly.  We have all seen these.  You enter your phone number as “1234567890” and the software responds “you must enter your phone number in the format (123) 456-7890.” 


If the software can determine that the number isn’t in the right format, then it should be able to format it.  Formatting warnings like this tell me that the programmer is just looking for one particular instance and has not accounted for, and is not handling, data as data, but is rather trying to constrain input into a predetermined format.  This might sound like it is a good thing, but in reality it means that the programmer did not take the time to write a true data handling method and instead just wrote for minimum functionality.

2. Case sensitivity on email addresses

This is one of my biggest pet peeves.  It shows that the developers were either very lazy, or that they didn’t understand how the internet works.  Either tells me that they are not writing secure code.  Email addressing is not case sensitive.  That means that and BiLl@MYsItE.COm are exactly equal.  When software asks me to input my email twice to confirm that it is the same and then tells me they don’t match because of capitalization, it tells me that the programmer just did a byte by byte compare and didn’t actually analyze the input.  If that is true here, then that means that they probably didn’t fully analyze input from other inputs and a hacker could use that fact to inject input that the programmer wasn’t expecting.  This is the foundation of SQL injection vulnerabilities.

3. Limiting the length of a password

Seriously?  We are trying to make the software secure, but we are limiting the length of passwords?  This tells me that the programmer was probably very junior and didn’t know how to program variable length strings or that they didn’t understand how to convert a variable length string into the fixed length key that an encryption algorithm needs.  Either demonstrates a rudimentary understanding of programming that is not on an industrial level, and someone who doesn’t understand security at all.

4. “Complex” passwords

I know that NIST just changed their stance on “complex” passwords and that they have been advocating password complexity and aging for decades, but security practitioners have known for a long time that aging passwords and enforcing “complexity” standards just causes people to create passwords that they can’t remember, which means they write them down somewhere.  Security practitioners have also known that the human element is the easiest to breach.  Complex passwords that are only 8 characters long are easier to break than easy to remember passphrases comprised of multiple every day words that are 16-20 characters long.  Personally, I feel that disallowing any dictionary words in a passphrase is as bad as enforcing password complexity.  Of course, if we are going to allow dictionary words, then the passphrase needs to be of a minimum length. (But not constrained to a maximum length!)

5. Emailing lost password

This is the absolute worst.  No one should do business with a company that does this.   I thought this was a thing of the past but I experienced it just last week.  This is how it happens:

You forget your password to a site and hit the “Forgot Password” button.  You receive a message or popup that says “your password has been sent to your email.”  When you open your email, there in plain, unencrypted text is your username and your password.

There are two things horrifically wrong with this.  First, the company is actually storing your clear-text password instead of storing a hash of your password.  That means that when (not if) they are compromised, hackers will instantly have your password.  Second, they are sending your username and password via email in clear-text.  They might as well print it on the outside of an envelope and send it through the mail.  Everyone along the way will be able to see both the username and the password. 

Writing secure code isn’t easy, but it isn’t difficult either.  What it takes is a knowledge of what is safe and what is not, and a desire to place security at least tantamount to functionality.  If you see any of these signs in software that you use, it is an indicator that the functionality was more important to that company than security and the software is probably not very secure.  If your company has software that demonstrates these signs, it is time for a rewrite!


Michael Lester is the chief information security officer of Magenic Technologies and the co-founder and director of LegacyArmour LLC, a secure digital asset delivery company.

A graduate of the U.S. Naval Academy in Annapolis, Md., and of the Naval Postgraduate School, Michael was a decorated U.S. Marine Corps pilot and an IT and leadership instructor at the Naval Academy. Early in his career, Michael worked as a software developer, a QA manager and a project manager. He also served in multiple leadership roles, both locally and nationally, as a general manager, a national director, a vice president and, now, as CISO. A member of Mensa and the holder of a security patent, Michael has bachelor’s degrees in history and electrical engineering, a master’s degree in electrical engineering and an MBA with an emphasis in leadership development. He is a Certified Information Privacy Professional (CIPP/US) and a Certified Information Security Manager (CISM), and is a frequent speaker on security topics.

The opinions expressed in this blog are those of Michael T. Lester and do not necessarily represent those of IDG Communications Inc. or its parent, subsidiary or affiliated companies.