According to a recent report, academics have been analyzing brainwaves of computer users to improve how they are alerted to cybersecurity dangers. I\u2019m sorry, but getting users to pay stricter attention to security isn\u2019t brain surgery: It\u2019s all about money and job security. Come to think of it, job security itself is all about money, which makes money the only carrot\u00a0and\u00a0the only stick that IT needs.That report,\u00a0courtesy of\u00a0Bloomberg BNA, said, \u201cMany computer users automatically swat away repetitive dialogue box warnings of impending doom, especially when they are engaged in another activity. Now, engineers are using data analytics based on user tracking to discover what might help users pay attention to warnings. Software engineers are exploring promising techniques, such as changing background colors in warning notifications and switching formats to distinguish substantial security warnings from mundane messages. Tapping people\u2019s brains helps the engineers design more effective user interfaces.\u201d\u201cEspecially when they are engaged in another activity\u201d? As opposed to when exactly? The problem this approach is addressing is not the actual problem. Changing background colors and switching formats would maybe, possibly help if the problem was that users weren\u2019t noticing these warnings.In reality, the problem is that users don\u2019t care about these warnings. More precisely, users don\u2019t care about security nearly as much as they do that other activity, which is typically work.This is also the problem with much of security training. It teaches and preaches and drills and quizzes, all with the goal of making users familiar with proper security procedures. But it does little to convince them to prioritize those procedures over the work project that is about to hit its deadline.Consider, for example, Employee Emma. Emma knows all about phishing schemes and how attachments can deliver viruses and Trojans. But when Emma is rushing to finish a project by deadline and sees an email from her boss saying, \u201cUrgent. Project change,\u201d and it includes what appears to be a Word attachment, what is she likely to do?Proper procedure for any attachment that wasn\u2019t explicitly expected is for Emma to phone, text or email her boss directly (but not by replying to the suspect message) to see if her boss did indeed send her something. Alternatively, she can click on the message and see what it is. Does she risk her deadline by ignoring what appears to be project-related urgent message from her boss?Undercutting the incentives for employees to do the right thing for security purposes is the fact that the vast majority of email attachments from a boss will in fact be a legitimate email attachment from the boss. Even with rampant phishing attacks happening today,\u00a0most attachments are legitimate, in the same way that most people ringing your home doorbell are not homicidal maniacs.Statistical reality aside, employees\u2019 perception is that the odds are dramatically against them opening a contaminated attachment and\u00a0having damage result\u00a0and\u00a0having that damage traced back to the employee\u2019s actions.In short, employees are rushed and they think it\u2019s a decent gamble to open attachments that at least look legit. (The really bad ones are easy to dismiss.)If a company is serious about getting people to strictly and routinely use proper security, it needs to improve those odds. Bosses should send and track attachments that their staffers did not expect. Anyone who opens one without checking should face some consequences.It could be a small amount of pay docked, or it could be the reverse: a small amount of money that is awarded to people who, over the course of a month, never clicked on one of the trap attachments.Call this catching employees at being good, if you will. But somehow, you have to convince people to behave properly, and money is the only effective motivator you have.