• United States



How acceptable is your acceptable use policy?

Dec 14, 20228 mins
Access ControlBusiness IT AlignmentBusiness Operations

If users resent, fear, or ignore policies around the use of corporate resources, it may be time for a different approach that incentivizes rather than punishes.

byod mobile tablets smartphones users devices
Credit: Thinkstock

In a world before smartphones, social media, and hybrid workplaces, an acceptable use policy was a lot easier to write—and to enforce. These days, it’s a lot more complicated. Work can take place almost anywhere, on any number of devices. An employee can accept a job and then never physically set foot in the office, working from home (or the Caribbean) on their personal laptop. That’s why an acceptable use policy, or AUP, is more critical than ever—not just to protect the organization, but to protect employees as well.

What is an acceptable use policy?

From an IT perspective, an AUP outlines the acceptable use of corporate data, devices, and networks. In a hybrid workplace, that policy should also include terms and conditions for working on personal devices or home networks. And it should include guests, gig workers, contractors, and other non-employees who use company systems and networks.

Even if some of those terms and conditions may seem obvious (such as not watching porn on a company-issued laptop), it’s still important to have employees sign off on the policy so they’re aware of the rules—and the consequences of breaking them. After all, we may have speed limits, but people still speed.

“People know that cybersecurity is important,” says Alex Michaels, principal adviser at Gartner. “They just aren’t doing what we want them to do.” That’s because they may not view cybersecurity as their personal responsibility. Yet, a significant number of data breaches are caused by human error, such as clicking on a malicious link. The problem is that many AUPs are written in technical jargon, containing “thou shalt not” phrases. Or the security team printed out a generic template they found on the internet. But there are much more progressive—and effective—approaches to establishing and enforcing policies.

“A lot of people in the security space grew up in the security space,” Michaels says. “But what about involving experts who have knowledge in behavioral economics and change management? Those types of things should be part of the conversation as you write your policies and as you look to shift and reframe the perception of security.”

AUPs set rules around IT security policies

An AUP typically sets rules around IT security policies, such as passwords, authentication procedures, and the use of public Wi-Fi. It can also be used to set standards of behavior on social media sites.

“I think everybody needs to reassess this right now,” says Frank Sargent, senior director of security workshops with Info-Tech Research Group. In years past, an acceptable use policy was easier to enforce with technical controls, such as firewalls. Nowadays, many employees are working remotely, possibly even in another country (with or without their employer’s knowledge)—and that can have security, compliance, and tax ramifications. Companies need to “catch up with the new realities of how people are working,” says Sargent. “You can pull an Elon Musk and say ‘thou shalt be back in the office’ to manage that. But that’s going to be a short-term fix.”

While companies need a way to handle variances—such as accommodating an employee who wants to work in the Caribbean for the winter—they also need to ensure their policies remain relevant in a changing world.

“You’re going to have to continue to learn as an organization what these risks are to you, and keep adjusting your policies, keep adjusting your controls, keep adjusting how you’re assessing risk, so it gets to where you need it to be,” Sargent says.

Evolving your acceptable use policy

Your AUP needs to be auditable and enforceable—but there’s a tricky balance between protecting employees and making them feel like they’re working for an authoritarian regime. “It should be written to the end user rather than the technical person who works in security,” says Michaels. “One of the pitfalls that we see in the development of policies is the security leader will either own the creation of the policy or delegate it to somebody on their team, and they won’t go out and source feedback and check that they’re on the right track.”

More mature security programs source feedback and have closer partnerships with HR and the other functions in the business. But many companies are “still trying to do the basic blocking and tackling,” Michaels says. “They’re still more focused on the technology and the process rather than the people that they’re impacting.”

The AUP should be clear, concise, and easy to understand—not technobabble or legalese. But getting employee buy-in could also come down to something as simple as word choice. “My specialty is respectful language and policy,” says policy drafting expert Lewis Eisen. “Respectful language means policies that don’t sound like parents yelling at their children.” For example, rather than using the phrase “You must get the CEO’s permission before borrowing a laptop,” you could say “Laptops are available for borrowing with the permission of the CEO.”

“If you sound like a parent yelling at your kids, you’re going to get the same results parents get when they yell at their kids,” Eisen says.

Enforcing acceptable use policies

An AUP isn’t worth the paper it’s printed on if it isn’t enforceable. But “security people are not HR admins. They’re not disciplinarians. And they’ve taken on the role of disciplinarian, and I don’t believe that’s appropriate,” Eisen says. Indeed, they might not be comfortable in that role—they signed on to work in information security, not policing employees.

The security team has the subject matter expertise to determine what constitutes an infraction and the risk level of that infraction (from minor to fire-able offense). But, says Eisen, disciplinary action should come from HR, which already has processes in place for such matters.

“(Today’s) HR department is not your grandfather’s HR department,” says Claudiu Popa, CEO of Datarisk Canada. “Now they’re accountable for enforcing policies like this one, because up until now, they’re like, ‘Oh, well, that’s obviously a technical policy.’ Well, guess what—it isn’t. They’re required to enforce this policy. So suddenly we have this requirement for HR to be properly trained in security.”

In some jurisdictions, they could also oversee privacy compliance, so Popa recommends HR personnel undergo both security and privacy awareness training. He also recommends they be present at security team meetings, “because they need to understand what it is that they’re enforcing.” While there should be teeth to a policy, another approach to enforcement is incentivizing good behavior rather than doling out punishment for breaking the rules.

Make rules easy for users to follow

“People are exponentially more likely to do the things the security team wants them to do if they have a sense of responsibility as it relates to cybersecurity and the impact on the business,” says Michaels.

Gartner’s PIPE (practices, influences, platforms, and expertise) framework is designed to do just that: shift from awareness to behavior and culture. Part of that involves using “cyber judgment,” a term coined by Gartner, which helps users make informed decisions about risk in the absence of security or risk management leaders.

The other side of that equation is making it as easy as possible for users to follow the rules by reducing digital friction, Michaels says. For example, when it comes to passwords, a high level of digital friction would be requiring users to change their passwords every six months, while a low level of digital friction would be dropping passwords altogether and moving to biometrics.

Technology has a role to play here, such as using machine learning and artificial intelligence to drive decisions and provide targeted training moments about what users should or shouldn’t do in a particular situation. Michaels is seeing more sophisticated organizations bake their AUP into collaboration tools or the help desk, or automatically deliver it to users based on their role in a specific project.

“So, you’re not relying on things you learned 12 months ago that you quickly clicked through,” says Michaels. “It’s delivering that insight in real time, at the moment when you actually need it. We’re seeing more organizations and vendors hyper-focused on that.”

When an organization is updating or evolving its AUP, the focus shouldn’t just be on what employees should not be doing, but on what the organization as a whole can do to create a culture of security.

“There is a difference between following rules about security and adopting a culture of security at the organization,” says Eisen. “Are we locking people in cages with guards on every door? Is that the kind of culture you want? Or are we going to put in controls that are appropriate to the level of what it is that we’re asking for?”


Vawn Himmelsbach is a Toronto-based journalist with a keen interest in cybersecurity. She got her start as a reporter for tech trade magazines in the late ’90s, followed by three years as a correspondent in Asia—and she’s been writing about tech ever since.