Events , Multi-factor & Risk-based Authentication , Next-Generation Technologies & Secure Development
Successful Security? Stop Blaming Users
Experts Offer Insights on Gaining Information Security Buy-InTo encourage individuals to improve their security practices, begin by not blaming them.
See Also: How High-Assurance Digital Identity has become the Center of Authentication Cybersecurity
That was one takeaway that Angela Sasse, a professor of human-centered technology at University College London, offered at the Infosecurity Europe conference. Sasse is also the director of the U.K. Research Institute for Science of Cyber Security, or RISCS.
Officials at Britain's National Cyber Security Center - part of Britain's GCHQ signals intelligence agency, and a primary funder of RISCS - have made user-blaming verboten.
"It's counterproductive; it doesn't help us to change" or to develop more effective security models, Sasse said in a keynote presentation.
Organizations have to find collaborative ways of working with employees and getting them to change, she argued. But they also need to be open to rigorously testing security-related assumptions - and practices - and revising their thinking based on empirical evidence.
Sasse is a proponent of the discipline known as usability, which draws from such fields as design, psychology and cognitive science. Usability refers to the practice of not just designing and building something, but making it easy to master. Great usability can sometimes be defined by building something that many people find intuitive to use - for example, the Amazon.com website, an iPhone, or the automated emergency defibrillators that are increasingly being installed in public places.
Usability is predicated on not just designing something - such as a product or system - but studying how users use it, refining the design, testing it again, refining it, and repeating that process.
In an ideal world, information security wouldn't require any human interaction. It could just be like an airbag that deploys when necessary.
Of course, the real world is more complicated. But that doesn't mean people will prioritize security concerns, no matter how much others lecture them about their importance.
Look Beyond Awareness
"Security awareness and education are not the answer," Sasse said. "Security is security experts' main job; for anyone else, it's a productivity drain. Users will bend over backwards to bypass things that make their life difficult."
Hence, security experts need to persuade individuals and business managers that they need to work together and then work with them to manage the risks they face, she said.
Old dogs will need time, encouragement and guidance to learn new tricks. And their teachers need to understand which levers work best.
"It's not for amateurs. If you've not been trained on how to change ... risky behaviors," Sasse said, "you're going to need help from professionals to do this. Changing a highly learned behavior is effort. It's like [Star Trek's] Jean-Luc Picard - 'engage, engage, engage, engage.'"
Don't Mistake Tests for Collaboration
All security guidance must be actionable, Sasse said. "Complexity and vagueness are the enemy. Communications must be NEAT - necessary, explained, actionable, tested.
"You want them to change their risky behaviors and to start doing good security practices. That is not a quick fix; that is not something you're going to achieve by sending out a few emails."
Sasse also sees phishing tests as a waste of time. Such tests subject employees to fake phishing attempts in an effort to raise security awareness. "I'm against them, totally and unequivocally," she said. "I think it's the wrong approach, because it's not really engaging users, it's not really building the [security] mindset.
"Half of security problems are down to crap IT, So instead of Band-Aid spending, why not focus on the real, underlying problems? ... What you want is to build this collaborative mindset."
Test All Assumptions
Security professionals should not be afraid to upend their long-held assumptions if field tests with users reveal flawed models.
"Don't be afraid to change; accept the fact that we have done things imperfectly in the past, but now is the time as a profession to move on," Jonathan Kidd, CISO at financial services company Hargreaves Lansdown, said during an Infosecurity Europe panel discussion.
For example, Sasse - also a panel participant - pointed to the new password policy issued by NCSC last year, which she said was the result of research by RISCS into how users use passwords in the real world (see Why Are We *Still* So Stupid About Passwords?).
NCSC's new recommendations included no longer forcing regular password expiration. For years, a prevalent line of security thinking has been that passwords should get reset every 30 or 60 days. But press security experts for an explanation, and what emerges is that time period reflected the amount of time it would theoretically have required a hacker who was able to dump all the passwords on a 1980s-era Unix system to brute-force crack them.
Properly designed password security systems today can ensure that passwords remain uncrackable not for days or months, but decades or centuries.
But Sasse said usability research on passwords revealed that users who are forced to change their passwords too often settle on easy-to-remember options as well as reuse. Such choices often make passwords easy to crack and are one reason why many security experts no longer advocate forced password expirations.
Train Technologists to Speak Plainly
A report from the SANS Institute "Securing the Human" project found that many security awareness programs are run by those who have a technology background, rather than a background in disciplines such as sociology, psychology or communications that are devoted to studying groups of users and persuading them to change their behavior.
Cybersecurity consultant Jessica Barker said at the conference that this skill deficit in information security teams can be a major liability. Accordingly, for any organization that says it does "awareness training," Barker said she'll always ask: "Do you have human training for the technologists?"
"Organizations are expecting human teams to learn technology, but not technologists to learn human," Barker quipped.
Avoid Fear
She also warned against using fear to influence behavior.
"If you're talking about something scary, it's important to say how they're susceptible to that threat, and how you expect them to respond, as well as the time and resources they will need to respond," Barker said. "We can't just give people the problem; we need to give them the solution."
She also cautioned against focusing exclusively on bad behaviors, rather than rewarding positive ones. "Too often we become the department or organization that is telling people off, rather than saying, '"That was a good job; well done.'"