Cyber security and the importance of usability

There is nothing new or unusual about the need to design usable systems. A whole industry has grown up around the business of making sure that commercial websites and apps are easy to use and deliver the behaviour, such as spending money, that the owners of those websites and apps want to see.

Usable systems generally require three things: the system has to be useful, or at least perceived as useful, by the end user; the system has to be easy to use by the end user; and the system has to be persuasive so that the user to take the actions that the owner desires.

Is cyber security any different?

These three requirements of utility, usability and persuasiveness are seen in cyber security systems. However there are some differences compared with the consumer-facing world. Making sure a cyber security system succeeds is in some ways more important than making a commercial system succeed.

One issue is that the cyber security system has to work for everyone: potentially if just one person fails to use the system properly then the organisation will be put at risk.

In addition cyber security systems are like stable doors – they need to be shut when you want them to be as there is no use locking them after a breach has happened. If an online shop doesn’t work for some reason then the user can go back and try again, but with a cyber security system, if it doesn’t work first time then the damage may be done.

These are stringent requirements. Unfortunately the nature of cyber security means that these requirements are hard to meet:

  • Users have little motivation to comply with security requirements as keeping secure is not their main purpose; indeed security systems are part of a technical infrastructure that may have no real meaning or relevance to the end users
  • Security systems can “get in the way” of tasks and so can be thought of as a nuisance rather than a benefit
  • Security systems are often based on arbitrary and little understood rules set by other people, such as those found in security policies, rather than on the desires of the end user
  • Users may find complying with the requirements of security systems socially difficult as they may force the user to display distrust towards colleagues

These are all challenging issues and any security systems you design need to ask the very minimum of effort from the user if it is to overcome them.

Unfortunately many cyber security systems demand a degree of technical knowledge. For instance they may use jargon: “Do you want to encrypt this document?” will have an obvious meaning to anyone working in IT but may mean nothing to some users.

Furthermore some security requirements may of necessity require a degree of “cognitive overload”: the requirement to remember a strong password (perhaps 12 random characters) is an example. Again this will cause additional difficulty.

Users are not naturally motivated towards cyber security systems. And they may find them hard to use. So how can success – universal and efficient use of systems – be achieved?

Delivering success

Start with the end user. Ensure, through the use of a combination of interviews (including the standard “speak aloud” protocol used by many UX practitioners), observation and expert evaluation identify where the obstacles to successful use of the system are placed. Obviously the usual rules of good usability will apply: consistency, reduced cognitive overload, feedback, and help when mistakes are made.

Learnability is also important. Accept that some form of help may be needed by the user and ensure that this is available, ideally within the system. Help files shouldn’t just tell people how to achieve something but also why it is important.

But for cyber security systems there is also a lot of work to be done around persuasion. This will involve educating the end user about the importance of the system – how it protects their organisation, and how it protects them as individuals.

It will also involve ensuring that the system is credible – that end users realise that the system does what it is supposed to do and isn’t just a tick box exercise or something dreamed up by the geeks in IT to make everyone’s live that little bit harder.

And it will involve demonstrating to the end user that all their colleagues are using the system – and if they don’t use it then they will be out of line with the majority.

“Usability is not enough” is a common theme in retail website design. It is even more important in the design of cyber security systems.

 

 

 

 

 

 

 

Advertisements

Why Human Resources need to engage with cyber security

You may think that cyber security is something for your IT department to manage. If you work in Human Resources, you need to think again. Because cyber security is very much your responsibility.

No, I am not saying you need to go around seeing if your organisation has installed the latest firewall or if all your Internet of Things ports have been secured.

What you do need to do though, is to check whether your colleagues across the organisation are cyber safe.

That’s because only around one third of data breaches are caused by malicious outsiders. The rest are caused by insiders, your colleagues: acting foolishly, carelessly, and yes sometimes maliciously.

What can go wrong? A lot of things. Personal information about customers is leaked because a laptop gets left in a taxi.  An email leads to an unintentional contract variation. A social media posts leads to a libel suits. An unwary worker shares their log-in details, leading to data theft.

So what should you be doing?

Start with strategy

A good place to start is strategy. Most organisations have some understanding of cyber risk. But often they focus on protecting corporate networks from external risks such as hackers. What is your organisation’s cyber security strategy? Does it include sufficient analysis of internal “human” risks? If it doesn’t then you need to work with the Information Security team to identify and manage these human risks.

Develop practical policies

Developing appropriate policies to help manage cyber security and spell out the “rules” is important. You are likely to need policies in several areas: web and computer use, data privacy, social media use at work, a “Bring your own device” policy to manage personal phones and tablets, and even policies about the software and cloud services that people are allowed to use.

Writing these policies should not be a “tick box” exercise. They need to make sense: they should be easy to understand by everyone in the organisation; and they need to benefit your organisation. They shouldn’t simply be designed to make the IT department’s life easier. Sure, pouring digital super-glue into all the USB ports would stop people uploading corporate data to insecure USB sticks, but it might not improve business efficiency. HR executives, with a feel for wider business needs, as well as an understanding of what will motivate or demotivate employees, are an essential part of any cyber policy development process.

Training: tell people how they should behave

The next step is training. Training is essential because without it most people won’t know how to act in a cyber safe manner.

You might as well accept that almost no one is going to read your policies. So you will have to tell everybody about them, face to face. And it won’t be enough to read out a list of rules and corresponding sanctions for disobedience. Apart from putting everyone’s backs up, people will generally ignore rules if they don’t know why they are in place. You will need to explain what the rules mean, why keeping to them is important, and quite possibly when they can be ignored (and when they can’t).

You will need to train the way people think too. This isn’t just about describing dangers: it’s about how people interact safely with colleagues, with suppliers and customers, and with people outside the organisation. It’s not about following rigid processes: it’s about understanding how to avoid risk in the first place. For instance you can’t tell people the precise information to avoid sharing on social media. But you can help them understand what types of information they shouldn’t share and how competitors can draw conclusions from seemingly innocent pieces of data.

Build continued awareness

Don’t think a one-off (or even annual) training session will cut it though. You need to keep awareness of cyber safe behaviour at the front of people’s minds. This means developing assets designed to deliver continued awareness of cyber risks – posters (that change design and location regularly), screen savers, sign in messages, even mouse mats and mugs.

Develop a cyber secure culture

An even more important issue for HR to address is culture. An organisation that doesn’t take cyber security seriously is unlikely to be changed by training and awareness. HR may need to address underlying cultural assumptions.

Start by auditing the security culture. Do this from the perspective of employees: what cyber risks do they know of; what do they think of existing security processes; to what extent do they feel security is their responsibility? And do it from the perspective of the organisation: how are employees expected to behave; what sort of resources are provided for security; is dangerous behaviour stopped, tolerated – or not even noticed.

Once you know what needs to change, you can start thinking about how to do that. Build persuasion tools, such as leader boards of cyber-safe behaviour; incentivise safe behaviour with praise or other rewards – and make sure it is not disincentivised accidentally; ensure that leaders walk the cyber security walk; develop an intolerance to unsafe behaviour. (“Why are you putting my job at risk by doing that?”)

But don’t develop a blame culture. That way you will just drive unsafe behaviour underground.

Encourage people to be less trusting

Sadly, one element of culture you will need to work on is trust. People are often very trusting and this can be a problem for cyber security. They need to be taught to question: emails don’t always come from the people they appear to, friendly people on the phone aren’t always who they say they are, confident people striding round the office without a visitor’s badge don’t necessarily have the right to be there. Defending against people who take advantage of trust doesn’t need complex software: it needs awareness, sometimes combined with robust processes.

Make sure cyber security is usable

HR teams also need to work on the usability of any security processes.

By their nature most IT people are very logical. In addition they understand the purpose of systems they are developing. And of course they are focussed on their responsibility to protect IT systems.

In HR you are also focussed on cyber security. But you may have a wider view of the organisation. Almost certainly you understand what motivates people. You understand how people perform their tasks. And you probably provide a receptive ear to frustrated colleagues. In fact you are probably going to be one of the first people to hear about cyber security initiatives that are counter productive – because they cause blocks in efficiency. And you may even hear how people would like to alter them.

All this means that you are in pole position to identify usability problems, to construct the analysis that proves (to sceptical colleagues in IT) problems exist and to make the case for change.

Monitor “off network” activities

Not everything that should concern your organisation will be happening within your corporate network. You colleagues, almost inevitably, will be using social media. And many will be commenting on colleagues, clients, your organisation and your industry. In addition they may be using cloud computing services such as Drop Box and Google Docs to store, edit and share corporate information. This type of activity needs to be managed, to preserve information security and to protect reputation.

Recruit sensibly

When recruiting, watch out for people who may not be cyber secure. Anyone who comes from a competitor boasting they can bring a list of clients on a disk may well be less than trustworthy. You might also need to think twice about people whose social media posts are irresponsible – perhaps complaining about their current employers or giving information away about new initiatives.

Keep an eye on risky people

Some people will be higher risks than others. Sometimes this will be a result of personality. For instance sales people are likely to me more open, and possibly more trusting, than finance people. But that’s not where the real risk lies. The people you will need to monitor most closely are those who feel disengaged from your organisation. These may include temporary staff, new recruits during a probation period, people on low pay or in boring jobs, people who have handed their notice in, and people who are having difficulties at work, perhaps experiencing disciplinary procedures.

Yes, cyber security really is an issue for HR

Human Resources managers may not be particularly focussed on technology. But they have a responsibility to learn about cyber security because the role that HR can play in preserving security is an enormous one. In other words, if your HR and IT departments are not working closely together on cyber security you are opening your organisation up to some major and unnecessary risks.

Why your employees are your biggest cyber threat

People and cyber risks

Cyber threat is a problem. 90% of large UK organisations suffered an information breach in 2014. But ask an IT manager what keeps them awake at night and they are likely to say “my colleagues”.

Human error is responsible for around two thirds of data breaches in the UK with only one third being caused by malicious outsiders.

These human errors vary widely from the use of weak passwords, people losing mobile phones that contain confidential information, accidentally forwarded emails, and people succumbing to phishing attacks that steal log in details.

Why are people such a risk? There are three main problems: ignorance, inconvenience, and trust.

Ignorance

When were you last trained on cyber risks? Chances are that if you don’t work in IT you won’t have had any training beyond an IT “policy” hidden somewhere in your employee’s handbook.

And yet there are cyber risks everywhere: people who use public wi-fi to log on to your corporate network; people who store sensitive information such as a new product design insecurely in the “cloud”; people who accidentally give away strategic plans through conversations or behaviour on social media.

It isn’t sufficient to tell people about the risks. You also need to help people understand the importance of complying with information security policies. Too many people feel that security policies are irrelevant: perhaps they think a security breach won’t affect them; or they feel that it’s not their job to police security; they might even think they think they are too important to bother with security rules.

Inconvenience

Badly designed systems that are inconvenient to use are another major cause of cyber risk. If security requirements get in the way of doing a job efficiently, people will look for ways to get around them. Usable systems need to be developed with input from users, so that they protect corporate systems but avoid hampering employees. Forget that simple rule and expect the number of information breaches to grow.

Trust

The fact that most people are very trusting is also a problem for cyber security. Passwords get shared because people trust colleagues to act appropriately – even though sometimes they don’t. And trust is the reason that so many people fall for phishing attacks

People are social animals. Because we trust people we have a tendency to follow the crowd.  If everybody is doing something, then we will do it too. This is particularly true when that “everybody” is influential. In other words, if the CEO is seen to be routinely flouting cyber security requirements, they shouldn’t be surprised if the rest of the company does it too.

Managing people risks

Managing cyber risk isn’t easy – because managing people isn’t easy. You can tell them what to do but that doesn’t mean they will do it!

Nonetheless, the first step is education. Explaining cyber risks and why they are important should be done face to face. Do it regularly to keep it front of mind. And use different media to keep awareness up: emails, posters, on-screen messages, “advertisements” on the intranet. And socialise it: use the fact that we are social animals by presenting and discussing cyber security advice in groups, and by encouraging people to share best practice.

Back up your education with appropriate tools – to make it easy for people to comply with the guidelines, or to monitor and manage people’s compliance. There are numerous tools although of course the resources your organisation has to hand will dictate how many can be used.

Consider email management tools that can encrypt content, prevent alteration of emails, or manage the distribution of content and attachments. Investigate “Bring your own device” tools such as software that allows mobile devices to be locked or even wiped if they are stolen. Password sharing is also a problem, especially in relation to corporate social media accounts. The solution here may be implementing “single sign on” systems that allow people who sign on to a corporate network to be given access only to those systems they are authorised to access.

You may also want to stop your employees from being so trusting. A good place to start is with an anti-phishing tool. These allow organisations to create and circulate spoof phishing emails which flash up warning messages when clicked on and record data about who is being fooled by them.

Finally ensure that you manage people appropriately. Personalise the information they get so that it is perceived as relevant. Play games with them such as spoofing phishing attacks and seeing whether they fall for them. Give them instant feedback about the things they do well – and the things they do badly. And don’t expect people to change all of their risky behaviour over night – push them gently towards safety by suggesting a series of small changes over time.

It’s important not to forget network security when thinking about cyber security. But with so much information being held and used outside the corporate network it is vital to address the very real cyber risks that your employees represent.

Why social media privacy setting are a waste of time

Social media sites: they are private, right? There are lots of privacy settings; so whatever I post is safe and secure and can only be seen by people I choose. Right?

Wrong!

Social media sites are not private. So if you wouldn’t want your mum (or your boss) to see something, then don’t post it on a social media site.

First of all, are you sure you have your privacy settings set in a way you want them? Or are you just trusting the default settings?

While the majority of people do alter their security settings, around 40% of people have either public or only partially private settings.  And while Facebook is making efforts to increase the ability of users to tweak their privacy settings the very fact that they are having to do this shows that there is a problem. And if you don’t have your privacy settings the way you want them, the chances are you are sharing information you don’t want to.

But difficulty choosing the right privacy setting is not the only problem. Another problem involves who you choose to share with.

The average Facebook user has 338 Facebook “friends”.  And yet, according to researchers at Oxford University, the average person has fewer than 10 close friends. So that’s about 330 people on Facebook most people can’t be sure they can trust. (Even if you are sure you can trust your real friends…) Sharing only with Facebook friends doesn’t guarantee that those Facebook “friends” won’t share your embarrassing posts with the wider community.

And using ephemeral sites like Snapchat doesn’t necessarily lessen the risk.  Those ephemeral photos may well be stored deep in the recipient’s phone, and in any case it is a simple matter to take a screenshot of them or even just tap them to store them for future use.

The potential lack of privacy doesn’t end there. The risk of a social media account being hacked is considerable, especially when poor passwords are used. And if that happens then who knows where those embarrassing posts will end up! And finally of course you are trusting that the platform itself won’t get hacked or share information by mistake.

The wrong privacy settings. Friends you can’t trust. Ephemeral content that really exists for ever. Accounts getting hacked. Websites releasing your information by mistake. All in all, social media platforms are not guaranteed to preserve your privacy.

And as that is the case, then you should make sure that you could never be ashamed of anything you post.

Digital natives? Meh! (or “Prensky revisited”)

Call me old fashioned, but: I really don’t think business needs to worry about so called “digital natives”.

Who are they? Well, I suppose we had better go back to the person who invented the term, Marc Prensky, who used it when talking about students enrolling at university in 2001. These people would have been born in around 1983, so they will be around 30 now. 

These people (so the Prensky theory goes) are radically different from previous generations, his so-called “digital immigrants”:

  • They are used to receiving information really fast
  • They like to parallel process and multi-task
  • They prefer their graphics before their text rather than the opposite
  •  They prefer random access (like hypertext)
  • They function best when networked
  • They thrive on instant gratification and frequent rewards
  • They prefer games to “serious” work

All of this is because they have been surrounded by digital technology such as “computers, video games, digital music players, video cams, cell phones” for most of their lives. And because of this there is a “discontinuity” between these people and the people who grew up before them.

It’s probably unfair to criticise Prensky at this distance in time. But the trouble is people are still talking about his digital natives and saying that they are somehow different from the rest of us. I’d disagree. For a start most of the technologies that Prensky mentions had been around for quite a while in 2001 and would have been very familiar to many “digital immigrants” as well as  his “digital natives”.

Many people in the 1970s had been exposed to computers. They would have been playing video games from an early age (Space Invaders anyone?). They would have used portable (if not digital) music players on the way to school and video cams at the weekend.

Other technologies came too late for Prensky’s “digital natives”. Cell phones? These didn’t achieve 10% penetration in the USA until about 1993 so most “digital natives” didn’t grow up with them. Same for the internet : when our “digital natives” were around 12 years old in 1995, internet penetration in the USA had only just hit 10%.

OK, perhaps Prensky was a bit too quick off the mark back in 2001. Those digital natives weren’t really around then. But they might exist by now. And, if you have read this far, that might be what you are worried about.

Perhaps people who grow up with digital technology that they

  • interact with
  • regularly
  • at home

really are different. Perhaps, as some people have claimed, their brains are wired differently because of their early experiences. That might be. But I don’t think the differences are those that Prensky described.

They like to parallel process and multi-task: they are used to doing homework while watching TV or listening to music. That’s really new? I don’t think so. It’s been standard teenage and student behaviour ever since I was a kid. (It’s also inefficient as most psychologists will tell you which is why it is illegal to use your mobile when you are driving a car.)

They prefer their graphics before their text rather than the opposite. This isn’t particularly important from a business perspective, but again I am not convinced this is new. You just have to look at learn-to-read books from the 1950s (and earlier) to see that graphics and text often go together in a learning context (not “before” or “after”).

They function best when networked. It is increasingly true that networked information is always available. This is changing people, but not necessarily for the better. For example, people who come to rely on SatNav devices tend to lose map-reading skills; they follow navigation instructions rather than using common sense. And it isn’t just (some) digital natives who display an over-reliance on information from third parties. This is, unfortunately, behaviour than anyone can learn, not just digital natives.

They thrive on instant gratification and frequent reward. Most people do. Only the most disciplined can resist the opportunity of getting something now at the expense of having less in the future. Ask the credit card companies.

They prefer games to “serious” work. Only an academic would think this an unusual human condition.

This leaves one thing that could actually be connected with technology. They prefer random access to information. If this is true, there may well be implications for education in terms of course structure and the structure of learning aids.

But is it something that business needs to respond to (beyond including hypertext links in websites)?

In some cases, it is. Music and other media products (generally excluding things with a “story” element) need to allow random exploration.

But what else does? Not cornflakes (random information features would be boring). Not cars (random information features would be dangerous). Not retail (random menus would be highly frustrating).

In fact randomness of information access is not a particularly useful feature for most products, although of course the option to explore information in a non-linear fashion is. There is little new about that, however: even printed books have indexes.

None of this is to say that society isn’t changing. It is. But there hasn’t been a discontinuous change that makes digital natives (whenever they were born) different from digital immigrants. One thing less to worry about? Perhaps. But that still means you need to understand the behaviour of today’s “digital consumers”. And some aspects of it might just surprise you.

To find out how your business needs to respond to changing consumer behaviour why not email me at jeremy@mosoco.co.uk.