The dangers of hidden data

How many times have you leaked strategic data by accident? And do you even know when you have?

There are a multitude of opportunities to share strategic information with third parties such as clients and suppliers by accident. Information that could seriously damage your negotiating position. And if you are not aware of these dangers, it is very easy to do this.

Take Microsoft Office documents. If you ever share Excel spreadsheets with clients, do you make sure that any “hidden” columns don’t contain information you would rather keep hidden. Creating pivot tables to communicate your data analysis? Are you sure that the original detailed data isn’t available somewhere? And what about PowerPoint. Are those “Notes” pages suitable for sharing, or do they contain thoughts that you would rather not put in writing? And those text boxes that you pulled off the side of slides when you were writing them – you know they are still there of course!

Have you collaborated with others to produce a document? Most likely you will have written notes and tracked changes. If you are not careful much of the history of your document could be available to the final recipients: and that could be embarrassing!

Don’t forget document metadata either. Are there any interesting titbits in the “Properties” of your documents – the original author perhaps or the date the document was first drafted? Who know what value that might be to someone else.

Perhaps you think you have blocked some text out. Ineffective “redaction” is the cause of a lot of data leakage. For instance, blocking out text using a “highlight” the same colour as the text won’t delete it – and it could be very easy get rid of the highlight.

It’s not just documents though. There are lots of places where information can be hidden. Are your social media posts geo-tagged for instance? If you are regularly visiting a particular location, that could be of interest to competitors – or your colleagues.

Software can be another culprit. Is there any hidden text in your website, perhaps in an “invisible” font or in a comment tag. And that software you have commissioned _ are you sure the developers haven’t left any notes that could give away secrets?

Is there strategic data hidden in plain site? You might be surprised where interesting data lurked. Security blogger Brian Krebs tells how he analysed an airline boarding card and found a wealth of information in the bar code – including information that could have helped him disrupt future travel plans.

And finally – do be careful how you delete sensitive files. It isn’t sufficient to “delete” them as they will probably still exist in some form on your hard drive, easy for anyone reasonably skilled to find. You need to actively scrub them out. There is plenty of free software available online to do this. (Make sure you do this carefully when you recycle a personal computer or smartphone.)

The data you don’t value is often surprising valuable to other people, especially competitors and suppliers. Don’t share it accidentally because you simply can’t see it.

Cyber security and the importance of usability

There is nothing new or unusual about the need to design usable systems. A whole industry has grown up around the business of making sure that commercial websites and apps are easy to use and deliver the behaviour, such as spending money, that the owners of those websites and apps want to see.

Usable systems generally require three things: the system has to be useful, or at least perceived as useful, by the end user; the system has to be easy to use by the end user; and the system has to be persuasive so that the user to take the actions that the owner desires.

Is cyber security any different?

These three requirements of utility, usability and persuasiveness are seen in cyber security systems. However there are some differences compared with the consumer-facing world. Making sure a cyber security system succeeds is in some ways more important than making a commercial system succeed.

One issue is that the cyber security system has to work for everyone: potentially if just one person fails to use the system properly then the organisation will be put at risk.

In addition cyber security systems are like stable doors – they need to be shut when you want them to be as there is no use locking them after a breach has happened. If an online shop doesn’t work for some reason then the user can go back and try again, but with a cyber security system, if it doesn’t work first time then the damage may be done.

These are stringent requirements. Unfortunately the nature of cyber security means that these requirements are hard to meet:

  • Users have little motivation to comply with security requirements as keeping secure is not their main purpose; indeed security systems are part of a technical infrastructure that may have no real meaning or relevance to the end users
  • Security systems can “get in the way” of tasks and so can be thought of as a nuisance rather than a benefit
  • Security systems are often based on arbitrary and little understood rules set by other people, such as those found in security policies, rather than on the desires of the end user
  • Users may find complying with the requirements of security systems socially difficult as they may force the user to display distrust towards colleagues

These are all challenging issues and any security systems you design need to ask the very minimum of effort from the user if it is to overcome them.

Unfortunately many cyber security systems demand a degree of technical knowledge. For instance they may use jargon: “Do you want to encrypt this document?” will have an obvious meaning to anyone working in IT but may mean nothing to some users.

Furthermore some security requirements may of necessity require a degree of “cognitive overload”: the requirement to remember a strong password (perhaps 12 random characters) is an example. Again this will cause additional difficulty.

Users are not naturally motivated towards cyber security systems. And they may find them hard to use. So how can success – universal and efficient use of systems – be achieved?

Delivering success

Start with the end user. Ensure, through the use of a combination of interviews (including the standard “speak aloud” protocol used by many UX practitioners), observation and expert evaluation identify where the obstacles to successful use of the system are placed. Obviously the usual rules of good usability will apply: consistency, reduced cognitive overload, feedback, and help when mistakes are made.

Learnability is also important. Accept that some form of help may be needed by the user and ensure that this is available, ideally within the system. Help files shouldn’t just tell people how to achieve something but also why it is important.

But for cyber security systems there is also a lot of work to be done around persuasion. This will involve educating the end user about the importance of the system – how it protects their organisation, and how it protects them as individuals.

It will also involve ensuring that the system is credible – that end users realise that the system does what it is supposed to do and isn’t just a tick box exercise or something dreamed up by the geeks in IT to make everyone’s live that little bit harder.

And it will involve demonstrating to the end user that all their colleagues are using the system – and if they don’t use it then they will be out of line with the majority.

“Usability is not enough” is a common theme in retail website design. It is even more important in the design of cyber security systems.

 

 

 

 

 

 

 

A New Year’s resolution for CEOs

“I am going to take cyber security seriously in 2016.”

On the whole senior executives claim that they want to act in an ethical manner. And yet if they fail to embrace cyber security they are clearly lying.

Why do I say that? Because playing fast and loose with customer data wrecks lives. It is as simple as that. Lose your customers’ data and you expose them to a major risk of identity theft – and that can and does cause people massive personal problems.

The problems that David Crouse experienced in 2010 are typical. When his identity was stolen he saw $900,000 in goods and gambling being drained from his credit card account in less than 6 months. His credit score was ruined and he spent around $100,000 trying to solve the problems.

Higher interest rates and penalty fees for missed payments just made his financial situation worse. His debts resulted in his security clearance for government work being rescinded. Having lost his job, other employers wouldn’t touch him because of his debts and credit score. He felt suicidal. “It ruined me, financially and emotionally” he said.

Data breaches frequently result in identity theft. And this can have a devastating emotional impact on the victims, as it did with David Crouse. Research from the Identity Theft Resource Center  indicates that 6% of victims actually feel suicidal while 31% experience overwhelming sadness.

The directors of any company whose negligence results in customers feeling suicidal cannot consider themselves to be ethical.

Unfortunately most data breaches that don’t involve the theft of credit card details are dismissed by corporations as being unimportant. And yet a credit card can be cancelled and replaced within hours. A stolen identity can take months, or longer, to repair.

And all sorts of data can be used to steal an identity. An email address and password; a home and office address; the names of family members; a holiday destination; a regular payment to a health club… Stolen medical records, which are highly effective if you want to steal an identity, will sell for around £20 per person online, while credit card details can be bought for as little as £1. Go figure, as they say in the USA.

Organisations must accept that any loss of customer data puts those customers in harm’s way. And if they want to be seen as ethical they must take reasonable steps to prevent data breaches. Until they do, well the EU’s new data protection rules can’t come on-stream quickly enough for me!

Does your cyber security have the right aura?

Can cyber security have auras?

How can cyber security have an “aura”? It sounds like a meaningless question. But step back a little and think about how direct marketing works.

Commonly, people in direct marketing use a simple mnemonic to describe the steps they take consumers through when persuading them to buy: AURAL. I think this is relevant for cyber security.

AURAL stands for Awareness, Understanding, Relevance, Action, and Loyalty. In other words:

  1. You start by making people aware of your product
  2. You move on to helping them understand what it does – its benefits and features
  3. Then you persuade them that the product is relevant for their own needs, that it solves a particular problem they have
  4. Next you call them to the action you want them to take, which is generally putting their hand in their pocket and shelling out for whatever you are selling
  5. And finally you hope to generate some loyalty so that they will come back and buy again, and perhaps even recommend your product to their friends.

As I said, this process (which by the way doesn’t have to be linear) is pretty relevant for cyber security too. Except that “loyalty” isn’t really appropriate. But rather than simply getting rid of the “L” I am going to change it to an S: AURAS. The final S stands for Socialise. You will see what I mean in a moment.

So what do I mean by “AURAS”?

Awareness

As with direct marketing, in cyber security we need Awareness. This is aimed at keeping cyber threats, and the need for cyber security, at the front of everyone’s minds.

You might create awareness with posters (remember to move them around and change their message so that people don’t become blind to them), emails (personalised messages can be highly effective), messages when people start their computers up or start to do certain things (again remember to change them), even things like mugs and mouse mats which can be given to reward cyber safe behaviour.

Understanding

It isn’t enough to be aware of a threat though. People also need Understanding about what they can do. For instance, if you have a policy of insisting on complex passwords that are changed every month then you need to give people the tools to do this – otherwise they are likely to write their passwords down on sticky notes and put them on their monitors, hardly the cyber safe behaviour you want to encourage. (There is a hint about complex passwords at the end of this post.) This is where training comes in: helping people understand how they need to behave to keep safe.

Relevance

You also need to ensure that people feel the training they have had has real Relevance to their own lives. Not everyone lives to work. Most people regard work as a way of getting the things they want in life. Of course their job is important – so stressing that unsafe behaviour could damage their employer, and hence their own job, is one tactic.

A stronger tactic though (and one that might even generate a bit of gratitude) is to show them how being cyber safe can help them outside their work life – protecting their identity, their bank accounts, their children’s physical safety.

Action

Now you need to call them to Action. This involves communication at the moment they are doing something. For instance, BAE’s email security service has a very handy feature: if a user is tempted to click on a link in an email (generally accepted as unsafe behaviour unless you are certain who the email is from) they can be served a CAPTCHA image which makes them stop and think about what they are doing before they click on the link.

(I haven’t seen these images: it would be nice to think that instead of a standard CAPTCHA image such as a random set of numbers they contain a little message like “Are you sure?” or “Links can hurt”.)

Socialise

And finally you need to Socialise cyber safe behaviour into the organisation. The aim will be to make unsafe behaviour socially unacceptable – just as drink driving, not showering after a lunchtime run, or eating fish soup at your desk are all pretty unacceptable.

One of the most powerful way of socialising behaviour is telling people that the majority of their fellows act in the way you are hoping to persuade them to act.This doesn’t have to be complicated. For instance Northern Illinois University halved the amount of binge drinking by students simply by promoting the message “Most students drink in moderation.” People follow the crowd.

AURAS

AURAS: it’s a great way of thinking about the different things you need to do to change the way people think about cyber security and to change the way they behave.

An easy way to complex passwords

Now I did say I would give you a tip about remembering complex passwords that change every month. It’s easy. You need two things: a memorable phrase; and a date “protocol” (I’ll explain).

Let’s say your IT people have demanded a password of at least 12 characters that includes at least one of each of the following: upper case letter, lower case letter, number and symbol. They also want you to change it every month.

First of all, the phrase. This isn’t the same as a “pass phrase” where people use several words as a passwords: there is some evidence that this isn’t very secure.

You need to think of a phrase such as: I love my job at Acme Widgets, Dorking! Take the first letter of each word and the symbols and you get: 1lmj@AW,D! (the word “at” is useful as it turns nicely into a symbol and the “I” is useful as you can turn it into a number 1).

Now think about a date “protocol”. A really simple one might be to use the first of the month. It’s October 2015 so that makes: 01 10 15. Just for a bit of fun I am going to put the first thee numbers at the start and the last three numbers at the end. So my password this month is: 0111lmj@AW,D!015. Easy to remember and I can change it every month.

Keep cyber safe!

Why Human Resources need to engage with cyber security

You may think that cyber security is something for your IT department to manage. If you work in Human Resources, you need to think again. Because cyber security is very much your responsibility.

No, I am not saying you need to go around seeing if your organisation has installed the latest firewall or if all your Internet of Things ports have been secured.

What you do need to do though, is to check whether your colleagues across the organisation are cyber safe.

That’s because only around one third of data breaches are caused by malicious outsiders. The rest are caused by insiders, your colleagues: acting foolishly, carelessly, and yes sometimes maliciously.

What can go wrong? A lot of things. Personal information about customers is leaked because a laptop gets left in a taxi.  An email leads to an unintentional contract variation. A social media posts leads to a libel suits. An unwary worker shares their log-in details, leading to data theft.

So what should you be doing?

Start with strategy

A good place to start is strategy. Most organisations have some understanding of cyber risk. But often they focus on protecting corporate networks from external risks such as hackers. What is your organisation’s cyber security strategy? Does it include sufficient analysis of internal “human” risks? If it doesn’t then you need to work with the Information Security team to identify and manage these human risks.

Develop practical policies

Developing appropriate policies to help manage cyber security and spell out the “rules” is important. You are likely to need policies in several areas: web and computer use, data privacy, social media use at work, a “Bring your own device” policy to manage personal phones and tablets, and even policies about the software and cloud services that people are allowed to use.

Writing these policies should not be a “tick box” exercise. They need to make sense: they should be easy to understand by everyone in the organisation; and they need to benefit your organisation. They shouldn’t simply be designed to make the IT department’s life easier. Sure, pouring digital super-glue into all the USB ports would stop people uploading corporate data to insecure USB sticks, but it might not improve business efficiency. HR executives, with a feel for wider business needs, as well as an understanding of what will motivate or demotivate employees, are an essential part of any cyber policy development process.

Training: tell people how they should behave

The next step is training. Training is essential because without it most people won’t know how to act in a cyber safe manner.

You might as well accept that almost no one is going to read your policies. So you will have to tell everybody about them, face to face. And it won’t be enough to read out a list of rules and corresponding sanctions for disobedience. Apart from putting everyone’s backs up, people will generally ignore rules if they don’t know why they are in place. You will need to explain what the rules mean, why keeping to them is important, and quite possibly when they can be ignored (and when they can’t).

You will need to train the way people think too. This isn’t just about describing dangers: it’s about how people interact safely with colleagues, with suppliers and customers, and with people outside the organisation. It’s not about following rigid processes: it’s about understanding how to avoid risk in the first place. For instance you can’t tell people the precise information to avoid sharing on social media. But you can help them understand what types of information they shouldn’t share and how competitors can draw conclusions from seemingly innocent pieces of data.

Build continued awareness

Don’t think a one-off (or even annual) training session will cut it though. You need to keep awareness of cyber safe behaviour at the front of people’s minds. This means developing assets designed to deliver continued awareness of cyber risks – posters (that change design and location regularly), screen savers, sign in messages, even mouse mats and mugs.

Develop a cyber secure culture

An even more important issue for HR to address is culture. An organisation that doesn’t take cyber security seriously is unlikely to be changed by training and awareness. HR may need to address underlying cultural assumptions.

Start by auditing the security culture. Do this from the perspective of employees: what cyber risks do they know of; what do they think of existing security processes; to what extent do they feel security is their responsibility? And do it from the perspective of the organisation: how are employees expected to behave; what sort of resources are provided for security; is dangerous behaviour stopped, tolerated – or not even noticed.

Once you know what needs to change, you can start thinking about how to do that. Build persuasion tools, such as leader boards of cyber-safe behaviour; incentivise safe behaviour with praise or other rewards – and make sure it is not disincentivised accidentally; ensure that leaders walk the cyber security walk; develop an intolerance to unsafe behaviour. (“Why are you putting my job at risk by doing that?”)

But don’t develop a blame culture. That way you will just drive unsafe behaviour underground.

Encourage people to be less trusting

Sadly, one element of culture you will need to work on is trust. People are often very trusting and this can be a problem for cyber security. They need to be taught to question: emails don’t always come from the people they appear to, friendly people on the phone aren’t always who they say they are, confident people striding round the office without a visitor’s badge don’t necessarily have the right to be there. Defending against people who take advantage of trust doesn’t need complex software: it needs awareness, sometimes combined with robust processes.

Make sure cyber security is usable

HR teams also need to work on the usability of any security processes.

By their nature most IT people are very logical. In addition they understand the purpose of systems they are developing. And of course they are focussed on their responsibility to protect IT systems.

In HR you are also focussed on cyber security. But you may have a wider view of the organisation. Almost certainly you understand what motivates people. You understand how people perform their tasks. And you probably provide a receptive ear to frustrated colleagues. In fact you are probably going to be one of the first people to hear about cyber security initiatives that are counter productive – because they cause blocks in efficiency. And you may even hear how people would like to alter them.

All this means that you are in pole position to identify usability problems, to construct the analysis that proves (to sceptical colleagues in IT) problems exist and to make the case for change.

Monitor “off network” activities

Not everything that should concern your organisation will be happening within your corporate network. You colleagues, almost inevitably, will be using social media. And many will be commenting on colleagues, clients, your organisation and your industry. In addition they may be using cloud computing services such as Drop Box and Google Docs to store, edit and share corporate information. This type of activity needs to be managed, to preserve information security and to protect reputation.

Recruit sensibly

When recruiting, watch out for people who may not be cyber secure. Anyone who comes from a competitor boasting they can bring a list of clients on a disk may well be less than trustworthy. You might also need to think twice about people whose social media posts are irresponsible – perhaps complaining about their current employers or giving information away about new initiatives.

Keep an eye on risky people

Some people will be higher risks than others. Sometimes this will be a result of personality. For instance sales people are likely to me more open, and possibly more trusting, than finance people. But that’s not where the real risk lies. The people you will need to monitor most closely are those who feel disengaged from your organisation. These may include temporary staff, new recruits during a probation period, people on low pay or in boring jobs, people who have handed their notice in, and people who are having difficulties at work, perhaps experiencing disciplinary procedures.

Yes, cyber security really is an issue for HR

Human Resources managers may not be particularly focussed on technology. But they have a responsibility to learn about cyber security because the role that HR can play in preserving security is an enormous one. In other words, if your HR and IT departments are not working closely together on cyber security you are opening your organisation up to some major and unnecessary risks.

Eight steps to change cyber security culture

Hackers are always a problem. And naturally, your IT Department has network security buttoned down. But they are probably more worried about something else: you and your colleagues.

The big challenge in cyber security is people. It is how to change an organisation’s culture from relying on IT for security into one where everyone takes responsibility. Everyone, from the CEO to the newest intern.

John Kotter famously proposed an eight step process for changing organisational culture, starting with “Establish a sense of urgency” and finishing with “Institutionalise the change”. Well, most people realise that the cyber security problem is pretty urgent. So I thought I’d outline a separate set of eight steps that organisations can follow to strengthen their cyber security culture.

Step 1. Build your guiding coalition

Start by building a multifunctional team to guide change. Cyber security shouldn’t be the responsibility of IT, so you will need people from across the organisation to be involved: sales, marketing. operations, finance… This is essential so you get buy in across the organisation.

More importantly though, if your approach to security doesn’t take account of the way people work, it will fail.

Step 2. Form your vision and scope out your intentions

Next you need to form your vision for cyber security. That should be simple: to protect your assets, reputation, efficiency and information from computer based threats, and to ensure that your digital information is private, is accessible by people who have authority, and has integrity (think “the truth, the whole truth and nothing but the truth”).

In addition you will need to identify the scope of your vision: who it applies to, and what assets, processes and information is relevant. You will also need – and this is a big task – to identify the risks that your vision faces and how best to manage them.

Step 3. Define the details of what you want to achieve

Out of your vision will come the detailed policies you need around cyber security (including policies on IT and web use, Bring your own device, Privacy, and Social media). These need to be expressed in clear language: avoid techie jargon at all costs. Having a truly multifunctional team should mean that the policies should be relevant and effective for your whole organisation.

Step 4. Build new processes

Based on your policies you will be able to identify the tools you need to implement and the processes you need to develop that will help to protect you from cyber risks. It is vital to include a cross section of employees in the design of these systems. Without them you are likely to end up with unusable, frustrating and inflexible processes. If that happens your workforce will soon be looking for ways to work around them. So remove any barriers to people being cyber safe.

Step 5. Educate

Bring your policies to your workforce and educate them about any new tools and processes. Tell them why cyber security is important – for your organisation but also for them personally. And make sure they understand what they should do if they have problems or if things go wrong (as they surely will).

Don’t rely on one off training sessions: make sure that security is constantly “front of mind” with reminders using different techniques, messages and media hitting them as often as possible.

Step 6. Persuade

You can “educate” all you want, but if you fail to persuade them about the importance and effectiveness of what you are proposing then you won’t change anyone’s behaviour.

There are lots of methods that you can steal from marketing and from behavioural economics here. For instance, make sure authority and other credible figures are seen to follow the rules (if the Chief Exec is lax with security you can be certain everyone will happily follow their example). Prove to people that your new ways of working actually deliver benefits. Help people realise that they face constant and sometimes personal risks but (and this is very important) that there is plenty they can do to keep safe.

Keep an eye on how people are incentivised as well. Not about cyber security but about their every day tasks. Don’t put incentives in place that could persuade people to behave in an insecure manner.

Step 7. Socialise cyber security

Kotter talks about “enlisting a volunteer army” and that’s exactly what you have to do. You need everyone in your organisation buying in to the idea of cyber security. Part of this will be ensuring that “the organisation” behaves properly: if it is seen to be cavalier with the security of customer data for instance your internal processes will lose credibility. Ultimately you want your workforce disapproving of people who behave unsafely.

Disapproval doesn’t mean developing a blame culture. That would be very damaging – given the ever changing nature of cyber threats you need people to be able to feel safe if they make a mistake or if they respond wrongly to a new threat. But you do need people to accept cyber safety as the norm and as something that has value in protecting their career and indeed themselves personally, as well as protecting their colleagues and the organisation as a whole.

You might want to take some ideas from Sales as well – leader-boards for people who are particularly effective, prizes for good behaviour, simple recognition for jobs well done…

Step 8 Monitor and enforce

Measurement is very important. Your organisation needs to know how well it is maintaining a positive security culture. Identify some relevant KPIs so you will know if you need to take remedial action.

Enforcement is also important. If people who act unsafely are seen to get away with it then others will quickly follow them. Regular negligence and malicious behaviour may need disciplinary sanctions. More often than not though, you will simply need to offer a little “re-education”. And treat this as a learning opportunity for the organisation as well as the individual concerned. After all if someone is regularly breaking the rules it could well be the fault of the rules!

Managing the “people” part of cyber risk

Why are people such a security risk at work? This is particularly the case when it comes to cyber security. For instance why do people so often seem to forget common-sense and share passwords or leave secret documents exposed for others to see? There are a good few reasons.

  • It may be because some people find that the rules are too difficult to follow them: for instance complex password protocols are often avoided because “who could ever remember a random 12 character password that is different for every site and that changes every month?”
  • Or it may simply be because they don’t understand how to follow the rules. They haven’t been taught the techniques of following them properly and so rather than getting it wrong they do nothing.

These reasons can be addressed through education. But there are other reasons for security lapses, and a number of reasons are related to the credibility of cyber security initiatives.

  • Some people may ignore the rules because they feel that the rules are simply unnecessary and don’t actually do anything useful.
  • Or they may feel (and perhaps they have evidence) that the rules don’t work; other rules need to be put in place but in the meantime they will ignore the useless current rules.
  • They might even think that they are doing the right thing in ignoring the rules – that there are certain circumstances that justify their actions.

These reasons may be a little harder to address, but again with some education supported by evidence people will understand why they are necessary and have confidence that they do work.

There are however a number of reasons that are harder to manage.

  • Sometimes people feel that it’s inconvenient to follow security rules. Perhaps they feel that their time is too valuable – or they are too important to follow the rules. Perhaps they are simply too lazy.
  • Some people feel that rules are an attack on their freedom. It’s up to them what they do – and no one is going to tell them otherwise.
  • Others feel that breaking the rules is in some way funny or exciting and as a result breaking them will be emotionally rewarding: they will be liked or admired for it. Or perhaps the emotional reward will be more negative – revenge on a colleague, boss or organisation that they feel has treated them (or someone else) badly.
  • And others will be out for personal gain: perhaps they steal information to further their careers elsewhere, or even in response to a bribe.
  • Still others will submit to peer pressure: they follow the crowd – everyone else is doing it so what can the harm be?

Education can’t be the whole answer here. While it is important, there may also need to be a degree of coercion, an acceptance that ignoring the rules may result in unpleasant consequences. After all, not everyone will do the right thing simply because it is the right thing. And remember – coercion doesn’t always have to involve disciplinary procedures. With the right approach the coercion can come from colleagues and rule breaking can become socially unacceptable, as has largely happened with drink driving.

But there are some other reasons that are particularly difficult to manage.

  • First of all there is trust. Most people are happy to believe another person, especially if they look and sound confident and trustworthy, and to help them if they ask for something for instance printing out a document on a USB stick for a stranger or getting someone a coffee while they wait alone in a room with a network connection
  • And secondly there is fear – fear of being wrong, fear of embarrassment. It is this fear of speaking up when people see something that they think isn’t quite right that opens many organisations to risk.

These characteristics are very hard to manage. And it is because of these that organisations need to go further than education and coercion when managing cyber security.

Gartner  has created an interesting model that may help with this. They propose seven core principles that help to establish “people-centred security”. These are:

  1. Accountability: the owners of data are accountable for its security and so make decisions about who can have access to it
  2. Autonomy: people who have access to data make their own decisions about how they use it, based on the requirements of their role balanced with the organisation’s security requirements
  3. Community: people do not make decisions in isolation and so organisations should promote a positive culture of collaboration that supports good decision-making
  4. Responsibility: people who have access to information are expected to act responsibly and, perhaps more importantly, are held responsible for the consequences of their actions
  5. Proportionality: controls must be appropriate and proportionate to the risks
  6. Transparency: people’s behaviour is monitored, but any punitive actions are open to scrutiny to prevent vigilantism, bullying or unfair behaviour
  7. Immediacy: if someone fails to act responsibly, the reaction will be immediate although as a rule there will be a greater focus on supporting compliance that on punishment

These principles are sound (although perhaps a little “right on” for such an important subject). The focus on education as the fundamental enabler, and community policing of decisions and punishments as the chief source of influence, should be effective in many situations, especially with people who are too trusting or too fearful.

However, a reliance on autonomy may be dangerous, especially where people do not feel confident of their own abilities, or where people are simply incompetent.

And there is a danger that many actions are not open to the scrutiny of the wider community so the community may in reality be powerless to act as a policeman. In addition there are always some people who are mavericks and do not care whether they have community approval, or even seek actively to generate community disapproval.

Organisational security should of course have a strong focus on people and their motivations and behaviour. But there also needs to be an equal focus on business processes so that employees have structures to support their decisions and actions. And underpinning all this must be the technical solutions that are fundamental to any effective cyber security.