The dangers of hidden data

How many times have you leaked strategic data by accident? And do you even know when you have?

There are a multitude of opportunities to share strategic information with third parties such as clients and suppliers by accident. Information that could seriously damage your negotiating position. And if you are not aware of these dangers, it is very easy to do this.

Take Microsoft Office documents. If you ever share Excel spreadsheets with clients, do you make sure that any “hidden” columns don’t contain information you would rather keep hidden. Creating pivot tables to communicate your data analysis? Are you sure that the original detailed data isn’t available somewhere? And what about PowerPoint. Are those “Notes” pages suitable for sharing, or do they contain thoughts that you would rather not put in writing? And those text boxes that you pulled off the side of slides when you were writing them – you know they are still there of course!

Have you collaborated with others to produce a document? Most likely you will have written notes and tracked changes. If you are not careful much of the history of your document could be available to the final recipients: and that could be embarrassing!

Don’t forget document metadata either. Are there any interesting titbits in the “Properties” of your documents – the original author perhaps or the date the document was first drafted? Who know what value that might be to someone else.

Perhaps you think you have blocked some text out. Ineffective “redaction” is the cause of a lot of data leakage. For instance, blocking out text using a “highlight” the same colour as the text won’t delete it – and it could be very easy get rid of the highlight.

It’s not just documents though. There are lots of places where information can be hidden. Are your social media posts geo-tagged for instance? If you are regularly visiting a particular location, that could be of interest to competitors – or your colleagues.

Software can be another culprit. Is there any hidden text in your website, perhaps in an “invisible” font or in a comment tag. And that software you have commissioned _ are you sure the developers haven’t left any notes that could give away secrets?

Is there strategic data hidden in plain site? You might be surprised where interesting data lurked. Security blogger Brian Krebs tells how he analysed an airline boarding card and found a wealth of information in the bar code – including information that could have helped him disrupt future travel plans.

And finally – do be careful how you delete sensitive files. It isn’t sufficient to “delete” them as they will probably still exist in some form on your hard drive, easy for anyone reasonably skilled to find. You need to actively scrub them out. There is plenty of free software available online to do this. (Make sure you do this carefully when you recycle a personal computer or smartphone.)

The data you don’t value is often surprising valuable to other people, especially competitors and suppliers. Don’t share it accidentally because you simply can’t see it.

Cyber security and the importance of usability

There is nothing new or unusual about the need to design usable systems. A whole industry has grown up around the business of making sure that commercial websites and apps are easy to use and deliver the behaviour, such as spending money, that the owners of those websites and apps want to see.

Usable systems generally require three things: the system has to be useful, or at least perceived as useful, by the end user; the system has to be easy to use by the end user; and the system has to be persuasive so that the user to take the actions that the owner desires.

Is cyber security any different?

These three requirements of utility, usability and persuasiveness are seen in cyber security systems. However there are some differences compared with the consumer-facing world. Making sure a cyber security system succeeds is in some ways more important than making a commercial system succeed.

One issue is that the cyber security system has to work for everyone: potentially if just one person fails to use the system properly then the organisation will be put at risk.

In addition cyber security systems are like stable doors – they need to be shut when you want them to be as there is no use locking them after a breach has happened. If an online shop doesn’t work for some reason then the user can go back and try again, but with a cyber security system, if it doesn’t work first time then the damage may be done.

These are stringent requirements. Unfortunately the nature of cyber security means that these requirements are hard to meet:

  • Users have little motivation to comply with security requirements as keeping secure is not their main purpose; indeed security systems are part of a technical infrastructure that may have no real meaning or relevance to the end users
  • Security systems can “get in the way” of tasks and so can be thought of as a nuisance rather than a benefit
  • Security systems are often based on arbitrary and little understood rules set by other people, such as those found in security policies, rather than on the desires of the end user
  • Users may find complying with the requirements of security systems socially difficult as they may force the user to display distrust towards colleagues

These are all challenging issues and any security systems you design need to ask the very minimum of effort from the user if it is to overcome them.

Unfortunately many cyber security systems demand a degree of technical knowledge. For instance they may use jargon: “Do you want to encrypt this document?” will have an obvious meaning to anyone working in IT but may mean nothing to some users.

Furthermore some security requirements may of necessity require a degree of “cognitive overload”: the requirement to remember a strong password (perhaps 12 random characters) is an example. Again this will cause additional difficulty.

Users are not naturally motivated towards cyber security systems. And they may find them hard to use. So how can success – universal and efficient use of systems – be achieved?

Delivering success

Start with the end user. Ensure, through the use of a combination of interviews (including the standard “speak aloud” protocol used by many UX practitioners), observation and expert evaluation identify where the obstacles to successful use of the system are placed. Obviously the usual rules of good usability will apply: consistency, reduced cognitive overload, feedback, and help when mistakes are made.

Learnability is also important. Accept that some form of help may be needed by the user and ensure that this is available, ideally within the system. Help files shouldn’t just tell people how to achieve something but also why it is important.

But for cyber security systems there is also a lot of work to be done around persuasion. This will involve educating the end user about the importance of the system – how it protects their organisation, and how it protects them as individuals.

It will also involve ensuring that the system is credible – that end users realise that the system does what it is supposed to do and isn’t just a tick box exercise or something dreamed up by the geeks in IT to make everyone’s live that little bit harder.

And it will involve demonstrating to the end user that all their colleagues are using the system – and if they don’t use it then they will be out of line with the majority.

“Usability is not enough” is a common theme in retail website design. It is even more important in the design of cyber security systems.








A New Year’s resolution for CEOs

“I am going to take cyber security seriously in 2016.”

On the whole senior executives claim that they want to act in an ethical manner. And yet if they fail to embrace cyber security they are clearly lying.

Why do I say that? Because playing fast and loose with customer data wrecks lives. It is as simple as that. Lose your customers’ data and you expose them to a major risk of identity theft – and that can and does cause people massive personal problems.

The problems that David Crouse experienced in 2010 are typical. When his identity was stolen he saw $900,000 in goods and gambling being drained from his credit card account in less than 6 months. His credit score was ruined and he spent around $100,000 trying to solve the problems.

Higher interest rates and penalty fees for missed payments just made his financial situation worse. His debts resulted in his security clearance for government work being rescinded. Having lost his job, other employers wouldn’t touch him because of his debts and credit score. He felt suicidal. “It ruined me, financially and emotionally” he said.

Data breaches frequently result in identity theft. And this can have a devastating emotional impact on the victims, as it did with David Crouse. Research from the Identity Theft Resource Center  indicates that 6% of victims actually feel suicidal while 31% experience overwhelming sadness.

The directors of any company whose negligence results in customers feeling suicidal cannot consider themselves to be ethical.

Unfortunately most data breaches that don’t involve the theft of credit card details are dismissed by corporations as being unimportant. And yet a credit card can be cancelled and replaced within hours. A stolen identity can take months, or longer, to repair.

And all sorts of data can be used to steal an identity. An email address and password; a home and office address; the names of family members; a holiday destination; a regular payment to a health club… Stolen medical records, which are highly effective if you want to steal an identity, will sell for around £20 per person online, while credit card details can be bought for as little as £1. Go figure, as they say in the USA.

Organisations must accept that any loss of customer data puts those customers in harm’s way. And if they want to be seen as ethical they must take reasonable steps to prevent data breaches. Until they do, well the EU’s new data protection rules can’t come on-stream quickly enough for me!

Persuasion and cyber security

You can’t rely on technology to solve your cyber security issues.

Cyber security is largely a “people” issue: cyber breaches are generally caused by people behaving in an unsafe manner, whether they know they are doing so or not. The solution is to persuade them to behave safely.

But how can you persuade people to do this?

Effective cyber communication

The first step is developing an appropriate communication programme. Of course you already know that this shouldn’t be a “death by PowerPoint” style lecture.

You are going to make your communication engaging and interactive with lots of colour and interesting imagery. You are going to start training sessions off with uplifting material that gets people into a good mood – games, stories, or other activities designed to generate a feeling of well being.

But what about the content of your communications? How should you structure the messages that you need to get across? Here are a few Do’s and Dont’s:

  • Do describe security problems in a clear cut and simple way so that people can understand everything you are saying. Don’t use jargon and make it all sound frightfully difficult because you want to look clever
  • Do give people hope – while 100% security is impossible, you should emphasise that there is a lot that can be done to minimise threats and the consequences of a cyber incident. Don’t use “fear, uncertainty, doubt” to persuade people of the importance of the risk: they will just bury their heads in the sand.
  • Do make the risk relevant to the individuals you are talking to – describe personal risks, to their reputation or their jobs. Don’t describe it as a risk to the faceless organisation they work for.
  • Do stress that the risks are immediate ones that are all around you as you speak. Use examples of things that have happened, ideally to your organisation or a competitor. Don’t describe potential incidents that might happen sometime in the future.

Marketing techniques

There are also a number of marketing techniques you may be able to bring into your communications:

  • Use the power of FREE when describing techniques that people can take to avoid risk; this could be FREE training to avoid phishing, or some FREE software people can download to use at work and at home
  • Use the power of loss. When faced with a potential loss, people are risk averse. So emphasise what people might lose if they behave unsafely, not what they might gain if they behave safely. The loss needs to be personal, for instance it could relate to losing money when shopping online
  • Use the power of authority to persuade people. If you can ensure that your organisation’s leaders will act – and you can show them acting – in a cyber safe way then you have a good chance that people will follow their lead.
  • Use the power of peer pressure. People will often follow the lead of the people around them as they don’t want to seem out of step with the majority’s way of behaving. So if you can persuade some people to endorse safe behaviour during a training session, others will inevitably follow them. Having a few “stooges” as part of your audience may help!
  • Use the power of discovery. Guide people towards uncovering solutions to cyber risks, rather than telling them what to do. If they are responsible for defining solutions they will value those solutions. If you simply give them someone else’s solution it may well be discounted as “Not invented here”

You are trying to change people’s behaviour and it is important that you succeed. Think about what will persuade people. And don’t be afraid to use a few cheesy marketing techniques along the way.

Tackling invoice fraud

Invoice fraud is on the rise. It may involve a spoof email, apparently from the CEO of your organisation, “authorising” the payment of a fake invoice. In other cases the email seems to come from a trusted supplier.

Two of my friends who run SMEs have recently been exposed to invoice fraud, in one case for around £70,000. In both cases the fraud was picked up before payment was made.

But in a lot of cases it isn’t. For instance last year a small Norfolk manufacturer was scammed out of £350,000 by a fraudulent email. And because of the nature of online banking, once the money has left your account it is very hard to retrieve.

There are ways of reducing the risk of fraudulent emails. For large organisations an anomalytics service might be the answer. These services build up a picture of normal email traffic in order to identify unusual emails that can be subjected to further examination.

Another tool is the DMARC email standard. This prevents people sending emails that are apparently from you. It doesn’t of course stop people from breaking into your email account and actually sending emails “from” you. Nor does it prevent phishing attacks. But it is a useful tool nonetheless as it makes it harder for fraudsters.

But the real way to address invoice fraud is through implementing stronger business processes. These will include:

  • Ensuring only properly trained and authorised people are able to make online payments
  • Creating a “whitelist” of approved suppliers and their agreed payment instructions (bank details etc)
  • Double checking any changes in payment instructions from suppliers on the whitelist with people you know are authorised to approve those changes
  • Checking any payment requests made by managers in your company with that person on the phone or face to face (and not as an email reply)
  • Ensuring that appropriate documentation is present and checked before any payment is authorised: this might include an invoice, a purchase order, and a “goods received” slip or equivalent
  • Creating a process where additional authorisation is need to sign off payments
    • Over a certain amount
    • To new suppliers who are not yet on an approved “whitelist”
    • To existing whitelist suppliers who have provided new bank details
    • Where a payment is requested to a country outside normal trading patterns.

Using your common sense and raising a query when something seems odd. (This of course requires a culture that is sympathetic to juniors raising queries. If you don’t have this sort of culture, or if senior staff bully their juniors, this type of fraud becomes much more likely.)

Process won’t be enough on its own though. Training the finance team is also important so they are aware of the nature of invoice fraud. This training should include advice about how to take extra care with urgent or aggressive requests for payment.

Keep cyber safe!

Cyber security and peer pressure

Cyber security training – most IT security people would say that’s the answer to solving the insider threat problem.

But is it? Giving people information is certainly part of the solution. But training rarely changes behaviour.

Peer pressure does.

And that is why socialising safe cyber behaviour is an essential strategy if you want to ensure cyber security.

What does socialising cyber safety mean?

Socialising safe cyber behaviour involves changing an organisation’s culture so that unsafe cyber behaviour becomes as socially unacceptable as bullying and sexual harassment. It uses peer pressure or “social influence” to steer people into an acceptable way of behaving.

One of the reasons that peer pressure works is because most people like being in groups. Being in a group is safer than being isolated and, because humans evolved in a very dangerous world full of nasty predators (including other humans), we are hard wired to want this.

If we want to belong to a particular group it helps a lot to behave like the other people in the group – wearing the same types of clothing, supporting the same football team, looking, sounding, thinking and behaving like them.

In other words, people in a group often think in similar ways. Promote the right way of thinking and you can use the people’s need to be in a group to enhance cyber security and encourage cyber safe behaviour.

Following the herd

The most powerful way of doing this is to make safe behaviour seem to be what everyone else does – the “social norm”. There is lots of evidence that telling people about social norms changes people’s behaviour.

A good example of a company that successfully used social norms to change people’s behaviour is Opower a US energy company. Opower wanted to drive energy use down. It tried several messages to do this such as:

  • You can save $54 this month
  • You can save the planet
  • You can be a good citizen

None of these were particularly successful. So they tried a fourth message along the lines of:

  • Your neighbours are doing better than you

The results were amazing: more than 75% of the people who found this out started to save energy. In the same way, a message along the lines of  “83% of your colleagues never click on links in emails from unknown sources” is likely to influence people.


Making safe behaviour “visible” is another effective technique. People have a tendency to follow other people, as we have seen. So if one person is seen as behaving in a particular way, then others are likely to follow.

You are in the canteen when the person behind you at the counter sees what you have on our tray and says “That looks nice. I’ll have one of those.” Why do they do it? It might be because they didn’t notice it before. But it is just as likely to be because they want to make a good impression on you by imitating you.

You will see this with body language too. When you are sitting opposite someone try changing your posture: cross your legs, fold your arms, rub your chin. The chances are the person opposite will “mirror” at least some of your changes in posture.

The same with cyber security. If one person acts in a particular way, then people near them may well imitate them. For instance, if you can get someone to claim they always use a complicated password to log on or say “No way I’d log on to our corporate network using Costalotta’s free wi-fi”, then others may well follow their lead.


People follow authority figures. So you want your leaders to act in a cyber safe manner.

There are different sorts of leaders in any group. I’d always suggest that you try to get your organisation’s formal leaders to act in a visibly cyber safe way (or at least avoid obviously unsafe behaviour).

But the CEO might not have much credibility when it comes to technology. One of the new interns may be far more credible, and influential. Or perhaps there are some popular social leaders in your organisation: these too will have lots of leadership power. Empowering your leaders to act as cyber safety role models will pay dividends.

Incentivising the group

Using group rewards that disappear if anyone steps out of line is an interesting idea. With this technique there is a reward when everyone behaves well but no one gains if only one person behaves wrongly. And as most people dislike being unpopular they do their utmost to ensure that others don’t lose out.

Creative agency 23Red used this technique to get people to complete their time sheets.

Of course this technique only works if everyone belongs to the same social group. If there is a clique of people, perhaps in sales, who don’t interact much with the rest of the organisation, then they may well not feel obliged to behave well for the sake of their colleagues.

Social shaming

This is a little controversial, although I have seen it used successfully as a way of keeping the size of email directories down. With this technique bad behaviour is reported publicly – the digital equivalent to being put in the stocks. People may not throw cabbages at you but it still embarrassing to be called out in front of your peers for antisocial behaviour.

Peer groups

It may be practical to start the socialising process off using one or more small groups rather than trying to influence the whole of an organisation. Socialising behaviour in a small group has obvious limitations, but get some people to engage with cyber safety and their behaviour will soon be copied by others.

Cyber security expert Richard Knowlton suggested to me that telling stories in groups is a great way of generating understanding and acceptance of the threats that cyber brings. “My email was hacked…”, “I definitely got a phishing message on LinkedIn the other day…”, “One of my friends was emailed a fraudulent invoice the other day…” Share stories and you bring the problem to life and make it seem relevant for the people in your group.

You can even think about generating solutions as a group. Thinking tends to converge when people are in small groups, especially when people are faced with a hard problem. If you set your group a cyber threat problem they will probably come up with a common view of how to solve it.

Of course you want that view to be effective and practical, so you may want one or two “stooges” in your group who have been briefed about good solutions and who can lead the conversation in the right direction. But once you have arrived at the solution, the whole group are likely to agree with it as they have been involved in uncovering it.

Rewarding good behaviour

Providing public rewards and status can also generate social pressure. If people who have behaved safely – perhaps challenging a stranger who isn’t wearing a visitor’s badge, or politely suggesting to their boss that they shouldn’t use the business centre’s PC to log into the office network – are rewarded, and if the rewards are made public, that will encourage others.

Sales teams often work this way, with the most successful salesperson being publicly rewarded, and applauded by his (no doubt slightly envious) peers. The key to this is to make the reward public: “Cyber Safe Employee of the Month” notices, mugs and mouse mats, special privileges such as being allowed to go home early…

All in all…

Socialising cyber safe behaviour is a very powerful tool. It isn’t the only tool you can use of course, and it’s not a magic wand. You will need the right tools and the right processes in place as well. But used with imagination it can make a big difference to your cyber security as well as helping more general team bonding.

Keep cyber safe!

Does your cyber security have the right aura?

Can cyber security have auras?

How can cyber security have an “aura”? It sounds like a meaningless question. But step back a little and think about how direct marketing works.

Commonly, people in direct marketing use a simple mnemonic to describe the steps they take consumers through when persuading them to buy: AURAL. I think this is relevant for cyber security.

AURAL stands for Awareness, Understanding, Relevance, Action, and Loyalty. In other words:

  1. You start by making people aware of your product
  2. You move on to helping them understand what it does – its benefits and features
  3. Then you persuade them that the product is relevant for their own needs, that it solves a particular problem they have
  4. Next you call them to the action you want them to take, which is generally putting their hand in their pocket and shelling out for whatever you are selling
  5. And finally you hope to generate some loyalty so that they will come back and buy again, and perhaps even recommend your product to their friends.

As I said, this process (which by the way doesn’t have to be linear) is pretty relevant for cyber security too. Except that “loyalty” isn’t really appropriate. But rather than simply getting rid of the “L” I am going to change it to an S: AURAS. The final S stands for Socialise. You will see what I mean in a moment.

So what do I mean by “AURAS”?


As with direct marketing, in cyber security we need Awareness. This is aimed at keeping cyber threats, and the need for cyber security, at the front of everyone’s minds.

You might create awareness with posters (remember to move them around and change their message so that people don’t become blind to them), emails (personalised messages can be highly effective), messages when people start their computers up or start to do certain things (again remember to change them), even things like mugs and mouse mats which can be given to reward cyber safe behaviour.


It isn’t enough to be aware of a threat though. People also need Understanding about what they can do. For instance, if you have a policy of insisting on complex passwords that are changed every month then you need to give people the tools to do this – otherwise they are likely to write their passwords down on sticky notes and put them on their monitors, hardly the cyber safe behaviour you want to encourage. (There is a hint about complex passwords at the end of this post.) This is where training comes in: helping people understand how they need to behave to keep safe.


You also need to ensure that people feel the training they have had has real Relevance to their own lives. Not everyone lives to work. Most people regard work as a way of getting the things they want in life. Of course their job is important – so stressing that unsafe behaviour could damage their employer, and hence their own job, is one tactic.

A stronger tactic though (and one that might even generate a bit of gratitude) is to show them how being cyber safe can help them outside their work life – protecting their identity, their bank accounts, their children’s physical safety.


Now you need to call them to Action. This involves communication at the moment they are doing something. For instance, BAE’s email security service has a very handy feature: if a user is tempted to click on a link in an email (generally accepted as unsafe behaviour unless you are certain who the email is from) they can be served a CAPTCHA image which makes them stop and think about what they are doing before they click on the link.

(I haven’t seen these images: it would be nice to think that instead of a standard CAPTCHA image such as a random set of numbers they contain a little message like “Are you sure?” or “Links can hurt”.)


And finally you need to Socialise cyber safe behaviour into the organisation. The aim will be to make unsafe behaviour socially unacceptable – just as drink driving, not showering after a lunchtime run, or eating fish soup at your desk are all pretty unacceptable.

One of the most powerful way of socialising behaviour is telling people that the majority of their fellows act in the way you are hoping to persuade them to act.This doesn’t have to be complicated. For instance Northern Illinois University halved the amount of binge drinking by students simply by promoting the message “Most students drink in moderation.” People follow the crowd.


AURAS: it’s a great way of thinking about the different things you need to do to change the way people think about cyber security and to change the way they behave.

An easy way to complex passwords

Now I did say I would give you a tip about remembering complex passwords that change every month. It’s easy. You need two things: a memorable phrase; and a date “protocol” (I’ll explain).

Let’s say your IT people have demanded a password of at least 12 characters that includes at least one of each of the following: upper case letter, lower case letter, number and symbol. They also want you to change it every month.

First of all, the phrase. This isn’t the same as a “pass phrase” where people use several words as a passwords: there is some evidence that this isn’t very secure.

You need to think of a phrase such as: I love my job at Acme Widgets, Dorking! Take the first letter of each word and the symbols and you get: 1lmj@AW,D! (the word “at” is useful as it turns nicely into a symbol and the “I” is useful as you can turn it into a number 1).

Now think about a date “protocol”. A really simple one might be to use the first of the month. It’s October 2015 so that makes: 01 10 15. Just for a bit of fun I am going to put the first thee numbers at the start and the last three numbers at the end. So my password this month is: 0111lmj@AW,D!015. Easy to remember and I can change it every month.

Keep cyber safe!