The dangers of hidden data

How many times have you leaked strategic data by accident? And do you even know when you have?

There are a multitude of opportunities to share strategic information with third parties such as clients and suppliers by accident. Information that could seriously damage your negotiating position. And if you are not aware of these dangers, it is very easy to do this.

Take Microsoft Office documents. If you ever share Excel spreadsheets with clients, do you make sure that any “hidden” columns don’t contain information you would rather keep hidden. Creating pivot tables to communicate your data analysis? Are you sure that the original detailed data isn’t available somewhere? And what about PowerPoint. Are those “Notes” pages suitable for sharing, or do they contain thoughts that you would rather not put in writing? And those text boxes that you pulled off the side of slides when you were writing them – you know they are still there of course!

Have you collaborated with others to produce a document? Most likely you will have written notes and tracked changes. If you are not careful much of the history of your document could be available to the final recipients: and that could be embarrassing!

Don’t forget document metadata either. Are there any interesting titbits in the “Properties” of your documents – the original author perhaps or the date the document was first drafted? Who know what value that might be to someone else.

Perhaps you think you have blocked some text out. Ineffective “redaction” is the cause of a lot of data leakage. For instance, blocking out text using a “highlight” the same colour as the text won’t delete it – and it could be very easy get rid of the highlight.

It’s not just documents though. There are lots of places where information can be hidden. Are your social media posts geo-tagged for instance? If you are regularly visiting a particular location, that could be of interest to competitors – or your colleagues.

Software can be another culprit. Is there any hidden text in your website, perhaps in an “invisible” font or in a comment tag. And that software you have commissioned _ are you sure the developers haven’t left any notes that could give away secrets?

Is there strategic data hidden in plain site? You might be surprised where interesting data lurked. Security blogger Brian Krebs tells how he analysed an airline boarding card and found a wealth of information in the bar code – including information that could have helped him disrupt future travel plans.

And finally – do be careful how you delete sensitive files. It isn’t sufficient to “delete” them as they will probably still exist in some form on your hard drive, easy for anyone reasonably skilled to find. You need to actively scrub them out. There is plenty of free software available online to do this. (Make sure you do this carefully when you recycle a personal computer or smartphone.)

The data you don’t value is often surprising valuable to other people, especially competitors and suppliers. Don’t share it accidentally because you simply can’t see it.

Cyber security and the importance of usability

There is nothing new or unusual about the need to design usable systems. A whole industry has grown up around the business of making sure that commercial websites and apps are easy to use and deliver the behaviour, such as spending money, that the owners of those websites and apps want to see.

Usable systems generally require three things: the system has to be useful, or at least perceived as useful, by the end user; the system has to be easy to use by the end user; and the system has to be persuasive so that the user to take the actions that the owner desires.

Is cyber security any different?

These three requirements of utility, usability and persuasiveness are seen in cyber security systems. However there are some differences compared with the consumer-facing world. Making sure a cyber security system succeeds is in some ways more important than making a commercial system succeed.

One issue is that the cyber security system has to work for everyone: potentially if just one person fails to use the system properly then the organisation will be put at risk.

In addition cyber security systems are like stable doors – they need to be shut when you want them to be as there is no use locking them after a breach has happened. If an online shop doesn’t work for some reason then the user can go back and try again, but with a cyber security system, if it doesn’t work first time then the damage may be done.

These are stringent requirements. Unfortunately the nature of cyber security means that these requirements are hard to meet:

  • Users have little motivation to comply with security requirements as keeping secure is not their main purpose; indeed security systems are part of a technical infrastructure that may have no real meaning or relevance to the end users
  • Security systems can “get in the way” of tasks and so can be thought of as a nuisance rather than a benefit
  • Security systems are often based on arbitrary and little understood rules set by other people, such as those found in security policies, rather than on the desires of the end user
  • Users may find complying with the requirements of security systems socially difficult as they may force the user to display distrust towards colleagues

These are all challenging issues and any security systems you design need to ask the very minimum of effort from the user if it is to overcome them.

Unfortunately many cyber security systems demand a degree of technical knowledge. For instance they may use jargon: “Do you want to encrypt this document?” will have an obvious meaning to anyone working in IT but may mean nothing to some users.

Furthermore some security requirements may of necessity require a degree of “cognitive overload”: the requirement to remember a strong password (perhaps 12 random characters) is an example. Again this will cause additional difficulty.

Users are not naturally motivated towards cyber security systems. And they may find them hard to use. So how can success – universal and efficient use of systems – be achieved?

Delivering success

Start with the end user. Ensure, through the use of a combination of interviews (including the standard “speak aloud” protocol used by many UX practitioners), observation and expert evaluation identify where the obstacles to successful use of the system are placed. Obviously the usual rules of good usability will apply: consistency, reduced cognitive overload, feedback, and help when mistakes are made.

Learnability is also important. Accept that some form of help may be needed by the user and ensure that this is available, ideally within the system. Help files shouldn’t just tell people how to achieve something but also why it is important.

But for cyber security systems there is also a lot of work to be done around persuasion. This will involve educating the end user about the importance of the system – how it protects their organisation, and how it protects them as individuals.

It will also involve ensuring that the system is credible – that end users realise that the system does what it is supposed to do and isn’t just a tick box exercise or something dreamed up by the geeks in IT to make everyone’s live that little bit harder.

And it will involve demonstrating to the end user that all their colleagues are using the system – and if they don’t use it then they will be out of line with the majority.

“Usability is not enough” is a common theme in retail website design. It is even more important in the design of cyber security systems.

 

 

 

 

 

 

 

A New Year’s resolution for CEOs

“I am going to take cyber security seriously in 2016.”

On the whole senior executives claim that they want to act in an ethical manner. And yet if they fail to embrace cyber security they are clearly lying.

Why do I say that? Because playing fast and loose with customer data wrecks lives. It is as simple as that. Lose your customers’ data and you expose them to a major risk of identity theft – and that can and does cause people massive personal problems.

The problems that David Crouse experienced in 2010 are typical. When his identity was stolen he saw $900,000 in goods and gambling being drained from his credit card account in less than 6 months. His credit score was ruined and he spent around $100,000 trying to solve the problems.

Higher interest rates and penalty fees for missed payments just made his financial situation worse. His debts resulted in his security clearance for government work being rescinded. Having lost his job, other employers wouldn’t touch him because of his debts and credit score. He felt suicidal. “It ruined me, financially and emotionally” he said.

Data breaches frequently result in identity theft. And this can have a devastating emotional impact on the victims, as it did with David Crouse. Research from the Identity Theft Resource Center  indicates that 6% of victims actually feel suicidal while 31% experience overwhelming sadness.

The directors of any company whose negligence results in customers feeling suicidal cannot consider themselves to be ethical.

Unfortunately most data breaches that don’t involve the theft of credit card details are dismissed by corporations as being unimportant. And yet a credit card can be cancelled and replaced within hours. A stolen identity can take months, or longer, to repair.

And all sorts of data can be used to steal an identity. An email address and password; a home and office address; the names of family members; a holiday destination; a regular payment to a health club… Stolen medical records, which are highly effective if you want to steal an identity, will sell for around £20 per person online, while credit card details can be bought for as little as £1. Go figure, as they say in the USA.

Organisations must accept that any loss of customer data puts those customers in harm’s way. And if they want to be seen as ethical they must take reasonable steps to prevent data breaches. Until they do, well the EU’s new data protection rules can’t come on-stream quickly enough for me!

Business processes and cyber risk

Cyber risk doesn’t just involve malicious techies hacking into corporate accounts. It can also involve risk to every day business processes: “process cyber risk”. Unfortunately, because the IT Department are kept busy defending the corporate network from the hackers, these process risks are often left to themselves.

What do I mean by process cyber risk? Quite simply, a risk of loss or damage to an organisation caused by a weak business process combined with the use of computer technology. These weak processes are often found within finance departments, but you will also find them in HR, in marketing and across organisations.

Process risk and identity

Many business processes rely on a particular document being signed off by an authorised individual. As many processes migrate online, the assumption is that the sign-off process can also be undertaken online. Sign on as an individual and perhaps you have authorisation to access a particular document or process.

As most people have to log in to company systems with a password and a name, then this shouldn’t be a problem. Except that passwords get shared. Busy people often share log-in details with juniors, allowing unauthorised people to access systems and documents that they are not authorised to access.

Any authorisation process that simply relies on someone logging in with name and password is weak because it is easily subverted. Issuing “dongles” as a second factor authentication device isn’t much better as these can get shared (unless they are integral to a company identity card). Robust processes where sensitive data or decisions are concerned should assume that a password has been shared (or stolen) and require additional security such as a second pair of eyes.

Process risks and finance departments

One big risk for finance departments is invoice fraud. This can happen in several ways. A common way is for thieves to gather information about a company, perhaps the news that it is investing in new technology. They will then use this information plus other easily obtainable assets such as company logos and the names of senior people in an organisation to put together a scam.

This might involve an email “from” a director of the organisation to a mid ranking person in the finance department asking for an invoice to be paid promptly; the invoice, which is of course a fake, is attached to the email.

In other cases the invoice is genuine. For instance thieves may pose as a supplier and ask for details of any unpaid invoices. They then resubmit a genuine invoice – but with the bank payment details changed.

All too often the unwitting finance executive passes the invoice for payment. Once the money has reached the thief’s bank account it is quickly transferred to another account making it unrecoverable.

This type of fraud is big business. Earlier this year Ubiquiti Networks disclosed that thieves stole $46.7 million in this way. While in the UK, the police’s Action Fraud service received reports of around 750 in the first half of 2015. And of course many similar frauds go unreported – or undetected.

What can you do to protect against this? Well start by educating staff about the nature of the threat – all staff not just in the finance department. Ensure that the details of all invoices are scrutinised carefully: Is the logo up-to-date? Is the email address correct (perhaps it is a .org instead of a .com)? Are the bank payment details the same as usual (if they have changed then telephone someone you know at the supplier to ask for confirmation)? And take extra care with larger invoices, for instance requiring them to be check by two separate people.

There are other cyber risks within finance processes – and often these are internal risks, initiated by employees. Examples include purchase fraud when personal items are bought using company money or when required items are bought at inflated prices, with the purchaser then getting a kick back at a later date. Again fake emails can be used to support these purchases. And again simple processes can disarm the threat.

Process risks within HR

Within HR there are numerous process risks. Let’s start with recruitment. The risks here can involve social media profiles designed to misinform, perhaps with fake endorsements or untrue job details. Looking at a LinkedIn profile is an easy way to identify potential candidates – but it is important to realise that the profile you see may well be substantially embroidered.

Another short cut, especially when looking for “knowledge leaders”, is to see what sort of “rating” candidates have on sites like Klout.com. Superficially this is fine. However, it is essential to be aware of how people are rated by the site (for instance what data is used) before making a judgement using this type of data as you may well be given an untrue perspective.

Another risk of using social media to identify candidates is that you open yourself to accusations of discrimination. An attractive cv may not have information on social media about age, ethnicity or sexual preference. Social media will. You really don’t want to know this sort of information but once you know something you can’t “unknown it”: and this can open you up to accusations of bias. It isn’t unknown for companies to commission an edited summary of a candidate’s social media profiles with anything that could lead to accusations of discrimination taken out in order to de-risk the profile before it is given to the recruiter.

In fact HR is full of cyber risk, especially where social media is concerned. There may be problems with the posts employees make on social media. There may be issues around bullying or discrimination at work. And maintaining a positive “employer brand” can be very difficult if an ex-employee starts to deride their old employer on line in sites such as Glassdoor.

Process risk and marketing

Process risk is also very at home in marketing. Again social media is one of the culprits. Not everyone, even in marketing, is a social media addict. Senior marketers frequently hand over their brands’ social media profiles to junior marketers, or even interns, because “they have a Facebook page”.

It’s a mistake. Not only is it likely that the output will be poor, the junior marketer may well (they frequently do) break advertising regulations (for instance by glamorising alcohol, or even fair trading laws (e.g. by including “spontaneous” endorsements from paid celebrities).

This shouldn’t be difficult: there is no reason that the processes that govern advertising in general can’t be applied to social media.

Procurement and cyber risk

Finally there is procurement – and the process of ensuring that third party suppliers don’t represent a cyber risk. This is a huge area of risk and one that is not always well appreciated.

The issue is not just that the third party may be insecure (for instance the massive hack to US retailer Target came about via an insecure supplier) and it is hard to know whether they are secure or not. It is also that people working for a supplier who have been given access may then leave the supplier without you being told: and as a result they retain access to your information, perhaps after they have joined a competitor. In additions suppliers may well have their own reasons for being a risk – they are in dispute with you, they are in financial difficulty, they have been taken over by a competitor…

Business processes frequently have the potential to be undermined by online technologies. It takes imagination to identify where the threats lie. However once they have been identified, actions to reduce the effect of the threat are often very simple.

Persuasion and cyber security

You can’t rely on technology to solve your cyber security issues.

Cyber security is largely a “people” issue: cyber breaches are generally caused by people behaving in an unsafe manner, whether they know they are doing so or not. The solution is to persuade them to behave safely.

But how can you persuade people to do this?

Effective cyber communication

The first step is developing an appropriate communication programme. Of course you already know that this shouldn’t be a “death by PowerPoint” style lecture.

You are going to make your communication engaging and interactive with lots of colour and interesting imagery. You are going to start training sessions off with uplifting material that gets people into a good mood – games, stories, or other activities designed to generate a feeling of well being.

But what about the content of your communications? How should you structure the messages that you need to get across? Here are a few Do’s and Dont’s:

  • Do describe security problems in a clear cut and simple way so that people can understand everything you are saying. Don’t use jargon and make it all sound frightfully difficult because you want to look clever
  • Do give people hope – while 100% security is impossible, you should emphasise that there is a lot that can be done to minimise threats and the consequences of a cyber incident. Don’t use “fear, uncertainty, doubt” to persuade people of the importance of the risk: they will just bury their heads in the sand.
  • Do make the risk relevant to the individuals you are talking to – describe personal risks, to their reputation or their jobs. Don’t describe it as a risk to the faceless organisation they work for.
  • Do stress that the risks are immediate ones that are all around you as you speak. Use examples of things that have happened, ideally to your organisation or a competitor. Don’t describe potential incidents that might happen sometime in the future.

Marketing techniques

There are also a number of marketing techniques you may be able to bring into your communications:

  • Use the power of FREE when describing techniques that people can take to avoid risk; this could be FREE training to avoid phishing, or some FREE software people can download to use at work and at home
  • Use the power of loss. When faced with a potential loss, people are risk averse. So emphasise what people might lose if they behave unsafely, not what they might gain if they behave safely. The loss needs to be personal, for instance it could relate to losing money when shopping online
  • Use the power of authority to persuade people. If you can ensure that your organisation’s leaders will act – and you can show them acting – in a cyber safe way then you have a good chance that people will follow their lead.
  • Use the power of peer pressure. People will often follow the lead of the people around them as they don’t want to seem out of step with the majority’s way of behaving. So if you can persuade some people to endorse safe behaviour during a training session, others will inevitably follow them. Having a few “stooges” as part of your audience may help!
  • Use the power of discovery. Guide people towards uncovering solutions to cyber risks, rather than telling them what to do. If they are responsible for defining solutions they will value those solutions. If you simply give them someone else’s solution it may well be discounted as “Not invented here”

You are trying to change people’s behaviour and it is important that you succeed. Think about what will persuade people. And don’t be afraid to use a few cheesy marketing techniques along the way.

Eight steps to change cyber security culture

Hackers are always a problem. And naturally, your IT Department has network security buttoned down. But they are probably more worried about something else: you and your colleagues.

The big challenge in cyber security is people. It is how to change an organisation’s culture from relying on IT for security into one where everyone takes responsibility. Everyone, from the CEO to the newest intern.

John Kotter famously proposed an eight step process for changing organisational culture, starting with “Establish a sense of urgency” and finishing with “Institutionalise the change”. Well, most people realise that the cyber security problem is pretty urgent. So I thought I’d outline a separate set of eight steps that organisations can follow to strengthen their cyber security culture.

Step 1. Build your guiding coalition

Start by building a multifunctional team to guide change. Cyber security shouldn’t be the responsibility of IT, so you will need people from across the organisation to be involved: sales, marketing. operations, finance… This is essential so you get buy in across the organisation.

More importantly though, if your approach to security doesn’t take account of the way people work, it will fail.

Step 2. Form your vision and scope out your intentions

Next you need to form your vision for cyber security. That should be simple: to protect your assets, reputation, efficiency and information from computer based threats, and to ensure that your digital information is private, is accessible by people who have authority, and has integrity (think “the truth, the whole truth and nothing but the truth”).

In addition you will need to identify the scope of your vision: who it applies to, and what assets, processes and information is relevant. You will also need – and this is a big task – to identify the risks that your vision faces and how best to manage them.

Step 3. Define the details of what you want to achieve

Out of your vision will come the detailed policies you need around cyber security (including policies on IT and web use, Bring your own device, Privacy, and Social media). These need to be expressed in clear language: avoid techie jargon at all costs. Having a truly multifunctional team should mean that the policies should be relevant and effective for your whole organisation.

Step 4. Build new processes

Based on your policies you will be able to identify the tools you need to implement and the processes you need to develop that will help to protect you from cyber risks. It is vital to include a cross section of employees in the design of these systems. Without them you are likely to end up with unusable, frustrating and inflexible processes. If that happens your workforce will soon be looking for ways to work around them. So remove any barriers to people being cyber safe.

Step 5. Educate

Bring your policies to your workforce and educate them about any new tools and processes. Tell them why cyber security is important – for your organisation but also for them personally. And make sure they understand what they should do if they have problems or if things go wrong (as they surely will).

Don’t rely on one off training sessions: make sure that security is constantly “front of mind” with reminders using different techniques, messages and media hitting them as often as possible.

Step 6. Persuade

You can “educate” all you want, but if you fail to persuade them about the importance and effectiveness of what you are proposing then you won’t change anyone’s behaviour.

There are lots of methods that you can steal from marketing and from behavioural economics here. For instance, make sure authority and other credible figures are seen to follow the rules (if the Chief Exec is lax with security you can be certain everyone will happily follow their example). Prove to people that your new ways of working actually deliver benefits. Help people realise that they face constant and sometimes personal risks but (and this is very important) that there is plenty they can do to keep safe.

Keep an eye on how people are incentivised as well. Not about cyber security but about their every day tasks. Don’t put incentives in place that could persuade people to behave in an insecure manner.

Step 7. Socialise cyber security

Kotter talks about “enlisting a volunteer army” and that’s exactly what you have to do. You need everyone in your organisation buying in to the idea of cyber security. Part of this will be ensuring that “the organisation” behaves properly: if it is seen to be cavalier with the security of customer data for instance your internal processes will lose credibility. Ultimately you want your workforce disapproving of people who behave unsafely.

Disapproval doesn’t mean developing a blame culture. That would be very damaging – given the ever changing nature of cyber threats you need people to be able to feel safe if they make a mistake or if they respond wrongly to a new threat. But you do need people to accept cyber safety as the norm and as something that has value in protecting their career and indeed themselves personally, as well as protecting their colleagues and the organisation as a whole.

You might want to take some ideas from Sales as well – leader-boards for people who are particularly effective, prizes for good behaviour, simple recognition for jobs well done…

Step 8 Monitor and enforce

Measurement is very important. Your organisation needs to know how well it is maintaining a positive security culture. Identify some relevant KPIs so you will know if you need to take remedial action.

Enforcement is also important. If people who act unsafely are seen to get away with it then others will quickly follow them. Regular negligence and malicious behaviour may need disciplinary sanctions. More often than not though, you will simply need to offer a little “re-education”. And treat this as a learning opportunity for the organisation as well as the individual concerned. After all if someone is regularly breaking the rules it could well be the fault of the rules!

Uncovering waste in digital service delivery

Services need to be delivered efficiently if an organisation is to thrive. And digitisation can deliver many efficiencies. But it is important to ensure that as much waste as possible is stripped out of  services as they are digitised. Otherwise digitisation can simply be an excuse for avoiding hard decisions about existing wasteful processes.

“Muda” in service delivery

Ideas of “lean” production were developed in post-war Japan by companies like Toyota and helped lead to that country’s reinvention as a commercial dynamo.

Lean production involves stripping waste (muda in Japanese) out of the production process to maximise profitability. How can this powerful idea be used when considering digital transformation?

According to Shoichiro Toyoda (President of Toyota until 1999) waste is “anything other than the minimum amount of equipment, materials, parts, space, and workers’ time which are absolutely essential to add value to the product”.

Toyota identified eight “wastes” in their production process. With a little imagination these can be matched with potential wastes in service processes.

The eight wastes

1. Defective processes

Accuracy is fundamental to manufacturing and so it is to services. Defects in processes can include clerical errors in data entry (for example the wrong data being recorded) or a lack of the data necessary for a complete record.

Alternatively, defects might involve the wrong data being used to service an individual: a call centre employee might pull up records for the wrong person or when the records available to a retailer might not match the promises being delivered elsewhere in the organisation – for instance when an advert promises something but the retailer can’t offer this to someone who requests it.

2. Over-production

The most important form of over-production in service delivery is the failure to retain existing customers; this results in an expensive search for new customers. Waste here could be caused by a failure to service customers properly but is just as likely to be caused by a failure to generate loyalty through communications (for instance when offers are targeted only at new customers) or a recognition of a customer’s status as an existing customer.

Within the service itself, over-production could involve the creation of records that are not required e.g. keeping records of people who are not customers may be a waste if they are not (legitimately and ethically) used for other things. Alternatively requiring unnecessary data fields to be completed is a waste e.g. in a sales form a requirement for a telephone number in addition to an email address may be unnecessary (as well as being off-putting to the customer). This seems to be a fairly common issue in e-commerce forms where data is gathered unnecessarily “just in case” it might be useful. If unnecessary data is collected and stored then there is a data compliance issue in Europe as data rules state that data should only be held when necessary.

3. Damage during production

When you are building a car it is easy to see how damage to delicate components can happen. It is not immediately obvious how waste can be caused during the process of providing the service.

But it could be generated by someone accessing and changing customer data used in a service. For instance if someone access your file and makes changes to, adds to or deletes the data, then if this is done without any appropriate record being made the record could be damaged as it would no longer be complete.

4. The use of unnecessary physical resources or inventory

Using too much steel in a motor car is an obvious waste of resource. Keying data in twice is an example of an unnecessary use of resource in a service process. For instance if a salesperson takes down the details of a prospect on a paper form and then those details need to be transposed to an online system there is an obvious waste, as well as an increased risk of inaccuracy when transposition errors occur.

Waste is a big problem in any service where the service provider isn’t using their own money to provide the service. The bloated management seen in many public service organisations is a manifestation of this.

Examples include the use of unnecessary equipment such as expensive tablet computers bought for reasons of fashion rather than function, or decisions made about unnecessary software, or software upgrades, that cause unnecessary expenditure. Note that the use of unnecessary software could also act as a cyber risk by expanding the “risk surface” of the organisation while the use of non-standard computing equipment could have a similar effect: another reason for rooting out this type of waste.

Another important resource is information. Making it unnecessarily hard to find information could be very wasteful: knowledge workers have been estimated to spend up to 20% of their time looking for information. Thinking of ways to reduce this – better file structures, efficient desktop search engines, more effective knowledge management, even a library of books – could reduce this waste considerably as well as making employees feel better about their jobs.

Related to this is the waste associated with unnecessary work – such as emails where people are “copied in” for no reason and unnecessary “meetings about meetings”, or meetings where everyone is given a chance to speak even if they have nothing to say! (Holding meetings standing up is a good way of speeding them up.) The creation of long meeting minutes rather than brief outlines of decisions made is often wasteful. Compulsory training can also be wasteful – where it is provided to people who don’t need it, perhaps because training plans are not granular enough and fail to distinguish between different types of worker.

Office costs may also be very wasteful – heating and lighting left on in empty rooms,unnecessary use of printer ink and paper etc; these can add substantially to the cost of delivering services. Comfortable working conditions are of course important for maintaining staff morale and staff efficiency but where some parts of an organisation are seen as getting special treatment this can cause resentment.

5. Unnecessary transportation costs

Generally services are not “transported”, unlike motor cars. However the people who deliver them are: wasteful costs here therefore could involve unnecessary offices that are physically near to consumers when the service could as well be delivered remotely. This can be part of the case for digitising processes: for instance a customer consultation or an internal meeting held over Skype might be far more time efficient than a face to face meeting.

There could also be “transportation” wastes caused by the inability for people to access records remotely once they are created,  requiring people to visit a separate location to access the information they require or download data to a  system. I have seen this caused by inefficient (i.e. over secure) security protocols that allow people to log on to a system from one work location but not from another.

6. Unnecessary time taken

If parts of a service takes an unnecessarily long time to deliver it can mean other people involved in the service wasting their time as they wait. It can also mean the customer waiting for something to be ready for them – and waiting will reduce their loyalty.

Time waste can be caused by inefficient “critical paths” where actions dependent on other actions are not ordered as well as they could be. In addition unnecessary processes such as the duplication of data entry can cause delays in the delivery of services. A large numbers of versions of a “version controlled” document could indicate inefficiency in the way that document is handled.

One technique to uncover unnecessarily complex processes is “process mining” where the relationships between different parts of a process are mapped out and any loops or repeated steps can be identified.

7. Unnecessarily high quality of components

We want our motor cars to contain components of the appropriate quality. For instance some European motor manufacturers experienced quality problems when they decided to save money on components during the economic downturn.

In service processes, of course consumers want an appropriate quality of customer service. But if the delivery of customer service elements don’t actually generate extra sales or loyalty then they are wasteful. For instance interactions with call centres by customers who have queries about a product they have bought may be seriously wasteful compared with creating a good FAQ online.

Timing is also important here: asking a customer at a restaurant “is everything satisfactory” may well show appropriate customer care when it happens just after they have been served; but asking the same on the way out after they have paid (rather than just saying “goodbye”) could be considered wasteful and indeed unnecessarily risky.

8. Failure to use staff skills

Where the wrong people are doing the wrong jobs, e.g. where professionals are doing admin jobs, there is a clear waste of talent and resource. This can happen if tasks are not allocated properly or if weak management allows people who should be undertaking routine tasks for more qualified colleagues to “delegate upwards”.

Even if professionally qualified people are employed at a cheaper rate because they have been employed to perform a routine task, you can argue that this may be wasteful for an organisation because they are likely to be bored and less efficient – unless they know they are being trained up to do a harder job in the future.

Finding the bottlenecks

Waste can occur anywhere in a service process. However some waste is worse than other waste. In particular, when the waste is happening in a part of the process that is already struggling to perform effectively then this waste needs to be prioritised.

Most processes are as strong – or as efficient as their weakest (or most inefficient) part. Therefore it is sensible to locate any bottlenecks that are reducing service efficiency or extending delivery times and start identifying waste there.

Let’s take a process that is required to deliver a service in a particular time – say the delivery of groceries in a particular time slot. There may be waste in several areas – receiving the order, picking and packing, loading the van, getting to the customer. But if there is a resource problem around loading the van that is effecting the ability to meet promised delivery times, solving a resource issue in the picking and packing area won’t solve the problem of late delivery.

Waste and the digitising of processes

Digitisation does not in itself guarantee efficiency. Any project to digitise a business process needs to identify waste in the process and then consider ways that digitising the process could reduce that waste. It is important to avoid digitisation that merely makes processes more complex – for instance paper is an excellent interface and in some circumstances (e.g. where data doesn’t need to be shared or stored for any length of time)  can be an excellent part of a process.

In addition it is important to consider any risks  (especially around security and data compliance) that might arise as a result of digitising a process. If these risks outweigh the advantages of the digitisation, and are not capable of being reduced, then the case for digitisation is also reduced.