‘Security isn’t a dirty word Blackadder. Crevice is a dirty word, but security isn’t.’
General Melchett, Blackadder Goes Forth
Security plays an ever more visible role in our lives because it is increasingly important when we leave a trail of personal data wherever we go online.
In my current engagement I’m managing a security project that underpins part of the UK’s critical national infrastructure, and so come into contact with information security PhDs, grey hat hackers, and people who spend a lot of time in Cheltenham. It’s genuinely frightening how easy it is for people with the right skills and some basic tools to breach the security that most of us put around our online identities, whether we’re using the same password (or variations of it) for everything or not, or whether we’re using ‘password1’ because it’s easy to remember.
When it comes to the systems that we build, a lot of organisations find it hard to find the balance between security and usability – erring on the side of ever-more security. This is probably because there are a standards for every aspect of IT security, from verification of identity to backups, and where there is an inevitable tension between hard (defined standards, audit, certification) and soft (user research, business need, ideas) – hard almost always wins.
The problem is that this often backfires. In trying to lock down what people can do to an ever-greater extent, organisations find that people work around the constraints to accomplish what they need to do.
The more complex we make password rules, the less easy they are to remember and the more likely it is that users write them down in a place that’s easily accessible.
The more we lock down access at work, the more we’ll use our own devices – where corporate IT security has no control at all.
The more rules there are, the less we’ll appreciate which are the important ones.
In almost every design workshop I repeat the mantra ‘users are like water – they will find the easiest path. If we don’t provide the easiest path for them to accomplish what they need, they will find another way.’
We can find this path, and a balance, by asking what we need to accomplish as users, and fitting this within a framework of security considerations. We need standards and rules, but we also need to be able to do our jobs. We don’t need isolated decisions (which are easy) that layer up into a hellish bureaucracy (making things hard).
Designing systems (in the organisation-as-a-system sense) is hard work but by mapping out the consequences of our decisions and asking ‘what will people do?’ when we’re making architectural decisions, and then watching what they actually do with the early versions of the systems they use, we’ll find a better balance between security and usability, and will have systems as secure as they need to be, and users who can do their jobs.