Tuesday, October 3, 2023

AskAppSec - Painless Usable Security

Imagine security being painless, easily usable and just the usual way we do things. Imagine this for both those who develop products and those who use these products. Wouldn't that be amazing? My optimism tells me it's possible, and yet we're often far from it.

In one way or another, I kept thinking about this for the last months. At SoCraTes and FroGS conf, I've facilitated sessions on the topic. We gathered lots of insights together with participants, hearing what struggles they faced and what opportunities they found. What we can do to change the narrative. Many thanks to everyone who contributed! Here are the points that came up repeatedly when asked what's painful.

  • Fear. It's scary to ask questions, especially about security. Security teams (if you even have them) might be very detached from teams' everyday realities and not approachable, might even be condescending, or just wave around a policy without being helpful. There's a lot of secrecy and gatekeeping going on as well. And what if we make a mistake and people blame us? What if I see something yet have every incentive encouraging me to look the other way instead of reporting it?
  • Fatigue. So many alerts, we're overwhelmed already. So many false positives reported by tools, which makes it even easier to ignore yet another scanner result and just bypass it so we can move forward, as we're pressured to do. Security theater is huge at play here as well, everyone talking how important security is without ever seeing real actions taken. Why not just tick those boxes in the easiest way so we can say we comply without actually fulfilling the spirit behind regulations.
  • Future. Well, security is indeed a future problem, isn't it? Yeah, that risk exists, yet will it really ever happen? We'll rather cross the bridge when we come to it. We have so many other things to do after all. And as we can't invest in prevention now, let's put security last by default. Hence, we can ignore issues we see, as no real pain is perceived - until suddenly the pain is super high.
  • Friction. I know this is the more secure way, yet I have to jump through ten hoops, get approval from hundred people and then sign this contract with my blood - or... I just do this one-liner change. Procedural problems are real. Poor experience is real. Difficult cross-team collaboration and dependencies are real. And they have very real impact on behavior. If something is way too much effort for what it's worth, we're usually not going this extra mile (or at least aren't rewarded for it).
  • Futility. Security is just such a huge area, security work is never finished, we'll never know everything. The system is so complex. We lack knowledge and we have so much else to know already. We struggle to see the actual impact of vulnerabilities anyways. We can only know the system is insecure, not the other way around. All this feels really futile, so why invest at all.

This list resonates with me a lot, and I see these points in my own work context as well. Especially when there's a whole backlog of things that we know we need to improve, yet struggle with balancing competing priorities. Fatigue is a real challenge indeed, like fatigue of pointing out problems and proposing solutions that just don't cut the list of most valuable things to do right now.

I've also talked with several people in the communities I'm in, where security was sometimes perceived as painful due to other reasons. Like, why is security the only quality aspect that is considered and gets buy-in, what about all the other ways in which we can harm our users, our own people, our product and company? Why do we get external experts for security yet not for other topics (like accessibility)? Why do security policies just always make things harder? What is it that security slows us down while not achieving actually more secure outcomes?

Finally, there's the angle from friends and family not working in tech. Security? Well, that's often perceived as the thing that annoys you, that you skip. Oh my, another update, why do people have to change things all the time. Oh no, another factor to log in, why does everybody need to do this nowadays. My goodness, another popup to click away so I can do my job and go on with my life. I heard a lot of statements like this, usually accompanied with frustration and anger. Or with shrugging things off. I don't care if they have my data, what would they do with it anyways. Yeah, I know this company has proven to do bad things and yet they offer the best usable solution compared to more secure competitors.

So how can we reduce the pain and friction, increase usability and make security the easy route to take? In all my conversations, the following points came up repeatedly.

  • Ease development experience. Anything that makes security easier and reduces friction and cognitive load from the start can help. Include thinking exercises like having evil personas you could use for user stories, and doing threat modelling to raise awareness before designing and building solutions. Provide good code examples. Have secure defaults, in frameworks, infrastructure, your own product. Keep things in shape and up to date. Enable folks to deliver fast and well, so we can respond fast and well to new threats. Planning for mistakes (that will inevitably happen) and recovery, and foster a culture where postmortems are considered an invaluable learning opportunity.
  • Collaborate with security experts early on. No ivory towers, no jargon being thrown around. Instead, security folks being approachable and helpful, enabling team after team, pairing and ensembling hands-on. Security champion programs that actually build bridges and help scale good practices. Do this early on and continuously to make use of the best leverage. Then consider getting external persons to point out problems and receive advice, be it in the form of consultants, audits, bug bounty programs, or coordinated disclosure. 
  • Be clear about risks to help prioritization. Not only do we need to assess risks, risks can also have vastly different impact depending on our specific context. Terrible consequences in one might be reasonably ignorable in another context. Learning what's most relevant in yours, and probing the risk appetite of the organization helps figure out priorities.

One thing that stuck with me is what both security and UX folks repeat over and over: security that's not usable is not security. Just as Jared Spool points out, "If it’s not usable, it’s not secure." Because people simply won't do it or find their way around. They have a task to accomplish, a goal to achieve, a job to do. If security blocks them instead of supports them, it might as well not be there in the first place.

The same applies to development teams. If practices leading to more security aren't usable, or don't fit in our everyday lives, they simply won't happen. We have to find ways to make it the easy and frictionless route, anything else is simply not sustainable.

This whole topic reminds me again of testing and quality, as so many things do in security. It's a lot about culture, it's a lot about advocacy and change. In the end it's about people. I'm wondering now: the team transformation tactics I found to help move towards holistic testing and quality, could I try them out for moving towards painless usable security as well? I probably should give it a try. Now that I'm thinking about it, I realize I literally just did apply them the last weeks. And before as well.

Let me give an example. One of our current focus topics is keeping our dependencies up to date. Updating them is one part, yet having a team reliably keep doing so is a whole different story. What I did was building on existing energies; in this case, building on existing practices that already worked for the team. And just last week for the first time it worked out well enough. People felt responsible and updated dependencies on their own without my nudging. It was clear, it was easy, it was part of everyday work. It didn't cause friction. Okay granted, I'll have to observe and evaluate this experiment further, yet on first glance it does look like less friction than before.

When it comes to security improvements for our product, I believe we need to work a lot more and a lot closer with UX folks and product designers. This expertise is invaluable and yet too often underrated. The resources listed above give lots of pointers why.

That leaves me wondering: why not also work together with UX and design to find more painless, usable, secure ways to build more painless, usable, secure solutions? There's a lot more for me to think about and try out.

I'm sure people made their own experience with this intersection of security and usability as well as respective pain points, be it for their team, organization, product or just generally in life. Therefore, let me ask you all: what's your approach to move towards painless, usable security?

UPDATE: This post didn't receive much response from the community yet. Really appreciated this person taking the time to share their thoughts and experiences!


  1. An interesting article (as always), Lisi? What did you and your team do to make keeping dependencies up to date work? This is a hot issue in our teams at the moment.

    1. Thank you, John, as always! :) As I started writing my reply, I realized I should write a separate blog post about this topic, there's more thinking to be done. Here's my short version for now, maybe it provides helpful pointers already.

      Keeping dependencies up to date is a hot issue for my team as well, especially as we have a whole bunch of services we own and most of them are around for quite a long time (and still valuable). So far, the following worked for us in our context.

      1) Establish, encourage and ensure 20% time for every team member used to drive tech initiatives (like getting dependencies of our services in shape).

      2) Use tooling to support easier updates, like automated scanners for outdated dependencies, utility tools to adapt required related documentation for compliance reasons, and automated checks to discover potential regressions where feasible (in combination with relying on system knowledge to quickly unveil more surprises where automation reaches its limits).

      3) Build on existing energies and practices to keep dependencies in shape (in our case we have regular tasks needed for each release, and updating dependencies became one of them).

      All this, however, likely only worked due to the team culture we fostered where people are sharing everything; knowledge, skills, load, a common goal, and more. This made it clear from the start that keeping dependencies up to date is a team task as well and we're all responsible for it, together. I hope to share more once we've lived this approach for longer time, I'm curious if we can manage to keep our system in shape.