By George Finney, Chief Security Officer for Southern Methodist University and Author of Well Aware: Master the Nine Cybersecurity Habits to Protect Your Future
My first job out of college was at a call center doing tech support for an Internet Service Provider. This was a long time ago, but one of the first things I learned were the phrases “ID10T Error” and “PEBKAC”. Both were jabs at the sometimes-frustrating customers who would do weird things like using their CD tray as a cup holder. We still use these acronyms today and have built them into our culture as though they were a motto.
In cybersecurity, everyone knows our secret motto:” people are the weakest link.” We say this even though it’s totally wrong. People aren’t the weakest link. As Lance Spitzner of the SANS Institute says, “People aren’t the weakest link, they are the largest attack surface.” And this way of thinking is making us less secure.
In the 1960s, Lenore Jacobson conducted an experiment. Jacobson was an elementary school principal, and she had just read a study by psychologist Dr. Robert Rosenthal about how expectations can lead to higher performance. So she set out to give all the students in her elementary school an IQ test. Then she shared this information with the teachers. But she lied to the teachers about the students’ scores.
The students that she said had the highest test scores were actually the lowest and vice versa.
At the end of the school year, the students were tested again. The students that the teachers believed to have the highest scores, in the beginning, made significantly more improvement than the students the teachers believed to have the lowest scores. What mattered more than students’ innate intellectual ability was the teacher’s belief that the students were “intellectual bloomers”.
If we in the cybersecurity community believe that people are the weakest link and always will be, then our belief will ensure that this comes true. But what if we believed something different?
When I came into my role as a CISO, I did a monthly report to my executive team with lots of dashboards. I was constantly searching for metrics that should show how effective our security program was. There are lots of metrics you can report on, as the total volume of attacks, that are helpful to understand the scope of the problem but don’t reflect how good a job your team is doing. A large volume of attacks doesn’t mean you aren’t good at your job, it just means that you are a large target.
We began sending simulated phishing messages to our users in 2014, and I started reporting on the number of users that clicked on the phishing links. Over time this number went down, but I realized that this metric didn’t tell the whole story. Focusing on how low the percentage got focused on the negative aspects of my campaign and distracted from the positive. Instead of saying that we reduced our click-through rate down to 3%, I started saying that we increased our phishing recognition rate to 97%.
For me, this was a big change. Instead of normalizing bad behavior, I started sending the message that the vast majority of our community was highly effective at recognizing phishing.
This approach was, for lack of a better term, infectious. In my security awareness newsletters, I began using images that are of people, not random pictures of technology, to reinforce the message that people are the ones we’re protecting. I began telling stories of how people were impacted by security incidents, and more importantly how they responded. I wanted to show my community how to improve rather than constantly telling them to improve.
But all this required that I let go of the belief that people are the problem and I had to start believing that they were the solution. And one of the ways that I’ve changed my security program is to embrace what I call “fearless learning”. When someone makes a mistake, whether or not they can learn from that mistake comes down to whether they’re afraid of changing afterward. If they feel like they could be made a scapegoat and be fired means, from a neuroscience perspective, that their cognitive capacity will be reduced. We see this degradation of mental capacity effect in all kinds of stressful situations.
When a user clicks on a phishing message, I never report this information to anyone. I’ve gotten requests from people who want to use this information to discipline employees. I’ve resisted this at all costs because I want to create a culture where users have a safe environment to learn and practice before there is an incident. I do this because I believe that they can change their habits. And I’ve seen that this is possible.
Stanford Professor BJ Fogg believes the reason we fail at changing things in our lives is that we start big. In his book, Tiny Habits, he describes habits as a rope with hundreds of knots. If you go for the largest knot to unravel, you will fail. But if you loosen an easy knot, you will be able to work your way up to the bigger challenges. And with each small knot, you build your own skill at mastering change.
Changing our cybersecurity cultures may seem like an insurmountable problem, but it’s not. We can start, not just small, but tiny. We need to make it incredibly easy to get started. We need to celebrate even the smallest successes rather than condemning mistakes. And over time, we can start to build momentum.
As I’ve researched the habits we use in cybersecurity, I distilled all of the advice and training we give to people down to nine distinct categories of habits. The habits are Literacy, Skepticism, Vigilance, Secrecy, Culture, Diligence, Community, Mirroring, and Deception.
The nine cybersecurity habits are what Fogg calls constellations of tiny habits. Changing works best when you focus on related habits all at the same time. If you miss a habit for a day because you went on vacation, that’s ok. If you only do the minimum, you still celebrate because you’re building a lasting habit. And you get the satisfaction of knowing that you’re not just protecting yourself, but you’re protecting those around you as well.
Can making tiny changes really change the whole culture of an entire organization?
To be successful, we need to start small. We don’t need to change everyone all at once. But to start, we do need a small committed group of people to be our vanguard. These will create a tipping point to change our culture. According to Dr. Damon Centola at the University of Pennsylvania, the tipping point for creating large scale change is only around 25% of the population of a group.
25% is still a large number, but we don’t need to start big. We can start by working with 10 people to teach them how to change their cybersecurity habits. And if we deputize them to be cybersecurity habit evangelists, each of them can teach 10 more. But it starts with believing people are the solution to our cybersecurity challenges.
Changing culture won’t happen overnight. But it will happen if we change one habit at a time.
About the Author
George Finney is a CISO, author, speaker, professor, and consultant who believes that people are the key to solving our cybersecurity challenges. He has worked in cybersecurity for nearly 20 years and has helped startups, global telecommunications firms, and nonprofits improve their security posture. As a part of his passion for education, George has taught cybersecurity at Southern Methodist University and is the author of Well Aware: Master the Nine Cybersecurity Habits to Protect Your Future. George has been recognized by Security Magazine as one of their top cybersecurity leaders in 2018 and is a part of the Texas CISO Council.
George can be reached via LinkedIn, Twitter @wellawaresecure, and on his website where you can find more information about the nine cybersecurity habits http://www.wellawaresecurity.com/