The Wall Street Journal published an article that criticized phishing simulations. Frankly, I thought when I read the title, I would agree with it. Unfortunately, as I read it, my impression was, “What a specious and pompous piece of crap!”
Before I go on, I should say that I wrote a message to the Wall Street Journal editor offering to publish these points on their venue, but I received the following response.
Hi! Thanks so much for writing! While we take pieces by reporters and academic researchers, we don't run essays by cybersecurity company executives or consultants. That said, it sounds like your work, studies, and writing provide you with an interesting perspective. You've given me something to think about, and I'll keep in mind the points you've made.
In other words, if you are a qualified practitioner with a learned position, don’t bother trying to reach the WSJ audience. I will leave that here, but the following is what I would have published.
Two things to be clear about.
- I make no money from phishing companies, nor am I proactively offering services to perform the work. I have no financial interest in this.
- I will not link the article and give it more exposure. Unfortunately, it is much easier to find that article than this article.
As background, the article was written by an associate professor at a legitimate institution, who sometimes attempts research regarding human aspects of cybersecurity. There is no practitioner experience. There is no running of an awareness program. They just sometimes perform somewhat relevant research.
The article in question seems to make its points by unsubstantiated references to “research” and one phishing simulation for a company of 2,000 people. Academically, vague references to research are rejected by academics by default. Professionally, one simulation of 2,000 people is a small test run for a full assessment at many organizations. Either way, professionals know not to put any measure of legitimacy in a single test.
The article uses these tools to provide some specious reasons to end phishing simulations. It talks about simulations as generating fear and distrust. It says that some phishing lures generated more response than others. It also says that users experience stress and anxiety. Technically, all these things can be true, and they are, in poorly designed programs.
The article uses these tools to provide some specious reasons to end phishing simulations. It talks about simulations as generating fear and distrust. It says that some phishing lures generated more response than others. It also says that users experience stress and anxiety. Technically, all these things can be true, and they are, in poorly designed programs.
I will say this once and leave it here. Many awareness programs, including phishing simulations, are poorly designed and a waste of time and money. I used this analogy on LinkedIn: If you give a monkey a violin, and the monkey makes horrible music, it’s not the violin’s fault. Awareness tools are the violin, and in this case, the academic researcher and bad security program designs are the monkey.
Yes, awareness managers should not just put out phishing simulations without making sure that people know why you do so. I once spoke to a financial analyst at one of the largest investment banks in the world, who told me that he hates phishing simulations, because he could be fired if he fails 3 of them in a year. I asked how he felt about it, and the response was enlightening, “Well, if I click on a real phish at this company, I can cost them more money than I will make in a lifetime.” They knew the “Why”, to the credit of the awareness team.
Regarding the rest, I will simply say that some phishing lures are better than other lures for training, and this is exactly why you don’t use an academic researcher doing an experiment to do the phishing assessment. You want a skilled person, who does the tests frequently, who knows which lures are better designed for a given purpose than others. It really is that simple. Regarding users being stressed, unfortunately instilling a sense of diligence in users does add to their stress. However, it shouldn’t be considered any different than instilling good driving habits in truck drivers, safe food handling habits for food service workers, or proper diligence on the part of accountants. Frankly, ransomware these days is infinitely more of an existential threat to an organization than a single unsafe truck driver, and we don't talk about driving tests as too stressful on truck drivers.
The author’s reference to other studies, with just vague references, is incredibly poor practice. You can’t confirm or deny the thoughts. But I can say with 100% confidence that I have legitimate studies that will rebut whatever points they make. There is a study for everything.
There are a few facts and commonly accepted sciences that the author makes no acknowledgement of. For example, there is the Ebbinghaus Forgetting Curve, which essentially finds that if you give someone educational information and they don’t apply it, their memory of the information fades (by 70% after one day, and 90% after one month). Phishing simulations essentially interrupt the Forgetting Curve to reinforce all awareness lessons.
Likewise in Safety Science, there is the principle of Complacency. When nothing happens, people become complacent and ignore safety practices. Phishing simulations help to interrupt not just the Forgetting Curve, but also a feeling of complacency given the effectiveness of secure email gateways in weeding out real phishing messages.
There is also some interesting research performed by Elevate Security and The Cyentia Institute that found that 4% of users caused 80% of damage. These phishing simulations help to identify that 4% of users, allowing the organization to better train them and otherwise reduce the risk they cause.
Independently, I performed research and found that users who had high levels of self-confidence, who fell for a phishing message, would never click on a second phishing message. So, in this case, the risk these individuals posed to the organization, however minimal, was totally mitigated. Clearly, this is not everyone, but it does have a risk reduction impact.
Again, I will not argue that many, if not most, phishing simulations are poorly designed and implemented. This does not mean that they will not provide value if implemented properly. The problem is that the Wall Street Journal is considered a legitimate source of information that is read by executives. This article did a tremendous disservice to awareness professionals and the cybersecurity profession as a whole. The article is Ivory Tower pontification at its worst, that is dangerously specious, without providing critical context for its dictates, while leaving out other relevant information.
About the author
Ira Winkler is a prominent CISO and cybersecurity leader who, with his background in psychology and the military, has written some of the most authoritative books in the industry on security awareness. All opinions are his own, and do not necessarily reflect the opinion or perspective of Hoxhunt.
- Subscribe to All Things Human Risk to get a monthly round up of our latest content
- Request a demo for a customized walkthrough of Hoxhunt