How optimism undermines cybersecurity
[ad_1]
While we like to think of ourselves as highly rational and logical, the human brain is sometimes too optimistic for its own good. For example, in our personal life, we may be likely to grossly underestimate the chances that we will experience a negative life event or be involved in an accident. Psychologists call it optimism bias, a cognitive disconnection that extends far beyond our personal lives. That’s why CEOs and CCOs invest heavily in cloud-based business transformation efforts, but strive to save money on the robust cybersecurity programs needed to keep them safe and secure. functioning.
When it comes to cybersecurity, the âIt won’t happen to usâ attitude is not a smart bet; like playing russian roulette. This breeds a feeling of complacency and overconfidence and puts an organization at risk. To overcome this, organizations must take an evidence-based approach to cybersecurity, assessing the impact of cyber attacks in their industry and the cost of a potential breach.
Hope for the best, prepare for the worst
In the world of cybersecurity, many of the incidents we heard about were almost certainly preceded by an assumption – or at least a reasonable expectation – that preventative measures were in place to prevent such an incident. If you were to ask yourself how many security practitioners would knowingly or even intentionally jeopardize their security posture, you would probably answer very little or not at all. The point is, most organizations are still assessing their risk position and looking for ways to effectively adopt “best practices” that improve their ability to mitigate and respond to attacks. This could be due to new technology, new processes or workflows, or just an increase in staff (if you are lucky enough to find qualified staff for a vacant position.) I would say after many years of talking with organizations and helping to improve their solution and their security effectiveness, it’s often a mix of all three.
What a lot of organizations are finding is that they have good oversight tools, decent defense mechanisms, and staff who have a legitimate desire to improve their security posture – not to mention that I have never talked in my career with someone who wanted to experience a breach! So how do you systematically eliminate choice biases and make decisions based on facts, shifting from a âwe don’t have to worryâ opinion to a position of informed awareness and preparedness? Start with incident rates in your industry or the size of your organization. These are just facts, not emotions, and can help set reasonable expectations.
Choose the right tools
After moving from âIt would never beâ to âIt is possible,â it’s time to take a look at tool integrations and workflow improvement opportunities. Many organizations often have the data they need to at least identify suspicious activity. This is where you’ll hear about the volume of events, data glut, or Security Operations Center (SoC) fatigue.
These are not just complaints or personal issues, they are legitimate; without intelligent analysis tools that help refine metrics contributing to the root incident, an incident is more and more likely. This is where great machine learning or AI-based analysis and tools play a role. SIEMs, for example, aggregate a lot of data into one place and serve a purpose, but unless the logic is programmed with exactly what to look for, an important event can be missed. Of course, most L2 or L3 SoC analysts will get the same result, but it takes time on a human scale. Most of the time, attacks run on the scale of compute cycles, and like it or not, it can move much faster than human analysts, in most cases.
The good news is that many solutions have inference logic that improves the quality of the verdict and reduces the number of data points to a finite volume that can be acted upon. This allows a specific security workflow to be performed repeatedly, whether the incident or event is handled by an L3 SOC engineer or an L2 support technician. Having accurate, defensible data and a workflow that resolves the incident as repeatedly as making chocolate chip cookies removes the bias, opinion and “maybe” from the equation and replace it with an attenuated, solved or irrelevant equation.
Eliminate bias
Information security is a practice that must be constantly questioned and tested for weaknesses and opportunities for improvement. There is no such thing as âthis can’t happen to meâ for anyone, due to the pace of change in each industry and the fundamental technology we rely on every day. If an organization were to stop using software, disconnect from the Internet, remove outside access to all valuable assets, and ensure that nothing could go or go, then maybe this degree of isolation would not require any further improvement in cybersecurity. And that organization would probably cease to exist.
One approach to reducing bias and recognizing it in a convenient format is to run exercises and tests. It is an effective way to improve confidence in a process or the outcome of a solution, and also to identify problems. What is not proven or tested remains a variable, and this is where prejudices or assumptions lie. As the old saying goes, “practice makes perfect”; or, in this case, reduce the variables. With this increased predictability, biases give way to evidence and knowledge that the approach is sound. Dealing with bias is a process that requires first awareness of the bias and then methodical elimination of the triggers that lead to a biased response.
With awareness of a bias comes the ability to recognize its effects and deal with it appropriately, which leads to real confidence that you are effectively reducing the risk of it happening. Understanding that threats exist and are real to everyone is the first step in eliminating stigma and protecting an organization. From there, the right tools can be adopted and steps taken to be ready for anything. Hoping for the best, preparing for the worst is the ultimate mantra of a great safety program.
[ad_2]