Lack of an established recovery plan
Posted: Mon Feb 10, 2025 3:54 am
Unfortunately, hackers bypass even the most meticulous cyber hygiene practices. Organizations need to prepare for this eventuality with a robust recovery plan. According to Ontrack research, nearly 40% do not have one.
“I’ll hire [an outside provider] when there’s a problem. And I’ll pay a lot of money to fix it,” is how Williams describes the typical mentality. It’s not a sustainable model, he says: “We need to move from prevention to full business continuity. But how do we maintain business when we’re under attack?”
As is well known, a pinch of "before" is worth a pound of "after". But in case of a catastrophe, it is necessary to have a last resort in reserve. And the creation of a stable organization is almost impossible if overconfident leaders believe that they are initially insured against everything.
Human factor
According to the World Economic Forum, human error austria mobile database for up to 95% of cybersecurity incidents. Verizon’s data breach report puts the figure at 85%. However, only 67% of board members surveyed by Harvard Business Review believe that human error is the leading cause of breaches.
These errors take many forms, from falling for phishing scams, using obvious passwords and single-factor authentication, to leaving open backdoors in coding. Alert fatigue can cause analysts to ignore real security issues if overly enthusiastic programs regularly draw attention to benign events.
Ultimately, many of the problems stem from a failure to build a culture of security within an organization through regular training and testing. But even trained people are surprisingly susceptible to manipulation by attackers. Psychology literature shows that hackers often have strong personality traits known as the “dark triad”: Machiavellianism, narcissism, and psychopathy. People with these traits are skilled manipulators and tend to feel no guilt about using these skills to get what they want. This is what social engineering is all about—gaining someone’s trust in order to get them to do something dangerous, like hand over passwords.
“I’ll hire [an outside provider] when there’s a problem. And I’ll pay a lot of money to fix it,” is how Williams describes the typical mentality. It’s not a sustainable model, he says: “We need to move from prevention to full business continuity. But how do we maintain business when we’re under attack?”
As is well known, a pinch of "before" is worth a pound of "after". But in case of a catastrophe, it is necessary to have a last resort in reserve. And the creation of a stable organization is almost impossible if overconfident leaders believe that they are initially insured against everything.
Human factor
According to the World Economic Forum, human error austria mobile database for up to 95% of cybersecurity incidents. Verizon’s data breach report puts the figure at 85%. However, only 67% of board members surveyed by Harvard Business Review believe that human error is the leading cause of breaches.
These errors take many forms, from falling for phishing scams, using obvious passwords and single-factor authentication, to leaving open backdoors in coding. Alert fatigue can cause analysts to ignore real security issues if overly enthusiastic programs regularly draw attention to benign events.
Ultimately, many of the problems stem from a failure to build a culture of security within an organization through regular training and testing. But even trained people are surprisingly susceptible to manipulation by attackers. Psychology literature shows that hackers often have strong personality traits known as the “dark triad”: Machiavellianism, narcissism, and psychopathy. People with these traits are skilled manipulators and tend to feel no guilt about using these skills to get what they want. This is what social engineering is all about—gaining someone’s trust in order to get them to do something dangerous, like hand over passwords.