In software development, there is a concept called “technical debt.” which is costs companies pay when they choose to build software the easy way instead of the right way. They bring together temporary solutions to satisfy a short-term need. Over time, as teams struggle to maintain a patchwork of such solutions, tech debt accrues in the form of lost productivity or poor customer experience.
Howard Boville writes about how cybersecurity defences are following a similar debt. The true cost of this “cybersecurity debt” is difficult to quantify.
The exact cause of attacks are unknown, however the aftermath is costing consumers and businesses a lot. For example, after the Colonial Pipeline attack, gas prices went up and there was a shortage of gas too.
The more security solutions and vendors a company uses, any of them can be a potential entry point for an attack thus leading to an increase in cybersecurity debt.
To update foundation and pay off cybersecurity debt, you steps must be taken:
1. Embrace open standards across all critical digital infrastructure, especially the infrastructure used by private contractors to service the government.
2. Close the remaining loopholes in the data security supply chain.
The cost of not addressing cybersecurity debt spans far beyond monetary damages. Food and fuel supplies are at risk, and entire economies can be disrupted.
The need for significant improvement in the state of operational technology (OT) cybersecurity in critical infrastructures has been broadly acknowledged for almost two decades.
It is essential that we view improvements in cybersecurity as requiring improvements in multiple areas, including people and skills development, governance and process development, and improved technology.
Sharing experiences can begin with simple things like sharing specific experiences and lessons learned in social media or forums dedicated to the subject of industrial cybersecurity. What has worked? What has not? What might you do differently if you were to do it again?
There must be more peer-to-peer sharing such as what happens at industry events. This information should be readily available to everyone, and not just those able to attend such events.
There have been several attempts to establish searchable repositories of incidents that people can learn from. Why not extend this model to include case studies at various levels of detail? The can be anonymized to protect confidentiality. Some of this is already being done, but shared information is often limited to those in a specific community, such as an industry group or information sharing and analysis center (ISAC). Standards committees such as ISA99 are also investigating ways to complement their formal standards with simpler and more practical examples. Clearly, this is not a new or revolutionary idea; we simply must engage more people. There are opportunities for you to contribute.
A recent study found that a person analyzing digital forensics could interpret data differently based on information or comments given to them about the suspect or the case prior to conducting any of their analysis.
For example, if the analyst was told that the suspect was innocent before conducting their digital search, they were much less likely to find incriminating evidence, even though some might exist, and vice versa. If an analyst has reason to believe that the suspect is guilty prior to the start of their analysis, they would be more likely to find more evidence than an analyst who had no prior opinion or knowledge of the case.
The study was conducted by Nina Sunde, Police Superintendent for the Norwegian Police University College and Itiel Dror, a professor at the University College, London. The same hard drive of evidence was given to 53 different digital analysts spanning across eight countries.
The results of the study showed that each analyst’s biases on whether or not the suspect was already guilty played a large part in their findings. According to the article, the study also showed inconsistencies even across analysts who were told the same preliminary context.
According to the article, the report opens up a larger issue of bias within a system that should be solely focused on searching for evidence. Unlike forensic science, which has been used and developed for hundreds of years, digital forensics has only been evolving since the boom of technology across the globe.