Cyber Attacks Take Hours to Detect (Source: CSG International)
More than one-third of cyber attacks take hours to detect. Even more alarming, resolving breaches takes days, weeks, and in some cases, even months. Despite increased resource allocation designed to protect networks, a CSG Invotas survey conducted by independent research firm IDG finds that 82 percent of respondents report no decrease in the number of network security events or breaches last year—and more than a quarter of those surveyed report an increase. “There’s no doubt that improving intrusion response and resolution times reduces the window of exposure from a breach,” said Jen McKean, research director at IDG Research. “More companies seek security automation tools that will enable them to resolve breaches in mere seconds and help maintain business-as-usual during the remediation period.” Researchers polled decision makers of information security, strategy, and solution implementations at companies with 500 or more employees. They explored the security challenges commercial organizations face when confronted with security breaches across their networks. Key findings include:
- More than one-third of breaches take hours to detect.
- Resolving breaches can take days, weeks, or months.
- Ongoing management of electronic identities that control access to enterprise, cloud, and mobile resources take the most time to change or update during a security event.
- A majority of respondents seek ways to reduce response time in order to address risk mitigation, preserve their company’s reputation, and protect customer data.
- Sixty-one percent of respondents admit they are looking for ways to improve response times to security events.
Business process automation solutions offer a new approach to the most difficult step in security operations: taking immediate and coordinated action to stop security attacks from proliferating. Building digital workflows that can be synchronized across an enterprise allows a rapid counter-response to cyber attacks. Speed, accuracy, and efficiency are accomplished by applying carrier-grade technology, replicating repetitive actions with automated workflows, and reducing the need for multiple screens. A quarter of respondents say they are comfortable with the idea of automating some security workflows and processes and that they deploy automation tools where they can. Fifty-seven percent of respondents say they are somewhat comfortable with automation for some low-level and a few high-level processes, but they still want security teams involved. On average, respondents report that 30 percent of their security workflows are automated today; but nearly two-thirds of respondents expect they will automate more security workflows in the coming year.
Israel’s Cellebrite delves deep into cell phone memory (Source: Ari Rabinovitch PETAH TIKVAH, Israel, June 5 Thu Jun 5, 2014 3:20am EDT) (Reuters) – Israel’s Cellebrite has seen a huge jump in sales of its mobile forensic technology as smartphones have become an increasingly vital tool for investigators in solving crimes across the world. A deleted picture or text message can often be key to a case – whether for police detectives or bank auditors – and the ability to extract and analyze the data could prove a suspect’s innocence or guilt, said Yossi Carmil, corporate co-chief executive. Cellebrite, a fully owned subsidiary of Japan’s Sun Corp , developed a system it says can do just that – retrieve data hidden deep inside nearly all mobile devices on the market. And with people becoming more dependent on their smartphones, which have in turn become more sophisticated, Cellebrite is playing a “more and more significant” role for Sun Corp, Carmil said. The company’s forensic department saw an average 25-30 percent growth for three straight years. It controls a major portion of the global forensics market, which in total Carmil estimated is over $150 million but will exceed $1 billion within a decade, as the field broadens and new technologies are introduced. Cellebrite also sells products to retailers and cellular operators that back up and transfer data and can quickly diagnose problems on a phone. About 150,000 shops worldwide use these devices, and it brings in a bit less revenue than the forensics business, Carmil said. “Ten years ago someone would have to sit and physically scroll through the phone. If you had erased a message, it was gone,” Carmil said. “But like in computers, even if you delete something, it is actually still there on the smartphone.” “Our system can retrieve it. This is harder to do than with computers since there are so many systems and devices,” he said. Leeor Ben-Peretz, vice president of products, said a key advantage for Cellebrite is its speed to market in supporting new phones and its coverage of a wide range of operating systems and devices, including those with higher levels of encryption and protection. (Editing by Tova Cohen).
Forensics Tool Validation (Source: “Training is Not Enough: A Case for Education Over Training” by Tim Wedge)
The premise that an effective digital forensic examiner must be able to validate all of the tools that he or she uses is universally accepted in the digital forensic community. I have seen some less-educated members of the community champion a particularly insidious, and I will argue, invalid method of tool validation, often referred to as the two-tool validation method. The premise of this method is that if two different tools provide the same result, they must both be correct. The problem with that assumption is that it ignores the fact that both tools may have the same flaw. This may be due to unforeseen changes to operating systems, or file systems, or it may simply be the result of invalid, but widely accepted assumptions. Few practitioners would suggest testing a gas chromatograph by merely seeing if two of them produced the same result; they would insist on testing a calibrated sample and ensuring that the results of the test matched the known values of the sample used. Digital forensic tools cover a much broader spectrum of data recovery and presentation of recovered data, and are therefore correspondingly difficult to test. Nonetheless, we cannot hold them to a lower standard. Errors and limitations in the recovery and presentation of data may occur as a result of oversight, design flaw, or simply because the ability to interpret every data structure that exists in the real world is an intrinsically unattainable goal.