A new report from cybersecurity firm Bromium reveals that most organizations who deploy detection-based security to protect their systems also face soaring hidden costs.
Deployment costs and upfront licensing as well as security-detection tools like anti-virus are dwarfed by costs incurred on human hires and the efforts required to manage and assess millions of false-positive threat intelligence reports and alerts.
The findings are based on a survey of 500 CISOs from a plethora of global enterprises that reveal organizations typically invest $345,0000 on average per year on detect-to-protect security tools. The average annual cost to maintain this endpoint security could reach over $16 million per enterprise.
Specifically, the figure is based on security teams spending some 413,920 hours per year triaging alerts as well as spending an additional 2,448 hours rebuilding compromised machines alongside 780 hours on emergency patching. Running the numbers, that adds up to 417,148 hours per year – leading to an annual labor cost of $16.3 million per enterprise.
Furthermore, researchers also determined that organizations are investing in multiple security layers to wield defenses against hackers including: Advanced Threat Detection (annual spend $159,220); next-generation and traditional anti-virus (annual spend $44,200); whitelisting and blacklisting ($29,540 annual spend), and detonation environments ($112,340 annual spend).
“Detection requires a patient zero — someone must get owned and then protection begins. Yet, because of this, rebuilds are unavoidable; false positives balloon; triage becomes more complex and emergency patching is increasingly disruptive,” said Bromium CEO Gregory Webb. “It’s no surprise that 63 percent of the CISOs we surveyed said they’re worried about alert fatigue.”