I was talking favorite security metrics with Phil Agcaoili at RSA. For me, the best isn’t even a metric, it’s a statistic that highlights maturity across a number of IT processes. Quick fun quiz:
- What was the biggest surprise when you completed your first device enumeration scan?
For the shops I’ve worked in, it was the extreme difference in the number of devices identified vs. what IT thought they had in their directories and cmdb. The discrepancy was so profound it helped justify the entire vulnerability management process. It also highlighted the need for an internal, “independent” assessment of security posture in IT. I’m not talking about an annual scan and fix check-box exercise. I’m talking continuous scan, report, fix, publish metrics kind of process. This opens the door to measure % of “managed devices,” not just % patch.
Device enumeration is such a great iceberg statistic. “If we don’t know how many devices we have on the network, what about the ‘harder’ stuff like apps, data, users, roles, etc.” Caution: use this power responsibly. The last thing you want is to surprise your IT peers. It’s a great opportunity to demonstrate the value of the security org, gauge your organization’s appetite for transparency, accountability, and risk acceptance.
Hmm, I suppose you could make this into a metric: % difference in devices enumerated vs. cmdb, with a target of 100% over xx day rolling period. However that may not earn you any friends at the friday IT mixer. I think this statistic should be called the ultimate metric-enabler.