-
Notifications
You must be signed in to change notification settings - Fork 24
Maturity Levels
From the beginning, one of the biggest issues with the CSF is understanding the level of the intended control's maturity. Back in 2014, when I developed the initial version of this tool, there was no formal way to monitor and manage a control's maturity.
However, even with the publication of the four official NIST CSF Maturity Levels, using them in a modern enterprise leaves much open to interpretation and a lot to be desired.
This tool was designed to leverage the Capability Maturity Model (SEI-CMM) developed by the Carnegie Mellon Software Engineering Institute (SEI). Since many enterprises use this model as a base to measure general information technology maturity, measuring an enterprise's security maturity on the same scale provides a clear apples-to-apples comparison.
That being said, every enterprise is different, so having the ability to adjust maturity levels to match the IT teams is crucial.
The basic presumptions built into the tool are more around security Standards of Good Practice than any specific enterprise requirement. you can and should adjust them to suit your individual needs.
Anyone who has spent any time in security is well aware of the differences between written policies and technical standards and what processes occur in real-life IT operations. The most common example is when the policy states "All users must use two-factor authentication" only to find out that the CEO or CFO refuses to use it.
When it comes to the CSF, the presumption that policies and processes align perfectly presents a unique challenge - not knowing which one to measure. Most companies would have stellar results on the CSF if all they had to do was worry about policy documents. The reality is that the policies are rarely adhered to 100% of the time.
This is the foundational reason for breaking out policy compliance vs process compliance. By measuring both aspects, you achieve a far better understanding of the actual risk throughout the enterprise.
The maturity levels defined in the tool are generally accepted as Standards of Good Practice. These are not Best Practices as generally defined, but more around the targets used by leading organizations to elevate their ITSM.
A key part of developing a strong assessment plan is being honest with oneself. Measuring yourself against absolute perfection in every category is not only a futile effort but also a significant waste of time. To set this up correctly, you should partner with your IT leadership to understand what their expected service levels are. Perhaps in your organization, a Level 3 - Defined maturity has policy exceptions at 3% and processes exceptions at 5%, or perhaps 10% and 20% respectively. Either way, understanding the rest of th organization is a crucial part of setting up the assessment correctly.
| Maturity Level | Expectation of Policy Maturity Level | Expectation of Process Maturity Level |
|---|---|---|
Level 1 - Initial
|
Policy or standard does not exist or is not formally approved by management. | Standard process does not exist. |
Level 2 - Repeatable
|
Policy or standard exists, but has not been reviewed in more than 2 years | Ad-hoc process exists and is done informally. |
Level 3 - Defined
|
Policy and standard exist with formal management approval. Policy exceptions are documented and approved and occur less than 5% of the time. | Formal process exists and is documented. Evidence can be provided for most activities. Less than 10% exceptions. |
Level 4 - Managed
|
Policy and standards exist with formal management approval. Policy exceptions are documented and approved and occur less than 3% of the time. | Formal process exists and is documented. Evidence can be provided for all activities and detailed metrics of the process are captured and reported. A minimal target for metrics has been established. Less than 5% of process exceptions occur with minimal reoccurring exceptions. |
Level 5 - Optimizing
|
Policies and standards exist with formal management approval. Policy exceptions are documented and approved and occur less than 0.5% of the time. | Formal process exists and is documented. Evidence can be provided for all activities and detailed metrics of the process are captured and reported. Minimal targets for metrics have been established and continually improving. Less than 1% of process exceptions occur. |
The Target Maturity Score is the pre-agreed to score that the enterprise wishes to achieve for each sub-category.
For example, perhaps AcmeCo feels strongly that Identity Management needs to be very mature, so they give “Identity Management, Authentication, and Access Control (PR.AA)” a target score of 4.5, but since they are 100% cloud based, they have very little to do with “Technology Infrastructure Resilience (PR.IR)”, so they give that a target score of 3.0. Both of these targets are completely justifiable for their specific technology stack and enable the organization to focus both time and money on the higher risk areas.
Since, conceptually, Policy and Practice should reasonably be at the same maturity level, there is only one target score for each sub-category.