Step 6: Avoid The Trap of Compromised Data
Lately, there is a welcome topic that I hear frequently from all sorts of companies. A few weeks ago a visit with a large enterprise customer confirmed that the topic is alive and kicking. When I answered the question they posed, there were several jaws hitting the floor when I suggested starting with a whiteboard rather than a tool. Think about data first.Organizational and cultural issues tend to add significantly to the reasons why accuracy and completeness are concerning and major barriers to adoption of a new IT system and intentions for overall IT transformation. Audit is a word that typically makes the average IT resource run for the exits. In previous posts I have tried to disarm the fear of this activity and provide ways and means to get ahead of the curve, be proactive and avoid fire drills.
Data accuracy and completeness are difficult to measure or even quantify, given that the boundaries can be so vast and it can turn into a never-ending journey. I like to define them in the following way:
- Accuracy – Data has been validated and is within an agreed period of certification.
- Completeness – The extent to which the data is not missing.
What are the benefits of efforts to ensure accuracy and completeness?
- To allow a user to more objectively decide whether to trust the data being shown.
- To allow users, configuration managers and stakeholders to quickly and accurately spot poor data and escalate accordingly.
- To be able to highlight trends across data within the CMDB to target process and data remediation activities.
- To allow the tracing of data quality over time as the user strives for continual service improvement.
- To allow reporting of data quality. For example, reporting the average data quality associated with a given support group.
There are many ways and means of measuring data quality and applying specific dimensions to enable a score to be calculated. The benefits are obvious, but nobody finds data quality and accuracy projects near the top of any investment list.
Completeness and currency of data are two key metrics I like to measure, defined as the extent to which the data is not missing and that we expect something to be there, but then also the frequency in which it can be expected to be refreshed or updated. Stale data is sometimes more dangerous than incomplete data. There are many horror stories of technicians updating or implementing changes to infrastructure and touching the wrong CI, causing untold amounts of issues. They thought they were acting on the data in the system of record, but they were not. This issue can be as basic as having the incorrect escalation information for paging/calling a resource who is on-call to support an incident in the middle of the night.
Currency is a key driver to ensuring that decisions can be made in a timely and appropriate manner. The frequency of feeds should also determine which fields are updated and when, as not everything changes on a frequent basis. Defining what you care about—and when—is just as important as we have discussed previously.
I challenge you to review your implementation and focus on three key drivers: accuracy, completeness and currency. Compare the results to the overall process goals and objectives that consume or are driven by the data in your CMDB. You might be surprised.
Up Next: The last mile……