You may think that data is a boring and irrelevant topic for your company? Then let’s imagine for one minute the following situation. You are working for a grid operator, and let’s assume that you don’t know the construction year and material of some assets you are responsible for. Which material should a field worker take with him to repair the asset in case of an outage, and how long would this take? Your customers may want to know this from you. But probably even more critical is the question of what the main risk factors are in the grid. Your CEO may want to know this from you, and perhaps also your local authorities.
Hence, data quality should play an important role in your company. In fact, having a better data quality cannot only improve processes depending on it, such as those touched in the example above. It can also lead to immediate cost savings. At Alliander, the largest grid operator in The Netherlands, it is expected that approximately 1 to 2% of the annual spend for asset maintenance can be saved only because of a better data quality.
Despite these business benefits, improving data quality and keeping it high is a challenging task. At Alliander, several data quality improvement programs were executed successfully during the past years. It turned out, that the following three major steps are very crucial:
Structure the problem: When you start tackling data quality problems, it is vital to start with the most important data objects first. Given this, the most important attributes of these objects should be identified and the target quality defined.
Get clean: Most people immediately think of building business rules when it comes to improve data quality. This approach, however, has some disadvantages. Not only is it impossible to detect unknown rules in your data, but it also requires tremendous involvement of your asset experts. We propose a different method which is based on data mining. This means that, instead of manually deriving rules from the data, you can use advanced statistical methods to automatically derive these rules from the data itself. There are many advantages of this approach. For instance, it immediately leads to lower project costs and a faster improvement in data quality.
Stay clean: It may happen that get clean programs are executed without putting any data management processes in place to sustain the better data quality. In such a case, the quality of your data inevitable deteriorates and perhaps you may have to execute another get clean project again in the future. At Alliander, we employed a different approach. Already while the get clean programs were running, the required changes for data governance, data structure, data quality and data security were put in place. Hence, at the end of the get clean program, all data management processes for staying clean were already operational.
Have you ever had issues with the quality of the data about your assets? Have you ever wondered how such issues can be solved rapidly and in a sustainable manner? Are you at the start of a data quality improvement program or in the middle of executing it? Then join Rob Jansen from Alliander and me at our presentation at the SAP Conference for Utilities in Mannheim.
Join us at our presentation “IT/OT Convergence at Alliander through Improved Data Transparency, Quality, and Monitoring” on April 10th, from 13:45 to 14:30.