The Butterfly Effect in Data Quality – 2
In the first part of this series of blog posts, we talked about how a small error in data can lead to serious issues. We learned about the different ways such errors can affect data quality. Now is the time to learn how to avoid these errors and minimize the chances of a catastrophe sometime in the future.
Here are some tips and techniques you can use to prevent small data quality issues from leading to serious consequences.
- Standardize the data – Standards are great – they provide an easy way to manage and identify your data. A global taxonomy such as UNSPSC makes the task of material master data management much simpler that it otherwise will be. Most departments have their own way of storing data. Make sure that you are standardizing everything. This is one of the easiest way to avoid and spot errors.
- Pay attention to the reference data – Depending on the industry you work in, there is bound to be many types of reference data that can help in managing your master data. However, they might not always be a part of the critical fields to be filled up compulsorily. If this is the case, try to get as much reference data as you can. This will help you spot errors in the master data.
- Use data matching properly – Data matching takes care of the duplicates in the data and also helps you recognize relationship in your master data (if any). The two common types of matching are deterministic matching and probabilistic matching – in the first type, data is compared using fuzzy logic, with rules that the system recognizes. In probabilistic matching, on the other hand, statistical analysis is performed on the data, on the basis of which decisions about duplicates and relationships are made.
- Keep checking the data – No system in this world works to its full potential without proper monitoring. Decide on the critical metrics and keep track of them. This will not only help you spot errors keeping in, but will also help you follow their trends and check the evolution of your organization’s data quality.
Clean and reliable data is a critical asset for any organization. If the people working on the data cannot rely on it, it becomes very difficult for them to be enthusiastic about their work. These two posts will definitely help you in providing your business with better quality data.
Note – Interested in data quality and master data management? You might like to register for the upcoming webinar – ‘ASR Group shares Supply chain best practices to drive SAP consolidation success’. In it Arthur Raguette, EVP, Verdantis, Inc. gets together with Jose Bobrek, Central MDM team lead, American Sugar Refining Group, to discuss how large corporations can drive SAP roll-out success by leveraging harmonized material masters. The complementary 60 minute webinar scheduled for 10th December (2 PM ET) will be hosted by ASUG (Americas’ SAP Users’ Group), the largest community of SAP professionals.
Further Reading –
Latest posts by Vipul Aroh (see all)
- Key Takeaways from our Webinar – Strategically Manage Data Quality in an ERP rollout - November 4, 2015
- Boosting the ROI of an ERP project – Part I - August 31, 2015
- Master Data Quality Improvement – Adding to your EAM Implementation ROI - July 29, 2015