Data Quality Management: The key to unleash business potential with big impact
In an effort to engage with clients in the wave of digital transformation, financial institutions have been establishing omni-channels that collect a vast quantity of new market and customer data. The information is contained in the form of trillions of structured and unstructured fragments and is stored in physical or cloud databases, waiting for business owners to crack the code for successful insights.
Data scientists are now in demand for the race to harness the power of such data, one common industry challenge is the tendency to spend a big, yet unnecessary, amount of time and effort on data preparation – identifying, cleansing and transforming, instead of model building and opportunity discovery.
With the mission to enrich, enable and empower our banking clients through data, Synpulse proudly launches a new solution — DATAMANAGEMENTINABOX. It combines our deep topic expertise and hands-on industry experience to create a solid reference model to solve persistent data problems. The out-of-the-box design provides a step-by-step guide for organisations to identify root cause of their data struggles, and a full set of artefacts to handhold stakeholders to complete the roadmap to the desired state.
In this article, we take a look at data quality, its importance and our solution to better data quality management.
The impact of data quality on business
Throughout the business end-to-end service cycle, heterogeneous data that is generated runs through all departments and teams, and the business impact can be significant if the data quality is poor.
If the source input is not accurate and complete, for example the customer information, the Product team will not be able to build a reasonable pricing model for distinct customer segmentations based on organisations’ risk appetite and profit margin.
When the data captured is not timely or in a valid format, the Customer Due Diligence team will need to spend extra hours on public websites, emails and phone interviews to verify information on natural persons or entities.
Also, mass or personalised communications may not be delivered to the right target audience regarding key terms changes or promotions when their recorded contact details have expired.
From a control perspective, the Risk team will not be able to differentiate customer profiles to perform risk assessment for rated, unrated, public and private entities as the data record of the same person or group is not unique or consistent. The alert mechanisms of abnormal transaction patterns and clients’ activities could also then be weakened and subsequently causing a flawed regulatory control framework with potential AML fines.
In its most optimal form, data can facilitate businesses to obtain a clear and valid representation of their customers, products, operations and compliance landscape. However, we often hear that poor data quality is a big impediment blocking success.
Data should be ready-to-use, allowing all business units to simply start and complete their business-as-usual operations or regular and ad-hoc reporting at a drop of a hat. Actionable insights should also be generated to enable accurate and timely strategic decisions in response to the rapid market competition and ever-changing customer demands. In addition, data should allow intelligent automation to save tedious and complex manual processes while dependency on IT can be minimised to reduce operational cost and turnaround time, with no unfair and subjective judgements or negative outcomes produced.
Synpulse's Data Quality Management
Synpulse’s DATAMANAGEMENTINABOX solution provides a Data Quality Framework which emphasises on both control and monitoring. The Data Quality Framework serves as a reference model and guide for our clients to understand their current data quality in an objective manner, and fill the gaps identified with industry best practice.
As we understand that no one data quality project is the same, each design is customised to enable our clients to be self-sufficient in controlling their data quality issues, as well as monitoring future data impediments
Data quality control
The objective of data quality control is to guarantee that the data elements with material impact on the business are constantly in check in terms of their accuracy, completeness, validity, timeliness, uniqueness and consistency.
Business requirements on these dimensions vary from one department to another. Our framework provides a comprehensive step-by-step guide to assist clients in defining these requirements along with the acceptable error tolerance levels, ensuring that the data is fit for business, both for operations and analytics. This will serve as the standard of data quality control implementation, and we work closely with our clients to fine-tune the controls in order to achieve the best results.
Ultimately, the data quality control framework will safeguard the end-to-end data quality management lifecycle to ensure the optimum use of data quality.
Data quality monitoring
Once the requirements for the above-mentioned dimensions are defined, they become the data quality metrics and key performance indicators (KPIs).
Our data quality monitoring framework assists our clients in measuring these metrics and KPIs, and evaluates it against the defined data quality criteria. The aim is to allow our clients to be self-sufficient in identifying and examining the quality of their data in a proactive manner and be alerted to any data that violates the data quality metrics that are set. This enables our clients to take immediate steps in remediating the data and to continuously improve the data quality measures and controls over time, ultimately preventing the recurrence of the same data issues. The data quality monitoring framework also serves as the baseline for our clients to include additional data quality metrics and measures to build up the rigor of the overall data quality management lifecycle.
With DATAMANAGEMENTINABOX, we consolidates our deep data expertise and hands-on industry experience, providing a solid reference model for all financial institutions to solve their data management struggles. This article introduces our Data Quality Management Framework as the first of the seven components within our DATAMANAGEMENTINABOX. We will be introducing each of the components in the coming months, so do stay tuned.
This article was authored by Yoong Chung (Associate Partner) and Tony Cheung (Consultant).