What Doesn’t Kill You Will Make You Stronger - A Deep Dive Into Data Remediation Execution

 

Data remediation activities in financial services will never cease. The best that can be achieved is significantly reducing the frequency and scope of remediations over time. Remediation does not always indicate a negative financial impact to a customer, but it does indicate a negative financial impact to the organisation.

The mantra, ‘what doesn’t kill you will make you stronger’ is very applicable when it comes to the execution/ implementation of remediation projects in financial services. It is often a demanding process, with short deadlines. It typically requires multiple attempts and internal reviews to get right. It is not uncommon to uncover new insights during the execution that alter the course of the remediation. Therefore, it is critical to have a structured process ready when entering this phase, covering everything from data ingestion to analysis and reporting.

Once remediation is underway, it usually garners significant internal attention, given that it often exposes a dormant issue. A lot of pressure is placed into assessing scope, product impact, legal and regulatory breaches in a very short timeframe.

This article aims to dive deeper into the foundation for the successful execution of a remediation project - Data.

The first step is acquiring the correct data. To do this, a deep dive with businesses subject matter experts (SME’s) must be undertaken with the objective of understanding what key data sets are required.

During the data collection phase there are at least three things to consider:

Scope

Data extraction can be a costly process requiring time and resources. Limiting the data set to strictly cover your scope will save both the time required for analysis and any subsequent re-extractions. Ensure this scope is clearly communicated with the technology team providing the extracts. For instance, many business units interchangeably use member number, client number and account number, yet in the back-end database, these are all very different data elements.

Sources

Identify all the sources of the required data sets. Typically, this will be various groups within the technology department that may provide source system extracts or set up direct access to database replicas. It could also be project managers that have relevant data stored as flat files, or SME’s that may hold data around product or business rules. Having an overview of this will aid in scoping out the work involved in centralising all the data in one structured database.

Auditability

Remediation activities will almost always be subject to either an internal or external audit. It is important to consider the steps which will need evidencing and or attestations, that prove correct/production data was used in analysis. This would apply to any data received in the form of extracts, or data referenced through specific product rules. These attestations need to be clear about the data being provided, and any known limitations or filters applied to the data sets. It’s always easier to request these upfront by the stakeholder providing the data. For instance, in case of technology extracts, it allows them to save the logic they used for extraction, at the time they performed that task.

Once the data has been sourced and attested to, it should be pulled together within one structured data set through the data load and transform process. The main goals here are to ensure:

  • No data loss

For example, use a loading process that reports on the number of lines pulled into the database from the data extract and compare to lines of data in extract.

  • Data integrity is maintained

For example, ensure values retrieved are sensible. Common areas are checking data types, date ranges, duplicated records, missing or null values in key fields, truncation of leading zeroes and so forth.

  • Data is complete

Investigate relationships between data sets which are meant to be related and see if they converge to the same story. For example, does the data set meet the scope requirement. Does a simple trend analysis show large, unexpected gaps in the data?

Once the data is loaded, the above tests must be carried out and documented, as they are likely to form part of a review/audit. It is imperative extensive testing is done on the data once loaded, as this will form the foundation of the data analysis to be performed for the remediation.

Core data analysis

Following on from these validations, is the core data analysis and where much of the complexity lies. Some of the key steps listed also apply to broader analytic projects:

  • Keep the business informed

Keep the business informed about any findings that are unexpected. It could point to another core issue or a misinterpretation of the requirement. Try to take the business SME on the data analytic journey, as their input and feedback will help cut a lot of noise inherently present.

  • Keep a record of all the changes

Keep a record of all changes made to the analytic script or logic being developed. Particularly if your logic is hard coding anything or implementing specific product requirements. The logic implemented here will be under heavy scrutiny and review, so all decisions made here need to be clearly documented. It is a good idea to do this as you develop, when all design decisions are fresh and supporting evidence is likely to be present.

  • Visualise your data for better communication

The analysis being done will need to be communicated back to other stakeholders, so it is good practice to try and envisage what form this data visualisation will take early on. Developing this view early in the analysis piece also allows the data analyst to have a good overall understanding of the data.

When using the data sets at hand, it is recommended to establish a rough baseline of the size and scope for the remediation at the very beginning of the project. In the early stages, this will likely constitute several broad level assumptions (which must be explicitly stated), with the goal of developing a heuristic for the member numbers and dollar impacts at play.

The process of diving into the data to develop the baseline might expose data sets originally not thought to be required. As the remediation progresses, this can also serve as a sanity check that numbers are still in the right ballpark. If there are major differences, then it is worth taking a closer look at the area with the discrepancy and being able to justify it either quantitatively or qualitatively.

Continuous Improvement

If new data is entering a system, remediation activities will not be going away, nor should they. Data is inherently ‘dirty’ and must undergo a systematic quality process.

Remediation activities themselves are not necessarily an indication of poor controls, but what gave rise to these remediations is. If most remediations are instigated by ad-hoc triggers and external parties, then there is a clear lack of a reliable data quality process.

A mature data management system can minimise the number of ad-hoc triggers and remediation activities overall. While it does not completely eradicate them, it will certainly allow project teams to be better prepared for the resulting remediation, minimise risk of introducing new issues, and will show the controls put in place are effective.

When done correctly, data remediation contributes to the important cycle of continuous improvement, raising the value of the data within any financial services organisation.

 

If your organisation needs assistance with remediation projects, QMV can help.

QMV are experts in remediation and known for delivering on complex requirements. Our team includes specialist consultants with the hybrid technical and analytical skills required for remediation. For further information please telephone our office p +61 3 9620 0707 or submit an online form.

Like what you see? Please subscribe to receive original QMV content!