Data Quality Archives - MDM Team https://mdmteam.org/blog/category/data-management/data-quality/ Easy To Learn Wed, 30 Mar 2022 15:42:05 +0000 en-US hourly 1 Big Data – Visualization Tools https://mdmteam.org/blog/big-data-visualization-tools/ https://mdmteam.org/blog/big-data-visualization-tools/#respond Wed, 30 Mar 2022 15:42:01 +0000 https://mdmteam.org/blog/?p=973 Traditional tools in data visualization have both a data and a graphical component. Advanced visualization and discovery tools use in-memory architecture to allow users to interact with the data. Patterns …

The post Big Data – Visualization Tools appeared first on MDM Team.

]]>
Traditional tools in data visualization have both a data and a graphical component. Advanced visualization and discovery tools use in-memory architecture to allow users to interact with the data. Patterns in a large data set can be difficult to recognize in a numbers display. A visual pattern can be picked up quickly when thousands of data points are loaded into a sophisticated display.

Information graphics or info-graphics are graphical representations stylized for effective interaction and comprehension. Marketing adopted these to provide visual appeal to presentations. Journalists, bloggers, and teachers found info-graphics useful for trend analysis, presentation, and distribution. Information visualization methods like radar charts, parallel coordinate plots, tag charts, heat maps, and data maps are now supported by many tool-sets. These allow users to rapidly discern changes in data over time, gain insights into related items, and understand potential cause and effect relationships before impacts occur. These tools have several benefits over traditional visualization tools:

  • Sophisticated analysis and visualization types, such as small multiples, spark lines, heat maps, histograms, waterfall charts, and bullet graphs
  • Built-in adherence to visualization best practices
  • Interactivity enabling visual discovery

The post Big Data – Visualization Tools appeared first on MDM Team.

]]>
https://mdmteam.org/blog/big-data-visualization-tools/feed/ 0
Data Quality – Policy and Metrics https://mdmteam.org/blog/data-quality-policy-and-metrics/ https://mdmteam.org/blog/data-quality-policy-and-metrics/#respond Tue, 08 Mar 2022 10:56:23 +0000 https://mdmteam.org/blog/?p=876 Data Quality – Policy Data Quality efforts should be supported by and should support data governance policies. For example, governance policies can authorize periodic quality audits and mandate compliance to …

The post Data Quality – Policy and Metrics appeared first on MDM Team.

]]>
Data Quality – Policy

Data Quality efforts should be supported by and should support data governance policies. For example, governance policies can authorize periodic quality audits and mandate compliance to standards and best practices. All Data Management Knowledge Areas require some level of policy, but data quality policies are particularly important as they often touch on regulatory requirements. Each policy should include:

  • Purpose, scope and applicability of the policy
  • Definitions of terms
  • Responsibilities of the Data Quality program
  • Responsibilities of other stakeholders
  • Reporting
  • Implementation of the policy, including links to risk, preventative measures, compliance, data protection, and data security

Data Quality – Metrics

Much of the work of a Data Quality team will focus on measuring and reporting on quality. High-level categories of data quality metrics include:

  • Return on Investment: Statements on cost of improvement efforts vs. the benefits of improved data quality
  • Levels of Quality: Measurements of the number and percentage of errors or requirement violations within a data set or across data sets
  • Data Quality Trends: Quality improvement over time (i.e., a trend) against thresholds and targets, or quality incidents per period
  • Data Issue Management Metrics:
    • Counts of issues by dimensions of data quality
    • Issues per business function and their statuses (resolved, outstanding, escalated)
    • Issue by priority and severity
    • Time to resolve issues
  • Conformance to Service Levels: Organizational units involved and responsible staff, project interventions for data quality assessments, overall process conformance
  • Data Quality Plan Rollout: As-is and roadmap for expansion

The post Data Quality – Policy and Metrics appeared first on MDM Team.

]]>
https://mdmteam.org/blog/data-quality-policy-and-metrics/feed/ 0
Data Quality Program – Readiness Assessment / Risk Assessment https://mdmteam.org/blog/data-quality-program-readiness-assessment-risk-assessment/ https://mdmteam.org/blog/data-quality-program-readiness-assessment-risk-assessment/#respond Sun, 06 Mar 2022 07:57:35 +0000 https://mdmteam.org/blog/?p=863 Findings from a readiness assessment will help determine where to start and how quickly to proceed. Findings can also provide the basis for roadmapping program goals. If there is strong …

The post Data Quality Program – Readiness Assessment / Risk Assessment appeared first on MDM Team.

]]>
Findings from a readiness assessment will help determine where to start and how quickly to proceed. Findings can also provide the basis for roadmapping program goals. If there is strong support for data quality improvement and the organization knows its own data, then it may be possible to launch a full strategic program. If the organization does not know the actual state of its data, then it may be necessary to focus on building that knowledge before developing a full strategy. Organizational readiness to adopt data quality practices can be assessed by considering the following characteristics:

  • Management commitment to managing data as a strategic asset: As part of asking for support for a Data Quality program, it is import to determine how well senior management understands the role that data plays in the organization. To what degree does senior management recognize the value of data to strategic goals? What risks do they associate with poor quality data? How knowledgeable are they about the benefits of data governance? How optimistic about the ability to change culture to support quality improvement?
  • The organization’s current understanding of the quality of its data: Before most organizations start their quality improvement journey, they generally understand the obstacles and pain points that signify poor quality data. Gaining knowledge of these is important. Through them, poor quality data can be directly associated with negative effects, including direct and indirect costs, on the organization. An understanding of pain points also helps identify and prioritize improvement projects.
  • The actual state of the data: Finding an objective way to describe the condition of data that is causing pain points is the first step to improving the data. Data can be measured and described through profiling and analysis, as well as through quantification of known issues and pain points. If the DQ team does not know the actual state of the data, then it will be difficult to prioritize and act on opportunities for improvement.
  • Risks associated with data creation, processing, or use: Identifying what can go wrong with data and the potential damage to an organization from poor quality data provides the basis for mitigating risks. If the organization does not recognize these risks, it may be challenging to get support for the Data Quality program.
  • Cultural and technical readiness for scalable data quality monitoring: The quality of data can be negatively impacted by business and technical processes. Improving the quality of data depends on cooperation between business and IT teams. If the relationship between business and IT teams is not collaborative, then it will be difficult to make progress.

The post Data Quality Program – Readiness Assessment / Risk Assessment appeared first on MDM Team.

]]>
https://mdmteam.org/blog/data-quality-program-readiness-assessment-risk-assessment/feed/ 0
Data Quality Program – Implementation Guidelines https://mdmteam.org/blog/data-quality-program-implementation-guidelines/ https://mdmteam.org/blog/data-quality-program-implementation-guidelines/#respond Sun, 06 Mar 2022 05:48:55 +0000 https://mdmteam.org/blog/?p=859 Typically, a hybrid approach works best – top-down for sponsorship, consistency, and resources, but bottom-up to discover what is actually broken and to achieve incremental successes. Improving data quality requires …

The post Data Quality Program – Implementation Guidelines appeared first on MDM Team.

]]>
Typically, a hybrid approach works best – top-down for sponsorship, consistency, and resources, but bottom-up to discover what is actually broken and to achieve incremental successes. Improving data quality requires changes in how people think about and behave toward data. Cultural change is challenging. It requires planning, training, and reinforcement. While the specifics of cultural change will differ from organization to organization, most Data Quality Program implementations need to plan for:

  • Metrics on the value of data and the cost of poor quality data: One way to raise organizational awareness of the need for Data Quality Management is through metrics that describe the value of data and the return on investment from improvements. These metrics (which differ from data quality scores) provide the basis for funding improvements and changing the behavior of both staff and management.
  • Operating model for IT/Business interactions: Business people know what the important data is, and what it means. Data Custodians from IT understand where and how the data is stored, and so they are well placed to translate definitions of data quality into queries or code that identify specific records that do not comply.
  • Changes in how projects are executed: Project oversight must ensure project funding includes steps related to data quality (e.g., profiling and assessment, definition of quality expectations, data issue remediation, prevention and correction, building controls and measurements). It is prudent to make sure issues are identified early and to build data quality expectations upfront in projects.
  • Changes to business processes: Improving data quality depends on improving the processes by which data is produced. The Data Quality team needs to be able to assess and recommend changes to non-technical (as well as technical) processes that impact the quality of data.
  • Funding for remediation and improvement projects: Some organizations do not plan for remediating data, even when they are aware of data quality issues. Data will not fix itself. The costs and benefits of remediation and improvement projects should be documented so that work on improving data can be prioritized.
  • Funding for Data Quality Operations: Sustaining data quality requires ongoing operations to monitor data quality, report on findings, and continue to manage issues as they are discovered.

Employees need to think and act differently if they are to produce better quality data and manage data in ways that ensures quality. This requires training and reinforcement. Training should focus on:

  • Common causes of data problems
  • Relationships within the organization’s data ecosystem and why improving data quality requires an enterprise approach
  • Consequences of poor quality data
  • Necessity for ongoing improvement (why improvement is not a one-time thing)
  • Becoming ‘data-lingual’, about to articulate the impact of data on organizational strategy and success, regulatory reporting, customer satisfaction

Training should also include an introduction to any process changes, with assertions about how the changes improve data quality.

The post Data Quality Program – Implementation Guidelines appeared first on MDM Team.

]]>
https://mdmteam.org/blog/data-quality-program-implementation-guidelines/feed/ 0
Data Quality – Audit Code Module and Metrics https://mdmteam.org/blog/data-quality-audit-code-module-and-metrics/ https://mdmteam.org/blog/data-quality-audit-code-module-and-metrics/#respond Thu, 03 Mar 2022 06:03:20 +0000 https://mdmteam.org/blog/?p=852 Quality Check and Audit Code Modules Create shareable, linkable, and re-usable code modules that execute repeated data quality checks and audit processes that developers can get from a library. If …

The post Data Quality – Audit Code Module and Metrics appeared first on MDM Team.

]]>
Quality Check and Audit Code Modules

Create shareable, linkable, and re-usable code modules that execute repeated data quality checks and audit processes that developers can get from a library. If the module needs to change, then all the code linked to that module will get updated. Such modules simplify the maintenance process. Well-engineered code blocks can prevent many data quality problems. As importantly, they ensure processes are executed consistently. Where laws or policy mandate reporting of specific quality results, the lineage of results often needs to be described. Quality check modules can provide this. For data that has any questionable quality dimension and that is highly rated, qualify the information in the shared environments with quality notes, and confidence ratings.

Effective Data Quality Metrics

A critical component of managing data quality is developing metrics that inform data consumers about quality characteristics that are important to their uses of data. Many things can be measured, but not all of them are worth the time and effort. In developing metrics, DQ analysts should account for these characteristics:

  • Measurability: A data quality metric must be measurable – it needs to be something that can be counted. For example, data relevancy is not measurable, unless clear criteria are set for what makes data relevant. Even data completeness needs to be objectively defined in order to be measured. Expected results should be quantifiable within a discrete range.
  • Business Relevance: While many things are measurable, not all translate into useful metrics. Measurements need to be relevant to data consumers. The value of the metric is limited if it cannot be related to some aspect of business operations or performance. Every data quality metric should correlate with the influence of the data on key business expectations.
  • Acceptability: The data quality dimensions frame the business requirements for data quality. Quantifying along the identified dimension provides hard evidence of data quality levels. Determine whether data meets business expectations based on specified acceptability thresholds. If the score is equal to or exceeds the threshold, the quality of the data meets business expectations. If the score is below the threshold, it does not.
  • Accountability / Stewardship: Metrics should be understood and approved by key stakeholders (e.g., business owners and Data Stewards). They are notified when the measurement for the metric shows that the quality does not meet expectations. The business data owner is accountable, while a data steward takes appropriate corrective action.
  • Controllability: A metric should reflect a controllable aspect of the business. In other words, if the metric is out of range, it should trigger action to improve the data. If there is no way to respond, then the metric is probably not useful.
  • Trending: Metrics enable an organization to measure data quality improvement over time. Tracking helps Data Quality team members monitor activities within the scope of a data quality SLA and data sharing agreement, and demonstrate the effectiveness of improvement activities. Once an information process is stable, statistical process control techniques can be applied to detect changes to the predictability of the measurement results and the business and technical processes on which it provides insight.

The post Data Quality – Audit Code Module and Metrics appeared first on MDM Team.

]]>
https://mdmteam.org/blog/data-quality-audit-code-module-and-metrics/feed/ 0
Data Quality – Preventive and Corrective Actions https://mdmteam.org/blog/data-quality-preventive-actions/ https://mdmteam.org/blog/data-quality-preventive-actions/#respond Thu, 03 Mar 2022 05:10:28 +0000 https://mdmteam.org/blog/?p=845 Preventive Actions The best way to create high quality data is to prevent poor quality data from entering an organization. Preventive actions stop known errors from occurring. Inspecting data after …

The post Data Quality – Preventive and Corrective Actions appeared first on MDM Team.

]]>
Preventive Actions

The best way to create high quality data is to prevent poor quality data from entering an organization. Preventive actions stop known errors from occurring. Inspecting data after it is in production will not improve its quality. Approaches include:

  • Establish Data Entry Controls: Create data entry rules that prevent invalid or inaccurate data from entering a system.
  • Train Data Producers: Ensure staff in upstream systems understand the impact of their data on downstream users. Give incentives or base evaluations on data accuracy and completeness, rather than just speed.
  • Define and Enforce Rules: Create a ‘Data Firewall’, which has a table with all the business data quality rules used to check if the quality of data is good, before being used in an application such a data warehouse. A data firewall can inspect the level of quality of data processed by an application, and if the level of quality is below acceptable levels, analysts can be informed about the problem.
  • Demand High Quality Data from Data Suppliers: Examine an external data provider’s processes to check their structures, definitions, and data source(s) and data provenance. Doing so enables assessment of how well their data will integrate and helps prevent the use of non-authoritative data or data acquired without permission from the owner.
  • Implement Data Governance and Stewardship: Ensure roles and responsibilities are defined that describe and enforce rules of engagement, decision rights, and accountabilities for effective management of data and information assets (McGilvray, 2008). Work with data stewards to revise the process of, and mechanisms for, generating, sending, and receiving data.
  • Institute Formal Change Control: Ensure all changes to stored data are defined and tested before being implemented. Prevent changes directly to data outside of normal processing by establishing gating processes.

Corrective Actions

Corrective actions are implemented after a problem has occurred and been detected. Data quality issues should be addressed systemically and at their root causes to minimize the costs and risks of corrective actions. ‘Solve the problem where it happens’ is the best practice in Data Quality Management. This generally means that corrective actions should include preventing recurrence of the causes of the quality problems. Perform Data Correction in three General Ways:

  • Automated Correction: Automated correction techniques include rule-based standardization, normalization, and correction. The modified values are obtained or generated and committed without manual intervention. An example is automated address correction, which submits delivery addresses to an address standardizer that conforms and corrects delivery addresses using rules, parsing, standardization, and reference tables. Automated correction requires an environment with well-defined standards, commonly accepted rules, and known error patterns. The amount of automated correction can be reduced over time if this environment is well-managed and corrected data is shared with upstream systems.
  • Manually-Directed Correction: Use automated tools to remediate and correct data but require manual review before committing the corrections to persistent storage. Apply name and address remediation, identity resolution, and pattern-based corrections automatically, and use some scoring mechanism to propose a level of confidence in the correction. Corrections with scores above a particular level of confidence may be committed without review, but corrections with scores below the level of confidence are presented to the data steward for review and approval. Commit all approved corrections, and review those not approved to understand whether to adjust the applied underlying rules. Environments in which sensitive data sets require human oversight (e.g., MDM) are good examples of where manual-directed correction may be suited.
  • Manual Correction: Sometimes manual correction is the only option in the absence of tools or automation or if it is determined that the change is better handled through human oversight. Manual corrections are best done through an interface with controls and edits, which provide an audit trail for changes. The alternative of making corrections and committing the updated records directly in production environments is extremely risky. Avoid using this method.

Statistical Process Control

Statistical Process Control (SPC) is a method to manage processes by analyzing measurements of variation in process inputs, outputs, or steps. The technique was developed in the manufacturing sector in the 1920s and has been applied in other industries, in improvement methodologies such as Six Sigma, and in Data Quality Management. Simply defined, a process is a series of steps executed to turn inputs into outputs. SPC is based on the assumption that when a process with consistent inputs is executed consistently, it will produce consistent outputs. It uses measures of central tendency (how values cluster around a central value, such as a mean, median, or mode) and of variability around a central value (e.g., range, variance, standard deviation), to establish tolerances for variation within a process.
SPC is used for control, detection, and improvement. The first step is to measure the process in order to identify and eliminate special causes. This activity establishes the control state of the process. Next is to put in place measurements to detect unexpected variation as soon as it is detectable. Early detection of problems simplifies investigation of their root causes. Measurements of the process can also be used to reduce the unwanted effects of common causes of variation, allowing for increased efficiency.

Root Cause Analysis

A root cause of a problem is a factor that, if eliminated, would remove the problem itself. Root cause analysis is a process of understanding factors that contribute to problems and the ways they contribute. Its purpose is to identify underlying conditions that, if eliminated, would mean problems would disappear.
Common techniques for root cause analysis include Pareto analysis (the 80/20 rule), fishbone diagram analysis, track and trace, process analysis, and the Five Whys (McGilvray, 2008).

The post Data Quality – Preventive and Corrective Actions appeared first on MDM Team.

]]>
https://mdmteam.org/blog/data-quality-preventive-actions/feed/ 0
Data Quality – Tools https://mdmteam.org/blog/data-quality-tools/ https://mdmteam.org/blog/data-quality-tools/#respond Wed, 02 Mar 2022 13:15:18 +0000 https://mdmteam.org/blog/?p=841 Tools should be selected and tool architectures should be set in the.planning phase of the enterprise Data Quality program. Tools provide a.partial rule set starter kit but organizations need to …

The post Data Quality – Tools appeared first on MDM Team.

]]>
Tools should be selected and tool architectures should be set in the.planning phase of the enterprise Data Quality program. Tools provide a.partial rule set starter kit but organizations need to create and input their.own context specific rules and actions into any tool.

  • Modeling and ETL Tools: The tools used to model data and create ETL processes have a direct impact on the quality of data. If used with the data in mind, these tools can enable higher quality data. If they are used without knowledge of the data, they can have detrimental effects.
  • Data Querying Tools: Data profiling is only the first step in data analysis. It helps identify potential issues. Data Quality team members also need to query data more deeply to answer questions raised by profiling results and find patterns that provide insight into root causes of data issues.
  • Data Profiling Tools: Data profiling tools produce high-level statistics that enable analysts to identify patterns in data and perform initial assessment of quality characteristics. Some tools can be used to perform ongoing monitoring of data. Profiling tools are particularly important for data discovery efforts because they enable assessment of large data sets. Profiling tools augmented with data visualization capabilities will aid in the process of discovery.
  • Data Quality Rule Templates: Rule templates allow analyst to capture expectations for data. Templates also help bridge the communications gap between business and technical teams. Consistent formulation of rules makes it easier to translate business needs into code, whether that code is embedded in a rules engine, the data analyzer component of a data-profiling tool, or a data integration tool. A template can have several sections, one for each type of business rule to implement.
  • Metadata Repositories: Defining data quality requires Metadata and definitions of high quality data are a valuable kind of Metadata. DQ teams should work closely with teams that manage Metadata to ensure that data quality requirements, rules, measurement results, and documentation of issues are made available to data consumers.

The post Data Quality – Tools appeared first on MDM Team.

]]>
https://mdmteam.org/blog/data-quality-tools/feed/ 0
Data Quality – Incident Tracking System https://mdmteam.org/blog/data-quality-incident-tracking-system/ https://mdmteam.org/blog/data-quality-incident-tracking-system/#respond Tue, 01 Mar 2022 06:30:08 +0000 https://mdmteam.org/blog/?p=834 The incident tracking system will collect performance data relating to issue resolution, work assignments, volume of issues, frequency of occurrence, as well as the time to respond, diagnose, plan a …

The post Data Quality – Incident Tracking System appeared first on MDM Team.

]]>
The incident tracking system will collect performance data relating to issue resolution, work assignments, volume of issues, frequency of occurrence, as well as the time to respond, diagnose, plan a solution, and resolve issues. These metrics can provide valuable insights into the effectiveness of the current workflow, as well as systems and resource utilization, and they are important management data points that can drive continuous operational improvement for Data Quality Control.

Incident tracking data also helps data consumers. Decisions based upon remediated data should be made with knowledge that it has been changed, why it has been changed, and how it has been changed. That is one reason why it is important to record the methods of modification and the rationale for them. Make this documentation available to data consumers and developers researching code changes. While changes may be obvious to the people who implement them, the history of changes will be lost to future data consumers unless it is documented. Data quality incident tracking requires staff be trained on how issues should be classified, logged, and tracked. To support effective tracking:

  • Standardize Data Quality Issues and Activities: Since the terms used to describe data issues may vary across lines of business, it is valuable to define a standard vocabulary for the concepts used. Doing so will simplify classification and reporting. Standardization also makes it easier to measure the volume of issues and activities, identify patterns and inter-dependencies between systems and participants, and report on the overall impact of data quality activities. The classification of an issue may change as the investigation deepens and root causes are exposed.
  • Provide an Assignment Process for Data Issues: The operational procedures direct the analysts to assign data quality incidents to individuals for diagnosis and to provide alternatives for resolution. Drive the assignment process within the incident tracking system by suggesting those individuals with specific areas of expertise.
  • Manage Issue Escalation Procedures: Data quality issue handling requires a well-defined system of escalation based on the impact, duration, or urgency of an issue. Specify the sequence of escalation within the data quality Service Level Agreement. The incident tracking system will implement the escalation procedures, which helps expedite efficient handling and resolution of data issues.
  • Manage Data Quality Resolution Workflow: The data quality SLA specifies objectives for monitoring, control, and resolution, all of which define a collection of operational workflows. The incident tracking system can support workflow management to track progress with issues diagnosis and resolution.

The post Data Quality – Incident Tracking System appeared first on MDM Team.

]]>
https://mdmteam.org/blog/data-quality-incident-tracking-system/feed/ 0
Data Quality – SLA – Service Level Agreements https://mdmteam.org/blog/data-quality-sla-service-level-agreements/ https://mdmteam.org/blog/data-quality-sla-service-level-agreements/#respond Tue, 01 Mar 2022 05:58:46 +0000 https://mdmteam.org/blog/?p=830 A data quality Service Level Agreement (SLA) specifies an organization’s expectations for response and remediation for data quality issues in each system. Data quality inspections as scheduled in the SLA …

The post Data Quality – SLA – Service Level Agreements appeared first on MDM Team.

]]>
A data quality Service Level Agreement (SLA) specifies an organization’s expectations for response and remediation for data quality issues in each system. Data quality inspections as scheduled in the SLA help to identify issues to fix, and over time, reduce the number of issues. While enabling the isolation and root cause analysis of data flaws, there is an expectation that the operational procedures will provide a scheme for remediation of root causes within an agreed timeframe. Having data quality inspection and monitoring in place increases the likelihood of detection and remediation of a data quality issue before a significant business impact can occur. Operational data quality control defined in a data quality SLA includes:

  • Data elements covered by the agreement
  • Business impacts associated with data flaws
  • Data quality dimensions associated with each data element
  • Expectations for quality for each data element for each of the identified dimensions in each application or system in the data value chain
  • Methods for measuring against those expectations
  • Acceptability threshold for each measurement
  • Steward(s) to be notified in case the acceptability threshold is not met
  • Timelines and deadlines for expected resolution or remediation of the issue
  • Escalation strategy, and possible rewards and penalties

The data quality SLA also defines the roles and responsibilities associated with performance of operational data quality procedures. The operational data quality procedures provide reports in conformance with the defined business rules, as well as monitoring staff performance in reacting to data quality incidents. Data stewards and the operational data quality staff, while upholding the level of data quality service, should consider their data quality SLA constraints and connect data quality to individual performance plans.

When issues are not addressed within the specified resolution times, an escalation process must exist to communicate non-observance of the level of service up the management and governance chain.

Given the set of data quality rules, methods for measuring conformance, the acceptability thresholds defined by the business clients, and the service level agreements, the Data Quality team can monitor compliance of the data to the business expectations, as well as how well the Data Quality team performs on the procedures associated with data errors.

The post Data Quality – SLA – Service Level Agreements appeared first on MDM Team.

]]>
https://mdmteam.org/blog/data-quality-sla-service-level-agreements/feed/ 0
Data Quality Reporting https://mdmteam.org/blog/data-quality-reporting/ https://mdmteam.org/blog/data-quality-reporting/#respond Tue, 01 Mar 2022 05:36:12 +0000 https://mdmteam.org/blog/?p=827 The work of assessing the quality of data and managing data issues will not benefit the organization unless the information is shared through reporting so that data consumers understand the …

The post Data Quality Reporting appeared first on MDM Team.

]]>
The work of assessing the quality of data and managing data issues will not benefit the organization unless the information is shared through reporting so that data consumers understand the condition of the data. Reporting should focus around:

  • Data Quality Scorecard, which provides a high-level view of the scores associated with various metrics, reported to different levels of the organization within established thresholds
  • Data Quality Trends, which show over time how the quality of data is measured, and whether trending is up or down
  • SLA Metrics, such as whether operational data quality staff diagnose and respond to data quality incidents in a timely manner
  • Data Quality Issue Management, which monitors the status of issues and resolutions
  • Conformance of the Data Quality team to governance policies
  • Conformance of IT and business teams to Data Quality policies
  • Positive Effects of improvement projects

Reporting should align to metrics in the data quality SLA as much as possible, so that the team’s goals are aligned with those of its customers. The Data Quality program should also report on the positive effects of improvement projects. It is best to do this in business terms to continually remind the organization of the direct effect that data has on customers.

The post Data Quality Reporting appeared first on MDM Team.

]]>
https://mdmteam.org/blog/data-quality-reporting/feed/ 0