top of page

Small or Traditional Data Management Strategy

There are basic elements of small, internal or traditional data management strategy that are very important to be able to have the playing field for the analysts to have productive value added deliverables. These elements include:

  • Data Governance Structure

  • Advanced Data Stewardship Services

  • Data Quality Analytics

  • Data Quality Controls

  • Data Standardization

  • Metadata Catalog and Data Integration

  • Data Flows Services

  • Data Rationalization Services

Data Governance Structure

The Governance Structure outlines the roles and responsibilities of those within the Data Management space to manage and govern the integrity of the data provided.  Additionally, the structure identifies who will be responsible for monitoring and maintaining the quality of the data used to generate the performance metrics for the businesses

Other elements of the Governance include:

  • Data Governance Principles

Principles establish to standardize the definitions of the performance metrics as well as define the data quality goals of such metrics to meet the needs of the user community.

  • Data Processes and Procedures

Processes and Procedures define the rules of engagement between those representing the Analytics Governance organization and relevant parties including those from business units.

  • Data Quality Tracking

Refers to the design, development, and implementation of the reports and scorecards used to track the quality of the data performance metrics  as well as track the quality of the internal Operating Model processes within the data team.

  • Data Architecture

Refers to providing guidelines and direction as to the recommended path of retrieving the information needed to support the performance metrics as well as the implementation of the recommended data control and monitoring processes to enforce the data quality goals of each performance metric.

  • Other Governance Tools include:

Data Governance Activity Matrix and Enforcement

 

Advanced Data Stewardship Services

Data Stewardship Services Team involves maintenance of the company’s data assets so as improve data quality, reusability, and accessibility and Team members are organized across multiple business units. The Team involves working closely with the Data Quality analysts and Security administrators from IT to define the data quality standards and security measures. Data Stewards understand the full front to back flow of the data sets, not only from the systems and feeds perspectives, but also from sources, users and usability perspectives.  Therefore, Data Stewards are responsible for developing the processes, policies, and metrics to improve and maintain the quality of data in their domains.

 

Data Quality Analytics

Data Quality Analytics Team defines the data quality standards and definitions required to meet the metric requirements of the user community.  This includes an initial assessment of the quality of the data from the various systems of record relevant to the requested metrics as well as the definition of the business and/or technical rules to meet the data quality targets expected by the user community.  Moreover, Data Quality Analytics Team is responsible for defining the data quality controls required to support the integrity of the performance metrics managed by the Analytics Team working with either the Data Stewardship Team or those within the Data Management Team.

 

Data Quality Controls

Data quality controls are designed, developed and implemented in order to provide continual monitoring of the integrity of the metric of the data over time.  Such data quality controls include those embedded within the ETL processes which track use of the business and technical rules along with the reconciliation logic to verify that the information captured for each metric is consistent with other information within the company, as well as referential integrity checks.

 

Data Standardization

Data Standardization refers to the creation and maintenance of standardized definitions for performance metrics, data elements, attributes, and schemas.  Information such as naming standards, data classifications, business rules, data models, data dictionary information and data format standards are shared with users within the company leveraging a metadata repository environment.

 

Metadata Catalog and Data Integration

Metadata Catalog area focuses on defining and managing the content required for the Metadata and Metrics Catalog. This also include providing the requisite integration of the metadata information as well as metadata maintained by the technology teams.  Such integration is critical in facilitating impact and root cause analysis requirements.  Additionally, this includes defining the user interface, navigation, and organization of the information contained within the Metadata Catalog to meet the reporting and metrics needs of the company

 

Data Flows Services

Analytics Team provides recommendations to IT representatives and developers as to the flow of data from the system-of-record to each performance metric’s ultimate destination in order to maintain the integrity of each metric.  Such recommendations are provided as part of the design and review process between Analytics and IT. In many companies, the integrity of the flow process may become so important that it is the responsibility of an individual or a Team if the company has global locations and exposure.

 

Data Rationalization Services

As part of the Data Request Process, the Data Management Team provides services related to the rationalization of incoming data requests for new metrics so as to reduce or avoid redundancy as well as facilitate the justification and prioritization process in order to ensure that metrics developed are “Metrics That Matter”.

Rationalization Services Team leverages the metadata repository information to determine if the requested metric is similar to existing metrics managed by the Analytics Team.  Moreover, such services provide information to facilitate the identification of systems of record for the requested metric and includes analysis of each candidate source system, granularity, and availability of information required.

bottom of page