Supporting Your Data Management Strategy with a Phased Approach to Master Data Management- 2

Data Quality Management

Data governance encompasses the program management required to manage data consumer expectations and requirements, along with collaborative semantic metadata management. However, the operational nexus is the integration of data quality rules into the business process and application development life cycle. Directly embedding data quality controls into the data production workflows reduces the continual chore of downstream parsing, standardization and cleansing. These controls also alert data stewards to potential issues long before they lead to irreversible business impacts.

Engaging business data consumers and soliciting their requirements allows data practitioners to translate requirements into specific data quality rules. Data controls can be configured with rules and fully incorporated into business applications. Data governance procedures guide data stewards through the workflow tasks for addressing emerging data quality issues. Eliminating the root causes for introducing flawed data not only supports the master data management initiative, it also improves the overall quality of enterprise data. Data quality management incorporates tools and techniques for:

Data quality rules and standards. Providing templates for capturing, managing and deploying data quality rules – and the standards to which the data sets and applications must conform – establishes quantifiable measures for reporting quality levels. Since the rules are derived from data consumer expectations, the measures provide relevant feedback as to data usability.

Data quality controls. Directly integrating data quality controls as part of the application development process means that data quality is “baked in” to the application infrastructure. Enabling rule-based data validation ratchets data quality out of downstream reactive mode and helps data practitioners address issues within the context of the business application.

Monitoring, measurement and reporting. A direct benefit of data quality rules, standards and controls is the ability to continuously inspect and monitor data sets and data streams for any recognizable issues, and to alert the right set of people when a flaw is detected.

Data quality incident management and remediation. One of the most effective techniques for improving data quality is instituting a framework for reporting, logging and tracking the status of data quality issues within the organization. Providing a centrally managed repository with integrated workflow processes and escalation means that issues are not ignored. Instead, issues are evaluated, investigated and resolved either by addressing the cause or determining other changes to obviate the issue. The visibility into the point of failure (or introduction of a data error) coupled with the details of the data quality rules that were violated help the data steward research the root cause and develop a strategy for remediation.

While one of the proposed benefits of MDM is improved data quality, in reality it’s the other way around: To ensure a quality MDM deployment, establish best practices for proactive data quality assurance.

 

Integrating Identity Management into the Business Process Model

The previous phases – oversight, understanding and control – lay the groundwork of a necessary capability for MDM: entity identification and identity resolution. The increased inclusion of data sets from a variety of internal and external sources implies the increased variation of representations of master data entities such as customer, product, vendor or employee. As a result, organizations need high-quality, precise and accurate methods for parsing entity data and linking similar entity instances together.

Similarity scoring, algorithms for identity resolution and record linkage are mature techniques that have been refined over the years and are necessary for any MDM implementation. But the matching and linking techniques for identity resolution are just one part of the solution. When unique identification becomes part and parcel of the business process, team members become aware of how their commitment to maintaining high-quality master data adds value across the organization. Identity resolution methods need to be fully incorporated into the business processes that touch master entity data, implying the need for:

Enumerating the master data domains. It may seem obvious that customer and product are master data domains, but each organization – even within the same industry – may have numerous data domains that could be presumed to be “mastered.” Entity concepts that are used and shared by numerous organizations are candidate master domains. Use the data governance framework to work with representatives from across the corporation to agree on the master data domains.

Documenting business process models and workflows. Every business process must touch at least one master data entity. For an MDM program, it’s critical to understand the flow of business processes – and how those processes are mapped to specific applications. The organization must also know how to determine which applications touch master data entities.

CRUD (create, read, update, delete) characteristics and process touch points. Effective use of master data cuts horizontally across different business functions. Understanding how business processes create, read or update master data entity instances helps the data practitioner delineate expectations for key criteria for managing master data (such as consistency, currency and synchronization).

Data access services. Facilitating the delivery of unobstructed access to a consistent representation of shared information means standardizing the methods for access. Standard access methods are especially important when master data repositories are used as transaction hubs requiring the corresponding synchronization and transaction semantics. This suggests the need to develop a layer of master data services that can be coupled with existing strategies for enterprise data buses or data federation and virtualization fabrics.

“Master entity-aware” system development. If one of the root causes for the inadvertent replication of master data stems from siloed application development, the remedy is to ensure that developers use master data services as part of the system development life cycle. Couple the delivery of master data services with the proper training and oversight of application design and development.

The methods used for unique identification are necessary but not sufficient for MDM success. Having identified the business applications that touch master data entities is a prelude to exploring how the related business processes can be improved through greater visibility into the master data domains.

Supporting Your Data Management Strategy with a Phased Approach to Master Data Management 1

overview

For practical purposes, master data management is a set of disciplines and tools enabling business data consumers to access a unified view of shared data about one or more specific master data domains such as customer or product. Yet, while the technical processes for extraction and consolidation drive the activity in most MDM programs, the actual intent of MDM is to satisfy business consumer needs for access and delivery of consistent, shared data.

Satisfying user needs means “working backward” by understanding how master data is employed within critical business functions and processes – and how that data is to be consumed by the users. This means identifying the key data consumers in the organization and soliciting their needs and expectations (both now and in the future). Information architects must work with the business teams to understand how the organizational mission, policies and strategic performance objectives are related to the use of master data. Finally, it’s necessary to understand how improvements in information sharing will maximize corporate value.

A gap analysis is performed to determine what must be done to the current data environment in order to satisfy future data consumption objectives. Performing this gap analysis helps in two ways. First, it isolates key strategic goals for data sharing that must be put into place before any technical MDM approach can add value. Second, and more importantly, it establishes the value of fundamental data management best practices that benefit the organization beyond the needs of a specific master data project. Effective data management practices penetrate all aspects of information use across the enterprise, such as:

Data governance, which formulates the policies, processes and procedures to ensure that data use complies with explicit business policies; engages business data owners; identifies key data assets to be managed and shared; and delineates specific data requirements and quantifiable measures.

Metadata collaboration, which defines standards for business glossary terms and definitions, representations for conceptual data elements, and alignment of models in ways that will reduce conflicts when data sets are merged into a unified master view.

Data quality management, especially when deploying inspection and monitoring compliance with defined data standards and rules; and integrating services implementing data controls directly into the application infrastructure.

Integration of identity resolution and management within business process model, which best satisfies the ongoing need for maintaining unified views for specific entities such as customer or product.

Use of these best practices does more than lay the foundation for improved information. These practices highlight the relationship among business processes, information and use. They also emphasize the need to adjust staff member behaviors as access to master data provides greater customer and product data visibility. And while the success of your strategic management plan for MDM must have milestones and deliverables aligned with these disciplines, the organization will directly benefit from each practice area independently.

 

How Data Governance Supports the Data Strategy

Because enabling comprehensive visibility into a composed view of uniquely identifiable entities will continue to be part of the information strategy, there must be a reliable approach for:

  • Ensuring that proper validation and verification is performed and approved as new data is created or acquired.
  • Confirming that enterprise requirements for the quality of shared information are satisfied.
  • Accessing and managing the composed view of shared information within defined security controls.
  • Guaranteeing the consistency, coherence and synchrony of data views composed from a variety of sources.

Data governance provides the foundation for mapping operational needs to the framework of a sound data strategy designed around unified master data domains. There are numerous aspects of instituting a data governance program, including these key practices:

  • Data governance program management – Developing an ongoing program management plan that identifies roles, defines responsibilities, provides templates for key artifacts (such as data policies, data quality requirements and policy deployment plans) and specifies the types of tools to support the data governance program.
  • Data governance operating model – Specifying the organizational structure for operationalizing data governance, the interaction and engagement model to create a data governance council, the development of ongoing agendas, meeting schedules, and the establishment of measures to ensure that progress continues.
  • Definition and deployment of data policies – Developing a framework for the process and workflows related to drafting, reviewing, approving and deploying data policies, as well as integrating business rule validation within the application infrastructure.
  • Data stewardship – Defining the operational procedures for data controls, inspection, monitoring and issue remediation related to data policy violation.
  • Collaborative agreements. Introducing data governance and data stewardship opens the door for agreeing to existing or newly defined data standards, business glossary terms and definitions, data element concepts, and corresponding data types and sizes. A governance program facilitates these agreements, and collaboration tools can supplement the tasks associated with the metadata management life cycle.
  • Data lineage. Data lineage needs to be mapped from creation (or acquisition points) to the various modification or usage touch points across the application landscape. An accurate lineage map helps in understanding the business application and process impacts of modifications to definitions or underlying data structures. It also enables the insertion of data validity controls for inspecting and monitoring data quality and usability

A reasonable investment in metadata management can add value to the organization by facilitating communication about shared concepts across business functions, while reducing variance and complexity. Metadata management also adds value by smoothing the differences between data sourced from functional silos – and paves the way for an effective MDM effort.

Change of the WAN Landscape

Cloud services are at the center of growth for many enterprises. Currently, about 80 percent of U.S. companies are considering public or private cloud according to IDC, and many are actively building out hybrid cloud strategies. This growth in cloud usage is having a transformational impact on IT resource plans. A recent IDC survey revealed a flip in how IT budgets would be allocated over time. At the time of the survey, respondents were spending 58 percent of their IT budget on non-cloud architecture and 42 percent on cloud; they anticipated that usage changing to 44 percent non-cloud and 56 percent cloud in the next 24 months.

The research company also found that enterprises already using the cloud expected to increase cloud spending by 34 percent in the subsequent 24 months. Even more importantly, cloud is driving business growth. IDC forecasts that businesses will spend $122 billion on public cloud IT services in 2018.

This investment in cloud-based applications is spawning a new set of networking needs. Traditional WAN architectures were designed for efficiency and performance when applications primarily resided in the data center. With the proliferation of cloud and software-as–a-service (SaaS) applications, traditional means of connecting branches and users to applications are in need of a change. In short, businesses must look beyond the WAN-connectivity technologies currently in place, particularly Multiprotocol Label Switching (MPLS), to address their needs.

In addition to the need for new ways to connect users to SaaS and cloud-based applications, enterprises must also ensure the WAN delivers consistent performance across all sources of connectivity (e.g., DSL, Cable, LTE and MPLS), visibility and control for legacy and cloud-based applications, and faster service provisioning.

 

Times Are Changing

It’s been well over a decade since MPLS rose to prominence, replacing frame relay as the preferred WAN solution. MPLS’ reliability combined with its ability to deliver on SLAs helped drive its ascent. MPLS offered reliable access to data-center-based applications — which were predominant, provided support for branch-to-branch communication for voice and video, and could easily handle the small amount of non-critical Internet traffic that passed through the network.

However, in the past five years things have dramatically changed and continue to do so. Applications are moving to the cloud – the architecture, agility and flexibility that accompany such a transition do not favor MPLS, which is more rigid, expensive and not optimized for cloud application environments. For example, with MPLS, accessing a cloud-based application follows a very different path from accessing a data-

For example, with MPLS, accessing a cloud-based application follows a very different path from accessing a data-centerbased application. While MPLS provides branch users with direct access to an application housed in the data center, it can create a circuitous and more expensive path for branch users accessing cloud-based applications. In a cloud architecture with an MPLS-based WAN, the traffic must first travel over the MPLS network from the branch office to the data center before finally going out to the Internet, and then back the same route. This can negatively impact performance and cause costs to rise. According to IDC, 90 percent of new applications are being developed specifically for the cloud, and this gap will only continue to grow and render MPLS less effective. But

But cost isn’t the only issue. User experience has also become problematic. The challenges of using a traditional MPLS network to connect to cloud-based applications is often recognized when employees are frustrated by application performance at the office and find that accessing the same cloud-based applications from their home-based Internet connection is faster than at the office.

 

Looking Beyond MPLS

The questions enterprise IT consequently are asking are: Is there a way to leverage broadband for their enterprise WAN to make accessing cloud-based applications more efficient and less expensive? Can they introduce multiple sources of connectivity – MPLS, broadband, LTE and so on — without compromising the high level of reliability, security and performance they expect with their traditional WAN architecture? Finding a solution that combines the flexibility, scalability and cost of broadband with the control and reliability of MPLS seemed an impossible feat. Until now. Enterprises now have a solution called the software-defined WAN (SD-WAN). An SD-WAN resolves many of the shortcomings found in traditional WAN architectures by putting a secure, virtualized overlay on top of the WAN to enable simple, centralized provisioning, application and user visibility, and the use of multiple sources of connectivity simultaneously through dynamic multi-path control. More advanced SD-WAN solutions also deliver consistent performance regardless of the type of connectivity – all while driving down costs significantly. Essentially, an SD-WAN turns the WAN into a geographically distributed LAN, providing the enterprise with a dynamic solution that leverages multiple sources of connectivity, is faster to deploy and can be centrally managed. Gartner cites four key components of an SD-WAN solution. SDWANs:

1. Provide a lightweight replacement for traditional WAN routers and are agnostic to WAN transport (e.g., support MPLS, Internet and LTE).

2. Allow for load sharing of traffic across multiple WAN connections in an efficient and dynamic fashion that can be based on business or application policies.

3. Simplify the complexity associated with management, configuration and orchestration of WANs.

4. Must provide secure VPNs and have the ability to integrate additional network services