Reference And Master Data Management Exam Questions and Answers
An organization chart where a high level manager has department managers with staff and non-managers without staff as direct reports would best be maintained in which of the following?
Options:
A fixed level hierarchy
A ragged hierarchy
A reference file
A taxonomy
A data dictionary
Answer:
BExplanation:
A ragged hierarchy is an organizational structure where different branches of the hierarchy can have varying levels of depth. This means that not all branches have the same number of levels. In the given scenario, where a high-level manager has department managers with staff and non-managers without staff as direct reports, the hierarchy does not have a uniform depth across all branches. This kind of structure is best represented and maintained as a ragged hierarchy, which allows for flexibility in representing varying levels of managerial relationships and reporting structures.
References:
DAMA-DMBOK2 Guide: Chapter 7 – Data Architecture Management
"Master Data Management and Data Governance" by Alex Berson, Larry Dubov
The following is a technique thatyou can find useful when implementing your Reference and Master program:
Options:
Business key cross references
Root Cause Analysis
Process Management
None of the answers is correct
Extract Transformation Load (ETL)
Answer:
AExplanation:
When implementing a Reference and Master Data Management (RMDM) program, it is crucial to utilize techniques that ensure consistency, accuracy, and reliability of data across various systems. Business key cross-references is one such technique. This technique involves creating a mapping between different identifiers (keys) used across systems to represent the same business entity. This mapping ensures that data can be accurately and consistently referenced, integrated, and analyzed across different systems.
References:
DAMA-DMBOK: Data Management Body of Knowledge (2nd Edition), Chapter 11: Reference and Master Data Management.
"Master Data Management and Data Governance" by Alex Berson and Larry Dubov, which emphasizes the importance of business key cross-referencing in MDM.
Which of the following is NOT an example of Master Data?
Options:
A categorization of products
A list of account codes
Planned control activities
A list of country codes
Currency codes
Answer:
CExplanation:
Planned control activities are not considered master data. Here’s why:
Master Data Examples:
Categories and Lists: Master data typically includes lists and categorizations that are used repeatedly across multiple business processes and systems.
Examples: Product categories, account codes, country codes, and currency codes, which are relatively stable and broadly used.
Planned Control Activities:
Process-Specific: Planned control activities pertain to specific actions and checks within business processes, often linked to operational or transactional data.
Not Repeated Data: They are not reused or referenced as a stable entity across different systems.
References:
Data Management Body of Knowledge (DMBOK), Chapter 7: Master Data Management
DAMA International, "The DAMA Guide to the Data Management Body of Knowledge (DMBOK)"
Sharing of Reference and Master data across an enterprise requires which of the following?
Options:
A staging area as an intermediate data store
Maintaining and storing history records
Collaboration between multiple parties internal to the organization
Identification of the business key and surrogate keys
Creation of foreign keys to support dimensions
Answer:
CExplanation:
Sharing reference and master data across an enterprise requires effective collaboration and communication among various stakeholders within the organization.
Staging Area:
A staging area can be used for intermediate data storage during processing but is not a requirement for sharing data.
Maintaining and Storing History Records:
Historical records are important for auditing and tracking changes but do not directly facilitate the sharing of current reference and master data.
Collaboration Between Multiple Parties Internal to the Organization:
Effective sharing of master and reference data requires collaboration among different departments and stakeholders to ensure data consistency, quality, and governance.
This includes establishing clear communication channels, defining roles and responsibilities, and ensuring alignment on data standards and practices.
Identification of Business Key and Surrogate Keys:
Keys are important for data integration and linking but do not by themselves ensure effective sharing of data.
Creation of Foreign Keys to Support Dimensions:
Foreign keys are used in relational databases to link tables but are not specifically required for the sharing of master data.
An organization's master data can be acquired from an external third-party?
Options:
True
False
Answer:
AExplanation:
An organization's master data can indeed be acquired from external third parties. Here’s how and why:
Third-Party Data Acquisition:
Enrichment: External data sources can be used to enrich an organization's master data, providing additional details and context.
Accuracy and Completeness: Acquiring data from reputable third-party sources can enhance the accuracy and completeness of master data.
Use Cases:
Market Data: Organizations may purchase market data to complement their internal customer or product data.
Reference Data: Common reference data, such as postal codes or industry classifications, are often obtained from external providers.
Integration:
Data Integration: Master data acquired from third parties needs to be integrated into the organization’s MDM system, ensuring it aligns with existing data standards and governance policies.
References:
Data Management Body of Knowledge (DMBOK), Chapter 7: Master Data Management
DAMA International, "The DAMA Guide to the Data Management Body of Knowledge (DMBOK)"
Which of the following is NOT a metric that c.tn be tied to Reference and Master Data Quality?
Options:
Data sharing usage
The rate of change of data values
Service Level Agreements
Data sharing volume
Operational functions
Answer:
EExplanation:
Metrics tied to Reference and Master Data Quality generally include:
Data Sharing Usage: Measures how often master data is accessed and used across the organization.
Rate of Change of Data Values: Tracks how frequently master data values are updated or modified.
Service Level Agreements (SLAs): Monitors adherence to agreed-upon service levels for data availability, accuracy, and timeliness.
Data Sharing Volume: Measures the volume of data shared between systems or departments.
Excluded Metric - Operational Functions: While operational functions are important, they are not typically considered metrics for data quality. Operational functions refer to the various tasks and processes performed by systems and personnel but do not directly measure data quality.
References:
Data Management Body of Knowledge (DMBOK), Chapter 7: Master Data Management
DAMA International, "The DAMA Guide to the Data Management Body of Knowledge (DMBOK)"
The concept of tracking the number of MDM subject areas and source system attributes Is referred to as:
Options:
Publish and Subscribe
Hub and Spoke
Mapping and Integration
Subject Area and Attribute
Scope and Coverage
Answer:
DExplanation:
Tracking the number of MDM subject areas and source system attributes refers to defining the scope and coverage of the subject areas and attributes involved in an MDM initiative. This process includes identifying all the data entities (subject areas) and the specific attributes (data elements) within those entities that need to be managed across the organization. By establishing a clear scope and coverage, organizations can ensure that all relevant data is accounted for and appropriately managed.
References:
DAMA-DMBOK2 Guide: Chapter 10 – Master and Reference Data Management
"Master Data Management and Data Governance" by Alex Berson, Larry Dubov
A division of power approach to master data governance provides the benefit of:
Options:
Better alignment of decisions based on varying levels of organizational data sharing
Spreads the blame for bad decisions
Centralizing responsibility
Lower expense
Facilitating a decision by committee model
Answer:
AExplanation:
Division of Power in Data Governance:This approach distributes decision-making authority across different levels or areas within the organization.
Benefits:
Better alignment of decisions:By distributing power, decisions can be made that are better suited to the specific needs and contexts of different parts of the organization. This ensures that decisions about data management are relevant and effective for each particular area.
Avoids centralization issues:Centralized decision-making can often be disconnected from the needs of different departments or functions.
Improved responsiveness:
Decentralized governance can enable faster and more contextually appropriate responses to data management issues.
Other Options Analysis:
Spreads the blame for bad decisions:This is not a strategic benefit but rather a negative consequence.
Centralizing responsibility:This contradicts the concept of division of power.
Lower expense:While decentralization might lead to better decision-making, it doesn't inherently mean lower costs.
Facilitating a decision by committee model:This can lead to slower decision-making processes and isn't the primary benefit of a division of power.
Conclusion:The key benefit of a division of power approach in master data governance is the better alignment of decisions based on varying levels of organizational data sharing.
References:
DMBOK Guide, sections on Data Governance and Organizational Structures.
CDMP Examination Study Materials.
ISO 8000 is a Master Data international standard tor what purpose?
Options:
Provides a standard format for defining a model for a data dictionary
Provide guidance only to the Buy side of the supply chain
To replace the ISO 9000 standard
Define and measure data quality
Defines a format to exchange data between parties
Answer:
DExplanation:
ISO 8000 is an international standard focused on data quality and information exchange. Its primary purpose is to define and measure the quality of data, ensuring that it meets the requirements for completeness, accuracy, and consistency. The standard provides guidelines for data quality management, including requirements for data governance, data quality metrics, and procedures for improving data quality over time. ISO 8000 is not meant to replace ISO 9000, which is focused on quality management systems, but to complement it by addressing data quality specifically.
References:
ISO 8000: Overview and Benefits of ISO 8000, International Organization for Standardization (ISO)
DAMA-DMBOK2 Guide: Chapter 12 – Data Quality Management
The 'consumer of master data content received from a MDM platform can also be referred to as a:
Options:
Metadata repository
System of record
Single version of the truth
Subscriber
MDM platform
Answer:
DExplanation:
Step by Step Comprehensive Detailed Explanation with all References:The 'consumer' of master data content received from an MDM platform is often referred to as a subscriber. Here’s why:
Role of Subscriber:
Data Consumption: A
subscriber is an entity (individual, department, or system) that consumes or uses the master data provided by the MDM platform. -Access and Utilization: Subscribers access the master data to support various business functions, ensuring they have consistent and accurate data.
MDM Context:
Data Distribution: In an MDM context, the MDM platform distributes master data to its subscribers, who rely on this data for operational and analytical purposes.
Stakeholders: Subscribers are key stakeholders in the MDM ecosystem, as they benefit directly from the standardized and high-quality data provided.
References:
Data Management Body of Knowledge (DMBOK), Chapter 7: Master Data Management
DAMA International, "The DAMA Guide to the Data Management Body of Knowledge (DMBOK)"
What is the critical need of any Reference & Master Data effort?
Options:
Funding
Metadata
Project Management
Executive Sponsorship
ETL toolset
Answer:
DExplanation:
The critical need of any Reference & Master Data effort is executive sponsorship. Executive sponsorship provides the necessary authority, visibility, and support for the MDM initiative. Key aspects include:
Strategic Alignment: Ensures that the MDM effort aligns with the organization's strategic goals and objectives.
Resource Allocation: Secures the required funding, personnel, and other resources needed for the MDM program.
Stakeholder Engagement: Facilitates engagement and commitment from key stakeholders across the organization.
Governance and Oversight: Provides governance and oversight to ensure the MDM program adheres to best practices and delivers value.
Without executive sponsorship, MDM initiatives often struggle to gain traction, secure necessary resources, and achieve long-term success.
References:
DAMA-DMBOK: Data Management Body of Knowledge, 2nd Edition.
"Master Data Management and Data Governance" by Alex Berson and Larry Dubov.
International Classification of Diseases (ICD) codes are an example of:
Options:
Industry Reference Data
None of these
Geographic Reference Data
Computational Reference Data
Internal Reference Data
Answer:
AExplanation:
International Classification of Diseases (ICD) codes are a type of industry reference data.
ICD Codes:
Developed by the World Health Organization (WHO), ICD codes are used globally to classify and code all diagnoses, symptoms, and procedures recorded in conjunction with hospital care.
They are essential for health care management, epidemiology, and clinical purposes.
Industry Reference Data:
Industry reference data pertains to standardized data used within a particular industry to ensure consistency, accuracy, and interoperability.
ICD codes fall into this category as they are standardized across the healthcare industry, facilitating uniformity in data reporting and analysis.
Other Options:
Geographic Reference Data:Includes data like country codes, region codes, and GPS coordinates.
Computational Reference Data:Used in computational processes and algorithms.
Internal Reference Data:Data used internally within an organization that is not standardized across industries.
Master Data Curation is used for improving the overall quality of the data throughout the business by doing the following:
Options:
Providing a look up service for definitions
Recording who owns the data
Performing a data audit
Creating a map of the enterprise data stores
De-duplication of data.
Answer:
EExplanation:
Master Data Curation is a process aimed at improving the overall quality of data throughout the business. Here’s how:
Data Quality Improvement:
De-duplication: The process involves identifying and eliminating duplicate records to ensure a single, accurate version of each data entity.
Data Cleaning: Removes inaccuracies and inconsistencies, enhancing the reliability of the data.
Benefits of De-duplication:
Accuracy: Ensures that each entity (e.g., customer, product) is represented only once, improving data accuracy and reducing redundancy.
Operational Efficiency: Streamlines operations by eliminating duplicate records that can cause confusion and errors in business processes.
References:
Data Management Body of Knowledge (DMBOK), Chapter 7: Master Data Management
DAMA International, "The DAMA Guide to the Data Management Body of Knowledge (DMBOK)"
Which of the following is NOT a possible outcome of a probabilistic matching algorithm?
Options:
Likely Match
Non- match
None of answers are correct (
Underminable match
Match
Answer:
DExplanation:
Understanding Probabilistic Matching:Probabilistic matching algorithms are used in data matching processes to compare records and determine if they refer to the same entity. These algorithms use statistical techniques to calculate the likelihood of matches.
Possible Outcomes of Probabilistic Matching:
Likely Match:The algorithm determines that the records are probably referring to the same entity based on calculated probabilities.
Non-match:The algorithm determines that the records do not refer to the same entity.
Match:The algorithm determines with high confidence that the records refer to the same entity.
Non-Standard Outcome (D):The term "Underminable match" is not a standard term used in probabilistic matching outcomes. Typically, if the algorithm cannot determine a match or non-match, it might categorize it as "possible match" or leave it undecided but not as "underminable."
Conclusion:The term "Underminable match" does not fit into the standard categories of probabilistic matching outcomes.
References:
DMBOK Guide, specifically the sections on Data Quality and Data Matching Techniques.
Industry standards and documentation on probabilistic data matching algorithms.
The ISO definition of Master Data quality is which of the following?
Options:
Data meets the objective dimensions but not the subjective dimensions
Data meets all common requirements of all data users
Data is compliant to all international, country, and industry standards
The degree to which the data's characteristics fulfill individual users' requirements
Identifies the company that created and owns the Master Data
Answer:
DExplanation:
The ISO definition of Master Data quality focuses on the degree to which the data's characteristics meet the requirements of individual users. This implies that quality is subjective and depends on whether the data is suitable and adequate for its intended purpose, fulfilling the specific needs of its users.
References:
ISO 8000-8:2015 - Data quality — Part 8: Information and data quality: Concepts and measuring.
DAMA-DMBOK: Data Management Body of Knowledge (2nd Edition), Chapter 13: Data Quality Management.
Master Data Management resolves uncertainty by clearly stating that;
Options:
To have master data you must focus resources properly
Some entities [master entities) are more important than others
Only those entities in the Enterprise Data Model are considered Master Data.
All entities arc equal across an enterprise and need to be managed
Data elements must be stored in a repository before they are considered master data
Answer:
BExplanation:
Master Data Management (MDM) aims to establish a single, reliable source of key business data (master data). The correct answer here is B, which states that "Some entities [master entities) are more important than others."
Definition of Master Data:Master data refers to the critical data that is essential for operations in a business, such as customer, product, and supplier information.
Significance in MDM:MDM focuses on identifying and managing these key entities because they are vital for business processes and decision-making. This is why these entities are considered more important than others.
Resolution of Uncertainty:By emphasizing the importance of master entities, MDM reduces ambiguity around which data should be prioritized and managed meticulously, ensuring consistency and accuracy across the enterprise.
References:
DAMA-DMBOK: Data Management Body of Knowledge, 2nd Edition.
CDMP Study Guide
The MDM process step responsible for determining whether two references to real world objects refer to the same object or different objects is known as:
Options:
Data Model Management
Data Acquisition
Entity Resolution
Data Sharing & Stewardship
Data Validation. Standardization, and Enrichment
Answer:
CExplanation:
Entity resolution is a critical step in the MDM process that identifies whether different data records refer to the same real-world entity. This ensures that each entity is uniquely represented within the master data repository.
Data Model Management:
Focuses on defining and maintaining data models that describe the structure, relationships, and constraints of the data.
Data Acquisition:
Involves gathering and bringing data into the MDM system but does not deal with resolving entities.
Entity Resolution:
This process involves matching and linking records from different sources that refer to the same entity. Techniques such as deterministic matching (based on exact matches) and probabilistic matching (based on similarity scores) are used.
Entity resolution helps in deduplication and ensuring a single, unified view of each entity within the MDM system.
Data Sharing & Stewardship:
Focuses on managing data access and ensuring that data is shared responsibly and accurately.
Data Validation, Standardization, and Enrichment:
Ensures data quality by validating, standardizing, and enriching data but does not directly address entity resolution.
What MDM style allows data to be authored anywhere?
Options:
Consolidation
Centralized style
Persistent
Registry style
Coexistence
Answer:
EExplanation:
Master Data Management (MDM) styles define how and where master data is managed within an organization. One of these styles is the "Coexistence" style, which allows data to be authored and maintained across different systems while ensuring consistency and synchronization.
Coexistence Style:
The coexistence style of MDM allows master data to be created and updated in multiple locations or systems within an organization.
It supports the integration and synchronization of data across these systems to maintain a single, consistent view of the data.
Key Features:
Data Authoring: Data can be authored and updated in various operational systems rather than being confined to a central hub.
Synchronization: Changes made in one system are synchronized across other systems to ensure data consistency and accuracy.
Flexibility: This style provides flexibility to organizations with complex and distributed IT environments, where different departments or units may use different systems.
Benefits:
Enhances data availability and accessibility across the organization.
Supports operational efficiency by allowing data updates to occur where the data is used.
Reduces the risk of data silos and inconsistencies by ensuring data synchronization.
When 2 records are not matched when they should have been matched, this condition is referred to as:
Options:
False Positive
A True Positive
A False Negative
A True Negative
An anomaly
Answer:
CExplanation:
Definitions and Context:
False Positive: This occurs when a match is incorrectly identified, meaning records are deemed to match when they should not.
True Positive: This is a correct identification of a match, meaning records that should match are correctly identified as matching.
False Negative: This occurs when a match is not identified when it should have been, meaning records that should match are not matched.
True Negative: This is a correct identification of no match, meaning records that should not match are correctly identified as not matching.
Anomaly: This is a generic term that could refer to any deviation from the norm and does not specifically address the context of matching records.
Explanation:
The question asks about a scenario where two records should have matched but did not. This is the classic definition of aFalse Negative.
In data matching processes, this is a critical error because it means that the system failed to recognize a true match, which can lead to fragmented and inconsistent data.
References:
DAMA-DMBOK: Data Management Body of Knowledge, 2nd Edition, Chapter 11: Master and Reference Data Management.
ISO 8000-2:2012, Data Quality - Part 2: Vocabulary.
Should both in-house and commercial tools meet ISO standards for metadata?
Options:
Yes. at the very least they should provide guidance
No. each organization needs to develop their own standards based on needs
Answer:
AExplanation:
Adhering to ISO standards for metadata is important for both in-house and commercial tools for the following reasons:
Standardization:
Uniformity: ISO standards ensure that metadata is uniformly described and managed across different tools and systems.
Interoperability: Facilitates interoperability between different tools and systems, enabling seamless data exchange and integration.
Guidance and Best Practices:
Structured Approach: Provides a structured approach for defining and managing metadata, ensuring consistency and reliability.
Compliance and Quality: Ensures compliance with internationally recognized best practices, enhancing data quality and governance.
References:
ISO/IEC 11179: Information technology - Metadata registries (MDR)
Data Management Body of Knowledge (DMBOK), Chapter 7: Master Data Management
DAMA International, "The DAMA Guide to the Data Management Body of Knowledge (DMBOK)"
Which is NOT considered a type of Master Data relationship?
Options:
Customer Household
Survivorship
Fixed-Level Hierarchy
Ragged-Level Hierarchy
Grouping based on common criteria
Answer:
BExplanation:
Master Data relationships define how different master data entities are related to each other within an organization. These relationships are crucial for understanding and managing the dataeffectively. The types of master data relationships generally include hierarchies, groupings, and associations that help in organizing and categorizing the data.
Customer Household:
This refers to grouping individual customers into a single household entity. It is commonly used in consumer industries to understand the relationships and dynamics within a household.
Fixed-Level Hierarchy:
A hierarchy with a predetermined number of levels. Each level has a specific position and relationship to other levels, such as organizational hierarchies or product categorization.
Ragged-Level Hierarchy:
Similar to fixed-level hierarchies, but with varying levels of depth. It accommodates entities that may not fit neatly into a fixed-level structure, providing flexibility in the hierarchy.
Grouping based on common criteria:
This involves creating groups or segments of data based on shared attributes or criteria. For example, grouping products by category or customers by region.
Survivorship (NOT a relationship):
Survivorship pertains to the process of determining the most accurate and relevant data when multiple records exist for the same entity. It is a data quality and management process, not a type of relationship.
Is there a standard tor defining and exchanging Master Data?
Options:
Yes, ISO 22745
No. every corporation uses their own method
Yes. it is called ETL
No. there are no standards because not everyone uses Master Data
Answer:
AExplanation:
ISO 22745 is an international standard for defining and exchanging master data.
ISO 22745:
This standard specifies the requirements for the exchange of master data, particularly in industrial and manufacturing contexts.
It includes guidelines for the structured exchange of information, ensuring that data can be shared and understood across different systems and organizations.
Standards for Master Data:
Standards like ISO 22745 help ensure consistency, interoperability, and data quality across different platforms and entities.
They provide a common framework for defining and exchanging master data, facilitating smoother data integration and management processes.
Other Options:
ETL:Refers to the process of Extract, Transform, Load, used in data integration but not a standard for defining master data.
Corporation-specific Methods:Many organizations may have their own methods, but standardized frameworks like ISO 22745 provide a common foundation.
No Standards:While not all organizations use master data, standards do exist for those that do.
Master and Reference Data are forms of:
Options:
Data Mapping
Data Quality
Data Architecture
Data Integration
Data Security
Answer:
CExplanation:
Master and Reference Data are forms of Data Architecture. Here’s why:
Data Architecture Definition:
Structure and Design: Data architecture involves the structure and design of data systems, including how data is organized, stored, and accessed.
Components: Encompasses various components, including data models, data management processes, and data governance frameworks.
Role of Master and Reference Data:
Core Components: Master and Reference Data are integral components of an organization’s data architecture, providing foundational data elements used across multiple systems and processes.
Organization and Integration: They play a critical role in organizing and integrating data, ensuring consistency and accuracy.
References:
Data Management Body of Knowledge (DMBOK), Chapter 7: Master Data Management
DAMA International, "The DAMA Guide to the Data Management Body of Knowledge (DMBOK)"
Which of the following reasons is a reason why MDM programs are often not successful?
Options:
Too much emphasis on technology rather than people and process components
All of the above
Poor positioning of MDM program responsibility within the IT organization
Not enough business commitment and engagement
MDM initiative is run as a project rather than a program
Answer:
BExplanation:
MDM programs often face challenges and can fail due to a combination of factors. Here’s a detailed explanation:
Emphasis on Technology:
Technology-Centric Approach: Overemphasis on technology solutions without addressing people and process components can lead to failure. Successful MDM programs require balanced attention to technology, people, and processes.
Positioning within IT:
IT Focus: Poor positioning of the MDM program within the IT organization can lead to it being seen as a purely technical initiative, missing the necessary business alignment and support.
Business Commitment and Engagement:
Lack of Engagement: Insufficient commitment and engagement from the business side can result in inadequate support, resources, and buy-in, leading to failure.
Program vs. Project:
Long-Term Perspective: Treating MDM as a one-time project rather than an ongoing program can limit its effectiveness. MDM requires continuous improvement and adaptation to evolving business needs.
References:
Data Management Body of Knowledge (DMBOK), Chapter 7: Master Data Management
DAMA International, "The DAMA Guide to the Data Management Body of Knowledge (DMBOK)"
What characteristics does Reference data have that distinguish it from Master Data?
Options:
It is more volatile and needs to be highly structured
It is always data from an outside source such as a governing body
It always has foreign database keys to link it to other data
It is less volatile, less complex, and typically smaller than Master Data sets
It provides data for transactions
Answer:
CExplanation:
Reference data and master data are distinct in several key characteristics. Here’s a detailed explanation:
Reference Data Characteristics:
Stability: Reference data is generally less volatile and changes less frequently compared to master data.
Complexity: It is less complex, often consisting of simple lists or codes (e.g., country codes, currency codes).
Size: Reference data sets are typically smaller in size than master data sets.
Master Data Characteristics:
Volatility: Master data can be more volatile, with frequent updates (e.g., customer addresses, product details).
Complexity: More complex structures and relationships, involving multiple attributes and entities.
Size: Larger in size due to the detailed information and numerous entities it encompasses.
References:
Data Management Body of Knowledge (DMBOK), Chapter 7: Master Data Management
DAMA International, "The DAMA Guide to the Data Management Body of Knowledge (DMBOK)"
What type of interactive system model is most often used for Master Data Management?
Options:
Hub-and-Spoke
Application-coupling
Publish-Subscribe
Synchronized interface
Point-to-point
Answer:
AExplanation:
The hub-and-spoke model is most often used for Master Data Management because it provides a central hub where master data is maintained, while the spokes represent different systems or applications that interact with the hub. This model allows for efficient management, synchronization, and distribution of master data across the enterprise, ensuring consistency and quality.
References:
DMBOK (Data Management Body of Knowledge), 2nd Edition, Chapter 11: Reference & Master Data Management.
The Data Warehouse Toolkit: The Definitive Guide to Dimensional Modeling by Ralph Kimball and Margy Ross.
Key processing steps for successful MDM include the following steps with the exception of which processing step?
Options:
Data Indexing
Data Acquisition
Data Sharing & Stewardship
Entity Resolution
Data Model Management
Answer:
AExplanation:
Key processing steps for successful MDM typically include:
Data Acquisition: The process of gathering and importing data from various sources.
Data Sharing & Stewardship: Involves ensuring data is shared appropriately across the organization and that data stewards manage data quality and integrity.
Entity Resolution: Identifying and linking data records that refer to the same entity across different data sources.
Data Model Management: Creating and maintaining data models that define how data is structured and related within the MDM system.
Excluded Step - Data Indexing: While indexing is a critical database performance optimization technique, it is not a primary processing step specific to MDM. MDM focuses on consolidating, managing, and ensuring the quality of master data rather than indexing, which is more about search optimization within databases.
References:
Data Management Body of Knowledge (DMBOK), Chapter 7: Master Data Management
DAMA International, "The DAMA Guide to the Data Management Body of Knowledge (DMBOK)"
MOM Harmonization ensures that the data changes of one application:
Options:
Are synchronized with all other applications who depend on that data
Are recorded in the repository or data dictionary
Agree with the overall MDM architecture
include changes to the configuration of the database as well as the data
Has a data steward to preview the data for quality
Answer:
AExplanation:
Master Data Management (MDM) Harmonization ensures that the data changes of one application are synchronized with all other applications that depend on that data.
MDM Harmonization Definition:This process involves aligning and reconciling data from different sources to ensure consistency and accuracy across the enterprise.
Synchronization:Ensuring that changes in one application are reflected across all dependent applications prevents data inconsistencies and maintains data integrity.
References:
DAMA-DMBOK: Data Management Body of Knowledge, 2nd Edition.
CDMP Study Guide
Bringing order to your Master Data would solve what?
Options:
20 40% of the need to buy new servers
Distributing data across the enterprise
The need for a metadata repository
60-80% of the most critical data quality problems
Provide a place to store technical data elements
Answer:
DExplanation:
Definitions and Context:
Master Data Management (MDM): MDM involves the processes and technologies for ensuring the uniformity, accuracy, stewardship, semantic consistency, and accountability of an organization’s official shared master data assets.
Data Quality Problems: These include issues such as duplicates, incomplete records, inaccurate data, and data inconsistencies.
Explanation:
Bringing order to your master data, through processes like MDM, aims to resolve data quality issues by standardizing, cleaning, and governing data across the organization.
Effective MDM practices can address and mitigate a significant proportion of data quality problems, as much as 60-80%, because master data is foundational and pervasive across various systems and business processes.
References:
DAMA-DMBOK: Data Management Body of Knowledge, 2nd Edition, Chapter 11: Master and Reference Data Management.
Gartner Research, "The Impact of Master Data Management on Data Quality."
Reference Data Dictionaries are authoritative listings of:
Options:
Master Data entities
External sources of data
Master Data sources
Master Data systems of record
Semantic rules
Answer:
BExplanation:
Definitions and Context:
Reference Data Dictionaries: These are authoritative resources that provide standardized definitions and classifications for data elements.
External Sources of Data: These are data sources that come from outside the organization and are used for various analytical and operational purposes.
Explanation:
Reference Data Dictionaries often contain listings and definitions for data that are used across different organizations and systems, ensuring consistency and interoperability.
They typically include external data sources, which need to be standardized and understood in the context of the organization’s own data environment.
References:
DAMA-DMBOK: Data Management Body of Knowledge, 2nd Edition, Chapter 11: Master and Reference Data Management.
ISO/IEC 11179-3:2013, Information technology - Metadata registries (MDR) - Part 3: Registry metamodel and basic attributes.