TOP 70 Informatica MDM Interview Questions

Table of Contents

TOP 70 Informatica MDM Interview Questions

Here, we will share some of the questions of Master in Data Management with you people, whatever are the biggest questions of MDM, here you will get the big questions related to it, whatever the difficult questions are. We will share it with you in very simple language from here, if you people want any question and answer related to data management, you can visit our website.

Master in Data Management, this is a big software, it will take some time to get to know you guys, whoever does a job here, by studying this pair by doing Master in Data Management, the one who passes well in the exam, if he The salary of the fake case that does the job will be around 2-3 courts, Hey you guys also have to do master in data management which is the interview question, you have to mean all this means you WANNA DO. THEN VISIT OUR SITE.

Indian who is Master in Data Management, to make this story to qualify you people, you are first asked some interview questions like Explain about MDM and Define MDM and Explain about OLAP and OLTP what is Data Warehousing Define Informatica Powercenter All this is asked to you guys and what is my heart this oltp whatsapp you will be asked.

1. Explain about MDM?

Ans: Master Data Management(MDM) is a methodology that allows the organization to link all its essential data into a single file, which is called a master file. This file seems to be a standard base reference to the organization that helps to make crucial business decisions. Master Data Management acts as a network to share the data among various firms.

     Want to enhance your skills to become a master in Informatica MDM Certification, Enroll Now!

2. What does the term MDM mean?

The term MDM (Master Data Management)refers to a comprehensive method that helps an enterprise to link its entire critical data to a single, master file serving as a common point of reference. Informatica’s Master Data Management simplifies data sharing across departments and members. 

3. List out different components of Powercenter?

Ans: The following are the different components of PowerCenter: 

  • Metadata Manager
  • PowerCenter Domain
  • Repository Service
  • Administration Console
  • Integration Service
  • PowerCenter Repository Reports
  • PowerCenter Client
  • Web Services Hub
  • Data Analyzer
  • PowerCenter Repository

4. Explain to us about Data Warehousing?

Ans: The data warehouse is the primary method of managing and gathering information from various sources to assist the firms with helpful insights. Using a Data warehouse, you can analyze and integrate the data from various sources. A data warehouse is connected with different technologies and components that allow the firms to utilize the data in a well-organised process. It collects all the information in a digital form for further interpretation and moulds the data into an understandable way and makes it feasible for business users. 

5. Explain to us about various fundamental phases of data warehousing?

Ans: following are the various four fundamental phases of data warehousing. They are: 

  • Offline Operational Databases 
  • Offline data warehouse
  • Real-time data warehouse
  • Integrated data warehouse

Now, let us know about each phase of the data warehouse with useful insights. 

Offline Operational Databases: This is the initial stage of the data warehouse, evolved from copying the original operating system into an offline server. This process doesn’t impact the performance of the system. 

Offline Data Warehouse: In this phase, data warehouses update the operational data regularly such as daily, weekly, and monthly and the data is stored in an homogenous report oriented way.  

Real-time Data Warehouse: In this phase, the operating system executes an action or an event every time and updates the events in the data warehouse.  

Integrated Data Warehouse: In this last phase, data warehouses generate transactions and activities that pass through the OS, which assists the organization in daily business functionality. 

6. What are the most significant technical and management challenges in adopting MDM?

Ans: Every technical folk need to face challenges when they are selling a project and getting the funds. Management is actively looking for ROI, and they need MDM to be quantifiable advantages and profits for their business.

7. What is meant by Dimensional Modelling?

Ans: In dimensional modelling, there are two tables which are distinct from the third normal form. In this model, fact tables used for business measurements and the dimension table contain a context. 

8. What is meant by dimension table?

Ans: Dimension table is a compilation of logic, categories, and hierarchies, which can be used for a further traverse in hierarchy nodes by a user. And the dimension table contains textual measurements stored in fact tables. 

9. Explain various methods to load the data in dimension tables?

Ans: There are two methods to load the data in dimension tables. Following are the two methods:

  • Conventional(Slow): Before storing the data into the dimension table, all constraints and keys are legalised against the data, and this process is called data integrity which is a time taking process. 
  • Direct(Fast): In this process, all constraints and keys are disabled before loading the data into the dimension table. Once loading of the information is complete into the dimension table, it validates the data against constraints and keys. In this process, if any set of data is invalid or irrelevant, then the information is skipped from the index and all future operations. 
10. Define fact tables?

Ans: In a data warehouse, a fact table consists of two columns which include metrics and measurements of business methods and foreign key. It is determined at the center of the star or snowflake schema surrounded by dimension tables.

11. Explain the term Mapping?

Ans: Mapping represents the flow of data between the sources and targets. It is a set of target definitions and source linked by transformation objects that defines the data transformation rules. 

12. Define Mapplet?

Ans: Mapplet is a reusable element that consists set of changes that allow reusing that transformation logic in involved mappings. 

13. Explain to us about Transformation?

Ans: It is a repository element that helps to generate and modifies the data. In mapping, transformation represents certain operations that perform on integrated services. All the data that passes through these transformation ports linked with mapping. 

14. What is Data Mining?

Ans: Data Mining is the process of analysing the vast amount of data to find valuable hidden insights and compiling it into useful data. It allows the users to discover unknown relationships and patterns between various elements in the data. The useful insights extracted from the data might help you in scientific discovery, marketing, fraud detection, and many more. 

15. List out various objects that cannot be used in the Mapplets?

Ans: Following are the various objects that cannot use in the mapplets: 

  • COBOL source definition
  • Normalizer transformations
  • Joiner transformations
  • Post or pre-session stored procedures
  • sequence generator transformations that are non-reusable
  • XML source definitions
  • Target definitions
  • IBM MQ source definitions
  • Power mart 3.5 styles of Lookup functions
16. What are the foreign columns in fact and dimensional tables?

Ans: In fact table, foreign keys are the primary key of the dimensional table, and in the dimensional table, foreign keys are the primary key of the entity table. 

Informatica Mdm Training

  •  Master Your Craft
  •  Lifetime LMS & Faculty Access
  •  24/7 online expert support
  •  Real-world & Project Based Learning

Explore Curriculum

17. Explain different ways used in Informatica to switch one environment to another?

Ans: Following are the different ways used in Informatica to switch one environment to another:

  • By copying folder/objects
  • By exporting deploy and repository into an environment
  • By dumping every mapping to XML and using them in a new environment
  • Informatica deployment groups
18. Differentiate Mapping variables and Mapping parameters?


  • Mapping Variable: It is dynamic and changes the value during the session. The PowerCenter reads the initial value of the mapping variable and stores the value in the database after the completion of each session. And the same value is used when you run the session. 
  • Mapping Parameter: It is static, and you need to define the variable before performing the session. And this value remains the same even after completion of the session. While running the session, integration service legalises the value and keeps the same value even after the termination of the session. 
19. Explain various ways to eliminate duplicate records from Informatica?

Ans: Following are the different ways to delete the duplicate records from Informatica: 

  • By choosing the different option in the source qualifier
  • By Revoking a SQL Query in Source qualifier
  • By using Aggregator and group by all fields
20. How to find invalid mappings in a folder?

Ans: Using the following query, you can find invalid mappings in a folder:


21. Explain different repositories that can be created using the Informatica Repository Manager?

Ans: Here are the various types of repositories that you can create in Informatica. 

  • Standalone Repository: It is a single repository that operates individually, which is not associated with other repositories. 
  • Global Repository: It is a central repository and carries shared objects over various repositories of a domain. 
  • Local Repository: Local repository stays within the domain. It can correlate to the global repository with the help of global ways and make use of these objects in their shared folders.
22. Explain different data movement modes that are available in Informatica Server?

Ans: Data movement modes provide a set of instructions to handle the process of character data. Based on that, you can determine the data movement from Informatica server configuration settings. Here are the two data movement modes:

  • ASCII Mode
  • Unicode Mode
23. Explain different types of Locks that are used in Informatica MDM 10.1?

Ans:  Following are the two types of locks used in Informatica MDM 10.1:

  • Exclusive Lock allows the individual user to access and make amends to the underlying metadata.
  • Write Lock allows various users to access and make amends to the underlying metadata at a time. 
24. List out the tools that do not require Lock-in Informatica MDM?

Ans: Following are the list of various tools that are not required lock:

  • Hierarchy Manager
  • Data Manager
  • Merge Manager
  • Audit Manager
25. List out the tools that require Lock in Informatica MDM?

Ans: Following are the tools that are required Lock to make configuration amends to the database of MDM hub master in Informatica MDM:

  • Message Queues
  • Users
  • Databases
  • Tool Access
  • Security Providers
  • Repository Manager
26. Explain about OLAP and OLTP?


  • Online Analytical Processing(OLAP) is a technology that works in the background of the application to support many business Intelligence (BI) applications. It manages, gathers and transforms multidimensional data for analysis. 
  • Online Transaction Process(OLTP) is a process that contains data and normalized tables. This application is designed to modify the data and perform day-to-day operations to analyze the data. 
27. What is the expiration module of automatic look in Informatica MDM?

Ans: For every 60 seconds the hub console is refreshed in the current situation. Lock can be released by the users manually. If a user switches to another database while the lock is on hold, the lock will be released automatically. The lock expires within a minute if the user terminates the hub console.

28. Explain various components of the Informatica hub console?

Ans: Following are the components of Informatica hub console:

  • Design Console: This element helps in configuration solution during the deployment stage and allows the changes according to the requirements in the ongoing configuration. 
  • Data Stewed Console: This element is utilised to review consolidated data and also check the data queued for the exception handling. 
  • Administration Console: This component is used to assign various database administrative activities and role-based security. 
29. List the tables that can linked to the staging date?

Ans: Following are the multiple tables that can integrate with the staging data in MDM. They are:

  • Raw Table
  • Staging Table
  • Landing Table
  • Rejects Table
30. Tell us about various loading phases in MDM?

Ans: Following are the various stages in which data is stored into hub stores in a sequential process.

  • Land
  • Stage
  • Load
  • Match
  • Consolidate

31. Tell us about the Informatica Powercenter?

Ans: Informatica Powercenter is a data integration software which is developed by Informatica Corporation. It is a widely used ETL tool (Extract, Transform, Load) to build organization data warehouses. The components of Informatica PowerCenter assists in extracting the data from multiple sources, transforming the data according to the business requirements and loading it into targeted data warehouses.

 If you want to Explore more about Informatica MDM? then read our updated article – Informatica MDM Tutorial.

32. Describe all the biggest management and technical challenges in adopting MDM?

Some of the most difficult challenges in adopting MDM include: 

  • Model Agility – the selected data management model must be agile to suit the business operations and for seamless data integration.
  • Data Governance – a strong data governance strategy must be used to identify, capture, measure, and rectify data quality issues in the source system. 
  • Data Standards – the data standard set for the master data must also be suitable for all other data types in the organisation.
  • Data Integration – accurate data integration policies must be set to avoid errors in the data integrations process which may result in loss of data during the transfer.
  • Data Stewardship – data stewardship is an important step towards maintaining data quality.

33. What is Data Warehousing?

Data warehouses are a trove of information updated periodically. Data warehouses play a key role in the decision making of any business as it contains all the data relating to processes, transactions, etc., of a company and serves as a deciding factor. Data warehousing allows data analysts to perform analysis, and execute complex queries on structured data sets and also conduct data mining. Data warehouses help identify the current position of the business by comparing all factors. 

34. Define Dimensional Modeling?

Dimensional modeling is an important data structure technique to optimise data storage in data warehouses. Dimensional modeling includes Facts tables and Dimension tables determining measurements of the business (Facts table) and dimensions and other calculations of the measurement (Dimension table).

35. Describe various fundamental stages of Data Warehousing?

The data warehousing stages play an integral role in determining the changes in data in the warehouse. The fundamental stages data warehousing are: 

  • Offline Operational Databases – it is the first fundamental stage in data warehousing. The database development of an operational system to an offline server takes place in this stage by copying the databases.
  • Offline Data Warehouse – the data warehouses are updated periodically from the operational systems and the data is stored in a data structure that is reporting-oriented. 
  • Real-time Data Warehouse –  data warehouses are updated based on the criteria of a particular transaction or event. A transaction is executed every time by an operational system. 
  • Integrated Data Warehouse – It is the last stage of data warehousing. The activity or transaction is reversed back into the operational system and the generated transactions are ready for an organisation’s regular use. 

36. Define Informatica PowerCenter.

Informatica PowerCenter is an ETL (Extract, Transform, and Load)tool used for enterprise data warehouses. PowerCenter helps extract data from the selected source, transforms it, and loads into the chosen data warehouse. Informatica PowerCenter consists of client tools, a server, a repository, and a repository server as its core components. It executes tasks generated by the workflow manager and allows mapping using the mapping designer. 

37. Name various components of Informatica PowerCenter.

There are many components that form the foundation of Informatica PowerCenter. They include: 

  • PowerCenter Repository
  • PowerCenter Domain
  • PowerCenter Client 
  • Administration Console 
  • Integration Service 
  • Repository Service 
  • Data Analyser 
  • Web Services Hub
  • PowerCenter Repository Reports 
  • Metadata manager

38. What is Mapping?

Mapping can be described as a set of target definitions and sources that are connected with transformation objects that define data transformation rules. Mapping represents the flow of data between the targets and sources.

39. What is a Mapplet?

A Mapplet is a reusable object consisting of transformations which can be reused in multiple mappings. A Mapplet can be created in the Mapplet Designer.

40. What is Transformation?

Informatica transformations are objects in the repository capable of reading, modifying, or passing data to defined targeted structures such as tables, files, etc,. Transformations represent a set of rules determining data flow and data loading into targets.

41. What is Data Mining?

Data Mining is also known as knowledge discovery in data (KDD) for the reason that it involves sorting through and performing complex analysis procedures on multiple data sets to discover underlying information crucial for business growth.

42. What is a Fact Table?

In data warehousing, a fact table is present at the centre of a star scheme and contains quantitative information relating to metrics, measurements, or facts of a business process.

43. What is a Dimension Table?

A dimension table is a part of the star, snowflake, or starflake schema in data warehousing. Dimension tables contain measurement of a fact and are connected to the fact table. The dimension tables form an integral component of dimensional modelling. 

44. How to connect the foreign key columns in dimension and fact table.

The dimension tables must contain a primary key which corresponds to a foreign key in a fact table and a fact table must contain a primary key that corresponds with the foreign key in the dimension tables.

45.Describe different methods to load dimension tables.

The methods of loading dimension tables are: 

  • Conventional Loading – all the table constraints and keys are checked before loading the data. 
  • Direct Loading – all the constraints are disabled for direct data loading. This loading process checks table constraints post-data loading and only indexes the qualified data. 

46. Name various objects that can’t be used in a mapplet.

Objects that cannot be used in a mapplet include:

  • COBOL source definition
  • Target definitions
  • IBM WMQ source definitions 
  • XML source definitions
  • Joiner transformations 
  • Normaliser transformations 
  • Non Reusable sequence generator transformations 
  • Power mart 3.5 styles Lookup functions
  • Post or pre-session stored procedures 

47. Define different ways used in Informatica to migrate from one environment to another.

The following are the ways to migrate from one environment to another in Informatica: 

  • The repository can be imported or exported to the new environment 
  • Using Informatica deployment groups 
  • Copying folders or objects 
  • All mappings can be exported to XML  and later imported to a new environment

48. What are the ways for deleting duplicate records in Informatica?

Duplicate records can be deleted from Informatica by:

  • Using  select distinct in source qualifier 
  • Using group and aggregator by all fields
  • Overriding SQL query in the source qualifier 

49. Differentiate between variable and mapping parameters.

  • A mapping parameter holds a constant value before running the session and maintains the same constant value throughout the complete session. The value of the mapping parameter can be updated through the parameter file.
  • A mapping variable does not hold a constant value. The value in a mapping variable can change through the session. Value of the mapping variable is stored by the Informatica server in the repository at the end of each successful session and the same value is used in the next session. 

50. Describe various repositories that can be generated using Informatica Repository Manager.

There are four types of repositories that can be generated using Informatica Repository Manager:

  • Global Repository – the global repository acts as an information hub and stores common objects that can be used by multiple developers through shortcuts. The objects may be operational, application source definitions, reusable transformations, mapplets, and mappings. 
  • Local Repository – local repositories are usually used in the case of development. A local repository facilitates creating shortcuts to objects in shared folders in the global repository. These objects may include source definitions, lookups, common dimensions, and enterprise standard transformations. Copies of the objects can also be created in non-shared folders.
  • Version Control – versioned repositories store multiple versions of an object and each version acts as a separate object with individual properties. PowerCenter’s version control feature allows developing, testing, and deploying metadata into productions.
  • Standalone Repository – a standalone repository is not related to any other repositories and functions by itself. 

51. How to find all the invalid mappings in a folder?

The invalid mappings in a folder can be found using the below mentioned query:



52. Name various data movement modes in Informatica.

Data movement mode enables the Informatica server to handle the character data. The data movement modes can be selected in the Informatica server configuration settings. There are two modes of data movement in Informatica: 

  • ASCII mode and 
  • Unicode mode

53. What is OLTP?

OLTP is the abbreviation of Online Transaction Processing. OLTP involves capturing, storing and processing data from multiple transactions in real-time. All the transaction data is stored in a database.

54. Describe the parallel degree of data loading properties in MDM.

In Informatica, the parallel degree of data loading properties clarify the degree of parallelism set on the base object table and other related tables. Though it does not affect all the batch processes, it has a significant effect on the performance when used. The use of parallel degree depends on the number of CPUs on the database and available memory. The default parallelism value is 1. 

      If you have any doubts on Informatica MDM, then get them clarified from Informatica MDM Industry experts on our Informatica MDM Community!

55. Explain various types of LOCK used in Informatica MDM 10.1.

Informatica MDM 10.1 ha two types of LOCK:

  • Exclusive LOCK – Exclusive LOCK allows only one user to make changes to the underlying operational reference store. 
  • Write LOCK – Write LOCK allows multiple users to make changes to the underlying metadata, at the same time.

56. What is the expiration module of automatic lock-in Informatica MDM?

The hub console is refreshed in the current connection every minute i.e, every 60 seconds. A lock can be manually released by a user. If a user switches to another database while holding a lock, the lock is released automatically. If a user terminates the hub console, the lock expires after one minute. 

Informatica Mdm Training

Weekday / Weekend Batches

57.Name the tool which does not require Lock in Informatica MDM.

Tools which do not require Lock in Informatica include: 

  • Merge manager
  • Audit manager
  • Data manager and 
  • Hierarchy manager

58. Name various tools that require LOCK in Informatica MDM.

In Informatica, some tools require LOCK to make configuration changes to the database of the Hub Master in MDM. These tools are:

  • Tool Access
  • Message Queues
  • Security Providers 
  • Databases
  • Users, and 
  • Repository Manager

59. Name the tables that are linked with staging data in Informatica MDM.

The tables linked with staging data in Informatica are:

  • Raw Table
  • Landing Table
  • Rejects table, and 
  • Staging Table

60. What is OLAP?

OLAP (Online Analytical Processing) software performs multidimensional analysis on large volumes of data. It collects, processes, manages, and presents data for analysis and management. 

61. What are the processes involved in Informatica MDM?

The data from different sources undergoes complex processing and the processes in Informatica include:

  • Landing – the data is acquired from the source system and pushed into the MDM landing tables. 
  • Staging – all the data in the landing tables is cleansed, standardised and then pushed into the MDM staging tables. 
  • Load – the data from the staging table is collected and loaded into the BO table.
  • Tokenization – the tokenization process is used after the configuration of match rules to generate match tokens. 
  • Match – the match process plays an integral role in helping match the records.
  • Merge or Consolidation – all the records that have been matched are consolidated during the merge process.

62. What is a stage process?

The stage process involves the transfer of source data from the landing tables to the stage tables. The stage process is completed using the stage mapping between the landing table and the stage table. Data cleansing and standardisation is also done during the stage process. 


With this article, we aim to target the most frequently asked interview questions and encourage learners to make the most out of the answers listed above to clear the interview with maximum knowledge and confidence. In case we have missed an important question, please mention it in the comments so that we update it in the article. 

Leave a Comment