Implementing a Vendor Master Data governance solution based on BPM, MDM & ECC: Part 3


This part of the blog series will focus on the Technical architecture part of the implementation of the Vendor Master Data governance solution based on SAP MDM, BPM and ECC.

We will take example of Vendor Master data creation scenario wherein after receiving a request for Vendor Creation, certain approvals are required (like Clearance from Procurement, Finance, etc) before a Vendor record is actually created in the system. SO we designed a Approval roces wherein a requester would initiate the creation request by logging onto the portal and filling up and submitting a request for Vendor Creation. This request in turn triggers the approval process that we designed in Netweaver BPM process. We used the MDM Java APIs for performing various CRUD operations for maintaining global data in MDM while used Remote Function Calls to read local data from ECC and PI webservices for posting the local data to ECC.

Below is the block architecture diagram showing various linkages between the components:

MDM web services generator (starting MDM 7.1 SP 04) can also be utilized for generating the web services to minimize the complexity and development time that is required for writing Java API code.

Next part of this blog series is focused on the challenges faced while implanting this solution.

Part 1 and Part 2 can be reached by clicking on the part link.

Share this:
Share this page via Email Share this page via Stumble Upon Share this page via Digg this Share this page via Facebook Share this page via Twitter
   Send article as PDF   
Posted in Uncategorized | Tagged | Leave a comment

Implementing a Vendor Master Data governance solution based on BPM, MDM & ECC: Part 2

This part of the blog series will focus on the approach that we used to implement the Vendor Master Data governance solution based on SAP MDM, BPM and ECC.

Lets get started with the overview of the existing process that client was having: Before this solution was realized, the client was having SAP MDM for mastering the Global attributes of the Vendor (Name, Address Data, Contact details) and used to maintain the Local attributes (Purchase Area specific data like Controlling data) directly in SAP ECC system and also had the SAP SRM system in place which was not tightly integrated with MDM. Process flows varied across the organization and local processes were being followed in local sites across geographies.

We started with gathering of inputs on the local processes being followed across the organization and studied the process to propose a common global process for Vendor Master Data maintenance that was not specific to a local site and fulfilled the most core requirements I the initial release. We had to prioritize the requirements based on the impact that it would make and listed out the core feature that were required along with other nice to have features which we agreed to implement at a later stage in form of enhancements to make sure that we are on track and provide the most required functionality to the business within the given time frame.

When we started with the actual realization of the solution, we partly followed the Agile development method wherein we used to develop the core requirements first and demonstrated the working piece to the client to obtain the feedback on the developments. This way we involved the different business teams right through the development stage and accepted changes (not major) during the development and saved a lot of time (in comparison to making changes at a later stage) along with the customer delight. Actually we were able to complete the realization well ahead of the schedule so we went ahead and worked on adding the ‘Nice to have’ features which led to a positive customer experience. Once the development finished we had a round of rigorous testing by the functional team and later a User Acceptance test followed by end-user training and finally go-live.

Everything went well during the go-live and post go-live support and we did not receive a single high priority issue for the solution and the solution is working efficiently and helped the Master Data organization to process the requests quickly while maintaining the data quality. Next part of this blog series focuses on the technical architecture used for the solution.

Part 1 of this series can be found here.

Share this:
Share this page via Email Share this page via Stumble Upon Share this page via Digg this Share this page via Facebook Share this page via Twitter
   Send article as PDF   
Posted in Uncategorized | Tagged | Leave a comment