This part of the blog series will focus on the approach that we used to implement the Vendor Master Data governance solution based on SAP MDM, BPM and ECC.
Lets get started with the overview of the existing process that client was having: Before this solution was realized, the client was having SAP MDM for mastering the Global attributes of the Vendor (Name, Address Data, Contact details) and used to maintain the Local attributes (Purchase Area specific data like Controlling data) directly in SAP ECC system and also had the SAP SRM system in place which was not tightly integrated with MDM. Process flows varied across the organization and local processes were being followed in local sites across geographies.
We started with gathering of inputs on the local processes being followed across the organization and studied the process to propose a common global process for Vendor Master Data maintenance that was not specific to a local site and fulfilled the most core requirements I the initial release. We had to prioritize the requirements based on the impact that it would make and listed out the core feature that were required along with other nice to have features which we agreed to implement at a later stage in form of enhancements to make sure that we are on track and provide the most required functionality to the business within the given time frame.
When we started with the actual realization of the solution, we partly followed the Agile development method wherein we used to develop the core requirements first and demonstrated the working piece to the client to obtain the feedback on the developments. This way we involved the different business teams right through the development stage and accepted changes (not major) during the development and saved a lot of time (in comparison to making changes at a later stage) along with the customer delight. Actually we were able to complete the realization well ahead of the schedule so we went ahead and worked on adding the ‘Nice to have’ features which led to a positive customer experience. Once the development finished we had a round of rigorous testing by the functional team and later a User Acceptance test followed by end-user training and finally go-live.
Everything went well during the go-live and post go-live support and we did not receive a single high priority issue for the solution and the solution is working efficiently and helped the Master Data organization to process the requests quickly while maintaining the data quality. Next part of this blog series focuses on the technical architecture used for the solution.
Part 1 of this series can be found here.