Daniel Freed of Merck discussed their SAP implementation, and how their integration strategy uses TIBCO to integrate with non-SAP systems. As with Connie Moore’s presentation this morning, the room was packed (I’m sitting on the floor and others standing around the perimeter of the room), and I have to believe that TIBCO completely underestimated attendees’ interest in BPM since we’re in a room that is half the size (or less) that for some of the other streams. Of course, this presentation is really about application integration rather than BPM…
They have four main integration scenarios:
- Master data replication (since each system expects to maintain its own data, but SAP is typically the true master data source), both event-driven publish-subscribe and batch point-to-point.
- Cross-system business process, using event-driven publish-subscribe and event-driven point-to-point.
- Analytical extraction/consolidation with batch point-to-point from operational systems to the data warehouse.
- Business to business, with event-driven point-to-point as well as event-driven publish-subscribe and batch point-to-point.
They have some basic principles for integration:
- Architect for loosely coupled connectivity, in order to increase flexibility and improve BPM; the key implication is that they needed to move from point-to point integrations to hub-and-spoke architecture, publish from the source to all targets rather than chaining from one system to another, and use canonical data models.
- Leverage industry standards and best practices
- Build and use shared services
- Architect for "real-time business" first
- Proactively engage the business in considering new opportunities enabled by new integration capabilities
- Architect to insulate Merck from external complexity
- Design for end-to-end monitoring
- Leverage integration technology to minimize application remediation (i.e., changes to SAP) required to support integration requirements
SAP, of course, isn’t just one monolithic system: Merck is using multiple SAP components (ECC, GTS, SCM, etc.) that have out-of-the-box integration provided by SAP through Process Integrator (PI), and Merck doesn’t plan to switch out PI for TIBCO. Instead, PI bridges to TIBCO’s bus, then all other applications (CRM, payroll, etc.) connect to TIBCO.
Gowri Chelliah of HCL (the TIBCO partner involved in the project) then discussed some of the common services that they developed for the Merck project, including auditing, error handling, cross-referencing, monitoring, and B2B services. He covered the error handling, monitoring, cross-reference and B2B services in more detail, showing the specific components, adapters and technologies used for each.
Freed came back up to discuss their key success factors:
- Creation of shared services
- Leverage global sourcing model
- Integration strategy updated for SAP
- Buy-in from business on integration strategy
- Program management
- High visibility into the development process
- Comprehensive on-boarding process for quick ramp-up
- Factory approach to integration — de-skill certain tasks and roles to leverage less experienced and/or offshore resources
- Thorough and well-documented unit testing
- Blogs and wiki for knowledge dissemination and sharing within the team, since it was spread over 5 cities
- Architecture team responsible for consistency and reuse
- Defined integration patterns and criteria for applicability
- Enhanced common services and frameworks
- Architecture defined to support multiple versions of services and canonical data mocdels
- Development templates for integration patterns
- Canonical data models designed early
In short, they’ve done a pretty massive integration project with SAP at the heart of their systems, and use TIBCO (and its bridge to SAP’s PI) to move towards a primarily event-driven publish-subscribe integration with all other systems.