BPM And MDM For Business Performance Improvement

Andrew White presented a session at Gartner BPM 2013 on how process, applications and data work together, from his perspective as an analyst focused on master data management (MDM). He was quick to point out that process is more important than data 😉 but put forward MDM as a business-led discipline for maintaining a single version of the truth for business data. The focus is on how that data is created and maintained in business applications, assuring the integrity of the processes that span those applications. Since his background is in ERP systems, his view is that processes are instantiated by applications, which are in turn underpinned by data; however, the reality that I see with BPMS is that data resides there as well, so it’s fair to say that processes can consume data directly, too.

Master data is the common set of attributes that are reused by a wide variety of systems, not application-specific data — his example of master data was the attributes of a specific inventoried product such as size and weight — but there is also shared data: that grey area between the common master data and application-specific data. There are different tiers of systems identified in their pace layering, with different data access: systems of record (e.g., ERP) tend to consume enterprise master data and transaction data; systems of differentiation (e.g., CRM) consume master data, analytic data and rich data; and systems of innovation (e.g., Facebook app) consume analytic data, rich data and cloud-sourced data that might be someone else’s master data. End-to-end business processes may link all of these systems together, and be guided by different data sources along the way. It all makes my head hurt a little bit.

MDM programs have some of the same challenges as BPM programs: they need to focus on specific business outcomes, and focus on which processes need improving. And like the Fight Club reference that I heard earlier today (“the first rule of process is that you don’t talk about process”), you want MDM to become transparent and embedded, not be a silo of activity on its own. Also in common with some BPM initiatives is that MDM is often seen as an IT initiative, not a business initiative; however, just like defining business processes, it’s up to the business to identify their master data. MDM isn’t about data storage and retention; it’s about how data is used (and abused) throughout the business lifecycle. In my opinion, we still need better ways to model the data lifecycle at the same time as we model business processes; BPMN 2.0 added some provisions for data modeling, but it’s woefully inadequate for a whole data lifecycle model.

White presented a number of things that we need to think about when creating an enterprise data model, and best practices for aligning BPM and MDM. The two initiatives can be dovetailed, so that BPM provides priority and scope for the MDM efforts. Business processes (and not just those implemented in a BPMS) create and consume data, and once a process is defined, the points where data is created, viewed and updated can be identified and used as input to the master data model. From an EA standpoint, the conceptual, logical and physical models for data and process (or column 1 and column 2, if you’re a Zachman follower) need to be aligned.

TIBCO Product Strategy With Matt Quinn

Matt Quinn, CTO, gave us the product strategy presentation that will be seen in the general session tomorrow. He repeated the “capture many events, store few transactions” message as well as the five key components of a 21st century platform that we heard from Murrary Rode in the previous session; this is obviously a big part of the new messaging. He drilled into their four broad areas of interest from a product technology standpoint: event platform innovation, big data and analytics, social networking, and cloud enablement.

In the event platform innovation, they released BusinessEvents 5.0 in April this year, including the embedded TIBCO Datagrid technology, temporal pattern matching, stream processing and rules integration, and some performance and big data optimizations. One result is that application developers are now using BusinessEvents to build applications from the ground up, which is a change in usage patterns. For the future, they’re looking at supporting other models, such as BPMN and rule models, integrating statistical models, improving queries, improving the web design environment, and providing ActiveMatrix deployment options.

In ActiveMatrix, they’ve released a fully integrated stack of BusinessWorks, BPM and ServiceGrid with broader .Net and C++ support, optimized for large deployments and with better high-availability support and hot deployment capabilities. AXM/BPM has a number of new enhancements, mostly around the platform (such as the aforementioned HA and hot deployment), with their upcoming 1.2 release providing some functional enhancements such as customer forms and business rules based on BusinessEvents. We’ll see some Nimbus functionality integration before too much longer, although we didn’t see that roadmap; as Quinn pointed out, they need to be cautious about positioning which tools are for business users versus technical users. When asked about case management, he said that “case management brings us into areas where we haven’t yet gone as a company and aren’t sure that we want to go”. Interesting comment, given the rather wild bandwagon-leaping that has been going on in the ACM market by BPM and ECM vendors.

The MDM suite has also seen some enhancements, with ActiveSpaces integration and collaborative analytics with Spotfire, allowing MDM to become a hub for reference data from the other products. I’m very excited to see that one-click integration between MDM and AMX/BPM is on the roadmap; I think that MDM integration is going to be a huge productivity boost for overall process modeling, and when I reviewed AMX/BPM last year, I liked their process data modeling stated that “the link between MDM and process instance data needs to be firmly established so that you don’t end up with data definitions within your BPMS that don’t match up with the other data sources in your organization”. In fact, the design-time tool for MDM is now the same as that used for business object data models that I saw in AMX/BPM, which will make it easier for those who move across the data and process domains.

TIBCO is trying to build out vertical solutions in certain industries, particularly those where they have acquired or built expertise. This not only changes what they can package and offer as products, but changes who (at the customer) that they can have a relationship with: it’s now a VP of loyalty, for example, rather than (or in addition to) someone in IT.

Moving on to big data and analytics technology advances, they have released FTL 2.0 (low-latency messaging) to reduce inter-host latency below 2.2 microseconds as well as provide some user interface enhancements to make it easier to set up the message exchanges. They’re introducing TIBCO Web Messaging to integrate consumer mobile devices with TIBCO messaging. They’ve also introduced a new version of ActiveSpaces in-memory data grid, providing big data handling at in-memory speeds by easing the integration with other tools such as event processing and Spotfire.

They’ve also released Spotfire 4.0 visual analytics, with a bit focus on ease of use and dashboarding, plus tibbr integration for social collaboration. In fact, tibbr is being used as a cornerstone for collaboration, with many of the TIBCO products integrating with tibbr for that purpose. In the future, tibbr will include collaborative calendars and events, contextual notifications, and other functionality, plus better usability and speed. Formvine has been integrated with tibbr for forms-based routing, and Nimbus Control integrates with tibbr for lightweight processes.

Quinn finished up discussing their Silver Fabric cloud platform to be announced tomorrow (today, if you count telling a group of tweet-happy industry analysts) for public, private and hybrid cloud deployments.

Obviously, there was a lot more information here that I could possibly capture (or that he could even cover, some of the slides just flew past), and I may have to get out of bed in time for his keynote tomorrow morning since we didn’t even get to a lot of the forward-looking strategy. With a product suite as large as what TIBCO has now, we need much more than an hour to get through an analyst briefing.

TIBCO Now Roadshow: Toronto Edition (Part 2)

We started after the break with Jeremy Westerman, head of BPM product marketing for TIBCO, presenting on AMX BPM. The crowd is a bit slow returning, which I suspect is due more to the availability of Wii Hockey down the hall than to the subject matter. Most telling, Westerman has the longest timeslot of the day, 45 minutes, which shows the importance that TIBCO is placing on marketing efforts for this new generation of their BPM platform. As I mentioned earlier, I’ve had 3+ hours of briefing on AMX BPM recently and think that they’ve done a good job of rearchitecting – not just refactoring – their BPM product to a modern architecture that puts them in a good competitive position, assuming that they can get the customer adoption. He started by talking about managing business processes as strategic assets, and the basics of what it means to move processes into a BPMS, then moved on to the TIBCO BPM products: Business Studio for modeling, the on-premise AMX BPM process execution environment, and the cloud-based Silver BPM process execution environment. This built well on their earlier messages about integration and SOA, since many business processes – especially for the financial-biased audience here today – are dependent on integrating data and messaging with other enterprise systems. Business-friendly is definitely important for any BPM system, but the processes also have to be able to punch at enterprise weight.

His explanation of work management also covered optimizing people within the process: maximizing utilization while still meeting business commitments through intelligent routing, unified work lists and process/work management visibility. A BPM system allows a geographically distributed group of resources to be treated as single pool for dynamic tunable work management, so that the actual organizational model can be used rather than an artificial model imposed by location or other factors. This led into a discussion of workflow patterns, such as separation of duties, which they are starting to build into AMX BPM as I noted in my recent review. He walked through other functionality such as UI creation, analytics and event processing; although I’ve seen most of this before, it was almost certainly new to everyone except the few people in the room who had attended TUCON back in May. The BPM booth was also the busiest one during the break, indicating a strong audience interest; I’m sure that most BPM vendors are seeing this same level of interest as organizations still recovering from the recession look to optimize their processes to cut costs and provide competitive advantage.

Ivan Casanova, director of cloud marketing for TIBCO, started with some pretty simple Cloud 101 stuff, then outlined their Silver line of cloud platforms: Silver CAP for developing cloud services, Silver Fabric for migrating existing applications, Silver BPM for process management, and Silver Spotfire for analytics. Some portion of the IT-heavy audience was probably thinking “not in my data centre, dude!”, but eventually every organization is going to have to think about what a cloud platform brings in terms of speed of deployment, scalability, cost and ability to collaborate outside the enterprise. Although he did talk about using Fabric for “private cloud” deployments that leverage cloud utility computing principles for on-premise systems, he didn’t mention the most likely baby step for organizations who are nervous about putting production data in the cloud, which is to use the cloud for development and testing, then deploy on premise. He finished with a valid point about how they have a lot of trust from their customers, and how they’ve built cloud services that suit their enterprise customers’ privacy needs; IBM uses much the same argument about why you want to use an large, established, trusted vendor for your cloud requirements rather than some young upstart.

We then heard from Greg Shevchik, a TIBCO MDM specialist, for a quick review of the discipline of master data management and TIBCO’s Collaborative Information Manager (CIM). CIM manages the master data repositories shared by multiple enterprise systems, and allows other systems – such as AMX BPM – to use data from that single source. It includes a central data repository; governance tools for validation and de-duplication; workflow for managing the data repository; synchronization of data between systems; and reporting on MDM.

Last up for the Toronto TIBCO Now was Al Harrington (who was the mystery man who opened the day), giving us a quick view of the new generation of TIBCO’s CEP product, BusinessEvents. There’s a lot to see here, and I probably need to get a real briefing to do it justice; events are at the heart of so many business processes that CEP and BPM are becoming ever more intertwined.

My battery just hit 7% and we’re after 5pm, so I’ll wrap up here. The TIBCO Now roadshow provides a good overview of their updated technology portfolio and the benefits for customers; check for one coming your way.