My last session for the day at CASCON is to hear Alex Lau of IBM, and Andrei Solomon and Marin Litoiu of York University discuss their research project on real time monitoring and simulation of business processes (they also presented a related paper this morning, although I missed the session). This is one of the CAS Innovation Impact sessions, which are based on fellowship projects that are “ready for harvesting”, which I think means that they’re ready to move towards productization. These projects involve one student (Solomon) and one professor (Litoiu) from a university, then one person from CAS (Lau) and one from IBM development (Jay Benayon, also credited on the research). In this case, the need for the research was identified by IBM development; I’m not sure if this is the usual method, although it seems logical that these projects act as mini research teams for work that is beyond the scope of production development.
In addition to their research, they’ve prototyped a system that integrates with WebSphere Process Modeler that can use the monitoring data from an in-flight process in order to feedback to a simulation model of the process to improve what-if scenario analysis and process KPI forecasting. The key research challenge was the use of the monitoring data, because that data is typically quite noisy since it could include hidden overhead such as queuing, and tended to skew results to make task durations appear to be longer than they actually were. This noise can’t be measured directly, but they’ve attempted to filter it out of the monitoring data using a particle filter prior to feeding into the simulation model.
Their proof of concept prototype linked together three IBM products, and could either change automated decisions in the runtime process or suggest alternatives to a human participant, based on the near-past performance of that process instance and any approaching SLA deadlines. One caveat is that they didn’t use real-world (e.g., customer) business processes for this, but created their own processes and ran them to generate their results.
They see this research as applicable to any process modeling tools where processes can be simulated against a set of metrics, KPIs can be forecasted, and simulation parameters are entered at modeling time and remain static until explicitly updated. They have the potential to extend their technique, for example, by providing better trend predictions based on regression techniques. There are other vendors working on research like this, and I think that we’ll see a lot more of this sort of functionality in BPM solutions in the future, where users are presented with suggestions (or automated decisions made) based on a prediction of how well a process instance is likely to meet its KPIs.
That’s it for me today, but I’ll be back here tomorrow morning for the workshop on practical ontologies, the women in technology lunch panel, and the afternoon keynote.