Peter Dadam of University of Ulm opened the last day of the conference (and my last session, since I’m headed out at the morning break) with a keynote on the future of BPM: Flyin with the Eagles, or Scratching with the Chickens?
He went through some of his history in getting into research (in the IBM DB2 area), with a conclusion when you ask current users about what they want, they tend to use the current technology as a given, and only request workarounds within the constraints of the existing solution. The role of research is, in part, to disseminate knowledge about what is possible: the new paradigm for the future. Anyone who has worked on the bleeding edge of innovation recognizes this, and realizes that you first have to educate the market on what’s possible before you can begin to start developing the use cases for it.
He discussed the nature of university research versus industrial research, where the pendulum has swung from research being done in universities, to the more significant research efforts being done (or being perceived as being done) in industrial research centers, to the closing of many industrial research labs and a refocusing on pragmatic, product-oriented research by the rest. This puts the universities back in the position of being able to offer more visionary research, but there is a risk of just being the research tail that the industry dog wags.
Moving on to BPM, and looking at it against a historical background, we have the current SOA frenzy in industry, but many enterprises implementing it are hard-pressed to say why their current SOA infrastructure provides anything for them that CORBA didn’t. There’s a big push to bring in BPM tools, particularly modeling tools, without considering the consequences of putting tools like this in the hands of users who don’t understand the impact of certain design decisions. We need to keep both the manual and automated processes in mind, and consider that exceptions are often not predictable; enterprises cannot take the risk of becoming less flexible through the implementation of BPM because they make the mistake of designing completely structured and rigid processes.
There’s also the issue of how the nature of web services can trivialize the larger relationship between a company and its suppliers: realistically, you don’t replace one supplier with another just because they have the same web services interface, without significant other changes (the exception to this is, of course, when the product provided by the supplier is the web service itself).
He sees that there is a significant risk that BPM technology will not develop properly, and that the current commercial systems are not suitable for advanced applications. He described several challenges in implementing BPM (e.g., complex structured processes; exceptions cannot be completely anticipated), and the implications in terms of what must exist in the system in order to overcome this challenge (e.g., expressive process meta model; ad-hoc deviations from the pre-planned execution sequence must be possible). He discussed their research (more than 10 years ago now) in addressing these issues, considering a number of different tools and approaches, how that resulted in the ADEPT process meta model and eventually the AristaFlow process management system. He then gave us a demo of the AristaFlow process modeler — not something that you see often in a keynote — before moving on to discuss how some of the previously stated challenges are handled, and how the original ADEPT research projects fed into the AristaFlow project. The AristaFlow website describes the motivation for this joint university-industry project:
In particular, in dynamic environments it must be possible to quickly implement and deploy new processes, to enable ad-hoc modifications of single process instances at runtime (e.g. to add, delete or shift process steps), and to support process schema evolution with instance migration, i.e. to propagate process schema changes to already running instances. These requirements must be met without affecting process consistency and by preserving the robustness of the process management system.
Although lagging behind many commercial systems in terms of user interface and some functionality, this provides much more dynamic functionality in areas such as allowing a user to add make minor modifications to the process instance that they are currently running.
He concluded with the idea that BPM technology could become as important as database technology, if done correctly, but it’s a very complex issue due to the impact on the work habits of the people involved, and the desire not to limit flexibility while still providing the benefits of process automation and governance. It’s difficult to predict what real-world process exceptions will occur and therefore what type of flexibility will be required during execution. By providing a process template rather than a rigidly-structured process instance, some of this flexibility can be achieved within the framework of the BPMS rather than forcing the users to break the process in order to handle exceptions.