Today’s keynote was by Richard Hull, Research Manager at the T.J. Watson Research Center, on business entities with lifecycles. He started by describing the process of process: how business strategy maps to business goals which maps to business operations, that is, the multiple conceptual models involved in BPM design and implementation. He stated that today’s approach to BPM environments is fundamentally disjointed: there’s one conceptual model for rules and policies, another for analytics and dashboards, and another core business process model based on activity flows, while the data being manipulated is often not related to the conceptual models but is more of an afterthought in the design.
Process and data are two sides of the same coin (as Clay Richardson stated at BPM 2010, and many of us have been saying for a while), and Hull thinks that we need to change the way that we model data to account for the fact that it is being used to support process, and change the way that we model processes to account for the tight linkage between process and data. This has led to a new way to model and implement business processes and operations: business entities with lifecycles (BEL). He defined a BEL as a key conceptual dynamic entity/object that is used in guiding the operation of a business, such as a FedEx package delivery that includes all data and processing steps from pickup through delivery and payment. This is an integrated view of the process and data, with different views according to the lifecycle of the entity. That means that it needs to include an information model containing the relevant data, and a lifecycle model to show the possible lifecycles. There’s been a couple of papers published by this research group, and they’ve integrated this into IBM GBS both in terms of a methodology and some tools.
He presented BEL as a new way of thinking – like the shift to OOP – that we need to get our head around in order to understand the methodology and the benefits. He walked through the characteristics of the first generation of BEL, starting with the end-to-end process visibility across multiple silos and their interactions. The example was an automotive engineering/manufacturing process, where they defined business entities for an engineering change, an engineering work order and a part, each of which has an information (data) model and a lifecycle (process, represented as a finite state machine) model that may span the engineering, purchasing and manufacturing silos. Each task in the process, which is likely executed within a single silo, relates to a view on the information model. He sees this as coming before requirements: once the information and lifecycle models are created, then the requirements can be created based on those.
He described BEL as “upper middleware”, since BELs can be used as an upper layer to control legacy applications via their services wrappers. The business entity approach relies on identifying things that matter to define the objects; he discussed a banking example, where those things included customer, campaign, product, element (product building block) and deal. He showed the lifecycle for the deal entity, where the information model was represented alongside the process model. Although BPMN 2.0 now has the ability to represent data models, BPEL does not, and it’s probably safe to say that most process modeling is still done fairly separate from data modeling. However, BPM tools are changing to provide better linkages between process and data models, and I think that we’ll start to see full integration between process modeling tools and master data management tools in the next year or so.
By bringing together the data and process models, BELs now have all the structure required to automatically generate user interface screens for the human activities in the process; this is something that we’ve been seeing in BPM tools for a number of years, based on the process instance data model as defined in the process. Although you typically can’t just use those auto-generated screens as is, the BEL models and tooling along with the UI auto-gen provides a significant accelerator for creating process applications.
Maybe I’ve just seen too many leading-edge products and methods, but I don’t find anything here that new. Yes, it’s nice to have more formalized models and methodologies around this, but the idea of linking business object and process models, and auto-generating user interface screens from them, isn’t cutting edge. He positions this as not really competitive to WebSphere (process modeler), but being for cases when BPMN just doesn’t cut it. He described the differentiator as that of disaggregating operations into chunks that are meaningful to the business, then modeling the lifecycle of those chunks relative to a more global information model.
There’s more research being done on this, including moving to a more declarative style; I’d be interested in seeing more about this to understand the benefits over the current crop of BPM products that allow for a more integrated business object/process modeling environment.
I don’t find auto-generating screens cutting edge either.
I do think that a fair number of people overlook BEL’s when tackling process problems, and a fair number of people overlook process when tackling data problems 🙂
Thanks for this overview on Rick’s work. I think this is a crucial change of paradigm that companies will embrace sooner or later.
It doesn’t make any sense to consider data and processes as separate realms, with different kings and armies fighting at their borders.
I also have been doing integrated modeling of different aspects of applications for a while, not limited to BP and data models, but comprising also other aspects like user role modeling, application logic modeling, user interaction modeling, and so on. In our approach we envision an integrated model driven development (MDD) approach covering all these aspects and providing automatic alignment and quick prototyping.
I strongly believe that model integration, alignment and continuous cross-validation down to the implementation is the only response to the complex needs of today’s enterprises.
Here is an overview of what we do:
https://www.slideshare.net/stefano_butti/webratio-a-mdd-approach-to-bpm-4872510
And a summary of MDD responses to current BPM trends (including MDM integration):
https://www.slideshare.net/mbrambil/webratio-bpm-trends-and-challenges
I would be glad to get feedback on this.
Scott and Marco, thanks for your comments. As you are likely aware, I’ve been pushing the benefits of modeling data and process together for a while: maybe I should change the name of this blog from “Column 2” to “Columns 1 and 2” 🙂
Marco, I like the standardized language approach to MDD that I see in your slides; the idea of MDD including full application modeling is not new, but it would be nice to see some standardization so that we can consider moving full application models between tools, not just the process models.