BonitaSoft Open Source BPM

I recently had my first briefing with BonitaSoft about their open source BPM product. Although the project has been going on for some time, with the first release in 2001, the company is only just over a year old; much of the development has been done as part of BPM projects at Bull. Their business model, like many open source companies, is to sell services, support and training around the software, while the software is available as a free download and supported by a broader community. They partner with a number of other open source companies – Alfresco for content management, SugarCRM for CRM, Jaspersoft for BI – in order to provide integrated functionality without having to build it themselves. They’ve obviously hit some critical mass point in terms of functionality and market, since their download numbers have increased significantly in the past year and have just hit a half million.

A French company, they have a strong European customer base, and a growing US customer base, mostly comprising medium and large customers. They’ve just announced the opening of two US offices, and the co-founder/CEO Miguel Valdés Faura is moving to the San Francisco area to run the company from there; that’s the second European company that I’ve heard of lately where the top executives are moving to the Bay area, indicating that the “work from anywhere” mantra doesn’t necessarily pan out in practice. They’ve hired Dave Cloyd away from open source content management company Nuxeo as a key person in the building the US market; he was VP of sales at Staffware prior to the TIBCO acquisition, so knows both the open source and BPM side.

Open source BPM solutions have been around for a while, but the challenges are the same as with any open source project: typically, it takes greater technical skills to get up and running with open source, especially if it doesn’t do everything that you need and has to be integrated with other (open source or not) products. In many cases, open source BPM provides the process engine embedded inside a larger solution created by a systems integrator or business process outsourcing firm; in other words, it’s more like a toolkit for adding process capabilities into another application or environment. BonitaSoft considers jBPM, Activiti and ProcessMaker to be in this “custom BPM development” camp, as opposed to the usual commercial players in the “standalone BPM suites” category; they see themselves as being able to play on both sides of that divide.

Taking a look (finally, after 35 minutes of PowerPoint) at a product demo, I saw their four main components of process modeling, process development, process execution, and process administration and monitoring.

The modeler is a desktop Eclipse-based application providing BPMN 2.0 modeling, including importing of BPMN models from other tools. There is starting to be less distinction between these tools, as all the vendors start to pick up the user interface tricks that make process modeling work better: auto-alignment, automatic connector creation, and tool tips with the most likely next element to add. The distinguishing characteristics start to become how the non-standard modeling aspects are handled: data modeling and integration with other systems using proprietary connectors that go beyond the capabilities of a simple web services call, for example.

Bonitasoft BPM - Alfresco connector actionsI like what they’ve done with some of the out-of-the-box connectors: the Sharepoint and Alfresco connectors allow you to browse and select a specific document repository event (such as check in a file) directly from within the process designer, and associate it with an activity in the process model. I saw a fairly comprehensive database connector that allowed for graphical query creation, and this connection can be used to transfer a data model from a database to the process model to build out the process instance data. There’s a wizard to create your own connectors, or browse the BonitaSoft community to find connectors created by others – a free marketplace for incremental functionality.

You can create a web form for a particular step in the process, which will auto-generate based on the defined data model, then allow new fields to be added based on external database calls, and reformatted in a graphical editor. Effectively, this capability allows a quick process-based application to be created with a minimum of code, just using the forms designer and connectors to databases and other systems.

Key performance indicators (KPIs) can be defined in process modeler; these are effectively data objects that can be populated by any step of the process, then reported on via a BI engine such as the integrated Jaspersoft.

Although they describe their modeling as collaborative, it’s asynchronous collaboration, where the model and associated forms are saved to the Bonita repository model, where they are property versioned and can be checked out by another user.

Bonitasoft BPM - user inbox viewThe end-user experience uses an inbox metaphor in a portal, with the forms displayed as the user interacts with the process. Individual process instances (or entire processes) can be tagged with private labels by a user – similar to labels applied to conversations in Gmail – and categories can be applied to processes so that every instance of that process has the same category, visible to all users. Love the instance and process tagging: this is a capability that I’ve been predicting for years, and just starting to see it emerge.

I was surprised by the lack of flexibility in runtime environment: the only change that a user can make to a process at runtime is to reassign a task, although they are working on other features to handle more dynamic situations.

The big product announcements from last month, with the release of version 5.3, included process simulation and support for cloud environments with multi-tenancy and REST APIs. However, by this time we were getting to the end of our time and I didn’t get all the details; that will have to wait for another day, or you can check out the brief videos on their site.

Getting Started With BPM: A Series

I’ve been working with Steve Russell, SVP of Engineering at Global 360, to create a series of articles on how to get started with BPM. Since this is sponsored content, I won’t publish it here, but will point you over to a link each week when it is published on BPM.com.

This week, we introduce the series with a short description of each of the upcoming six articles:

  • Picking the right first process
  • Gaining business buy-in for project success
  • Ensuring user adoption
  • Structured versus unstructured work
  • Measuring success
  • Moving to wider adoption across the organization

We’re not wedded to these ideas (although I’m already working on the first, so that’s unlikely to change), and if you have ideas of things that you’d like to see included in the series, add a comment here and we’ll work it in if possible.

IBM Blueworks Live Sneak Peak

When I wrote a post yesterday about the slow convergence between BPM and social software, I had forgotten about the analyst briefing that I had scheduled with IBM later in the day for a sneak peak of the new Blueworks Live site. Lombardi has always been at the forefront of the integration of social and BPM, although previously focused purely on the process discovery/design phase, and the IBM acquisition has allowed Lombardi’s social process discovery to be combined with IBM’s online BPM community to create something greater than the sum of the parts. For all my criticism of IBM, they have some incredible pockets of innovation that sometimes burst out into actual product.

Yesterday’s session was hosted by Phil Gilbert; apparently this was the first public viewing of the site, which will be officially unveiled this Saturday, November 20th. Phil, who I’ve known for a number of years through his time at Lombardi, explained some of the motivation for Blueworks Live, and in a weird echo of the post that I wrote just hours before, he said “BPM is ready to meet social networking”. They are trying to reinvent the public BPM community, while avoiding the problems that they perceive with other vendors’ community sites:

  • They are mainly product support sites
  • They have high membership numbers, but low participation
  • A majority of the information is from the sponsor company
  • The customer perception is that these sites are proprietary and biased, and that there’s already too many sources of information on BPM

Blueworks Live Community

In their search for a truly public BPM community, they turned to that universal public community: Twitter. They are taking the public BPM-focused Twitter stream, based on both BPM-focused users (including everyone on the analyst call, said Phil) and the #bwlive hashtag, to create a public stream that will be displayed alongside a user’s private activity stream in Blueworks Live. The private activity stream is based on processes and projects in which the user is a participant, or that the user has selected to follow.

Blueworks Live is a combination of the previous BPM BlueWorks Beta community and the (Lombardi) BPM Blueprint process discovery tool; although BPM BlueWorks Beta had some process modeling tools, they were not of the sophistication of Blueprint. However, it’s more than just community and process modeling: Blueworks Live also includes process automation for the long tail of low-volume administrative processes, that is, those simple human-based processes that can’t warrant a BPM implementation that involves IT. IBM estimates that 75% of all business processes fall into this category – including processes from HR, IT, accounting, marketing and a number of other areas – and most end up being done in email.

IBM Blueworks LiveWe moved on to a product demo by Cliff Vars, a product manager, who started with the view of the site by unregistered (that is, unpaid) users. Without signing in, you can view:

  • Under the Community section, the afore-mentioned public BPM Twitter stream, made up of specific Twitter users and tweets containing the #bwlive hashtag. Although the pricing chart indicated that free users could see both public and private communities, we only saw the public BPM Twitter stream before logging in.
  • Under the Library section, blog posts migrated from the old BPM BlueWorks Beta site. I believe that a lot of the content from the old site was written by IBM employees and was moderated, so can’t exactly be considered public community content.
  • Also in the Library section, a number of process templates that appear to be in the (Lombardi) Blueprint format – not clear how useful that would be if you weren’t a paid user, since you couldn’t use the Blueprint modeler to open them.

Creating a Process Automation

We then logged on to take a look at how simple process automation works. In the logged-in view, the “Getting Started” section is replaced by the “Work” section, which contains all of the tasks assigned to the user, the process instances that they’ve launched, the ability to launch a new process instance, plus links to create a Blueprint process design or a new automated process. It’s important to recognize that there’s two distinct types of processes here: complex processes modeled in Blueprint (the former Lombardi tool), which may eventually be transferred to an on-premised IBM Lombardi process engine for execution; and simple processes, which are modeled using a completely different tool and executed directly within the Blueworks Live site. When we look at process automation, it’s the latter that we’re seeing.

Creating a process automation in IBM Blueworks LiveTo automate a process, then, you click the big green “Automate a Process button” to get started, then specify the following:

  • A process application name.
  • The process type, either “Simple Workflow” or “Checklist”. In the demo, we saw a simple workflow type, which is a linear sequence of tasks assigned to users; we didn’t get a look at the checklist type so not sure of the different functionality. These are the only types available for automated processes in Blueworks Live, although they plan to add more in the future.
  • Select the space for the process definition, which might be a personal sandbox or a department such as Marketing.
  • Add instructions to be provided when an instance of the process is launched.
  • Configure some of the labels that will appear in the running process to make them more specific to the process.
  • Add one or more tasks, which will be executed sequentially in each process instance. For each task, specify the description, who the task is assigned to (or leave it blank to have it assigned at runtime), and whether the task is an approval step.
  • Share the process definition with participants of that space, who will then have it available as a process type to instantiate from their Work section.

The whole process creation took only a couple of minutes, and when we returned to the user’s Work section where we had started, the new process template was available in the sidebar.

Launching and Participating in a Process

We then logged on as a different user to create a process instance from that template. Since this user presumably has access to the space in which the process designer saved the process template, it appears in the sidebar of our Work section. Clicking that link kicks off a process instance:

  • The instructions specified by the process designer are displayed.
  • Fill in the name and details fields.
  • Add a desktop document as an attachment; this is uploaded and shared with all the participants.
  • Select a due date for each of the tasks.
  • For the task that wasn’t pre-assigned to a user, assign the user.
  • Launch it to kick off the first task.

Returning to the main Work section, we can now see that process instance in the “Work I’ve launched” tab, and can open and track its progress from there.

Launching a process in IBM Blueworks LiveWhen we move over to the Community section, we can now see our private activity stream, which includes two new events: first, that we launched the workflow, and second, that the first task in that workflow was received by the user to which it was assigned. By default, all of the events for every process that we’ve launched will appear in our activity stream.

We then switched back to the original user, who was also the user to whom the first task in the process was assigned, to see what it looks like to participate in a process. An email was already waiting to tell us that we had a new task, complete with a link to the task, or we could have found the task directly in the Work section of Blueworks Live under the “Tasks assigned to me” tab. Regardless of how it was opened, we can then complete the task:

  • View the process name and details provided by the process originator.
  • View the attached document. It appears that we could also have added more documents at this point, although we didn’t see that.
  • Add a comment, which appears in a comments timeline on the side of the process information.
  • View the tasks to be completed. Since the first one is assigned to us, and it was an approval task, there are Approve and Reject buttons on the task.
  • Click the Approve button to mark the task as completed. I assume that tasks that are not approval tasks have a simple Complete button or something similar so that the participant can mark the task as complete, although we didn’t look at that.

Participating in a process in IBM Blueworks LiveThere are a number of other options that appear to be available at this point, although we didn’t explore them, such as reassigning the remaining tasks to different users, but essentially this user is done with their task and the process. If we move to the Community area and look at the private activity stream for this user, we can see that in addition to creating and sharing the process template, the approval task also appears there.

Overall, although there’s nothing really new about this sort of easy sequential workflow design and execution, the user interface is clean and uncluttered, and pop-up tips on the fields assist the user on what to enter. Assuming that you can wrench your users away from using email for these processes, there won’t be much of a learning curve for them to create new processes on their own, and even less to use processes created by others. If you want to see this in action, there’s a Blueworks Live YouTube channel with a couple of videos on creating and participating in a process.

A user with administrative privileges can view some basic aggregate reports on these processes, including some graphical views of process template usage, user participation and on-time completion; this is generated as an Excel spreadsheet that is downloaded and viewed on the desktop, not as an integrated reporting or dashboard view. It’s very rudimentary, but may be sufficient for the types of processes that are likely to be automated using this tool.

To finish up, we also looked at the Library section again; as a logged-in user, we could now see some additional content areas, including links to Blueprint process models, which could then launch the familiar Blueprint environment within Blueworks Live for complex process discovery and modeling. As I mentioned earlier, this is a completely different modeling environment than the “process automation” that I described above; these processes will be exported to an on-premise IBM Lombardi process engine for execution.

There are three levels of Blueworks Live users:

  • Community, free, which allows you to view the public and private communities, although it’s not clear what the private community is in the case of a free user.
  • Contributor, $10/month, which adds all the functionality of creating and running the simple process applications that I’ve described above, plus the ability to review and comment on Blueprint process models.
  • Editor, $50/month, which adds the full Blueprint modeling capability.

Although the paid users now have more than former (paid) Blueprint users with the addition of the simple process automation, free users of the old BPM BlueWorks Beta site have lost a whole bunch of capabilities, unless we just skipped that part of the demo.

The Verdict

In a nutshell, Blueworks Live provides some private and public community functionality, allows you to create (Lombardi) Blueprint process designs, and automate simple processes. But these are two very different tools: the online mini processes with the Blueworks Live automation engine (based on two basic templates, workflow and checklist), and the Blueprint processes, some of which will be moved to an on-premise Lombardi system. Different interfaces, different engines, different everything except that they’re contained within the same portal.

The Twitter stuff is pretty useless for those of us who are already competent at monitoring Twitter using a tool such as Tweetdeck. I’m never going to go to Blueworks Live to look at the public Twitter stream; I probably already follow the same list of people in my BPM Twitter list, and if I want to see what’s happening with #bwlive, I’ll just add it as a search column. It’s probably good for the Twitter newbies, since they haven’t figured out groups, hashtags or Tweetdeck yet; maybe that’s more representative of the expected user base.

Except for the Twitter stream, the only community content appears to be the current BlueWorks blog content, written mostly by IBM. The online execution isn’t really community, it’s process execution in a semi-collaborative space, which is different. The forums (mostly product/site help) and media library (including webinars, white papers and the various modeling tools such as strategy and capability maps) from the old BPM BlueWorks Beta site are missing, or at least not displayed in the version that we saw. Although Blueworks Live definitely has some improved functionality such as process execution, this is really a collection of non-integrated tools, and it’s not clear that they’ve reached their goals regarding a public BPM community.

They’re not the first to have cloud-based process execution, but they are IBM, and that lends some credibility to the whole notion of running your business processes outside the firewall. Like the entry of other large players into the cloud BPM marketplace, I believe that this will be a benefit to all cloud BPM providers since it will validate and enlarge the market. This validation of cloud-based BPM is a real game-changer, if not Blueworks Live itself.

Blueworks Live Launch

What Organizations Want From Case Management

There was an AIIM webinar today on supporting the information worker with case management, featuring Craig Le Clair from Forrester.

Le Clair introduced the concept of information workers, a term that they use instead of knowledge worker, defined as “everyone between 18 and 88 with a job in which they use a computer or other connected device”, which I find to be a sufficiently broad definition as to be completely useless but allows them to use the cute abbreviation iWorker. Today, however, he’s just focused on those iWorkers who are involved with case management, in other words, what the rest of us would call knowledge workers. Whatever.

Forrester uses the term dynamic case management – others use advanced or adaptive case management, but we’re talking about the same thing – to mean “a semistructured but also collaborative, dynamic, human, and information-intensive process that is driven by outside events and requires incremental and progressive responses from the business domain handling the case.” Le Clair provided a quick summary of dynamic case management, with the document-centric case file as the focus, surrounded by processes/tasks, data, events and other aspects that make up the entire history of a case. There are some business challenges now that are driving the adoption of case management, including cost and risk management for servicing customer requests, enforcing compliance in less structured processes, and support for ad hoc knowledge work. He spoke specifically about transparency in regulatory compliance situations, where case management provides a way to handle regulatory processes for maximum flexibility while still enforcing necessary rules and capturing a complete history of a case, although most customers are more focused on case management specifically for improving customer service.

He described case management as a convergence of a number of technologies, primarily ECM, BPM, analytics and user experience, although I would argue that events and rules are equally important. Dynamic allocation of work is key: a case can select which tasks that should be applied to a case, and even who should be involved, in order to reach the specified states/goals of the case. Some paths will include structured processes, others will be completely ad hoc, others may involved a task checklist. Different paths selected may trigger rules and events, or offer guidance on completion. Different views of the case may be available to different roles. In other words, case management tries to capture the flexible experience of working on a case manually, but provides a guided experience where regulations demand it, and captures a complete audit trail as well as analytics of what happened to the case.

Forrester predicts that three categories of case management will emerge – investigative, service requests and incident management (can you sense three separate Forrester Waves coming?) – focused on different aspects of customer experience, cost control and risk mitigation. Key to making these work will be integration of core customer data directly into the case management environment, both for display to a case worker as well as allowing for automated rules to be triggered based on customer data. There are some challenges ahead: IT is still leading the configuration of case management applications, and it just takes too long to make changes to the rules, process models and reporting.

He was followed by Ken Bisconti from IBM’s ECM software products group, since IBM sponsored the webinar, talking about their new Case Manager product; I wrote about what Ken and many others said about this at the IOD conference last month, and just had an in-depth briefing on the product that I will be writing about, so won’t cover his part of the presentation today.

Time For Enterprise 2.0 To Get Enterprisey

The funny thing about “Enterprise 2.0”, or social business software, is that it’s not very enterprisey: yes, it is deployed in enterprises, but it often doesn’t deal with the core business of an enterprise. You hear great stories about social software being used to strengthen weak ties through internal social networking, or fostering social production by using a wiki for project documents, but many less stories about using social software to actually run the essential business processes. Andrew McAfee recently wrote about his experience talking to a group of CIOs, and how they were seeing social software as becoming mainstream, but one comment struck me:

[The CIOs] weren’t too worried that their people would use the tools to waste time or goof off. In fact, quite the opposite; they were concerned that the busy knowledge workers within their companies might not have enough time to participate.

The fact that the knowledge workers had a choice of whether to participate tells me that the use of social business software is still somewhat discretionary in these companies, that is, it’s not running the core business operations; if it were, there wouldn’t be a question of participation.

At the Enterprise 2.0 conference in June, my only blog post was something of a rant on the emperor having no clothes, since I believe that this has to be about the core business or it’s just not very interesting (and likely won’t survive an economic downturn). Interestingly, Michael Idinopulos of Socialtext was at the same conference, and saw some evidence of the shift towards the idea that ”social software delivers business value when it integrates with business process” (I wish I had been in some of the sessions that he was, since he obviously saw evidence of this opinion being further along than I did).

I’m starting to see some similar opinions emerging from a variety of sources, or maybe the recent Enterprise 2.0 conference in Santa Clara has just heated up the same discussion again. Klint Finley of ReadWrite Enterprise, hearkening back to Idinopulos’ post, thinks that enterprise 2.0 needs to be tied to business processes. Tom Davenport recently wrote about the need to add structure to social in order to bring enterprise value:

Well before personal computers enabled online chatter, they helped bring structure to work. Transaction systems like ERP and CRM, tools for workflow and document management, and project management systems all made it more clear to people what they need to do next in their jobs. That capability has undoubtedly led to productivity gains.

But work effectiveness also demands that people share their knowledge and expertise with each other. That’s where social media comes in. It makes it easy to reach out to others for help in making a decision or taking an action. And the transfer of knowledge through social media doesn’t require a lot of difficult knowledge management work in advance.

Be sure to read Davenport’s example of what’s happening at Cognizant, where they’re combining project/task management and social resources: effectively combining social and core business processes.

Meanwhile, while the social business software vendors have been stumbling towards process, the BPMS vendors have been stumbling towards social. I first presented on the ideas of social features in BPMS in 2006, and while a lot of what I predicted then has come to pass, there are many things that I didn’t even imagine four years ago. Although many vendors focus on the social aspects of process discovery and design, I don’t think that’s where the true impact will be felt: social process execution is the key to bringing together the productivity, governance and quality improvements of BPM with the networking and cultural aspects of social software. Having social features at runtime as innate capabilities for all process participants – through the entire spectrum from structured processes to unstructured collaboration – is what will really make social software (or rather, social features of enterprisey software such as BPM) mainstream.

What concerns me is the divide between social business software and enterprise software vendors. I don’t think that most social business software is capable of managing industrial-strength core business processes. I also don’t think that most BPM software is capable of doing social collaboration really, really well – at least, not yet. However, the BPMS vendors have already done the heavy lifting of creating tools to manage business processes and gaining the trust of customer to manage those processes, and I expect that we’ll continue to see rapid expansion of the social features of BPMS, through acquisition or organic internal development. Although there’s still undoubtedly a place for social business software as a standalone category, those companies looking to take on the social aspects of core business processes may want to position themselves for acquisition by one of those deep-pocketed BPMS vendors.

Taking a BPMS Test Drive

Last week, I tried out the Ultimus Test Drive, a guided hands-on session using a process application built with the Ultimus BPMS. The process itself was fairly simple – a purchase request process – since this test drive is targeted at end users, not analysts or developers, and intended to give the user a hands-on look at what it’s like to participate in the process.

The script for the session is pretty straightforward: a purchase request  is made by one person, approved by their supervisor, then sent to their manager if the amount is over a threshold. Once approved, the purchase order is created and emailed to the requester.

Ultimus Test Drive 2010

The real innovation here is not the application itself, but the method that Ultimus is using to have people try it out. Although the demo is scripted and partially guided, you’re actually interacting with a live version of Ultimus via GoToMyPC, not a screen capture animation. Optionally, someone from Ultimus will watch over what you’re doing and be on the phone with you, providing help if you get off track.

If you’re in the market for a BPMS, getting your hands on software as part of the evaluation process is important, but is rarely done very satisfactorily. In many cases, organizations still buy (very expensive) software based only on what they see in a demo given by the vendor’s sales team, without ever trying it out for themselves. I have seen some situations that improve on that by providing the analyst/developer view of the product in a hosted environment (such as EC2) for a trial period; this allows the techies to do an evaluation of the look and feel of the components that they will use, but rarely the end users. Short of an onsite proof of concept, it’s rare for the potential end-users to have a chance to try things out. The exception, of course, is cloud-based BPMS where you can get a limited trial license for free, or nearly so, but that covers only a small subset of the BPMS vendors out there today, and if you’re looking at on-premise software for your final solution, you may not want to limit your search to those that also have a cloud version.

We need more of what Ultimus is doing in terms of customer (or prospect) education: the chance for a quick, hands-on demo of software that we’re expecting people to spend a good part of their day interacting with in the future.

Webinar on Fast-Tracking BPM Projects

I’ll be speaking on a webinar this Thursday, November 18th, along with Michael Rowley of Active Endpoints, about fast-tracking BPM projects. The Active Endpoints tools are targeted at an IT audience of architects, developers and technical business analysts, and that’s exactly who we’re focusing this webinar on as well. From the description:

Time and resources are limited and you are tasked with automating processes for your business faster, cheaper and better than ever before. How do you develop service-oriented process applications that address the needs of the business AND meet their deadlines? What process automation tools and techniques are used today successfully by others in your situation? How do you quickly create working prototypes, while avoiding a drawn out process of creating written requirements, and still provide traceability from requirements to implementation?

You can register here for the webinar; hope to see you there.

Smarter Infrastructure For A Smarter Planet

Kristof Kloeckner, IBM’s VP of Strategy & Enterprise Initiatives System and Software, & CTO of Cloud Computing, delivered today’s keynote on the theme of a smarter planet and IBM’s cloud computing strategy. Considering that this is the third IBM conference that I’ve been to in six months (Impact, IOD and now CASCON), there’s not a lot new here: people + process + information = smarter enterprise; increasing agility; connecting and empowering people; turning information into insights; driving effectiveness and efficiency; blah, blah, blah.

I found it particularly interesting that the person in charge of IBM’s cloud computing strategy would make a comment from the stage that he could see audience members “surreptitiously using their iPads”, as if those of us using an internet-connected device during his talk were not paying attention or connecting with his material. In actual fact, some of us (like me) are taking notes and blogging on his talk, tweeting about it, looking up references that he makes, and other functions that are more relevant to his presentation than he understands.

I like the slide that he had on the hype versus enterprise reality of IT trends, such as how the consumerization of IT hype is manifesting in industrialization of IT, or how the Big Switch is becoming reality through multiple deployment choices ranging from fully on-premise to fully virtualized public cloud infrastructure. I did have to laugh, however, when he showed a range of deployment models where he labeled the on-premise enterprise data center as a “private cloud”, as well as enterprise data centers that are on-premise but operated by a 3rd party, and enterprise infrastructure that is hosted and operated by a 3rd party for an organization’s exclusive use. It’s only when he gets into shared and public cloud services that he reaches what many of us consider to be “cloud”: the rest is just virtualization and/or managed hosting services where the customer organization still pays for the entire infrastructure.

It’s inevitable that larger (or more paranoid) organizations will continue to have on-premise systems, and might combine them with cloud infrastructure in a hybrid cloud model; there’s a need to have systems management that spans across these hybrid environments, and open standards are starting to emerge for cloud-to-enterprise communication and control.

Kloeckner feels that one of the first major multi-tenanted platforms to emerge(presumably amongst their large enterprise customers) will be databases; although it seems somewhat counterintuitive that organizations nervous about the security and privacy of shared services would use them for their data storage, in retrospect, he’s probably talking about multi-tenanted on-premise or private hosted systems, where the multiple tenants are parts of the same organization. I do agree with his concept of using cloud for development and test environments – I’m seeing this as a popular solution – but believe that the public cloud infrastructure will have the biggest impact in the near term on small and medium businesses by driving down their IT costs, and in cross-organization collaborative applications.

I’m done with CASCON 2010; none of the afternoon workshops piqued my interest, and tomorrow I’m presenting at a seminar hosted by Pegasystems in downtown Toronto. As always, CASCON has been a great conference on software research of all types.

Iterative Development in BPM Applications Using Traceability

The last speaker at the CASCON workshop is Sebastian Carbajales of the IBM Toronto Software Lab (on the WebSphere BPM team), on the interaction between a technical business analyst using WebSphere Business Modeler and an IT developer using WebSphere Integration Developer, particularly how changes made to the business-level model in WBM are reflected in WID. The business view is quite different from the IT view, and doesn’t necessarily directly represent the IT view; this is a common theme at this conference, and in general for vendors who don’t use a shared model approach but rely on some sort of model transformation. Given that there are two different models, then, how do business and IT collaborate in order to keep the models in sync?

They first looked at maintaining a separation of concerns between business and IT that would to minimize the need for changes by IT in response to a business change. This comes down to separating the business logic and rules from the implementation, and the separation of artifacts with well-defined logic from those with undefined logic. I’m not sure that I really get the distinction between the well-defined and undefined logic artifacts, or the benefits of separating them, although my instinct would be to externalize much of the business logic into a rules environment that the business analyst and/or business manager could manipulate directly.

They also looked at tool-assisted model merging to allow models to be compared in the specific user’s domain, then selectively apply and merge the changes into existing models. This would speed development as well as improve the model quality by reducing translation errors. There are some very similar concepts to those discussed in the previous paper on model change management, although with the added complexity of multiple modeling environments. A key goal is to improve the accuracy of model change detection, both to identify the objects in each model type as well as the relationships across the business-IT model transformation, and they used a traceability mechanism to do this. They generate a traceability map when the business to IT model transformation is originally done, capturing the identify of and the relationship between each object in the models, which allows traceability of changes on either model type.

He walked through a typical scenario, where the BA creates a process model in WBM, then exports/publishes it, where it is then imported by IT into WID and enhanced with implementation artifacts. When a change is made by the BA to the original model, and re-exported, that modified model is compared to enhanced WID model to create a new, merged WID model. Then, the change report is exported from WID, and any business-level changes are compared and merged back into the WBM model. Yikes.

Having a traceability map allows an IT developer to filter changes based on the business or IT artifacts, do visual comparisons and selective merging of the models. On the return trip, the BA can view updates to the process model, business services and business service objects that might impact the business logic, and select to apply them to the business-level models. The traceability is the key to model governance when multiple model types undergo transformations as part of the modeling and implementation lifecycle.

Following Carbajales’ presentation, we had a round-table discussion on the two process modeling themes of collaboration and consistency management to finish up the workshop. Some good ideas on the reality of business-IT collaboration in process modeling.

Process Model Change Management

Jochen Küster of the IBM Research Zurich lab (where they do a lot of BPM research), was first after the morning break at our CASCON workshop on collaboration and consistency management in BPM, presenting on process model change management. This was more of a technical talk about the tools required. As motivation, process models are a key part of model-driven development, and become input to the IT process modeling efforts for SOA implementations. Multiple models of different types will be created at different points in the modeling lifecycle, but individual models will also go through multiple revisions that need to be compared and merged. This sort of version management – that allows models to be compared with differences highlighted, then selectively merged – doesn’t exist in most process modeling tools, and didn’t exist at all in the IBM process modeling tools when their research started. This is different from just keeping a copy of all revised process models, where any one can be selected and used.

In order to do this sort of comparison and selective merging, it’s necessary to generate a change log that can be used for this purpose, logging not only atomic activities, but have those rolled up to compound operations to denote an entire process fragment. Furthermore, the merged model generated by the selective application of the changes must still be valid, and must be checked for correctness: a resolution of the changes following a detection of the changes.

The solution starts with a tree-like decomposition of the process model into fragments, with correspondences being determined between model elements and fragments; this was the subject of research by the Zurich lab that I saw presented at BPM 2008 in Milan on parsing using a refined process structure tree (PST). A key part of this is to identify the compound operations that denote the insertion, movement or deletion of a process fragment. The current research is focused on computing a joint PST (J-PST) for the merged process, which is the combination of two PSTs determined by the earlier decomposition, based on the correspondences found between the two models. The dependencies are also computed, that is, which process fragments and activities need to be inserted, moved or deleted before others can be handled.

The results of this research has been integrated into WebSphere Business Modeler v7.0, although not clear if this is part of production code, or a prototype integration. In the future, they’re looking at improving usability particularly around model difference resolution, then integrate and extend these concepts of change management and consistency checking to other artifacts such as use cases and SCA component diagrams.