Day 2 Keynote at BPMCM15

Second day at the BPM and Case Management summit in DC, and our morning keynote started with Jim Sinur — former Gartner BPM analyst — discussing opportunities in BPM and case management. He pointed out the proven benefits of process and case management, in terms of improving revenue, costs, time to market, innovation and visibility, while paving a path to digital transformation. However, these tried-and-true ROI measures aren’t just enough these days: we also need to consider customer loyalty, IoT, disruptive companies and business models, and in general, maintaining competitive differentiation in whatever way necessary to thrive in the emerging marketplace. In order to accommodate this, as well as attract good workers, it’s necessary to break the specialist mindset and allow people to become knowledge workers. I gave a workshop last week at the IRM BPM conference on the future of work, and I agree that this is a key part of it: more of the routine work is being automated, leaving the knowledge work for the people in the process; this requires a work environment that allows people to do the right thing at the right time to achieve a goal, not just work at a pre-defined task in a pre-defined way. Sinur cited a number of examples of processes that are leveraging emerging technologies, including knowledge workers’ workbenches that incorporate smart automated agents and predictive analytics; and IoT applications in healthcare and farming. The idea is to create goal-driven and proactive “smarming” processes that figure out on their own how to accomplish a goal through both human and automated intelligence, then assemble the resources to do it. Instead of pre-defining processes, you provide goals, constraints, analytics and contexts; the agents — including people, services, bots and sensors — create each process instance on the fly to best meet the situation. Although his case studies included a number of other technologies, he finished with a comment on how BPM and case management can be used to coordinate and orchestrate these processes as we move to a new world of digital transformation of the customer experience.

Next up was Tom Debevoise, now with Signavio to help promote their recently-released DMN modeler; we had a sneak peek of the DMN modeler at bpmNEXT. He talked about three levels of decisions — strategic (e.g., should we change our business model), tactical (e.g. which customers to target) and operational (e.g., which discount to apply to this transaction) — and how these tend to be embedded within process models and business application logic, rather than externalized into decision models where they can be explicitly managed. Most organizations manage their decisions very poorly, both human and automated, resulting in inconsistent or just plain wrong decisions being made. In other words, our business decisions are at the same point now as business processes were a decade or more ago, before BPM systems became widespread, and the path to improving this is to consider decision management as a discipline as well as the systems to model and automate decisions. We now have a decision modeling standard, DMN 1.0; this is expected to drive the adoption of decision modeling in organizations in the same way that BPMN did for process modeling. He proposed a decision management lifecycle similar to a BPM lifecycle, starting with decision discovery that allows modeling using the DMN-standard elements of a decision, input data, knowledge sources, information requirements, authority requirements and knowledge requirements. He wrapped up with the linkage between process and decision models, particularly using the Signavio BPMN and DMN modelers: how decisions that are defined external to a process can be used to assign process activity participants, decide on next steps, select the process pathway, define data access control, or detect and respond to events. We saw yesterday how Trisotech’s tools combine BPMN, CMMN and DMN, and today how Signavio combines BPMN and DMN; as more process modeling vendors expand to include decision modeling, we are going to see more implementations of these modeling standards integrated.

The last speaker in the keynote was Lloyd Dugan, on how business architecture and BPM work together, in response to a paper that he wrote last year with Neal McWhorter. Although dense (I recommend checking out the paper at the link), his presentation discussed some of the issues with reconciling business architecture and BPM, such as reconciling value stream, balanced scorecard and other BA models with activities within a process model. He reviewed a number of definitions and model types, cutting a wide swath through pretty much everything even remotely related to process and architecture, and highlighting some of the failures of mapping enterprise architecture frameworks to BPMN. He finished with a spectrum from business model perspectives (what the business is doing) to the operational model perspective (how the business is doing it), and how the business architecture versus BPM viewpoints differ, but can still both use BPMN as a modeling language. Pretty sure of two things from this: 1) I missed a lot of the detail 2) Dugan has never heard that you’re supposed to have less than 500 words on each PowerPoint slide.

BPMN, CMMN and DMN with @denisgagne at BPMCM15

Last session of day 1 of the BPM and Case Management Summit 2015 in DC, and Denis Gagne of Trisotech is up to talk about the three big standards: the Business Process Model and Notation (BPMN), the Case Management Model & Notation, and the Decision Model & Notation. BPMN has been around for a few years and is well-established — pretty much every business process modeling and automation vendor uses BPMN in some form in their process modelers, and it is OMG’s most-adopted standard — but CMMN and DMN are much newer and less widespread in the market. There are a few vendors offering CMMN modelers and even fewer offering DMN. There are two major benefits to standards such as BPMN, CMMN and DMN, in addition to the obvious benefit of providing an unambiguous format for modeling processes, management and decisions: they can be used to create models that can be interchanged between different vendors’ products; and they provide a common and readily-transferable “language” that is learned by analysts. This interchangeability, both of models and skills, means that organizations don’t need to be quite so worried about which modeling tool that they use, or the people that they hire to use it. Denis was at the Model Interchange Working Group (MIWG) OMG meeting in Berlin last week, where they showed all types of interchange for BPMN; with luck, we’ll be seeing the same activities for the other standards as they become widely adopted.

There are some grey areas about when to use BPMN versus CMMN, since both are (sort of) process-based. However, the main focus in BPMN is on activities within processes, whereas CMMN focuses on events that impact cases. He showed a chart comparing different facets of the three standards:

BPMN CMMN DMN
Processes Cases Decisions
Activities Events Rules
Transitional Contextual Applied
Data Information Knowledge
Procedural Declarative Functional
Token Event Condition Action (ECA) First Order Logic (FOL)

The interesting part (at least to me) comes when we look at the bridges between these standards: in BPMN, there is a business rule task that can call a decision in DMN; in CMMN, there is a process task that can call a process defined in BPMN. Trisotech’s version of all of these modelers (not yet in the standards, but count on Denis to get them in there) also provides for a case task type in BPMN that can call a CMMN case, and a decision task in CMMN that can call a DMN decision. There are some patterns to watch for when modeling that might indicate that you should be using another model type:

  • In BPMN, if you have a lot of gateways expressing business logic, then consider moving the gateway logic to DMN
  • In BPMN, if you have a lot of events especially boundary events, then consider encapsulating that portion into a CMMN case
  • In BPMN, if you have a lot of ad hoc subprocesses, then consider using CMMN to allow for greater specification of the ad hoc activities
  • In CMMN, if you have a lot of task interdependencies, consider using BPMN to replace the temporal dependencies with flow diagrams

The recognition and refactoring of these patterns is pretty critical for using the right model type, and are likely a place where a more trained technical analytical eye might be able to suggest improvements to models created by a less-technical analyst who isn’t familiar with all of the model types or how to think about this sort of decomposition and linking.

He demonstrated integration between the three model types using the Trisotech BPMN, CMMN and DMN modelers, where a decision task in the BPMN modeler can link directly to a decision within a model in the DMN modeler, and a case task in BPMN can link directly to a case model in the CMMN modeler. Nice integration, although it remains to be seen what analyst skill level is required to be able to model across all three types, or how to coordinate different analysts who might be modeling in only one of the three model types each, where the different models are loosely coupled with different authors.

Disclosure: I’m doing some internal work with Trisotech, which means that I have quite a bit of knowledge about their products, although I have not been compensated in any way for writing about them here on my blog.

Fannie Mae Case Study on Effective Process Modeling at BPMCM15

Amit Mayabhate from Fannie Mae (a US government-sponsored mortgage lender that buys mortgages from the banks and packages them for sale as securities) gave a session at the BPM and Case Management Summit on outcome-based process modeling for delivering business value. He had a few glitches getting started — apparently Fannie Mae doesn’t allow employees to download a presentation to their laptop, so he had to struggle through getting connected to the conference wifi and then the Fannie Mae VPN to open a PDF of his presentation — but did tell the best joke of the day when he was restarting his computer in front of us and said “now you know my password…it’s 8 dots in a row”.

Back on track, he discussed their business architecture efforts and how process modeling fits into it. Specifically, he talked about their multifamily housing division, which had its own outdated and inflexible technology platform that they wanted to change out for a simpler infrastructure that would give them better access to information for better decision-making. To get there, they decided to start with the best possible outcome in mind, but first had to have the organization understand not only that they had problems, but some quantification of how big those problems were in order to set those future goals. They identified several key metrics where they could compare today’s measurements with their desired future goals, such as operational efficiency (manual versus automated) and severability. To map from the current to future state, they needed a transformation roadmap and a framework for achieving the steps on the roadmap; this included mapping their journey to greater levels of process maturity, and creating a business capability model that included 17 capabilities, 65 functions, 262 sub-functions, and around 300 process flows.

Their business architecture transformation framework started with the business model (how do we make money), the operating model (how do we behave to make money) and the business capability model (what abilities are needed) using the Business Model Canvas framework. They used other architecture analysis tools, such as analyzing their operating model by plotting business process standardization against business process integration both for their current state and desired future state, to help them develop the strategy for moving between them. They used Mega’s business strategy module for most of the architecture analysis, which helps them identify business processes that are ripe for automation, then move to a BPMS for process modeling and automation. In that way, they can do just the process modeling that provides them with some architectural change that they know will provide value, rather than attempting to boil the ocean by modeling all processes in the organization.

Top 10 Trends of Digital Enterprise with @setrag at PegaWorld 2015

I finished my visit to PegaWorld 2015 in the breakout session by Setrag Khoshafian, Pega’s chief BPM evangelist, on the top 10 trends for the adaptive digital enterprise:

  1. Context matters. Analyze and understand the information about your customers and your interactions with them.
  2. Connecting customers to operations. Think of your customers’ actions online as part of your business process, and implement your processes accordingly.
  3. The rise of things. Specifically, the process of things: devices becomes actors in sense-and-respond processes, and channels for customer engagement, across the seven levels of the IoT World Forum reference model.
  4. Design time blurs with runtime. Model-driven development, and moving from explicit requirements collection to having the business make changes on the fly.
  5. The digital executive. Making it so simple that even a C-level executive can understand it. Or better yet, finding executives that actually have a digital clue.
  6. Two-speed IT. Setrag talked about mapping business KPIs and value streams directly onto case phases; being able to do this using Agile model-driven development while still doing more traditional maintenance on underlying legacy systems allows for fast-tracking strategic projects.
  7. The database is dead, long live the database. Moving from transactional SQL-based RDBMS to the new crop of NoSQL databases optimized for big data, analytics and more dynamic applications.
  8. Smart enterprises avoid the app-pocolypse. Allow for native apps as well as responsive web for the most flexibility.
  9. The hybrid cloud. Well, that goes back to what we can really call “cloud”.
  10. The rise of design. Users expect beautiful design; if you don’t live up to those expectations, you degrade the customer experience and therefore your relationship with them.

He was joined by Adam Field from Pega’s technology innovation area to see some examples of what Pega is doing in mobile user experience and device management.

Fast-paced and fun view of what’s driving the digital enterprise, and a bit of how Pega is helping its customers meet those needs.

That’s it for PegaWorld 2015: it was a quick but information-filled trip. I’ll be at the IRM BPM conference in London next week, then the BPM and Case Management Summit in DC the week following, watch for my coverage from those events.

TD Bank at PegaWorld 2015

I attended a breakout presented by TD Bank (there was also a TCS presenter, since they’ve done the implementation) on their workflow system for customer maintenance requests – it’s a bit of a signal about the customer, and possibly the SI, that this is called “workflow” – and how they have implemented this using Pega. They started with PRPC 6.3 and there was no indication that they’ve upgraded to Pega 7, which would give them a whole raft of new functionality.

Customer maintenance requests include any non-financial transaction that a customer may request, such as an address change, which may be fulfilled either manually, semi-automatically, or automated based on Pega business rules. They’re measuring the ROI primarily in terms of improving efficiency (increased throughput, reduced processing time, reduced paper) and improving quality and regulatory compliance (reconciliation of work received and processed, data capture validation, identification of trends, better reporting to compliance). He did mention the improved customer experience, although mostly in terms of the call center/branch staff rather than the actual customer, but turned that back to branch efficiency and productivity. There was a mention that this would result in lower wait times for customers while they were in the branch making the request, but this is so far out of touch with the realities of customer experience these days, as evidenced by the keynote that we saw this morning with AIG and RBS. This was (I think) a technical presenter from TCS going through this part, but depressing in the lack of awareness of how far they are from understanding the customer journey. This is one of the dangers in treating internal stakeholders as the customer rather than having an awareness of the actual customer and their requirements: the internal operations customer is mostly motivated by improving efficiency and compliance, not making sure that their real customer isn’t walking out the door and goes to a bank that pays attention to their needs. We can’t throw away the concepts of efficiency and compliance, but I find in dealing with my banks (yes, more than one, because none of them give me everything that I need) that there are still too many processes that require my presence in a branch, a physical signed document or a call to a call center, when they have already authenticated me in so many ways online already.

They talked about their development process and some of the best practices and lessons learned: allowing time for visual screen mockups during inception in order to reduce rework later (they seriously didn’t know that?), participation from other groups such as application integration (?!), and including a Pega deployment architect to make sure that things get into production the right way. TD Bank has been using Pega for about eight years, and they seem to be rooted in older versions and older development methodologies. Definitely in need of some digital transformation.

I didn’t attend this session with the goal of poking fun at TD or TCS, but this is really an example of old-school (probably waterfall) development methods that is not going to give them big wins in the long run. It’s clear that there is very deep integration with their other systems, and a lot of use of the Pega CPM framework and rules, but also that there has been a lot of custom work here: PRPC used as an application development tool. This is pretty typical of what I have seen with Pega customers in the past, although their recent shift to providing applications rather than frameworks is an obvious strategy to move to less-customized solutions that can be deployed faster. For the customers still plugging away on 6.x, that might be more of a dream than reality.

Pega 7 Express at PegaWORLD 2015

img-pega-7-express-ui-snippetAdam Kenney and Dennis Grady of Pega gave us the first look at Pega 7 Express: a new tool for building apps on top of the Pega infrastructure to allow Pega to push into the low-code end of the BPM/ACM market. In part, this is likely driven by the somewhat high degree of technical skill that has traditionally been required to create applications using Pega, but also by the fact that customer experience is becoming a key differentiator, creating the need to create good customer-facing applications faster. Customer experience, of course, is much more than just the type of apps that you’re going to create using Pega 7 Express: it’s new devices and methods of interaction, but all of these are setting the bar high and changing customer expectations for how they should be able to deal with vendors of goods and services. Pega 7 Express is part of the Pega 7 platform, using the same underlying infrastructure: it’s (just) a simpler authoring experience that requires little or no advance training.

We saw an introductory video, then a live demo. It includes graphical data modeling, form building and case configuration, all with multi-device support.

IMG_7234Call it end-user computer (EUC), citizen computing or low-code model-driven development, Express is addressing the problem area of applications that were traditionally built using email, spreadsheets and local desktop databases (I’m looking at you, Excel and Access). I’m not going to enumerate the problems with building apps like these; suffice it to say that Express allows you to leverage your existing Pega infrastructure while allowing non-Java developers to build applications. They even include some badges for gamifying achievements – when you build your first app, or personalize your dashboard. Just-in-time learning is integrated so that you can see an instructional video or read the help as you need it, plus in-context guidance while you’re working.

IMG_7236In the demo, we created a new case-based app by specifying the following:

  • Application name, description and logo
  • Case type
  • Major phases (a straight-through process view)
  • Steps in each phase
  • Form design for specific steps – the data model is created behind the scenes from the form fields
  • Routing/assignment to reviewers and approvers
  • Milestones and deadlines
  • Device support

In part, this looks a lot like their Directly Capturing Objectives tools, but with more tools to create an actual executable app rather than just as input to the more technical Designer Studio development environment. We also saw customizing the dashboard, which was a pretty standard portal configuration.

IMG_7237As with any good Pega demo, however, Kenney went off-screen to “create a little data table” while Grady showed us the graphical form/case builder; they are definitely the masters of “pay no attention to the man behind the curtain” during demos, where one person does the user-friendly stuff on-screen, while a couple of others do a bit of heavy lifting in the background. Lucky for us (unlikely for Kenney), he couldn’t connect to the wifi so we did get to see the data table definition, which was straightforward.

IMG_7239This does look like a pretty usable low-code application development environment. Like any other low-code model driven development, however, it’s not really for complete non-techies: you need to understand data types, how to design a form, the concept of linking case types and separately-defined data types, and how to decompose a case into phases and steps. It wasn’t clear from the brief demo how this would interact with any sort of expected case automation or other parts of the underlying Pega infrastructure: predictions, automated steps/service calls, more complex process flow or temporal dependencies, access control, etc. It’s also unclear any sort of migration path from Express to the full Designer Studio, so that this could be used as an operational prototyping tool for more complex development. They did respond to a question about reporting; there is some out of the box, and they will be adding more as well as adding ad hoc

Pega 7 Express was announced today, with the cloud version available starting today, with a 30-day free trial followed by subscription pricing; when Pega 7.19 rolls out to on-premise installations, it will also offer Express. They’re not really pushing it yet, but will start to roll out the marketing around it in Q3.

PegaWORLD 2015 Keynote: CRM Evolved and Pega 7 Express

Orlando in June? Check. Overloaded wifi? Check. Loud live band at 8am? Check. I must be at PegaWORLD 2015!

Alan Trefler kicked off the first day (after the band) by looking at the new world of customer engagement, and how both organizations and the supporting technology need to change to support this. He took a direct hit at the silos of old-school companies such as traditional financial services (“What *is* a middle office, anyway?”, a question that I’ve often asked), and how many applications and platforms fail to move them beyond that model: conforming (to how an application works out of the box) versus strategic (mix your own DNA into the software). Like many other vendors in this space who are repositioning as process-centric application development platforms, the term BPM (business process management) didn’t come up; Pega is repositioning as “CRM Evolved”. To be fair, Pega has always had a strong CRM (customer relationship management) bias, but it looks like they’re rebranding the entire business of their customers as CRM, from sales and onboarding through support and back into operations. This includes anticipating and operationalizing customer actions, so that you can respond to a potential problem before it ever occurs, and moving from conforming to strategic software in order to allow you to evolve quickly to meet those needs. He warned against implementing the Frankenstack, pieced together from “dead software products”, and decried the term BPM in favor of case management as how customer engagement and operations need to work, although arguably there is a lot of what we think of a traditional BPM implemented as part of Pega’s customers’ solutions.

We’re definitely seeing the BPM market (broadly defined to include dynamic and ad hoc process management including case management) bifurcating into the application development platforms such as Pega, and the more out-of-the-box, low-code process platforms. BPM is really much beyond just process management, of course: many of these platforms include mobile, social, IoT, analytics, big data and all of the other popular features that are being built into almost all enterprise applications. Trefler talked about Pega 7 Express – I’ll be going to a session on that after the keynote – which is a simpler user experience for application development. Having seen their more complex user experience in a few client projects, this is definitely needed to cut through the complexity in order to address the end-user computing/citizen computing needs. In other words, although they are primarily in the heavy-duty application development space, they also realize that they can’t ignore the “low end” of the market if they want to achieve greater awareness and penetration in their customer environments beyond the IT development group.

Trefler also talked about Pega’s vertical industry applications, and we heard from Dr. Mark Boxer from Cigna Healthcare. He discussed how they use Pega’s Smart Claims App, although we mostly saw a lot of futuristic videos of what healthcare could be like, including big data and gamification. Plus Apollo 13. It’s not clear how much of this that Cigna has implemented (presumably they are not working on the moon shot) although I know that some US healthcare companies are reducing premiums for customers who use wearables to monitor their health since it allows for early problem detection.

Don Schuerman, Pega’s CTO and VP of Product Marketing, took the stage to talk about their technology, with a big focus on strategic applications rather than the platform itself – Trefler did make a comment earlier about how their marketing used to be really bad, and I think that someone told them that applications show better than platforms – plus their cloud infrastructure. He was joined by Jim Smith, CIO of the State of Maine, who was not afraid to talk about BPM: he sees BPM plus agile plus legacy system modernization as the cornerstones of their enterprise strategy, underpinned by a cloud platform for speed and security. He showed some pictures of their filing cabinets, pending files in paper folders and other paper-based inefficiencies; it’s interesting to see that there is still so much of their digital transformation – and that of many other organizations that I work with – that is relying on getting paper into digital form, either natively (i.e., online forms replacing paper ones) or through image and data capture.

Brian Matsubara, head of Global Technology Alliances at Amazon, talked briefly about their Amazon Web Services offerings, and their partnership with Pega to create the Pega Cloud on which Pega 7 Express and other products are domiciled. I don’t need to be sold on cloud in general or AWS in particular since I trust critical business data to AWS, but there are still a lot of skittish organizations who think that their own data centers are better, faster, cheaper and more secure than AWS. (Hint: they’re not.) I just finished up the materials for a workshop that I’m giving in London next week on the Future of Work, and I agree with what Matsubara said about (public) cloud: it’s not just cheaper infrastructure, it provides ways of doing business that just weren’t possible before, especially consumer mobile and external collaboration applications. Schuerman stressed at the end that they need to help their customers make cloud strategic:

The keynote finished with Kerim Akgonul, SVP of Products, who discussed changing customer attitudes: customers now expect more, and will quickly make their displeasure public when the experience is less than awesome. He talked about their suite of applications – Marketing, Sales Automation, Customer Service, and Operations – and how decision-based Next Best Action predictions and recommendations are an underlying feature that drives all of them. The Pega Marketing application brings tools to help improve customer engagement, including next best action and 1:1 targeted marketing. Their Sales Automation application offers guided selling through the end-to-end sales process. Their Customer Service application uses case management paradigms and next best actions for guided customer conversations, while interacting with social media and other channels. Akgonul is always willing to participate in the on-stage highjinks: last year, it was a wild motorcycle ride, and this year it’s a wellness app on an iWatch and iPhone that tied in with a customer service agent’s screen, with some assistance from his colleagues David Wells and Don Schuerman. Fun, and drove home the point about how these technologies can be used to improve customer engagement: mobile, omni-channel, next best action, gamification and more. He wrapped up with a more serious, if somewhat breathless, look at some of the newer features, including offline mobile apps that can synchronize data later, pattern detection in real-time streaming data such as dropped calls, dashboard personalization, and the new Pega 7 Express lightweight application builder.

Consolidated Inbox in SAP Fiori at SapphireNow 2015

I had a chance to talk with Benny Notheis at lunchtime today about the SAP Operational Intelligence product directions, and followed on to his session on a consolidated inbox that uses SAP’s Fiori user experience platform to provide access to SAP’s Business Suite workflow, BPM and Operational Process Intelligence work items, as well as work items from non-SAP workflow systems. SAP has offered a few different consolidated inboxes over the years — some prettier than others — but they all serve the same purpose: to make things easier for users by providing a single point of contact for all work items, and easier for IT by reducing maintenance and support. In the case of the Fiori My Inbox, it also provides a responsive interface across mobile and desktop devices. Just as the underlying database and transaction platform for SAP is converging on HANA, all user experience for applications and analytics is moving to Fiori. Fiori (and therefore the consolidated My Inbox) is not yet available on the cloud platform, but that’s in the works.

As a consolidated work list manager, My Inbox provides multiple device support including mobile, managing work items from multiple systems in a single list and fully integrated into the Fiori launchpad. It has some nice features such as mass approvals, full-text searching, sorting and filtering, and sharing tasks via email and SAP JAM; work items can have attachments, comments and custom attributes that are exposed in the work list UI or by launching the UI specific to the work item.

We saw a demo of My Inbox, with  a user-configurable view that allows workers to create filtered lists within their inbox for specific task types or source systems in order to organize their work in the way that they want to view it. Work items can be viewed and managed in the work list view within Fiori, or the work item launched for full interaction using its native UI. Tasks can be forwarded to other users or suspended, as well as task type-specific actions such as approve and reject. Attachments can be added and viewed directly from the work list view, as well as direct links into other systems. The history for a work item is maintained directly in My Inbox for viewing by the user, although the underlying workflow systems are likely also maintaining their own separate history logs; this provides a more collaborative history by allowing users to add comments that become part of the My Inbox history. Emailing a task to a user sends a direct link to the task but does not interrogate or allocate access rights; I assume that this could mean that a task could  sent to someone who does not have rights to open or edit the tasks, and the original sender would not be informed. Within any list view, a multi-select function can be used to select multiple items for approval; these all have to be approval-type items rather than notifications, so this might be most useful in a list view that is filtered for a single task type. There is no view of tasks that a user delegated or completed — a sort of Sent Items box — so a user can’t monitor the progress of something that they forward to someone else. Substitutions for out-of-office times are set in My Inbox, meaning that the user does not need to visit each of the underlying systems of record to set up substitution rules; these rules can be applied based on task groups, which are established by how task profiles are set up during the initial technical configuration.

A good demonstration of the new generation of SAP user experience, and how Fiori can be used in a production transaction-oriented environment. There obviously needs to be a fair amount of cooperation between the Fiori-based My Inbox and the systems of record that contribute work items: My Inbox needs to be able to interrogate quite a bit of data from each work item, send actions, and manage user substitution rules via a common task consumption model that interacts with gateways to each type of underlying system. There is likely still quite a bit of work to do in those integration points to make this a fully-functional universal inbox, especially for systems of record that are more reluctant to yield their secrets to other systems; SAP has published specifications for building task gateways that could then be plugged into this model, which would expose work items from any system in My Inbox via a compatible gateway.

image

(Image from SDN link above)

The next good trick will be to have a consolidated history log, combining the logs from My Inbox with those in the systems of record to build a more complete history of a work item for reporting and decisioning.

London Calling To The Faraway Towns…For EACBPM

I missed the IRM Business Process Management Europe conference in London last June, but will be there this year from June 15-18 with a workshop, plus a breakout session and a panel session. It’s collocated with the Enterprise Architecture Europe conference, and you can attend sessions from either conference if you attend.

There are five conference tracks and 40 case studies over three days of the conference, plus a day of pre-conference workshops. Here’s what I’m presenting:

  • On the morning of June 15, I’ll present a half-day workshop/tutorial on The Future of Work, looking at how work is changing in the face of changing technology and culture, and how to adapt your organization for this brave new world.
  • On the morning of June 17, I’ll give a breakout session that excerpts some of the material from the workshop on Changing Incentives for Knowledge Workers.
  • Also on the morning of June 17, I’ll be on a panel of “BPM Gurus” with Roger Burlton, Ron Ross and Howard Smith, moderated by Chris Potts, discussing ten years of BPM.

IRM runs a good conference with a lot of great content, hope to see you there. If you plan to attend, I have a 10% discount code that I can provide to colleagues, send me a note or add a comment here and I’ll send it to you.

bpmNEXT 2015 Day 3 Demos: Camunda, Fujitsu and Best In Show

Last demo block of the conference, and we’re focused on case management and unstructured processes.

Camunda, CMMN and BPMN Combined

Jakob Freund presented on OMG’s (relatively) new standard for case management modeling, CMMN, and how they combine it with BPMN to create processes that have a combination of pre-defined flows and case structures. They use the Trisotech CMMN modeler embedded in their environment, running both the CMMN and BPMN on the same engine; they are looking at adding DMN for decision modeling as well. He demonstrated an insurance application example there BPMN is used to model the overall process, with the underwriting subprocess actually being a CMMN model within a BPMN model. The user task list can show a consolidated view of both BPMN tasks and CMMN tasks, or a dedicated UI can be used for a case since it can also show enabled activities that are not yet instantiated (hence would not appear in a task list) as available user actions. BPMN processes can also be triggered from the CMMN model, providing pre-defined process fragments that can be triggered by the case worker to perform standard operations. He also showed their developer workbench, including a full-featured debugger that includes stepwise execution and the ability to execute code at any step. Since their paradigm is to provide process management services to a developer writing in Java, their tooling is more technical than what is found in a no-code or low-code environment. Also, a BPMN font.

Fujitsu: Using Agents to Coordinate Patient Care across Independent Specialists

Keith Swenson finished the demos presenting healthcare research from the University of Granada, which helps to create patient treatment plans based on rules and iterative goal-seeking rather than pre-defined processes. This allows for different medical specialists to have their own sets of rules and best practices for dealing with their own specialization; automated agents can combine and negotiate the rules from multiple specialists to create a consolidated treatment plan for patients with multiple conditions, allowing each of the participants to monitor progress. He demonstrated a prototype/sample application that allows each specialist to set out a schedule of actions that make up a treatment plan; the multiple treatments plans are conciliated against each other — basically, modifying a plan by adding steps from another plan — and presented back to the referring physician, who can then select one of the plan processes for execution. He used the IActive Knowledge Studio to show how the plans and rules are designed, and discussed how the processes for the interacting agents would be emergent as they communicate and negotiate.

That’s it for bpmNEXT for me. Great conference, as always. As a matter of disclosure, I was not charged the conference fee to attend, although I paid my own travel and living expenses. A number of the vendors that I have written about here over the past three days are my clients or have been so in the past, but that did not allow them to escape the snarky Twitter comments.

Update: waiting to take off at Santa Barbara airport, and I see from the Twitter stream that SAP won the Best In Show award for their Internet of Everything demo – congratulations! Top five presentations: W4, Camunda, Trisotech, Bonitasoft and BP-3. Kudos all around.