bpmNEXT 2014 Wednesday Morning: Cloud, Synthetic APIs and Models

I’m not going to say anything about last night, but it’s a bit of a subdued crowed here this morning at bpmNEXT. Smile

We started the day with Tom Baeyens of Effektif talking about cloud workflow simplified. I reviewed Effektif in January at the time of launch and liked the simple and accessible capabilities that it offers; Tom’s premise is that BPM is just as useful as email, and it needs to be just as simple to use as email so that we are not reliant on a handful of power users inside an organization to make them work. To do this, we need to strip out features rather than add features, and reduce the barriers to trying it out by offering it in the cloud. Inspired by Trello (simple task management) and IFTTT (simple cloud integration, which basically boils down every process to a trigger and an action), Effektif brings personal DIY workflow to the enterprise that also provides a bridge to enterprise process management through layers of functionality. Individual users can get started building their own simple workflows to automate their day-to-day tasks, then more technical resources can add functionality to turn these into fully-integrated business processes. Tom gave a demo of Effektif, starting with creating a checklist of items to be completed, with the ability to add comments, include participants and add attachments to the case. There have been a few changes since my review: you can use Markdown to format comments (I think that understanding of Markdown is very uncommon in business and may not be well-adopted as, for example, a TinyMCE formatted text field); cases can now to started by a form as well as manually or via email; and Google Drive support is emerging to support usage patterns such as storing an email attachment when the email is used to instantiate the case. He also talked about some roadmap items, such as migrating case instances from one version of a process definition to another.

Next up was Stefan Andreasen of Kapow (now part of Kofax) on automation of manual processes with synthetic APIs – I’m happy for the opportunity to see this because I missed seeing anything about Kapow during my too-short trip to the Kofax Transform conference a couple of weeks ago. He walked through a scenario of a Ferrari sales dealership who looks up SEC filings to see who sold their stock options lately (hence has some ready cash to spend on a Ferrari), and narrow that down with Bloomberg data on age, salary and region to find some pre-qualified sales leads, then load them into Salesforce. Manually, this would be an overwhelming task, but Kapow can create synthetic APIs on top of each of these sites/apps to allow for data extraction and manipulation, then run those on a pre-set schedule. He started with a “Kapplet” (applications for business analysts) that extracts the SEC filing data, allows easy manual filtering by criteria such as filing amount and age, then select records for committal to Salesforce. The idea is that there are data sources out there that people don’t think of as data sources, and many web applications that don’t easily integrated with each other, so people end up manually copying and pasting (or re-keying) information from one screen to another; Kapow provides the modern-day equivalent to screen-scraping that taps into the presentation logic and data (not the physical layout or style, hence less likely to break when the website changes) of any web app to add an API using a graphical visual flow/rules editor. Building by example, elements on a web page are visually tagged as being list items (requiring a loop), data elements to extract, and much more. It can automate a number of other things as well: Stefan showed how a local directory of cryptically-named files can be renamed to the actual titles based on table of contents HTML document; this is very common for conference proceedings, and I have hundreds of file sets like this that I would love to rename. The synthetic APIs are exposed as REST services, and can be bundled into Kapplets so that the functionality is exposed through an application that is useable by non-technical users. Just as Tom Baeyens talked about lowering the barriers for BPM inside enterprises in the previous demo, Kapow is lowering the bar for application integration to service the unmet needs.

It would be great if Tom and Stefan put their heads together and lunch and whipped up an Effektif-Kapow demo, it seems like a natural fit.

Next was Scott Menter of BP Logix on a successor to flowcharts, namely their Process Director GANTT chart-style process interface – he said that he felt like he was talking about German Shepherds to a conference of cat-lovers – as a different way to represent processes that is less complex to build and modify than a flow diagram, and also provides better information on the temporal aspects and dependencies such as when a process will complete and the impacts of delays. Rather than a “successor” model such as a flow chart, that models what comes after what, a GANTT chart is a “predecessor” model, that models the preconditions for each task. A subtle but important difference when the temporal dependencies are critical. Although you could map between the two model types on some level, BP Logix has a completely different model designer and execution engine, optimized for a process timeline. One cool thing about it is that it incorporates past experience: the time required to do a task in the past is overlaid on the process timeline, and predictions made for how well this process is doing based on current instance performance and past performance, including tasks that are far in the future. In other words, predictive analytics are baked right into each process since it is a temporal model, not an add-on such as you would have in a process based on a flow model.

For the last demo of this session, Jean-Loup Comeliau of W4 on their BPMN+ product, which provides model-driven development using BPMN 2, UML 2, CMIS and other standards to generate web-based process applications without generating code: the engine interprets and executes the models directly. The BPMN modeling is pretty standard compared to other process modeling tools, but they also allow UML modeling of the data objects within the process model; I see this in more complete stack tools such as TIBCO’s, but this is less common from the smaller BPM vendors. Resources can be assigned to user tasks using various rules, and user interface forms are generated based on the activities and data models, and can be modified if required. The entire application is deployed as a web application. The data-centricity is key, since if the models change, the interface and application will automatically update to match. There is definitely a strong message here on the role of standards, and how we need more than just BPMN if we’re going to have fully model-driven application development.

We’re taking a break, and will be back for the Model Interchange Working Group demonstration with participants from around the world.

bpmNEXT 2014: Work Management And Smart Processes

Bruce Silver always makes me break the rules, and tonight I’m breaking the “everything is off the record after the bar opens” rule since he scheduled sessions after dinner and with an open bar in the back of the room. Rules, as they say, are made to be broken.

Roger King of TIBCO attempted to start this demo during the earlier session but there were problems with the fancy projector setup. He’s back now to talk about model-driven work management. TIBCO’s core customer base (like mine) is traditional enterprises such as financial services, and they’re seeing a lot of them retiring legacy enterprise apps now in favor of process-centric apps built on platforms such as TIBCO. They see specific problems with work management in very large, branch-network organizations like retail banks; by work management and resource management, they mean the way that work is distributed to and accessed by end users, one of the things that BPMN doesn’t do when you define processes. With tens of thousands of participants, just a small increment in productivity through better work management can cause a significant ROI in absolute terms, but traditionally this has been done through custom user interfaces and distribution/matching. There are a number of resource patterns that have been studied and developed, e.g., separation of duties, round robin; Roger demonstrated how these are being incorporated into TIBCO’s AMX BPM (modeled within their Business Studio product) through organizational models, where you can find the resources in the organization, groups and custom organizational units that you need to bring your business vocabulary to determining how work is distributed within your organization. The idea is that once you have this defined, you can then use very fine-grained rules for determining which person gets which piece of work, or who has access to what. This now becomes something that you can attach to an activity in a process model using simple assignments or with a resource query language that assigns it dynamically, including based on process instance variables – essential when you have 100’s or 1000’s of branches and can’t realistically administer your organizational model and work distribution methods manually. Furthermore, you need to be looking at having people go to the work rather than having work sent to the people. This is the only type of work distribution approach when you’re creating declarative processes, where configuration needs to be much more dynamic than what might be drawn in the process model.

We finished off the short opening day of bpmNEXT with a keynote by Jim Sinur, late of Gartner (but not hesitant to use the materials that he helped to create there) and now an independent analyst, on how his processes are smarter than him. Processes based on machine learning, however, can only go so far: although machines are more accurate and consistent (and never complain when you ask them to work overtime), people are better at unexpected situations. The key is to have computers and people work together within intelligent processes: let the computers work on the parts that they do best, including events, analytics standardized decisions, pre-defined processes and the resulting actions from combining all of these; exploit emerging technologies such as cognitive systems, what-if scenarios via simulation, intelligent business operations, visualization and social analytics. Intelligent agents are a big part of this, but we need to have goal-directed processes to really make this work, or abandon the concept of processes at all except for the footprints that they leave behind.

Rule-breaking done. Back tomorrow for a full day of bpmNEXT 2014.

bpmNEXT 2014 Tuesday Session: It’s All About Mobile

I’ll blog this year the same as last year’s bpmNEXT demos, with each session of multiple demos in a single post. The posts are a bit long, but they are usually grouped into themes so it works better that way.

First up was Brian Reale of Colosa (makers of ProcessMaker open source BPM and ProcessMapper) on self-organizing groups, ad hoc work and expectations of simplicity. This is a topic that I’m really interested in, since I’ve been presenting on worker incentives with collaborative work, which includes some of the same issues as self-organization. One of his keys points is about the effort required to start using a typicial BPMS, and how that differs from design time (where there is typically a large degree of effort required and very little organic adoption) to runtime (where there is much less effort and is the main target of ROI). What they are trying to do is increase adoption by reducing the effort required at design time by providing more ad hoc capabilities, with a resultant lower ROI but also lower cost.  The result is FormSlider, an app environment for ad hoc workflow of structured data with minimal setup, which is what Brian demonstrated (still in alpha). He demoed the tablet interface for a loan application that allows for mobile capture of a client requesting a loan, including pictures and signatures, which then interfaces with ProcessMaker or other back-ends. More interestingly, he showed how an easily-setup app can be used for mobile data capture that hte user can then route to whomever they want (possibly limited to a selection list) with a few other fields such as due date and priority. There’s some informational context, such as seeing how long it is taking each of the possible participants to process cases, and also allows for routing to be round-trip or one-way. The standard user interface is pretty simple: My Cases for things that I’m working on, an Inbox for new things, and a simple forms interface for working on items. There’s an historical view of cases, showing the participants and their responses. He demoed a simple flow going through a round-trip from the initiator through two people and back to the initiator; this can be used for adding a collaborative workflow on top of existing pre-defined processes and systems, taking the place of emailing around for approvals and other simple collaboration. He finished up the demo in ProcessMaker showing us how an app and forms are created and deployed in a few minutes, including how potential users and groups are associated with the forms as they are designed. They have email and forum connectors for ProcessMaker and will be using the same methods with FormSlider for providing people with ways to be notified about work but also to interact with it directly.

Next up was Romeo Elias of Interneer on extending enterprise software using mobile apps by using BPM, addressing the issue that many companies have of not having skilled mobile app developers, but there being no commercial apps available for their needs. Their Intellect BPMS has mobile app capabilities, and allows custom mobile apps to be built quickly that can connect directly to the back-end processes. Since BPMS’ are often being used as full application development platforms, this is not that much of a stretch: the BPM platform already has a lot of the integration and other capabilities, and Interneer’s platform is intended to be used mostly in a drag-and-drop model-driven development environment. Romeo demonstrated creating a new application template that consisted of laying out a UI form for the mobile app using the full web interface (there could also have been a process attached, but the point of his demo was to show the mobile UI), then using it as an app on a tablet interface. The design interface on the web provides the ability to specify sidebar content as well as multiple pages (shown as tabs in the designer). The resultant app – immediately available as soon as it is created in the designer – is a native mobile app, not viewed through a mobile browser, so can take advantage of device-specific features as well as cache data offline. The app was a mobile data capture/reporting application that connected to a database; he demonstrated adding records to the table that include text (free text and restricted using a selection list) and a photo field, with any new records stored locally if connectivity is lost.

Scott Francis and Greg Harley of BP3 presented on bringing process to the people using their  Brazos mobile BPM responsive UI toolkit; at the time of last year’s bpmNEXT, they were focused on hybrid mobile apps, but now are directed towards responsible UI, that is, applications that run in a browser but behave appropriately regardless of the form factor of the device. Native apps can cause a lot of problems because of lack of mobile development and deployment skills within enterprises, but also the hurdles that many companies have to go through to deploy a mobile app that connects to their enterprise apps. Conversely, many enterprise applications already have web interfaces, so adding a new web UI that happens to be responsive and hence appropriate for mobile devices may have a much shorter adoption path, and less effort required since there’s a single application to design and deploy for any platform: no specialized mobile browser apps versus desktop browser apps. Plus, they’re giving it away for free, with plans to open source it in the future. Greg demoed a UI for an IBM BPM process in the full desktop browser version, then the same form on a phone (simulator). The same features in the full form are available in the mobile version, just resized and reformatted for the smaller screen in either orientation. He showed a bit of the form designer, although I had the sense that this would take a bit more effort than what we saw in the previous two demos but would offer quite a bit more capability. They support IBM BPM and Activiti BPM (which are the two platforms that BP3 supports in its consulting practice) and can be made to work with pretty much any BPMS that has a REST API since those APIs turn out to be surprisingly similar between different BPMS vendors. If you want to try out the Brazos UI toolkit, they have a sandbox where you can try it out running against an Activiti instance. This is quite the opposite in technology strategy from Interneer: I can understand BP3’s motivation for going with responsive UI, as well as the rapid uptake, but can also understand the challenges of a browser-based app when you have spotty connectivity (as I often do when I’m travelling), and they admittedly give up some of the device-specific capabilities.

We’re heading off to dinner, then back with a last demo (which was aborted from this session due to projector difficulties) and a keynote by Jim Sinur before we get down to the serious business of the evening drinks reception.

bpmNEXT 2014 Begins!

We’re at the lovely oceanside Asilomar conference grounds a couple of hours drive south of San Francisco for this year’s bpmNEXT conference. Last year’s inaugural conference was a great experience – I wrote 7,000+ words in two days, if that’s any indication – and this year’s lineup looks like a winner.

This conference is about what’s happening next in BPM (as you might guess by the  name): no sales pitches or death by PowerPoint, but a look at the technology directions as seen through demos. It’s also a great opportunity for networking, with a lot of the well-known names in BPM here in person meeting each other face-to-face for a change.

Bruce Silver and Nathaniel Palmer, our hosts and organizers, kicked off the conference and laid out the rules: each session (except for the keynote and a multi-company interoperability demo) is strictly 30 minutes long, with 20 minutes for the demo and 10 for Q&A. Last year, Nathaniel would start to look a bit threatening when the speaker reached their deadline, and everything ran on time.

We have sessions this afternoon and into the evening focused on mobile apps and interfaces, then all day tomorrow and until early afternoon on Thursday on a variety of other BPM topics, so get ready for the firehose.

My Spring 2014 BPM Conference Schedule

Last night, a friend asked me about where I’m travelling next, and when I responded “Newark, Philadelphia, San Diego, Orlando and San Francisco”, she assumed that was everything up to the end of May. Alas, that only gets me to the end of March. Here’s the conferences that I’ll be attending or presenting at over the next couple of months:

  • Kofax Transform, San Diego, March 9-11: I am making a joint presentation with Craig LeClair of Forrester on Planning, Designing and Implementing a Smart Process Application. I was also asked to judge their customer and partner awards, although I won’t be sticking around for the awards ceremony.
  • DST AWD Advance, Orlando, March 17-19: I’m presenting The Technical Side of Process Excellence, particularly around the use of configurable process-based applications for quick solution delivery.
  • bpmNEXT, Monterey, March 25-27: I’m attending and blogging, as I mentioned in yesterday’s post. I’ll be in San Francisco for the beginning of that week, and possibly stopping in the South Bay area at the end of the week to visit the Computer History Museum.
  • IBM Impact, Las Vegas, April 27-30: I’m attending the analyst event at Impact and as much of the show that I can cram in in the short time, because after almost a month without conferences, I’ll be doing two in one week.
  • Appian World, DC, April 30-May 2: I’m attending after a year away (recently, this always conflicts with IBM Impact).
  • BPM Portugal, Lisbon, May 8: I’m presenting on incentives for social enterprise, including social BPM. This will be an updated version of the presentation that I gave at the APQC conference last fall, and if you have any case studies to contribute to this, I would love to hear about them.
  • PegaWorld, DC, June 8-10: Again, one that I’ve missed a few times since it was conflicting with the IRM BPM conference in London, but this year they are a week apart and I’ll be there.
  • BPM Europe, London, June 16-18: I haven’t yet been added to the agenda for IRM’s annual BPM conference, but I’ve been there the past several years so it’s likely that I’ll be there again.

Hopefully, that’s it for the next four months, although there are always last-minute changes. Let me know if you’ll be nearby or at any of these and want to meet up. It’s a fair bet that I’ll be blogging from each of these as well.

Countdown To #bpmNEXT 2014

The conference that I was most excited to attend last year was bpmNEXT, conceived and executed by Bruce Silver and Nathaniel Palmer: “it’s like DEMO for BPM” is how Bruce original described it to me, and that’s how it turned out. I blogged almost 7,000 words about bpmNEXT and the individual sessions in two days (on an Android tablet, no less), which gives you an idea of the value that I got from it; you can read what I wrote or watch the recorded sessions from 2013 to see for yourself. Of course, a lot of the interesting bits weren’t in the sessions, but in the face-to-face interactions with the world’s BPM afficiandos, many of whom I hadn’t previously met IRL.

This year’s bpmNEXT is coming up on March 25-27, back at Asilomar – a lovely setting, although a bit of a drive from San Francisco – and you can see the list of scheduled presentations here and register here. You have until February 28 to get the early bird pricing, which includes housing and meals.

To be clear, this is an opportunity for learning, networking and collaborating, not selling to customers. Send your people in charge of strategic product direction and innovation, not your usual conference team. If you’re giving a demo, you have the chance to show off your cool new BPM stuff, whether early-stage demo or released product, and get feedback from your peers. If you’re in the audience, you’ll have your mind expanded and your creativity sparked with the mix of new ideas, and have time to discuss them and make some new business connections.

Disclosure: Bruce and Nathaniel have been kind enough to waive the conference portion of my fee, so that I pay only the housing/meals portion plus my own travel expenses. Note that this is one of the few conferences where I pay my own travel expenses to attend (I would be broke, otherwise), so you can take that as my further endorsement of bpmNEXT.

bpmNEXT Wrapup: The Good, The Bad And The Best In Show

The first bpmNEXT conference has finished, and it was a great experience for me. I’m still on the west coast, enjoying a half-day in San Francisco before flying home, and having a bit of time to reflect on what I liked — and the few things I didn’t like — about this week’s event.

First and foremost, this was primarily a meeting of peers to discuss BPM innovation in an open fashion. It was not targeted at customers, so there was little of the peacock-like preening and posturing that you get when vendors parade in front of them. It was also not attended by the major analyst firms, so ditto. Since the format required that presenters give a demo, there was a much heavier bias towards technical attendees, although many of them were in product management/marketing or company founder roles, so not code monkeys. For me, that’s the perfect group for networking: people who are technical and articulate, but want to talk about more than just the technology. The atmosphere was collegial and friendly, even (for the most part) between competitors, and I had the feeling that many of the presenters just wanted to show off their cool new stuff because they knew that this was the audience that would most appreciate it. I really think that Bruce and Nathaniel achieved their goal of making this “like DEMO for BPM”. For the vendors that didn’t attend, or who attended but didn’t participate because they didn’t want to show all their cool new stuff to their competitors: you are totally missing the point, and you missed a great opportunity. Ideas need a bit of exposure to the light in order to grow properly, and this is the place for that.

Second, cool and awesome demos! The “Best in Show” awards that we all voted on at the end (to Fluxicon, Whitestein and Fujitsu) were well-deserved, although many others were equally deserving. I loved the bleeding edge demos — Gero Decker of Signavio accidentally showed us how to do process modeling with head gestures — and the skunkworks projects that may never see the light of day but represent some different thinking — Keith Swenson of Fujitsu with his Cognoscenti demo really brought advanced case management to life. Anne Rozinat and Christian Gunther were in the first demo slot, which could have been tough, but they set the bar extremely high (they won first prize, after all) and wowed a lot of the North Americans there who had never had the chance to see their Disco process mining product, born out of their work at Eindhoven University. The demos that didn’t work as well were those that spent too much time on slides instead of the demo, those that were customer-facing rather than optimized for peer review, and those that tried to cover too much ground in a 20-minute demo. If I can give a word of advice to those vendors who have given me briefings in the past, treat this like that: no slides, no bullshit, no distractions.

Third, nice location, although there were some minuses as well as pluses here. Asilomar is beautiful, but we had no spare time to do more than take a quick walk to the beach on a break. I don’t propose lengthening the calendar or reducing the number of demos, but rather rearranging the first day: take a nice long break in the middle of the day after lunch for people to explore around, then go a bit later or even have some evening demos after dinner. Since pretty much everyone arrived by Tuesday dinner, there could have been demos on that evening, too. It’s a fairly remote location, people mostly didn’t do much after dinner anyway (except, for those of us from time zones further east, go to bed early), and evening demos could have been fun sessions with some libations involved — maybe a BPM buzzword drinking game? The remote location may have deterred some, it’s a 2-1/2 hour drive south of San Francisco, but it did mean that we mostly hung out with groups of other attendees, making for better networking.

Fourth, this was an unparalleled networking experience. As an extroverted introvert (really, just check my Myers-Briggs), I usually dislike networking events, but this felt less like an uncomfortable meet-and-greet and more like chatting with old friends. Which, for the most part, it was: I knew a lot of people there, both face-to-face and online. I had the chance to meet several people who I knew only online in the past but already felt like I knew, such as Anatoly Belaychuk and Ashish Bhagwat. About 1/3 of the 80 attendees were international (including three Canadians), meaning that there was a significant European contingent here, showing off some of the outstanding BPM innovation that is less often seen in North America, and possibly creating some trans-Atlantic relationships that might bloom into partnerships in the future.

Lastly, a few healthier, lower-carb snacks would not have gone amiss: I think that I ate my own weight in chocolate. Only the health benefits of the red wine offset all that. 🙂

bpmNEXT – Trisotech, EnterpriseWeb, Computas, Fujitsu

Full bpmNEXT program here. The format is 30 minutes per speaker: 20 minutes of demo, 10 minutes of Q&A.

Day 2, third session – last of the conference: dynamic processes and case management

Performing Collections of Activities as Means to Business Ends, Denis Gagne, Trisotech

Recorded demo of gathering requirements in the Discovery Accelerator on Business Process Incubator and gradually structure the data elements collected into a data model, using a pinboard paradigm. Can switch to a text view where a text description is added and key terms extracted for use in the board representation. The result is a structured set of activities that have been identified from requirements sessions and documents. This coordinated collection of activities is used to guide BPMN or CMMN modeling, creating the activities within the model as a starting point for further modeling. Their Visio add-on provides BPMN and CMMN modeling support, including model validation. There are also web-based modelers for BPMN and CMMN that can access and edit models from the same repository as the Visio-based modelers, providing the same user experience on multiple versions of Visio and any browser platform.

Event-Driven Rules-based Business Processes for the Real-Time Enterprise, Dave Duggal, EnterpriseWeb

Automated agents connects people, data and services on the fly (late binding) based on interpretation of models, context and available data. Every executing process may be different, and can be correlated with other instances. Brief demo showing searches and relationships between objects, e.g., between people and projects. Allows for creation of dynamic processes by the user as required.

Malleable Tasks and ACM, Helle Frisak Sem, Computas AS

Demo of MATS system developed for Norwegian Food Safety, winner of 2012 ACM Award for public sector. Knowledge workers involved in food safety inspections and audits of farms, fisheries, food industry and restaurants, requiring thousands of rules from Norway and harmonized across the EU. A case represents an entity subject to inspection — a person or business — with the case folder containing all information and documentation related to that entity, from which a knowledge worker can launch any of a number of tasks to be performed on that entity. The rules provide guidance to the user on which tasks are required from the general template for that task type, since the users are food safety subject matter experts, but the specific tasks to be applied are often a legal issue and based on the context. The tasks may be executed in any order unless there are specific dependencies. The data in the case folder is central, with the transitory tasks/process fragments acting on that data. Control objects are modeled declaratively, significantly reducing coding. Demo showed system use in response to a telephone call regarding a potential health safety violation; the task template is selected that most closely matches the caller report, and the required steps are added or removed based on the parameters selected. Provides support to ensure that workers are performing legally-required activities, but flexibility for them to control their work order and environment.

Antifragile Systems for Innovation and Learning Organizations, Keith Swenson, Fujitsu America

From the antifragile concepts in Nicholas Taleb’s latest book, business systems that are highly adaptive due to exposure to variable and adverse conditions can be significantly stronger than those that are protected. Creative, innovative organizations that thrive in an unpredictable world have to rely less on predefined processes, rules and predictions, and more on adapting to the current context and information. Demo of Interstage Cognoscenti (unreleased Fujitsu product Fujitsu research prototype) that allows users to create a case, referred to as a project, that is completely empty. The case owner can add documents, including sending an email to external participants with a URL for uploading documents without having an explicit login, and write notes. A project can be used as a template, and its characteristics merged into an existing project. Goals (tasks) can be created and assigned, and turned into subprojects for more complex activities. Other users may have the project in their watch list, and have goals assigned to them. In order to link projects together, a project can generate a streaming link that is linked into another project; the projects can then be set up to synchronize goals and documents between the linked projects. The system is intended for non-technical knowledge workers to create cases on the fly; there is no “design time” environment or more technical requirements.

That’s it for the bpmNEXT sessions — it’s been an awesome conference in terms on content, participants and atmosphere. We’re going to vote on Best in Show and wrap up, and I’ll likely post some final thoughts in a day or two after I’ve had some time to digest everything. Next week, I’ll be at DST’s AWD ADVANCE conference, although my volume of blogging will be lower since I’m giving a presentation there rather than just an observer.

bpmNEXT – Kofax, Knowledge Partners/Sapiens, Bosch

Full bpmNEXT program here. The format is 30 minutes per speaker: 20 minutes of demo, 10 minutes of Q&A.

Day 2, second session.

Fully Exploiting the Potential of BPM in the Cloud, Carl Hillier, Kofax

A prime motivator for cloud is instant provisioning; demo showed the live provisioning of a Kofax TotalAgility cloud instance (based on Microsoft Azure). Instances must be provisioned by Kofax, not directly by the customer. Each customer gets their own SQL database and data storage, but the (stateless) web application and presentation layers are multi-tenant. After entry of the instance attributes, the instance was generated within 3 minutes and the TotalAgility designer was available for immediate use in the new instance for creating and executing process models. Identical functionality is provided on public cloud, private cloud and on premise. Software updates are applied automatically in the public cloud, but can be controlled by the customer in the private cloud and on premise environments.

The Decision Model, Michael Grohs, Knowledge Partners International

Business-friendly decision models can compress the months that it normally takes for IT to encode business rules into enterprise systems down to a few weeks; the goal is to eventually be able to directly author business policies and the underlying rules in a business-understandable and machine-readable form, allowing for near-instant deployment of new business rules. The Decision Model is a methodology and book, manifested in the DECISION product from Sapiens, for representing business logic as a separate component within an architecture. Any decisions that can be defined declaratively are extracted from the process model and stored in the decision model, which can then be referenced within the process model. The decision model notation is a goal-driven hierarchical representation of rules and rule families, along with the data that is acted upon by those rules. The elements in the decision model are linked to the implementation methods, forming the hand-off point between business and IT. The decision model connects the process model, rule models, use cases and business motivation models. Very brief demo of DECISION showing graphical representation of a decision model as well as the tabular structure of the rule families associated with the model. When creating a new model, import a text document of the policy wording, system can detect synonyms and other vocabulary analysis to identify inconsistencies in policies and assist with creation of the decision model.

BPM for the Internet of Things, Tom Debevoise and Troy Foster, Bosch

In the internet of things, there are potentially billions of devices out there generating data and requiring instructions. These are typically organized as massive distributed systems of systems, such as Smart Home or Smart Grid, to organize and control collections of devices. Localized rules can monitor collections of sensors/devices, and report up the chain to higher-level controllers when certain events occur, allowing information to be aggregated and actions to be taken, including launching BPM processes. Rules are present at the device level and at the higher collector level, as well as potentially within the BPM process. Demo of their monitor dashboard (from inubit acquisition) can show status of specific machines as well as aggregate statistics, e.g., how many machines are in a critical state requiring maintenance or replacement. Rules (from VisualRules acquisition) allow rules to be created for different machine types, e.g., to display an alert when a vibration sensor indicates that the machine requires preventive maintenance to avoid failure; this, in turn, could instantiate a BPM process to dispatch a field maintenance worker to the machine.

Break for lunch. One more session after lunch, then Best of Show voting and wrap-up.

bpmNEXT – Process Analytica, Whitestein, Oracle, SAP

Full bpmNEXT program here. The format is 30 minutes per speaker: 20 minutes of demo, 10 minutes of Q&A.

Day 2, first session: data-driven processes and analytics

Visual Analytics and Smart Tools, Robert Shapiro, Process Analytica

Optima is their product for performing statistical analysis and optimizing process models, with four quadrants providing visibility into Model, Simulate, Analyze and Optimize toolkits, which act on a shared model. Model provides standard BPMN modeling including setting advanced attributes on elements. Simulation attributes are set in Simulate, then Optimize is used to define the parameters for optimization, e.g., cycle time, number of resources, cost. Analyze shows different representations of simulated runtime information, including histograms and scatterplots to allow identification of outliers, and a Gantt chart for critical path calculation. Can filter the data set based on the analytics, e.g., show only the process instances represented by a cluster on the scatterplot of cycle times. Remote demo via Skype, which made the communication a bit stilted.

Goals in the Process Continuum: from BPM to ACM and Beyond, Dominic Greenwood, Whitestein Technologies

Their target is to allow process automation across the continuum from unstructured to structured processes, for which they propose executable goal-oriented BPMN. They see rules-driven processes as being reactive, and goal-based processes having the potential to be proactive: assessing what to do next based on current and target future states, and selecting best actions to achieve a specific goal. They have a taxonomy of process goals: milestone goals (aligning intent with action, representing something to be achieved) and governance goals (obtaining/maintaining a specific target, representing something to be maintained), plus a layered process scoping to provide both tactical (single instance) governance and strategic (multiple/aggregate instance) governance. Demo of Eclipse-based design environment showing hierarchy of goals and activities: activities are the leaf nodes for achieving a specific atomic goal, and can be expanded into a full BPMN model. Individual processes are essentially functions that are called to achieve the goals, not the top-level artifact; the goals are constantly being evaluated to see how best to satisfy them, then uses a goal-seeking controller to instantiate and control the processes as required. Planning for a GO-BPMN extension.

KPI Risk Assessment, Manoj Das, Oracle

Demonstration of unreleased analytics capabilities for Oracle BPM. BAM Composer is a web tool targeted at business analysts, allowing creation of Dashboard, KPI, Query, Alert, View or Data Object. A dashboard is a standard BAM-style dashboard including process data and external data sources, with graphical visualizations, made up of the other elements. KPIs can be realtime (constantly calculated) or scheduled (periodically calculated), and are defined by the measurement of a specific data object, optionally over a rolling window; a threshold, with high/medium/low deviation ranges and actions to be triggers when thresholds are reached; and additional optional risk indicators such as other data values that may make this KPI particularly critical. Queries are a measurement of a specific data object to show trending changes, also allowing rules and actions to be triggered. Their event processing is focused on meaningful business patterns, including trends, moving averages and missing events.

Operational Process Intelligence for Real-Time Business Process Visibility, Patrick Schmidt, SAP

Intelligence business operations powered by HANA: in-memory big data analytics for real-time informational context to support decision-making in processes. Demonstration of a business dashboard showing internal data from multiple systems (BPM, Business Suite, databases, etc.), plus Twitter feeds and other external data sources. Interactively drill into or filter by problem areas highlighted by graphical representations to see underlying processes and data, including drilling into specific process instances. KPIs for processes updated and displayed in realtime. In the NetWeaver BPM design environment, a phase view (higher level sequential process view, similar to a value chain) can be created where process phases are a collection of the activities in an existing process model; the phase view is also available in the runtime business dashboard with the aggregate runtime statistics for the process activities contained in each phase. NW BPM can now use HANA directly as its database so that the history logs don’t need to be exported and imported before the analytics can be generated.

Mid-morning break, time to check out of my room and get back here for the next session.