Full bpmNEXT program here. The format is 30 minutes per speaker: 20 minutes of demo, 10 minutes of Q&A.
Day 1, first session: it’s Model Morning!
Process Mining: Discovering Process Maps from Data, Anne Rozinat and Christian Gunther, Fluxicon – uncovering the secrets within your processes
Process mining, beginning with an analogy of analyzing wind and currents to optimize sailing routes. The process reality is always more complex than the ideal model. The model is helpful for understanding the overall flow, but you need to see what’s actually happening if you want to understand – and optimize – it in more detail. Demo of their Disco product, starting with a log file in a spreadsheet that includes a case ID, start and end timestamp, activity name, resource and role for each activity in a process (resource and role are not required, but can help with the analysis). Imported this into Disco, identify the fields, and Disco reconstructs the process as it was actually executed. The colors, thickness of arcs and numbers indicate the frequency of each path – great for identifying exceptions and non-compliance. Interactive filters can remove less frequent paths to show the main flow. Statistics views show aggregate information for the data set, such as case (process instance) duration and mean wait time. Filters can be added, e.g. to identify the cases with the longest duration, then switch back to the process diagram view to see the flow for that filtered set of cases to identify the bottlenecks and loopbacks. Performance view of process diagram shows the time for each path. Animation view feeds the actual log data through the process diagram to help with visualization of bottlenecks. Clicking on a path creates a filter for only those cases that follow that path — the example shown is to find the small number of cases that bypassed a mandatory compliance step — then view the actual log data to identify the cases and resources involved. Can also generate a performance view of the log data rather than a process view, that shows the path between roles, indicating handoffs in the process and allowing for identification of resource bottlenecks. They’re working with a simulation vendor to link the models that they generate with simulation capabilities, which would allow for what-if scenarios. A number of questions about the log file: obviously, if the data’s not there, you can’t analyze it, and you might have to massage the log file prior to importing to Disco.
Managing Process Roles and Relationships, Roy Altman, PeopleServ – putting people into context
People in today’s organizations may assume multiple roles and have complex relationships with others, with can make it difficult to manage an organizational view of the company as well as people’s responsibilities within processes. Most organizations have some sort of matrix structure when it comes to projects, for example, so that a person’s manager on a project is not the same as their reporting manager. Issues such as delegation can make things more complex. PeopleServ’s Connexxions uses a rules-based approach to determine the correct role for each resource in the context of each process; it creates a functional integration between people and a technical integration between systems. Within the tool, create a context, then add a person and department and create the relationship between them (e.g., manages). Add another person and create relationship between then, (e.g., peer). Add rules, which may include links to data from other systems such as HR systems (e.g., all of the people actively working in that department), LDAP and other sources of organizational information. Contexts can be nested, so one context can be added as a node in another context. The context can be traversed starting at a node (e.g., find a person’s manager in that context, or find all people reporting to a person): this is where a lot of the power comes in, since allows identification of someone’s role and relationships in each individual context, e.g., projects, teams and departments. These contexts/traversals can be accessed/consumed from applications in order to control access and establish contextual reporting structure. People throughout the organization could create and edit contexts as appropriate: HR may manage the overall organization chart, departmental and project managers may manage roles within their teams, and anyone may manage roles for more casual groups such as social teams.
Lowering the Barriers to BPMN, Gero Decker, Signavio – BPMN made easy, and cool
Signavio is used not just for BPMN process modeling for IT projects such as ERP, CRM and BPMS implementations; in fact, that’s only 30% of their business. 40% of customers use it purely for process analysis and improvement, and the remaining 30% used it as part of their certification, risk or compliance efforts. They focus on providing capabilities for both easy creation of an initial graphical process model, then more complex tools for validation, understanding and refinement of the model, in order to lower the barriers to good process modeling. For people less familiar with graphical modeling, they can use Quick Model to enter the activities in a spreadsheet-style format, and see the diagram generated as each activity is entered or edited. The columns in the activity entry table are What, Who, How, IT Systems, Input documents, and Output documents; only the What is required (which becomes the activity name), specifying the How causes swimlanes to appear. At any point, you can switch to the full graphical editor, which allows addition of gateways and other elements, and also supports voice input. The shape palette can be filtered by the BPMN 2.0 specification, e.g., core elements only, and there is a tool to validate against Bruce Silver’s BPMN Method and Style conventions, which displays the improper usages and suggests fixes. They can also apply best practices and other style guides, such as DODAF, and can show a list of the rules and which are violated by the model. The tool can maintain and compare different versions of a model, much like a code diff either for subsequent revisions on the same model, or variants that are branched off. Models can be exported in BPMN XML and other formats, and their tool is integrated and/or OEMed into several BPM automation tools as the front-end modeler. They have an experimental feature — not yet, or possibly ever, part of the core product — using a device that detects hand movements and gestures (or, as we saw accidentally, head movements) to edit the process models; although this would not work for fine-grained modeling, it would be cool when presenting and doing some minor edits on a model in front of a larger group. They’re also working on vocabulary analysis for refactoring/correcting element labels, which would ensure that standardized language is used within the models.
Automated Assessment of BPMN 2.0 Model Quality, Stephan Fischli and Antonio Palumbo, itp commerce – a quality engineer’s dream
itp commerce is a long-standing process modeling and analysis product that uses Microsoft Visio with their add-on as the front end graphical environment, which provides a transition for existing Visio users to create BPMN 2.0 standard process models. Bruce Silver uses this in his training (he also now uses Signavio, not sure if he’ll be maintaining both), since they were the first to provide validation of models against his style guide. They also provide a number of process model quality metrics — validation, conformance, complexity, consistency and vagueness — some of which are difficult or fuzzy or both. Looking at the model quality pop-up (from their Visio add-on), it shows overall model quality with a long list of checks to turn on/off, and preset filters for process analysis, process execution, diagram interchange, and process simulation to show the quality for those intended activities for the model. Models can also be assessed for maturity level, which is an indicator of an organization’s overall process maturity; obviously, models are only one aspect of organizational maturity, but looking at a maturity model such as OMG’s BPMM can yield requirements for process models at each maturity level, e.g., KPIs would be required in models at process maturity level 4. In their Repository Explorer tool, you can run a quality report on a process chain (a set of related processes), which generates a full spreadsheet of all of the individual metrics for each process, plus aggregate statistics on quality, conformance, maturity and much more.
Awesome first session at bpmNEXT. It’s only the mid-morning break on the first day, lots more to come.