Category Archives: BPM

business process management

BPM skills in 2017–ask the experts!

Zbigniew Misiak over at BPM Tips decided to herd the cats, and asked a number of BPM experts on the skills that are required – and not relevant any more – as we move into 2017 and beyond. I was happy to be included in that group, and my contribution is here.

In a nutshell, I had advice for both the process improvement/engineering groups, and the IT groups that are involved in BPM implementations. Basically, the former needs to learn more about the potential power of automation as a process improvement tool and how BPMS can help with that; while the latter needs to stop using agile low-code BPMS tools to do monolithic, waterfall-driven implementations. I also addressed the need for citizen developers – usually semi-technical business analysts that build “end user computing” solutions directly within business units – to start using low-code BPMS tools to do this instead of spreadsheets.

On the side of skills that are no longer relevant, I’m seeing less need for Lean/Six Sigma efforts that focus on incremental process improvements rather than innovation. There are definitely industries with material assets and processes that benefit greatly from LSS methodologies, but its use in knowledge-based service organizations in waning.

Check out the entire post at the link above for the views of several others in the industry.

BPM books for your reading list

I noticed that Zbigniew’s reading list of BPM books for 2017 included both of the books where I have author credit on Amazon: Social BPM, and Best Practices for Knowledge Workers.

You can find the ebooks on Amazon for about $10 each:

 

I’ve also been published in a couple of academic books and journals, but somehow it’s a more exciting to see my name on Amazon, since I don’t really think of myself as an author. After writing almost a million words on this blog (968,978 prior to this post), maybe I should reconsider!

RPA just wants to be free: @WorkFusion RPA Express

Last week, WorkFusion announced that their robotic process automation product, RPA Express, will be released in 2017 as a free product; they published a blog post as well as the press release, and today they hosted a webinar to talk more about it. They are taking requests to join the beta program now, with a plan to launch publicly at the end of Q1 2017.

WorkFusion has a lot of interesting features in their RPA Express and Smart Process Automation (SPA) products, but today’s webinar was really about their business model for RPA Express. This is not a limited-time trial, it’s a free enterprise-ready product that can generate business benefit. Free to purchase and no annual maintenance fees, although you obviously have infrastructure costs for the servers/desktops on which RPA Express runs. Their goal in making it free is to bypass the whole RFP-POC-ROI dance that goes on in most organizations, where a decision to implement RPA – which typically can show a pretty good ROI within a matter of weeks – can take months. With a free product, one major barrier to implementation has been removed.

So what’s the catch? WorkFusion has a more intelligent automation tool, SPA, and they’re hoping that by seeing the benefits of using RPA Express, organizations will want to try out SPA on their more complex automation needs. RPA Express uses deterministic, rules-based automation, which requires explicit training or specification of each action to be taken; SPA uses machine learning to learn from user actions in order to perform automation of tasks that would typically require human intervention, such as handling unstructured and dynamic data. WorkFusion envisions a “stairway to digital operations” that starts with RPA, then steps up the intelligence with cognitive processing, chatbots and crowdsourcing to a full set of cognitive services in SPA.

This doesn’t mean that RPA Express is just a “starter edition” for SPA: there are entire classes of processes that can be handled with deterministic automation, such as synchronizing data between systems that may not talk to each other, such as SAP and Salesforce. This replaces having a worker copy and paste information between screens, or (horrors!) re-type the information in two or more systems; it can result in a huge reduction in cost and time, and remove the tedious work from people to free them up for more complex decision-making or direct customer interaction.

RPA Express bots can also be called from other orchestration and automation tools, including a BPMS, and can run on a server or on individual desktops. We didn’t get a rundown of the technology, so more to come on that as they get closer to the release. We did see one or two screens, and it’s based on modeling processes using a subset of BPMN (start and end events, activities, XOR gateways) that can be easily handled by a business user/analyst to create the automation flows, plus using recorder bots to capture actions while users are running through the processes to be automated. There was a mention of coding on the platform as well, although it was clear that this was not required in many cases, hence development skills are not essential.

Removing the cost of software changes the game, allowing more organizations to get started with this technology without having to do an internal cost justification for the licensing costs. There’s still training and implementation costs, but WorkFusion plans to provide some of this through online video courses, as well as having the large SIs and other partners trained to have this in their toolkit when they are working with organizations. More daunting is the architectural review that most new software needs to go through before being installed within a larger organization: this can still block the implementation even if the software is free.

I’m looking forward to seeing a more complete technical when the product is closer to launch date. I’m also looking forward to see how this changes the price point of RPA from other vendors.

What’s on your agenda for 2017? Some BPM conferences to consider

I just saw a call for papers for a conference for next October, and went through to do a quick update of my BPM Event Calendar. I certainly don’t attend all of these events, but like to keep track of who’s doing what, when and where. Here’s what I have in there so far; if you have others, send me a note or add them as a comment to this post and I’ll add to the calendar. I’m posting just the major conferences here, not every regional seminar.

Many vendors are eschewing a single large annual conference in favor of several regional conferences, easing the travel concerns of attendees; since these are usually just one day long, they aren’t announced this far in advance. It will be interesting to see if more vendors decide to go this way, or do more livestreaming to allow people to participate in more of the conference content remotely.

At this point, I don’t have confirmed attendance or speaking spots at any of these, although I will almost certainly be attending bpmNEXT and a few of the vendor conferences, either as a speaker or as an analyst/blogger. If you’re interested in having me attend your conference, let me know; I require that my travel expenses are covered (otherwise they come out of my own pocket in addition to the billable days that I’m giving up to attend), and a speaking fee if you want me to do a keynote or other presentation.

Keynoting at @ABBYY_USA Technology Summit

I’ve been in the BPM field since long before it was called BPM, starting with imaging and workflow projects back in the early 1990s. Although my main focus is on process now (hence the name of my blog), almost every project that I’m involved in has some element of content and capture, although not necessarily from paper documents. Basically, content capture is the onramp to many business processes: either the capture of a piece of content is what triggers a process (e.g., an application form) or it adds information to a process to move it along. Capture can mean traditional document scanning with intelligent recognition in the form of OCR, ICR, barcode and mark sense recognition, or can also be capture of information already in electronic form but not in a structured format (e.g., emails).

To get to the point, this is why I’m excited to be keynoting at the ABBYY Technology Summit coming up on November 16-18, in a presentation entitled How Digital Transformation is Increasing the Value of Capture and Text Analytics:

As the business world has been wrestling with the challenge of Digital transformation, the last few years have seen the shift away from BPM and Case Management technology platforms towards the more solutions-orientated approach offered by Smart Process Applications and Case Management Frameworks. A critical component of these business solutions is capability to capture the key business information at the point of origin.

This information is often buried inside forms and other business documents. Capturing this data through recognition technologies and automatic document classification transforms streams of documents of any structure and complexity into business-ready data.

This enables organizations of any size to streamline their existing business processes, increasing efficiency and reducing costs; it also enables real-time customer self-service processes triggered by mobile document capture.

I’ll be covering trends and benefits of intelligent capture, providing ABBYY’s customers and partners in attendance with solid advice on how to best start integrating these technologies to make their business processes run better. I’m also writing a paper covering these topics, sponsored by ABBYY, which will be available in time for the conference.

If you’re at the conference, please stop by and say hi, I’ll be hanging out there for the rest of the day after my keynote.

Strategy to execution – and back: it’s all about alignment

I recently wrote a paper sponsored by Software AG called Strategy To Execution – And Back, which you can find here (registration required). From the introduction:

When planning for business success, corporate management sets business strategy and specifies goals in terms of critical success factors and key performance indicators (KPIs). Although senior management is not concerned with the technical details of how business operations are implemented, they must have confidence that the operations are aligned with the strategy, and be able to monitor performance relative to the goals in real time.

In order to achieve operational alignment, there must be a clear path that maps strategy to execution: a direct link from the strategic goals in the high-level business model, through IT development and management practices, to the systems, activities and roles that make the business work. However, that’s only half the story: there must also be a path back from execution to strategy, allowing operational performance to be measured against the objectives in order to guide future strategy. Without both directions of traceability, there’s a disconnect between strategy and operations that can allow a business to drift off course without any indication until it’s far too late.

I cover how you need to have links from your corporate strategy through various levels of architecture to implementation, then be able to capture the operational metrics from running processes and roll those up relative to the corporate goals. If you don’t do that, then your operations could just be merrily going along their own path rather than working towards corporate objectives.

Another rift in the open source BPM market: @FlowableBPM forks from @Alfresco Activiti

Photo of Berries On Forks by my talented friend Pat Anderson (digiteyes)In early 2013, Camunda – at the time, a value-added Activiti consulting partner as well as a significant contributor to the open source project – created a fork from Activiti to form what is now the Camunda open source BPM platform as well as their commercial version based on the open source core. As I wrote at the time:

At the end of 2012, I had a few hints that things at Alfresco’s Activiti BPM group was undergoing some amount of transition: Tom Baeyens, the original architect and developer of Activiti (now CEO of the Effektif cloud BPM startup announced last week), was no longer leading the Activiti project and had decided to leave Alfresco after less than three years; and camunda, one of the biggest Activiti contributors (besides Alfresco) as well as a major implementation consulting partner, was making noises that Activiti might be too tightly tied to Alfresco’s requirements for document-centric workflow rather than the more general BPM platform that Activiti started as.

Since then, Effektif became Signavio Workflow and Camunda decided to use a capital letter in their name; what didn’t change, however, is that as the main sponsor of Activiti, Alfresco obviously has a need to make Activiti work for document-centric BPM and skewed the product in that direction. That’s not bad if you’re an Alfresco ECM customer, but likely was not the direction that the original Activiti BPM team wanted to go.

Last month, I heard that key Activiti people had left Alfresco but had no word about where they were heading; last week, former Activiti project lead Tijs Rademakers and Activiti co-founder and core developer Joram Barrez announced that they were creating an Activiti fork to form Flowable with a team including former Alfresco Activiti people and other contributors.

To be clear to those who don’t dabble in open source, forking is a completely normal activity (no pun intended…well, maybe only a little) wherein a copy of the source code is take at a point in time to create a new open source project with its own name and developer community. This may be done because of a disagreement in product direction – as appears was the case here – or because someone wants to take it in a radically different direction to address a different need or market.

I heard about all of this from Jeff Potts, who has been involved in the Alfresco open source community for a decade, via his blog. He wrote about the Activiti leads leaving Alfresco back in September, although he reported it as three people leaving independently that just happened to occur at the same time. Possibly not completely accurate, in hindsight, but that was the word at the time. He then saw the Flowable announcement (linked above) and wrote about that,  which is where I first saw it. Potts has been involved in the Alfresco open source community for a decade.

Alfresco’s founder and CTO, John Newton, posted about the team departure and the fork:

Unfortunately, some of my early friends on the Activiti project have disagreed with our direction and have taken the step of forking the Activiti code. This is disappointing because the work that they have done is very good and has generally been in the spirit of open source. However, the one thing that we could not continue to give them was exclusive control of the project. I truly wish that we could have found a way to work with them within a community framework.

This seemed to confirm my suspicion that this was a disagreement in product direction as well as a philosophical divide; with Alfresco now a much bigger company than it was at the time that they took Activiti under their wing, it’s not surprising that the corporate mindset wouldn’t always agree with open source direction. Having to spend much more effort on the enterprise edition than the open source project and seeing BPM subsumed under ECM would not sit well with the original Activiti BPM team.

Newton’s comments are also an interesting echo of Barrez’ post at the time of the Camunda fork. In both situations, a sense of disappointment – and maybe a bit of betrayal? – although now Barrez is on the other side of the equation.

Flowable was quick to offer a guide on how to move from Activiti to Flowable – trivial at this point since the code base is still the same – and Camunda jumped in with a guide on moving from Activiti to Camunda, an update on what they’ve done to the engine since their fork in 2013, and reasons why you should make the switch.

If you’re using Activiti right now, you have to be a bit nervous about this news, but don’t be.

  • If you’re using it primarily for document workflow with your Alfresco ECM, you’re probably best to stay put in the long run: undoubtedly, Activiti will be restaffed with a good team and will continue to integrate tightly with Alfresco; it’s possible that some of the capabilities might find their way from the Activiti project to the Alfresco project over time. There’s obviously going to be a bit of a gap in the team for a while: the project shows no new commits for a month, and questions on the forum are going unanswered.
  • If you use Activiti without Alfresco ECM (or with very little of it), you still don’t need to do anything right now: as Camunda showed in their post, a migration path from Activiti to Flowable or Camunda or any other fork will still exist in the future because of the shared heritage. It will get more complex over time, but not impossible. Sit tight for 6-12 months, and reassess your options then.
  • If you’re considering Activiti as a new installation, consider your use cases. If you’re a heavy Alfresco ECM user and want it tightly integrated, Activiti is the way to go. For other cases, it’s not so clear. We need to hear a bit more from Alfresco on their plans, but it’s telling that Newton’s post said “Business Process Management is one of those critical components of any content management system” which seems to place ECM as the primary focus and BPM as a component. He also said that they would be more explicit in their roadmap, and I recommend that you wait for that if you’re in the evaluation stage for a pure (non-ECM) BPM platform.

In the meantime, Flowable has released their initial 5.22.0 engine, and have plans for version 6 by the end of November. They haven’t published a product roadmap yet, but I’m expecting significant diversions from Activiti to happen quickly to incorporate new technologies that bring the focus back to BPM.

Note: the photo accompanying this post was taken by my talented photographer friend, Pat Anderson, with whom I have eaten many delicious and photogenic meals.

Bridging the bimodal IT divide

Bimodal ITI wrote a paper a few months back on bimodal IT: a somewhat controversial subject, since many feel that IT should not be bimodal. My position is that it already is – with a division between “heavy” IT development and lighter-weight citizen development – and we need to deal with what’s there with a range of development environments including low-code BPMS. From the opening section of the paper:

The concept of bimodal IT – having two different application development streams with different techniques and goals – isn’t new. Many organizations have development groups that are not part of the standard IT development structure, including developers embedded within business units creating applications that IT couldn’t deliver quickly enough, and skunkworks development groups prototyping new ideas.

In many cases, this split didn’t occur by design, but out of situational necessity when mainstream IT development groups were unable to service the needs of the business, especially transformational business units created specifically to drive innovation. However, in the past few years, analysts are positioning this split as a strategic action to boost innovation. By 2013, McKinsey & Company was proposing “greenfield IT” – technology managed independently of the legacy application development and infrastructure projects that typically consume most of the CIO’s attention and budget – as a way to innovate more effectively. They found a correlation between innovation and corporate growth, and greenfield IT as a way to achieve that innovation. By the end of 2014, the term “bimodal IT” was becoming popular, with Mode 1 being the traditional application development cycle focused on stability, well suited to infrastructure and legacy maintenance and Mode 2, focused on agility and innovation, similar to McKinsey’s greenfield IT.

Read on by downloading the paper from Software AG’s site; right now, it looks like registration isn’t required.

AIIM Toronto keynote with @jmancini77

I’m at the AIIM Toronto seminar today — I pretty much attend anything that is in my backyard and looks interesting — and John Mancini of AIIM is opening the day with a talk on business processes. Yes, Mr. Column 1 is talking about Column 2, if you get the Zachman reference. This is actually pretty significant: content management isn’t just about content, just as process management isn’t just about process, but both need to overlap and work together. I had a call with Mancini yesterday in advance of my keynote at ABBYY’s conference next month, and we spent 30 minutes talking about how disruption in capture technologies has changed all business processes. Today, in his keynote, he talked about disruptive business processes that have transformed many industries.

John Mancini at AIIM TorontoHe gave us a look at people, process and technology against the rise (and sometimes fall) of different technology platforms: document management and workflow; enterprise content management; mobile and cloud. There are a lot of issues as we move from one type of platform to another: moving to a cloud SaaS offering, for example, drives the move from perimeter-based security to asset-based security. He showed a case study for financial processes within organizations — such as accounts payable and receivable — with both a tactical dimention of getting things done and a strategic side of building a bridge to digital transformation. Most businesses (especially traditional ones) operate at a slim profit margin, making it necessary to look at ways to reduce costs: not through small, incremental improvements, but through more transformational means. For financial processes, in many cases this means getting rid of manual data capture and manipulation: no more manual data entry, no more analysis via spreadsheets. And cost reduction isn’t the key driver behind transforming financial business processes any more: it’s the need for better business analytics. Done right, these analytics provide real-time insight into your business that provide a strategic competitive differentiator: the ability to predict and react to changing business conditions.

Mancini finished by allowing today’s sponsors, with booths around the room, to introduce themselves: Precision ContentAIIMBoxPanasonicSystemWareABBYYSourceHOV, and Lexmark (Kofax). I’ll be here for the rest of the morning, and look forward to hearing from some of the sponsors and their customers here today.

Camunda Community Day: @CamundaBPM technical sessions

I’m a few weeks late completing my report on the Camunda Community Day. The first part was on the community contributions and sessions, while the second half documented here is about Camunda showing new things that could be used by the community developers in the audience.

First up was Vladimirs Katusenoks, core developer on BPMN.io, with a presentation on bpmn-js: how it works, and how to extend it with custom functionality such as adding color to BPMN diagrams, which is a permitted extension to BPMN XML. His live coding presentation showed changing the colour of a shape background, either statically in code for the element class or by adding a colour picker to an individual element context palette; this was based on the bpmn-js core BPMN functionality, using bpmn-moddle to read/write into the metamodel and diagram-js to render it. There are a number of other bpmn-js examples on Github.

Next, Felix Müller discussed KPI management, expanding on his August blog post on the topic. KPI management is based on quantitative indicators for process cycle-time improvement, including cycle time and overdue time, plus definitions of the time period, unit of measure and calculation method. In Camunda, KPIs are defined in the Modeler, then monitored in Cockpit. He showed how to use the concept of element templates (that extend core definitions) to create custom fields on collaboration object (process) or individual tasks, e.g., KPI unit (hours, days, minutes) and KPI threshold (number). In Cockpit, this appears as a new tab for KPI Overview, showing a list of individual instances and target/current/average duration, plus an indicator of overdue status of the instance and any contained tasks; there is also a decorator bubble on the top right of the task on the process model to show the number of overdue instances on the aggregate model, or overdue status as a check mark or exclamation on individual models. The Cockpit modifications were done by creating a plug-in to display KPI statistics, which queries and calculates on the fly – a potential performance problem that might be improved through pre-aggregation of statistics. He also demonstrated how to modify this basic KPI model to include an expected duration as well as maximum duration. A good start, although I think there’s a lot more that’s needed here.

Thorsen Lindhauer, a Camunda core BPM developer, discussed how to contribute to the Camunda open source community, both at camunda.org (engine and desktop modeler, same as the commercial product) and bpmn.io (JS tools). Possible contributions include answering questions on forums; logging error reports; documenting ideas for new functionality; and working on code. Code contributions typically start by having a forum discussion about planned new functionality, then a decision is made on whether it will be core code (higher quality standards since it will become part of the commercial product, and will eventually be maintained by Camunda) versus a community extension; this is followed by ongoing development, merge and release cycles. Camunda is very supportive of community contributions, even if they don’t become part of the core product: community involvement is critical to the health of any open source project.

The last presentation of the community day was Daniel Meyer discussing the product roadmap. The next release, 7.6, will be on November 30 – they have a strict twice-yearly release cycle. This release includes updates to DMN, CMMN, BPMN, rolling updates, Cockpit features, and UI/UX in web apps; I have captured a few notes here but see the linked roadmap for a more complete and accurate description and the online documentation as it is rolled out.

  • DMN:
    • Simpler decision table editing with drop-down lists of comparison/range operators instead of having to remember FEEL or Juel syntax
    • Ability to add list of selection values (advanced mode still exists for full flexibility)
    • Decisions with literal expressions
    • DMN engine performance 4-6x faster
    • Support for decision requirements diagrams/graphs (DRD/DRG) that can link decision tables; visualization in Modeler and Cockpit are not there yet but the structures are supported – in my experience, this is typical of Camunda, which builds and releases the engine capabilities early then follows with the visualization, allowing for a quicker start for executable diagrams
  • CMMN:
    • Modeler now completely models CMMN including technical attributes such as listeners
    • Cockpit (visualization still incomplete although we saw a brief view) will allow linking models of same or different types
    • Engine feature and functionality improvements
  • Rolling updates allow Camunda process engine to be updated without shutdown: guaranteed backwards compatibility of database schema to allow database to be updated first, then roll updates of engines by taking each offline individually and allowing load balancer to reroute sessions.
  • BPMN:
    • BPMN conditional event supported
    • Improved modeling including labels, collapsing/expanding subprocesses to switch between view types, and field injections in property panel.
  • Cockpit:
    • More flexible/granular human task monitoring
    • New welcome page with links to apps (Cockpit, Tasklist, Admin), user profile, and frequent links
    • Batch operations (cancel, suspend, etc.) based on batch action capability built for instance migration
    • CMMN and DMN DRD visualization

Daniel discussed some other minor improvements based on customer feedback, plus plans for 2017, including a web modeler for collaborative BPMN, CMMN and DMN modeling via a SaaS offering and a future on-premise version. They finished the day with a poll and community feedback to establish priorities for future versions.

I stayed on for the second day, which is actually a separate conference: BPMCon for Camunda’s enterprise (commercial) customers. Rather, I stayed on for Neil Ward-Dutton’s keynote, then ducked out for most of the rest of day, which was in German. Neil’s keynote included results from workshops that he has done with executives on digital transformation, and how BPM can be used to create the bridges between the diverse parts of a digital business (internal to external, automated to people-centric), while tracking and coordinating the work that flows between the different areas.

Disclaimer: Camunda paid my travel expenses to attend both conference days. I was not compensated in any way for attending or for writing this post, and the opinions here are my own.