BPM for Small and Medium Businesses

Tom Bellinson of IT Methods looked at some of the reasons why BPM in small business (under 500 people) is different from that in larger businesses, based on some of the fundamental differences in how small and large businesses work, and therefore how they deal with process improvement. There are some advantages to looking at process improvement in large companies – more human and financial resources, and a longer time frame – but small businesses have the advantage that usually their processes can be understood and tackled end to end rather than piecemeal.

Because of less available resources, a big bang approach sometimes doesn’t work for small companies: the big bang is good for just getting it does all at once, but causes a great deal more stress and requires more concentrated effort. Bellinson proposes using a big bang approach for the analysis and planning phases, then implementing using a more incremental approach. This can still cause significant stress, particularly during the analysis phase when you may be challenging the company owner’s personality directly as it manifests in company culture and processes. Although process analysis always challenges management in some way in any sized company, when the owner personally created specific processes, and those processes are now being called out as broken, that can be uncomfortable all around.

To do a big bang approach to process mapping in a small business, it needs to be made a #1 priority in the company so that it doesn’t get pushed aside when the inevitable emergencies occur; you’ll also need to hire an external consultant to guide this process and gather the information, since the odds of those skills being inside your company and readily available is near zero. This is really a 2-4 week effort, not the months that it might take in a larger company, so although it will be a bit stressful and disruptive during that time, you need to bite the bullet and just get it done. The analysis itself isn’t unique to small businesses – map as-is processes, find and eliminate the non-value added activities, determine ROI – but sometimes the roles are a bit different, with the process consultant actually doing the process improvement exercise and presenting it to the company, rather than internal participants being involved in the reengineering efforts.

I’ve been approached by a few smaller businesses lately who were interested in BPM, and I think that the tools are finally at a price point that SMBs can consider implementing BPM inside their organizations. I agree with Bellinson that many of the techniques are just different for smaller businesses; having started and run two small businesses up to 40-50 people in size, I can certainly understand how the owner’s personality can have a big influence on the corporate culture and therefore the way that business processes improvement has to happen. However, there are still a lot of standard BPM principles and methodologies that can be applied, just on a smaller scale.

SaaS BPM at Surrenda-link

Bruce Spicer of Keystar Consultancy presented on a project that he did with Surrenda-link Investment Management to implement Appian cloud-based BPM for the process around procuring US life settlement assets (individual life insurance policies) to become part of their investment funds. They were specifically looking at a software as a service offering for this, in order to reduce cost and risk (considering the small size of their IT group), since SaaS allows them to scale up and down seamlessly without increasing costs significantly. They’ve built their own portal/user interface, using Appian Anywhere as the underlying process and analytics engine; it surprises me a bit that they’re not using more out of the box UI.

They were overtime and over budget, mostly because they (admittedly) screwed up the process mapping due to immature processes, inexperience with process analysis, and inexperience with gathering requirements versus just documenting the as-is state. Even worse, someone senior signed off on these incorrect process models, which were then used for initial development in the proof of concept before corrections were made. They made some methodology corrections after that, improving their process analysis by looking at broad processes before doing a detailed view of a functional silo, and moving to Agile development methodologies. Even with the mistakes that were made, they’re in production and on track to achieve their three-year ROI.

This should be a compelling case study, but maybe because it was just after lunch, or maybe because his presentation averaged 120+ words per slide, I had a hard time getting into this.

Resolving Case Management Challenges with Dynamic BPM

Dermot McCauley of Singularity discussed case management and its need for dynamism. He’s one of the co-authors of Mastering the Unpredictable: How Adaptive Case Management Will Revolutionize The Way That Knowledge Workers Get Things Done, and started with a definition of case management:

Case management is the management of long-lived collaborative processes that require secure coordination of knowledge, content, correspondence and resources to achieve an objective or goal. The path of execution cannot be predefined. Human judgment is required in determining how to proceed, and the state of a case can be affected by external events.

As he pointed out, cases are inherently unpredictable, emerging and changing over time, and must allow case workers to chart their own course through the process of managing the case, deciding on the right tasks to do and the right information to include at the right time. He discussed 14 key characteristics of case management, including “goal driven”, “information complexity” and “fluid participants and roles”, and how a case management technology platform must include aspects of BPMS, ECM and collaboration technologies in order to effectively support the knowledge workers. He also discussed the criticality of the history of a case, even more so than with structured processes, since cases are typically long-running and might include several workers added in partway through the case timeline. Case workers need a flexible work environment, since that’s the nature of their work, which means that they need to be able to configure their own desktop environment via a mashup-like functionality in order to organize their work effectively.

He also covered off a bit of their own product; interesting to see that there is a process map underlying a case, with a “happy path” showing what the case should be doing, but providing the user at any point with the ability to skip forward or back in the process map, initiate other (pre-defined) tasks, reassign the task to another user, and change case characteristics such as priority and expected completion time. This is not purely unstructured process, where there is no predefined model, but dynamic BPM where the model is predefined but can be readily changed while in flight. They have implemented a solution with the UK Insolvency Service, dealing with individual bankruptcy; this was targeted at a new low-cost program that the Insolvency Service was putting in place to handle the large number of low-asset individual insolvency cases in the face of the recent economic crisis. They used an agile approach, moving the case files from paper to electronic and providing a more flexible and efficient case management process that was live within 12 months of the original government legislation that enacted the program.

Bridging Process Modeling and IT Solutions Design at adidas

Eduardo Gonzalez of the adidas Group talked about how they are implementing BPM within their organization, particularly the transition from business process models to designing a solution, which ties in nicely with the roundtable that I moderated yesterday. The key issue is that process models are created for the purpose of modeling the existing and future business processes, but the linkage between that and requirements documents – and therefore on to solution design – is tenuous at best. One problem is with traceability: there is no way to connect the process models to the thick stack of text-based requirements documents, and from the requirements documents to the solution modules; this means that when something changes in a process model, it’s difficult to propagate that through to the requirements and solution design. Also, the requirements leave a bit too much to the developers imaginations, so often the solution doesn’t really meet the requirements.

The question becomes how to insert the business process models into the software development lifecycle. Different levels of the process model are required, from high-level process flows to executable workflows; they wanted to tie this in to their V-cycle model of solution design and development, which appears to be a modified waterfall model with integrated testing. Increasingly granular process models are built as the solution design moves from requirements and architecture to design and implementation; the smaller and more granular process building blocks, translated into solution building blocks, are then reassembled into a complete solution that includes a BPMS, a rules engine, a portal, and several underlying databases and other operational systems that are being orchestrated by the BPMS.

Gonzalez has based some of their object-driven project decomposition methods on Martyn Ould’s Business Process Management: A Rigorous Approach , although he found some shortcomings to that approach and modified it to suit adidas’ needs. Their approach uses business and solution objects in an enterprise architecture sort of approach (not surprising when he mentioned at the end of the presentation that he is an enterprise architect), moving from purely conceptual object models to logical object models to physical object models. Once the solution objects have been identified, they model the object states through its lifecycle, and object handling cases (analogous to use cases) that describe how the system handles an object through its full lifecycle, including both system and human interaction. He made the point that you have to have the linkage to master data; this is becoming recognized as a critical part of process applications now, and some BPMS vendors are starting to consider MDM connectivity.

The end solution includes a portal, BPMS, BRMS, ESB, MDM, BI and back-end systems – a fairly typical implementation – and although the cycle for moving from process model to solution design isn’t automated, at least they have a methodology that they use to ensure that all the components are covered and in synchronization. Specific models at particular points in their cycle include models from multiple domains, including process and data. They did a proof of concept with this methodology last year, and are currently running a live project using it, further refining the techniques.

Their cycle currently includes the model and execute phases of a standard BPM implementation cycle; next, they want to take on the monitor and optimize phases, and add modeling techniques to derive KPIs from functional and non-functional requirements. They also plan to look at more complex object state modeling techniques, as well as how adaptive case management fits into some of their existing concepts.

I posed a question at the end of my roundtable yesterday: if a tool existed that allowed for the definition of the process model, user interface, business rules and data model, then generated an executable system from that, would there still be a need for written requirements? Once we got past the disbelief that such tools exist (BPMS vendors – you have a job to do here), the main issue identified was one of granularity: some participants in the process modeling and requirements definition cycle just don’t need to see the level of detail that will be present in these models at an executable level. Obviously, there are still many challenges in moving seamlessly from conceptual process models to an executable process application; although some current BPMS provide a partial solution for relatively simple processes, this typically breaks down as processes (and related integrations) become more complex.

Conversation with Keith Harrison-Broninski

You may have noticed that I haven’t been blogging for the first two days of the IRM BPM conference here in London: that’s because I gave a 1/2-day seminar on the BPM technology landscape on Monday, then presented a session on collaboration and BPM yesterday morning, then moderated a roundtable on transforming process models to IT requirements yesterday afternoon. Last night, a small group of us had dinner at the lovely Institute of Directors club, where we had a fascinating conversation about all things related to BPM – off the record, of course. 🙂

This morning, we started the day with Roger Burlton, the conference organizer, interviewing Keith Harrison-Broninski about the future of work. Keith, who I first met at the BPMG conference here in London four years ago, created the theory of Human Interaction Management (HIM), with the idea that you start with the complex human relationships – strategy, goals and deliverables – and work your way out to the transactional stuff. In other words, get a handle on the collaborative human-to-human processes first, with no technology involved, then use the successes in that sort of process improvement to gain support for the greater funding and time commitments required for implementing a BPMS. When Roger said that HIM sounds a lot like project management, Keith replied that project management is a use case of HIM.

Keith comes across as a bit of an old-school technophobe: he pooh-poohs blogging, tweeting and all other social media, and (based on his involvement in my roundtable yesterday afternoon) considers BPMS implementations to take much too long and cost too much although he appears to have little practical experience with any modern-day model-driven BPMS. Ignoring that, he does have some interesting ideas that get back to the definition of BPM that we all give lip service to, but often ignore: the management practice of improving processes, separate from the technology. This is about knowledge work, however, not routine work, where people are given goals and deliverables and work out how to achieve those based on their own knowledge. He refers to these as information-based processes, and everything that could be represented by a process model as task-based processes, where these mundane task-based processes are merely programs (in the software sense) to be implemented with much time and effort by the lowly engineers and developers. The answer to all this, of course, is his software, HumanEdj, and the workshops and services that he provides to help you implement it.

An interesting discussion, showing some of the huge gaps that exist in BPM today, especially between how we deal with knowledge work versus routine work.