Fujitsu process discovery case study #GartnerBPM

I first saw Fujitsu’s process discovery offering last year, and it looked pretty useful at the time, but it didn’t have much of a track record yet. Today’s session brought forward Greg Mueller of Electro Scientific Industries (ESI), a manufacturer of photonic and laser systems for microengineering applications, to talk about their successes with it.

Basically, the Automated Process Discovery (APD) uses log files and similar artifacts from any variety of systems in order to derive a process model, analyzing frequencies of process variations, and slicing and dicing the data based on any of the contributing parameters. I’ve written a lot about why you would want to do process discovery, including some of the new research that I saw at BPM 2009 in Germany last month.

ESI wanted to reduce inventory and improve manufacturing cycle time, and needed to understand their opportunity-to-order process better in order to do that. They used APD to determine the actual process flows based on about 15 months of data from SAP and other systems, then validated those flows with the team who worked with those flows. They wanted to look at variations based on business unit and other factors to figure out what was causing some of their cycle time and inventory problems.

They assumed a relatively simple four-step process of opportunity-quote-order-shipment, possibly with 3-4 additional steps to allow revisions at each of these steps; what they actually found when they looked at about 11,500 process instances is that they had over 1,300 unique process flows. Yikes. Some of this was cycling through steps such as order change: you would expect an order to be changed, but not 120 times as they found in some of their instances. There were also loopbacks from order to quote, each of these representing wasted employee time and increased cycle time. They found that one task took an average of 58 days to complete, with a standard deviation of 68 days – again, a sign of a process out of control. They realize that they’re never going to get it down to 25 unique process flows, but they are aiming for something far lower than 1,300.

They did a lot of data slicing and analysis: by product, by region, by sales manager and many other factors. APD allows for that sort of analysis pretty easily (from what I saw last year), much like any sort of dimensional modeling that you would do in a data warehouse.

They observed that less than 20% of their opportunities followed the happy path, and the rest were taking too long, duplicating efforts, having too many rework loopbacks, and sometimes not even shipping after a great deal of up-front work.

In their process improvement phase, they established 22 projects including a number of improvement features such as automating processes to reduce repeated steps, improving entry flow to reduce time intervals, require the entry of initial data early in the process in order to reduce loopbacks and rework. Since their business runs on SAP, a lot of this was implemented there (which begs the question of who did such a crappy SAP implementation for them in the first place such that they had problems like this – seriously, insufficient required data entry at the start of an process?), and they’re able to keep extracting and analyzing the logs from there in order to see what level of improvement that they are experiencing.

After a much too short presentation by ESI, Ivar Alexander from Fujitsu gave us a demo of APD with ESI’s basic process; I’ve seen a demo before, but it’s still fascinating so see how the system correlates data and extracts the process flows, then performs detailed dimensional analysis on the data. All of this is done without having to do a lot of interviews of knowledge workers, so is non-invasive both from a people and system standpoint.

It’s important to recognize that since APD is using the system logs to generate the process flows, only process steps that have some sort of system touch-point will be recorded: purely manual process steps will not. Ultimately, although they can make big improvements to their SAP-based processes based on the analysis through APD, they will probably need to combine this with some manual analysis of off-system process steps in order to fully optimize their operations.

Dynamic BPM versus agility #GartnerBPM

Jim Sinur led a session this morning on dynamic BPM and how to deal with the demands for change. He started with the statement that dynamic BPM is more than just another type of BPM technology, it’s a requirement for a transformational advantage, and took a look at how BPM will become more dynamic in the future.

Change is driven by unexpected exceptions in processes, and patterns of these unexpected events can indicate trends in your business environment that the processes need to accommodate. Typical change cycles in IT, however, tend to be slow and steady, which doesn’t at all match either the business dynamics or the external forces that shape them. Being able to handle these spiky demands drives the requirement for more dynamism in how processes and rules are managed, and drives the requirement for the business to be able to manage these directly rather than having to engage IT for all changes.

Gartner’s definition of dynamic BPM is the ability to support process change by any role, at any time, with very low latency. Change agents include everyone from customers and business people through business and process analysts, and on to architects and developers; if the people at the business end of this spectrum aren’t allowed to make process changes, then they’ll just work around it and invent their own processes using their own tools. This isn’t just about each individual’s personal preferences for how they work, however: if knowledge workers can make changes to their processes, they will tend to make them more efficient and effective, which has enterprise benefits.

A significant part of this is the inclusion of explicit rules within processes, so that scenario-driven rule sets can detect and respond to conditions, even without the process participants having to make those changes themselves: the basis of what James Taylor was saying in his presentation this morning. What used to be monolithic lumps of code can be split into several parts, each of which has the potential to be agile: user interface is managed by portals and the web; decision points are handled by rules engines; paths of execution are managed by BPMS; and data definitions are handled in databases or XML data representations. All of those parts used to be under the control of the developers, but turning it inside out and using more agile technologies allows people to customize their UI, change their rules on a daily basis, modify their processes, and define their own data structures. Dynamic BPM isn’t just about changing process models, it spans requirements, recompilation, data binding, loading and versioning.

There was quite a bit about services composition environments and CEP that I felt didn’t really belong in a presentation on dynamic BPM: yes, you need to have services and CEP in order to build agile processes in the first place, but it seems like filler.

One brief slide on “Web 2.0”, really just a quick laundry list of enterprise social software aspects that could impact BPM, including collaborative process design and execution, but no meat. Sinur merely read the list and pointed out that there are vendors at the showcase showing some of these capabilities. That was a bit of a disappointment, considering that the term “dynamic BPM” is being used by many (including Forrester and several vendors) to describe collaborative processes that are created or modified at runtime by the user.

He finished up with some sensible advice about separating rules and other application components from the processes in order to push towards more agile processes, although not different from the message that we’ve been hearing for quite a while now.

This wasn’t a new presentation: it was mostly recycled material that I had seen in previous Gartner presentations (either at conferences or on webinars) about agile BPM using rules, services and complex event processing. There’s been some new verbiage put around it and a few new slides, but only the briefest nod to the type of user-created ad hoc collaborative processes that represent the most dynamic form of BPM.

Using BPM to survive, thrive and capitalize #GartnerBPM

Last session of the day, a panel with Jim Sinur, Elise Olding and Michele Cantara on using BPM to survive, thrive and capitalize in a turbulent economy. I realize that this session has the same title as a webinar that Cantara and Janelle Hill did a while back, and there’s a lot of repeat material from that so I won’t bother to recapture it here. There’s a link to the webinar replay in that post, and I recommend checking it out if you weren’t here in Orlando today.

Off to the vendor showcase; that’s it for day 1 of the Gartner BPM summit.

Hidden costs of unstructured processes #GartnerBPM

Elise Olding and Carol Rozwell kicked off the afternoon with a session on the hidden costs of unstructured processes: although a lot of focus of BPM efforts (time and money) is on structured processes, as much as 60% of an organization’s processes are unstructured – and probably also unmonitored, unmanaged, unknown and unruly.

Gartner defines unstructured processes as “work activities that are complex, nonroutine processes, predominantly executed by an individual or group highly dependent on the interpretation and judgment of the humans doing the work for their successful completion”, and notes that most business processes are made up of both structured and unstructured processes. Unstructured processes are costing organizations a lot of money in lost productivity, lack of compliance and other factors, and you can’t afford to ignore them. Although most processes aimed to meet regulatory requirements are structured, unstructured processes provide a company’s unique identity and often its competitive differentiation, as well as supporting operational activities.

In order to start managing unstructured processes, you need to get some visibility into them; start by understanding the critical path through the process. This can be a bit tricky, since as you start to map out your unstructured processes, there will be some points at which the process participant just has to wing it and make their own decisions. These are, after all, knowledge workers, and it’s not possible (or desirable) to map every possible process permutation. Instead, map the structured portions of the process, then the points at which it becomes unstructured, but don’t try to overengineer what happens in the unstructured parts. The unstructured parts can be modeled by the notification mechanism (how someone is notified that a piece of work requires attention), the information provided to the participant to allow them to complete the unstructured work, and how the outcome is recorded.

They presented a number of analysis techniques for getting to the heart of unstructured and folklore processes:

  • Observe work being done, and challenge tasks that don’t make sense. Keep asking “why”.
  • Use storytelling (“tell me what happens when…”) to uncover decision-making logic, methods and best practices: these types of narratives are not well-captured in standard process documentation.
  • Analyze the unstructured interactions between people (e.g., customers and CSRs) and extract the themes and patterns. Rozwell wrote a report “Business Narratives Supplement Traditional Data Analysis” that discusses one technique for doing this, although it wasn’t quite clear what it was from the discussion.
  • Get clarity around roles and who is the decision-maker in any given process.

There are a variety of different areas of knowledge that you need to consider when analyzing unstructured processes, from identifying what metadata is used for collaboration, to looking at alternative analysis techniques such as mind mapping and social network analysis. Understanding collaborative technologies is also key, since unstructured processes are often collaborative in nature, and make use of the participants’ social graphs.

Their final recommendations are to keep an eye on the technologies that can support unstructured processes, but not to go overboard on monitoring and managing these processes.

Navigating the BPM Wonderland #GartnerBPM

Alan Trefler of Pegasystems gave his traditional lunch address – entertaining as always, starting with a “White Rabbit” audio clip – with an Alice in Wonderland theme of how we have to chase our business goals down whatever rabbit hole that they disappear down. Continuing on the theme, he contrasted the “one pill makes you larger” end of the spectrum with monolithic applications, and the “one pill makes you small” end of point solutions, and how you need to look at something in the middle. Don’t be afraid to ask for advice (even from hookah-smoking caterpillars), watch for those delusional Mad Hatter software salespeople, and be sure to meet the needs of the Red Queen boss lady so that you don’t get your head chopped off in the process.

Trefler is a former chess champion, so it’s inevitable that he introduced an Alice-themed chess analogy when examining his recommended steps for implementing BPM:

  • Directly capture objectives, so that your BPM implementation is focused on business intents and goals.
  • Automate the programming: the computer can write code much better than human beings, which is much less expensive in the long run even if off-shoring development appears to make it cheaper up front. In other words, use a system that allows for model-driven development and zero-code (or near-zero-code) deployment.
  • Automate the work wherever steps can be automated.

I love the term that he introduced: “heritage systems”, which are just legacy systems that we like a little bit better, probably because we’ve wrapped them to allow them to be more easily integrated with other systems and processes.

Deciding on process modeling tools #GartnerBPM

Bill Rosser presented a decision framework for identifying when to use BPA (business process analysis), EA (enterprise architecture) and BPM modeling tools for modeling processes: all of them can model processes, but which should be used when?

It’s first necessary to understand why you’re modeling your processes, and the requirements for the model: these could be related to quality, project validation, process implementation, as part of a larger enterprise architecture modeling effort and many other reasons. In the land of BPM, we tend to focus on modeling for process implementation because of the heavy focus on model-driven development in BPMS, hence model within our BPMS, but many organizations have other process modeling needs that are not directly related to execution in a BPMS. Much of this goes back to EA modeling, where several levels of process modeling that occur in order to fulfill a number of different requirements: they’re all typically in one column of the EA framework (column 2 in Zachman, hence the name of this blog), but stretch across multiple rows of the framework such as conceptual, logical and implementation.

Different types and levels of process models are used for different purposes, and different tools may be used to create those models. He showed a very high-level business anchor model that shows business context, a conceptual process topology model, a logical process model showing tasks within swimlanes, and a process implementation model that looked very similar to the conceptual model but included more implementation details.

As I’ve said before, introspection breeds change, and Rosser pointed out that the act of process modeling reaps large benefits in process improvement since the process managers and participants can now see and understand the entire process (probably for the first time), and identify problem areas. This premise is what’s behind many process modeling initiatives within organizations: they don’t plan to build executable processes in a BPMS, but model their processes in order to understand and improve the manual processes.

Process modeling tools can come in a number of different guises: BPA tools, which are about process analysis; EA tools, which are about processes in the larger architectural context; BPM tools, which are about process execution; and process discovery tools, which are about process mining. They all model processes, but they provide very different functionality around that process model, and are used for different purposes. The key problem is that there’s a lot of overlap between BPA, EA and BPM process modeling tools, making it more difficult to pick the right kind of tool for the job. EA tools often have the widest scope of modeling and analysis capabilities, but don’t do execution and tend to be more complex to use.

He finished by matching up process modeling tools with BPM maturity levels:

  • Level 1, acknowledging operational inefficiencies: simple process drawing tools, such as Visio
  • Level 2, process aware: BPA, EA and process discovery tools for consistent process analysis and definition of process measurement
  • Levels 3 and 4, process control and automation: BPMS and BAM/BI tools for execution, control, monitoring and analysis of processes
  • Levels 5 and 6, agile business structure: simulation and integrated value analysis tools for closed-loop connectivity of process outcomes to operational and strategic outcomes

He advocates using the simplest tools possible at first, creating some models and learning from the experience, then evaluating more advanced tools that cover more of the enterprise’s process modeling requirements. He also points out that you don’t have to wait until you’re at maturity level 3 to start using a BPMS; you just don’t have to use all the functionality up front.

Patterns for Business Process Implementations #GartnerBPM

Benoit Lheureux from Gartner’s Infrastructure and Architecture group gave a presentation on process implementation patterns. I think that he sees BPM as just part of SOA, and presents as such, but I’m willing to give him a pass on that.

He discussed five styles of flow management in SOA:

  1. Microflows: fine-grained services implemented via flows amongst software components. This is a process from a software development standpoint, not a business-level process: probably 3GL code snippets assembled into what we old-timers might refer to as a “subroutine”. 🙂
  2. Service composition: coarse-grained services implemented by assembling fine-grained flows (microflows). This may be done with a BPMS tool, but is low-level service composition rather than business processes.
  3. Straight-through process: automating business processes involving multiple services across systems, but without human intervention.
  4. Workflow: pretty much the same as STP, but with human intervention at points in the process.
  5. Semi-structured processes: a combination of structured processes with unstructured activities or collaboration.

He has some good strategic planning assumptions based on these four patterns, such as 75% of companies will use at least three different products to implement at least three different styles of flows. His primary focus, however, is on B2B, and how internal process connect to multi-enterprise processes, and the ultimate goal of shared process execution across enterprises. This led to the four B2B flow management styles:

  1. Blind document/transaction exchange: loosely-coupled, with each partner managing their own internal processes, and no visibility outside their own processes.
  2. Intelligent document/transaction exchange: visibility across the shared process to provide a shared version of the truth, such as a BAM dashboard that provides an end-to-end view of an order-to-cash process across enterprises. Although this isn’t that popular yet, it is providing significant benefits for companies that are implementing it, and Lheureux estimates that 50% of B2B relationships will include this by 2013.
  3. Multi-enterprise applications: shared execution of a process that spans the enterprises, such as vendor-managed inventory. This may be hosted by one of the partners, or may be hosted by a third-party service provider.
  4. Multi-enterprise BPMS and rules: centralized processes and rules, such as shared compliance management on a shared process. By 2013, he predicts that at least 40% of new multi-enterprise integration projects will leverage BPMS technology.

He showed a chart that I’ve seen at earlier conferences on identifying process characteristics, classifying your processes as case management, form-driven workflow, content collaboration, multiparty transactional workflow, participant-driven workflow, and optimization of network relationships based on the unit of work, process duration, degree of expertise required, exception rate, and critical milestones that progress work. Then, consider when to use BPMS technology rather than code when there are specific process characteristics such as complexity and changeability.

The final recommendations: don’t try to use the same tool to handle every type of process implementation, but be aware of which ones can be best handled by a BPMS (and by different types of BPMS) and which are best handled in code.

BPM in Times of Rapid Change #GartnerBPM

For the next couple of days, I’m at the Gartner BPM Summit in Orlando. Jim Sinur and Janelle Hill gave the opening keynote this morning on BPM in times of rapid change, starting with a view of the global economy: basically, it’s down this year, although not as bad as expected, and the leading economic indicators are starting to trend up.

Gartner did a survey of CEOs in late 2008, and found that their top priority is shifting back from cutting operating costs to increasing revenues, although only by a slim margin. The resulting message: the time to return to business growth is now, and leveraging BPM to assist growth can provide a first-mover advantage if the economy does trend up in 2010 as predicted. BPM still provides assistance in restructuring operations (including mergers and acquisitions) and cutting costs that goes along with a down economy, so you might as well leverage what you’re already using to cut costs, and start looking forward and repositioning for growth. In many cases (in my experience), improving business processes using BPM has the impact of reducing costs of the specific processes, which can either translate to reduced operational costs through reduced headcount, or increased revenues due to the increased capacity of the process to handle new business: these are just two sides of the same process improvement coin.

Going into 2010, most large enterprises have already completed their cutbacks – reduced headcounts, reduced infrastructure, renegotiated contracts and elimination of redundant technologies – but their budgets are going to be pretty flat. If you already have a BPMS in your organization, then this might mean some incremental expansion, but if you don’t, you need to look at how to justify the technology acquisition. Fortunately, that’s getting easier as the capabilities of the BPMS products expand: consider the value of process modeling (reduced redundancy and better use of people in the process) as well as process and application orchestration (automating the linkages between many existing applications) and composite application development environments (bringing together many applications into a single user view).

Focus on improving processes that defend revenue and cash without impacting customer experience, such as order-to-cash, sales processes, and customer service. Depending on your industry, this could also be the time to take some risks in order to gain that first-mover advantage: reconsider institutionalized behaviors and what you might think of as best practices, and see if there’s an innovative way to improve processes that provide a competitive edge. There should be no processes that are immune to change: challenge the status quo. I see this all the time with how companies are embracing social media in addressing customer relationships: the ones that are successful at it are those that throw away all the old ideas about how companies communicate and interact with their customers. These customer-facing processes are no longer about executing transactions, they’re about coordinating social interactions and developing social relationships.

The hot button these days is unstructured processes (which I’m sure that we’ll hear a lot more about this week), and how some new BPMS functionality allows for dynamic collaboration instead of, or within the context of, a structured process. This provides methods for gaining visibility into processes that might exist now only in email or other ad hoc methods, and likely aren’t managed well in their current state.

It’s not good enough, however, to use old-style BPMS/workflow products: you need to be considering products that have model-driven development, composite application development, process discovery and optimization, and customized dashboards for different roles and personas within a process. Otherwise, you’ll just be stuck back in the same old waterfall development methodology, and won’t achieve a lot of benefit from BPM. Interestingly, Sinur and Hill highlighted three specific products to show examples of what they consider BPMS innovation: Vitria’s composite application development, Pallas Athena’s process discovery and simulation, and Global 360’s persona-based user interfaces.

In the recession of the 1980’s, business process reengineering was a high-profile, strategic activity with top executives involved; as the recession eased, the executives’ interest in BPR waned. The same cycle will repeat now: executives are very interested right now in process improvement and BPM, but that’s not going to last when the economy starts to recover, so you may want to take advantage of their interest now and get something going.

You can track the Twitter backchannel for the Gartner BPM summit here.

Skelta BPM.NET

A while back, I had an email from Phil Larson, who I have known since he was at Appian; he has spent the summer in India as an MBA internship. One thing led to another, he connected me up with Skelta, and I fostered India-Canada relationships by getting up early for an online demo with Sanjay Shah and Arvind Agarwal of Skelta. They’ve published a corporate presentation if you want to take a look.

Application with BPM embeddedThey started by creating OEM workflow components that were embedded in other products, then built that out into a full-blown BPM suite, BPM.NET, while retaining a focus on componentized, embeddable pieces. They have significant penetration into the Indian business process outsourcing (BPO) market, both as the BPOs’ product offerings and for their own internal processes. Because of the OEM nature of their product, they also end up embedded in SaaS BPM implementations, although white-labeled so you may not know that they’re there. This is much more like the Fujitsu model – create BPM primarily for the OEM market, then launch as a direct BPMS product – and Skelta has leveraged this into business that includes OEM and full product sales as well as multi-tenanted hosted BPM. Even their browser-based process modeler can be embedded as a component in another application, not just the run-time UI components.

SharePoint activities built inAs you might guess from the product name, they have a strong Microsoft bias: there is significant integration with SharePoint and other Microsoft products to capture and act on events generated from those systems, plus adapters for SAP, PeopleSoft and Microsoft Dynamics. The number of integration services that they provide is quite extensive, and is likely what has made their product attractive to the BPOs to use as a base upon which to build applications. These are available directly from the process modeler: there is a palette of SharePoint activities built in, as well as BizTalk activities and other integration activities.

The process modeler includes the ability to set up data points that will be used as KPIs in reporting. Queue filtering and prioritization can be based on multiple factors so that process participants see only the work that they should be able to access, served to them in the correct order. Process models are consumed directly by the process engine without translation.

Personal work listThey include an AJAX forms designer for creating task user interfaces, including scripting to control contextual behavior: the view on the form (and therefore the visible/editable fields) can change depending on which step that the process is at. The main processing paradigm has a user requesting the next item at a particular process step from a shared queue, which moves it to their personal work list for working with that AJAX form; escalation can be based on the time that a work item spends in a shared queue before selection, or in a user’s work list. The user’s view can have some monitoring graphs built in, since these are all components that can be assembled into a web application. The user can view the process map for the current instance, including a history of the process to date.

There is not a full rules engine as part of the product: expressions and rules can be built into the forms and process definitions, or rules services can be integrated using web services, calling BizTalk rules, or writing the rules in .Net.

There’s a big focus on components used to monitor processes and their SLAs: this is critical for the BPO market, since their compensation is typically based on meeting SLAs, and they likely have penalty clauses associated with missing them so need to monitor them closely. There are other needs of BPO vendors that Skelta is seeking to address: the ability to embed white-labeled BPM within other applications; multi-tenancy software-as-a-service infrastructure; and, for the Indian BPO marketplace, the fact that Microsoft infrastructure is cheaper to build and maintain in India than a comparable Java infrastructure. In some ways, BPOs have needs similar to that of large enterprises, such as quickly-changing user requirements that can vary widely across the user base, and the need to simplify training and roll-out of the system.

HandySoft BizFlow BPM

I caught up with Garth Knudson from HandySoft a few weeks ago; I’ve looked at their BizFlow product previously, and they’re currently at version 11.3 so have a pretty long track record. Although HandySoft handles the same sort of structured processes as you see in most other BPMS vendors, they really focus on ad hoc and dynamic (unstructured) processes, where either a user needs to jump out of an existing process definition at a particular step to an unstructured flow and bring the results back to the structured process, or even create a new dynamically-defined process. Some processes just can’t be modeled in advance due to non-standard processes, changing roles and responsibilities, or process participants and actions being dependent on the participating user request: this is more like managing a project rather than a traditional process, but with BPM capabilities and structured applied to it rather than trying to manage this in email. These types of dynamic processes can form a huge portion of an organization’s processes: think of all the ad hoc processes that you have now in email, only with no control or monitoring. Some significant research efforts are underway on dealing with dynamic processes, as I saw at the academic conference in Ulm two weeks ago; Gartner and Forrester are all over this area as well, so I expect that we’ll see some advances from many vendors in this area in the next few years.

The structured parts of the process are managed by BizFlow BPM, whereas the unstructured workflow portions, whether spawned from a structured process or initiated directly, are managed by the OfficeEngine front-end application; in both cases, the process engine is BizFlow. Although you use an email-like interface to kick things off, and email is used as a transport for external recipients, this provides tracking of ad hoc processes that’s just not possible in email.

HandySoft: Specify ad hoc task detailTo start a completely ad hoc process, you create a task, specify properties such as instructions and deadlines, and attach any documents required or link to documents in a ECM repository using a URL in the rich comments field on the launch form. You use ActiveDirectory/LDAP or type in external email addresses to select participants, specify whether the participants can reassign the task further, and submit the task; then, the task is available for monitoring and you can see who has done what in a graphical view. Process participants receive tasks as calendar invitations, then click through to login to BizFlow and work on the task assigned to them, which may include adding other people to the collaboration. The web-based user interface includes a list of ad hoc tasks in which you are participating, a work list for your activities within structured processes, a launch pad for initiating new tasks or processes, and a graphical view of your SLA scorecard. From there, you can click through to the task monitor for ad hoc tasks that you have created, and see the state of each participant.

HandySoft: Task monitoringSince external participants can’t access BizFlow directly, they do their work outside the system and reply; replies from external participants are returned as a proxy, and an internal user must enter the response manually. This sort of one-step collaborative process – including multiple participants and reassignments – can replace the current practice of emailing around to multiple people for information or comments, then manually tracking to see who has responded. In an environment dominated by ad hoc processes in email, this provides a big benefit for tracking who is doing what, and when.

It’s fairly similar for launching an ad hoc task from a structured process: the structured process is modeled (using BPMN) in a similar fashion to other BPMS tools, and launched using a web form. From the participant’s UI at any step, however, you have an “Assign a task” tab that pops up the same form as was used for the purely ad hoc tasks; essentially, this allows delegating the structured process activity to the collaborative task, which can then include people who were not originally involved in the structured process. It doesn’t change the structured process; it just pops out to a collaborative task at this point, and when that completes, it returns to this step in the structured process and continues on. Just as with the standalone ad hoc tasks, this reduces the amount of unmonitored email activity that is prevalent in many structured processes where someone needs to request more information at a step.

In many BPM implementations, there is an attempt to capture all possible exceptions and collaborations as part of the structured process, but in reality, this just isn’t possible; they end up in email, phone calls and other untraceable activities. As Clay Richardson of Forrester pointed out in his vendor snapshot on HandySoft:

Traditional BPM platforms perpetuate the myth of neatly structured processes – with most vendors providing ample support for capturing reoccurring and well-defined workflows, but minimal support for managing unstructured and dynamic business processes. This chasm between the worlds of structured and unstructured processes forces teams to develop custom workarounds to handle ad hoc routing and collaborative interactions, ultimately increasing the time and cost to deliver BPM solutions.

HandySoft: Detailed stats of SLA violationAllowing an ad hoc, yet monitored, task to be launched from any point in a structured process has the effect of reducing the complexity of the structured process without sacrificing monitoring and auditability of the process. In BizFlow, launching an ad hoc task from a structured process causes an indicator to appear on the graphical view of the executing process to indicate that a task has been launched from that point, and the complete audit trail of structured and unstructured processes is maintained. If the ad hoc task isn’t completed within the specified deadline, that SLA violation shows on the structured process monitoring, and you can click to the OfficeEngine interface for detailed monitoring of the task.

On their product release agenda for later this year are a reporting service module – there’s already a fairly capable BAM functionality – and full rich internet application development capabilities to create more usable web forms for the user interface.