Beyond BPM: Virtusa’s Experience With Process-Led Transformation

I was late arriving at the breakout session by Virtusa’s Stuart Chandler this afternoon – as Ron Locklin tweeted, we’re over here in the outer Siberia of the conference center, and it was a long trip – so arrived as Chandler was discussing their experiences with customers who are undergoing business transformation powered by BPM and RPM.

Virtusa’s experience with process-led transformation in their customers has manifested in a number of ways:

  • Upsell/cross-sell opportunities to expand into new markets and products
  • Reduce costs and improve quality
  • Create a collaborative work environment to align cross-functional activities and enhance linkages along the supply chain
  • Incorporate the customer into the process to improve service levels
  • Greater transparency and visibility, particularly into real-time activity
  • Empowering the business so that they take ownership of their processes, making their own changes to processes rather than relying on IT
  • Case management and the ability to have a 360 degree view of the customer

Not surprisingly, they’ve seen a number of gaps and emerging needs driven by these transformation efforts, both on the project teams and in the business areas that are being transformed. There are a number of these that I saw highlight in research presented at BPM 2011 in Clermont-Ferrand a few weeks ago: process discovery (automated and manual), multi-platform integration, the impact of events on processes, and dynamic runtime capabilities. There are issues that are seen in any technology-heavy business transformation, not specific to BPM, but change management issues are exacerbated by the rapidly-changing environments that we create with BPM and related technologies.

They are seeing the focus of BPM shifting from reactive to proactive through event handling, interactive process management, sense and respond processes, and fast visualization and definition of new situations and contexts. This, of course, moves far beyond just BPM to more of the RPM portfolio that Progress is promoting, since traditional BPM platforms can’t easily handle the dynamic nature of real business processes, and are unable to handle and provide visibility into exceptions and dynamic processes, especially those that span multiple organizations.

Progress isn’t the only BPM product in Virtusa’s portfolio, but Chandler pointed out how RPM – which integrates BPM, BEP and RTA – is a more effective transformation tool because of its treatment of transactions, events and processes, and due to Control Tower as a common interface for not just monitoring, but effecting change in the underlying components. However, there are some changes that need to be made within organizations in order to properly adopt a transformation RPM implementation, including platform/architecture familiarity, and finding the right methodology, and making the necessary cultural shifts.

Although he gave some lip service to Progress RPM, I have the sense that this is a presentation that he gives at a variety of their BPM partner conferences with just a few slides switched out. He talked about the Virtusa BPM implementation methodology – a waterfall/Agile hybrid – and a case study that probably won the prize for most words ever crammed onto a single PowerPoint slide. Some good information, but like most partner presentations that I see at vendor conferences, a bit too self-serving in parts to be completely credible.

OpenEdge BPM: Modifying an OE Application To Integrate With BPM

I sat in on a breakout session today at the Progress Revolution conference on OpenEdge BPM and migrating existing OpenEdge applications to work with (Savvion) BPM. There are some new ways of doing this that are coming in OE 11 that we are not seeing in this session, but I’ve had a few conversations with people since my blog post yesterday and expect to have a more in-depth briefing on this soon.

Traditionally, in OE development as in many other application development environments, the business process and logic are built directly in the application, intertwined with the user interface and other components. This pre-SOA app dev methodology is pretty old-school, but I think that’s what we’re dealing with in considering a lot of the existing OE apps that have been in production for years: not only where they designed before multi-tiered architecture was generally acceptable practice, in many cases they may have been designed by less-technical business people who weren’t aware of these types of coding standards.

Now that Savvion BPM is integrated with OE, the plan will be that business processes will be made explicit in the BPM layer, and eventually much of the user interface will also be moved to the BPM forms layer, while the more traditional OE code will provide the services layer that is consumed by the process. This creates a more flexible and open service oriented architecture, allowing the BPM processes to consume both the OE services (currently with web services, but presumably with tighter couplings in the future) and services from other systems in an enterprise’s environment in order to orchestrate multiple applications.

If you were starting fresh with a new app, that would not be a significant problem: build your processes and UIs in BPM, build your services in OE, and you’re done. The problem, however, is the huge body of existing OE applications that need to start migrating in this direction. This problem exists far beyond the OpenEdge world, of course: those with legacy code and monolithic applications of any sort are having to deal with this as they struggle to become more service-oriented, and to integrate BPM as an overarching orchestration framework.

Brian Bowman, a Progress senior systems consultant, led this session and gave a demo of creating a process in BPM  – all the while explaining some BPM basics to what I assumed was a room full of OE developers. Like a huge portion of the business/IT world, most OE customers and partners have no idea what BPM looks like or what it can do for them, meaning that Progress has a lot of education to do before anyone actually starts integrating BPM into their OE apps. A huge opportunity for Progress, undoubtedly, but also a huge challenge. I’m also struck by the idea that a lot of the Progress people, particularly the SCs who will be demoing this to customers and partners, need to have some better general process modeling training including a bit more stringent BPMN education, not just training on pushing the mouse around in the BPM tool.

Brian was joined by Sandy (I missed her last name), another SSC, who moved from Brian’s “business analyst” role who created the process in BPM, to a “developer” role in OE. She had a pre-built OE app that had a button that instantiated a process and displayed the process ID; she showed the underlying OE code, which made a CreateInstance call followed by some updateDataSlot calls to update the parameters in the process instance with the OE database parameters. The rest of the integration happened on the BPM side, with the basic points of integration as follows:

  • Create a process instance from an OE app, and populate the data fields. I don’t know OE code, but it appears that it uses a couple of new or repurposed functions (CreateInstance and updateDataSlot) to call BPM.
  • Call an OE function from a process step using a SOAP call. This requires that the function be exposed in OE as a web service, but BPM would not have had to be changed in order to make the call, since that’s a standard functionality in Savvion.
  • Update the OE database from a process step. This is based on the OE database connectivity functionality that has been added to BPM.
  • Embed a WebSpeed form page in a BPM UI form: basically, replacing a BPM UI form with an existing WebSpeed form to complete a BPM step. It is not possible to use an existing OE GUI form in this way, only a WebSpeed form since the HTML can be embedded as a URL. This is done by embedding the search parameters directly in the URL that is called to invoke the embedded WebSpeed form, which may be a security concern in some situations.

There’s definitely an issue with those using the OE GUI (again, I’m not that familiar with OE, but I assume this is a thick client UI) since these can’t be used directly as BPM task UIs as you can with the WebSpeed forms, although you could add API calls to your OE GUI application to update the BPM process instance, effectively creating a shadow process in OE and BPM. Not best practice, I’m sure, but possibly a stop-gap measure for those migrating from the OE GUI either to WebSpeed forms or BPM UI forms.

OE and BPM have separate servers (as you would expect), and deployment is done independently, as far as I can tell. That makes sense, since the eventual goal is to have OE becomes more of the services layer consumed by BPM; there is no real need to have the deployment tightly coupled although you do have to be concerned about dependencies, as you would with any SOA deployment.

Some of the questions at the end were related to how OE functionality would be replicated in BPM, such as roles and security; Progress definitely needs to do some work around educating the customers and partners on how the features that OE developers currently rely on will be done on the BPM side, for those functions such as UI and work management that are expected to move over to this new environment.

7 Reasons To Build Business Apps on BPM by @neilwd

I was late to Neil Ward-Dutton’s session due to another meeting, and ended up arriving just as he started on reason #7. However, in the summary, he did list out his 7 reasons of why you want to build business applications using BPM technology:

  1. Demonstrate value transparently
  2. Support iterative, collaborative changes
  3. Speed up user acceptance
  4. Improve management of customizations
  5. Enable transition to SaaS delivery
  6. Reach new stakeholders
  7. Support continuous improvement

He finished with the signals that you can use to identify the maximum opportunity for value from using BPM:

  • Strong service differentiation focus
  • Strong need for performance transparency
  • Dynamic regulation or policy environment
  • Need to coordinate work/information across or between organizations

I was sorry to miss most of the presentation, but there apparently is a paper that Neil wrote on the Progress website going through these points as well (although I don’t have the link).

RPM For Top and Bottom Line Improvement

Dr. Ketabchi, who was CEO of Savvion before the acquisition and is now a chief strategist at Progress, presented a breakout session on using RPM to enable enterprises to improve their top and bottom line. BPM isn’t just about cutting costs and improving quality any more (although those are still important, and expected), it’s also about increasing revenue by taking advantage of opportunities as they arise.

He gave another version of what I saw in Wilner’s presentation on the justification for BPM (explicit business processes leading to agility, visibility and better business understanding of processes) which really drives home that I’m not at a BPM vendor’s conference, I’m at an application development tool vendor’s conference where they are introducing this hot new technology called BPM. This is, of course, the stage that most of the business world is at with respect to BPM understanding; I’m just so used to being in the BPM echo chamber that I rarely hear these messages unless I’m delivering them to a client.

Dr. K pointed out that BPM isn’t enough to increase revenue, although it obviously pained him to say that; business event processing (BEP) and embedded real time analytics are also required. Revenue generating opportunities are always customer-facing and situational, based on time, location, occasion, connection, exceptions and/or actions. This requires understanding the customers’ situations, analyzing those situations, and delivering (or offering) services and products that they need immediately. Sensing and understanding the customers’ situations requires the processing and correlation of events from a variety of sources using BEP, extracting information from the context of those events, and triggering actions and events as a result. In part, it’s about recognizing patterns and exceptions.

He went on to discuss these functionalities in the context of the Progress RPM suite, and some customer examples of using RPM to take advantage of revenue generating opportunities as they arise, such as a mobile phone company pushing offers to their (opted-in) customers based on their location. No real new information here, but showing a realignment of the focus of RPM to be as much about improving the top line as well as the bottom line of business.

Dr. K will be speaking at the Forrester BPM event here in Boston on Thursday, along with Progress customer Reliance Capital.

OpenEdge BPM Introduction with @KenWilner

Ken Wilner, Progress’ VP of Technology for the OpenEdge product, gave a breakout session on OpenEdge BPM, which integrates the former Savvion BPM platform into the OpenEdge application development environment to allow the business process to be made explicit – externalized from the application – in order to improve agility and visibility. It’s interesting to hear this message, which is no longer even a topic of conversation in mainstream BPM circles because it is so well-understood, presented to a group of OpenEdge application developers.

Does the integration of BPM just relegate OpenEdge to the scripting/coding language slaved to BPM? Maybe, but that’s not necessarily a bad thing. Instead of layering BPM on top of a monolithic application developed with OpenEdge, it’s about having an integrated development platform that includes BPM as a part of the toolkit. It will be interesting to see how well this message is received by the OpenEdge development community, and how long it takes to actually impact their development methods. I had a number of questions yesterday during my workshop on exactly this issue: how does BPM fit with an application developed in OpenEdge? It’s about pulling the process out of the app and into BPM, as Wilner pointed out, but also about orchestrating and integrating apps including OpenEdge and other systems such as CRM and accounting.

Although (Savvion) BPM Studio and the OpenEdge Architect development environment are both Eclipse-based, it doesn’t appear that they’ve been integrated in any significant manner. Similarly, there are two different servers – although a BPM process can call an OpenEdge functionality, using web services at least – and two different end-user portal environments, where the BPM server functionality can be surfaced in the OpenEdge portal.

He gave a live demo of creating a process in BPM Studio. This was pretty straightforward, a BPMN diagram of a five-step order processing flow with roles assigned to human steps, plus a simple data model with specific fields exposed at the steps in order to auto-generate the related forms. He then assigned a system step to an OpenEdge function, using web services (SOAP) calls, and another system step using the standard Savvion email functionality. He ran the process in the BPM portal, showing how the tasks were routed to the different users, and how you can monitor the process as it moves through the steps. Nice, and probably new and exciting for the purely OpenEdge people in the audience, but so far, this is just standard BPM with no specific integration between OpenEdge and Savvion BPM, only the standard loosely-coupled web services that would have been there in BPM anyway.

Wilner discussed (but did not demo) the high level of reusability of existing OpenEdge application components in the context of a BPM process, including the use of existing UI forms, but it’s not clear that this is a level of integration specific to OpenEdge, or just using standard integration points and/or custom development.

There is no doubt that BPM provides value as a tool for any application developer, but this demo could have been done with any BPMS, and/or any application that exposes functionality as a web service. I know that this session was listed as an introduction to OpenEdge BPM, but appears to be more of an introduction to BPM for OpenEdge developers. I hope that there is more to OpenEdge BPM than this, as well as a comprehensive roadmap for further integration. His closing slides indicated that this was coming in OpenEdge 11 at the end of this year, and I look forward to seeing how they are going to push this forward.

Active Endpoints’ Cloud Extend For Salesforce Goes Live

Next week at at Dreamforce, Active Endpoints’s new Cloud Extend for Salesforce will officially go live. I had a briefing a few months back when it hit beta, and an update last week where I saw little new functionality from the first briefing, but some nice case studies and partner support.

Introduction Call guide - set up meeting.jpgCloud Extend for Salesforce is a helper layer that integrates with Salesforce that allows business users to create lightweight processes and guides – think screenflows with context-sensitive scripting – to help users through complex processes in Salesforce. In Salesforce, as in many other ERP and CRM systems, achieving a specific goal sometimes requires a complex set of manual steps. Adding a bit of automation and a bit of structure, along with some documentation displayed to the user at the right time, can mean the difference between a process being done correctly or having some critical steps missed. If you look at the Cloud Extend case study with PSA Insurance & Financial Services covered in today’s press release, a typical “first call” sales guide created with Cloud Extend includes such actions as recording the prospect’s current policy expiration date, setting reminders for call-back, sending out collateral and emails to the prospect, and interacting with other PSA team members via Salesforce Chatter. This will mean that less follow-up items are missed, and improve the overall productivity of the sales reps since some of the actions are automated or semi-automated. Michael Rowley, CTO of Active Endpoints, wrote about about Cloud Extend at the time of the beta release, covering more of the value proposition that they are seeing by adding process to data-centric applications such as Salesforce. Lori MacVittie of F5 wrote about how although data and core processes can be commoditized and standardized, customer interaction processes need to be customized to be of greatest value. Interestingly, the end result is still a highly structured pre-defined process, although one that can be created by a business user using a simple tree structure.

When I saw a demo of Cloud Extend, I was reminded of similar guides and scripts that I’ve seen overlaid on other enterprise software to assist in user interactions, usually for telemarketing or customer service to be prompted on what to say on the phone to a customer, but this is more interactive than just scripts: it can actually update Salesforce data as part of the screenflow, hence making it more of a BPM tool than just a user scripting tool. Considering that the ActiveVOS BPMS is running behind the scenes, that shouldn’t come as a surprise, since it is optimized around integration activities. Yet, this is not an Active Endpoints application: the anchor application is Salesforce, and Cloud Extend is a helper app around that rather than taking over the user experience. In other words, instead of a BPMS silo in the clouds as we’re seeing from many BPMS cloud vendors, this is using a BPMS platform to facilitate a functionality integrated into another platform. A cloud “OEM” arrangement, if you please.

Creating a new guide - set automated email step actionThe Guide Designer – a portal into the ActiveVOS functionality from within Salesforce – allows a non-technical user to create a screen flow, add more screens and automated steps, call subflows, and call Salesforce functions. The flow can be simulated graphically, stepping forwards and backwards through it, in order to test different conditions; note that this is simulation in order to determine flow correctness, not for the purpose of optimizing the flow under load, hence is quite different from simulation that you might see in a full-featured BPA or BPMS tool. Furthermore, this is really intended to be a single-person screen flow, not a workflow that moves work between users: sort of like a macro, only more so. Although it is possible to interrupt a screen flow and have another person restart it, that doesn’t appear to be the primary use case.

There are a few bits that likely a non-technical user couldn’t do without a bit of initial help, such as creating automated steps and connecting up the completed guides to the Salesforce portal, but it is pretty easy to use. It uses a simple branching tree structure to represent the flow, where the presence of multiple possible responses at a step creates the corresponding number of outbound branches. In flowcharting terms, that means only OR gates, no merges and no loopbacks (although there is a Jump To Step capability that would allow looping back): it’s really more of a decision tree than what you might thing of as a standard process flow.

Creating a guide, or a “guidance tree” as it is called in the Guide Designer consists of adding an initial step, specifying whether it is a screen (i.e., a script for the user to read), an automated step that will call a Salesforce or other function, a subflow step that will call a predefined subflow, a jump to step that will transfer execution to another point in the tree, or an end step. Screen steps include a prompt and up to four answers to the prompt; this is the question that the user will answer at this point in response to what is happening on their customer call. One outbound path is added to the step for each possible answer, and a subsequent step automatically created on that path. The branches keep growing until end steps are defined on each branch.

Complex guidance tree - additional steps on right revealed on navigationA complex tree can obviously get quite large, but the designer UI has a nice way of scrolling up and down the tree: as you select a particular step, you see only the connected steps twice removed in either direction, with a visual indicator to show that the branch continues to extend in that direction.

Regardless of the complexity of the guidance tree, there is no palette of shapes or anything vaguely BPMN-ish: the user just creates one step after another in the tree structure, and the prompt and answers create the flow through the tree. Active Endpoints see this tree-like method of process design, rather than something more flowchart-like, to be a key differentiator. In reality, under the covers, it is creating BPMN that is published to BPEL, but the designer user interface just limits the design to a simple branching tree structure that is a subset of both BPMN and BPEL.

Once a flow is created and tested, it is published, which makes it available to run directly in the Salesforce sales guides section directly on a sales lead’s detail page. As the guide executes, it displays a history of the steps and responses, making it easy for the user to see what’s happened so far while being guided along through the steps.

Cloud Extend, Socrates modeler and multi-tenant ActiveVOS in the PaaS stackObviously, the Active Endpoints screen flows are executing in the cloud, although as of the April release, they were using Terremark rather than hosting it on Salesforce’s platform. Keeping it on an independent platform is critical for them, since there are other enterprise cloud software platforms with which they could integrate for the same type of benefits, such as Quickbooks and SuccessFactors. Since there is very little data persisted in the process instances within Cloud Extend, just some execution metrics for reporting and the Salesforce object ID for linking back to records in Salesforce, there is less concern about where this data is hosted, since it will never contain any personally identifiable information about a customer.

We’re starting to see client-side screen flow creation from a few of the BPMS vendors – I covered TIBCO’s Page Flow Models in my review of AMX/BPM last year – but those screen flows are only available at a step in a larger BPMS model, whereas Cloud Extend has encapsulated that capability for use in other platforms. For small, nimble vendors who don’t need to own the whole application, providing embeddable process functionality for data-centric applications can make a lot of sense, especially in a cloud environment where they don’t need to worry about the usual software OEM problems of installation and maintenance.

I’m curious about whatever happened to Salesforce’s Visual Process Manager and whether it will end up competing with Cloud Extend; I had a briefing of Visual Process Manager over a year ago that amounted to little, and I haven’t heard anything about it since. Neil Ward-Dutton mentions these two possibly-competing offerings in his post on the beta release of Cloud Extend, but as he points out, Visual Process Manager is more of a general purpose workflow tool, while Cloud Extend is focused on task-specific screen flows within the Salesforce environment. Just about the opposite of what you might have expected to come out of these respective vendors.

Cloud Extend

Business Process Manifesto By @RogerBurlton – Open For Comments

Roger Burlton, who has been doing business process stuff for even longer than me, has written The Business Process Manifesto, which he describes as “A necessary foundation for all things process”. He’s gathered some feedback on this at a few conferences, and has asked me to post it here for more comments:

Please review and add any comments here; I appreciate if you would use your real name and email address in the comment form (only I can see the email addresses) so that I can pass these on to Roger in case he wants to follow up with you on your comments.

Strategic Synergies Between BPM, EA and SOA

I just had to attend Claus Jensen’s presentation on actionable architecture with synergies between BPM, EA and SOA since I read two of his white papers in preparing the workshop that I delivered here on Wednesday on BPM in an EA context. I also found out that he’s co-authored a new red book on EA and BPM.

Lots of great ideas here – I recommend that you read at least the first of the two white papers that I link to above, which is the short intro – about how planning (architecture) and solution delivery (BPM) are fundamentally different, and you can’t necessarily transform statements and goals from architecture into functions in BPM, but there is information that is passed in both directions between the two different lifecycles.

He went through descriptions of scenarios for aligning and interconnecting EA and BPM, also covered in the white papers, which are quite “build a (IBM-based) solution”-focused, but still some good nuggets of information.

Workshop: BPM In An EA Context

Here’s my presentation slides from the workshop that I gave on Wednesday here at the IRM BPM conference in London, entitled Architecting A Business Process Environment:

As always, some slides may not make much sense without my commentary (otherwise, why would I be there live?), but feel free to ask any questions here in the comments.

BPM Rapid Development Methodology

Today at the IRM BPM conference, I started with a session by Chris Ryan of Jardine Lloyd Thompson on a rapid development methodology with BPMS in their employee benefits products area.

They’ve been a HandySoft customer since 2004, using BizFlow both for internal applications, and for external-facing solutions that their customers use directly; they switched off a Pega project that was going on (and struggling) in one of their acquisitions and replaced it with BizFlow in about 4 months. However, they were starting to become a victim of their own success, with many parts of the organization wanting their process applications developed by the same small development team.

They weren’t doing their BPM solutions in a consistent and efficient fashion, and were using a waterfall methodology; they decided to move to an Agile development methodology, where the requirements, process definition and application/form design are all done pretty much simultaneously with full testing happening near the end but still overlapping with ongoing development. They’ve also starting thinking about their processes in a more service-oriented way that allows them to design (sub)processes for specific discrete functions, so that different people can be working on the subprocesses that will make up part of a higher-level process. This has tracking implications as well: users viewing a process in flight can look at the top level process, or drill down into the individual functions/subprocesses as required.

They’ve established best practices and templates for their user interface design, greatly reducing the time required and improving consistency. They’ve built in a number of usability measures, such as reducing navigation and presenting only the information required at a specific step. I think that this type of standardization is something rarely done in the user interface end of BPMS development, and I can see how it would accelerate their development efforts. It’s also interesting that they moved away from cowboy-style development into a more disciplined approach, even while implementing Agile: the two are definitely not mutually exclusive.

This new methodology and best practices – resulting in a lean BPM team of analysts, PMs, developers, testers and report writers – have  allowed them to complete five large projects incorporating 127 different processes in the past year. Their business analysts actually design the processes, involving the developers only for the technical bindings; this means that the BAs do about 50% of the “development”, which is what all of the BPMS vendors will tell you should be happening, but rarely actually happens in practice.

From an ROI standpoint, they’ve provided the infrastructure that has allowed the company to grow its net profit by 46%, in part through headcount reduction of as much as 50% in some areas, and also in the elimination of redundant systems (e.g., Pega).

They’ve built a business process competency center, and he listed the specific competencies that they’ve been developing in their project managers, analysts, developers and “talent development” (training, best practices and standards). Interestingly, he pointed out that their developers don’t need to have really serious technical skills because the BizFlow developer really doesn’t get that technical.

He finished up with their key success factors: business involvement and user engagement, constant clear communications amongst stakeholders, and good vendor support. They found that remote teams can work well, as long as the communication methods support the user engagement throughout the process, since Agile requires constant review by users and retuning of the application under development throughout the lifecycle, not just during a final testing stage.

Great success story, both for JLT and for HandySoft.