Beyond BPM: Virtusa’s Experience With Process-Led Transformation

I was late arriving at the breakout session by Virtusa’s Stuart Chandler this afternoon – as Ron Locklin tweeted, we’re over here in the outer Siberia of the conference center, and it was a long trip – so arrived as Chandler was discussing their experiences with customers who are undergoing business transformation powered by BPM and RPM.

Virtusa’s experience with process-led transformation in their customers has manifested in a number of ways:

  • Upsell/cross-sell opportunities to expand into new markets and products
  • Reduce costs and improve quality
  • Create a collaborative work environment to align cross-functional activities and enhance linkages along the supply chain
  • Incorporate the customer into the process to improve service levels
  • Greater transparency and visibility, particularly into real-time activity
  • Empowering the business so that they take ownership of their processes, making their own changes to processes rather than relying on IT
  • Case management and the ability to have a 360 degree view of the customer

Not surprisingly, they’ve seen a number of gaps and emerging needs driven by these transformation efforts, both on the project teams and in the business areas that are being transformed. There are a number of these that I saw highlight in research presented at BPM 2011 in Clermont-Ferrand a few weeks ago: process discovery (automated and manual), multi-platform integration, the impact of events on processes, and dynamic runtime capabilities. There are issues that are seen in any technology-heavy business transformation, not specific to BPM, but change management issues are exacerbated by the rapidly-changing environments that we create with BPM and related technologies.

They are seeing the focus of BPM shifting from reactive to proactive through event handling, interactive process management, sense and respond processes, and fast visualization and definition of new situations and contexts. This, of course, moves far beyond just BPM to more of the RPM portfolio that Progress is promoting, since traditional BPM platforms can’t easily handle the dynamic nature of real business processes, and are unable to handle and provide visibility into exceptions and dynamic processes, especially those that span multiple organizations.

Progress isn’t the only BPM product in Virtusa’s portfolio, but Chandler pointed out how RPM – which integrates BPM, BEP and RTA – is a more effective transformation tool because of its treatment of transactions, events and processes, and due to Control Tower as a common interface for not just monitoring, but effecting change in the underlying components. However, there are some changes that need to be made within organizations in order to properly adopt a transformation RPM implementation, including platform/architecture familiarity, and finding the right methodology, and making the necessary cultural shifts.

Although he gave some lip service to Progress RPM, I have the sense that this is a presentation that he gives at a variety of their BPM partner conferences with just a few slides switched out. He talked about the Virtusa BPM implementation methodology – a waterfall/Agile hybrid – and a case study that probably won the prize for most words ever crammed onto a single PowerPoint slide. Some good information, but like most partner presentations that I see at vendor conferences, a bit too self-serving in parts to be completely credible.

OpenEdge BPM: Modifying an OE Application To Integrate With BPM

I sat in on a breakout session today at the Progress Revolution conference on OpenEdge BPM and migrating existing OpenEdge applications to work with (Savvion) BPM. There are some new ways of doing this that are coming in OE 11 that we are not seeing in this session, but I’ve had a few conversations with people since my blog post yesterday and expect to have a more in-depth briefing on this soon.

Traditionally, in OE development as in many other application development environments, the business process and logic are built directly in the application, intertwined with the user interface and other components. This pre-SOA app dev methodology is pretty old-school, but I think that’s what we’re dealing with in considering a lot of the existing OE apps that have been in production for years: not only where they designed before multi-tiered architecture was generally acceptable practice, in many cases they may have been designed by less-technical business people who weren’t aware of these types of coding standards.

Now that Savvion BPM is integrated with OE, the plan will be that business processes will be made explicit in the BPM layer, and eventually much of the user interface will also be moved to the BPM forms layer, while the more traditional OE code will provide the services layer that is consumed by the process. This creates a more flexible and open service oriented architecture, allowing the BPM processes to consume both the OE services (currently with web services, but presumably with tighter couplings in the future) and services from other systems in an enterprise’s environment in order to orchestrate multiple applications.

If you were starting fresh with a new app, that would not be a significant problem: build your processes and UIs in BPM, build your services in OE, and you’re done. The problem, however, is the huge body of existing OE applications that need to start migrating in this direction. This problem exists far beyond the OpenEdge world, of course: those with legacy code and monolithic applications of any sort are having to deal with this as they struggle to become more service-oriented, and to integrate BPM as an overarching orchestration framework.

Brian Bowman, a Progress senior systems consultant, led this session and gave a demo of creating a process in BPM  – all the while explaining some BPM basics to what I assumed was a room full of OE developers. Like a huge portion of the business/IT world, most OE customers and partners have no idea what BPM looks like or what it can do for them, meaning that Progress has a lot of education to do before anyone actually starts integrating BPM into their OE apps. A huge opportunity for Progress, undoubtedly, but also a huge challenge. I’m also struck by the idea that a lot of the Progress people, particularly the SCs who will be demoing this to customers and partners, need to have some better general process modeling training including a bit more stringent BPMN education, not just training on pushing the mouse around in the BPM tool.

Brian was joined by Sandy (I missed her last name), another SSC, who moved from Brian’s “business analyst” role who created the process in BPM, to a “developer” role in OE. She had a pre-built OE app that had a button that instantiated a process and displayed the process ID; she showed the underlying OE code, which made a CreateInstance call followed by some updateDataSlot calls to update the parameters in the process instance with the OE database parameters. The rest of the integration happened on the BPM side, with the basic points of integration as follows:

  • Create a process instance from an OE app, and populate the data fields. I don’t know OE code, but it appears that it uses a couple of new or repurposed functions (CreateInstance and updateDataSlot) to call BPM.
  • Call an OE function from a process step using a SOAP call. This requires that the function be exposed in OE as a web service, but BPM would not have had to be changed in order to make the call, since that’s a standard functionality in Savvion.
  • Update the OE database from a process step. This is based on the OE database connectivity functionality that has been added to BPM.
  • Embed a WebSpeed form page in a BPM UI form: basically, replacing a BPM UI form with an existing WebSpeed form to complete a BPM step. It is not possible to use an existing OE GUI form in this way, only a WebSpeed form since the HTML can be embedded as a URL. This is done by embedding the search parameters directly in the URL that is called to invoke the embedded WebSpeed form, which may be a security concern in some situations.

There’s definitely an issue with those using the OE GUI (again, I’m not that familiar with OE, but I assume this is a thick client UI) since these can’t be used directly as BPM task UIs as you can with the WebSpeed forms, although you could add API calls to your OE GUI application to update the BPM process instance, effectively creating a shadow process in OE and BPM. Not best practice, I’m sure, but possibly a stop-gap measure for those migrating from the OE GUI either to WebSpeed forms or BPM UI forms.

OE and BPM have separate servers (as you would expect), and deployment is done independently, as far as I can tell. That makes sense, since the eventual goal is to have OE becomes more of the services layer consumed by BPM; there is no real need to have the deployment tightly coupled although you do have to be concerned about dependencies, as you would with any SOA deployment.

Some of the questions at the end were related to how OE functionality would be replicated in BPM, such as roles and security; Progress definitely needs to do some work around educating the customers and partners on how the features that OE developers currently rely on will be done on the BPM side, for those functions such as UI and work management that are expected to move over to this new environment.

7 Reasons To Build Business Apps on BPM by @neilwd

I was late to Neil Ward-Dutton’s session due to another meeting, and ended up arriving just as he started on reason #7. However, in the summary, he did list out his 7 reasons of why you want to build business applications using BPM technology:

  1. Demonstrate value transparently
  2. Support iterative, collaborative changes
  3. Speed up user acceptance
  4. Improve management of customizations
  5. Enable transition to SaaS delivery
  6. Reach new stakeholders
  7. Support continuous improvement

He finished with the signals that you can use to identify the maximum opportunity for value from using BPM:

  • Strong service differentiation focus
  • Strong need for performance transparency
  • Dynamic regulation or policy environment
  • Need to coordinate work/information across or between organizations

I was sorry to miss most of the presentation, but there apparently is a paper that Neil wrote on the Progress website going through these points as well (although I don’t have the link).

RPM For Top and Bottom Line Improvement

Dr. Ketabchi, who was CEO of Savvion before the acquisition and is now a chief strategist at Progress, presented a breakout session on using RPM to enable enterprises to improve their top and bottom line. BPM isn’t just about cutting costs and improving quality any more (although those are still important, and expected), it’s also about increasing revenue by taking advantage of opportunities as they arise.

He gave another version of what I saw in Wilner’s presentation on the justification for BPM (explicit business processes leading to agility, visibility and better business understanding of processes) which really drives home that I’m not at a BPM vendor’s conference, I’m at an application development tool vendor’s conference where they are introducing this hot new technology called BPM. This is, of course, the stage that most of the business world is at with respect to BPM understanding; I’m just so used to being in the BPM echo chamber that I rarely hear these messages unless I’m delivering them to a client.

Dr. K pointed out that BPM isn’t enough to increase revenue, although it obviously pained him to say that; business event processing (BEP) and embedded real time analytics are also required. Revenue generating opportunities are always customer-facing and situational, based on time, location, occasion, connection, exceptions and/or actions. This requires understanding the customers’ situations, analyzing those situations, and delivering (or offering) services and products that they need immediately. Sensing and understanding the customers’ situations requires the processing and correlation of events from a variety of sources using BEP, extracting information from the context of those events, and triggering actions and events as a result. In part, it’s about recognizing patterns and exceptions.

He went on to discuss these functionalities in the context of the Progress RPM suite, and some customer examples of using RPM to take advantage of revenue generating opportunities as they arise, such as a mobile phone company pushing offers to their (opted-in) customers based on their location. No real new information here, but showing a realignment of the focus of RPM to be as much about improving the top line as well as the bottom line of business.

Dr. K will be speaking at the Forrester BPM event here in Boston on Thursday, along with Progress customer Reliance Capital.

OpenEdge BPM Introduction with @KenWilner

Ken Wilner, Progress’ VP of Technology for the OpenEdge product, gave a breakout session on OpenEdge BPM, which integrates the former Savvion BPM platform into the OpenEdge application development environment to allow the business process to be made explicit – externalized from the application – in order to improve agility and visibility. It’s interesting to hear this message, which is no longer even a topic of conversation in mainstream BPM circles because it is so well-understood, presented to a group of OpenEdge application developers.

Does the integration of BPM just relegate OpenEdge to the scripting/coding language slaved to BPM? Maybe, but that’s not necessarily a bad thing. Instead of layering BPM on top of a monolithic application developed with OpenEdge, it’s about having an integrated development platform that includes BPM as a part of the toolkit. It will be interesting to see how well this message is received by the OpenEdge development community, and how long it takes to actually impact their development methods. I had a number of questions yesterday during my workshop on exactly this issue: how does BPM fit with an application developed in OpenEdge? It’s about pulling the process out of the app and into BPM, as Wilner pointed out, but also about orchestrating and integrating apps including OpenEdge and other systems such as CRM and accounting.

Although (Savvion) BPM Studio and the OpenEdge Architect development environment are both Eclipse-based, it doesn’t appear that they’ve been integrated in any significant manner. Similarly, there are two different servers – although a BPM process can call an OpenEdge functionality, using web services at least – and two different end-user portal environments, where the BPM server functionality can be surfaced in the OpenEdge portal.

He gave a live demo of creating a process in BPM Studio. This was pretty straightforward, a BPMN diagram of a five-step order processing flow with roles assigned to human steps, plus a simple data model with specific fields exposed at the steps in order to auto-generate the related forms. He then assigned a system step to an OpenEdge function, using web services (SOAP) calls, and another system step using the standard Savvion email functionality. He ran the process in the BPM portal, showing how the tasks were routed to the different users, and how you can monitor the process as it moves through the steps. Nice, and probably new and exciting for the purely OpenEdge people in the audience, but so far, this is just standard BPM with no specific integration between OpenEdge and Savvion BPM, only the standard loosely-coupled web services that would have been there in BPM anyway.

Wilner discussed (but did not demo) the high level of reusability of existing OpenEdge application components in the context of a BPM process, including the use of existing UI forms, but it’s not clear that this is a level of integration specific to OpenEdge, or just using standard integration points and/or custom development.

There is no doubt that BPM provides value as a tool for any application developer, but this demo could have been done with any BPMS, and/or any application that exposes functionality as a web service. I know that this session was listed as an introduction to OpenEdge BPM, but appears to be more of an introduction to BPM for OpenEdge developers. I hope that there is more to OpenEdge BPM than this, as well as a comprehensive roadmap for further integration. His closing slides indicated that this was coming in OpenEdge 11 at the end of this year, and I look forward to seeing how they are going to push this forward.

Progress Revolution Keynotes: Goodson and more @DrJohnBates

The second morning keynote was by John Goodson, SVP of products, digging a bit more deeply into the technology and products behind responsive business management (Control Tower, Savvion, Apama, Actional, Visual Analytics) and responsive business integration (Actional, DataXtend, DataDirect, Sonic). This is a formidable suite of products, and there are 20 more in their portfolio that he didn’t even cover; some degree of integration between these is really required to make this a bit less unwieldy.

He showed a (canned) demo of Progress RPM and Control Tower, including event and process monitoring as well as process modeling directly in the same environment. He stressed that Control Tower is not just about visibility, it’s about being able to make changes in real time.

photoHe quoted Mike Gualtieri of Forrester, who said “Java is a dead-end for enterprise application development” earlier this year; Goodson pointed out that Java coding is not inherently agile enough for a truly responsive business as he announced OpenEdge 11: fully integrated with BPM, multi-tenanted and deployable in the cloud, with support for iPad and Silverlight. It appears that OpenEdge is now positioned as part of RPM, with OpenEdge BPM being positioned as “the world’s first business process-enabled application development platform”. Furthermore, it’s deployable in the cloud on their Arcade platform.

John Bates returned to the stage – complete in a British red coat and tricorner hat last seen in Boston around 1776 – to discuss the industry solution accelerators available for RPM: capital markets, telecom and more. These accelerators allow developers to quickly assemble applications that combine RPM capabilities and their own industry knowledge. Interestingly, Savvion refocused on solution accelerators a year or more before their acquisition by Progress, and these are now considered by the analysts to be a requires feature in a BPMS.

He finished with a recorded demo of a detecting wash trades in the market, based on the capital markets solution accelerator, including discovering alerts, analyzing the related data, and setting up custom event correlation on the fly.

Progress Revolution Kicks Off: @RReidy and @DrJohnBates Keynotes

I arrived in Boston yesterday for Progress Software’s user conference, Revolution, and to deliver an Introduction to BPM workshop yesterday afternoon. With that out of the way, I can focus on the other speakers here, and what’s happening with Progress these days.

The keynotes opened with Rick Reidy, the CEO, with a bit of history of Progress as a supporter of business-led software development, and their current leading position in helping companies to become more operationally responsive to external events. This seems a bit strange, given that it was announced six weeks ago that Reidy will be leaving as soon as a successor is found – what was the board thinking, allowing this to happen just prior to the user conference without a replacement in place?

Regardless of this elephant in the room, he spoke well about creating the responsive business as a revolution in both technology and business, and what Progress is doing to lead that with new versions of OpenEdge and RPM, the latter of which includes the Progress Control Tower, an interactive cockpit. In spite of Progress’ long history with their OpenEdge software development environment, it’s clear that much of their future success is based on the Apama CEP and Savvion BPM acquisitions, and the integration of these product functionalities into a comprehensive solution.

Next up was John Bates, CTO, talking about how business success is defined by how you respond to the continuous disruption that occurs in everyday business life. Looking just at stock trading – a favorite topic for Bates, who has done a lot of work in this area – consider flash crashes and events such as the recent UBS $2B loss due to a single rogue trader; but he also touched on the business responses to natural disasters such as the Japan earthquake earlier this year, where entire supply chains were reconfigured. On a smaller scale, consider business responses to customer events such as missed SLAs, complaints and even Twitter messages about your company or products. It’s not just disaster management/avoidance: there are also fleeting opportunities for revenue such as mobile promotions, web upsells, and algorithmic trading, which can’t rely on a person noticing that something needs to be done, but must be automated based on external events.

Bates pointed out that companies that use classic business intelligence are driving in the rear view mirror: using past data to try to predict the future. Instead, you need to become operationally responsive:

  • Continuously mitigating risk
  • Optimizing operations dynamically
  • Capitalizing on real-time revenue opportunities

Progress has developed a responsive business prescription, which is a set of methods and tools for ensuring operational responsiveness. That starts with continuous business visibility, so that you can figure out how your business is doing in real time. That allows the next step, which is to sense and respond to opportunities and threats. Finally, you can improve your business processes based on this real-time sense and respond capability. See—Respond—Improve.

In order to do this, Progress provides an integrated suite of products that they call Responsive Process Management, which includes BPMS, CEP, business transaction management (BTM) and business analytics. The analysts are jumping on board: Gartner refers to this as intelligent business operations; IDC refers to it as business navigation systems.

BPM and CEP together are key to this: the BPMS allows for agile modeling and deployment of processes, while CEP correlates external events to determine what should impact the processes on the fly. He gave an example using the “tarmac rule” – the rule in the US that results in huge fines for airlines who load up a plane and leave the passengers sitting for more than 3 hours – where the combination of weather events, flight operations events, maintenance events and crew events can be used to avoid violating the tarmac rule by redeploying crew and aircraft as required, or avoid boarding a plane where it’s already known that the rule would be violated.

In order to become more responsive, business analytics have to combine real time and historical data, without necessarily replacing the core legacy systems that run the business. I touched on this yesterday during my workshop, in response to questions about the value of BPM when you have a perfectly good monolithic legacy application: basically, you want to gain visibility into (and potentially control) those legacy applications. This is where business transaction management comes into play, allowing you to monitor the events and state changes within the legacy applications, track and orchestrate the entire flow in  BPM – for example, linking together CRM, ERP and logistics systems, plus adding new functionality in an end-to-end process flow – then use Control Tower to view and control the entire process, including the underlying legacy applications.

Bates is a great speaker (and a really smart guy), so it’s always a pleasure to see him present: equally informative and entertaining.

Introduction to BPM Workshop at Progress Revolution

I gave a workshop on Introduction to BPM at the pre-conference day yesterday for Progress Software’s user conference, Revolution:

The room was full, well over 120 people, and I’ve had a lot of good feedback from the session. As always, I could have talked all day about this stuff, but had to limit myself to about 2.5 hours, with another hour hanging around talking to attendees afterwards. It was great to get my presentation out of the way early in the conference; now I can relax and focus on (and blog about) other people’s presentations.

The Changing Nature of Work

I’ve been completely remiss with my blogging this week at the academic/research conference on BPM this week in Clermont-Ferrand, France. This isn’t for lack of good material from the workshops and presentations, but I’ve been a bit busy preparing for the keynote that I gave this morning:

I really enjoyed the discussion afterwards, both in the Q&A and at the break later. There is quite a bit of research already happening in the areas that I list as “unsolved problems” (by which I mean, generally unsolved in commercial products), and I gained some good insights on removing the boundary between what we now thing of as structured and unstructured processes.

There is a bit of Twitter action going on at the conference, using both the hashtags #BPM2011 and #BPM11, with Michael zur Muehlen leading the pack on live-tweeting my presentation.

Active Endpoints’ Cloud Extend For Salesforce Goes Live

Next week at at Dreamforce, Active Endpoints’s new Cloud Extend for Salesforce will officially go live. I had a briefing a few months back when it hit beta, and an update last week where I saw little new functionality from the first briefing, but some nice case studies and partner support.

Introduction Call guide - set up meeting.jpgCloud Extend for Salesforce is a helper layer that integrates with Salesforce that allows business users to create lightweight processes and guides – think screenflows with context-sensitive scripting – to help users through complex processes in Salesforce. In Salesforce, as in many other ERP and CRM systems, achieving a specific goal sometimes requires a complex set of manual steps. Adding a bit of automation and a bit of structure, along with some documentation displayed to the user at the right time, can mean the difference between a process being done correctly or having some critical steps missed. If you look at the Cloud Extend case study with PSA Insurance & Financial Services covered in today’s press release, a typical “first call” sales guide created with Cloud Extend includes such actions as recording the prospect’s current policy expiration date, setting reminders for call-back, sending out collateral and emails to the prospect, and interacting with other PSA team members via Salesforce Chatter. This will mean that less follow-up items are missed, and improve the overall productivity of the sales reps since some of the actions are automated or semi-automated. Michael Rowley, CTO of Active Endpoints, wrote about about Cloud Extend at the time of the beta release, covering more of the value proposition that they are seeing by adding process to data-centric applications such as Salesforce. Lori MacVittie of F5 wrote about how although data and core processes can be commoditized and standardized, customer interaction processes need to be customized to be of greatest value. Interestingly, the end result is still a highly structured pre-defined process, although one that can be created by a business user using a simple tree structure.

When I saw a demo of Cloud Extend, I was reminded of similar guides and scripts that I’ve seen overlaid on other enterprise software to assist in user interactions, usually for telemarketing or customer service to be prompted on what to say on the phone to a customer, but this is more interactive than just scripts: it can actually update Salesforce data as part of the screenflow, hence making it more of a BPM tool than just a user scripting tool. Considering that the ActiveVOS BPMS is running behind the scenes, that shouldn’t come as a surprise, since it is optimized around integration activities. Yet, this is not an Active Endpoints application: the anchor application is Salesforce, and Cloud Extend is a helper app around that rather than taking over the user experience. In other words, instead of a BPMS silo in the clouds as we’re seeing from many BPMS cloud vendors, this is using a BPMS platform to facilitate a functionality integrated into another platform. A cloud “OEM” arrangement, if you please.

Creating a new guide - set automated email step actionThe Guide Designer – a portal into the ActiveVOS functionality from within Salesforce – allows a non-technical user to create a screen flow, add more screens and automated steps, call subflows, and call Salesforce functions. The flow can be simulated graphically, stepping forwards and backwards through it, in order to test different conditions; note that this is simulation in order to determine flow correctness, not for the purpose of optimizing the flow under load, hence is quite different from simulation that you might see in a full-featured BPA or BPMS tool. Furthermore, this is really intended to be a single-person screen flow, not a workflow that moves work between users: sort of like a macro, only more so. Although it is possible to interrupt a screen flow and have another person restart it, that doesn’t appear to be the primary use case.

There are a few bits that likely a non-technical user couldn’t do without a bit of initial help, such as creating automated steps and connecting up the completed guides to the Salesforce portal, but it is pretty easy to use. It uses a simple branching tree structure to represent the flow, where the presence of multiple possible responses at a step creates the corresponding number of outbound branches. In flowcharting terms, that means only OR gates, no merges and no loopbacks (although there is a Jump To Step capability that would allow looping back): it’s really more of a decision tree than what you might thing of as a standard process flow.

Creating a guide, or a “guidance tree” as it is called in the Guide Designer consists of adding an initial step, specifying whether it is a screen (i.e., a script for the user to read), an automated step that will call a Salesforce or other function, a subflow step that will call a predefined subflow, a jump to step that will transfer execution to another point in the tree, or an end step. Screen steps include a prompt and up to four answers to the prompt; this is the question that the user will answer at this point in response to what is happening on their customer call. One outbound path is added to the step for each possible answer, and a subsequent step automatically created on that path. The branches keep growing until end steps are defined on each branch.

Complex guidance tree - additional steps on right revealed on navigationA complex tree can obviously get quite large, but the designer UI has a nice way of scrolling up and down the tree: as you select a particular step, you see only the connected steps twice removed in either direction, with a visual indicator to show that the branch continues to extend in that direction.

Regardless of the complexity of the guidance tree, there is no palette of shapes or anything vaguely BPMN-ish: the user just creates one step after another in the tree structure, and the prompt and answers create the flow through the tree. Active Endpoints see this tree-like method of process design, rather than something more flowchart-like, to be a key differentiator. In reality, under the covers, it is creating BPMN that is published to BPEL, but the designer user interface just limits the design to a simple branching tree structure that is a subset of both BPMN and BPEL.

Once a flow is created and tested, it is published, which makes it available to run directly in the Salesforce sales guides section directly on a sales lead’s detail page. As the guide executes, it displays a history of the steps and responses, making it easy for the user to see what’s happened so far while being guided along through the steps.

Cloud Extend, Socrates modeler and multi-tenant ActiveVOS in the PaaS stackObviously, the Active Endpoints screen flows are executing in the cloud, although as of the April release, they were using Terremark rather than hosting it on Salesforce’s platform. Keeping it on an independent platform is critical for them, since there are other enterprise cloud software platforms with which they could integrate for the same type of benefits, such as Quickbooks and SuccessFactors. Since there is very little data persisted in the process instances within Cloud Extend, just some execution metrics for reporting and the Salesforce object ID for linking back to records in Salesforce, there is less concern about where this data is hosted, since it will never contain any personally identifiable information about a customer.

We’re starting to see client-side screen flow creation from a few of the BPMS vendors – I covered TIBCO’s Page Flow Models in my review of AMX/BPM last year – but those screen flows are only available at a step in a larger BPMS model, whereas Cloud Extend has encapsulated that capability for use in other platforms. For small, nimble vendors who don’t need to own the whole application, providing embeddable process functionality for data-centric applications can make a lot of sense, especially in a cloud environment where they don’t need to worry about the usual software OEM problems of installation and maintenance.

I’m curious about whatever happened to Salesforce’s Visual Process Manager and whether it will end up competing with Cloud Extend; I had a briefing of Visual Process Manager over a year ago that amounted to little, and I haven’t heard anything about it since. Neil Ward-Dutton mentions these two possibly-competing offerings in his post on the beta release of Cloud Extend, but as he points out, Visual Process Manager is more of a general purpose workflow tool, while Cloud Extend is focused on task-specific screen flows within the Salesforce environment. Just about the opposite of what you might have expected to come out of these respective vendors.

Cloud Extend