WebSphere Business Performance and Service Optimization

I sat in on a roundtable with Doug Hunt, VP of Business Performance and Service Optimization (which appears to be a fancy name for industry accelerators) and Alan Godfrey of Lombardi. Basically, BP&SO is a team within the software group (as opposed to services) that works with GBS (the services part of IBM) to build out industry vertical accelerators based on actual customer experience. In other words, these are licensed software packs that would typically be bundled with services. A BP&SO center of excellence within GBS has been launched in order to link the efforts between the two areas.

I heard a bit about the accelerators in the BPM portfolio update this morning; they’re focused on making implementation faster by providing a set of templates, adapters, event interfaces and content for a specific industry process, which can then be built out into a complete solutions by GBS or a partner. In particular, the accelerators look at how collaboration, monitoring, analytics, rules and content can be used specifically in the context of the vertical use case. They’re not really focused on the execution layer, since that tends to be where the ISVs play, but rather more prescriptive, such as the control layer for real-time monitoring across multiple business silos.

Interestingly, Hunt describe the recently-revealed advanced case management (ACM) as a use case around which an accelerator could be developed; I’m not sure that everyone would agree with this characterization, although it may be technically closer to the truth than trying to pass off the ACM “strategy” as a product.

This trend for vertical accelerators has been around in the BPM market for a while with many other vendors, and the large analysts typically look at this as a measure of the BPMS vendor’s maturity in BPM. The WebSphere accelerators are less than a packaged application, but more than a sales tool; maybe not much more, since they were described as being suitable for an “advanced conference room pilot”. In any case, they’re being driven in part by the customers’ need to be more agile than is permitted with a structured packaged application. There’s no doubt that some highly regulated processes, such as in healthcare, may still be more suited for a packaged application, but the more flexible accelerators widen the market beyond that of the packaged applications.

WebSphere BPM Analyst Update

There was a lunchtime update for the analysts on all the new WebSphere offerings; this was, in part, a higher-level (and more business oriented) view of what I saw in the technical update session earlier.

We also saw a demo of using Cast Iron (which was just acquired by IBM this morning) to integrate an on-premise SAP system with Salesforce.com; this sort of integration across the firewall is essential if cloud platforms are going to be used effectively, since most large enterprises will have a blend of cloud and on-premise.

There’s a ton of great traffic going on at #ibmimpact on Twitter and the IBM Impact social site, and you can catch the keynotes and press conference on streaming video. Maybe a bit too much traffic, since the wifi is a bit of a disaster.

WebSphere BPM Product Portfolio Technical Update

The keynotes sessions this morning were typical “big conference”: too much loud music, comedians and irrelevant speakers for my taste, although the brief addresses by Steve Mills and Craig Hayman as well as this morning’s press release showed that process is definitely high on IBM’s mind. The breakout session that I attended following that, however, contained more of the specifics about what’s happening with IBM WebSphere BPM. This is a portfolio of products – in some cases, not yet really integrated – including Process Server and Lombardi.

Some of the new features:

  • A whole bunch of infrastructure stuff such as clustering for simple/POC environments
  • WS CloudBurst Appliance supports Process Server Hypervisor Edition for fast, repeatable deployments
  • Database configuration tools to help simplify creation and configuration of databases, rather than requiring back and forth with a DBA as was required with previous version
  • Business Space has some enhancements, and is being positioned as the “Web 2.0 interface into BPM” (a message that they should probably pass on to GBS)
  • A number of new and updated widgets for Business Space and Lotus Mashups
  • UI integration between Business Space and WS Portal
  • Webform Server removes the need for a client form viewer on each desktop in order to interact with Lotus Forms – this is huge in cases where forms are used as a UI for BPM participant tasks
  • Version migration tools
  • BPMN 2.0 support, using different levels/subclasses of the language in different tools
  • Enhancements to WS Business Modeler (including the BPMN 2.0 support), including team support, and new constructs including case and compensation
  • Parallel routing tasks in WPS (amazing that they existed this long without that, but an artifact of the BPEL base)
  • Improved monitoring support in WS Business Monitor for ad hoc human tasks.
  • Work baskets for human workflow in WPS, allowing for runtime reallocation of tasks – I’m definitely interested in more details on this
  • The ability to add business categories to tasks in WPS to allow for easier searching and sorting of human tasks; these can be assigned at design time or runtime
  • Instance migration to move long-running process instances to a new process schema
  • A lot of technical implementation enhancements, such as new WESB primitives and improvements to the developer environment, that likely meant a lot to the WebSphere experts in the room (which I’m not)
  • Allowing Business Monitor to better monitor BPEL processes
  • Industry accelerators (previously known as industry content packs) that include capability models, process flows, service interfaces, business vocabulary, data models, dashboards and solution templates – note that these are across seven different products, not some sort of all-in-one solution
  • WAS and BPM performance enhancements enabling scalability
  • WS Lombardi Edition: not sure what’s really new here except for the bluewashing

I’m still fighting with the attendee site to get a copy of the presentation, so I’m sure that I’ve missed things here, but I have some roundtable and one-on-one sessions later today and tomorrow that should clarify things further. Looking at the breakout sessions for the rest of the day, I’m definitely going to have to clone myself in order to attend everything that looks interesting.

In terms of the WPS enhancements, many of the things that we saw in this session seem to be starting to bring WebSphere BPM level with other full BPM suites: it’s definitely expanding beyond being just a BPEL-based orchestration tool to include full support for human tasks and long-running processes. The question lurking in my mind, of course, is what happens to FileNet P8 BPM and WS Lombardi (formerly TeamWorks) as mainstream BPM engines if WPS can do it all in the future? Given that my recommendation at the time of the FileNet acquisition was to rip out BPM and move it over to the WebSphere portfolio, and the spirited response that I had recently to a post about customers not wanting 3 BPMSs, I definitely believe that more BPM product consolidation is required in this portfolio.

PegaWORLD: Managing Aircraft at Heathrow Airport

Eamonn Cheverton of BAA discussed the recent event-driven implementation of Pega at Heathrow airport for managing aircraft from touchdown to wheels-up at that busiest of airports. In spite of the recent interruption caused by the volcanic eruption in Iceland, Heathrow sees millions of passengers each year, yet had little operational support or information sharing between all of the areas that handle aircraft, resulting in a depressingly low (for those of us who fly through Heathrow occasionally) on-time departure rate of 68%. A Europe-wide initiative to allow for a three-fold increase in capacity while improving safety and reducing environmental effects drove a new business architecture, and had them look at more generic solutions such as BPM rather than expensive airport-specific software.

We’ll be looking more at their operations tomorrow morning in the case management workshop, but in short, they are managing aircraft air-to-air: all activities from the point that an aircraft lands until it takes off again, including fuel, crew, water, cleaning, catering, passengers and baggage handling. Interestingly, the airport has no visibility into the inbound flights until about 10 minutes before they land, which doesn’t provide the ability to plan and manage the on-ground activities very well; the new pan-European initiative will at least allow them to know when planes enter European airspace. For North Americans, this is a bit strange, since the systems across Canada and the US are sufficiently integrated that a short-haul flight doesn’t take off until it has a landing slot already assigned at the destination airport.

Managing the events that might cause a flight departure to be delayed allows for much better management of airline and airport resources, such as reducing fuel spent due to excessive taxi times. By mapping the business processes and doing some capability mapping at the business architecture level, BAA is able to understand the interaction between the activities and events, and therefore understand the impact of a delay in one area on all the others. As part of this, they documented the enterprise objects (such as flights) and their characteristics. Their entire business architecture and set of reference models are created independent of Pega (or any other implementation tool) as an enterprise architecture initiative; to the business and process architects, Pega is a black box that manages the events, rules and processes.

Due in part to this initiative, Heathrow has moved from being consider the world’s worst airport to the 4th best, with the infamous “disastrous” terminal 5 now voted best in the world. They’re saving 90 liters of fuel per flight, have raised their on-time departure rate to 83%, and now involve all stakeholders in the processes as well as sharing information. In the near future, they’re planning for real-time demand capacity balancing through better integration, including coordinating aircraft movement across Europe and not just within Heathrow’s airspace. They’re also looking at physical improvements that will improve movement between terminals, such as underground baggage transport links that allow passengers to check in baggage at any terminal. Their current airport plan is based around plans for each stand, gate, person, vehicle, baggage and check-in resource; in the future, they will have a single integrated plan for the airport based on flights. They’re also adopting ideas from other industries: providing a timed entry ticket to security at the time that you check in, for example, similar to the fast-track system in theme parks. Also (which will raise some security hackles), tracking you on public transit on your way to the airport so that your flight can be rescheduled if your subway is delayed. With some luck, they’ll be able to solve some of the airport turnaround problems such as I experienced in Frankfurt recently.

The tracking and management system, created using Pega, was built in about 180 days: this shows the status of arrivals, departures, turnarounds (the end-to-end process) and a real-time feed of aircraft locations on the airport property, plus historical and predictive reports on departures, arrivals and holdings. Really fascinating case study of using BPM in a non-traditional industry.

PegaWORLD: SunTrust Account Opening

Trace Fooshee, VP and Business Process Lead at SunTrust, discussed how they are using Pega to improve account opening in their wealth management area. He’s in a group that acts as internal management consultants for process transformation and related technology implementation: this covers issues ranging from lack of integration to inconsistent front-back office communications to inefficient manual work management. Some of the challenges within their wealth management account opening process were increasing costs due to inefficient and time-consuming manual processes, inconsistent processes, and poor operational and management control.

In order to address the challenges, they set a series of goals: reducing account opening time from 15 to 4 days, improving staff productivity, eliminating client setup inconsistencies, streamlining approvals, and converting 40% of maintenance requests to STP. In addition to these specific targets, they also wanted to develop enterprise account opening services and assets that could be used beyond wealth management. They approached all of this with online intent-driven forms, imaging and automated work management, online reporting and auditing, backend system integration, and standardized case management to share front and back office information.

Having some previous experience with Pega, they looked at how BPM could be applied to these issues, and found advantages in terms of flexibility, costs and other factors compared to both in-house builds and purchase of an account opening application. In considering their architecture, they classified some parts as enterprise assets, such as scanned versions of the executed trust documents that went into their into their FileNet enterprise content repository, versus line-of-business and business unit assets, such as specific process flows for account setup.

Using an iterative waterfall approach, they took a year to put their first pilot into production: it seems like they could have benefited from a more agile approach that would have seen incremental releases sooner, although this was seen as being fast compared to their standard SDLC. Considering that the system just went into production a couple of weeks ago, they don’t really know how many more iterations will be required – or how long each will take – to optimize this for the business. They were unable to use Pega’s Directly Capturing Objectives (DCO) methodology for requirements since it conflicted with their current standard for requirements; as he discussed their SLDC and standard approaches, it appears that they’re caught in the position of many companies, where they know that they should go agile, but just can’t align their current approach to that. The trick is, of course, that they have to get rid of their current approach, but I imagine that they’re still in denial about that.

Some of the lessons that they learned:

  • Break down complex processes and implement iteratively.
  • Strong business leadership accelerates implementation.
  • Track and manage deferred requirements, and have a protocol for making decisions on which to implement and which to defer.
  • Get something working and in the hands of the users as soon as possible.

The year that they took for their first release was 3-4 months longer than originally planned: although that doesn’t sound like much, consider it as a 30-40% schedule overrun. Combining that with the lesson that they learned about putting something into the users’ hands early, moving from an iterative waterfall to agile approach could likely help them significantly.

Their next steps include returning to their deferred requirements (I really hope that they re-validate them relative to the actual user experience with the first iteration, rather than just implementing them), expanding into other account opening areas in the bank, and leveraging more of the functionality in their enterprise content management system.

PegaWORLD: Zurich’s Operational Transformation Through BPM

Nancy Mueller, EVP of Operations at Zurich Insurance North American, gave a keynote today on their operational transformation. Zurich has 60,000 employees, 9,500 of them in North America, and serves customers in 170 countries. Due to growth by acquisition, they ended up with five separate legal entities within the US, only one of which was branded as Zurich; this tended to inhibit enterprise-wide transformation. Their business in North America is purely commercial, which tends to result in much more complex policies that require highly-skilled and knowledgeable underwriters.

She admitted that the insurance industry isn’t exactly at the forefront of technology adoption and progressive change, but that they are recognizing that change is necessary: to stay competitive, to adapt to changing economic environments, and to meet customer demands. Their focus for change is the vision of a target operating model with specific success criteria around efficiency, effectiveness and customer satisfaction. They started with process, a significant new idea for Zurich, and managing the metrics around the business processes: getting the right skills doing the right parts of the process. For example, in underwriting, there were a lot of steps being done by the highly-skilled underwriters because it was just easier than handing things off (something that I’ve seen in practice with my insurance clients), although it could be much more effective to have underwriter support people involved that can take on the tasks that don’t need to be done by an underwriter. One of the challenges was managing a paperless process: trying to get people to stop printing out emails and sending them around to drive processes – something that I still see in many financial and insurance organizations.

As they looked into their processes, they found that there were many ways to do the same process, when it should be much more structured, and ended up standardizing their processes using Lean methods in order to reduce waste and streamline processes. The result of just looking at the process was a focus on the things that their current systems didn’t do: many of the process aberrations were due to workarounds for lack of system functionality. Also, they saw the need for electronic underwriting files in order to allow collaboration during the underwriting process: as simple as that sounds, many insurance companies are just not there yet. Moving to an electronic file in turn drives the needs for BPM: you needs something to move that electronic file from one desk to another in order to complete that standardized underwriting process.

Once those two components of technology are in place – electronic underwriting files and BPM – portions of the process can be done by people other than underwriters without decreasing efficiency. They’re just starting to roll this out, but expect to deploy it across the organization later this year. This also provides a base for looking at other ways to be more agile and flexible in their business, not just incremental improvements in their existing processes.

So far, they are seeing improvements in quality that are being noticed by their brokers and customers: policies are being issued right the first time, requiring less return and rework, which is critical for their commercial customer base. They’ve also improved their policy renewal and delivery timeframe, required to meet commercial insurance regulations. They’re looking forward to their full roll-out later this year, and how this can help them to further improve on their major performance metric, customer satisfaction.

PegaWORLD: SmartBPM at TD Bank Financial Group

Adrian Hopkins from TD’s Visa Systems and Technology group talked about their experiences with Pega, through various major upgrades over the years and now with SmartBPM for multi-channel customer management in their call centers. TD is one of Canada’s “big five” banks, but is also the 6th-largest bank in North American due to its diverse holdings in the US as well as Canada, serving 17 million customers. They’ve been a Pega customer for quite a while; I first wrote about it back in 2006

8-10% of TD’s workforce – 7,000 employees – are in call centers spread across 23 locations in North American and India, handling 47 million calls per year, hence their need to commoditize the agent and provide the ability to route any call to any center and have the customer’s questions answered satisfactorily. The key here is service leveling: providing the same level of service regardless of from where the call is serviced, through training, access to technology and information, and scripting. The goals were to improve service levels, increase capacity, and providing opportunities for up-selling by the agents while they have the customer on the call. TD is using BPM automation to achieve some of this – automating fulfillment and integrating disparate systems – plus providing more intuitive processes that require little training. In one example, they’ve consolidated the content and functionality of 12 different mainframe green screens into a single screen that can be used to handle a majority of the inbound calls; another allows them to process a credit card fraud claim in a single screen and a small number of steps, replacing an overly-complex manual process of 95 steps that involved managing the claim, handling the fraud and replacing the car. In the latter case, they’ve moved to a completely paperless process for handling a fraud claim, and reduced the case handling time from 7 hours down to minutes. Interestingly, they didn’t take away the old green screen methods of doing things when they deployed the new system, since some of the call center old-timers insisted that it was faster for them; however, they gradually removed the access since the new interfaces enforced rules and procedures that were not built into the green screens, generally improving quality of service.

They’re looking at savings and benefits in several areas:

  • Reduce training time: at a 10% attrition rate, saving one week of training per new employee means a savings of $750,000/year
  • Reduce callback rates, which increases customer satisfaction as well as increasing agent capacity
  • Improve compliance, and reduce the cost of achieving and proving compliance
  • Increase customer-facing time due to less follow-up paperwork, increasing agent capacity
  • Reduce secondary training requirements by guiding agents through complex inquires, allowing less-skilled agents to handle a wider variety of calls, and reducing handoffs
  • Increase ability to drive incremental sales from every contact, or “would you like fries with that?”, although they’re not yet actively doing this in their implementation

They’re still using SmartBPM 5.3, but are looking forward to some of the new capabilities in version 6, which should reduce the amount of code that they’re writing and allow them to put more control of the business process rules in the hands of the users.

Based on the screen snapshots that we saw, however, they’re still building fairly large desktop applications; this must be impacting their agility, in spite of their Agile approach, even though it is providing a huge benefit over using the green screens directly.

PegaWORLD: Alan Trefler keynote

Weather and Air Canada conspired against me getting to Philadelphia yesterday, but here I am at the opening keynote as Alan Trefler gives us the high-level view of Pega’s progress (including the Chordiant acquisition) and what’s coming up. Pega is one of the longest-standing BPM players, now 27 years old – although not all of that strictly in BPM, I think – which gives them a good perspective of how the industry is changing. My links post this morning was a collection of posts about adaptive case management, dynamic BPM and social BPM, and Pega is part of this trend. In fact, Trefler claimed that many of the other vendors are hopping on the agility bandwagon late (even the BPM bandwagon), or in words only.

He pointed out how many of the pure play vendors have been acquired recently, and sees this as a play by the acquiring companies to reduce choice in the market, and artificially bolster the "x% of companies run our software" claims. In his usual style, he used a giant photo of a shark on the screen behind him to illustrate this point. He made direct hits on Oracle, SAP and IBM with his comments, claiming that if consolidation results in the lowest common denominator – a common level of mediocrity – then the customers will lose out. The acquiring stack vendors end up offering a Frankenstack of products that do not integrate properly (if at all), and that so much custom code is required in order to deploy these that they become the new legacy systems, unchangeable and not able to meet the customer needs, since they require that you change your business in order to fit the software rather than the other way around.

He discussed their approach to case management, stating that a case is a metaphor for whatever you need it to be in your business, not a construct that is pre-defined by a vendor. Like the comments that I saw about the recent Process.gov conference, I think that this is also going to be a conference about adaptive case management (ACM) as well as BPM.

Pega is about to announce a set of managed services; they are already pretty cloud-friendly (the recent demo that I had from them was done on an EC2 instance, for example) since they allow for complete configuration and administration via a browser. They’ve been talking about platform as a service for over a year now, so this isn’t a bit surprise, but good to see something concrete rolling out.

He finished up by stating that Pega intends to be the dominant player in the space. They announced on Friday that they’ve added 114 people in the first quarter of 2010, and have just announced 11 consecutive quarters of record revenues. They will continue to invest in R&D in order to achieve and maintain this position.

Judging by the tweets that I’m already seeing, several key BPM analysts are here with me, so expect a lot of good coverage of the conference.

Global 360’s analystView Simulation

It’s the first day of Gartner’s BPM summit in Las Vegas, so expect to see a lot of vendor announcements this week. Some, like Global 360, had the decency to arrange for a briefing for me last week so that I could write something about their announcement in advance; others, who shall remain nameless, waited until Friday afternoon to send me a content-free advance press release that is not worth repeating (although some undoubtedly will). You know who you are.

Global 360’s products are tightly tied to Microsoft platforms, and they use Visio as their business-facing process modeler. Although I have a philosophical problem with not using a shared model approach to process modeling, I’m also realistic enough to know that Visio for process modeling is not going away any time soon. There’s some nice things in Visio 2010 that are allowing them to move Visio from a passive role to a more active role, although only in the Premium edition.

BPMN in Visio 2010Visio 2010 Premium supports BPMN 1.2 with a stencil, and also has a number of ease-of-use enhancements to make it easier to draw process diagrams, such as the ability to easily add connected shapes, auto-alignment, reflowing the process map to vertical or horizontal alignment, and allowing a selected group of elements to be converted to a subprocess. In short, Visio is becoming a competent BPMN modeler, although the key will be how quickly they will release BPMN 2.0 now that the standard is with the finalization taskforce and can be expected to be released pretty much as it is currently defined. For those of you who aren’t that familiar with the differences between BPMN 1.x and 2.0, there are a number of new event types (some of them will be rarely used, although the non-interrupting boundary events are going to be a big hit), the addition of standardized task types, and most importantly, a serialization format that can be used for process map interchange between tools.

Global 360 analystView Visio integration: simulation resultsSo far, that’s just Microsoft Visio Premium. What Global 360 provides is the analystView plugin to add simulation to these BPMN models right within Visio. This is intended to be simulation “for the masses”: really, we’re talking simulation for the statistically-minded business analyst, although they’ve made the user interface fairly simple, and will include tutorials and interactive help to support the user who is just getting to know simulation. This actually runs discrete events through the model, and can pump out the results to the managerView analytics just as if it were an executing process. It can also do the reverse, taking historical data from managerView and using it as baseline data for the simulation. There are a number of fairly sophisticated simulation functions: data can be simulated through the model; routes can be selected conditionally rather than just weighted decision; roles can be used; and specific statistics can be tracked, such as logged events, timed sequences of events, or SLAs for the entire process, an activity or a timed sequence.

After watching the simulation in action, I’m left with two thoughts: first, it looks quite fully functional, although you’re still going to need some basic statistical background to use it; and second, I’m very glad that they didn’t use little animated running people while the simulation is running, because we’re all just so over that as a user experience. The simulation engine, by the way, is what they acquired from Cape Visions around 2004, which is the same as is used in IBM FileNet and Fujitsu BPM products for simulation.

Visio/SharePoint 2010 integration: saving as Web DrawingA second part of this announcement, also riding on the shoulders of Microsoft, is their SharePoint integration. Process maps created with Visio can be checked in to SharePoint for collaboration; although this can be done with the prior version of SharePoint, the 2010 version allows the process models to be checked in as Web Drawing files, which allows viewing and commenting by non-Visio users, where the built-in Visio services allow the diagram to be viewed as a web part. A process model in this form is still fully functional, for example, clicking on a subprocess will link to that subprocess, and clicking on an element shows the parameters associated with that element, including the simulation parameters.

When an analystView simulation is run, the simulation data is stored in SharePoint with the process model as XML; although Global 360 hasn’t yet launched the web part that will allow viewing of the simulation data directly in SharePoint, that’s expected to come along within a couple of months.

The critical component here is Visio 2010, which is still in beta, required for the BPMN 1.2 support; SharePoint 2010 is a nice-to-have because it allows non-Visio users to collaborate on process models, but isn’t required for any of the other functionality. Global 360 is hedging on the BPMN 2.0 support, saying that they’re pushing Microsoft to support it as soon as it is available, but if Microsoft doesn’t come through by the end of 2010, they’re going to have to make a move on their own. There’s also the issue of what happens when Microsoft decides that they really want to play in the BPM market, although Global 360 (and many other Microsoft-centric BPM vendors) are so far ahead of them, it will likely take an acquisition to have any of the current BPM vendors lose any sleep over it. In the meantime, Microsoft and Global 360 are doing some nice co-marketing, and Microsoft’s Visio website will offer the Global 360 analystView plugin for sale ($349) as well as Visio Premium 2010 ($800).

Global 360 userView inboxThis isn’t the first thing that Global 360 has done with SharePoint, however: they built 20-30 “ViewParts” that are SharePoint web parts that provide a front-end to the Global 360 process engine, allowing you to assemble a user interface for executing their processes in a SharePoint portal view. They’ve done quite a bit of research into persona-based user interfaces, which has resulted in their viewPoint set of interfaces tuned to each particular type of user: builders, managers and participants. The newly-released analystView is for analyst-type builders, whereas some of the userView applications that I saw last fall are for end users in various roles.

For example, a user’s inbox would show their current task list, a Current Workload feedback widget tell their supervisor how busy that they feels they are, a performance comparison with other users, and a message center for other work-related information. A heads-down transaction processing user’s view could be similar, but with push-type task lists instead of browsable lists. A user can then open a work packet, view any attached documents, and complete the tasks within the packet. A supervisor, on the other hand, would see the managerView, containing SLAs, warning and reports, and allowing the supervisor to reallocate work and roles.

The designerView, for more technical builders than the Visio tools described above, provides a process modeler with palettes for standard BPMN process elements, but also messaging, document functions, scripting and a variety of other integration functions. The data model for the process is fully exposed and integrated with the process model, something that more BPM vendors are realizing is critical. Comments can be added on each element in a model, then a documentation view collects all of those comments into a single view. The process models were not fully BPMN compliant when I saw them last fall, although that was planned for early 2010.

Once a process is designed, a builder can create the UI using web parts that are auto-wired when dropped onto the SharePoint canvas. Composite applications can be built using SharePoint or ASP.Net; a number of production-quality starter applications are provided out of the box.

I’ve been waiting for the analystView piece, announced today, to complete this picture: now they have business-facing process modeling and simulation in Visio with analystView, collaboration on the process models in SharePoint, redrawing (or at least enrichment) of the process models in the designerView, and user interface design in SharePoint. The suite feels a bit disjointed, although taking advantage of the penetration of Visio and SharePoint within enterprises could be a huge advantage for Global 360. The major challenges are direct competition from Microsoft at some point in the future, as I discussed previously, and the slow migration of many organizations onto the 2010 versions of the Microsoft platforms required for full functionality. Given that I still have enterprise customers using SharePoint 2003, that could be a while yet.

Business Process Incubator: Another Online BPM Community, But With Standards

BPM standards, I mean. 😉

Yesterday saw the public beta launch of the Business Process Incubator; although this was inadvertently announced by Robert Shapiro during a public webinar last month, it only moved out of closed preview yesterday. I had a briefing from Denis Gagné of Trisotech, one of the driving forces behind BPI, and have had a test account to try it out for the past month.

BPI has a focus on BPM standards, especially BPMN and XPDL, and is intended to a be a hub for content and tools related to standards. That doesn’t mean that this is another walled garden of content; rather, a lot of content is mashed in from other locations rather than being published directly on the site. For example, if you search for me on the site, you’ll find links to this blog and to a number of my presentations on Slideshare, plus the ability to rate the content or flag them on a My Interests list. That means that there’s a lot of content available (but not necessarily hosted) on the site from the start, and it’s growing every day as more people link in BPM-related content that they know about.

The site is divided into four main areas:

  • Do, including services for verifying, visualizing, validating, publishing and converting process models in various standard formats. These are premium services available either directly on the site or via an API: you can try them out a few times with a free membership, but they require payment for more than a few times each month.
  • Share, for contributing content such as process models, tools and blogs; this is also used to view process models shared by others.
  • Learn, for viewing the links, blogs, books, training and other content added in the Share section.
  • Tools, for viewing the tools added in the Share section; these are categorized as diagramming, BPMS, BPA, BAM and BRE. Trisotech’s own free BPMN add-in for Visio is here, but is also featured directly on most other pages on the site, something that competing diagramming tools might object to.

Most content on the site can be tagged and rated, allowing the community to provide feedback. There needs to be better integration with other social networking besides just standard “community share” options on Facebook, Twitter and LinkedIn, and this site just begs for BPI iPhone app, or at least a mobile version of the site.

Although I like the clean user interface, the categorization takes a bit of getting used to: for example, you add both content and tools in the Share section, but you view the links to content in Learn and the links to tools in Tools. Furthermore, you both contribute and view process models in the Share section; this appears to be the only type of contribution that is viewed in Share rather than another section. Also, the distinctions between some of the functions in the Do section are a bit esoteric: most users, for example, may not make the distinction between Transform (which is an XML transformation) versus Convert, since both turn a file of one type into another type. Similarly, Verify ensures that the file is a BPMN file based on the schema, whereas Validate ensures that there are no syntax errors in the BPMN file.

Although vendors can participate in the community as partners, it is vendor-independent. Rather than vendor sponsorships, the site is monetized through a membership model that allows access to most of the content for free, but requires a $300/year premium membership for unrestricted access to premium features, such as process model validation and translation services. In that way, the bulk of the site revenue is expected to come from corporate end-user organizations that use a combination of free and premium memberships for their users, and can sign up for a corporate membership that gives them four premium memberships plus 50% any additional ones. End-user organizations are becoming more aware of the value of BPM standards, and understand the value proposition of a standard notation when using process models to communicate broadly within their organization; BPI will help them to learn more about BPM standards as well as being a general resource for BPM information.

Businesses can have their own page on the site using a custom URL, fancy it up with their own logo and business description, and list all of the site content that belongs to them, whether links to tools, blogs or other content. Partner pages are free, but are monetized by referral or commission fees on any RFI/RFQs, services, training or paid content offered via those pages.

The cloud-based functions offered in the Do section are also available through a public API for vendors to include directly or white-label them in their own offerings; although monetized for this wasn’t settled last month, it would be possible to do this through an API key, much like other public APIs. Both APIs and a toolbar are available for including BPI content and functions on another site.

Partners are already ramping up on the site, and by fall, BPI will be in general availability for all members. There’s now quite a bit of choice in BPM online communities: in addition to all the BPM-themed social networking sites and discussion groups, there are now several public communities offering tools and functionality specific to BPM, such as BPM Blueworks and ARISalign. Gagné sees BPI as complementary and partnering with those sites – for example, those sites could have a partner page, as BPM Institute does – since they augment the other sites’ content with standards-focused materials. BPI’s openness via APIs and a toolbar allows it to be added as a BPM community from another site, and will likely see many referrals from BPM vendors who don’t want to build their own community site, but like the idea of participating in one that’s vendor-neutral. Although BPI is focused on BPM standards, the open platform gives it the potential to grown into a full BPM social networking site with a broad variety of content.

By the way, as your reward for reading this entire post, here’s a link to get a free premium membership. Enjoy!