TIBCO BPM Product Update and Strategy

The TUCON 2012 keynotes are done, and all of my analyst meetings have finished, so I’m free to attend some of the breakout sessions. This afternoon, I went to the BPM update with Roger King (director of BPM product strategy) and Justin Brunt (BPM product manager).

The current AMX BPM release is 1.3.1, with 2.0.0 coming in November, bringing a number of enhancements:

  • Some impressive performance statistics within a single engine: 20,000 simultaneous users opening and completing work items, and management of 1.6M process instances and 18.8m activities in a 10-hour working day.
  • Better administrative tools for managing halted process instances, including being able to examine the instance payload.
  • Worklist views can now be server-based, so that a filtered view of a work list is passed to the client rather than the entire list for client-side filtering.
  • Pageflow debugging and testing tools.
  • Calendaring to control deadlines and work assignment. This allows different calendars to be created for different business areas, each of which can have its own time zone, holiday schedule and working hours. These calendars are then used to calculate deadlines for work assigned to a business unit that references that calendar. Calendars can be assigned to work dynamically at runtime, or at design-time.
  • Changes to the in-process event handling, with improved support for event handler patterns including non-interrupting boundary events for catching external thrown events.
  • Enhancements to work item deadline handling and priority management.
  • Immediate reply with process ID when starting a process instance using a web service.
  • Optimization of large forms when they are used as application interfaces.

This is a fairly long list of mostly minor features, specifically to address customer requests, and gets the x.0 numbering only to indicate that it’s moving from an early-stage to stable product version.

Going forward, they’re looking a satisfying the needs of the initial customers and core market, then adding features for the pure BPMS functionality and intelligent business platform. The high priority candidates include a number of performance enhancements, plus improvements to single sign-on and multi-tenancy. Medium priorities include enhanced form control support; getting BPM event data into external systems by defining custom events and including business data in externally published events; enhanced access control; enhanced web services; and greater control over process instance data purging.

In order to compete better in the BPMS market, they’re looking at updating their standards compliance (BPMS 2.0 and CMIS), and providing a case management offering. Case management will include an adaptive end user interface, global case data and content management integration, improved integration with social streams including tibbr and Nimbus, and plan-based process management (i.e., using a Gantt chart style interface). They will be moving onto the Silver Marketplace structure for public cloud, which I know will be exciting for a number of customers.

Looking at the intelligent business platform functionality, their vision includes event intelligence and the use of realtime data to inform process execution. There will also be enhancements to their AMX Decisions rules product, which is pretty rudimentary right now.

iProcess is still alive and well, with a list of performance and feature enhancements, but they’re certainly not encouraging new development on iProcess. However, they do not appear to be throwing the existing iProcess customers under the bus by sunsetting the product any time soon.

BPM systems are becoming complete development environment for enterprise application development, and any BPMS needs to offer a complete suite of capabilities for application developers as well as business analyst and end-user tools. AMX BPM is continuing to build out their feature set, in part by integrating with other products in their portfolio, and they offer a fairly complete set of functions as they move into the version 2.x product cycle. The challenge for them is not so much new customers, which they are now well-positioned to win, but in convincing the existing iProcess customers to redevelop their iProcess applications on AMX BPM, or at least to start new application development on AMX BPM.

TIBCO TUCON2012 Day 2 Keynotes: Consumerization of IT

We’re on the last of the four themes for TUCON, with Matt Quinn kicking off the session on the consumerization of enterprise IT. It’s a telling sign that many vendors now refer to their products as being like “Facebook/Twitter/iTunes/<insert popular consumer software here> for the enterprise” – enterprise app vendors definitely have consumer app envy, and TIBCO is no exception. As we saw in the earlier keynote session about tibbr, TIBCO is offering a lot of functionality that mimics (and extends) successful consumer software, and their providing Silver Mobile Server as a way to put all of that enterprise functionality that you build using TIBCO products out onto mobile devices. We saw a demo of a app that was built using Silver Mobile Server for submitting and managing auto insurance claims, and it looks like the platform is pretty capable both in terms of using the native device capabilities and linking directly to back-end processes.

They showed some new enhancements to Silver Fabric for private cloud provisioning and management, and discussed their public cloud applications (tibbr, Spotfire, Loyalty Lab) and public Silver Marketplace functionality. Today, they announced Silver Integrator, running on Silver Marketplace, providing enterprise-class integration services on the public cloud. We saw a brief demo of Silver Integrator: it launches a cloud version of the TIBCO Designer with some additional palettes for cloud connectors such as Facebook, Salesforce and REST services.

Being able to extend enterprise applications onto mobile devices and into the cloud are critical capabilities for consumerization of IT, and TIBCO (as well as other vendors) are offering those capabilities. The problem of adoption, however, is usually not about product capabilities, it’s about organizational culture: there is a lot of resistance to this trend not in the user community, but within IT. I saw a graphic and blog post by Dion Hinchcliffe of Dachis Group today about the major shifts happening in IT, and one thing that he wrote was especially impactful:

Never in my two decades of experience in the IT world have I seen such a disparity between where the world is heading as a whole and the technology approach that many companies are using to run their businesses.

<p>Cloud, mobile, social, consumerization and big data: we’re all doing it in the consumer space, but IT departments are continuing to drag their feet at bringing this into the enterprise. The organizations that fail to embrace this are going to fall further behind in their ability to serve customers effectively and to innovate, and will suffer for it.</p>  <p>Quinn wrapped up with a list of their product announcements, including two new policy-driven governance products as part of the ActiveMatrix suite: AMX Service Gateway for providing enterprise services outside the firewall, and AMX Policy Director for managing security, auditing and logging rules for services. He covered their AMX BPM 2.0 release announcement briefly, with new functionality for work assignment and scheduling, case management, and page flow debugging, plus some new capabilities in Nimbus Control and FormVine.</p>  <p>That’s it for the keynotes, since the rest of today and all of tomorrow is breakout sessions where we can dig into the details of the products and how they’re being used by customers. I’ll be heading to the BPM product update this afternoon by Roger King and Justin Brunt, and probably also drop in on the tibbr product strategy to see more details of what we heard this morning.</p>  <p>I’ve been asked to step in for a last-minute cancellation and will be <a href="https://column2.com/2012/09/tibco-tucon2012-you-say-you-want-a-process-revolution/">presenting tomorrow morning at 10am on the process revolution</a> that is moving beyond implementing BPM just for cost savings, but looking at new business process metrics such as maximizing customer satisfaction. If you’re here at TUCON, come on out to the session tomorrow morning.

TIBCO TUCON2012: You Say You Want A (Process) Revolution?

I’ve been asked to fill in for a last-minute cancellation at a breakout session here tomorrow (Thursday) morning, and keeping with the retro music theme that we’ve been hearing here at TUCON, we’re going to have a process revolution:

Process Revolution: Turn Operational Efficiency into Operational Advantage

When starting a business process management (BPM) program, the first step is to achieve operational efficiency and its benefits: cut costs, improve agility through process automation, and ensure regulatory compliance with process transparency. What are the next steps? In this session, learn how you can take your BPM program to the next level. We’ll showcase strategies you can use to not only support process  innovation, but ensure every opportunity is a potential source of revenue generation and increased customer satisfaction. Additionally, we’ll outline how our next-generation BPM fits into the overall event-enabled enterprise.

Rachel Brennan, TIBCO’s BPM product marketing manager, will be up there with me to cover the TIBCO-specific parts about their AMX BPM product, while I discuss the main theme of moving from implementing BPM for cost and compliance reasons to a focus on other criteria such as customer satisfaction.

It’s at 10am, but I’m not sure of the room since the original presentation was added late enough that it didn’t make the printed schedule, the online schedule shows the session but not the location, and the app is completely missing it (for now).

Update: the session is on Thursday from 10:00-10:50am in Juniper 1.

Update 2: we’re now in the app!

TIBCO TUCON2012 Day 2 Keynotes: More Big Data and Social Collaboration

Yesterday, the keynotes talked about customer experience management and big data; this morning, we continued on the big data theme with John Shewell of McKesson Enterprise Intelligence talking about the issues in US healthcare: costs are increasing, and are far beyond that of other industrialized countries, yet the US does not have better healthcare than many other countries: it ranks 50th in life expectancy, and 30th in infant mortality rate. Most of the healthcare spending is on a small percentage of the population, often to treat chronic conditions that are preventable and/or treatable. Some portion of the healthcare spend is waste, to the tune of an estimated $700B per year, some of which can be eliminated by ensuring that standard procedures are followed by hospitals, physicians, pharmacies and other healthcare professionals. McKesson, as a provider of healthcare information and systems, has systems in place with hospitals and physicians that can collect information about healthcare procedures as they are executed, but the move to electronic health records is still ongoing and a lot of the data is fairly unstructured, presenting some challenges in mining the data for opportunities to improve procedural compliance and the quality of care.

Historically, healthcare data was in silos, making it difficult to get a holistic view of a patient. In the US, the only place where all the data came together was at the health insurance company (if you had health insurance, of course), and that might be several weeks after the event. If follow-up care was required after a hospital visit, for example, that information didn’t pop up anywhere, since it fell through the cracks in responsibility. One change that can improve this is to align incentives between the provider, payer and patient, so that it’s to everyone’s benefit if the patient is not readmitted due to a missed follow-up appointment. It’s also important to manage patients earlier to detect and avoid problems before they occur. Big data can help with all of these by detecting patterns and ensuring procedural compliance. In closing, he pointed out that this is not a government issue: it needs to be fixed by the industry.

We moved on to the social collaboration theme, with Ram Menon, TIBCO’s president of social computing, talking about tibbr: as an enterprise social networking platform, this is positioned as a “social bus”, much like TIBCO’s earlier technology success is based on the enterprise message bus. In 18 months, tibbr has grown to be used by over a million people – more than half using smartphones – in 104 countries. TIBCO’s heritage with events and messaging is essential to this success, because tibbr isn’t just about following people, it’s also about following physical devices/items, business processes, files and applications. Earlier this year, they launched tibbrGEO, which has physical locations pushing information to people who are nearby, based on their profile.

Menon was joined briefly by Hervé Coureil, CIO of Schneider Electric, then Jay Grant, Secretary General of InterPortPolice to talk about how they are using tibbr for social networking within and across organizations. He then announced tibbr 4, to be released within a few weeks, with a number of new features:

  • Social Profile – presenting a view of yourself to your colleagues (think LinkedIn)
  • Peer Influence – the impact that you have on the things with which you interact (think Klout)
  • tibbr Insights – social analytics, showing a summary of what’s happening in your social network, including both activities and requests waiting for action

We saw a demo of tibbr, which presents a Facebook-like interface for seeing updates from your social graph, but also allows something very similar to Facebook pages for other entities, such as customers. From a CRM standpoint, this allows all information about the customer to be surfaced in one place: a single pane of glass through which to view a customer.

tibbr 4 also provides a social graph API allowing that social graph being collected within tibbr to be accessed from other applications, as well as provide add-on functionality to tibbr, and a marketplace for these applications that allows tibbr users to add them to their own tibbr environment. They have some new partners that are offering applications right now for tibbr: box for cloud content storage and sharing; Wayin for surveys and sentiment analysis; Badgeville for engagement through gamification and incentives; and several others including Whodini, ManyWorlds, BrightIdea, Teamly and FileBoard.

TIBCO Corporate and Technology Analyst Briefing at TUCON2012

Murray Rode, COO of TIBCO, started the analyst briefings with an overview of technology trends (as we heard this morning, mobile, cloud, social, events) and business trends (loyalty and cross-selling, cost reduction and efficiency gains, risk management and compliance, metrics and analytics) to create the four themes that they’re discussing at this conference: digital customer experience, big data, social collaboration, and consumerization of IT. TIBCO provides a platform of integrated products and functionality in five main areas:

  • Automation, including messaging, SOA, BPM, MDM, and other middleware
  • Event processing, including events/CEP, rules, in-memory data grid and log management
  • Analytics, including visual analysis, data discovery, and statistics
  • Cloud, including private/hybrid model, cloud platform apps, and deployment options
  • Social, including enterprise social media, and collaboration

A bit disappointing to see BPM relegated to being just a piece of the automation middleware, but important to remember that TIBCO is an integration technology company at heart, and that’s ultimately what BPM is to them.

Taking a look at their corporate performance, they have almost $1B in revenue for FY2011, showing growth of 44% over the past two years, with 4,000 customers and 3,500 employees. They continue to invest 14% of revenue into R&D with a 20% increase in headcount, and significant increases in investment in sales and marketing, which is pushing this growth. Their top verticals are financial services and telecom, and while they still do 50% of their business in the Americas, EMEA is at 40%, and APJ making up the other 10% and showing the largest growth. They have a broad core sales force, but have dedicated sales forces for a few specialized products, including Spotfire, tibbr and Nimbus, as well as for vertical industries.

They continue to extend their technology platform through acquisitions and organic growth across all five areas of the platform functionality. They see the automation components as being “large and stable”, meaning we can’t expect to see a lot of new investment here, while the other four areas are all “increasing”. Not too surprising considering that AMX BPM was a fairly recent and major overhaul of their BPM platform and (hopefully) won’t need major rework for a while, and the other areas all include components that would integrate as part of a BPM deployment.

Matt Quinn then reviewed the technology strategy: extending the number of components in the platform as well as deepening the functionality. We heard about some of this earlier, such as the new messaging appliances and Spotfire 5 release, some recent releases of existing platforms such as ActiveSpaces, ActiveMatrix and Business Events, plus some cloud, mobile and social enhancements that will be announced tomorrow so I can’t tell you about them yet.

We also heard a bit more on the rules modeling that I saw before the sessions this morning: it’s their new BPMN modeling for rules. This uses BPMN 1.2 notation to chain together decision tables and other rule components into decision services, which can then be called directly as tasks within a BPMN process model, or exposed as web services (SOAP only for now, but since ActiveMatrix is now supporting REST/JSON, I’m hopeful for this). Sounds a bit weird, but it actually makes sense when you think about how rules are formed into composite decision services.

There was a lot more information about a lot more products, and then my head exploded.

Like others in the audience, I started getting product fatigue, and just picking out details of products that are relevant to me. This really drove home that the TIBCO product portfolio is big and complex, and this might benefit from having a few separate analyst sessions with some sort of product grouping, although there is so much overlap and integration in product areas that I’m not sure how they would sensibly split it up. Even for my area of coverage, there was just too much information to capture, much less absorb.

We finished up with a panel of the top-level TIBCO execs, the first question of which was about how the sales force can even start to comprehend the entire breadth of the product portfolio in order to be successful selling it. This isn’t a problem unique to TIBCO: any broad-based platform vendor such as IBM and Oracle have the same issue. TIBCO’s answer: specialized sales force overlays for specific products and industry verticals, and selling solutions rather than individual products. Both of those work to a certain extent, but often solutions end up being no more than glorified templates developed as sales tools rather than actual solutions, and can lead to more rather than less legacy code.

Because of the broad portfolio, there’s also confusion in the customer base, many of whom see one TIBCO product and have no idea of everything else that TIBCO does. Since TIBCO is not quite the household name like IBM or Oracle, companies don’t necessarily know that TIBCO has other things to offer. One of my banking clients, on hearing that I am at the TIBCO conference this week, emailed “Heard of them as a player in the Cloud Computing space.  What’s different or unique about them vs others?” Yes, they play in the cloud. But that’s hardly what you would expect a bank (that uses very little cloud infrastructure, and likely does have some TIBCO products installed somewhere) to think of first when you mention TIBCO.

TIBCO TUCON2012 Day 1 Keynotes, Part 2: Big Honking Data

Back from the mid-morning break, CMO Raj Verma shifted gears from customer experience management to look at one of the other factors introduced in the first part of the session: big data.

Matt Quinn was back to talk about big data: in some ways, this isn’t new, since there has been a lot of data within enterprises for many years. What’s changed is that we now have the tools to deal with it, both in place and in motion, to find the patterns hiding within it through cleansing and transformation. He makes a sports analogy, saying that a game is not just about the final score, but about all of the events that happen to make up the entire game; similarly, it is not sufficient any more to just measure outcomes in business transactions, you have to monitor patterns in the event streams and combine that with historical data to make the best possible decisions about what is happening right now. He referred to this combination of event processing and analytics as closing the loop between data in motion and data at rest. TIBCO provides a number of products that combine to handle big data: not just CEP, but ActiveSpaces (the in-memory data grid) to enable realtime processing, Spotfire for visual analytics and integration with Hadoop.

We saw a demo of LogLogic, recently acquired by TIBCO, which provides analytics and event detection on server logs. This might sound like a bit of a boring topic, but I’m totally on with this: too many companies just turn off logging on their servers because it generates too many events that they just can’t do anything with, and it impacts performance since logging is done on the operational server. LogLogic’s appliance can collect enormous amounts of log data, detect unusual events based on various rules, and integrate with Spotfire for visualization of potential security threats.

Mark Lorion, CMO for TIBCO Spotfire, came up to announce Spotfire 5, with a complete overhaul to the analytics engine, and including the industry’s first enterprise runtime for the R statistical language, providing 10 times the performance of the open source R project for predictive analytics. Self-service predictive analytics, ftw. They are also going beyond in-memory, integrating with Teradata, Oracle and Microsoft SQL Server for in-database analysis. With Teradata horsepower behind it – today’s announcement of Spotfire being optimized for in-database computation on Teradata – you can now do near-realtime exploration and visualization of some shocking amounts of data. Brad Hopper gave us a great Spotfire demo, not something that most TUCON attendees are used to seeing on the main stage.

Rob Friel, CEO of PerkinElmer, took the stage to talk about how they are using big data and analytics in their scientific innovations in life sciences: screening patient data, environmental samples, human genomes, and drug trials to detect patterns that can improve quality of life in some way. They screened 31 million babies born last year (one in four around the globe) through the standard heel-prick blood test, and detected 18,000 with otherwise undiagnosed disorders that could be cured or treated. Their instrumentation is key in acquiring all the data, but once it’s there, tools such as Spotfire empower their scientists to discover and act on what they find in the data. Just as MGM Grand is delivering unique experiences to each customer, PerkinElmer is trying to enable personalized health monitoring and care for each patient.

To wrap up the big data section, Denny Page, TIBCO’s VP of Engineering, came on stage with his new hardware babies: a FTL Message switch and an EMS appliance, both to be available by the end of November 2012.

For the final part of the day 1 keynotes, we heard from an innovators’ panel of Scott McNealy (founder of Sun Microsystems, now chairman of Wayin), Tom Siebel (founder of Siebel Systems, now at C3 Energy where they are using TIBCO for energy usage analytics), Vivek Ranadivé, and KR Sridhar (CEO of Bloom Energy), chaired by David Kirkpatrick. Interesting and wide-ranging discussion about big data, analytics, sentiment analysis, enterprise social media, making data actionable, the internet of things and how a low barrier to platform exit drives innovation. The panel thinks that the best things in tech are yet to come, and I’m in agreement, although those who are paranoid about the impact of big data on their privacy should be very, very afraid.

I’ll be blogging from the analyst event for the rest of the day: we have corporate and technology briefings from the TIBCO execs plus some 1:1 sessions. No pool time for me today!

TIBCO TUCON2012 Day 1 Keynotes, Part 1

The keynotes started with TIBCO’s CEO, Vivek Ranadivé, talking about the forces driving change: a massive explosion of data (big data), the emergence of mobility, the emergence of platforms, the rise of Asia (he referenced the Gangnam Style video, although did not actually do the dance), and how math is trumping science (e.g., the detection and exploitation of patterns). The ability to harness these forces and produce extreme value is a competitive differentiator, and is working for companies like Apple and Amazon.

Raj Verma, TIBCO’s CMO, was up next, continuing the message of how fast things are changing: more iPhones were sold over the past few days than babies were born worldwide, and Amazon added more computing capacity last night than they had in total in 2001. He (re)introduced their concept of the two-second advantage – the right information a little bit before an event is worth infinitely more than any amount of information after the event – enabled by an event-enabled enterprise (or E3, supported by, of course, TIBCO infrastructure). Regardless of whether or not you use TIBCO products, this is a key point: if you’re going to exploit the massive amounts of data being generated today in order to produce extreme value, you’re going to need to be an event-enabled enterprise, responding to events rather than just measuring outcomes after the fact.

He discussed the intersection of four forces: cloud, big data, social collaboration and mobility. This is not a unique message – every vendor, analyst and consultant are talking about this – but he dug into some of these in detail: mobile, for example, is no longer discretionary, even (or maybe especially) in countries where food and resources are scarce. The four of these together all overlap in the consumerization of IT, and are reshaping enterprise IT. A key corporate change driven by these is customer experience management: becoming the brand that customers think of first when the product class is mentioned, and turning customers into fans. Digital marketing, properly done, turns your business into a social network, and turns customer management into fan management.

Matt Quinn, CTO, continued the idea of turning customers into fans, and solidifying customer loyalty. To do this, he introduced TIBCO’s “billion dollar backend” with its platform components of automation, event processing, analytics, cloud and social, and hosted a series of speakers on the subject of customer experience management.

We then heard from a customer, Chris Nordling, EVP of Operations and CIO of MGM Resorts and CityCenter, who use TIBCO for their MLife customer experience management/loyalty program. Their vision is to track everything about you from your gambling wins/losses to your preferences in restaurants and entertainment, and use that to build personalized experiences on the fly. By capturing the flow of big data and responding to events in realtime, the technology provides their marketing team with the ability to provide a zero-friction offer to each customer individually before they even know that they want something: offering reduced entertainment tickets just as you’re finishing a big losing streak at the blackjack tables, for example. It’s a bit creepy, but at the same time, has the potential to provide a better customer experience. Just a bit of insight into what they’re spending that outrageous $25/day resort fee on.

Quinn came back to have a discussion with one of their “loyalty scientists” (really??) about Loyalty Lab, TIBCO’s platform/service for loyalty management, which is all about analyzing events and data in realtime, and providing “audience of one” service and offerings. Traditional loyalty programs were transaction-based, but today’s loyalty programs are much more about providing a more holistic view of the customer. This can include not just events that happen in a company’s own systems, but include external social media information, such as the customer’s tweets. I know all about that.

Another customer, Rick Welts of the Golden State Warriors (who, ironically, play at the Oracle stadium) talked about not just customer loyalty management, but the Moneyball-style analytics that they apply to players on a very granular scale: each play of each game is captured and analyzed to maximize performance. They’re also using their mobile app for a variety of customer service initiatives, from on-premise seat upgrades to ordering food directly from your seat in the stadium.

Mid-morning break, and I’ll continue afterwards.

As an aside, I’m not usually wide awake enough to get much out of the breakfast-in-the-showcase walkabout, but this morning prior to the opening sessions, I did have a chance to see the new TIBCO decision services integrated into BPM, also available as standalone services. Looked cool, more on that later.

Enterprise BPM Webinar Q&A Followup

I know, two TIBCO-related posts in one day, but I just received the link to the replay of the Enterprise BPM webinar that I did for TIBCO last week, along with the questions that we didn’t have time to answer during the webinar, and wanted to summarize here. First of all, my slides:

These were the questions that came in during the webinar via typed chat that are not related to TIBCO or its products; I think that we covered some of these during the session but will respond to all of them here.

Is it possible to implement BPM (business process management) without a BPMS?

How to capture process before/without technology?

These are both about doing BPM without a BPMS. I wrote recently about Elevations Credit Union (the fact that they are an IBM customer is completely immaterial in this context) that gained a huge part of their BPM success long before they touched any technology, Basically, they carved out some high-level corporate goals related to quality, modeled their value streams, then documented their existing business processes relative to those value streams. Every business process had to fit into a value stream (which was in turn related to a corporate goal), or else it didn’t survive. They saw how processes touched various different groups, and where the inefficiencies lay, and they did all of this using manual mapping on white boards, paper and sticky notes. In other words, they used the management discipline and methodology side of BPM before they (eventually) selected a tool for collaborative process modeling, which then helped them to spread the word further in their organization. There is a misperception in some companies that if you a buy a BPMS, your processes will improve, but you really need to reorient your thinking, management and strategic goals around your business processes before you start with any technology, or you won’t get the benefits that you are expecting.

In enterprises that do not have SOA implemented horizontally across the organization, how can BPM be leveraged to implement process governance in the LOB silos, yet have enterprise control?

A BPM center of excellence (CoE) would be the best way to ensure process governance across siloed implementations. I wrote recently about a presentation that I was at where Roger Burlton spoke about BPM maturity; there was some advice that he had at the end of that about organizations that had only a level 1 or 2 in process maturity (which, if you’re still very siloed, you’re probably at): get a CoE in place and target it more at change initiatives than governance. However, you will be able to leverage the CoE to put standards in place, provide mentoring and training, and eventually build a repository of reusable process artifacts.

I work in the equipment finance industry. Companies in this space are typically classified as banks/bank-affiliates, captives and independents. With a few exceptions it’s my understanding that this particular industry has been rather slow at adopting BPMS. Have you noticed this in other industries and, if so, what do you see as being the “tipping point” for greater BPMS adoption rates? Does it ultimately come down to a solid ROI, or perhaps a few peer success stories?

My biggest customers are in financial services and insurance, so are also fairly conservative. Insurance, in particular, tends to adopt technology at the very end of adoption tail. I have seen a couple of factors that can impact a slower-moving adoption of any sort of technology, not just BPMS: first, if they just can’t do business the old way any more, and have to adopt the new technology. An example of this was a business process outsourcer for back-office mutual fund transactions that started losing bids for new work because it was actually written into the RFP that they had to have “imaging and workflow” technology rather than paper-based processes. Secondly, if they can’t change quickly enough to be competitive in the market, which is usually the case when many other of their competitors have already started using the technology. So, yes, it does come down to a solid ROI and some peer success stories, but in many cases, the ROI is one of survival rather than just incremental efficiency improvements.

Large scale organizations tend to have multiple BPM / workflow engines. What insights can you share to make these different engines in different organizational BUs into an enterprise BPM capability?

Every large organization that I work with has multiple BPMS, and this is a problem that they struggle with constantly. Going back to the first question, you need to think about both sides of BPM: it’s the management discipline and methodology, then it’s the technology.  The first of these, which is arguably the one with the biggest impact, is completely independent of the specific BPMS that you’re using: it’s about getting the organization oriented around processes, and understanding how the end-to-end business processes relate to the strategic goals. Building a common BPM CoE for the enterprise can help to bring all of these things together, including the expertise related to the multiple BPM products. By bringing them together, it’s possible to start looking at the target use cases for each of the systems currently in use, and selecting the appropriate system for each new implementation. Eventually, this may lead to some systems being replaced to reduce the number of BPMS used in the organization overall, but I rarely see large enterprises without at least two different BPMS in use, so don’t be fanatical about getting it down to a single system.

Typically what is the best order to implement ; first BPM and last SOA or vice versa.

I recommend a hybrid approach rather than purely top-down (BPM first) or bottom-up (SOA first). First, do an inventory in your environment for existing services, since there will almost always be some out there, even if just in your packaged applications such as ERP. While is this happening, start your BPM initiative by setting the goals and doing some top-down process modeling. Assuming that you have a particular process in mind for implementation, do the more detailed process design for that, taking advantage of any services that you have discovered, and identifying what other services need to be created. If possible, implement the process even without the services: it will be no worse from an efficiency standpoint than your current manual process, and will provide a framework both for adding services later and for process monitoring. As you develop the services for integration and automation, replace the manual steps in the process with the services.

Re: Enterprise BPM Goals – Develop, Execute, but what about Governance?

This was in response to the material on my agenda for the webinar. Yes, governance is important, but I only had 40 minutes and could barely cover the design/develop/execute parts of what we wanted to cover. Maybe TIBCO will have me back for another webinar on governance. 😉

Data/content centric processes vs. people-centric vs. EAI/integration centric re: multiple BPMS platforms. Any guidelines for when and where to demarcate?

These divisions are very similar to the Forrester divisions of the BPMS landscape from a few years ago, and grew mostly out of the different types of systems that were all lumped together as “BPMS” by the analysts in the early 2000’s. Many of today’s products offer strength in more than one area, but you need to have a good understanding of your primary use cases when selecting a product. Personally, I think that content-centric and human-centric isn’t the right way to split it: more like unstructured (case management) versus structured; even then, there is more of a spectrum of functionality in most cases than purely unstructured or purely structured. So really, the division is between processes that have people involved (human-centric) or those that are more for automated integration (system-centric), with the latter having to accommodate this wider spectrum of process types. If you have mostly automated integration processes, then certainly an integration-centric BPMS makes sense; if you have human-facing processes, then the question is a bit more complex, since you’re dealing with content/documents, process types, social/collaborative capabilities and a host of other requirements that you need to look at relative to your own use cases. In general, the market is moving towards the full range of human-facing processes being handled by a single product, although specialist product companies would differ.

Thoughts on the role of the application/solution architect within an LOB or COE vs. that of the enterprise architect assigned to the BPM domain?

An enterprise architect assigned to the BPM CoE/domain is still (typically) part of the EA team, therefore involved with the broader scope of enterprise architecture issues. An application/solution architect tends to be more product and technology focused, and in many some that is just a fancy term used for a developer. In other words, the EA should be concerned with overall strategy and goals, whereas the solution architect is focused on implementation.

Role of the COE in governance? How far does/should it extend?

The CoE is core to governance: that’s what it’s there for. At the very least, the CoE will set the standards and procedures for governance, and may rely on the individual projects to enforce that governance.

Is it really IT giving up control? In many cases, the business does whatever they do — and IT has little (or aged) information about the actual processes.

This was in reference to slide #11 in my deck about cultural issues. Certainly business can (and often do) go off and implement their own processes, but that is outside the context of enterprise-wide systems. In order to have the business be doing that within the enterprise BPMS, IT has to ensure that business can access the process discovery and modeling tools that become the front-end of process design. That way, business and IT share models of the business processes, which means that what gets implemented in the BPMS might actually resemble what is required by the business. In some cases, I see a company buy a BPMS but not allow the business users to use the business-level tools to participate in process modeling: this is usually the result of someone in IT thinking that this is beyond the capability of the business people.

Is following of any BPM notation standards part of BPM development? I saw that there was no mention of it.

There was so much that I did not have time to address with only 40 minutes or so to speak, and standards didn’t make the cut. In longer presentations, I always address the issue of standards, since a common process modeling notation is essential to communication between various stakeholders. BPMN is the obvious front-runner there, and if used properly, can be understood by both business and IT. It’s not just about process models, however: a BPMS implementation has to also consider data models, organizational models and more, around which there is less standardization.

Regarding Common UI: shouldn’t it be Common Architecture, accessed by different UIs that fit the user’s roles, knowledge, etc?

In the context of slide #6, I did mean a common UI, literally. In other words, using the BPMS’ composite application development and forms environment to create a user interface that hides multiple legacy applications behind a single user interface, so that the user deals with this new integrated UI instead of multiple legacy UIs. Your point seems to be more about persona-based (or role-based) interfaces into the BPMS, which is a valid, but different, point. That “single UI” that I mention would, in fact, be configurable for the different personas who need to access it.

How does a fully fledged BPM tool stack up against workflow tools part of other COTS application, e.g. workflow in a document management tool or in a trouble ticketing tool?

A full BPMS tends to be much more flexible than what you will find in the embedded workflow within another platform, and is more of an application development platform than just a way to control processes within that application. On the other side, the workflow within those applications are typically already fully integrated with the other business objects within them (e.g., documents, trouble tickets), so the implementation may be faster for that particular type of process. If the only type of process management that you need to do is document approvals within your document management system, it may make sense to use that rather than purchase a full BPMS; if you have broader process management needs, start looking at a more general BPMS platform that can handle more of your use cases.

How do u see BPM tools surviving when CRM tools with more or less same capability is getting widely accepted by enterprises with out-of-box processes defined?

Similar to my response to the previous question, if the processes are related only to the business objects within the CRM, then you may be better off using the workflow tools within it. However, as soon as you want to integrate in other data sources, systems or users, you’ll start to get beyond the functional capabilities of the simpler workflow tools within the CRM. There’s room in the market for both; the trick is, for customers, to understand when to use one versus the other.

What are the reasons you see for BPM tools not getting quickly and widely accepted and what are the solutions to overcome that?

There are both cost and complexity components with BPMS adoption, but a big reason before you even start looking at tools is moving your organization to a process-driven orientation, as I discussed above. Once people start to look at the business as end-to-end processes, and those processes as assets and capabilities that the business offers to its customers, there will be a great pull for BPMS technologies to help that along. Once that motivation is in place, the cost and complexity barriers are still there, but are becoming less significant: first of all, more vendors are offering cloud-based versions of their software that allow you to try it out – and even do your full development and testing – without capital expenditures. If they offer the option, you can move your production processes on-premise, or leave them in the cloud to keep the total cost down. As for complexity, the products are getting easier to use, but are also offering a lot more functionality. This shifts the complexity from one of depth (learning how to do a particular function) to breadth (learning what all the functions are and when to use which), which is still complex but less of a technological complexity.

Is it possible to start introducing and implementing BPM in one department or module only and then extending the BPM to other departments or modules? Or this should be the enterprise wide decisions since it involves heavy cost to bring BPM technologies.

Almost every organization that I work with does their BPM implementation in one department first, or for one process first (which may span departments): it’s just not possible to implement everything that you will ever implement in BPM at the same time, first time. There needs to be ROI within that first implementation, but you also have to look at enterprise cost justification as with any horizontal technology: plan for the other projects that will use this, and allocate the costs accordingly. That might mean that some of the initial costs come from a shared services or infrastructure budget rather than the project budget, because they will eventually be allocated to future projects and processes.

How difficult would it be to replace legacy workflow system with BPM?

It depends (that’s always the consultant’s answer). Seriously, though, it depends on the level of integration between the existing workflow system and other systems, and how much of the user interface that it provides. I have seen situations where a legacy workflow system is deeply embedded in a custom application platform, with fairly well-defined integration points to other systems, and the user interface hiding the workflow system from the end user. In this case, although it’s not trivial, it is a straightforward exercise to rip out the workflow system since it is being used purely as a process engine, replace it with a new one, refactor the integration points so that the new system calls the other systems in the environment (usually easier since modern BPMS’ have better integration capabilities) and refactor the custom UI so that it calls the new BPMS (also usually easier because of updated functionality). That’s the best case, and as I said, it’s still not trivial. If the legacy workflow system also provides the user interface, then you’re looking at redeveloping your entire UI either in the new BPMS or in some other UI development tool, plus the back-end systems integration work. A major consideration in either case is that you don’t just want to replace the same functionality of the old workflow system, since the new BPMS will have far greater functionality: you need to think about how you are going to leverage capabilities such as runtime collaboration that never existed in the old system, in order to see the greatest benefit from the upgrade.

Is it possible to switch between BPM vendors without having pain?

No. Similar to the previous answer, this is a non-trivial exercise, and depending on how much of the functionality of the BPMS that you were using, could be pretty much a complete redevelopment. If the BPMS was used primarily for orchestration of automated processes, it will be much easier, but as soon as you get into custom integration/orchestration and user interfaces, it gets a lot more complicated (and painful).

Do we really need to go for BPM in a situation where we need only integration orchestration only?

One end of the BPMS market is integration-centric systems, which primarily do just integration orchestration. The advantage of using a BPMS for this instead of orchestrating directly in application code is that you get all of the other stuff that comes with the BPMS “for free”: graphical process modeling, execution monitoring, process governance and whatever other goodies are in the BPMS. It’s not really free, of course, but it’s valid to consider a comparison of all of that functionality against what parts of it you would have to custom-build if you were to do the orchestration in code.

That’s it for the Q&A. If you listen to the replay, or were on the live broadcast, my apologies for the rushed beginning: I got off on the wrong foot out of the gate, but settled down after the first few minutes.

Colonial Life at TUCON

I’m wrapping up my TUCON trip with the Colonial Life presentation on their TIBCO iProcess and BusinessWorks implementation in their back office. I work a lot with insurance companies, and find that they can be very conservative in terms of technology implementations: many are just implementing document imaging and workflow, and haven’t really looked at full BPM functionality that includes orchestration of different systems as well as work management. I had a chance to talk with the two presenters, Bijit Das from the business side and Phil Johnston from IT, in a private briefing yesterday; I heard about their business goals to do better work management and improve efficiencies by removing paper from the process, as well as their technical goal to build an agile solution that could be used across multiple process flows. They have done their first implementation in their policy administration area, where they receive about 180k pages of inbound documentation per year, resulting in about 10k work items per month.

They ended up using iProcess primarily as a state management and queuing engine, embedding most of the process flow rules in external database tables, and having just simple process flows in iProcess that routed work based on the table values rather than logic within the process model itself. Once a piece of work ended up in the right queue (or in a user-filtered view of a common work queue), the user could complete it, route it elsewhere, or put it on hold while they performed some activity outside of BPM. A huge part of their improvements came from using BW to create reusable services, where these services could be called from the processes, but they also turned that around and have some cases where iProcess is called as a service from BW for queue and state management, using services that had been developed by Unum (their parent company) for their implementation. They wrote their own custom user interface portal, allowing users to select the queue that they want to work, filter the queue manually, and select the work item that they want to work on. This is a bit unusual for back-office transactional systems, which typically push the next piece of work to a user rather than allowing them to select it, but it’s a lot harder to build those rules when you’re effectively writing all the work management in database tables rather than leveraging work management capabilities in a BPMS.

They transitioned from a very waterfall development model to a much more agile methodology throughout their first project lifecycle, which meant that the business area was seeing the code as it was being developed and testing, allowing for much smoother iterations. Although their first production release took about nine months (after an additional six months to implement the infrastructure), they did their next release in two months. They still do a lot of swivel-chair integration with their legacy policy administration system, and need to build better integration to further improve efficiencies and start to do some straight-through processing.

They’ve seen some impressive improvements:

  • The discovery and modeling that happened prior to the implementation forced them to look at their processes critically, and reorganize their teams so that similar work was processed by the same team
  • Minimized handoffs have improved SLAs by 4%
  • Increased visibility into processes
  • Removed 180k pieces of paper per year from the operations area
  • 20% efficiency improvement
  • Standardized solution for future implementations in other areas

They also learned some lessons and best practices, such as establishing scope, tools for process discovery and brining in the right resources at the right time. Yesterday, when I met with them, I mentioned Nimbus to them, which they had not yet looked at; obviously, they had time to check it out since then, since Bijit called it out from the presentation, saying that it could have helped them during process discovery. Their next steps are to do more system integration to further improve efficiencies by automating where possible, add input channels, and integrate smart forms to drive processes.

Although they have seen a huge amount of improvement in their processes, this still feels a bit like an old-school document workflow implementation, with table-driven simple process flows. Undoubtedly, the service layer is more modern, but I’m left feeling like they could see a lot more benefit to their business if they were to take advantage of newer BPM capabilities. However, this was probably a necessary first step for such a paper-bound organization that was cautiously dipping its toe into the BPM waters.

Driving the Adoption of Business Process Initiatives With @NimbusIP

Mark Cotgrove and Clark Swain from Nimbus Partners presented in a breakout session on Nimbus and how it fits into the bigger TIBCO picture, as an expansion of the short presentation we saw from Cotgrove at the analyst session yesterday. To sum up the message from yesterday, Nimbus Control provides an essential bit of business-driven process discovery functionality that isn’t really covered in TIBCO’s AMX/BPM offering, but more importantly, the ability to create intelligent operations manuals that can then interact with AMX/BPM in a variety of ways.

Nimbus Control doesn’t do process automation: they do process and procedural documentation that can also be linked to supporting documentation and other content required to perform a manual process. Some of the manual steps may be to interact with systems in specific ways, such as entering an order on an ERP system; others may be to perform purely manual tasks such as having a customer sign a paper document. There are a few competitors in this space, such as BusinessGenetics and Business Optix (formerly ProcessMaster), and there is some overlap with BPA tools such as ARIS and Blueprint in terms of the process discovery side, but not the end-user procedural help.

Swain started on a demo, but due to the late session start (apparently the keynote went way overtime), I had to leave for another meeting, and will have to see a more detailed demo some other time.