Category Archives: SOA

service oriented architecture

Transforming Healthcare At Maccabi With webMethods And ARIS

Israel has a mandatory but mostly privatized healthcare system, and Maccabi Healthcare Services is the country’s second largest and fastest growing health maintenance organization (HMO), with about 1.9 million members using the services of 5,000 physicians. Maccabi’s chief enterprise architect, Irena Kurman, gave a presentation in the Integration and Automation breakout track at Innovation World on how they are putting the model-to-execution message into practice to improve their processes and integrate their legacy systems better.

She talked about three case studies – medical referral follow-up, doctor visit management, and pregnancy tracking – that highlighted the challenges that they had with multiple systems and data sources, as well as uncontrolled and non-standardized processes. For example, their x-ray results process had point-to-point links between 14 different systems, making it very little to understand what was happening, much less consider modifications to the process. When something went wrong, there was no single process owner, and no visibility into the end-to-end process.

They started with webMethods for integration and SOA governance, then have more recently started to model their processes using ARIS and automate some of these processes using webMethods BPMS. That original spaghetti x-ray process still has those same source systems, but now uses an ESB middleware layer, with the BPMS (as well as external partners) accessing the legacy systems via services: changes to the process are made in the BPMS, not by rewiring the legacy systems.

The results in the case studies are pretty striking. In the medical referral follow-up process, they now have the ability to capture life-threatening cases in near real time, and since the entire process is linked and monitored, test samples can’t go missing without notice. For doctor visit management, payments to doctors are more accurate and are calculated in a transparent manner, improving relationships between Maccabi and their physicians. And for pregnancy tracking, a mobile application provides the patient with access to information relevant to her pregnancy stage, as well as view results such as recorded ultrasound video from anywhere.

Along the way, they’ve developed a model for approaching process and integration projects:

  • Start by modeling the business processes with ARIS
  • Integrate systems with webMethods Integration Platform
  • Execute and monitor processes with webMethods BPMS
  • Enable flexibility with the rules engine
  • Manage software services with CentraSite and Insight

Kurman feels that they’ve just started on their journey to process excellence, but it looks like they have a good roadmap on how they’re going to get there.

The Digital Agility Layer: Time To Get Intentionally Digital

Wolfram Jost, CTO of Software AG, started us off on the first full day of Innovation World with a keynote on innovations for the digital enterprise. As I mentioned yesterday, the use of the term “digital enterprise” (and even more, “digitization”) is a bit strange, since pretty much everything is digital these days, it’s just not necessarily the right type of digital. We still need to think about integration between systems to make automation seamless, but more importantly, we need to think about interaction patterns that put control in the hands of customers, and mobile and social platforms that make the digital forms ubiquitous. So maybe the right phrase is that we have to start being intentionally digital enterprises, rather than let it happen accidentally.

Software AG suiteI definitely agree with Jost’s key point: it’s all about the process. We need end-to-end processes at the business/customer layer, but have to interact with a plethora of silos down below, both on premise and in the cloud, some of which are decades old. Software AG, naturally, provides tools to help that happen: in-memory data management, integration/SOA, BPM, EA and intelligent business operations (IBO, including event processing and analytics). Software AG acquisitionsThis is made up of a number of acquisitions – Apama, alfabet, LongJump, Nirvana, JackBe – plus the pre-existing portfolio including ARIS and webMethods. Now, we’re seeing some of that on their Software AG Live PaaS vision for a unified cloud offering: Process Live for modeling and process publishing; Portfolio Live for IT portfolio management; AgileApps Live for application development and case management; and Integration Live for cloud-to-cloud and cloud-to-on premise integration. Integration Live is coming next year, but the rest of the platform is available as of today.

Software AG cloud offeringWe had a demo of Process Live, which provides cloud-based BPMN process modeling including collaboration; and Portfolios Live to see the systems with which the modeled processes may interact, including a wide variety of portfolio management functions such as assessing the usage and future development potential of any given system or application. We also saw an AgileApps Live application, including an analytics dashboard plus forms data entry and task/case management; interestingly, this is still sporting a longjump.com URL. I last reviewed LongJump in 2007 in conjunction with the Enterprise 2.0 conference, and obviously there have been some advances since then: it’s still an application development tool for web-based apps, but includes a lot of ad hoc task/case management functionality that allows the knowledge worker to create their own multi-step tasks (subprocesses, in effect) as well as perform other case-type functionality such as gathering artifacts and completing tasks related to a case resolution/completion.

Software AG Integration Live deployment stylesAlthough Integration Live isn’t there yet, we did hear about the different deployment styles that will be supported: development and/or operations can be in the cloud; there can be an on premise ESB or direct connections to systems.

Software AG event-driven architectureJost drilled down into several of the specific products, starting out with the overarching premise that Software AG is moving from a more traditional multi-tier architecture into an event-driven architecture (EDA), where everything is based around the event bus. Product highlights included:

  • ARIS positioning and use cases from process modeling to governance, and the radical UI redesign in ARIS 9 that matches the Process Live UI
  • Mobile and social BPM UI
  • Elastic ESB using virtual private cloud as well as public and private cloud
  • API management, representing an extension to the Centrasite concepts
  • Intelligent business operations architecture including in-memory analytics and event processing
  • Terracotta strategy for in-memory data management
  • Integration of Apama, big memory (Terracotta) and messaging for big data/event correlation

Software AG mobile BPM 1 Software AG mobile BPM 2 Software AG mobile BPM 3

I’m sure that we’ll see a lot more about these over the next two days so I’m not trying to cover everything here.

We had a brief demo from John Bates on audience sentiment analysis for price level setting using Apama, then wrapped up with a presentation from Edy Liongosari, Managing Director at Accenture on how to bring some of this into practice. One thing that Liongosari said really resonated: next year, none of us are going to be talking about cloud, because it will be so ubiquitous. Same is true, I believe, of the terms social and mobile. Not to mention digital.

Kicking Off @SoftwareAG @InnovationWorld

For the first time in a few years, I’m at Software AG’s Innovation World conference in San Francisco (I think that the last time I was here, it was still the webMethods Integration World), and the focus is on the Digital Enterprise. At the press panel that I attended just prior to this evening’s opening keynote, one journalist made the point that “digital enterprise” is kind of a dumb term (I paraphrase here) because everything is digital now: we need a more specific term to mean what Software AG is getting at with this. Clay Richardson of Forrester, who I dragged along to the press session, said that his colleagues are talking about the post-digital age, which I take to mean is based on the assumption that all business is digital so that term is a bit meaningless, although “post-digital” isn’t exactly descriptive either.

Terminology aside, Software AG’s heart is in the right place: CEO Karl-Heinz Streibich took the stage at the opening keynote to talk about how enterprises need to leverage this digital footprint by integrating systems in ways that enable transformation through alignment and agility. You can still detect the schisms in the Software AG product portfolio, however: many of the customer case studies were single-product (e.g., ARIS or webMethods), although we did hear about the growing synergy between Apama (CEP and analytics) and webMethods for operational visibility, as well as Apama and Terracotta (in-memory big data number crunching). As with many of the other large vendors that grow through acquisitions,

We heard briefly from Ivo Totev, Software AG’s CMO; saw presentations of two of their customer innovation awards; then had a lengthier talk on the power of mobile and social from Erik Qualman, author of Socialnomics and Digital Leader. Unlike the usual pop culture keynote speaker, Qualman’s stuff is right on for this audience: looking at how successful companies are leveraging online social relationships, data and influence to further their success through engagement: listening, interacting and reacting (and then selling). He points out that trying to sell first before engaging doesn’t work online because it doesn’t work offline; the methods of engagement are different online and offline, but the principles from a sales lead standpoint are the same. You can’t start the conversation by saying “hey, I’m great, buy this thing that I’m selling” (something that a *lot* of people/companies just starting with Twitter and/or blogging haven’t learned yet).

Qualman took the popular Dave Carroll’s “United Breaks Guitars” example from a couple of years ago, and talked about not just how United changed their policies on damage as a result of this, but the other people who leveraged the situation into increased sales: Taylor Guitars; a company that created a “Dave Carroll” travelling guitar case; and Carroll himself through sales of the song and his subsequent book on the power of one voice in the age of social media. He looked at companies that have transformed their customer experience through mobile (e.g., Starbucks mobile app, which has personally changed my café loyalty) by giving the customer a way to do what they want to do – which hopefully involves buying your product – in the easiest possible way; and how a fast and slightly cheeky social media presence can give you an incredible boost for very little investment (e.g., Oreo’s “dunk in the dark” tweet when the lights went out during the Superbowl). I gave a presentation last year on creating your own process revolution that talked about some of these issues and the new business models that are emerging because of it.

Great to see John Bates here, who I know from his tenure at Progress Software and came on at Software AG with the Apama acquisition, as well as finally meet Theo Priestley face to face after years of tweeting at each other.

Disclosure: Software AG is a customer (I’m in the middle of creating some white papers and webinars for them), and they paid my travel expenses to be at this conference. However, what I write here is my own opinion and I have not been financially compensated for it.

TIBCO Corporate and Technology Analyst Briefing at TUCON2012

Murray Rode, COO of TIBCO, started the analyst briefings with an overview of technology trends (as we heard this morning, mobile, cloud, social, events) and business trends (loyalty and cross-selling, cost reduction and efficiency gains, risk management and compliance, metrics and analytics) to create the four themes that they’re discussing at this conference: digital customer experience, big data, social collaboration, and consumerization of IT. TIBCO provides a platform of integrated products and functionality in five main areas:

  • Automation, including messaging, SOA, BPM, MDM, and other middleware
  • Event processing, including events/CEP, rules, in-memory data grid and log management
  • Analytics, including visual analysis, data discovery, and statistics
  • Cloud, including private/hybrid model, cloud platform apps, and deployment options
  • Social, including enterprise social media, and collaboration

A bit disappointing to see BPM relegated to being just a piece of the automation middleware, but important to remember that TIBCO is an integration technology company at heart, and that’s ultimately what BPM is to them.

Taking a look at their corporate performance, they have almost $1B in revenue for FY2011, showing growth of 44% over the past two years, with 4,000 customers and 3,500 employees. They continue to invest 14% of revenue into R&D with a 20% increase in headcount, and significant increases in investment in sales and marketing, which is pushing this growth. Their top verticals are financial services and telecom, and while they still do 50% of their business in the Americas, EMEA is at 40%, and APJ making up the other 10% and showing the largest growth. They have a broad core sales force, but have dedicated sales forces for a few specialized products, including Spotfire, tibbr and Nimbus, as well as for vertical industries.

They continue to extend their technology platform through acquisitions and organic growth across all five areas of the platform functionality. They see the automation components as being “large and stable”, meaning we can’t expect to see a lot of new investment here, while the other four areas are all “increasing”. Not too surprising considering that AMX BPM was a fairly recent and major overhaul of their BPM platform and (hopefully) won’t need major rework for a while, and the other areas all include components that would integrate as part of a BPM deployment.

Matt Quinn then reviewed the technology strategy: extending the number of components in the platform as well as deepening the functionality. We heard about some of this earlier, such as the new messaging appliances and Spotfire 5 release, some recent releases of existing platforms such as ActiveSpaces, ActiveMatrix and Business Events, plus some cloud, mobile and social enhancements that will be announced tomorrow so I can’t tell you about them yet.

We also heard a bit more on the rules modeling that I saw before the sessions this morning: it’s their new BPMN modeling for rules. This uses BPMN 1.2 notation to chain together decision tables and other rule components into decision services, which can then be called directly as tasks within a BPMN process model, or exposed as web services (SOAP only for now, but since ActiveMatrix is now supporting REST/JSON, I’m hopeful for this). Sounds a bit weird, but it actually makes sense when you think about how rules are formed into composite decision services.

There was a lot more information about a lot more products, and then my head exploded.

Like others in the audience, I started getting product fatigue, and just picking out details of products that are relevant to me. This really drove home that the TIBCO product portfolio is big and complex, and this might benefit from having a few separate analyst sessions with some sort of product grouping, although there is so much overlap and integration in product areas that I’m not sure how they would sensibly split it up. Even for my area of coverage, there was just too much information to capture, much less absorb.

We finished up with a panel of the top-level TIBCO execs, the first question of which was about how the sales force can even start to comprehend the entire breadth of the product portfolio in order to be successful selling it. This isn’t a problem unique to TIBCO: any broad-based platform vendor such as IBM and Oracle have the same issue. TIBCO’s answer: specialized sales force overlays for specific products and industry verticals, and selling solutions rather than individual products. Both of those work to a certain extent, but often solutions end up being no more than glorified templates developed as sales tools rather than actual solutions, and can lead to more rather than less legacy code.

Because of the broad portfolio, there’s also confusion in the customer base, many of whom see one TIBCO product and have no idea of everything else that TIBCO does. Since TIBCO is not quite the household name like IBM or Oracle, companies don’t necessarily know that TIBCO has other things to offer. One of my banking clients, on hearing that I am at the TIBCO conference this week, emailed “Heard of them as a player in the Cloud Computing space.  What’s different or unique about them vs others?” Yes, they play in the cloud. But that’s hardly what you would expect a bank (that uses very little cloud infrastructure, and likely does have some TIBCO products installed somewhere) to think of first when you mention TIBCO.

TIBCO TUCON2012 Day 1 Keynotes, Part 2: Big Honking Data

Back from the mid-morning break, CMO Raj Verma shifted gears from customer experience management to look at one of the other factors introduced in the first part of the session: big data.

Matt Quinn was back to talk about big data: in some ways, this isn’t new, since there has been a lot of data within enterprises for many years. What’s changed is that we now have the tools to deal with it, both in place and in motion, to find the patterns hiding within it through cleansing and transformation. He makes a sports analogy, saying that a game is not just about the final score, but about all of the events that happen to make up the entire game; similarly, it is not sufficient any more to just measure outcomes in business transactions, you have to monitor patterns in the event streams and combine that with historical data to make the best possible decisions about what is happening right now. He referred to this combination of event processing and analytics as closing the loop between data in motion and data at rest. TIBCO provides a number of products that combine to handle big data: not just CEP, but ActiveSpaces (the in-memory data grid) to enable realtime processing, Spotfire for visual analytics and integration with Hadoop.

We saw a demo of LogLogic, recently acquired by TIBCO, which provides analytics and event detection on server logs. This might sound like a bit of a boring topic, but I’m totally on with this: too many companies just turn off logging on their servers because it generates too many events that they just can’t do anything with, and it impacts performance since logging is done on the operational server. LogLogic’s appliance can collect enormous amounts of log data, detect unusual events based on various rules, and integrate with Spotfire for visualization of potential security threats.

Mark Lorion, CMO for TIBCO Spotfire, came up to announce Spotfire 5, with a complete overhaul to the analytics engine, and including the industry’s first enterprise runtime for the R statistical language, providing 10 times the performance of the open source R project for predictive analytics. Self-service predictive analytics, ftw. They are also going beyond in-memory, integrating with Teradata, Oracle and Microsoft SQL Server for in-database analysis. With Teradata horsepower behind it – today’s announcement of Spotfire being optimized for in-database computation on Teradata – you can now do near-realtime exploration and visualization of some shocking amounts of data. Brad Hopper gave us a great Spotfire demo, not something that most TUCON attendees are used to seeing on the main stage.

Rob Friel, CEO of PerkinElmer, took the stage to talk about how they are using big data and analytics in their scientific innovations in life sciences: screening patient data, environmental samples, human genomes, and drug trials to detect patterns that can improve quality of life in some way. They screened 31 million babies born last year (one in four around the globe) through the standard heel-prick blood test, and detected 18,000 with otherwise undiagnosed disorders that could be cured or treated. Their instrumentation is key in acquiring all the data, but once it’s there, tools such as Spotfire empower their scientists to discover and act on what they find in the data. Just as MGM Grand is delivering unique experiences to each customer, PerkinElmer is trying to enable personalized health monitoring and care for each patient.

To wrap up the big data section, Denny Page, TIBCO’s VP of Engineering, came on stage with his new hardware babies: a FTL Message switch and an EMS appliance, both to be available by the end of November 2012.

For the final part of the day 1 keynotes, we heard from an innovators’ panel of Scott McNealy (founder of Sun Microsystems, now chairman of Wayin), Tom Siebel (founder of Siebel Systems, now at C3 Energy where they are using TIBCO for energy usage analytics), Vivek Ranadivé, and KR Sridhar (CEO of Bloom Energy), chaired by David Kirkpatrick. Interesting and wide-ranging discussion about big data, analytics, sentiment analysis, enterprise social media, making data actionable, the internet of things and how a low barrier to platform exit drives innovation. The panel thinks that the best things in tech are yet to come, and I’m in agreement, although those who are paranoid about the impact of big data on their privacy should be very, very afraid.

I’ll be blogging from the analyst event for the rest of the day: we have corporate and technology briefings from the TIBCO execs plus some 1:1 sessions. No pool time for me today!

TIBCO TUCON2012 Day 1 Keynotes, Part 1

The keynotes started with TIBCO’s CEO, Vivek Ranadivé, talking about the forces driving change: a massive explosion of data (big data), the emergence of mobility, the emergence of platforms, the rise of Asia (he referenced the Gangnam Style video, although did not actually do the dance), and how math is trumping science (e.g., the detection and exploitation of patterns). The ability to harness these forces and produce extreme value is a competitive differentiator, and is working for companies like Apple and Amazon.

Raj Verma, TIBCO’s CMO, was up next, continuing the message of how fast things are changing: more iPhones were sold over the past few days than babies were born worldwide, and Amazon added more computing capacity last night than they had in total in 2001. He (re)introduced their concept of the two-second advantage – the right information a little bit before an event is worth infinitely more than any amount of information after the event – enabled by an event-enabled enterprise (or E3, supported by, of course, TIBCO infrastructure). Regardless of whether or not you use TIBCO products, this is a key point: if you’re going to exploit the massive amounts of data being generated today in order to produce extreme value, you’re going to need to be an event-enabled enterprise, responding to events rather than just measuring outcomes after the fact.

He discussed the intersection of four forces: cloud, big data, social collaboration and mobility. This is not a unique message – every vendor, analyst and consultant are talking about this – but he dug into some of these in detail: mobile, for example, is no longer discretionary, even (or maybe especially) in countries where food and resources are scarce. The four of these together all overlap in the consumerization of IT, and are reshaping enterprise IT. A key corporate change driven by these is customer experience management: becoming the brand that customers think of first when the product class is mentioned, and turning customers into fans. Digital marketing, properly done, turns your business into a social network, and turns customer management into fan management.

Matt Quinn, CTO, continued the idea of turning customers into fans, and solidifying customer loyalty. To do this, he introduced TIBCO’s “billion dollar backend” with its platform components of automation, event processing, analytics, cloud and social, and hosted a series of speakers on the subject of customer experience management.

We then heard from a customer, Chris Nordling, EVP of Operations and CIO of MGM Resorts and CityCenter, who use TIBCO for their MLife customer experience management/loyalty program. Their vision is to track everything about you from your gambling wins/losses to your preferences in restaurants and entertainment, and use that to build personalized experiences on the fly. By capturing the flow of big data and responding to events in realtime, the technology provides their marketing team with the ability to provide a zero-friction offer to each customer individually before they even know that they want something: offering reduced entertainment tickets just as you’re finishing a big losing streak at the blackjack tables, for example. It’s a bit creepy, but at the same time, has the potential to provide a better customer experience. Just a bit of insight into what they’re spending that outrageous $25/day resort fee on.

Quinn came back to have a discussion with one of their “loyalty scientists” (really??) about Loyalty Lab, TIBCO’s platform/service for loyalty management, which is all about analyzing events and data in realtime, and providing “audience of one” service and offerings. Traditional loyalty programs were transaction-based, but today’s loyalty programs are much more about providing a more holistic view of the customer. This can include not just events that happen in a company’s own systems, but include external social media information, such as the customer’s tweets. I know all about that.

Another customer, Rick Welts of the Golden State Warriors (who, ironically, play at the Oracle stadium) talked about not just customer loyalty management, but the Moneyball-style analytics that they apply to players on a very granular scale: each play of each game is captured and analyzed to maximize performance. They’re also using their mobile app for a variety of customer service initiatives, from on-premise seat upgrades to ordering food directly from your seat in the stadium.

Mid-morning break, and I’ll continue afterwards.

As an aside, I’m not usually wide awake enough to get much out of the breakfast-in-the-showcase walkabout, but this morning prior to the opening sessions, I did have a chance to see the new TIBCO decision services integrated into BPM, also available as standalone services. Looked cool, more on that later.

IBM Impact Day 2: Engage. Extend. Succeed.

Phil Gilbert spoke at the main tent session this morning, summarizing how they announced IBM BPM as a unified offering at last year’s Impact, and since then they’ve combined Business Events and ILOG to form IBM ODM (operational decision management). Business process and decision management provide visibility and governance, forming a conduit to provide information about transactions and data to people who need to access it. IBM claims to have the broadest, most integrated process portfolio, having taken a few dozen products and turned them into two products; Phil was quick to shoot down the idea that this is a disjointed, non-integrated collection of tools, referring to it instead as a “loosely coupled integration architecture”. Whatever.

Around those two core products (or product assemblies) are links to other enterprise tools – Tivoli, MDM, ECM and SAP – forming the heart of business processes and system orchestration. In version 8 of BPM and ODM, they’ve added collaboration, which is the third key imperative for business alongside visibility and governance.

We saw a demo of the new capabilities, most of which I talked about in yesterday’s post. For ODM, that included the new decision console (social activity stream, rules timeline) and global rules search. For BPM, there’s the new socially-aware process portal, which has been created on their publicly-available APIs so that you can roll your own portal with the same level of functionality. There’s searching in the process portal to find tasks easily. The new coach (UI form) designer allows you to create very rich task interfaces more easily, including the sidebar of task/instance details, instance-specific activity stream, and experts available for collaboration. They’ve incorporated the real-time collaboration capabilities of Blueworks Live into the BPM coaches to allow someone to request and receive help from an expert, with the user and the expert seeing each other’s inputs synchronously on the form in question. Lastly, Approve/Reject type tasks can be completed in-line directly in the task list, making it much faster to move through a long set of tasks that require only simple responses. He wrapped up with the obligatory iPad demo (have to give him credit for doing that part of the live demo himself, which most VPs wouldn’t consider).

The general session also included presentations of some innovative uses of BPM and ODM by IBM’s customers: Ottawa General Hospital, which has put patient information and processes on an iPad in the doctors’ pockets, and BodyMedia, which captures, analyzes and visualizes a flood of biometric data points gathered by an armband device to assist with a weight loss program.

IBM Vision for BPM, ODM and SOA

Opening day at IBM Impact 2012 (there were some sessions yesterday, but today is the real start), and a good keynote focused on innovation. The wifi is appalling – if IBM can’t get this right with their messages about scalability, who can? – so not sure if I’ll have the chance to post any of this throughout the day, or if you’ll get it all when I get back to my hotel room.

This post is based on a pre-conference briefing that I had a week or two ago, a regular conference breakout session this morning, and the analyst briefing this afternoon, covering  IBM’s vision for BPM, ODM (decision management) and SOA. Their customers are using technology to drive process innovation, and the IBM portfolio is working to address those needs. Cross-functional business outcomes, which in turn require cross-functional processes, are enabled by collaboration and by better technical integration across silos. And, not surprisingly, their message is moving towards the Gartner upcoming iBPMS vision: support for structured and unstructured process; flexible integration; and rules and analytics for repeatable, flexible decisions. Visibility, collaboration and governance are key, not just within departmental processes, but when linking together all processes in an organization into an enterprise process architecture.

The key capabilities that they offer to help clients achieve process innovation include:

  • Process discovery and design (Blueworks Live)
  • Business process management (Process Server and Process Center)
  • Operational decision management (Decision Server and Decision Center)
  • Advanced case management (Case Manager, which is the FileNet-based offering that not part of this portfolio, but integrated)
  • Business monitoring (Business Monitor)

Underpinning these are master data management, integration, analytics and enterprise content management, surrounded by industry expertise and solutions. IBM is using the term intelligent business operations (which was front and center at Gartner BPM last week) to describe the platform of process, events and decision, plus appropriate user interfaces for visibility and governance.

Blueworks Live is positioned not just as a front-end design tool for process automation, but as a tool for documenting processes. Many of the 300,000 processes that have been documented in Blueworks Live are never automated in IBM BPM or any other “real” BPMS, but it acts as a repository for discovering and documenting processes in a collaborative environment, and allowing process stakeholders to track changes to processes and see how it impacts their business. There is an expanded library of templates, plus an insurance framework and other templates/frameworks coming up.

One exciting new feature (okay, exciting to me) is that Blueworks Live now allows decision tasks to be defined in process models, including the creation of decision tables: this provides an integrated process/decision discovery environment. As with process, these decisions do not need to become automated in a decision management system; this may just document the business rules and decisions as they are applied in manual processes or other systems.

Looking at IBM BPM v8, which is coming up soon, Ottosson took us through the main features:

  • IBM BPM inbox showing inline task approvalSocial collaboration to allow users to work together on tasks via real-time interactions, view activity streams, and locate experts. That manifests in the redesigned task interface, or “coach”, with a sidebar that includes task details, the activity stream for the entire process, and experts that are either recommended by the system based on past performance or by others through manual curation. Experts can be requested to collaboration on a task with another user – it includes presence, so that you can tell who is online at any given time – allowing the expert to view the work that the user is doing, and offer assistance. Effectively, multiple people are being given access to same piece of work, and updates made by anyone are shown to all participants; this can be asynchronous or synchronous.
  • There is also a redesigned inbox UI, with a more up-to-date look and feel with lots of AJAX-y goodness, sorting and coloring by priority, plus the ability to respond to simple tasks inline directly in the inbox rather than opening a separate task view. It provides a single task inbox for a variety of sources, including IBM BPM, Blueworks workflows and Case Manager tasks.
  • Situational awareness with process monitoring and analysis in a performance data warehouse.
  • iPhone app task listMobile access via an iOS application that can interface with Blueworks Live and IBM BPM; if you search for “IBM BPM” in the iTunes app store (but not, unfortunately, in the Android Market), you’ll find it. It supports viewing the task list, task completion, attach documents and add comments. They are considering releases the source code to allow developers to use it as a template, since there is likely to be a demand for a customized or branded version of this. In conjunction with this, they’ve released a REST API tester similar to the sort of sandbox offered by Google, which allows developers to create REST-based applications (mobile or otherwise) without having to own the entire back-end platform. This will certainly open up the add-on BPM application market to smaller developers, where we are likely to see more innovation.
  • Enhancements to Process Center for federation of different Process Centers, each of which implies a different server instance. This allows departmental instances to share assets, as well as draw from an internal center of excellence plus one hosted by IBM for industry standards and best practices.
  • Support for the CMIS standard to link to any standard ECM repository, as well as direct integration to FileNet ECM, to link documents directly into processes through a drag-and-drop interface in the process designer.
  • There are also some improvements to the mashup tool used for forms design using a variety of integration methods, which I saw in a pre-conference briefing last week. This uses some of the resources from IBM Mashup Centre development team, but the tool was built new within IBM BPM.
  • Cloud support through IBM SmartCloud which appears to be more of a managed server environment if you want full IBM BPM, but does offer BPM Express as a pre-installed cloud offering. At last year’s Impact, their story was that they were not doing BPM (that is, execution, not the Blueworks-type modeling and lightweight workflow) in the cloud since their customers weren’t interested in that; at that time, I said that they needed to rethink their strategy on this and and stop offering expensive custom hosted solutions. They’ve taken a small step by offering a pre-installed version of BPM Express, but I still think these needs to advance further.

WebSphere Operational Decision Management (ODM) is a integration/bundling of WebSphere Business Event Manager and ILOG, bringing together events and rules into a single decision management platform for creating policies and deploying decision services. It has a number of new features:

  • ODM event streamSocial interface for business people to interact with rules design: decisions are assets that are managed and modified, and the event stream/conversation shows how those assets are being managed. This interface makes it possible to subscribe to changes on specific rules.
  • Full text searching across rules, rule flows, decision tables and folders within a project, with filtering by type, status and date.
  • Improved decision table interface, making it easier to see what a specific table is doing.
  • Track rule versions through a timeline (weirdly reminiscent of Facebook’s Timeline), including snapshots that provide a view of rules at a specific point in time.
  • Any rule can emit an event to be consumed/managed by the event execution engine; conversely, events can invoke rulesets. This close integration of the two engines within ODM (rules and events) is a natural fit for agile and rapid automated decisions.

There’s also zOS news: IBM BPM v8 will run on zOS (not sure if that includes all server components), and the ODM support for zOS is improved, including COBOL support in rules. It would be interesting to see the cost relative to other server platforms, and the compelling reasons to deploy on zOS versus those other platforms, which I assume are mostly around integrating with other zOS applications for better runtime performance.

Since last year’s big announcement about bringing the platforms together, they appear to have been working on integration and design, putting a more consistent and seamless user interface on the portfolio as well as enhancing the capabilities. One of the other analysts (who will remain nameless unless he chooses to identify himself) pointed out that a lot of this is not all that innovative relative to market leaders – he characterized the activity stream social interface as being like Appian Tempo three years ago, and some of the functionality as just repackaged Lombardi – but I don’t think that it’s necessarily IBM’s role to be at the very forefront of technology innovation in application software. By being (fairly) fast followers, they have the effect of validating the market for the new features, such as mobile and social, and introducing their more conservative customer base to what might seem like pretty scary concepts.

Emerging Trends in BPM – Five Years Later

I just found a short article that I wrote for Savvion (now part of Progress Software) dated November 21, 2006, and decided to post it with some updated commentary on the 5th anniversary of the original paper. Enjoy!

Emerging trends in BPM
What happened in 2006, and what’s ahead in 2007

The BPM market continues to evolve, and although 2006 has seen some major events, there will be even more in 2007. This column takes a high-level view of four areas of ongoing significant change in BPM: the interrelationship between SOA and BPM; BPM standards; the spread of process modeling tools; and the impact of Web 2.0 on BPM.

SOA and BPM, together at last. A year ago, many CIOs couldn’t even spell SOA, much less understand what it could do for them. Now, Service-Oriented Architecture and BPM are seen as two ends of the spectrum of integration technologies that many organizations are using as an essential backbone for business agility.

SOA is the architectural philosophy of exposing functionality from a variety of systems as reusable services with standardized interfaces; these, in turn, can be orchestrated into higher-level services, or consumed by other services and applications. BPM systems consume the services from the SOA environment and add in any required human interaction to create a complete business process.

As with every year for the last several years, 2006 has seen ongoing industry consolidation, particularly with vendors seeking to bring SOA and BPM together in their product portfolios. This trend will continue as SOA and BPM become fully recognized as being two essential parts of any organization’s process improvement strategy.

There has certainly been consolidation in the BPM vendor portfolios, especially the integration vendors adding better human-centric capabilities through acquisitions: Oracle acquired BEA in 2008, IBM acquired Lombardi in 2009, Progress acquired Savvion in 2010, and TIBCO acquired Nimbus in 2011. Although BPM is being used in some cases to orchestrate and integrate systems using services, this is still quite a green field for many organizations who have implemented BPM but are still catching up on exposing services from their legacy applications, and orchestrating those with BPM.

BPM standards. 2006 was the year that the Business Process Modeling Notation (BPMN), a notational standard for the graphical representation of process models, went mainstream. Version 2 of the standard was released, and every major BPM vendor is providing some way for their users to make use of the BPMN standard, whether it’s through a third-party modeling tool or directly in their own process modelers.

But BPMN isn’t the only standard that gained importance this year. 2006 also saw the widespread adoption of XPDL (XML Process Definition Language) by BPM vendors as an interchange format: once a process is modeled in BPMN, it’s saved in the XPDL file format to move from one system to another. A possible competitor to XPDL, the Business Process Definition Metamodel (BPDM) had its first draft release this year, but we won’t know the impact of this until later in 2007. On the SOA side, the Business Process Execution Language (BPEL), a service orchestration language, is now widely accepted as an interchange format, if not a full execution standard.

The adoption of BPM standards is critical as we consider how to integrate multiple tools and multiple processes to run our businesses. There’s no doubt that BPMN will remain the predominant standard for the graphical representation of process models, but 2007 could hold an interesting battle between XPDL, BPDM and BPEL as serialization formats.

The “Version 2” that I referred to was actually the second released version of the BPMN standard, but the actual version number was 1.1. That battle for serialization formats still goes on: most vendors support XPDL (and will continue to do so) but are also starting to support the (finally released) BPMN file format as well. BPDM disappeared somewhere in the early days of BPMN 2.0. BPEL is used as a serialization and interchange format primarily between systems that use BPEL as their core execution language, which are a minority in the broader BPMS space.

Modeling for the masses. In March of 2006, Savvion released the latest version of their free, downloadable process modeler: an application that anyone, not just Savvion customers, could download, install and run on their desktop without requiring access to a server. This concept, pioneered by Savvion in 2004, lowers the barrier significantly for process modeling and allows anyone to get started creating process models and finding improvements to their processes.

Unlike generic modeling tools like Microsoft Visio, a purpose-built process modeler can enforce process standards, such as BPMN, and can partially validate the process models before they are even imported into a process server for implementation. It can also provide functionality such as process simulation, which is essential to determining improvements to the process.

2006 saw other BPM vendors start to copy this initiative, and we can expect more in the months to come.

Free or low-cost process modelers have proliferated: there are web-based tools, downloadable applications and Visio BPMN add-ons that have made process modeling accessible – at least financially – to the masses. The problem continues to be that many people using the process modeling tools lack the analysis skills to do significant process optimization (or even, in some cases, representation of an event-driven process): the hype about having all of your business users modeling your business processes has certainly exceeded the reality.

Web 2.0 hits BPM. Web 2.0, a set of technologies and concepts embodied within the next generation of internet software, is beginning to impact enterprise software, too.

Web 2.0 is causing changes in BPM by pushing the requirement for zero-footprint, platform-independent, rich user interfaces, typically built using AJAX (Asynchronous Java and XML). Although browser-based interfaces for executing processes have been around for many years in BPM, the past year has seen many of these converted to AJAX for a lightweight interface with both functionality and speed.

There are two more Web 2.0 characteristics that I think we’re going to start seeing in BPM in 2007: tagging and process syndication. Tagging would allow anyone to add freeform keywords to a process instance (for example, one that required special handling) to make it easier to find that instance in the future by searching on the keywords. Process event syndication would allow internal and external process participants to “subscribe” to a process, and feed that process’ events into a standard feed reader in order to monitor the process, thereby improving visibility into the process through the use of existing feed technologies such as RSS (Really Simple Syndication).

Bringing Web 2.0 to BPM will require a few changes to corporate culture, especially those parts that require different – and more creative – types of end-user participation. As more people at all levels in the organization participate in all facets of process improvement, however, the value of this democratization of business processes will become clear.

I’ve been writing and presenting about the impact of social software on BPM for over five years now; adoption has been slower than I predicted, although process syndication (subscribing to a process’ events) has finally become mainstream. Tagging of processes is just starting to emerge; I’ve seen it in BonitaSoft but few other places.

I rarely do year-end prediction posts, but it was fun to look back at one that I did five years ago to see how well I did.

Enterprise BPM Webinar Q&A Followup

I know, two TIBCO-related posts in one day, but I just received the link to the replay of the Enterprise BPM webinar that I did for TIBCO last week, along with the questions that we didn’t have time to answer during the webinar, and wanted to summarize here. First of all, my slides:

These were the questions that came in during the webinar via typed chat that are not related to TIBCO or its products; I think that we covered some of these during the session but will respond to all of them here.

Is it possible to implement BPM (business process management) without a BPMS?

How to capture process before/without technology?

These are both about doing BPM without a BPMS. I wrote recently about Elevations Credit Union (the fact that they are an IBM customer is completely immaterial in this context) that gained a huge part of their BPM success long before they touched any technology, Basically, they carved out some high-level corporate goals related to quality, modeled their value streams, then documented their existing business processes relative to those value streams. Every business process had to fit into a value stream (which was in turn related to a corporate goal), or else it didn’t survive. They saw how processes touched various different groups, and where the inefficiencies lay, and they did all of this using manual mapping on white boards, paper and sticky notes. In other words, they used the management discipline and methodology side of BPM before they (eventually) selected a tool for collaborative process modeling, which then helped them to spread the word further in their organization. There is a misperception in some companies that if you a buy a BPMS, your processes will improve, but you really need to reorient your thinking, management and strategic goals around your business processes before you start with any technology, or you won’t get the benefits that you are expecting.

In enterprises that do not have SOA implemented horizontally across the organization, how can BPM be leveraged to implement process governance in the LOB silos, yet have enterprise control?

A BPM center of excellence (CoE) would be the best way to ensure process governance across siloed implementations. I wrote recently about a presentation that I was at where Roger Burlton spoke about BPM maturity; there was some advice that he had at the end of that about organizations that had only a level 1 or 2 in process maturity (which, if you’re still very siloed, you’re probably at): get a CoE in place and target it more at change initiatives than governance. However, you will be able to leverage the CoE to put standards in place, provide mentoring and training, and eventually build a repository of reusable process artifacts.

I work in the equipment finance industry. Companies in this space are typically classified as banks/bank-affiliates, captives and independents. With a few exceptions it’s my understanding that this particular industry has been rather slow at adopting BPMS. Have you noticed this in other industries and, if so, what do you see as being the “tipping point” for greater BPMS adoption rates? Does it ultimately come down to a solid ROI, or perhaps a few peer success stories?

My biggest customers are in financial services and insurance, so are also fairly conservative. Insurance, in particular, tends to adopt technology at the very end of adoption tail. I have seen a couple of factors that can impact a slower-moving adoption of any sort of technology, not just BPMS: first, if they just can’t do business the old way any more, and have to adopt the new technology. An example of this was a business process outsourcer for back-office mutual fund transactions that started losing bids for new work because it was actually written into the RFP that they had to have “imaging and workflow” technology rather than paper-based processes. Secondly, if they can’t change quickly enough to be competitive in the market, which is usually the case when many other of their competitors have already started using the technology. So, yes, it does come down to a solid ROI and some peer success stories, but in many cases, the ROI is one of survival rather than just incremental efficiency improvements.

Large scale organizations tend to have multiple BPM / workflow engines. What insights can you share to make these different engines in different organizational BUs into an enterprise BPM capability?

Every large organization that I work with has multiple BPMS, and this is a problem that they struggle with constantly. Going back to the first question, you need to think about both sides of BPM: it’s the management discipline and methodology, then it’s the technology.  The first of these, which is arguably the one with the biggest impact, is completely independent of the specific BPMS that you’re using: it’s about getting the organization oriented around processes, and understanding how the end-to-end business processes relate to the strategic goals. Building a common BPM CoE for the enterprise can help to bring all of these things together, including the expertise related to the multiple BPM products. By bringing them together, it’s possible to start looking at the target use cases for each of the systems currently in use, and selecting the appropriate system for each new implementation. Eventually, this may lead to some systems being replaced to reduce the number of BPMS used in the organization overall, but I rarely see large enterprises without at least two different BPMS in use, so don’t be fanatical about getting it down to a single system.

Typically what is the best order to implement ; first BPM and last SOA or vice versa.

I recommend a hybrid approach rather than purely top-down (BPM first) or bottom-up (SOA first). First, do an inventory in your environment for existing services, since there will almost always be some out there, even if just in your packaged applications such as ERP. While is this happening, start your BPM initiative by setting the goals and doing some top-down process modeling. Assuming that you have a particular process in mind for implementation, do the more detailed process design for that, taking advantage of any services that you have discovered, and identifying what other services need to be created. If possible, implement the process even without the services: it will be no worse from an efficiency standpoint than your current manual process, and will provide a framework both for adding services later and for process monitoring. As you develop the services for integration and automation, replace the manual steps in the process with the services.

Re: Enterprise BPM Goals – Develop, Execute, but what about Governance?

This was in response to the material on my agenda for the webinar. Yes, governance is important, but I only had 40 minutes and could barely cover the design/develop/execute parts of what we wanted to cover. Maybe TIBCO will have me back for another webinar on governance. ;-)

Data/content centric processes vs. people-centric vs. EAI/integration centric re: multiple BPMS platforms. Any guidelines for when and where to demarcate?

These divisions are very similar to the Forrester divisions of the BPMS landscape from a few years ago, and grew mostly out of the different types of systems that were all lumped together as “BPMS” by the analysts in the early 2000’s. Many of today’s products offer strength in more than one area, but you need to have a good understanding of your primary use cases when selecting a product. Personally, I think that content-centric and human-centric isn’t the right way to split it: more like unstructured (case management) versus structured; even then, there is more of a spectrum of functionality in most cases than purely unstructured or purely structured. So really, the division is between processes that have people involved (human-centric) or those that are more for automated integration (system-centric), with the latter having to accommodate this wider spectrum of process types. If you have mostly automated integration processes, then certainly an integration-centric BPMS makes sense; if you have human-facing processes, then the question is a bit more complex, since you’re dealing with content/documents, process types, social/collaborative capabilities and a host of other requirements that you need to look at relative to your own use cases. In general, the market is moving towards the full range of human-facing processes being handled by a single product, although specialist product companies would differ.

Thoughts on the role of the application/solution architect within an LOB or COE vs. that of the enterprise architect assigned to the BPM domain?

An enterprise architect assigned to the BPM CoE/domain is still (typically) part of the EA team, therefore involved with the broader scope of enterprise architecture issues. An application/solution architect tends to be more product and technology focused, and in many some that is just a fancy term used for a developer. In other words, the EA should be concerned with overall strategy and goals, whereas the solution architect is focused on implementation.

Role of the COE in governance? How far does/should it extend?

The CoE is core to governance: that’s what it’s there for. At the very least, the CoE will set the standards and procedures for governance, and may rely on the individual projects to enforce that governance.

Is it really IT giving up control? In many cases, the business does whatever they do — and IT has little (or aged) information about the actual processes.

This was in reference to slide #11 in my deck about cultural issues. Certainly business can (and often do) go off and implement their own processes, but that is outside the context of enterprise-wide systems. In order to have the business be doing that within the enterprise BPMS, IT has to ensure that business can access the process discovery and modeling tools that become the front-end of process design. That way, business and IT share models of the business processes, which means that what gets implemented in the BPMS might actually resemble what is required by the business. In some cases, I see a company buy a BPMS but not allow the business users to use the business-level tools to participate in process modeling: this is usually the result of someone in IT thinking that this is beyond the capability of the business people.

Is following of any BPM notation standards part of BPM development? I saw that there was no mention of it.

There was so much that I did not have time to address with only 40 minutes or so to speak, and standards didn’t make the cut. In longer presentations, I always address the issue of standards, since a common process modeling notation is essential to communication between various stakeholders. BPMN is the obvious front-runner there, and if used properly, can be understood by both business and IT. It’s not just about process models, however: a BPMS implementation has to also consider data models, organizational models and more, around which there is less standardization.

Regarding Common UI: shouldn’t it be Common Architecture, accessed by different UIs that fit the user’s roles, knowledge, etc?

In the context of slide #6, I did mean a common UI, literally. In other words, using the BPMS’ composite application development and forms environment to create a user interface that hides multiple legacy applications behind a single user interface, so that the user deals with this new integrated UI instead of multiple legacy UIs. Your point seems to be more about persona-based (or role-based) interfaces into the BPMS, which is a valid, but different, point. That “single UI” that I mention would, in fact, be configurable for the different personas who need to access it.

How does a fully fledged BPM tool stack up against workflow tools part of other COTS application, e.g. workflow in a document management tool or in a trouble ticketing tool?

A full BPMS tends to be much more flexible than what you will find in the embedded workflow within another platform, and is more of an application development platform than just a way to control processes within that application. On the other side, the workflow within those applications are typically already fully integrated with the other business objects within them (e.g., documents, trouble tickets), so the implementation may be faster for that particular type of process. If the only type of process management that you need to do is document approvals within your document management system, it may make sense to use that rather than purchase a full BPMS; if you have broader process management needs, start looking at a more general BPMS platform that can handle more of your use cases.

How do u see BPM tools surviving when CRM tools with more or less same capability is getting widely accepted by enterprises with out-of-box processes defined?

Similar to my response to the previous question, if the processes are related only to the business objects within the CRM, then you may be better off using the workflow tools within it. However, as soon as you want to integrate in other data sources, systems or users, you’ll start to get beyond the functional capabilities of the simpler workflow tools within the CRM. There’s room in the market for both; the trick is, for customers, to understand when to use one versus the other.

What are the reasons you see for BPM tools not getting quickly and widely accepted and what are the solutions to overcome that?

There are both cost and complexity components with BPMS adoption, but a big reason before you even start looking at tools is moving your organization to a process-driven orientation, as I discussed above. Once people start to look at the business as end-to-end processes, and those processes as assets and capabilities that the business offers to its customers, there will be a great pull for BPMS technologies to help that along. Once that motivation is in place, the cost and complexity barriers are still there, but are becoming less significant: first of all, more vendors are offering cloud-based versions of their software that allow you to try it out – and even do your full development and testing – without capital expenditures. If they offer the option, you can move your production processes on-premise, or leave them in the cloud to keep the total cost down. As for complexity, the products are getting easier to use, but are also offering a lot more functionality. This shifts the complexity from one of depth (learning how to do a particular function) to breadth (learning what all the functions are and when to use which), which is still complex but less of a technological complexity.

Is it possible to start introducing and implementing BPM in one department or module only and then extending the BPM to other departments or modules? Or this should be the enterprise wide decisions since it involves heavy cost to bring BPM technologies.

Almost every organization that I work with does their BPM implementation in one department first, or for one process first (which may span departments): it’s just not possible to implement everything that you will ever implement in BPM at the same time, first time. There needs to be ROI within that first implementation, but you also have to look at enterprise cost justification as with any horizontal technology: plan for the other projects that will use this, and allocate the costs accordingly. That might mean that some of the initial costs come from a shared services or infrastructure budget rather than the project budget, because they will eventually be allocated to future projects and processes.

How difficult would it be to replace legacy workflow system with BPM?

It depends (that’s always the consultant’s answer). Seriously, though, it depends on the level of integration between the existing workflow system and other systems, and how much of the user interface that it provides. I have seen situations where a legacy workflow system is deeply embedded in a custom application platform, with fairly well-defined integration points to other systems, and the user interface hiding the workflow system from the end user. In this case, although it’s not trivial, it is a straightforward exercise to rip out the workflow system since it is being used purely as a process engine, replace it with a new one, refactor the integration points so that the new system calls the other systems in the environment (usually easier since modern BPMS’ have better integration capabilities) and refactor the custom UI so that it calls the new BPMS (also usually easier because of updated functionality). That’s the best case, and as I said, it’s still not trivial. If the legacy workflow system also provides the user interface, then you’re looking at redeveloping your entire UI either in the new BPMS or in some other UI development tool, plus the back-end systems integration work. A major consideration in either case is that you don’t just want to replace the same functionality of the old workflow system, since the new BPMS will have far greater functionality: you need to think about how you are going to leverage capabilities such as runtime collaboration that never existed in the old system, in order to see the greatest benefit from the upgrade.

Is it possible to switch between BPM vendors without having pain?

No. Similar to the previous answer, this is a non-trivial exercise, and depending on how much of the functionality of the BPMS that you were using, could be pretty much a complete redevelopment. If the BPMS was used primarily for orchestration of automated processes, it will be much easier, but as soon as you get into custom integration/orchestration and user interfaces, it gets a lot more complicated (and painful).

Do we really need to go for BPM in a situation where we need only integration orchestration only?

One end of the BPMS market is integration-centric systems, which primarily do just integration orchestration. The advantage of using a BPMS for this instead of orchestrating directly in application code is that you get all of the other stuff that comes with the BPMS “for free”: graphical process modeling, execution monitoring, process governance and whatever other goodies are in the BPMS. It’s not really free, of course, but it’s valid to consider a comparison of all of that functionality against what parts of it you would have to custom-build if you were to do the orchestration in code.

That’s it for the Q&A. If you listen to the replay, or were on the live broadcast, my apologies for the rushed beginning: I got off on the wrong foot out of the gate, but settled down after the first few minutes.