IBM FileNet BPM Product Update

Last session of the day, and Mike Fannon and Dave Yockelson are giving an update on FileNet BPM, particularly the 5.x release. The highlights:

  • The Process Engine (PE) was ported completely to a standard Java application, with some dramatic performance increases: 60% improvement in response time through the Java API, 70% (or more) reduction in CPU utilization, near-linear growth in CPU utilization for vertical scaling (i.e., more processes on a single server), and constant CPU utilization on horizontal scaling (e.g., twice as many processes on twice as many servers).
  • Linux and zLinux support.
  • Multi-tenancy, allowing multiple PE instances to run on the same virtual server, so that different isolated regions can be tied to separate PE database stores. If you have multiple isolated regions in a single store now, there will be a procedure for migrating this for better multi-tenancy.
  • Simplified installation, configuration and operation.
  • Deployment/upgrade paths directly from pretty much any currently supported FileNet BPM environment to 5.x, going all the way back to eProcess (there was one person in the audience who admitted to still using it), as well as v3.53, 4.03, 4.50 and 4.51.
  • Process Analyzer is now Case Analyzer, having been extended to add capabilities for Case Manager. Case Analyzer reporting is now supported through Cognos BI in addition to the old-school Excel pivot tables.
  • Process Monitor is now Case Monitor (I seem to be seeing  a trend here), with Cognos Real-time Monitoring 10.1 (previously called Cognos Now) bundled in as an interactive dashboard solution.
  • Integration of IBM Forms (as we saw in the Case Manager product update) to be used in the same way as FileNet eForms are used in FileNet BPM today, namely, for a richer UI replacement that provides functionality such as digital signatures.

We moved on to yet another presentation on Case Manager; I could probably have skipped the previous session and just come to this one, but there was no indication on the conference materials that that would be a good idea.

Time for a quick sprint through the vendor expo, then off to the evening networking event, which promises displays highlighting 100 years of the history of IBM and the computing industry. We’ll also have a concert by Train, which is the third Train concert at the three large vendor conferences that I’ve attended in the last six weeks: Progress, TIBCO and now IBM. Not sure if the corporate gig is a new market strategy for Train; maybe I’ll actually make it to tonight’s conference after missing the previous two.

What’s New in IBM ECM Products

Feri Clayton gave an update on the ECM product portfolio and roadmap, in a bit more depth than yesterday’s Bisconti/Murphy ECM product strategy session. She reinforced the message that the products are made up of suites of capabilities and components, so that you’re not using different software silos. I’m not sure I completely buy into IBM’s implementation of this message as long as there are still quite different design environments for many of these tools, although they are making strides in consolidating the end user experience.

She showed the roadmap for what has been released in 2011, plus the remainder of this year and 2012: on the BPM side, there will be a 5.1 release of both BPM and Case Manager in Q4, which I’ll be hearing more about in separate BPM and Case Manager product sessions this afternoon. The new Nexus UI will previous in Q4, and be released in Q2 of 2012. There’s another Case Manager release projected for Q4 2012.

There was a question about why BPM didn’t appear in the ECM portfolio diagram, and Clayton stated that “BPM is now considered part of Case Manager”. Unlike the BPM vendors who think of ACM as a part of BPM, I think that she’s right: BPM (that is, structured process management that you would do with IBM FileNet BPM) is a functionality within ACM, not the other way around.

She went through the individual products in the portfolio, and some of the updates:

  • Production Imaging and Capture now includes remote capture, which is nice for organizations that don’t want to centralize their scanning/capture. It’s not clear how much of this is the Datacap platform versus the heritage FileNet Capture, but I imagine that the Datacap technology is going to be driving the capture direction from here on. They’ve integrated the IBM Classification Module for auto recognition and classification of documents.
  • Content Manager OnDemand (CMOD) for report storage and presentment will see a number of enhancements including CMIS integration.
  • Social Content Management uses an integration of IBM Connections with ECM to allow an ECM library to access and manage content from within Connections, display ECM content within a Connections Community and a few other cross-product integrations. There are a couple of product announcements about this, but they seem to be in the area of integration between Connections and ECM as opposed to adding any native social content management to ECM.
  • FileNet P8, the core content management product, had a recent release (August) with such enhancements as bidirectional replication between P8 and Image Services, content encryption, and a new IBM-created search engine (replacing Verity).
  • IBM Content Manager (a.k.a., the product that used to compete with P8) has a laundry list of enhancements, although it still lags far behind P8 in most areas.

We had another short demo of Nexus, pretty much the same as I saw yesterday: the three-pane UI dominated by an activity stream with content-related events, plus panes for favorites and repositories. They highlighted the customizability of Nexus, including lookups and rules applied to metadata field entry during document import, plus some nice enhancements to the content viewer. The new UI also includes a work inbasket for case management tasks; not sure if this also includes other types of tasks such as BPM or even legacy Content Manager content lifecycle tasks (if those are still supported).

Nexus will replace all of the current end-user clients for both content and image servers, providing a rich and flexible user experience that is highly customizable and extensible. They will also be adding more social features to this; it will be interesting to see how this develops as they expand from a simple activity stream to more social capabilities.

Clayton then moved on to talk about ACM and the Case Manager product, which is now coming up to its second release (called v5.1, naturally). Given that much of the audience probably hasn’t seem it before, she wen through some of the use cases for Case Manager across a variety of industries. Even more than the base content management, Case Manager is a combination of a broad portfolio of IBM products within a common framework. She listed some of the new features, but I expect to see these in more detail in this afternoon’s dedicated Case Manager session so will wait to cover them then.

She discussed FileNet P8 BPM version 5.x: now Java-based for significant performance and capacity improvements (also due to a great deal of refactoring to remove old code sludge, as I have heard). As I wrote about last month, it provides Linux and zLinux support, and also allows for multi-tenancy.

With only a few minutes to go, she whipped through information lifecycle governance (records and retention management), including integration of the PSS Atlas product; IBM Content Collector; and search and content analytics. Given the huge focus on analytics in the morning keynote, it’s kind of funny that it gets about 30 seconds at the end of this session.

Better Together: IBM Case Manager, IBM Content Manager and IBM BPM

Dave Yockelson from ECM product marketing and Amy Dickson from IBM BPM product management talked about something that I’m sure is on the minds of all FileNet customers who are doing anything with process: how do the (FileNet-based) Case Manager and Content Manager fit together with the WebSphere BPM products?

They started with a description of the IBM BPM portfolio – nothing new here – and how ACM requires an integrated approach that addresses repeatable patterns. Hmmmm, not completely sure I agree with that. Yockelson went through the three Forrester divisions of case management from their report on the ACM space, then went through a bit more detail on IBM Case Manager (ICM) and how it knits together functionality from the entire IBM software portfolio: content, collaboration, workflow, rules, events, integration, and monitoring and analytics. He positioned it as a rapid application development environment for case-based solutions, which is probably a good description. Dickson then went through IBM BPM (the amalgam of Lombardi and WebSphere Process Server that I covered at Impact), which she promised would finish up the “background” part and allow them to move on to the “better together” part.

So, in the aforementioned better together area:

  • Extend IBM BPM processes with content, using document and list widgets that can be integrated in a BPM application. This does not include content event processes, e.g., spawning a specific process when a document event such as check-in occurs, so is no different than integrating FileNet content into any BPMS.
  • Extend IBM BPM Advanced (i.e., WPS) processes with content through a WebSphere CMIS adapter into the content repository. Ditto re: any BPMS (or other system) that supports CMIS being able to integrate with FileNet content.
  • Invoke an IBM BPM Advanced process from an ICM case task. Assuming that this is via a web service call (since WPS allows processes to be exposed as web services), not specifically an IBM-to-IBM integration.

Coming up, we’ll see some additional integration points:

  • Invoke an IBM BPM Express/Standard process from an ICM case task. This, interestingly, implies that you can’t expose a BPM Express/Standard process as a web service, or it could have been done without additional integration, doesn’t it? The selection of the process and mapping of case to process variables is built right into the ICM Builder, which is definitely a nice piece of integration to make it relatively seamless to integrate ICM and BPM.
  • Provide a federated inbox for ICM and BPM (there was already an integrated inbox for the different types of BPM processes) so that you see all of your tasks in a single list, based on the Business Space Human Tasks widget. When you click on a task in the list, the appropriate widgets are spawned to handle that type of work.
  • Interact with ICM cases directly from a BPM process through an integration service that allows cases to be created, retrieved and updated (metadata only, it appears) as part of a BPM process.

This definitely fits IBM’s usual modus operandi of integrating rather than combining products with similar functionality; this has a lot of advantages in terms of reducing the time to releasing something that looks (sort of) like a single product, but has some disadvantages in the underlying software complexity as I discussed in my IBM BPM review from Impact. A question from the audience asked about consolidation of the design environment; as expected, the answer is “yes, over time”, which is similar to the answer I received at Impact about consolidation of the process engines. I expect that we’ll see a unified design environment at some point for ICM and both flavors of BPM by pulling ICM design into the Process Center, but there might still be three engines under the covers for the foreseeable future. Given the multi-product mix that makes up ICM, there will also be separate engines (and likely design environments) for non-process functions such as rules, events and analytics, too; the separate engines are inevitable in that case, but there could definitely be some better integration on the design side.

Enabling Agile Processes With IBM BPM For z/OS

Dave Marquard, Janet Wall and Eric Herness from the IBM BPM team gave an analyst briefing today on BPM on the z/OS platform. At Impact earlier this year, we saw a merging of the Lombardi acquisition and WebSphere Process Server into a unified IBM BPM product, and this month, they released BPM on z/OS. This is intended to unify across the historic divide between z (mainframe) and non-z assets, and allow the benefits of BPM for agility and visibility to be combined more easily with the z/OS applications and data.

The presentation highlighted a typical process problem in a System z environment: account opening in financial institution, where paper-based manual processes at the front end are combined with multiple repositories of customer information, a variety of systems for risk assessment and customer care, and legacy account management systems. In their new vision, this can be replaced with explicit process management and better orchestration between the components; this, of course, is not unique to this platform, but is a general benefit of BPM. Deploying BPM on z/OS, however, leverages co-location for better performance and access to data, as well as the scalability that you would expect on this larger platform.

From an IBM BPM architecture standpoint, the Process Server components can now be hosted on z/OS, while the Process Center and its repository stay on Windows, AIX or Linux. Process Server Advanced for z/OS is more than just a simple port: it leverages native z/OS data structures, supports languages such as COBOL, provides local adapters to other z/OS applications, and allows reusable services to be created more easily. Since the process and services are both running on z/OS, WebSphere z/OS does optimization for cross-memory local communications to improve performance and resource utilization, providing the most benefit when the processes frequently interact with DB2, CICS and IMS on the same platform, and also providing seamless integration with other facilities such as RACF.

This plugs into Business Monitor for z/OS that monitors the processes, other z/OS applications and events, and provides user-customizable dashboards for overall monitoring and some KPI-based predictive analytics. Other process-related offerings that are already on z/OS include business rules, ESB and message broker, so the migration of BPM to this platform provides a pretty robust set of tools for those companies who rely on z/OS for their primary operations. This is now providing a much more model-driven, process-oriented platform, allowing the underlying DB2 and CICS applications to be abstracted and orchestrated more easily.

They talked about a couple of case studies (without naming the clients), highlighting scalability, performance and resilience as the key differentiators of running BPM on z/OS for existing z/OS clients.

A few additional references provided in the briefing notes:

Since most of my customers are in financial services and insurance, many of them are on IBM mainframe platforms. Although not all will choose to deploy BPM on z/OS, this does provide an option for those who want to more fully integrate their mission-critical processes with their existing z/OS applications.

Colonial Life at TUCON

I’m wrapping up my TUCON trip with the Colonial Life presentation on their TIBCO iProcess and BusinessWorks implementation in their back office. I work a lot with insurance companies, and find that they can be very conservative in terms of technology implementations: many are just implementing document imaging and workflow, and haven’t really looked at full BPM functionality that includes orchestration of different systems as well as work management. I had a chance to talk with the two presenters, Bijit Das from the business side and Phil Johnston from IT, in a private briefing yesterday; I heard about their business goals to do better work management and improve efficiencies by removing paper from the process, as well as their technical goal to build an agile solution that could be used across multiple process flows. They have done their first implementation in their policy administration area, where they receive about 180k pages of inbound documentation per year, resulting in about 10k work items per month.

They ended up using iProcess primarily as a state management and queuing engine, embedding most of the process flow rules in external database tables, and having just simple process flows in iProcess that routed work based on the table values rather than logic within the process model itself. Once a piece of work ended up in the right queue (or in a user-filtered view of a common work queue), the user could complete it, route it elsewhere, or put it on hold while they performed some activity outside of BPM. A huge part of their improvements came from using BW to create reusable services, where these services could be called from the processes, but they also turned that around and have some cases where iProcess is called as a service from BW for queue and state management, using services that had been developed by Unum (their parent company) for their implementation. They wrote their own custom user interface portal, allowing users to select the queue that they want to work, filter the queue manually, and select the work item that they want to work on. This is a bit unusual for back-office transactional systems, which typically push the next piece of work to a user rather than allowing them to select it, but it’s a lot harder to build those rules when you’re effectively writing all the work management in database tables rather than leveraging work management capabilities in a BPMS.

They transitioned from a very waterfall development model to a much more agile methodology throughout their first project lifecycle, which meant that the business area was seeing the code as it was being developed and testing, allowing for much smoother iterations. Although their first production release took about nine months (after an additional six months to implement the infrastructure), they did their next release in two months. They still do a lot of swivel-chair integration with their legacy policy administration system, and need to build better integration to further improve efficiencies and start to do some straight-through processing.

They’ve seen some impressive improvements:

  • The discovery and modeling that happened prior to the implementation forced them to look at their processes critically, and reorganize their teams so that similar work was processed by the same team
  • Minimized handoffs have improved SLAs by 4%
  • Increased visibility into processes
  • Removed 180k pieces of paper per year from the operations area
  • 20% efficiency improvement
  • Standardized solution for future implementations in other areas

They also learned some lessons and best practices, such as establishing scope, tools for process discovery and brining in the right resources at the right time. Yesterday, when I met with them, I mentioned Nimbus to them, which they had not yet looked at; obviously, they had time to check it out since then, since Bijit called it out from the presentation, saying that it could have helped them during process discovery. Their next steps are to do more system integration to further improve efficiencies by automating where possible, add input channels, and integrate smart forms to drive processes.

Although they have seen a huge amount of improvement in their processes, this still feels a bit like an old-school document workflow implementation, with table-driven simple process flows. Undoubtedly, the service layer is more modern, but I’m left feeling like they could see a lot more benefit to their business if they were to take advantage of newer BPM capabilities. However, this was probably a necessary first step for such a paper-bound organization that was cautiously dipping its toe into the BPM waters.

Driving the Adoption of Business Process Initiatives With @NimbusIP

Mark Cotgrove and Clark Swain from Nimbus Partners presented in a breakout session on Nimbus and how it fits into the bigger TIBCO picture, as an expansion of the short presentation we saw from Cotgrove at the analyst session yesterday. To sum up the message from yesterday, Nimbus Control provides an essential bit of business-driven process discovery functionality that isn’t really covered in TIBCO’s AMX/BPM offering, but more importantly, the ability to create intelligent operations manuals that can then interact with AMX/BPM in a variety of ways.

Nimbus Control doesn’t do process automation: they do process and procedural documentation that can also be linked to supporting documentation and other content required to perform a manual process. Some of the manual steps may be to interact with systems in specific ways, such as entering an order on an ERP system; others may be to perform purely manual tasks such as having a customer sign a paper document. There are a few competitors in this space, such as BusinessGenetics and Business Optix (formerly ProcessMaster), and there is some overlap with BPA tools such as ARIS and Blueprint in terms of the process discovery side, but not the end-user procedural help.

Swain started on a demo, but due to the late session start (apparently the keynote went way overtime), I had to leave for another meeting, and will have to see a more detailed demo some other time.

TIBCO Acquisitions With Tom Laffey: OpenSpirit, Loyalty Lab and Nimbus

Tom Laffey, EVP of products and technology, moderated a session highlighting three of TIBCO’s recent acquisitions: OpenSpirit, Loyalty Lab and Nimbus.

Clay Harter, CTO of OpenSpirit (which was acquired by TIBCO a year ago), discussed their focus on delivering data and integration applications to the oil and gas industry. Their runtime framework provided a canonical data model over a heterogeneous set of data stores, and their desktop applications integrated with spatial data products such as ESRI’s ArcGIS and Schlumberger’s remote sensing. Due to their knowledge of the specialized data sources, they have a huge penetration into 330+ oil companies and relationships into industry-specific ISVs. In October, they will release a BusinessWorks plugin for OpenSpirit to make oil and gas technical data available through the TIBCO ESB. They are also prototyping a Spotfire extension for OpenSpirit for visualizing and analyzing this data, which is pretty cool – I worked as a field engineer in oil and gas in the early 80’s, and the sensing and visualization of data was a whole different ball game then, mostly black magic. OpenSpirit’s focus is on reducing exploration costs and increasing safely through better analysis of the petrotechnical data, particularly through interdisciplinary collaboration. From TIBCO’s standpoint, they were building their energy vertical, and the acquisition of OpenSpirit brings them expertise and credibility in that domain.

Keith Rose, formerly president of Loyalty Lab and now leading the sales efforts in that area since their acquisition by TIBCO, presented on their event-driven view of managing customer loyalty, particularly loyalty programs such as those used by airlines and retailers. They have a suite of products that support marketers in terms of visualizing and analyzing loyalty-related data, and building loyalty programs that can leverage that information. Their focus on events – the core of real-time and one-to-one loyalty marketing programs – was likely the big reason for the TIBCO acquisition, since TIBCO’s event and messaging infrastructure seems like a natural fit to feed into Loyalty Lab’s analysis and programs. Spotfire for visualization and analysis of data also makes a lot of sense here, if they can work out how to integrate that with their existing offerings. With 99% of their customers on a hosted cloud solution, they may also want to consider how a move to TIBCO’s cloud platform can benefit them and integrate with other initiatives that their customers may have.

Less than a month ago, Nimbus was acquired by TIBCO, and Mark Cotgrove, a founder and EVP, gave us a briefing on their product and why it made sense for TIBCO to acquire them. Nimbus provides tools for process discovery and analysis, including the 80% (or so) of an organization’s activities that are manual and are likely to remain manual. Currently, the automated activities are handled with enterprise applications and automated BPM (such as AMX/BPM), but the manual ones are managed with a mix of office productivity software (Word, PowerPoint, Visio) and business process analysis tools. Furthermore, end-to-end processes range back and forth between manual and automated activities as they progress through their lifecycle, such that often a single process instance ends up being managed by a variety of different tools. Nimbus provides what are essentially storyboards or guided walkthroughs for business processes: like procedures manuals, but more interactive. These “intelligent operations manuals” can include steps that will instruct the user to interact with a system of some sort – for example, an ERP system, or a BPMS such as AMX/BPM – but documents all of the steps including paper handling and other manual activities. Just as a BPMS can be an orchestration of multiple integrated systems, Nimbus Control can be an orchestration of human activities, including manual steps and interaction with systems. There are a few potential integration points between Nimbus and a few different TIBCO products: metrics in the context of a process using Spotfire; exporting discovered processes from Nimbus to BusinessStudio; instantiating an AMX/BPM process from Nimbus; worker accessing a Nimbus operations manual for instructions at the step in an AMX/BPM process; collaborative process discovery using tibbr; and tibbr collaboration as part of a manual process execution. Some or all of these may not happen exactly like this, but there is some interesting potential here. There’s also potential within an organization for finding opportunities for AMX/BPM implementation through process discovery using Nimbus.

An interesting view of three different acquisitions, based on three very different rationales: industry vertical; horizontal application platform; and expansion of core product functionality. TIBCO is definitely moving from their pure technology focus to one that includes verticals and business applications.

TIBCO Product Strategy With Matt Quinn

Matt Quinn, CTO, gave us the product strategy presentation that will be seen in the general session tomorrow. He repeated the “capture many events, store few transactions” message as well as the five key components of a 21st century platform that we heard from Murrary Rode in the previous session; this is obviously a big part of the new messaging. He drilled into their four broad areas of interest from a product technology standpoint: event platform innovation, big data and analytics, social networking, and cloud enablement.

In the event platform innovation, they released BusinessEvents 5.0 in April this year, including the embedded TIBCO Datagrid technology, temporal pattern matching, stream processing and rules integration, and some performance and big data optimizations. One result is that application developers are now using BusinessEvents to build applications from the ground up, which is a change in usage patterns. For the future, they’re looking at supporting other models, such as BPMN and rule models, integrating statistical models, improving queries, improving the web design environment, and providing ActiveMatrix deployment options.

In ActiveMatrix, they’ve released a fully integrated stack of BusinessWorks, BPM and ServiceGrid with broader .Net and C++ support, optimized for large deployments and with better high-availability support and hot deployment capabilities. AXM/BPM has a number of new enhancements, mostly around the platform (such as the aforementioned HA and hot deployment), with their upcoming 1.2 release providing some functional enhancements such as customer forms and business rules based on BusinessEvents. We’ll see some Nimbus functionality integration before too much longer, although we didn’t see that roadmap; as Quinn pointed out, they need to be cautious about positioning which tools are for business users versus technical users. When asked about case management, he said that “case management brings us into areas where we haven’t yet gone as a company and aren’t sure that we want to go”. Interesting comment, given the rather wild bandwagon-leaping that has been going on in the ACM market by BPM and ECM vendors.

The MDM suite has also seen some enhancements, with ActiveSpaces integration and collaborative analytics with Spotfire, allowing MDM to become a hub for reference data from the other products. I’m very excited to see that one-click integration between MDM and AMX/BPM is on the roadmap; I think that MDM integration is going to be a huge productivity boost for overall process modeling, and when I reviewed AMX/BPM last year, I liked their process data modeling stated that “the link between MDM and process instance data needs to be firmly established so that you don’t end up with data definitions within your BPMS that don’t match up with the other data sources in your organization”. In fact, the design-time tool for MDM is now the same as that used for business object data models that I saw in AMX/BPM, which will make it easier for those who move across the data and process domains.

TIBCO is trying to build out vertical solutions in certain industries, particularly those where they have acquired or built expertise. This not only changes what they can package and offer as products, but changes who (at the customer) that they can have a relationship with: it’s now a VP of loyalty, for example, rather than (or in addition to) someone in IT.

Moving on to big data and analytics technology advances, they have released FTL 2.0 (low-latency messaging) to reduce inter-host latency below 2.2 microseconds as well as provide some user interface enhancements to make it easier to set up the message exchanges. They’re introducing TIBCO Web Messaging to integrate consumer mobile devices with TIBCO messaging. They’ve also introduced a new version of ActiveSpaces in-memory data grid, providing big data handling at in-memory speeds by easing the integration with other tools such as event processing and Spotfire.

They’ve also released Spotfire 4.0 visual analytics, with a bit focus on ease of use and dashboarding, plus tibbr integration for social collaboration. In fact, tibbr is being used as a cornerstone for collaboration, with many of the TIBCO products integrating with tibbr for that purpose. In the future, tibbr will include collaborative calendars and events, contextual notifications, and other functionality, plus better usability and speed. Formvine has been integrated with tibbr for forms-based routing, and Nimbus Control integrates with tibbr for lightweight processes.

Quinn finished up discussing their Silver Fabric cloud platform to be announced tomorrow (today, if you count telling a group of tweet-happy industry analysts) for public, private and hybrid cloud deployments.

Obviously, there was a lot more information here that I could possibly capture (or that he could even cover, some of the slides just flew past), and I may have to get out of bed in time for his keynote tomorrow morning since we didn’t even get to a lot of the forward-looking strategy. With a product suite as large as what TIBCO has now, we need much more than an hour to get through an analyst briefing.

TIBCO Corporate Strategy Session with Murray Rode

I’m in Vegas this week at TUCON, TIBCO’s user conference, and this afternoon I’m at the analyst event. For the corporate strategy session, they put the industry analysts and financial analysts together, meaning that there were way too many dark suits in the room for my taste (and my wardrobe).

Murray Rode, COO, gave us a good overview presentation on the corporate strategy, touching on market factors, their suite of products, and their growth in terms of products, geographies and verticals. Definitely, event-driven processes are a driving force behind businesses these days – matching with the “responsive business” message I saw at the Progress conference last week – and TIBCO sees their product suite as being ideally positioned to serve those needs.

Rode defined the key components of a 21st century platform as:

  • Automation (SOA, messaging, BPM) as core infrastructure
  • Event processing
  • Social collaboration
  • Analytics
  • Cloud

Their vision is to be the 21st century middleware company, continuing to redefine the scope and purpose of middleware, and to provide their customers with the “2-second advantage” based on event processing, real-time analytics and process management. They see the middleware market as taking a bite out of the application development platforms and out of the box suites by providing higher-functioning, more agile capabilities, and plan to continue their pure-play leadership in middleware.

Looking at their performance in verticals, financial services is now only 25% of their business as they diversify into telecom, government, energy, retail and other market segments. This is an interesting point, since many middleware (including many BPM) vendors grew primarily in financial services, and have struggled to break out of that sector in a significant way.

From a product standpoint, their highest growth is happening in CEP, analytics and MDM, while core stable growth continues in BPM and SOA. They are starting to see new growth in cloud, tibbr, low-latency messaging and Nimbus to drive their future innovation.

They see their key competitors as IBM and Oracle, and realize that they’re the small fish in that pond; however, they see themselves as being more innovative and in touch with current trends, and having a better pure-play focus on infrastructure. Their strategy is to keep defining the platform through a culture of continuous innovation, so as not to become a one-hit wonder like many other now-defunct (or acquired) middleware vendors of the past; to maximize sales execution strengths for growth by setting vertical go-to-market strategies across their product suite; to organize for innovation particularly through cross-selling the newer products into mature opportunities; to cultivate their brand; and to manage for growth and continued profitability, in part by branching beyond their direct sales force, which has been a significant strength for them in the past, to invest in partner and SI relationships to broaden their sales further.

Rode spoke briefly about acquisitions (we’re slated for a longer session on this later today), and positioned Nimbus as having applicability to core infrastructure in terms of analytics and events, not just BPM. It will be interesting to see how that plays out. In general, their focus is on smaller acquisitions to complement and enhance their core offering, rather than big ones that would be much harder to align with their current offerings.

Empowering The Customer Through Process Improvement And BPM

Nick Deacon of Nokia Siemens Networks (they do the mobile networks for 65M users, not the phones) gave a presentation on empowering the customer through process improvement and BPM. With the recent acquisition of Motorola networks, they have almost 80,000 employees in 150 countries, with over half of their employees in service areas. Telecom is pretty volatile these days, with telecom, IT and media eco-systems mixing to create a data traffic explosion. This means that the networks needs to be both efficient and resilient while delivering the desired customer experience, so that we can all watch YouTube on our smartphones.

It used to be that you could just make your customer work the way that you needed them to work within your predefined efficient processes; now, however, the customers need more control over the services that they consume. NSN looks as their customer interaction points – much like what we heard from Bill Band this morning on analyzing the customer journey – and focus on improving those interaction points that are the most critical to improving the customer perception.

They are a big SAP customer, but find that they use Appian BPM to fill the gaps that SAP just doesn’t do without major customization, and to bridge between different systems. They’ve implemented BPM in five major business areas with more than 22,000 users. By reusing some components but adapting to each particular business area, they’re able to roll out new systems in a matter of months. They are pushing into social capabilities to facilitate faster decision-making, and mobile platforms to better support remote users.

As Deacon said in his summary, BPM enables them to react quickly to meet business needs and to respond effectively to better serve their customers.