Category Archives: CEP

complex event processing

Conference Within A Conference, Part Two: Big Fast Data World

John Bates, who I know from his days at Progress Software, actually holds the title SVP of Big Fast Data at Software AG. Love it. He led off the Big Fast Data World sub-conference at Innovation World to talk about real-time decisioning based on events, whether that is financial data such as trades, or device events from an oil rig. This isn’t just simple “if this event occurs, then trigger this action” sort of decisions, but real-time adaptive intelligence that might include social media, internal business systems, market information and more. It’s where events, data, analytics and process all come together.

The goal is to use all of the data and events possible to appear to be reading your customer’s mind and offering them the most likely thing that they want right then (without being too creepy about it), using historical patterns, current context and location information. For example, a customer is in the process of buying something, and their credit card company or retail partner uses that opportunity to upsell them on a related product or a payment plan, directly to their mobile phone and before they have finished making their purchase. Or, a customer is entering a mall, they are subscribed to a sports information service, there are available tables at a sports bar in the mall, so they are pushed a coupon to have lunch at the sports bar right then. Even recommendation engines, such as we see every time that we visit Amazon or Netflix, are examples of this. Completely context sensitive, and completely personalized.

On the flip side, companies have to use continuous monitoring of social media channels for proactive customer care: real-time event and data analysis for responding to unhappy customers before situations blow up on them. People like Dave Carroll and Heather Armstrong (and sometimes even me, on a much smaller scale) can strike fear in the hearts of customer service organizations who are unable to respond appropriately and quickly, but can cause big wins for these companies when they do the right things to fix things in an expedient manner for their customers.

What do you need to do to make this happen? Not much, just low-latency universal messaging, in-memory unstructured data, real-time predictive analytics, intelligent actions via real-time integration to operational systems, and real-time visual analytics. If you’re a Software AG customer, they’re bringing together Terracotta, Apama and JackBe into a unified platform for this sort of adaptive intelligence, producing intelligent actions from big data, in real time, to/from anywhere.

Software AG Big Fast Data

We then got a bit of a lesson on big data from Nathaniel Rowe, a research analyst at Aberdeen Group: how big is big, what’s the nature of that data, and some of the problems with it. The upshot: the fact that there’s a lot of data is important, but it’s the unstructured nature of it that presents many of the difficult analytical problems. It’s about volume, but also variety and velocity: the data could be coming from anywhere, and you don’t have control over a lot of it such as the social media data or that from business partners. You have to have a clear picture of what you want out of big data, such as better customer insights or operational visibility; Rowe had a number of use cases from e-commerce to healthcare to counterterrorism. The ability to effectively use unstructured data is key: those companies that are best in class are doing this better than average, and it translates directly to measures such as sales, customer satisfaction and net promoter score. He finished up with some of the tools required – automatic data capture, data compression, and data cleansing – and how those translate directly to employees’ ability to find data, particularly from multiple sources at once. Real-time analytics and in-memory analytics are the two high-speed technologies that result in the largest measurable benefits when working with big data, making the difference between seconds (or even sub-second) to see a result or take an action, versus minutes or hours. He ended up with the correlation between investing in big data and various customer experience measures (15-18% increases) as well as revenue measures (12-17% increases). Great presentation, although I’m pretty sure that I missed 75% of it since he is a serious speed-talker and zipped through slides at the speed of light.

And we’re done for the day: back tomorrow for another full day of Innovation World. I’m off to the drinks reception then a customer party event; as always, everything is off the record as soon as the bar opens. Smile

Kicking Off @SoftwareAG @InnovationWorld

For the first time in a few years, I’m at Software AG’s Innovation World conference in San Francisco (I think that the last time I was here, it was still the webMethods Integration World), and the focus is on the Digital Enterprise. At the press panel that I attended just prior to this evening’s opening keynote, one journalist made the point that “digital enterprise” is kind of a dumb term (I paraphrase here) because everything is digital now: we need a more specific term to mean what Software AG is getting at with this. Clay Richardson of Forrester, who I dragged along to the press session, said that his colleagues are talking about the post-digital age, which I take to mean is based on the assumption that all business is digital so that term is a bit meaningless, although “post-digital” isn’t exactly descriptive either.

Terminology aside, Software AG’s heart is in the right place: CEO Karl-Heinz Streibich took the stage at the opening keynote to talk about how enterprises need to leverage this digital footprint by integrating systems in ways that enable transformation through alignment and agility. You can still detect the schisms in the Software AG product portfolio, however: many of the customer case studies were single-product (e.g., ARIS or webMethods), although we did hear about the growing synergy between Apama (CEP and analytics) and webMethods for operational visibility, as well as Apama and Terracotta (in-memory big data number crunching). As with many of the other large vendors that grow through acquisitions,

We heard briefly from Ivo Totev, Software AG’s CMO; saw presentations of two of their customer innovation awards; then had a lengthier talk on the power of mobile and social from Erik Qualman, author of Socialnomics and Digital Leader. Unlike the usual pop culture keynote speaker, Qualman’s stuff is right on for this audience: looking at how successful companies are leveraging online social relationships, data and influence to further their success through engagement: listening, interacting and reacting (and then selling). He points out that trying to sell first before engaging doesn’t work online because it doesn’t work offline; the methods of engagement are different online and offline, but the principles from a sales lead standpoint are the same. You can’t start the conversation by saying “hey, I’m great, buy this thing that I’m selling” (something that a *lot* of people/companies just starting with Twitter and/or blogging haven’t learned yet).

Qualman took the popular Dave Carroll’s “United Breaks Guitars” example from a couple of years ago, and talked about not just how United changed their policies on damage as a result of this, but the other people who leveraged the situation into increased sales: Taylor Guitars; a company that created a “Dave Carroll” travelling guitar case; and Carroll himself through sales of the song and his subsequent book on the power of one voice in the age of social media. He looked at companies that have transformed their customer experience through mobile (e.g., Starbucks mobile app, which has personally changed my café loyalty) by giving the customer a way to do what they want to do – which hopefully involves buying your product – in the easiest possible way; and how a fast and slightly cheeky social media presence can give you an incredible boost for very little investment (e.g., Oreo’s “dunk in the dark” tweet when the lights went out during the Superbowl). I gave a presentation last year on creating your own process revolution that talked about some of these issues and the new business models that are emerging because of it.

Great to see John Bates here, who I know from his tenure at Progress Software and came on at Software AG with the Apama acquisition, as well as finally meet Theo Priestley face to face after years of tweeting at each other.

Disclosure: Software AG is a customer (I’m in the middle of creating some white papers and webinars for them), and they paid my travel expenses to be at this conference. However, what I write here is my own opinion and I have not been financially compensated for it.

Can BPM Save Lives? Siemens Thinks So

My last session at Gartner BPM 2013 is a discussion between Ian Gotts of TIBCO and their customer Tommy Richardson, CTO of Siemens Medical Solutions. I spoke with Siemens last year at Gartner and TUCON and was very interested in their transition from the old iProcess BPM platform (which originally came from TIBCO’s Staffware acquisition) to the newly-engineered AMX platform, which includes BPM and several other stack components such as CEP. Siemens isn’t an end-user, however: they OEM the TIBCO products into their own Soarian software, which is then sold to medical organizations for what Richardson refers to as “ERP for hospitals”. If you go to a hospital that uses their software, a case (process instance) is created for you at check-in, and is maintained for the length of your stay, tracking all of the activity that happens while you’re there.

With about 150 customers around the world, Seimens offers both hosted and on-premise versions of their software. Standard processes are built into the platform, and the hospitals can use the process modeler to create or modify the models to match their own business processes. These processes can then guide the healthcare professionals as they administer treatment (without forcing them to follow a flow), and capture the actions that did occur so that analytics can determine how to refine the processes to better support patient diagnosis and treatment. This is especially important for complex treatment regimes such as when an unusual infectious disease is diagnosed, which requires both treatment and isolation actions that may not be completely familiar to the hospital staff. Data is fed to and from other hospital systems as part of the processes, so the processes are not executing in isolation from all of the other information about the patient and their care.

For Siemens, BPM is a silver bullet for software development: they can make changes quickly since little is hard-coded, allowing treatment processes to be modified as research and clinical results indicate new treatment methods. In fact, the people who maintain the flows (both at Siemens and their customers) are not developers: they have clinical backgrounds so that they are actually subject matter experts, although are trained on the tools and in a process analyst role rather than medical practitioner role. If more technical integration is required, then developers do get involved, but not for process model changes.

The Siemens product does a significant amount of integration between the executing processes and other systems, such as waiting for and responding to test results, and monitoring when medications are administered or the patient is moved to another location in the hospital. This is where the move to AMX is helping them, since there’s a more direct link to data modeling, organizational models, analytics, event handling from other systems via the ESB, and other functionality in the TIBCO stack, replacing some amount of custom software that they had developed as part of the previous generations of the system. As I’ve mentioned previously, there is no true upgrade from iProcess to AMX/BPM since it’s a completely new platform, so Siemens actually did a vendor evaluation to see if this was an opportunity to switch which product OEMed into their product, and decided to stay with TIBCO. When they roll out the AMX-based version in the months ahead, they will keep the existing iProcess-based system in place for each existing client for a year, with new patient cases being entered on the new system while allowing the existing cases to be worked in place on the old system. Since a case completes when a patient is discharged, there will be very few cases remaining on the iProcess system after a year, which can then be transferred manually to the new system. This migration strategy is far beyond what most companies do when switching BPM platforms, but necessary for Siemens because of the potentially life-threatening (or life-saving) nature of their customers’ processes. This also highlights how the BPMS is used for managing the processes, but not as a final repository for the persistent patient case information: once a case/process instance completes on patient check-out, the necessary information has been pushed to other systems that maintain the permanent record.

Modernizing the healthcare information systems such as what Siemens is doing also opens up the potential for better sharing of medical information (subject to privacy regulations, of course): the existence of an ESB as a basic component means that trusted systems can exchange information, regardless of whether they’re in the same or different organizations. With their hosted software, there’s also the potential to use the Siemens platform as a way for organizations to collaborate; although this isn’t happening now (as far as I can tell), it may be only a matter of time before Siemens is hosting end-to-end healthcare processes with participants from hospitals, speciality clinics and even independent healthcare professionals in a single case to provide the best possible care for a patient.

TIBCO Corporate and Technology Analyst Briefing at TUCON2012

Murray Rode, COO of TIBCO, started the analyst briefings with an overview of technology trends (as we heard this morning, mobile, cloud, social, events) and business trends (loyalty and cross-selling, cost reduction and efficiency gains, risk management and compliance, metrics and analytics) to create the four themes that they’re discussing at this conference: digital customer experience, big data, social collaboration, and consumerization of IT. TIBCO provides a platform of integrated products and functionality in five main areas:

  • Automation, including messaging, SOA, BPM, MDM, and other middleware
  • Event processing, including events/CEP, rules, in-memory data grid and log management
  • Analytics, including visual analysis, data discovery, and statistics
  • Cloud, including private/hybrid model, cloud platform apps, and deployment options
  • Social, including enterprise social media, and collaboration

A bit disappointing to see BPM relegated to being just a piece of the automation middleware, but important to remember that TIBCO is an integration technology company at heart, and that’s ultimately what BPM is to them.

Taking a look at their corporate performance, they have almost $1B in revenue for FY2011, showing growth of 44% over the past two years, with 4,000 customers and 3,500 employees. They continue to invest 14% of revenue into R&D with a 20% increase in headcount, and significant increases in investment in sales and marketing, which is pushing this growth. Their top verticals are financial services and telecom, and while they still do 50% of their business in the Americas, EMEA is at 40%, and APJ making up the other 10% and showing the largest growth. They have a broad core sales force, but have dedicated sales forces for a few specialized products, including Spotfire, tibbr and Nimbus, as well as for vertical industries.

They continue to extend their technology platform through acquisitions and organic growth across all five areas of the platform functionality. They see the automation components as being “large and stable”, meaning we can’t expect to see a lot of new investment here, while the other four areas are all “increasing”. Not too surprising considering that AMX BPM was a fairly recent and major overhaul of their BPM platform and (hopefully) won’t need major rework for a while, and the other areas all include components that would integrate as part of a BPM deployment.

Matt Quinn then reviewed the technology strategy: extending the number of components in the platform as well as deepening the functionality. We heard about some of this earlier, such as the new messaging appliances and Spotfire 5 release, some recent releases of existing platforms such as ActiveSpaces, ActiveMatrix and Business Events, plus some cloud, mobile and social enhancements that will be announced tomorrow so I can’t tell you about them yet.

We also heard a bit more on the rules modeling that I saw before the sessions this morning: it’s their new BPMN modeling for rules. This uses BPMN 1.2 notation to chain together decision tables and other rule components into decision services, which can then be called directly as tasks within a BPMN process model, or exposed as web services (SOAP only for now, but since ActiveMatrix is now supporting REST/JSON, I’m hopeful for this). Sounds a bit weird, but it actually makes sense when you think about how rules are formed into composite decision services.

There was a lot more information about a lot more products, and then my head exploded.

Like others in the audience, I started getting product fatigue, and just picking out details of products that are relevant to me. This really drove home that the TIBCO product portfolio is big and complex, and this might benefit from having a few separate analyst sessions with some sort of product grouping, although there is so much overlap and integration in product areas that I’m not sure how they would sensibly split it up. Even for my area of coverage, there was just too much information to capture, much less absorb.

We finished up with a panel of the top-level TIBCO execs, the first question of which was about how the sales force can even start to comprehend the entire breadth of the product portfolio in order to be successful selling it. This isn’t a problem unique to TIBCO: any broad-based platform vendor such as IBM and Oracle have the same issue. TIBCO’s answer: specialized sales force overlays for specific products and industry verticals, and selling solutions rather than individual products. Both of those work to a certain extent, but often solutions end up being no more than glorified templates developed as sales tools rather than actual solutions, and can lead to more rather than less legacy code.

Because of the broad portfolio, there’s also confusion in the customer base, many of whom see one TIBCO product and have no idea of everything else that TIBCO does. Since TIBCO is not quite the household name like IBM or Oracle, companies don’t necessarily know that TIBCO has other things to offer. One of my banking clients, on hearing that I am at the TIBCO conference this week, emailed “Heard of them as a player in the Cloud Computing space.  What’s different or unique about them vs others?” Yes, they play in the cloud. But that’s hardly what you would expect a bank (that uses very little cloud infrastructure, and likely does have some TIBCO products installed somewhere) to think of first when you mention TIBCO.

TIBCO TUCON2012 Day 1 Keynotes, Part 2: Big Honking Data

Back from the mid-morning break, CMO Raj Verma shifted gears from customer experience management to look at one of the other factors introduced in the first part of the session: big data.

Matt Quinn was back to talk about big data: in some ways, this isn’t new, since there has been a lot of data within enterprises for many years. What’s changed is that we now have the tools to deal with it, both in place and in motion, to find the patterns hiding within it through cleansing and transformation. He makes a sports analogy, saying that a game is not just about the final score, but about all of the events that happen to make up the entire game; similarly, it is not sufficient any more to just measure outcomes in business transactions, you have to monitor patterns in the event streams and combine that with historical data to make the best possible decisions about what is happening right now. He referred to this combination of event processing and analytics as closing the loop between data in motion and data at rest. TIBCO provides a number of products that combine to handle big data: not just CEP, but ActiveSpaces (the in-memory data grid) to enable realtime processing, Spotfire for visual analytics and integration with Hadoop.

We saw a demo of LogLogic, recently acquired by TIBCO, which provides analytics and event detection on server logs. This might sound like a bit of a boring topic, but I’m totally on with this: too many companies just turn off logging on their servers because it generates too many events that they just can’t do anything with, and it impacts performance since logging is done on the operational server. LogLogic’s appliance can collect enormous amounts of log data, detect unusual events based on various rules, and integrate with Spotfire for visualization of potential security threats.

Mark Lorion, CMO for TIBCO Spotfire, came up to announce Spotfire 5, with a complete overhaul to the analytics engine, and including the industry’s first enterprise runtime for the R statistical language, providing 10 times the performance of the open source R project for predictive analytics. Self-service predictive analytics, ftw. They are also going beyond in-memory, integrating with Teradata, Oracle and Microsoft SQL Server for in-database analysis. With Teradata horsepower behind it – today’s announcement of Spotfire being optimized for in-database computation on Teradata – you can now do near-realtime exploration and visualization of some shocking amounts of data. Brad Hopper gave us a great Spotfire demo, not something that most TUCON attendees are used to seeing on the main stage.

Rob Friel, CEO of PerkinElmer, took the stage to talk about how they are using big data and analytics in their scientific innovations in life sciences: screening patient data, environmental samples, human genomes, and drug trials to detect patterns that can improve quality of life in some way. They screened 31 million babies born last year (one in four around the globe) through the standard heel-prick blood test, and detected 18,000 with otherwise undiagnosed disorders that could be cured or treated. Their instrumentation is key in acquiring all the data, but once it’s there, tools such as Spotfire empower their scientists to discover and act on what they find in the data. Just as MGM Grand is delivering unique experiences to each customer, PerkinElmer is trying to enable personalized health monitoring and care for each patient.

To wrap up the big data section, Denny Page, TIBCO’s VP of Engineering, came on stage with his new hardware babies: a FTL Message switch and an EMS appliance, both to be available by the end of November 2012.

For the final part of the day 1 keynotes, we heard from an innovators’ panel of Scott McNealy (founder of Sun Microsystems, now chairman of Wayin), Tom Siebel (founder of Siebel Systems, now at C3 Energy where they are using TIBCO for energy usage analytics), Vivek Ranadivé, and KR Sridhar (CEO of Bloom Energy), chaired by David Kirkpatrick. Interesting and wide-ranging discussion about big data, analytics, sentiment analysis, enterprise social media, making data actionable, the internet of things and how a low barrier to platform exit drives innovation. The panel thinks that the best things in tech are yet to come, and I’m in agreement, although those who are paranoid about the impact of big data on their privacy should be very, very afraid.

I’ll be blogging from the analyst event for the rest of the day: we have corporate and technology briefings from the TIBCO execs plus some 1:1 sessions. No pool time for me today!

TIBCO TUCON2012 Day 1 Keynotes, Part 1

The keynotes started with TIBCO’s CEO, Vivek Ranadivé, talking about the forces driving change: a massive explosion of data (big data), the emergence of mobility, the emergence of platforms, the rise of Asia (he referenced the Gangnam Style video, although did not actually do the dance), and how math is trumping science (e.g., the detection and exploitation of patterns). The ability to harness these forces and produce extreme value is a competitive differentiator, and is working for companies like Apple and Amazon.

Raj Verma, TIBCO’s CMO, was up next, continuing the message of how fast things are changing: more iPhones were sold over the past few days than babies were born worldwide, and Amazon added more computing capacity last night than they had in total in 2001. He (re)introduced their concept of the two-second advantage – the right information a little bit before an event is worth infinitely more than any amount of information after the event – enabled by an event-enabled enterprise (or E3, supported by, of course, TIBCO infrastructure). Regardless of whether or not you use TIBCO products, this is a key point: if you’re going to exploit the massive amounts of data being generated today in order to produce extreme value, you’re going to need to be an event-enabled enterprise, responding to events rather than just measuring outcomes after the fact.

He discussed the intersection of four forces: cloud, big data, social collaboration and mobility. This is not a unique message – every vendor, analyst and consultant are talking about this – but he dug into some of these in detail: mobile, for example, is no longer discretionary, even (or maybe especially) in countries where food and resources are scarce. The four of these together all overlap in the consumerization of IT, and are reshaping enterprise IT. A key corporate change driven by these is customer experience management: becoming the brand that customers think of first when the product class is mentioned, and turning customers into fans. Digital marketing, properly done, turns your business into a social network, and turns customer management into fan management.

Matt Quinn, CTO, continued the idea of turning customers into fans, and solidifying customer loyalty. To do this, he introduced TIBCO’s “billion dollar backend” with its platform components of automation, event processing, analytics, cloud and social, and hosted a series of speakers on the subject of customer experience management.

We then heard from a customer, Chris Nordling, EVP of Operations and CIO of MGM Resorts and CityCenter, who use TIBCO for their MLife customer experience management/loyalty program. Their vision is to track everything about you from your gambling wins/losses to your preferences in restaurants and entertainment, and use that to build personalized experiences on the fly. By capturing the flow of big data and responding to events in realtime, the technology provides their marketing team with the ability to provide a zero-friction offer to each customer individually before they even know that they want something: offering reduced entertainment tickets just as you’re finishing a big losing streak at the blackjack tables, for example. It’s a bit creepy, but at the same time, has the potential to provide a better customer experience. Just a bit of insight into what they’re spending that outrageous $25/day resort fee on.

Quinn came back to have a discussion with one of their “loyalty scientists” (really??) about Loyalty Lab, TIBCO’s platform/service for loyalty management, which is all about analyzing events and data in realtime, and providing “audience of one” service and offerings. Traditional loyalty programs were transaction-based, but today’s loyalty programs are much more about providing a more holistic view of the customer. This can include not just events that happen in a company’s own systems, but include external social media information, such as the customer’s tweets. I know all about that.

Another customer, Rick Welts of the Golden State Warriors (who, ironically, play at the Oracle stadium) talked about not just customer loyalty management, but the Moneyball-style analytics that they apply to players on a very granular scale: each play of each game is captured and analyzed to maximize performance. They’re also using their mobile app for a variety of customer service initiatives, from on-premise seat upgrades to ordering food directly from your seat in the stadium.

Mid-morning break, and I’ll continue afterwards.

As an aside, I’m not usually wide awake enough to get much out of the breakfast-in-the-showcase walkabout, but this morning prior to the opening sessions, I did have a chance to see the new TIBCO decision services integrated into BPM, also available as standalone services. Looked cool, more on that later.

IBM Impact Day 2: Engage. Extend. Succeed.

Phil Gilbert spoke at the main tent session this morning, summarizing how they announced IBM BPM as a unified offering at last year’s Impact, and since then they’ve combined Business Events and ILOG to form IBM ODM (operational decision management). Business process and decision management provide visibility and governance, forming a conduit to provide information about transactions and data to people who need to access it. IBM claims to have the broadest, most integrated process portfolio, having taken a few dozen products and turned them into two products; Phil was quick to shoot down the idea that this is a disjointed, non-integrated collection of tools, referring to it instead as a “loosely coupled integration architecture”. Whatever.

Around those two core products (or product assemblies) are links to other enterprise tools – Tivoli, MDM, ECM and SAP – forming the heart of business processes and system orchestration. In version 8 of BPM and ODM, they’ve added collaboration, which is the third key imperative for business alongside visibility and governance.

We saw a demo of the new capabilities, most of which I talked about in yesterday’s post. For ODM, that included the new decision console (social activity stream, rules timeline) and global rules search. For BPM, there’s the new socially-aware process portal, which has been created on their publicly-available APIs so that you can roll your own portal with the same level of functionality. There’s searching in the process portal to find tasks easily. The new coach (UI form) designer allows you to create very rich task interfaces more easily, including the sidebar of task/instance details, instance-specific activity stream, and experts available for collaboration. They’ve incorporated the real-time collaboration capabilities of Blueworks Live into the BPM coaches to allow someone to request and receive help from an expert, with the user and the expert seeing each other’s inputs synchronously on the form in question. Lastly, Approve/Reject type tasks can be completed in-line directly in the task list, making it much faster to move through a long set of tasks that require only simple responses. He wrapped up with the obligatory iPad demo (have to give him credit for doing that part of the live demo himself, which most VPs wouldn’t consider).

The general session also included presentations of some innovative uses of BPM and ODM by IBM’s customers: Ottawa General Hospital, which has put patient information and processes on an iPad in the doctors’ pockets, and BodyMedia, which captures, analyzes and visualizes a flood of biometric data points gathered by an armband device to assist with a weight loss program.

What Analysts Need to Understand About Business Events

Paul Vincent, CTO of Business Rules and CEP at TIBCO (and possibly the only person at Building Business Capability sporting a bow tie), presented a less technical view of events that you would normally see in one of his presentation, intended to have the business analysts here at Building Business Capability understand what events are, how they impact business processes, and how to model them. He started with a basic definition of events – an observation, a change in state, or a message – and why we should care about them. I cover events in the context of processes in many of the presentations that I give (including the BPM in EA tutorial that I did here on Monday), and his message is the same: life is event-driven, and our business processes need to learn to deal with that fact. Events are one of the fundamentals of business and business systems, but many systems do not handle external events well. Furthermore, many process analysts don’t understand events or how to model them, and can end up creating massive spaghetti process models to try and capture the result of events since they don’t understand how to model events explicitly.

He went through several different model types that allow for events to be captured and modeled explicitly, and compared the pros and cons of each: state models, event process chain models, resources events agents (REA) models, and BPMN models. The BPMN model is the only one that really models events in the context of business processes, and relates events as drivers of process tasks, but is really only appropriate for fairly structured processes. It does, however, allow for modeling 63 different types of events, meaning that there’s probably nothing that can happen that can’t be modeled by a BPMN event. The heavy use of events in BPMN models can make sense for heavily automated processes, and can make the process models much more succinct. Once the event notation is understood, it’s fairly easy to trace through them, but events are the one thing in BPMN that probably won’t be immediately obvious to the novice process analyst.

In many cases, individual events are not the interesting part, but rather a correlation between many events; for example, fraud events may be detected only have many small related transactions have occurred. This is the heart of complex event processing (CEP), which can be applied to a wide variety of business situations that rely on large volumes of events, and distinguishes between simple process patterns and business rules that can be applied to individual transactions.

Looking at events from an analyst’s view, it’s necessary to identify actors and roles, just as in most use cases, then identify what they do and (more importantly) when they do it in order to drive out the events, their sources and destinations. Events can be classified as positive (e.g., something that you are expecting to happen actually happened), negative (e.g., something that you are expecting to happen didn’t happen within a specific time interval) or sets (e.g., the percentage of a particular type of event is exceeding an SLA). In many cases, the more complex events that we start to see in sets are the ones that you’re really interested in from a business standpoint: fraud, missed SLAs, gradual equipment failure, or customer churn.

He presented the EPTS event reference architecture for complex events, then discussed how the different components are developed during analysis:

  • Event production and consumption, namely, where events come from and where they go
  • Event preparation, or what selection operations need to be performed to extract the events, such as monitoring, identification and filtering
  • Event analysis, or the computations that need to be performed on the individual events
  • Complex event detection, that is, the event correlations and patterns that need to performed in order to determine if the complex event of interest has occurred
  • Event reaction, or what event actions need to be performed in reaction to the detected complex event; this can overlap to some degree with predictive analytics in order to predict and learn the appropriate reactions

He discussed event dependencies models, which show event orderings, and relate events together as meaningful facts that can then be used in rules. Although not a common practice, this model type does show relationships between events as well as linking to business rules.

He finished with some customer case studies that include CEP and event decision-making: FedEx achieving zero latency in determining where a package is right now; and Allstate using CEP to adjust their rules on a daily basis, resulting in a 15% increase in closing rates.

A final thought that he left us with: we want agile processes and agile decisions; process changes and rule changes are just events. Analyzing business events is good, but exploiting business events is even better.

TIBCO Acquisitions With Tom Laffey: OpenSpirit, Loyalty Lab and Nimbus

Tom Laffey, EVP of products and technology, moderated a session highlighting three of TIBCO’s recent acquisitions: OpenSpirit, Loyalty Lab and Nimbus.

Clay Harter, CTO of OpenSpirit (which was acquired by TIBCO a year ago), discussed their focus on delivering data and integration applications to the oil and gas industry. Their runtime framework provided a canonical data model over a heterogeneous set of data stores, and their desktop applications integrated with spatial data products such as ESRI’s ArcGIS and Schlumberger’s remote sensing. Due to their knowledge of the specialized data sources, they have a huge penetration into 330+ oil companies and relationships into industry-specific ISVs. In October, they will release a BusinessWorks plugin for OpenSpirit to make oil and gas technical data available through the TIBCO ESB. They are also prototyping a Spotfire extension for OpenSpirit for visualizing and analyzing this data, which is pretty cool – I worked as a field engineer in oil and gas in the early 80’s, and the sensing and visualization of data was a whole different ball game then, mostly black magic. OpenSpirit’s focus is on reducing exploration costs and increasing safely through better analysis of the petrotechnical data, particularly through interdisciplinary collaboration. From TIBCO’s standpoint, they were building their energy vertical, and the acquisition of OpenSpirit brings them expertise and credibility in that domain.

Keith Rose, formerly president of Loyalty Lab and now leading the sales efforts in that area since their acquisition by TIBCO, presented on their event-driven view of managing customer loyalty, particularly loyalty programs such as those used by airlines and retailers. They have a suite of products that support marketers in terms of visualizing and analyzing loyalty-related data, and building loyalty programs that can leverage that information. Their focus on events – the core of real-time and one-to-one loyalty marketing programs – was likely the big reason for the TIBCO acquisition, since TIBCO’s event and messaging infrastructure seems like a natural fit to feed into Loyalty Lab’s analysis and programs. Spotfire for visualization and analysis of data also makes a lot of sense here, if they can work out how to integrate that with their existing offerings. With 99% of their customers on a hosted cloud solution, they may also want to consider how a move to TIBCO’s cloud platform can benefit them and integrate with other initiatives that their customers may have.

Less than a month ago, Nimbus was acquired by TIBCO, and Mark Cotgrove, a founder and EVP, gave us a briefing on their product and why it made sense for TIBCO to acquire them. Nimbus provides tools for process discovery and analysis, including the 80% (or so) of an organization’s activities that are manual and are likely to remain manual. Currently, the automated activities are handled with enterprise applications and automated BPM (such as AMX/BPM), but the manual ones are managed with a mix of office productivity software (Word, PowerPoint, Visio) and business process analysis tools. Furthermore, end-to-end processes range back and forth between manual and automated activities as they progress through their lifecycle, such that often a single process instance ends up being managed by a variety of different tools. Nimbus provides what are essentially storyboards or guided walkthroughs for business processes: like procedures manuals, but more interactive. These “intelligent operations manuals” can include steps that will instruct the user to interact with a system of some sort – for example, an ERP system, or a BPMS such as AMX/BPM – but documents all of the steps including paper handling and other manual activities. Just as a BPMS can be an orchestration of multiple integrated systems, Nimbus Control can be an orchestration of human activities, including manual steps and interaction with systems. There are a few potential integration points between Nimbus and a few different TIBCO products: metrics in the context of a process using Spotfire; exporting discovered processes from Nimbus to BusinessStudio; instantiating an AMX/BPM process from Nimbus; worker accessing a Nimbus operations manual for instructions at the step in an AMX/BPM process; collaborative process discovery using tibbr; and tibbr collaboration as part of a manual process execution. Some or all of these may not happen exactly like this, but there is some interesting potential here. There’s also potential within an organization for finding opportunities for AMX/BPM implementation through process discovery using Nimbus.

An interesting view of three different acquisitions, based on three very different rationales: industry vertical; horizontal application platform; and expansion of core product functionality. TIBCO is definitely moving from their pure technology focus to one that includes verticals and business applications.

TIBCO Product Strategy With Matt Quinn

Matt Quinn, CTO, gave us the product strategy presentation that will be seen in the general session tomorrow. He repeated the “capture many events, store few transactions” message as well as the five key components of a 21st century platform that we heard from Murrary Rode in the previous session; this is obviously a big part of the new messaging. He drilled into their four broad areas of interest from a product technology standpoint: event platform innovation, big data and analytics, social networking, and cloud enablement.

In the event platform innovation, they released BusinessEvents 5.0 in April this year, including the embedded TIBCO Datagrid technology, temporal pattern matching, stream processing and rules integration, and some performance and big data optimizations. One result is that application developers are now using BusinessEvents to build applications from the ground up, which is a change in usage patterns. For the future, they’re looking at supporting other models, such as BPMN and rule models, integrating statistical models, improving queries, improving the web design environment, and providing ActiveMatrix deployment options.

In ActiveMatrix, they’ve released a fully integrated stack of BusinessWorks, BPM and ServiceGrid with broader .Net and C++ support, optimized for large deployments and with better high-availability support and hot deployment capabilities. AXM/BPM has a number of new enhancements, mostly around the platform (such as the aforementioned HA and hot deployment), with their upcoming 1.2 release providing some functional enhancements such as customer forms and business rules based on BusinessEvents. We’ll see some Nimbus functionality integration before too much longer, although we didn’t see that roadmap; as Quinn pointed out, they need to be cautious about positioning which tools are for business users versus technical users. When asked about case management, he said that “case management brings us into areas where we haven’t yet gone as a company and aren’t sure that we want to go”. Interesting comment, given the rather wild bandwagon-leaping that has been going on in the ACM market by BPM and ECM vendors.

The MDM suite has also seen some enhancements, with ActiveSpaces integration and collaborative analytics with Spotfire, allowing MDM to become a hub for reference data from the other products. I’m very excited to see that one-click integration between MDM and AMX/BPM is on the roadmap; I think that MDM integration is going to be a huge productivity boost for overall process modeling, and when I reviewed AMX/BPM last year, I liked their process data modeling stated that “the link between MDM and process instance data needs to be firmly established so that you don’t end up with data definitions within your BPMS that don’t match up with the other data sources in your organization”. In fact, the design-time tool for MDM is now the same as that used for business object data models that I saw in AMX/BPM, which will make it easier for those who move across the data and process domains.

TIBCO is trying to build out vertical solutions in certain industries, particularly those where they have acquired or built expertise. This not only changes what they can package and offer as products, but changes who (at the customer) that they can have a relationship with: it’s now a VP of loyalty, for example, rather than (or in addition to) someone in IT.

Moving on to big data and analytics technology advances, they have released FTL 2.0 (low-latency messaging) to reduce inter-host latency below 2.2 microseconds as well as provide some user interface enhancements to make it easier to set up the message exchanges. They’re introducing TIBCO Web Messaging to integrate consumer mobile devices with TIBCO messaging. They’ve also introduced a new version of ActiveSpaces in-memory data grid, providing big data handling at in-memory speeds by easing the integration with other tools such as event processing and Spotfire.

They’ve also released Spotfire 4.0 visual analytics, with a bit focus on ease of use and dashboarding, plus tibbr integration for social collaboration. In fact, tibbr is being used as a cornerstone for collaboration, with many of the TIBCO products integrating with tibbr for that purpose. In the future, tibbr will include collaborative calendars and events, contextual notifications, and other functionality, plus better usability and speed. Formvine has been integrated with tibbr for forms-based routing, and Nimbus Control integrates with tibbr for lightweight processes.

Quinn finished up discussing their Silver Fabric cloud platform to be announced tomorrow (today, if you count telling a group of tweet-happy industry analysts) for public, private and hybrid cloud deployments.

Obviously, there was a lot more information here that I could possibly capture (or that he could even cover, some of the slides just flew past), and I may have to get out of bed in time for his keynote tomorrow morning since we didn’t even get to a lot of the forward-looking strategy. With a product suite as large as what TIBCO has now, we need much more than an hour to get through an analyst briefing.