Activiti BPM Suite – Sweet!

There are definitely changes afoot in the open source BPM market, with both Alfresco’s Activiti and camunda releasing out-of-the-box end-user interfaces and model-driven development tools to augment their usual [Java] developer-friendly approach. In both cases, they are targeting “citizen developers”: people who have technical skills and do some amount of development, but in languages lighter weight than Java. There are a lot of people who fall into this category, including those (like me) who used to be hard-core developers but fell out of practice, and those who have little formal training in software development but have some other form of scientific or technical background.

Prior to this year, Activiti BPM was not available as a standalone commercial product from Alfresco, only bundled with Alfresco or as the community open source edition; as I discussed last year, their main push was to position Activiti as the human-centric workflow within their ECM platform. However, Activiti sports a solid BPMN engine that can be used for more than just document routing and lifecycle management, and in May Alfresco released a commercially-supported Alfresco Activiti product, although focused on the human-centric BPM market. This provides them with opportunities to monetize the existing Activiti community, as well as evolving the BPM platform independently of their ECM platform, such as providing cloud and hybrid services; however, it may have some impact on their partners who were relying on support revenue for the community version.

The open source community engine remains the core of the commercial product – in fact, the enterprise release of the engine lags behind the community release, as it should – but the commercial offering adds all of the UI tools for design, administration and end-user interface, plus cluster configuration for the execution engine.

Activiti Administrator cluster monitoringThe Activiti Administrator is an on-premise web application for managing clusters, deploying process models from local packages or the Activiti Editor, and technical monitoring and administration of in-flight processes. There’s a nice setup wizard for new clusters – the open source version requires manual configuration of each node – and allows nodes within the cluster to be auto-discovered and monitored. The monitoring of process instances allows drilling into processes to see variables, the in-flight process model, and more. Not a business monitoring tool, but seems like a solid technical monitoring tool for on-premise Activiti Enterprise servers.

The Activiti Editor is a web-based BPMN process modeling environment that is a reimplementation of other open-source tools, refactored with JavaScript libraries for better performance. The palette can be configured based on the user profile in order to restrict the environment, which would typically be used to limit the number of BPMN objects available for modeling in order to reduce complexity for certain business users to create simple models; a nice feature for companies that want to OEM this into a larger environment. Models can be shared for comments (in a history stream format), versioned, then accessed from the Eclipse plug-in to create more technical executable models. Although I saw this as a standalone web app back in April, it is now integrated as the Visual Editor portion of Kickstart within the Activiti Suite.

Activiti SuiteThe Activiti Suite is a web application that brings together several applications into a single portal:

  • Kickstart is their citizen development environment, providing a simple step editor that generates BPMN 2.0 – which can then be refined further using the full BPMN Visual Editor or imported into the Eclipse-based Activiti Designer – plus a reusable forms library and the ability to bundles processes into a single process application for publishing within the Suite. In the SaaS version, it will integrate with cloud services including Google Drive, Alfresco, Salesforce, Dropbox and Box.
  • Tasks is the end-user interface for starting, tracking and participating in processes. It provides an inbox and other task lists, and provides for task collaboration by allowing a task recipient to add others who can then view and comment on the task. Written in Angular JS.
  • Profile Management to , for user profile and administration
  • Analytics, for process statistics and reports.

The Suite is not fully responsive and doesn’t have a mobile version, although apparently there are mobile solutions on the way. Since BP3 is an Activiti partner, some of the Brazos tooling is available already, and I suspect that more mobile support may be on the way from BP3 or Alfresco directly.

They have also partnered with Fluxicon to integrate process mining, allowing for introspection of the Activiti BPM history logs; I think that this is still a bit ahead of the market for most process analysts but will make it easy when they are ready to start doing process discovery for bottlenecks and outliers.

I played around with the cloud version, and it was pretty easy to use (I even found a few bugs Smile ) and it would be usable by someone with some process modeling and lightweight development skills to build apps. The Step Editor provides a non-BPMN flowcharting style that includes a limited number of functions, but certainly enough to build functional human-centric apps: implicit process instance data definition via graphical forms design; step types for human, email, “choice” (gateway), sub-process and publishing to Alfresco Cloud; a large variety of form field types; and timeouts on human tasks (although timers based on business days, rather than calendar days, are not there yet). The BPMN Editor has a pretty complete palette of BPMN objects if you want to do a more technical model that includes service tasks and a large variety of events.

Although initially launched in a public cloud version, everything is also available on premise as of the end of November. They have pricing for departmental (single-server up to four cores with a limit on active processes) and enterprise (eight cores over any number of servers, with additional core licensing available) configurations, and subscription licensing for the on-premise versions of Kickstart and Administrator. The cloud version is all subscription pricing. It seems that the target is really for hybrid BPM usage, with processes living on premise or in the cloud depending on the access and security requirements. Also, with the focus on integration with content and human-centric processes, they are well-positioned to make a play in the content-centric case management space.

Instead of just being an accelerator for adding process management to Java development projects, we’re now seeing open source BPM tools like Activiti being positioned as accelerators for lighter-weight development of situational applications. This is going to open up an entire new market for them: an opportunity, but also some serious new competition.

ActiveMatrix BPM at Citibank Brazil

Roberto Mercadante, SVP of operations and technology at Citibank Brazil, presented a session on their journey with AMX BPM. I also had a chance to talk to him yesterday about their projects, so have a bit of additional information beyond what he covered in the presentation They are applying AMX BPM to their commercial account opening/onboarding processes for “mid-sized” companies (between $500M-1B in annual revenue), where there is a very competitive market in Brazil that requires fast turnaround especially for establishing credit. As a global company in 160 countries, they are accustomed to dealing with very large multi-national organizations; unfortunately, some of those very robust features manifest in delays when handling smaller single-country transactions, such as their need to have a unique customer ID generated in their Philippines operation for any account opening. Even for functions performed completely within Brazil, they found that processes created for handling large corporate customers were just too slow and cumbersome for the mid-size market.

Prior to BPM implementation, the process was very paper-intensive, with 300+ steps to open an account, requiring as many as 15 signatures by the customer’s executives. Because it took so long, the commercial banking salespeople would try to bypass the process by collecting the paperwork and walking it through the operations center personally; this is obviously not a sustainable method for expediting processes, and wasn’t available to those people far from their processing center in Sao Paulo. Salespeople were spending as much as 50% of their time on operations, rather than building customer relationships.

They use an Oracle ERP, but found that it really only handled about 70% of their processes and was not, in the opinion of the business heads, a good fit for the remainder; they brought in AMX BPM to help fill that gap, typically representing localized processes due to unique market needs or regulations. In fact, they really consider AMX BPM to be their application development environment for building agile, flexible, localized apps around the centralized ERP.

When Citi implemented AMX BPM last year — for which they won an award — they were seeking to standardize and automate processes with the primary target to reduce the cycle time, which could be as long as 40 days. Interestingly, instead of reengineering the entire process, they did some overall modeling and process improvement (e.g., removing or parallelizing steps), but only did a complete rework on activities that would impact their goal of reducing cycle time, while enforcing their regulatory and compliance standards.

A key contributor to reducing cycle time, not surprisingly, was to remove the paper documents as early as possible in the process, which meant scanning documents in the branches and pushing them directly into their IBM FileNet repository, then kicking off the related AMX BPM processes. The custom scanning application included a checklist so that the branch-based salespeople could immediately know what documents that they were missing. Because they had some very remote branches with low communications bandwidth, they had to also create some custom store-and-forward mechanisms to save document transmission for times of low bandwidth usage, although that was eventually retired as their telecom infrastructure was upgraded. I’ve seen similar challenges with some of my Canadian banking customers regarding branch capture, with solutions ranging from using existing multifunction printers to actually faxing in documents to a central operational facility; paper capture still represents some of the hairiest problems in business processes, in spite of the fact that we’re all supposed to be paperless.

They built BPM analytics in Spotfire (this was prior to the Jaspersoft acquisition, which might have been a better fit for some parts of this) to display a real-time dashboard to identify operational bottlenecks — they felt strongly about including this from the start since they needed to be able to show real benefits in order to prove the value of BPM and justify future development. The result: 70% reduction in their onboarding cycle time within 3 months of implementation, from as much as 40 days down to a best time of about 3 days; it’s likely that they will not be able to reduce it further since some of that time is waiting for the customers to provide necessary documentation, although they do all the steps possible even in the absence of some documents so that the process can complete quickly as soon as the documents arrive. They also saw a 90% reduction in standard deviation, since no one was skewing the results by personally escorting documents through the operations center. Their customer rejection rate was reduced by 58%, so they captured a much larger portion of the companies that applied.

The benefits, however, extended beyond just operational efficiency: it allowed for decentralization of some amount of the front office functions, and allowed relocation of some back-office operations. This allows for leveraging shared services in other Citibank offices, relocating operations to less-expensive locations, and even outsourcing some operations completely.

They’re now looking at implementing additional functionality in the onboarding process, including FATCA compliance, mobile analytics, more legacy integration, and ongoing process improvement. They’re also looking at related problems that they can solve in order to achieve the same level of productivity, and considering how they can expand the BPMS implementation practices to support other regions. For this, they need to implement better BPM governance on a global basis, possibly through some center of excellence practices. They plan to do a survey of Citibank worldwide to identify the critical processes not handled by the ERP, and try to leverage some coordinated efforts for development as well as sharing experiences and best practices.

There’s one more breakout slot but nothing catches my eye, so I’m going to call it quits for TIBCO NOW 2014, and head out to enjoy a bit of San Francisco before I head home tomorrow morning. This is my last conference for the year, but I have a backlog of half-written product reviews that I will try to get up here before too long.

Event Analytics in Oil and Gas at TIBCONOW

Michael O’Connell, TIBCO’s chief data scientist, and Hayden Schultz, a TIBCO architect, discussed and demonstrated an event-handling example using remote sensor data with Spotfire and Streambase. One oil company may have thousands of submersible pumps moving oil up from well, and these modern pumps include sensors and telemetry to allow them to be monitored and controlled remotely. One of their oil and gas customers said that through active monitoring and control such as this, they are avoiding downtime worth $1000/day/well, meaning an additional $100M in additional revenue each year. In addition to production monitoring, they can also use remote monitoring in drilling operations to detect conditions that might be a physical risk. They use standards for sensor data format, and a variety of data sources including SAP HANA.

For the production monitoring, the submersible pumps emit a lot of data about their current state: monitoring for changes to temperature, pressure and current shows patterns that can be correlated with specific pre-failure conditions. By developing models of these pre-failure patterns using Spotfire’s data discovery capabilities on historical failure data, data pushed into Streambase can be monitored for the patterns, then Spotfire used to trigger a notification and allow visualization and analytics by someone monitoring the pumps.

We saw a demonstration of how the pre-failure patterns are modeled in Spotfire, then how the rules are implemented in Streambase for real-time monitoring and response using visual modeling and some XML snippets generated by Spotfire. We saw the result in Streambase LiveView, which provides visualization of streaming data and highlights those data points that are exhibiting the pre-failure condition. The engineers monitoring the pumps can change some of the configuration of the failure conditions, allowing them to fine-tune to reduce false positives without missing actual failure events. Events can kick off notification emails, generate Spotfire root cause analysis reports, or invoke other applications such as instantiating a BPM process.

There are a number of similar industrial applications, such as in mining: wherever there are a large number of remote devices that require monitoring and control.

AMX BPM and Analytics at TIBCONOW

Nicolas Marzin, from the TIBCO BPM field group, presented a breakout session on the benefits of combining BPM and analytics — I’m not sure that anyone really needs to be convinced of the benefits, although plenty of organizations don’t implement this very well (or at all) so it obviously isn’t given a high priority is some situations.

BPM analytics have a number of different audiences — end users, team leaders, live of business managers, and customer service managers — and each of them are interested in different things, from operational performance to customer satisfaction measures. Since we’re talking about BPM analytics, most of these are focused on processing work, but different views and aspects of that process-related information. Regardless of the information that they seek, the analytics need to be ease to use as well as informative, and focused on how analytics is more driven by questions that more static reporting.

There are some key BPM metrics regardless of industry:

  • Work backlog breakdown, including by priority, segment and skillset (required to determine resourcing requirements) or SLA status (required to calculate risk)
  • Resource pool and capacity
  • Aggregate process performance
  • Business data-specific measures, e.g., troublesome products or top customers

Monitoring and analytics are important not just for managing daily operations, but also to feed back into process improvement: actions taken based on the analytics can include work reprioritization, resource reallocation, or a request for process improvement. Some of these actions can be automated, particularly the first two; there’s also value in doing an in situ simulation to predict the impacts of these actions on the SLAs or costs.

By appropriately combining BPM and analytics, you can improve productivity, improve visibility, reduce time to action and improve the user experience. A good summary of the benefits; as I mentioned earlier, this is likely not really news to the customer in the audience, but I am guessing that a lot of them are not yet using analytics to the full extent in their BPM implementations, and this information might help them to justify it.

In AMX BPM, Spotfire was previously positioned for analytics and visualization, but TIBCO’s acquisition of Jaspersoft means that they are now bundling Jaspersoft with AMX BPM. You can use either (or both), and I think that TIBCO needs to get on top of identifying the use cases for each so that customers are not confused by two apparently overlapping BPM analytics solutions. Spotfire allows for very rich interactive visualizations of data from multiple sources, including drill-downs and what-if scenarios, especially when the analysis is more ad hoc and exploratory; Jaspersoft is better suited for pre-defined dashboards for monitoring well-understood KPIs.

TIBCONOW ActiveMatrix BPM Roadmap

On Monday, we heard an update on the current state of AMX BPM from Roger King; today, he gave us more on the new release and future plans in his “BPM for Tomorrow” breakout session. He started out introducing ActiveMatrix BPM 3.1, including the following key themes:

  • Case management
  • Data
  • Usability and productivity

As we saw in the previous breakout, the addition of ad hoc activities to process models enables case management capabilities. Ad hoc (disconnected) activities are fully supported in BPMN; TIBCO provides tooling to add preconditions and the choice of manual/automatic invocation: that allow an activity to be started manually or to start itself once the preconditions are met. If there are no preconditions, the activity will start (or be available to start) as soon as the process is instantiated. Manually-startable activities are surfaced for the user in the case UI, in the task list and in the process list. Case states and actions are defined in the case model, specifying the states, actions, and which actions are valid for each state. Support for CMIS has been extended to allow the addition of content (in an external ECM system) to a case object via a case folder paradigm; this includes some new document operations such as linking/unlinking to a case object.

Data and self-serving reporting is now enabled with the inclusion of the full capabilities of Jaspersoft — acquired by TIBCO in April 2014 — in AMX BPM (limited in use to BPM) and a number of out of the box reports and dashboards. This works with case data as well as process data. The messaging and capabilities of Spotfire for BPM analytics has been a bit lacking in the past, and obviously Jaspersoft is being positioned as the “right” way to do BPM analytics (which is probably not happy news for the customers that sweated through the BPM-Spotfire implementations).

On the usability side, they have improved some BPM developer tools such as developer server configuration, and added “live development” capability for iterative development of UI forms without needing to rebuild and redeploy: just edit, save and test directly.

He then talked about their future product direction, which is predicated on their role in managing the “crown jewel” core business processes, necessitating a lot of non-functional capability such as high availability and scalability. As for market trends, they are seeing the cloud being used to drive innovation through experimentation because of the low cost of failure, and the rise of disposable enterprise apps. As enterprise processes become more and more digital, organizations are starting to respond with more automated business processes as well as case management for more dynamic processes. Not surprisingly, they are seeing BPMS with HTML5 as an enterprise rapid application development platform: I have been seeing a merging of the high end of the BPMS market with the application development platform market for some time.

Every organization has a lot of non-differentiating applications with standardized experiences, such as those that support procurement and HR; TIBCO’s target is the differentiating apps within an enterprise, which may not be the systems of record but likely are the systems of engagement. The key to this is enterprise case management and process-centric apps, which include data, process, organizational and social aspects, but also UI composition capabilities, since out-of-the-box UI is rarely differentiating. They are moving toward having some large part of their development environment on the web rather than Eclipse, which will roll out around the time that Microsoft finally forces companies onto Internet Explorer 11 where HTML5 is properly supported. Through this, they will support more of the composable situational apps that can be built, rolled out, used and discarded in less time that it used to take you to write the requirements for an app.

Declarative (data and rules-driven) versus imperative (predefined flow) process models are on their roadmap, and they will start to roll out declarative models in the context of case management: not to the exclusion of imperative models, but to augment them where they provide a better fit. Tied into this, at least in my mind, they are providing stronger support for rules integrated into BPM development.

He restated the official TIBCO party line that BPMN is not for business users, but that they need something more like Nimbus UPN instead; however, those are currently offered by two separate and non-integrated products that can’t exchange models, making Nimbus less useful for process discovery that will lead to automation. In the future, they will address this with enterprise BPM in the cloud, providing a “Nimbus-style” experience for business users and business-IT collaboration to start, then more analyst-style BPMN modeling, design and implementation. Not clear how they are going to reconcile UPM and BPMN, however.

King then announced TIBCO Cloud BPM — not yet available, but soon — which will be a BPM service powered by AMX BPM. They deprecated their Silver Fabric BPM support, which allowed you to run AMX BPM in the Amazon cloud; it wasn’t a particularly flexible or supportable cloud BPM offering, and a true SaaS offering will be a good addition when it comes along.

Case Management at TIBCONOW 2014

Yesterday, I attended the analyst sessions (which were mostly Q&A with Matt Quinn on the topics that he covered in the keynote), then was on the “Clash of the BPM Titans” panel, so not a lot of writing. No keynotes today, on this last day of TIBCO NOW 2014, but some BPM breakouts on the calendar — stay tuned.

I started the day with Jeremy Smith and Nam Ton That presenting on case management. They discussed customer journeys, and how their Fast Data platform allows you to detect and respond to that journey: this often includes semi-structured, dynamic processes that need to change based on external events and the process to date. It’s more than just process, of course; there needs to be context, actionable analytics, internal and external collaboration, and recommended actions, all working adaptively towards the customer-centric goal.

TIBCO addresses case management with additions to AMX BPM, not with a separate product; I believe that this is the best way to go for a lot of case management use cases that might need to combine more traditional structured processes with adaptive cases. The new capabilities added to support case management are:

  • Case data, providing context for performing actions. The case data model is created independently of a process model; the modeling uses UML to create relational-style ERDs, but also scripting and other functions beyond simple data modeling. This appears to be where the power — and the complexity — of the case management capabilities lie.
  • Case folders, integrating a variety of document sources, including from multiple ECM systems using CMIS, to act as the repository for case-related artifacts.
  • Case state and actions, allowing a user (or agent) to view and set the state of a case — e.g., received, in process, closed — and take any one of a number of actions allowed for the case when it is that state. This is modeled graphical with a state/action model, which also can apply user/role permissions, in a very similar fashion to their existing page flows capability. Actions can include social interactions, such as requesting information from an expert, accessing a Nimbus-based operations manual related to the current action, applying/viewing analytics to provide context for the action at that state, or providing recommendations such as next best action. Rules can be integrated through pre-conditions that prevent, require or invoke actions.
  • Ad hoc tasks, allowing the case user to instantiate a user task or subprocess; it appears they are doing this by pre-defining these in the process model (as ad hoc, or disconnected, tasks) so although they can be invoked on an ad hoc basis, they can’t be created from scratch by the user during execution. Given that multiple process models can be invoked from a case, there is still a lot of flexibility here.
  • Case UI, providing some out of the box user interfaces, but also providing a framework for building custom UIs or embedding these capabilities within another UI or portal.

Related cases can be linked via an association field created in the case data model; since this is, at heart, an integration application development environment, you can do pretty much anything although it looks like some of it might result in a fairly complex and technical case data model.

They didn’t do an actual demo during the presentation, I’ll drop by the showcase later and take a peek at it later today.

TIBCONOW 2014 Day 2 Keynote: Product Direction

Yesterday’s keynote was less about TIBCO products and customers, and more about discussions with industry thought leaders about disruptive innovation. This morning’s keynote continued that theme with a pre-recorded interview with Vivek Ranadive and Microsoft CEO Satya Nadella talking about cloud, mobile, big data and the transformational effects on individual and business productivity. Nadella took this as an opportunity to plug Microsoft products such as Office 365, Cortana and Azure; eventually he moved on to talk about the role of leadership in providing a meaningful environment for people to work and thrive. Through the use of Microsoft products, of course.

Thankfully, we then moved on to actual TIBCO products.

We had a live demo of TIBCO Engage, their real-time customer engagement marketing product, showing how a store can recognize a customer and create a context-sensitive offer that can be immediately consumed via their mobile app. From the marketer’s side, they can define and monitor engagement flows — almost like mini-campaigns, such as social sharing in exchange for points, or enrolling in their VIP program — that are defined by their target, trigger and response. The target audience can be filtered by past interests or demographics; triggers can be a combination of geolocation (via their app), social media interactions, shopping cart contents and time of day; and responses may be an award such as loyalty points or a discount coupon, a message or both, with a follow link customized to the customer. A date range can then be set for each engagement flow, and set to be live/scheduled to start, or in a draft or review mode. Analytics are gathered as the flows execute, and the effectiveness can be measured in real time.

Matt Quinn, TIBCO’s CTO, spoke about the challenges of fast data: volume, speed and complexity. We saw the three blocks of the TIBCO Fast Data platform — analytics, event processing, and integration — in a bit more detail, with him describing how these three layers work together. Their strategy for the past 12 months, and going forward, has three prongs: evolution of the Fast Data platform; improved ease of use; and delivery of the Fast Data platform including cloud and mobile support. The Fast Data platform appears to be a rebranding of their large portfolio of products as if it were a single integrated product; that’s a bit of marketing-speak, although they do appear to be doing a better job of providing integrations and use cases of how the different products within the platform can be combined.

image

In the first part of the strategy, evolution of the platform (that is, product enhancements and new releases), they continue to make improvements to their messaging infrastructure. Fast, secure message transactions are where they started, and they continue to do this really well, in software and on their FTL appliances. Their ActiveSpaces in-memory data grid has improved monitoring and management, as well as multi-site replication, and is now more easily consumed via Node.js and other lighter-weight development protocols. BusinessWorks 6, their integration IDE, now provides more integrated development tooling with greatly improved user interfaces to more easily create and deploy integration applications. They’ve provided plug-ins for SaaS integrations such as Salesforce, and made it easier to create your own plug-ins for integration sources that they don’t yet support directly. On the event processing side, they’ve brought together some related products to more easily combine stream processing, rules and live data marts for real-time aggregation and visualization. And to serve the internet of things (IoT), they are providing connectivity to devices and sensors.

image

User experience is a big challenge with any enterprise software company, especially one that grows through acquisition: in general, user interfaces end up as a hodge-podge of inconsistent interfaces. TIBCO is certainly making some headway at refactoring these into a more consistent and easier to use suite of interfaces. They’ve improved the tooling in the BusinessWorks IDE, but also in the administration and management of integrations during development, deployment and runtime. They’ve provided a graphical UI designer for master data management (MDM). Presented as part of the ease of use initiative, he discussed the case management functions added to AMX BPM, including manual and automatic ad hoc tasks, case folder and documents with CMIS/ECMS access, and support for elastic organization structures (branch model). BPM reporting has also been improved through the integration of Jaspersoft (acquired by TIBCO earlier this year) with out of the box and customizable reports, and Jaspersoft also has been enhanced to more easily embed analytics in any application. They still need to do some work on interoperability between Jaspersoft and Spotfire: having two analytics platforms is not good for the customers who can’t figure out when to use which, and how to move between them.

The third prong of the strategy, delivery of the platform, is being addressed by offering on-premise, cloud, Silver Fabric platform-as-a-service, TIBCO Cloud Bus for hybrid cloud/on premise configurations, consumable apps and more; it’s not clear that you can get everything on every delivery platform, and I suspect that customers will have challenges here as TIBCO continues to build out their capabilities. In the near future, they will launch Simplr for non-technical integration (similar to IFTTT), and Expresso for consuming APIs. They are also releasing TIBCO Clarity for cleansing cloud data, providing cleaner input for these situational consumable apps. For TIBCO Engage, which we saw demonstrated earlier, they will be adding next best engagement optimization and support for third-party mobile wallets, which should improve the hit rate on their customer engagement flows.

He discussed some of the trends that they are seeing impacting business, and which they have on the drawing board for TIBCO products: socialization and gamification of everything; cloud requirements becoming hybrid to combine public cloud, private cloud and on premise; the rise of micro-services from a wide variety of sources that can be combined into apps; and HTML5/web-based developer tooling rather than the heavier Eclipse environments. They are working on Project Athena, a triplestore database that includes context to allow for faster decisioning; this will start to show up in some of the future product development.

Good review of the last year of product development and what to expect in the next year.

The keynote finished with Raj Verma, EVP of sales, presenting “trailblazer” awards to their customers that are using TIBCO technologies as part of their transformative innovation: Softrek for their ClearView CRM that embeds Jaspersoft; General Mills for their internal use of Spotfire for product and brand management; jetBlue for their use of TIBCO integration and eventing for operations and customer-facing services; and Three (UK telecom) for their use of TIBCO integration and eventing for customer engagement.

Thankfully shorter than yesterday’s 3-hour marathon keynote, and lots of good product updates.

Spotfire Content Analytics At TIBCONOW

(This session was from late yesterday afternoon, but I didn’t remember to post until this morning. Oops.)

Update: the speakers were Thomas Blomberg from TIBCO and Rik Tamm-Daniels from Attivio. Thanks, guys!

I went to the last breakout on Monday to look at the new Spotfire Content Analytics, which combines Spotfire in-memory analytics and visualization with Attivio content analysis and extraction. This is something that the ECM vendors (e.g., IBM FileNet) have been offering for a while, and I was interested to see the Spotfire take on it.

Basically, content analytics is about analyzing documents, emails, blogs, press releases, website content and other human-created textual data (also known as unstructured content) in order to find insights; these days, a primary use case is to determine sentiment in social media and other public data, in order for a company to get ahead of any potential PR disasters.

Spotfire Content Analytics — or rather, the Attivio engine that powers the extraction — uses four techniques to find relative information in unstructured content:

  • Text extraction, including metadata
  • Key phrase analysis, using linguistics to find “interesting” phrases
  • Entity extraction, identifying people, companies, places, products, etc.
  • Sentiment analysis, to determine degree of negative/positive sentiment and confidence in that score

Once the piece of content has been analyzed to extract this relevant information, more traditional analytics can be applied to detect patterns, tie these back to revenue, and allow for handling of potential high-value or high-risk situations.

Spotfire Content Analytics (via their ) uses machine learning that allows you to train the system using sample data, since the information that is considered relevant is highly dependent on the specific content type (e.g., a tweet versus a product review). They provide rich text analytics, seamless visualization via Spotfire, agility through combining sources and transformations, and support for diverse content sources. They showed a demo based on a news feed by country from the CIA factbook site (I think), analyzing and showing aggregate sentiment about countries: as you can imagine, countries experiencing war and plague right now aren’t viewed very positively. Visualization using Spotfire allows for some nice geographic map-based searching, as well as text searching. The product will be available later this month (November 2014).

Great visualizations, as you would expect from Spotfire; it will be interesting to see how this measures up to IBM’s and other content analytics offerings once it’s released.

BPM For Today At TIBCONOW

Roger King, who heads up TIBCO’s BPM product strategy, gave us an update on ActiveMatrix BPM, and some of the iProcess to AMX BPM tooling (there is a separate session on this tomorrow that I may attend, so possibly more on that then). It’s been four years since they launched AMX BPM; that forms the model-driven implementation side of their BPM offering, augmented by Nimbus for business stakeholders for procedure documentation and business-IT collaboration. AMX BPM provides a number of process patterns (e.g., maker-checker) built in, intelligent work and resource management, actionable analytic insights and more. This is built on an enterprise-strength platform — as you would expect from TIBCO — to support 24×7 real-time operations.

In May of this year, they released AMX BPM 3.0 with a number of new features:

  • Support all styles of processes in a single solution: human workflow, case management, rules-based processes, automation, etc.
  • To support case management, they enable global data to allow the creation of a case data model in a central repository separate from processes, allowing cases to exist independent of processes, although they can be acted upon by processes. Work items representing actions on cases can retrieve and update case data on demand, since it references the case data rather than having it copied to local instance data.
  • In work management enhancements, support for elastic organizations (branches, such as you see in retail banking). This allows defining a model for a branch — you could have different models for different sizes of branches, for example — then link to those from branch nodes in the static organization model. Work can then be managed relative to the features of those underlying models, e.g., “send to manager”.
  • Also in work management, they have added dynamic performers to allow for distribution based on business data in a running instance rather than pre-determined role assignments. This is supported by dynamic RQL (resource query language), a query language specifically for manipulating resource assignments.
  • Some new LDAP functions.

There will be another session on Wednesday that covers the new features that are new since May, including a lot about case management; I’ll report more from that.

He also gave us some of the details of the iProcess to AMX BPM “conversion” tools, which migrate the process models (although not the applications that use those models): I assume that the conversion rate of their iProcess customers to AMX BPM has been lower than they expected, and they are hoping that this will move things along.

We then heard a Nimbus update from Dan Egan, which will release version 9.5 this month: this is positioned as a “how to” guide for the enterprise, showing process models in a more consumable format than a full technical BPMN model. They have added collaboration capabilities so that users can review and provided feedback on the business processes, and the ability to model multiple process variants as multiple drill-downs from a single object. The idea is that you use Nimbus both as a place to document manual procedures that people need to perform, and as a process discovery tool for eventual automation, although the former is what Nimbus was originally designed for and seems to still be the main use case. They’ve spiffed up the UI, and will soon be offering their authoring, admin and governance functions on the web, allowing them to offer a fully web-based solution.

Nimbus uses their universal process notation (UPN) rather than BPMN for process models; King stated in response to a question about Nimbus supporting BPMN by stating that they do not believe that BPMN is a user-consumable format. They don’t have have tooling — or at least haven’t talked about it — to convert UPN to BPMN; they’re going to need to have that if they want to position UPN as being for business-led process discovery as well as procedural documentation.

If you want to see the replay of this morning’s keynote, or watch tomorrow’s keynotes live or on demand, you can see them here.

BPM COE at TIBCONOW 2014

Raisa Mahomed of TIBCO presented a breakout session on best practices for building a BPM center of excellence. She started with a description of different types of COEs based on Forrester’s divisions (I’m too lazy to hack the HTML to add a table in WordPress for Android, so imagine a 2×2 quadrant with one axis being centralized versus decentralized, the other tactical, i.e., focused on cost and efficiency, versus strategic, i.e., focused on revenue and growth):

  • Center of Expertise (decentralized, strategic) – empowers business stakeholders with expert assistance, provides best practice, governance, technology that is configurable and consumable by business
  • Center of Excellence (centralized, strategic) – governs all processes in organization, enforces strict guidelines and process methodology governance, owns the BPMS, engagement models foster trust and collaboration including internal evangelists
  • Community of Practice (decentralized, tactical) – small teams, departmental priorities and scope, basic workflow capabilities, little or no governance
  • Process Factory (centralized, tactical) – optimized for process automation projects, processes as application development, frameworks

Center of Expertise and Process Factory work well together and are often seen in combination.

image

Best practices (these went by pretty quickly with a lot of detail on the slides, so I’ve just tried to capture some of the high points):

  • Find executive sponsorship for the COE: they must be influential across the organization, and be in the right place for the COE within your organization (e.g., COO, CIO, separate architecture group)
  • Create a governance framework – style will be based on the type(s) of COEs in use
  • Establish a methodology, which may have to accommodate different levels of BPM maturity within organization; be sure to address reusability and common components
  • Start with a core process, but relatively low complexity – this is exactly what I recommend, and I’m always frustrated by the “experts” that recommend starting with a non-core process even if the core processes are the target for implementation.
  • Encourage innovation and introduce disruptive technology.
  • Collaboration is key, via co-location and online collaboration spaces.
  • Don’t skip the metrics: remember that measuring project success is essential for future funding, as well as day-to-day operations and feeding the continuous improvement cycle.
  • Don’t let the program go stale, or become an ivory tower; rotate SMEs from the COE back into the business.
  • There’s not a single BPM skillset: you need a variety of skills spread across multiple people and roles.
  • Make a business case to provide justification for BPM projects.
  • Empower and educate through training and change management.
  • Avoid the “build it and they will come” mentality: just because you create some cool technology, that doesn’t mean that business people will stop doing the things that they’re doing to take it up.
  • Institute formal reviews of process models and solutions.

Nothing revolutionary here, but a good introduction and review of the best practices.