HandySoft Process Intelligence and User Experience

Wow, has it really been a month since I last blogged? A couple of weeks vacation, general year-end busyness and a few non-work side projects have kept me quiet, but it’s time to get back at it. I have a few partially-finished product briefings sitting around, and thought it best to get them out before the vendors come out with their next versions and completely obsolesce these posts. 🙂

I had a chat with Garth Knudson of HandySoft in late November about the latest version of their BizFlow product, specifically around the new reporting capabilities and their WebMaker RIA development environment. Although these don’t show off the core BPM capabilities in their product suite (which I reviewed in late 2009), these are two well-integrated tools that allow for easy building of reports and applications within a BizFlow BPM environment. I always enjoy talking with Garth because he says good things about his competitors’ products, which means that not only does he have good manners, but he takes enough care to learn something about the competition rather than just tarring them all with the same brush.

We first looked at their user-driven reporting – available from the My AdHoc Reports option on the BizFlow menu – which is driven by OEM versions of the Jaspersoft open source BI server components; by next year, they’ll have the entire Jaspersoft suite integrated for more complete process analytics capabilities. Although you can already monitor the current processes from the core BizFlow capability, the ad hoc reporting add-on allows users (or more likely, business analysts) to define their own reports, which can then be run on demand or on a schedule.

HandySoft BizFlow Advanced Reporting - select data domainIf you’ve seen Jaspersoft (or most other ad hoc reporting tools) at work, there isn’t much new here: you can select the data domain from the list of data marts set up by an administrator, then select the type of report/graph, the fields, filtering criteria and layout. It’s a bit too techie for the average user to actually create a new report definition, since it provides a little much close contact with the database, such as displaying the actual SQL field names instead of aliases, but once the definition is created, it’s easy enough to run from the BizFlow interface. Regular report runs can be scheduled to output to a specific folder in a specific format (PDF, Excel, etc.), based on the underlying Jaspersoft functionality.

The key integration points with BizFlow BPM, then, are the ability of an administrator to include process instance data in the data marts as well as any other corporate data, allowing for composite reporting across sources; and access to the report definitions in the My AdHoc Reports tab.

The second part of the demo was on their WebMaker application development environment. Most BPM suites these days have some sort of RIA development tool, allowing you to build user forms, screens, portals and dashboards without using a third-party tool. This is driven in part by the former lack of good tools for doing this, and in part by the major analyst reports that state that a BPMS has to have some sort of application development built in to it. Personally, I’m torn on that: most BPMS vendors are not necessarily experts at creating application development tools, and making the BPMS capabilities available for consumption by more generic application development environments through standard component wrappers fits better with a best-of-breed approach that I tend to favor. However, many organizations that buy a BPMS don’t have modern application development tools at all, so the inclusion of at least an adequate one is usually a help.

HandySoft BizFlow WebMaker - specify field visibiltyHandySoft’s WebMaker is loosely coupled with BizFlow, so it can be used for any web application development, not just BPM-related applications. It does integrate natively with BizFlow, but can also connect with any web service or JDBC-compliant database (as you would expect) and uses the Model-View-Controller (MVC) paradigm. For a process-based application, you define the process map first, then create a new WebMaker project, define a page (form), and connect the page to the process definition. Once that’s done, you can then drag the process variables directly onto the form to create the user interface objects. There’s a full array of on-form objects available, including AJAX partial pages, maps, charts, etc., as well as the usual data entry fields, drop-downs and buttons. Since the process parameters are all available to the form, the form can change its appearance and behavior depending on the process variables, for example, to allow a partial page group to be enabled or disabled based on the specific step in the process or the value of the process instances variables at that step. This allows a single form to be used for multiple steps in a process that require a similar but not identical look and feel, such as a data entry screen and a QA screen; alternatively, multiple forms can be defined and assigned to different steps in the same process.

To be clear, WebMaker is not a tool for non-technical people: although a trained business analyst could probably get through the initial screen designs, there is far too much technical detail exposed if you want to do anything except very vanilla static forms; the fact that you can easily expose the MVC execution stack is a clue that this is really a developer tool. It is, however, well-integrated with BizFlow BPM, allowing the process instance variables to be used in WebMaker, and the WebMaker forms to be assigned to each activity using the Process Studio.

HandySoft is one of the small players in the BPMS market, and has focused on ad hoc and dynamic processes from the start. Now that all of the BPMS vendors have jumped into the dynamic BPM fray, it will be interesting to see if these new BizFlow tools round out their suite sufficiently to compete with the bigger players.

Real Time Monitoring & Simulation of Business Processes

My last session for the day at CASCON is to hear Alex Lau of IBM, and Andrei Solomon and Marin Litoiu of York University discuss their research project on real time monitoring and simulation of business processes (they also presented a related paper this morning, although I missed the session). This is one of the CAS Innovation Impact sessions, which are based on fellowship projects that are “ready for harvesting”, which I think means that they’re ready to move towards productization. These projects involve one student (Solomon) and one professor (Litoiu) from a university, then one person from CAS (Lau) and one from IBM development (Jay Benayon, also credited on the research). In this case, the need for the research was identified by IBM development; I’m not sure if this is the usual method, although it seems logical that these projects act as mini research teams for work that is beyond the scope of production development.

In addition to their research, they’ve prototyped a system that integrates with WebSphere Process Modeler that can use the monitoring data from an in-flight process in order to feedback to a simulation model of the process to improve what-if scenario analysis and process KPI forecasting. The key research challenge was the use of the monitoring data, because that data is typically quite noisy since it could include hidden overhead such as queuing, and tended to skew results to make task durations appear to be longer than they actually were. This noise can’t be measured directly, but they’ve attempted to filter it out of the monitoring data using a particle filter prior to feeding into the simulation model.

Their proof of concept prototype linked together three IBM products, and could either change automated decisions in the runtime process or suggest alternatives to a human participant, based on the near-past performance of that process instance and any approaching SLA deadlines. One caveat is that they didn’t use real-world (e.g., customer) business processes for this, but created their own processes and ran them to generate their results.

They see this research as applicable to any process modeling tools where processes can be simulated against a set of metrics, KPIs can be forecasted, and simulation parameters are entered at modeling time and remain static until explicitly updated. They have the potential to extend their technique, for example, by providing better trend predictions based on regression techniques. There are other vendors working on research like this, and I think that we’ll see a lot more of this sort of functionality in BPM solutions in the future, where users are presented with suggestions (or automated decisions made) based on a prediction of how well a process instance is likely to meet its KPIs.

That’s it for me today, but I’ll be back here tomorrow morning for the workshop on practical ontologies, the women in technology lunch panel, and the afternoon keynote.

IOD Keynote: Computational Mathematics and Freakonomics

I attended the keynote this morning, on the theme of looking forward: first we heard from Mike Rhodin, an exec in the IBM Software group, then Brenda Dietrich, a mathematician (and VP – finally, a female IBM exec on stage) from the analytics group in IBM Research. IBM Research has nine labs around the world, including a new one just launched in Brazil, and a number of collaborative research facilities, or “collaboratories”, where they work with universities, government agencies and private industries on research that can be leveraged into the market more quickly. I’ve met a few of the BPM researchers from the Zurich lab at the annual academic BPM conference, but the range of the research across the IBM labs is pretty vast: from nanotechnology, to the cloud, to all of the event generation that leads to the “smarter planet” that IBM has been promoting. She’s here from the analytics group because analytics is at the top of this pyramid of research areas, especially in the context of the smarter planet: all of our devices are generating a flood of events and data, and some pretty smart analytics have to be in place to be able to make sense of all this.

The future of analytics is moving from today’s static model of collect-analyze-present results, to more predictive analytics that can create models of the future based on what’s happened in the past, and use that flood of data (such as Twitter) as input to these analytical models.

I have a lot of respect for IBM for trying out their own ideas on systems on themselves as one big guinea pig, and this analytics research is no exception. They’re using data from all sorts of internal systems, from manufacturing plants to software development processes to human resources, to feed into this research, and benefit from the results. When this starts to hit the outside market, it has impacts on a much wider variety of industries, such as telco and oil field development. Not surprisingly, this ties in with master data management, since you need to deal with common data models if you’re going to perform complex analytics and queries across all of this data, and their research on using the data stream to actually generate the queries is pretty cool.

She showed a short video ciip on Watson, an AI “question answering system” that they’ve built, and showed it playing Jeopardy, interpreting the natural language questions – including colloquialisms – and responding to them quickly, beating out some top human Jeopardy players. She closed with a great quote that is inspirational in so many ways, especially to girls in mathematics: “It’s a great time to be a computational mathematician”.

The high-profile speakers of the keynote were up next: Steven Levitt and Stephen Dubner, authors of Freakonomics and Superfreakonomics, with some interesting anecdotes about how they started working together (Levitt’s the genius economist, and Dubner’s the writer who collaborated with him on the books). They talked about turning data into ideas, tying in with the analytics theme; they had lots of interesting and humorous stories on an economic theme, such as teaching monkeys about money as a token to be exchanged for goods and (ahem) services, and what that teaches us about risk and loss aversion in people.

I have a noon flight home to Toronto, so this ends my time at IOD 2010. This is my first IOD: I used to attend FileNet’s UserNet conference before the acquisition, but have never been to IOD or Impact until this year. With over 10,000 people registered, this is a massive conference that covers a pretty wide range of information management technologies, including the FileNet ECM, BPM and now Case Manager software that is my main focus here. I’ve had a look at the new IBM Case Manager, as you’ve read in my posts from yesterday, and give it a bit of a mixed review, although it’s still not even released. I’m hoping for an in-depth demo sometime in the coming weeks, and will be watching to see how IBM launches itself into the case management space.

IBM Announcements: Case Manager, CMIS and More

I had a pre-IOD analyst briefing last week from IBM with updates to their ECM portfolio, given by Ken Bisconti, Dave Caldera and Craig Rhinehart. IOD – Information on Demand – is IBM’s conference covering business analytics and information management, the latter of which includes data management and content management. The former FileNet products fall into their content management portfolio (including FileNet BPM, which was repositioned as document-centric BPM following the acquisition so as to not compete with the WebSphere BPM products), and includes case management capabilities in their Business Process Framework (BPF). I also had a one-to-one session with Bisconti while at IOD to get into a bit more detail.

The big announcement, at least to me, was the new Case Manager product, to ship in Q4 (probably November, although IBM won’t commit to that). IBM has been talking about an advanced case management strategy for several months now, and priming the pump about what “should” be in a case management product, but this is the first that we’ve seen a real product as part of that strategy; I’m sure that the other ACM vendors with products already released are ROFL over IBM’s statement in the press release that this is the “industry’s first advanced case management product”. With FileNet Content Manager at the core for managing the case file and the associated content, they’ve drawn on a variety of offerings across different software groups and brands to create this product: ILOG rules, Cognos realtime monitoring, Lotus collaboration and social networking, and WebSphere Process Server to facilitate integration to multiple systems. This is one of their “industry solutions” that spans multiple software groups, and I can just imagine the internal political wrangling that went on to make this happen. As excited as they sounded about bringing all these assets together in a new product, they’ll need to demonstrate a seamless integration and common user experience so that this doesn’t end up looking like some weird FrankenECM. Judging from the comments at the previous session that I attended, it sounds like the ILOG integration, at the very least, is a bit shaky in the first release.

They’re providing analytics – both via the updated Content Analytics offering (discussed below) and Cognos – to allow views of individual case progression as well as analysis of persistent case information to detect patterns in case workload. It sounds like they’re using Cognos for analyzing the case metadata, and Content Analytics for analyzing the unstructured information, e.g., documents and emails, associated with the case.

A key capability of any case management system, and this is no exception, is the ability to handle unstructured work, allowing a case worker to use their own experience to determine the next steps to progress the case towards outcome. Workers can create tasks and activities that use the infrastructure of queues and inboxes; this infrastructure is apparently new as part of this offering, and not based on FileNet BPM. Once a case is complete, it remains in the underlying Content Manager repository, where it is subject to retention policies like any other content. They’ve made the case object and its tasks native content types, so like any other content class in FileNet Content Manager, you can trigger workflows (in BPM) based on the native event types of the content class, such as when the object is created or updated. The old Business Process Framework (BPF), which was the only prior IBM offering in the case management arena, isn’t being discontinued, but customers will definitely be encouraged to create any new case management applications on Case Manager rather than BPF, and eventually to rewrite their BPF applications to take advantage of new features.

As we’re seeing in many other BPM and case management products, they’ve created the ability to deploy reusable templates for vertical solutions in order to reduce the time required to deploy a solution from months down to days. IBM’s focus will initially be on the horizontal platform, and they’re relying on partners and customers to build the industry-specific templates. Partners in the early adoption program are already providing templates for claims, wealth management and other solutions. The templates are designed for use by business analysts, so that a BA can use a pre-defined template to create and deploy a case management solution with minimal IT involvement.

For user experience, they’re providing three distinct interfaces:

  • A workbench for BAs to create case solutions, based on the afore-mentioned templates, using a wizard-based interface. This includes building the end user portal environment with the IBM iWidget component (mashup) environment.
  • A role-based portal for end users, created by the BAs in the workbench, with personalization options for the case worker.
  • Analytics/reporting dashboards reporting on case infrastructure for managers and case workers, leveraging Cognos and Content Analytics.

They did have some other news aside from the Case Manager announcement; another major content-related announcement is support for the CMIS standard, allowing IBM content repositories (FileNet CM, IBM CM8 and CMOD) to integrate more easily with non-IBM systems. This is in a technology preview only at this point, but since IBM co-authored the standard, you can expect full support for it in the future. I had a recent discussion with Pega indicating that they were supporting CMIS in their case management/BPM environment, and we’re seeing the same from other vendors, meaning that you’ll be able to integrate an industrial strength repository like FileNet CM into the BPM or ACM platform of your choice.

They had a few other announcements and points to discuss on the call:

  • IBM recently acquired Datacap, a document capture (scanning) product company, which refreshes their high-performance document scanning and automated recognition capabilities. This integrates with FileNet CM, but also with the older IBM CM8 Content Manager and (soon) CMOD, plus other non-IBM content repositories. Datacap uses a rules-based capability for better content capture, recognition and classification.
  • There are improvements to Office Document Services; this is one of the areas where CMIS will help as well, allowing IBM to hold its nose and improve their integration with SharePoint and Exchange. There’s a big focus on content governance, such as managing retention lifecycles, including content federation across multiple heterogeneous repositories.
  • There are updates to the information lifecycle governance (ILG) portfolio, including Content Collector and eDiscovery. Content Collector has better content collection, analysis and management capabilities for office documents, email and SAP data. eDiscovery now provides better support for legal discovery cases, with enhanced security roles for granular content access, redaction APIs and better keyword identification. This ties back into governance, content lifecycle management and retention management: disposal of information at the appropriate times is key to reducing legal discovery costs, since you’re not having to retrieve, distribution and review a lot of content that is no longer legally required.
  • IBM’s recent acquisition of PSS Systems complements the existing records management and eDiscovery capabilities with retention-related analytics and policy solutions.
  • The relatively new IBM Content Analytics (ICA) product has been updated, providing analytics on content retention management (i.e., find what you need to decommission) as well as more general “BI for content” for advanced analytics on what’s in your content repositories and related contextual data from other sources. This integrates out of the box with Cognos (which begs the question, why isn’t this actually just Cognos) as well as the new Case Manager product to provide analytics for the manager dashboard views. The interesting thing is that “content” in this situation is more than just IBM content repositories, it’s also competitive content repositories and even things like Twitter feeds via IBM’s new BigInsights offering. They have a number of ICA technology demos here at IOD, including the BigInsights/Twitter analysis, and ICA running on Hadoop infrastructure for scalability.
  • The only announcement for FileNet BPM seemed to be expanding to some new Linux platforms, and I’ve heard that they’re refactoring the process engine to improve performance and maintenance but no whiff of new functionality aside from the Case Manager announcement. I plan to attend the BPM technical briefing this afternoon, and should have some more updates after that.

I still find the IBM ECM portfolio – much like their BPM and other portfolios – to contain too many products: clearly, some of these should be consolidated, although IBM’s strategy seems to be to never sunset a product if they have a couple of others that do almost the same thing and there’s a chance that they can sell you all of them.

IBM IOD Opening Session: ACM and Analytics

I’m at IBM’s Information On Demand (IOD) conference this week, attending the opening session. There are 10,000 attendees here (including, I assume, IBM employees) for a conference that covers information management of all sorts: databases, analytics and content management. As at other large vendor conferences, they feel obligated to assault our senses in the morning with loud performance art: today, it’s Japanese drummers (quite talented, and thankfully short). From a logistics standpoint, the wifi fell to its knees before the opening session even started (what, like you weren’t expecting this many people??); IBM could learn a few lessons about supporting social media attendees from SAP, which provided a social media section with tables, power and wired internet to ensure that our messages got out in a timely fashion.

Getting back to the session, it was hosted by Mark Jeffries, who provides some interesting and amusing commentary between sessions, told us the results of the daily poll, and moderated some of the Q&A sessions; I’ve seen him at other conferences and he does a great job. First up from IBM is Robert LeBlanc (I would Google his title, but did I mention that there’s no wifi in here as I type?), talking about how the volume of information is exploding, and yet people are starved for the right information at the right time: most business people say that it’s easier to get information on the internet than out of their own internal systems. Traditional information management – database and ECM – is becoming tightly tied with analytics, since you need analytics to make decisions based on all that information, and gain insights that help to optimize business.

They ran some customer testimonial videos, and the term “advanced case management” came up early and often: I sense that this is going to be a theme for this conference, along with the theme of being analytics-driven to anticipate and shape business outcomes.

LeBlanc was then joined on stage by two customers: Mike Dreyer of Visa and Steve Pratt of CenterPoint Energy. In both cases, these organizations are leveraging information in order to do business better, for example, Visa used analytics to determine that “swipe-and-go” for low-value neighborhood transactions such as Starbucks were so low risk that they didn’t need immediate verification, speeding each transaction and therefore getting your morning latte to you faster. CenterPoint, an energy distributor, uses advanced metering and analytics not only for end-customer metering, but to monitor the health of the delivery systems so as to avoid downtimes and optimize delivery costs. They provided insights into how to plan and implement an information management strategy, from collecting the right data to analyzing and acting on that information.

We then heard from Arvind Krishna, IBM’s GM of Information Management, discussing the cycle of information management and predictive analytics, including using analytics and event processing to optimize real-time decisions and improve enterprise visibility. He was then joined on a panel by Rob Ashe, Fred Balboni and Craig Hayman, moderated by Mark Jeffries; this started to become more of the same message about the importance of information management and analytics. I think that they put the bloggers in the VIP section right in front of the stage so that we don’t bail out when it starts to get repetitive. I’m looking forward to attending some of the more in-depth sessions to hear about the new product releases and what customers are doing with them.

Since the FileNet products are showcased at IOD, this is giving me a chance to catch up with a few of my ex-FileNet friends from when I worked there in 2000-1: last night’s reception was like old home week with lots of familiar faces, and I’m looking forward to meeting up with more of them over the next three days. Looking at the all-male group of IBM executives speaking at the keynote, however, reminded me why I’m not there any more.

Disclosure: In addition to providing me with a free pass to the conference, IBM paid my travel expenses to be here this week. I flew Air Canada coach and am staying at the somewhat tired Luxor, so that’s really not a big perq.

Webinar on Process Intelligence and Predictive Analytics

Summer is the time when no one holds conferences, because vacation schedules make it difficult to get the attendance, so webinars tend to expand to fill the gap. I’ll be presenting on another BP Logix webinar on August 10th, discussing process intelligence and predictive analytics; you can register (and see my smiling face in a video) here.

I first presented on the combination of BPM, business rules and business intelligence at Business Rules Forum in 2007:

Near the end of the presentation, I talk about self-learning decisions in processes, where process statistics are captured with business intelligence, analyzed and fed back to business rules, which then modify the behavior of the processes. In the three years since then, technology has advanced significantly: rules are now accepted as a necessary part of BPM, and process intelligence has moved far beyond simple visualizations of process instance data. In the webinar, I’ll be discussing those trends and what they mean for process improvement.

TIBCO Silver Spotfire: BI/Analytics in the Cloud

TIBCO announces their cloud-based BI/analytics today: TIBCO Silver Spotfire, and you can even sign up for a free one-year trial.

This shouldn’t be a huge surprise to those watching TIBCO announcements to date: at their conference in May, “Silver Analytics” was mentioned in the general session as an upcoming product release, and they’ve made much ado about moving all of their other products onto the Silver cloud platform that this seems inevitable.

I haven’t had a demo or a chance to play with Silver Spotfire yet, but from their press release, it appears that it provides the usual sort of easy-to-use BI capabilities plus a social aspect: collaborative building and sharing of reports, dashboards and other visualizations and analytics. Spotfire has made a name for itself as an incredibly easy to use yet powerful BI platform; moving this to the cloud and adding social aspects should help to push adoption of Spotfire as well as start to make BI a bit more mainstream.

Update: There’s a short video showing the installation (yes, there’s a desktop client), data loading and web publication to get you started.

Can We Make A Sustainability-BPM Connection?

Peter Graf, SAP’s Chief Sustainabilty Officer, and Scott Bolick, VP Sustainability, spoke to a group of bloggers and analysts at a sustainability roundtable today. Graf started with SAP’s definition of sustainability: increase short and long-term profitability by holistically managing economic, social and environmental risks and opportunities. Sustainability changes business processes drastically, especially those processes that span multiple organizations. SAP is leading by example, improving their own internal efficiencies by enacting sustainability measures such as reducing carbon emissions, but also see their software as an enabler for other organizations to implement sustainable solutions. SAP has a number of customers that are using SAP solutions across five general areas of sustainability: carbon impact, environmental compliance, people health and safety, product safety, and sustainability performance management. In addition to cost savings, sustainability can become a recruitment factor: younger people, in particular, want to work for a company that shares their environmental concerns.

They have made sustainability a focus of presentations at this conference, but also have made a number of sustainable logistics choices at the actual event. They have a new sustainability report that has already become hugely popular for fostering stakeholder dialog, and a sustainability map structured by line of business and business case. They are the first technology company to join the Sustainability Consortium, and we heard about acquisitions, customers and partners that are all focused on sustainability.

SAP sees Business Objects Explorer as being a key tool for helping to identify areas for sustainability; for example, providing an analytical view into office and plant costs to determine where unusual electricity consumption is occurring. SAP uses this internally for their own sustainability data analysis, and had a nice spiffy iPad version to show us, since you can’t have a conference these days without showing an iPad at least once. Analytics, especially real-time dashboards that allow for drilling into data, have been gaining popularity in a number of areas lately: we’ve seen everything from academic papers to mainstream reports in The Economist discussing analytics, and this is just one more high-profile example.

Bolick then took the stage to talk about their new sustainability report in more detail; if you want more information on everything from the basic definitions of sustainability to measuring performance to more complex solutions, check it out online. This is not a static PDF that you’ll never read; this is an interactive website that includes up-to-date SAP sustainability news and social content, as well as their own analytics tools allowing a drill-down into performance (e.g., carbon footprint reduction) numbers. The sustainability map is pretty interesting (under the Solutions tab), showing all the different targets for sustainability, organized by who is responsible for solutions in that area.

SAP Sustainability Map

There’s a pretty strong commitment to corporate transparency from SAP: they show both positive and negative performance measures in the report, such as the significant drop in employee engagement. This would make a great tool for other companies to measure and publish their sustainability measures; Tom Rafferty asked when they planned to productize a sustainability report generator for their customers, but since this is currently pretty specific to SAP’s operations, it’s not clear how easy that would be to do; they spoke about the potential to provide at least part of this as an on-demand solution, as well as providing benchmark performance data to help companies measure their “return on sustainability”.

The conversation came back to business processes, and the impact of IT in enabling more efficient and sustainable processes. There’s a key piece missing, however: their focus today was on analyzing sustainability performance data for human consumption, but I’m not hearing anything about using those analytics as events to feed back into any sort of automated process optimization, where optimization in this sense would be sustainability performance optimization rather than the usual type of process optimization that we do. I suspect that much of this sort of optimization is still fairly manual due to the nature of the measurement and what is required to optimize it (e.g., number of women in the workforce in order to create a more sustainable workforce), and also since many of these are such high level measures that they don’t relate to just a single process: optimizing sustainability performance is up in the first row of your enterprise architecture, and over in those columns dealing with motivation, and we haven’t yet worked out all the transformations needed to map that down to the nitty-gritty of actual business processes and rules.

Credit to Jon Reed for the title of this blog post; I was in the blogger area of the communications center (did I mention that SAP’s treatment of media in general and social media in particular really rocks?) and I told him my impressions of the roundtable and how I thought they should have more of a focus on a round-trip push back to BPM, and he popped out the phrase “the sustainability-BPM connection”. Thanks, Jon!

TIBCO Product Stack and New Releases

We’re overtime on the general session, 2.75 hours without a break, and Matt Quinn is up to talk about the TIBCO product stack and some of the recent releases as well as upcoming releases:

  • Spotfire 3.1
  • BusinessEvents 4.0, with an improved Eclipse-based development environment including a rule debugger, and a multi-threaded engine
  • BEViews (BusinessEvents Views) for creating real-time customizable dashboards for monitoring the high-speed events (as opposed to Spotfire, which can include data from a much broader context)
  • ActiveSpaces Suite for in-memory processing, grid computing and events, with the new AS Transactions and AS Patterns components
  • Silver Suite for cloud deployment, including Fabric, Grid and CAP (Composite Application Platform)
  • PeopleForms, which I saw a brief view of yesterday: a lightweight, forms-based application development environment
  • tibbr, their social microblogging platform; I think that they’re pushing too much of the social aspect here, when I think that their sweet spot is in being able to “follow” and receive messages/events from devices rather than people
  • Silver Analytics
  • ActiveMatrix 3.0, which is an expansion of the lightweight application development platform to make this more of an enterprise-ready
  • ActiveMatrix BPM, which he called “the next generation of BPM within TIBCO” – I’ll have more on this after an in-depth briefing
  • Silver BPM, the cloud-deployable version of BPM
  • Design Collaborator, which is a web-based design discovery tool that will be available in 2011: this appears to be their version of an online process discovery tool, although with more of a services focus than just processes; seems late to be introducing this functionality to the market

I heard much of this yesterday from Tom Laffey during the analyst session, but this was a good refresher since it’s a pretty big set of updates.

TIBCO Products Update

Tom Laffey, EVP of Products and Technology, gave us an update at the analyst session yesterday on their new product releases (embargoed until today), but started with an interesting timeline of the their acquisitions. Unlike some companies, who make acquisitions just to remove a competitor from the market, TIBCO appears to have made some thoughtful buys over the years in order to build out a portfolio of infrastructure products. More than just being a Wall Street messaging company with Rendezvous, they have a full stack of mission-critical event processing, messaging, process management, analytics and more that puts them squarely in competition with the big players. Their competition differs for the different product segments: IBM is their biggest competitor, but others including Oracle, some small players and even open source in some cases. They offer fully-responsive 7×24 support through a series of worldwide support centers, handling more than 40,000 support requests per year.

Unfortunately, this leaves them with more than 200 products: a massive portfolio that makes it difficult for them to explain, and even more difficult for customers to understand. A core part of the portfolio is the “connect” part that we heard about earlier: moving point-to-point integrations onto a service bus, using products such as Rendezvous, EMS, BusinessWorks, all manner of adapters, ActiveMatrix, BusinessConenct, CIM, ActiveSpaces and tibbr. On the “automate” part of the platform is all of their BPM offerings: iProcess, the newly-announced ActiveMatrix BPM, Business Studio and PeopleForms. Laffey claimed up front that iProcess is not being replaced by ActiveMatrix BPM (methinks he doth protest too much), which means that there is likely some functionality overlap. The third part, “optimize”, includes Spotfire Suite, S+, BusinessEvents and Netrics.

He discussed their cloud strategy, which includes “internal clouds” (which, to many of us, are not really clouds) as well as external clouds such as AWS; the new Silver line of products – CAP, Grid, Integrator, Fabric, Federator and BPM – are deployable in the cloud.

The major product suites are, then:

  • ActiveMatrix (develoment, governance and integration)
  • ActiveMatrix BPM (BPM)
  • Spotfire (user-driven analytics and visualization)
  • BusinessEvents (CEP)
  • ActiveSpaces (in-memory technologies, datagrid, matching, transactions)
  • Silver (cloud and grid computing)

He dug back into the comparison between iProcess and ActiveMatrix BPM by considering the small number of highly-complex core business processes (such as claims processing) that are the focus for iProcess, versus the large number of tactical or situational small applications with simple workflows that are served by PeopleForms and ActiveMatrix BPM. He gave a quick demo that shows this sort of simple application development being completely forms-driven: create forms using a browser-based graphical form designer, then email it to a group of people to gather responses to the questions on the form. Although he referred to this as “simple BPM” and “BPM for the masses”, it’s not clear that there was any process management at all: just an email notification and gathering responses via a web form. Obviously, I need to see a lot more about this.