SAPPHIRENOW Day 1 Wrapup

Not a lot of blogging yesterday; a couple of good keynotes (but I’m not going to blog about Richard Branson, Al Gore and Colin Powell), a press conference, the sustainability roundtable, a couple of other short meetings and networking at Blogger Central. Some links to items of interest:

Today, I’ll be getting a briefing on NetWeaver BPM, what’s happened in the last months and what’s coming up in future releases; I haven’t heard a peep since TechEd last fall.

Can We Make A Sustainability-BPM Connection?

Peter Graf, SAP’s Chief Sustainabilty Officer, and Scott Bolick, VP Sustainability, spoke to a group of bloggers and analysts at a sustainability roundtable today. Graf started with SAP’s definition of sustainability: increase short and long-term profitability by holistically managing economic, social and environmental risks and opportunities. Sustainability changes business processes drastically, especially those processes that span multiple organizations. SAP is leading by example, improving their own internal efficiencies by enacting sustainability measures such as reducing carbon emissions, but also see their software as an enabler for other organizations to implement sustainable solutions. SAP has a number of customers that are using SAP solutions across five general areas of sustainability: carbon impact, environmental compliance, people health and safety, product safety, and sustainability performance management. In addition to cost savings, sustainability can become a recruitment factor: younger people, in particular, want to work for a company that shares their environmental concerns.

They have made sustainability a focus of presentations at this conference, but also have made a number of sustainable logistics choices at the actual event. They have a new sustainability report that has already become hugely popular for fostering stakeholder dialog, and a sustainability map structured by line of business and business case. They are the first technology company to join the Sustainability Consortium, and we heard about acquisitions, customers and partners that are all focused on sustainability.

SAP sees Business Objects Explorer as being a key tool for helping to identify areas for sustainability; for example, providing an analytical view into office and plant costs to determine where unusual electricity consumption is occurring. SAP uses this internally for their own sustainability data analysis, and had a nice spiffy iPad version to show us, since you can’t have a conference these days without showing an iPad at least once. Analytics, especially real-time dashboards that allow for drilling into data, have been gaining popularity in a number of areas lately: we’ve seen everything from academic papers to mainstream reports in The Economist discussing analytics, and this is just one more high-profile example.

Bolick then took the stage to talk about their new sustainability report in more detail; if you want more information on everything from the basic definitions of sustainability to measuring performance to more complex solutions, check it out online. This is not a static PDF that you’ll never read; this is an interactive website that includes up-to-date SAP sustainability news and social content, as well as their own analytics tools allowing a drill-down into performance (e.g., carbon footprint reduction) numbers. The sustainability map is pretty interesting (under the Solutions tab), showing all the different targets for sustainability, organized by who is responsible for solutions in that area.

SAP Sustainability Map

There’s a pretty strong commitment to corporate transparency from SAP: they show both positive and negative performance measures in the report, such as the significant drop in employee engagement. This would make a great tool for other companies to measure and publish their sustainability measures; Tom Rafferty asked when they planned to productize a sustainability report generator for their customers, but since this is currently pretty specific to SAP’s operations, it’s not clear how easy that would be to do; they spoke about the potential to provide at least part of this as an on-demand solution, as well as providing benchmark performance data to help companies measure their “return on sustainability”.

The conversation came back to business processes, and the impact of IT in enabling more efficient and sustainable processes. There’s a key piece missing, however: their focus today was on analyzing sustainability performance data for human consumption, but I’m not hearing anything about using those analytics as events to feed back into any sort of automated process optimization, where optimization in this sense would be sustainability performance optimization rather than the usual type of process optimization that we do. I suspect that much of this sort of optimization is still fairly manual due to the nature of the measurement and what is required to optimize it (e.g., number of women in the workforce in order to create a more sustainable workforce), and also since many of these are such high level measures that they don’t relate to just a single process: optimizing sustainability performance is up in the first row of your enterprise architecture, and over in those columns dealing with motivation, and we haven’t yet worked out all the transformations needed to map that down to the nitty-gritty of actual business processes and rules.

Credit to Jon Reed for the title of this blog post; I was in the blogger area of the communications center (did I mention that SAP’s treatment of media in general and social media in particular really rocks?) and I told him my impressions of the roundtable and how I thought they should have more of a focus on a round-trip push back to BPM, and he popped out the phrase “the sustainability-BPM connection”. Thanks, Jon!

BPM Summer Camp Is Starting!

As the weather gets warmer, don’t you just naturally think of BPM? Okay, maybe that’s just me.

I’ll be presenting on a series of three webinars for Active Endpoints over the summer, starting this week on Thursday, that we’re calling BPM Summer Camp. First up: “Team Dynamics in BPM Projects: Avoiding the Pitfalls of Forced Marriage,” in which I’ll discuss what it takes to make a great cross-departmental BPM team, and some of the challenges that you might face in building that team.

We’ll do live Q&A at the end, and I’ll warn you that Active Endpoints will keep the session going until all the questions are answered (last time, we ran over by 30 minutes), so you might want to plan a bit of slack in your schedule. They always make their webinars available for replay afterwards on iTunes; just search for VOSibilities on iTunes and subscribe to the free podcast to get both audio and video podcasts, including webinar replays.

Open Source BPM with Alfresco’s Activiti

When Tom Baeyens announced that he and Joram Barrez stepped down from the jBPM project, he hinted about a new project, but details have been sparse until now except for a post that stated that they’re working on an open source BPMN 2.0 offering, plus one that gave unprecedented (for Tom) attention to ECM, which should have tipped me off as to their direction. Turns out that they have both joined Alfresco and are spearheading Activiti, an Apache-licensed open source BPM project, announced its Alpha 1 release today with a planned November GA date. From the press release:

An independently-run and branded open source project, Activiti will work independently of the Alfresco open source ECM system. Activiti will be built from the ground up to be a light-weight, embeddable BPM engine, but also designed to operate in scalable Cloud environments. Activiti will be liberally licensed under Apache License 2.0 to encourage widespread usage and adoption of the Activiti BPM engine and BPMN 2.0, which is being finalized as standard by OMG.

I met Tom face-to-face a couple of years ago when we ended up at different conferences in the same conference center and had a chat about total BPM world domination; interestingly, at the time he expressed that “BPMN should stick to being a modeling notation…and the mapping approach to concrete executable process languages should be left up to the vendors”; obviously, BPMN 2.0 execution semantics have changed his mind. 😉

Activiti Modeler - process designJohn Newton, CTO of Alfresco, and Tom Baeyens, in his new role as Chief Architect of BPM, briefed me last week on Activiti. The project is led by Alfresco and includes SpringSource, Signavio and Camunda; Alfresco’s motivation was to have a more liberally-licensed default process engine, although they will continue to support jBPM. Alfresco will build a business around Activiti only for content-centric applications by tightly integrating it with their ECM, leaving other applications of BPM to other companies. I’ll be very interested to see the extent of their content-process integration, and if it includes triggering of process events based on document state changes as well as links from processes into the content repository.

They believe that BPEL will be replaced by BPMN for most general-purpose BPM applications, with BPEL being used only for pure service orchestration. Although that’s a technical virtuous viewpoint that I can understand, there’s already a lot of commitment to BPEL by some major vendors, so I don’t expect that it’s going to go away any time soon. Although they are only supporting a subset of the BPMN 2.0 standard now – which could be said of any of the process modelers out there, since the standard is vast – they are committed to supporting the full standard, including execution semantics and the interchange format.

Activiti includes a modeler, a process engine, an end-user application for participating in processes, and an administration console. Not surprisingly, we spent quite a bit of time talking about Activiti Modeler, which is really a branded version of Signavio’s browser-based BPMN 2.0 process modeler. This uses AJAX in a browser to provide similar functionality to an Eclipse-based process modeler, but without the desktop installation hassles and the geeky window dressing. It is possible to create a fully executable process model in the Activiti Modeler, although in most cases a developer will add the technical underpinnings, likely in a more developer-oriented environment rather than the Modeler. Signavio includes a file-based model repository, which has been customized for inclusion in the Activiti Modeler; it would be great to see if they can do something a bit more robust to manage the process models, especially for cloud deployments. They are including support for certain proprietary scripting instead of using Java code for some interfaces, such as their Alfresco interface.

Activiti Explorer - end-user interfaceActiviti Explorer provides a basic end-user application for managing task lists, working on tasks, and starting new processes. Without a demo, it was hard to see much of the functionality, although it appears to have support for private task lists as well as shared lists of unassigned tasks; a typical paradigm for managing tasks is to allow someone to claim an unassigned task from the shared list, thereby moving it to their personal list.

The Activiti Engine, which is the underlying process execution engine, is packaged as a JAR file with small classes that can be embedded within other applications, such as is done in Alfresco for content management workflows. It can be easily deployed in the cloud, allowing for cross-enterprise processes. The only thing that I saw of Activiti Probe, the technical administration console, was its view on the underlying database tables, although it will have a number of other capabilities to manage the process engine as it develops. Not surprisingly, they don’t have all the process engine functionality available yet, but have been focusing on stabilizing the API in order to allow other companies to start working with Activiti before the GA release.

Activiti Cycle mockup - design collaborationI also saw a mockup of Activiti Cycle, a design-time collaboration tool that includes views (but not editing) of process models, related documents from Alfresco, and discussion topics. Activiti Cycle can show multiple models and establish traceability between them, since their expectation is that an analyst and a developer would have different versions of the model. This is an important point: models are manually forward-engineered from an analyst’s to developer’s version, and there are no inherent automated updates when the model changes, although there are alerts to notify when other versions of the same model are updated. This assumption that there can be no fully shared model between analyst and developer has formed a part of a long-standing discussion between Tom and I since before we met; although I believe that a shared model provides the best possible technical solution, it’s not so easy for a non-technical analyst to understand BPMN models once you get past the basic subset of elements. Activiti Cycle may not be in GA until after the other components, although they are working on it concurrently.

The screen shots that I saw looked nice, although I haven’t seen a demo yet; Tom gave credit to Alfresco’s UI designers for raising this above just another developer’s BPM tool into something that could be used by non-developers without a lot of customization. I’m looking forward to a demo next month, and seeing how this progresses to the November release and beyond.

BPM and Case Management Webinar Q&A

I presented a webinar on business process management and case management today, hosted by Pegasystems. Great fun as always, and a ton of questions that we didn’t have time to answer. I captured a lot of them and will address them here; if you attended the webinar, Pega will also send out their responses as well as a link to the recording of the webinar. First of all, here’s my slides from the presentation:

Update: the webinar replay is here on the Pega site (registration required).

And here’s the Q&A. I’ve grouped together questions that I’ve responded to in a single answer.

What work is being done on the BPMN standard to improve support for case management?

What about modeling? What effect do you think this dynamic/structured divide has on modeling. Can you model structured processes in the same way as dynamic cases?

is there any standard for case management like BPM?

Do you work on Case Management Process Modeling of OMG Standardisation

Although you can use BPMN to model ad hoc processes, it doesn’t currently lend itself that well to modeling case management situations: it doesn’t include good support for the rich content required in most case management scenarios, nor for completely on-the-fly subprocess definition by a participant. BPM products that only support BPMN are going to struggle with representing cases as well as structured processes. I’m not sure what OMG is doing (if anything) to address CM within the BPMN standard in order to address this issue; the fact that they have issued an RFP for Case Management Process Modeling indicates that they’re going to do something; in my mind, it makes sense to consider some sort of extension to BPMN since there are so many processes that include aspects of both. I am not involved in that standards work.

Is “Case Management” just another name for Forrester’s “Human-Centric BPM” ?

Not really, or at least not based on their last definition of human-centric BPM. There are many structured BPM situations that involve a large number of human-facing steps; that doesn’t make them case management since they are neither dynamic nor collaborative. Forrester does have a recent report on dynamic case management that is separate from their BPM reports.

Case Management seems limited to user self-selected processes, that are pre-defined. Wouldn’t a truly dynamic case management system be guiding users based on current case context and customizing responses to the specific need, rather than simply insert pre-defined segment?

Absolutely. Although I showed the situation where someone could add pre-defined process fragments, I didn’t mean to imply that that’s the only method. Most of the time, the user has a pre-defined set of actions (which are more granular than process fragments) from which they can select; however, it is possible in most case management systems to allow a user to define actions and subprocesses on the fly.

Is it typical for someone other than the “case worker” to initiate or create a case, e.g. by submitting some sort of request directly into the CM application, or is it more typical for the case worker to create the case based on input via some other channel such as phone, email, etc.?

To what extent do organizations integrate the case with legacy systems – claims, investment etc.

I’ve grouped these two questions together because my response to the first one is “yes, but not only someone – it could be a trigger from another system”. In almost every situation, a case is created in response to an event; if that event occurs in another system, it could be used to create the case directly. Otherwise, an event such as a phone call could be used by someone other than the case worker to create the case, such as the CSR who took a call from a customer. Depending on the case management system, there may be further integration with other business systems in order to update information or trigger other events, or it may rely on the case worker to check those systems for additional actions and information.

If you’re trying to provide tools for front line service agents who take a wide variety of requests, including routine and knowledge based, is CM the best approach; linking in to BPM to support the routine workflows or is it better to have BPM. The main challenge is that the frontline worker could receive queries on certain query types very rarely.

Great question, to which the answer is “that depends”. That’s a design issue that would depend on the nature of the requests as well as the need to cross over between BPM and CM within those requests. For example, if the requests are independent from each other, you could spawn individual processes or cases depending on the type of request; whether the two different types are handled by one or two different systems could be completely transparent to the service agent. However, if a request could come in that need to be combined with or related to an earlier request, then CM would likely be the way to go.

Will presenters discuss measurability for individuals participating in the case, time and actions needed to close – and sense of ownership of the customer solution?

Just because a case is dynamic doesn’t mean that it’s not measured: keep in mind that a case is based on goals, and there should be KPIs associated with those goals that can be measured. For example, it may be important that a case be completed within a specific timeframe, although not important that any given action within the case be done within a specific time as long as it doesn’t jeopardize the case milestones. The reverse may also be true: a case could have no specific deadline since it is open-ended (as in managing a chronic care patient), although there may be deadlines and milestones on actions and subprocesses within the case. As for ownership, usually a case has a specific case manager who holds ultimate responsibility, even some of the actions are performed by other people, although that’s not always the case. In situations where there is not a single case manager, the identification and monitoring of KPIs becomes more important, with alerts being raised to someone who can take responsibility if required in order to achieve the case result.

BPM fits easily in a Quality Management System – Plan Do Check Act. How would Case Management fit into a QM system and repository?

I’m not a QM expert, but much of what I see of how QM is applied is through the development and application of fairly specific procedures. In the presentation, I spoke about structured subprocesses that could be invoked from a case in order to complete specific actions; these would obviously fit well within a QM framework. A structured PDCA model isn’t going to fit for most case management, although could be applied at a higher level since there is often some design of a framework or template for cases that is done, and KPIs against which you measure the success of the case.

If you had a process that was this complicated would you not rationalise it using something like six sigma?

This was related to the scenario that Emily Burns from Pegasystems presented, but I’ll address the more general issue of complexity and measurement, in part based on my previous response to the QM question. Six Sigma in particular is based on statistical measurement of processes, with a goal of reducing defects in the process. Although you could apply some of the Six Sigma measurement principles, in general, since you don’t have predefined processes, it’s difficult to make a lot of statistical calculations about those processes. Case management isn’t a replacement for process analysis: if you have a highly complex but structured process, then analyze it and implement it using more standard structured BPM techniques. Case management is for when, regardless of the amount of analysis that you do, it’s just not possible to model every possible pathway through the business process. That being said, there are situations where using case management for a while does end up producing some emergent processes: processes that weren’t understood to be predictable and structured until they were done enough times in a case management framework to see the patterns emerge.

since case is so dynamic, what is the best practice when designing system to handle CM?

how do you decide the granularity of a case ?

I’ve grouped these two together since they’re both involved with case design. As I mentioned in my previous response, CM is not a replacement for analysis: you still need to understand your business processes before you start designing your CM system. You will need to design a case framework that doesn’t restrict what the case managers can do, while collecting the information that is required in order to document and act upon the case. Things to design into your case will include an overall data model (which will determine the ability of people to find and monitor a specific case), any required actions or subprocesses that need to be executed at some point, and content that needs to be collected before the case can be completed. Other things to include will be case context (the information from other systems that may be used by the case manager in order to complete their work) as well as events between the case and other systems, both inbound and outbound. You will also want to set KPIs, milestones and related alerts or escalations on specific actions or the entire case. Emily will likely respond with more specifics on how they set out cases and subcases within Pega, but I suspect that you might find that your definition of case may shift once you start doing case management for a while. I had a chance to speak with the person from BAA who presented the BAA case study (the one that Emily showed at the end), and he said that they were in the process of rolling up the previous separate cases that they had for things such as passenger handling and luggage handling into a single case for each flight, with those as subcases.

What i understand from a case is that they are basically business scenarios. Can we assume that?

If you mean “business scenario” in the enterprise architecture sense, then a case and business scenario could be considered as equivalent in some situations, although business scenarios usually end up with some sort of structured process model defined. There are many common aspects, however, and I think that we can learn much about defining CM standards by looking at what has been done in EA.

To understand cases or to handle case management solutions, some extra tools are needed that handle things like case history, status etc., so, what generic list of tools do you think are needed from a holistic case management tool?

The list of tools and functionality is still emerging, and will continue to evolve over the next while, but Forrester’s report on dynamic case management has a useful diagram showing what to expect in a case management platform:

They also list the capabilities that would translate directly from BPMS in that part of the framework, such as human interaction, integration and analytics.

With a Case Management structure, is work typically or ever completed in the Case Management tool itself? If not, does the Case Management tool depend on the users updated the case periodically to indicate what stage they are in & how far they are toward completion, etc.?

Work is typically done both within the CM structure and in other systems. Since part of the expected functionality is integration between the CM system and other systems, there may be some degree of automated exchange of events and information between them, or users may be required to update the case directly with their progress in non-integrated systems. Since the case file serves as a permanent record of the case, it is often considered the system of record, not transient information such as a typical process instance might be: that means that updating the case isn’t just a matter of documenting what was done in other systems, but could be the only place in which that information is captured.

If you were on the call and have other questions, feel free to add them in the comments and I’ll respond.

TIBCO BPM Now and Future: iProcess, Meet ActiveMatrix BPM

The session that I’ve been waiting all day for is with Roger King, who runs BPM product management and strategy for TIBCO, where he discussed the new ActiveMatrix BPM and TIBCOSilver BPM offerings for on-premise and cloud deployments. They’ve been working on this for a couple of years, and obviously keen to get it out of the gate. As I tweeted earlier after taking a look at ActiveMatrix BPM in the solutions showcase, this isn’t a complementary product to iProcess: it’s the successor to iProcess, in spite of what was said about this yesterday. Have no doubt: AMX BPM is not an upgrade to iProcess, it’s a new product, based on a new technical architecture, and already (at version 1) provides more functionality than iProcess.

With both AMX BPM and Silver BPM, Business Studio is used for modeling the process; ActiveMatrix versus Silver is just a deployment choice at the time of deployment, which means that you can deploy the exact process to an on-premise ActiveMatrix application server or to the cloud. In fact, if you’re modeling your iProcess processes now in Business Studio, rather than in the iProcess Modeler, you can deploy those directly to AMX or Silver, too. What’s changed from iProcess is that they’ve bundled much more into the BPM bundle: it’s a full composite application development and deployment plaform, including forms-based user interface, rules and SOA capabilities, so that all of the process-related artifacts can be modeled in a single environment. Their previous focus on support for process patterns is now extended to include resource, business and data patterns, too, and there’s more work management and workforce optimization functionality. Their tag line: “Business Friendly, Enterprise Strength”.

This model-driven development is based on five types of models: process (which we’re used to in BPM), form, data, organizational and client application. In order to do this, they reused some pieces that will be familiar for iProcess customers, but some new stuff too:

  • BusinessStudio for modeling, extended for new functionality
  • New OSGi-based deployment model, where an application package (process, rules, services, etc.) is deployed rather than just a process
  • New container-based grid platform
  • New runtime, which is an ActiveMatrix application
  • Workspace, similar to that used by iProcess, but extended
  • New Openspace gadget-based client, including interfaces for mobile devices

The architecture starts with the OSGi runtime with the ActiveMatrix service platform as the basic platform, with the ActiveMatrix BPM SCA composite application as the BPM platform running on that platform, including Process Manager, Openspace, Event Collector, Work Manager and Workspace components. Everything used by the AMX BPM components are visible to other applications, meaning that it can be easily embedded or integrated with other AMX BPM applications.

Both business analysts and process developers create executable process models with the other supporting models and forms user interfaces, while the SOA developer creates process-based services, all within the AMX BPM environment. Work is managed and executed by various level of workers, using organizational models that can be extracted from LDAP. Users may access work using Workspace (the same interface as is used for iProcess), Openspace (a mashup-type interface) or Mobilespace (the mobile version of Openspace, currently available for iPhone), or through a custom interface. Performance data is visible for different levels of monitors, again through standard dashboards or custom interfaces.

One of the interesting things that can be done is modeling of page flows: since AMX BPM allows for both user interface and process to be modeled, there are some parts of the flow that aren’t run in the process engine, but are executed in the web tier as a series of pages/views linked by rules and services, presented to the same user during a single session with the state information maintained during the flow; this provides smart capabilities to an otherwise simple forms user interface, without having to round-trip to the process engine for some basic decisioning and screen flows. It also allows for a more seamless interface in the modeler: a page flow model is shown almost as if it were an expanded subprocess from a task in the main process model, so that you can view the whole process – that which runs on the process engine as well as in the web tier – in a common environment. This reminds me somewhat of the screen flow capabilities that are starting to emerge as part of web application platforms such as Salesforce and NetSuite, although in the context of a larger process rather than in the context of a packaged application.

I also like data modeling capabilities in their business object models: you can interrogate an existing database directly in order to derive the data model for your process instance data, which saves a lot of redefinition (and the errors that can be introduced) of the data model as part of the process model. You can also import the data model from UML and other formats. Eventually, this needs to be able to integrate with enterprise MDM initiatives, but this is a good start.

The forms-based UI designer has some nice features as well, being able to automatically generate master-detail forms with grids for detail records based on joins in the data model. Although it’s not a really complex forms designer, it does allow styling with a style sheet, and I expect to see some improvements here as they figure out what their customers really want. They can separate presentation from page flow, and some companies may decide to use the AMX BPM page flow but do their own presentation screens.

They’ve moved away from the concept of queues that supported iProcess to dynamic work lists that are generated on the fly; this makes sense given the advances in dynamic data access. In general, creating a new BPM product from the ground up today not only makes their 20-year-old iProcess architecture look dated, but also the 10-year-old generation of products from other vendors that started the current BPM revolution in the early 2000’s.

Tons of interesting stuff here, more than I can absorb on the fly for a live blogging post, but I’ll nail down a full briefing in the next couple of weeks.

AMX BPM shares common components with the AMX SOA product line, but does not include AMX Service Grid (which includes more containers) or AMX Service Bus – if you’re a TIBCO customer (or planning to become one), these are details that you’ll want to work out in terms of licensing to make sure that you have all the right pieces, and aren’t paying for things that you’re not using. If you’re an iProcess customer, then don’t look for AMX BPM as part of your upgrade maintenance: it’s not an upgrade, it’s a new product. iProcess is not being end-of-lifed; it will be maintained and have minor enhancements for some time to come, but I don’t get the idea that you’re going to see a lot happening here since King stated that the major BPM investment for them will be in AMX BPM. If you have one of the other BPM products, such as InConcert, you may want to start saying your prayers now (although there has been no EOL notice as yet). In any case, at some point you’re going to want to consider a migration path off these older platforms for processes that you want to continuously improve, since they are not going to see any significant upgrades in the future, even though the official line is that iProcess “is not going away for a long, long time”.

The current plan is to provide for coexistence of iProcess and AMX BPM in Workspace so that users can pull work from either system without having to worry about which one it is on. And, although you could take an iProcess model in Business Studio and deploy it in AMX BPM, you’d probably want to take advantage of much more of the new functionality, such as the forms-based user interface designer, which means essentially rewriting everything except the process model. Although there is some service composition capability in AMX BPM, you’re probably going to leave most of the service composition heavy lifting in BusinessWorks, since AMX BPM really is geared towards turning processes into services, not general composition.

Interestingly, when I saw a quick demo at the booth earlier today, I detected essence of BPEL in the process model (such as catch and throw events); King confirmed that at the composition level, this is heavily extended BPEL.

Essentially, AMX BPM provides BPM on an SOA platform, but without the BPM designers having to worry about the SOA parts. From that standpoint, the BPM functionality competes well with the pure play BPM suites, but it provides a great deal more flexibility in dealing with services than you’ll see from the pure plays. They see their competition as the other stack vendors, IBM and Oracle, but with the lack of innovation and cohesion in both of those vendors’ BPM offerings, TIBCO seems to come out ahead in BPM functionality. Seems like the best of both worlds.

TIBCO Product Stack and New Releases

We’re overtime on the general session, 2.75 hours without a break, and Matt Quinn is up to talk about the TIBCO product stack and some of the recent releases as well as upcoming releases:

  • Spotfire 3.1
  • BusinessEvents 4.0, with an improved Eclipse-based development environment including a rule debugger, and a multi-threaded engine
  • BEViews (BusinessEvents Views) for creating real-time customizable dashboards for monitoring the high-speed events (as opposed to Spotfire, which can include data from a much broader context)
  • ActiveSpaces Suite for in-memory processing, grid computing and events, with the new AS Transactions and AS Patterns components
  • Silver Suite for cloud deployment, including Fabric, Grid and CAP (Composite Application Platform)
  • PeopleForms, which I saw a brief view of yesterday: a lightweight, forms-based application development environment
  • tibbr, their social microblogging platform; I think that they’re pushing too much of the social aspect here, when I think that their sweet spot is in being able to “follow” and receive messages/events from devices rather than people
  • Silver Analytics
  • ActiveMatrix 3.0, which is an expansion of the lightweight application development platform to make this more of an enterprise-ready
  • ActiveMatrix BPM, which he called “the next generation of BPM within TIBCO” – I’ll have more on this after an in-depth briefing
  • Silver BPM, the cloud-deployable version of BPM
  • Design Collaborator, which is a web-based design discovery tool that will be available in 2011: this appears to be their version of an online process discovery tool, although with more of a services focus than just processes; seems late to be introducing this functionality to the market

I heard much of this yesterday from Tom Laffey during the analyst session, but this was a good refresher since it’s a pretty big set of updates.

TIBCO: Now FTL!

We had a brief comment from Tom Laffey in the general session about TIBCO’s new ultra low latency messaging platform to be released by year end, which breaks the microsecond barrier. They’re calling it FTL, which makes my inner (or not so inner) geek giggle with happiness: for sci-fi fans, that’s the acronym for “Faster Than Light” spaceship drives. I love it when technology companies tip a nod to the geeks who use and write about their products, while remaining on topic.

It’s also new (for TIBCO) since it provides content-based routing and structured data support, which are, apparently, just as important as a cool name.

Deutsche Bank’s Wolfgang Gaertner at TUCON

The third keynote speaker this morning was Wolfgang Gaertner, CIO of Deutsche Bank: we’ve moved from international crime-fighting to the somewhat more mundane – but every bit as international and essential – world of banking. Their biggest challenge over the past few years has been to reduce the paper flow that was slowing the communication between their processing centers, reduce processing time, and improve customer service levels: all of which they have achieved. They’ve used TIBCO to integrate their multiple legacy systems, especially those from mergers and acquisitions such as they had with Berliner Bank, where they wanted to maintain the customer brand but integrate the back-end systems to allow for greater efficiency and governance.

They’re using BPM to manage some of the processes, such as special account opening and exception handling, and are finding that the new technology drives new opportunities: as other areas in the bank see what can be done with integration and BPM, they want to have that for their applications as well. They’re also planning to rip out their core legacy systems and replace them with SAP, and use TIBCO for integration and workflow: TIBCO is a big enabler here, since Deutsche Bank now has sufficient experience with TIBCO products to understand how it can be used to help with this technology transformation.

INTERPOL at TUCON

The special guest speaker at this morning’s keynote was Ronald Noble, Secretary General of INTERPOL, speaking about why speed matters in law enforcement, and using technology to stay a step ahead of the criminals.

He engaged the crowd with very funny and completely deadpan humor, but addressed the very serious topic of how the expedient exchange of information between different countries is a crucial part of law enforcement on an international scale: the two second advantage can mean that an immigration agent has the full background of the person that they’re screening in near real-time, from both local and international databases. I’m not a huge fan of much of the “security theater” that happens in the name of airport security, but having this type of information can make a real difference in terms of identifying people traveling on lost and stolen passports, or tracking the international movements of suspected criminals. How that information is used, however, is where human rights violations can occur (a subject that Noble doesn’t address), since suspicion is not the same as conviction, and not all countries treat those suspected of a crime in a humane manner.

This is one of those areas where technology has moral implications, and the impact of using every bit of data about someone in order to make decisions can be a slippery slope in some cases.