SAP NetWeaver BPM

This post is both long, and long overdue. It’s based on several online sessions with Donka Dimitrova and Jie Deng of the SAP NetWeaver BPM product management team, then an update with Wolfgang Hilpert and Thomas Volmering at SAPPHIRE in May when the product entered unrestricted release. In the past few weeks, there’s been a series of “Introduction to SAP NetWeaver BPM” posts by Arafat Farooqui of Wipro on the SAP SDN site (part 1, part 2, part 3 and part 4, which are really about how to hook up a Web Dynpro UI to a human task in BPM, then invoke a process instance using web services from a portal), and I’m inspired to finally gather up all my notes and impressions.

The driver for BPM with SAP is pretty obvious: Business Workflow within the SAP ERP suite just isn’t agile or functional enough to compete with what’s been happening in BPM now, and SAP customers have been bringing in other BPM suites for years to complement their SAP systems. I had to laugh at one of Dimitrova’s comments on the justification for BPM during our discussion – "process changes in an ERP are difficult and require many hours from developers" – oh, the irony of this coming from an SAP employee!

The Eclipse-based Process Composer is part of the NetWeaver Developers’ Studio, and is used to create processes in the context of other development tools, such as the Yasu rules engine (which they bought) and user interfaces. Like most modern BPMS’, what you draw in the Process Composer (in BPMN) is directly executed, although user interfaces must be created in other development tools such as Web Dynpro or Adobe Interactive forms, then linked to the process steps. There are future plans to generate a UI from the process context data or provide some sort of graphics forms designer in place, but that’s not there yet.

SAP NetWeaver BPM perspectivesAs with most Eclipse-based process modelers that I’ve seen, Process Composer has multiple perspectives for different types of process design participants, with a shared process model. Initially, there is only a process architect (technical) perspective in the modeler, and the business analyst view will be released this year. Future releases will include a line-of-business manager view to see task sequences and parallelism, but no details of gateways; and an executive view of major phases with analytics and KPI dashboards.

There is no link between ARIS-based modeling (SAP Enterprise Modeling Applications by IDS Scheer) and the NetWeaver BPM in this version; integration is planned for later version, although it will be interesting to see how that plays out now that IDS Scheer has been purchased by Software AG, which competes with SAP in (at least) the BPM arena.

Although all you can do now is create your BPM processes in this environment, in the future, there’s plans to have a common modeler and composition environment provide visibility into ERP processes, too, which will be a huge selling point for existing SAP customers who need more agility in their ERP processes. This common process layer will provide not just a unified design experience, but common runtime services, such as monitoring and performance management.

One huge issue from an orchestration standpoint is the lack of support for asynchronous web services calls, meaning that you have to use the existing NetWeaver Process Integrator (PI) environment to create system-centric processes, then invoke those (as synchronous web services) from NetWeaver BPM as required. I didn’t get a clear answer on future plans to merge the two process management platforms; keeping them separate will definitely cause some customer pushback, since most organizations don’t want to buy two different products to manage system-centric and human-centric processes, as they are encouraged to do by stack vendors such as IBM and Oracle.

SAP NetWeaver BPM Process ComposerTaking a look at the Process Composer environment, this is a fairly standard Eclipse-based BPMN process modeling environment: you create a process, add pools, add steps and link them together. For human-facing tasks, you use the properties of the step to connect it to a UI for that step, which must already be built by a developer using something like Web Dynpro. As I mentioned previously, the first version only has the “process architect” perspective, and is targeted at creating human-centric processes without full orchestration capabilities, since that’s what SAP’s customers who were involved in the product development cycle said that they most wanted. The environment is fairly technical, and I wouldn’t put it in front of any but the most technical of business analysts.

Roles can be set by lanes and overridden by task role assignment, which allows using the lanes for a department (for example) and overriding manager-specific tasks without moving them to another lane. Also, expressions can be used to assign roles, such as manager of the user that started the process. User IDs, roles and groups are pulled from the NetWeaver user management engine (UME).

Each step can have other properties, including deadlines (and the events that occur when they are exceeded) and user texts that appear for this step in the user worklist, which can include parameters from the process instance. These are all maintained (I think) in a task object, which is then associated with a step on the process map; that allows the same task to be easily reused within the same process or across processes.

SAP NetWeaver BPM Process ComposerThere are a number of things that I like about Process Composer:

  • Some nice UI pop-ups on the process map to make specifying the next step easier.
  • An explicit process data model, called the process context, driven by master data management concepts; this is used for expressions and conditions in gateways, and to map to the inputs and outputs of the UI of human steps or the service interface of automated steps. It can be imported as an XSD file if you already have the schema elsewhere.
  • The visuals used to map and transform from the process context to a human or web service step make it obvious what’s getting mapped where, while allowing for sophisticated transformations as part of the mapping. Furthermore, a mapping – including transformation functions – can be saved and reused in other processes that have the same process context parameters.
  • Lots of fairly obvious drag-and-drop functionality: drag a task to create a step on a process map, drag a role to assign to a pool, or drag a WSDL service definition to create a system task.
  • Nice integration of the Yasu rules engine, which can be purely within the context of the process with rules appearing as functions available when selecting gateway conditions, or as a more loosely-coupled full rules engine.

Process Composer is just one tab within the whole NetWeaver Project Explorer environment: you can open other tabs for UI design, rules and other types of components. This allows the process to be visible while rules are being modeled, for example: handy for those of us with a short attention span. Rules are created using decision tables, or by writing in a Java-based rules language; Dimitrova referred to the latter as being “a bit complicated for business people”, which is a bit of an understatement, although decision tables are readily usable by business analysts. Future releases will have a business perspective in the rules modeler.

The Rules Composer is a full rules modeling environment, including debugging for incomplete or over-constrained rules in a decision table, and rules versioning. Parameters from a process context can be passed in to rules. Rules can be exposed as web services and called just like any other web service; in fact, although there is tight integration between the rules and process environment allowing for easy creation of a rule directly from within the Process Composer perspective, the rules management system is a separate entity and can be used independent of BPM: really the best of both worlds.

SAP Universal WorklistHaving spent about 3 sessions going through the design environments, we moved on to process execution. Processes can be initiated using a web services call, from an Adobe form, or manually by an administrator. Since process models are versioned, all of the versions available on the server can be seen and instantiated.

Human tasks can be seen in the SAP Universal Worklist (UWL) through a connector that I heard about at SAPPHIRE, appearing along with any other tasks that are surfaced there including SAP ERP tasks or other systems that have developed a connector into the UWL. I like the unified inbox approach that they’re presenting: other BPM systems could, in fact, add their own human tasks in here, and it provides a common inbox that is focused on human workflow. Although an email inbox could be used for the same purpose, it doesn’t provide adequate management of tasks from a BPMS. The UWL is fairly independent from NetWeaver BPM; this is just one way to provide a worklist of BPM tasks that is provided by SAP in a portal environment, but it doesn’t have to be done that way.

SAP NetWeaver BPM Task InterfaceOnce a task is selected and opened, there is a frame across the top with standard task information that will be common across all tasks: information such as start date, deadline and status; common task functions of Close, Delegate and Revoke; and notes and attachments to the task. Below that is the Web Dynpro UI form that was connected to that task in the Process Composer, which contains the instance data that is specific to the process context for this process. The user can interact with that form in whatever manner specified by the Web Dynpro developer, which might involve accessing data from databases or ERP systems; that part is completely independent of NetWeaver BPM.

The user can also click through to a process view showing where they are in the context of the entire process map, plus runtime task parameters such as priority and start date.

Considering the all-important areas of monitoring and management of work in progress, that’s a bit weak in the first version. In the next version, there will be a dashboard showing process status and cycle time, with drill-down to process instances, combining exported BI data and realtime work in progress statistics. There is no way to update the process design of work in progress; there are actually only a few BPMS that do this very well, and most either don’t do it at all or require manual modification of each instance. Wherever possible, things that might change should be put into business rules, so that the correct rule is invoked at the point in time that it is required, not when the process instance was created.

At the end of all the demos, I was impressed with what SAP has released for a version 1.0, especially some of the nice handling of data and rules, yet aware of the many things that are still missing:

  • task UI generation
  • simulation
  • KPI measurement
  • asynchronous web services calls
  • links to/from ARIS
  • common process composition environment across BPM and ERP processes
  • BPEL translation
  • business analyst perspective in process and rules modelers
  • BPMN 2.0 support
  • strategy for merging or coexisting with NetWeaver process orchestration platform

In the briefing at SAPPHIRE, I did see a bit of the roadmap for what’s coming in the next year or two. In 2009, the focus will be on releasing the common process layer to allow for discovery, design and management of processes that include core (ERP) processes, human tasks in BPM, and service orchestration. This, in my opinion, is the make-or-break feature for NetWeaver BPM: if they can’t show much deeper integration with their ERP suite than any other BPMS vendor can offer, then they’re just another behind-the-curve BPMS struggling for market share. If they do this right, they will be positioned to win deals against other BPMS vendors that target SAP customers, as well as having a pre-existing relationship with SAP customers who may not yet have considered BPM.

Also in 2009, expect to see convergence of their BPM and BI, which is badly needed in order to provide dashboard/monitoring capabilities for BPM.

Further out, they’re planning to introduce a UI generator that will create a simple forms-based UI for tasks based on the process context (data model), as well as reports generated from the process definition and point-and-click integration of analytics at process steps. There will be more robust event provisioning tied to the existing event structure in the ERP layer, allowing events to be propagated to external applications such as BPM, and intermediate message events integrated with Business Suite. As mentioned previously, there will be new perspectives in the Process Composer, initially a business analyst perspective with a different focus than the existing technical perspective, not just a dumbed-down version as I’ve seen in other tools, and eventually they’ll use the Eclipse rich client platform (RCP) for an even lighter weight (and less geeky) Eclipse interface. There are plans for allowing ad hoc collaboration at a process step – necessary for case management functionality – as well as allowing operation managers to have control over interactive rule thresholds, providing greater business control over processes once they are in operation.

There’s a lot still missing in this first version; : simulation, KPIs, asynchronous web services calls just to name a few. That doesn’t mean, however, that it’s not usable – I know many customers using BPMS’ that do support those functions, but the customers never use them: great demo and sales tools, but not always used in reality.

NetWeaver BPM is not the best BPMS on the market. However, they don’t need to be the best BPMS on the market: they need to be the best BPMS for SAP customers. They’re not quite there yet, but it’s an achievable goal.

Heather Kreger, IBM, on SOA standards

It’s impossible for me to pass up a standards discussion (how sad is that?), so I switched from the business analysis stream to the SOA stream for Heather Kreger’s discussion of SOA standards at an architectural level. OASIS, the Open Group and OMG got together to talk about some of the overlapping standards impacting this: they branded the process as “SOA harmonization” and even wrote a paper about it, Navigating the SOA Open Standards Landscape Around Architecture (direct PDF link).

As Kreger points out, there are differences between the different groups’ standards, but they’re not fatal. For example, both the Open Group and OASIS have SOA reference architectures; the Open Group one is more about implementation, but there’s nothing that’s completely contradictory about them. Similarly, there are SOA governance standards from both the Open Group and OASIS

They created a continuum of reference architectures, from the most abstract conceptual SOA reference architectures through generic reference architectures to SOA solution architectures.

The biggest difference in the standards is that of viewpoint: the standards are written based on what the author organizations are trying to do with them, but contain a lot of common concepts. For example, the Open Group tends to focus on how you build something within your own organization, whereas OASIS looks more at cross-organization orchestration. In some cases, specifications can be complementary (not complimentary as stated in the presentation 🙂 ), as we see with SoaML being used with any of the reference architectures.

Good summary, and I’ll take time to review the paper later.

The Open Group Conference

I was already planning to attend the Open Group Conference in Toronto next week to catch up on what’s happening in the enterprise architecture space, and now I’ve been invited to join Dana Gardner’s panel on Monday morning, which will also be recorded as a podcast. The panel is on architecture’s scope extending beyond the enterprise, bringing together elements of EA, SOA and cloud computing. Also on the panel will be John Gotze, president of the Associate of Enterprise Architects, and Tim Westbrock from EAdirections. There are big aspects of business process to this, specifically when you start orchestrating processes that span the enterprise’s boundaries, and I hope that we’ll have time to explore that.

I’ll probably be at the conference each day to check out some of the other sessions, and I may stick around for some of the evening events, such as CloudCamp on Wednesday. If you’re there, drop by my session and say hi.

Webinars and podcasts

This seems to be my month for webinars and podcasts. Here’s the line-up:

  • I recorded a webinar for SearchSOA a few weeks ago on a pragmatic approach to using SOA and BPM together, particularly in the area of service discovery and specification. Unfortunately, I can’t find it on their site, so not sure if it’s been published yet. Keep looking.
  • On March 18th, I’ll be doing a live webinar on BPM centers of excellence that will become part of the Appian-sponsored BPM Basics informational site. You can sign up for the webinar here if you want to listen to it live, which will include Q&A from the audience; the version without Q&A will be available for replay on the BPM Basics site.
  • That same week, I’ll be recording a podcast with Savvion’s Dr. Ketabchi on BPM in a down economy. There have been a few other webinars on this topic lately, but right now it’s a very popular message and there’s lots to talk about. This will be published on ITO America, which provides broad coverage of technology issues for higher-level technical management.

The fun part of these three is that not only are they three completely different topics, they’re targeted at three different audiences: the first for developers and other technical people, the second for business and mid-level project team members, and the third at CIOs. Although doing webinars and white papers is a small part of my business, the research, analysis and writing that goes into them really helps to hone my ideas for applicability with my enterprise clients who are implementing BPM.

Innovation World: ChoicePoint external customers solutions with BPM, BAM and ESB

I took some time out from sessions this afternoon to meet with Software AG’s deputy CTOs, Bjoern Brauel and Miko Matsumura, but I’m back for the last session of the day with Cory Kirspel, VP of identity risk management at ChoicePoint (a LexisNexis company), on how they have created externally-facing solutions using BPM, BAM and ESB. ChoicePoint screens and authenticates people for employment screening, insurance services and other identity-related purposes, plus does court document retrieval. There’s a fine line to walk here: companies need to protect the privacy of individuals while minimizing identify fraud.

Even though they only really do two things — credential and investigate people and businesses — they had 43+ separate applications on 12 platforms with various technologies in order to do this. Not only did that make it hard to do what they needed internally, customers were also wanting to integrate ChoicePoint’s systems directly into their own with an implementation time of only 3-4 months, and provide visibility into the processes.

They were already a Software AG customer with the legacy modernization products, so took a look at their BPM, BAM and ESB. The result is that they had better visibility, and could leverage the tools to build solutions much faster since they weren’t building everything from the ground up. He walked us through some of the application screens that they developed for use in their customers’ call centers: allow a CSR to enter some data about a caller, select a matching identity by address, verify the identity (e.g., does the SSN match the name), authenticate the caller with questions that only they could answer, then provide a pass/fall result. The overall flow and the parameters of every screen can be controlled by the customer organization, and the whole flow is driven by a process model in the BPMS which allows them to assign and track KPIs on each step in the process.

They’re also moving their own executives from the old way of keeping an eye on business — looking at historical reports — to the new way with near real-time dashboards. As well as having visibility into transaction volumes, they are also able to detect unusual situations that might indicate fraud or other situations of increased risk, and alert their customers. They found that BAM and BI were misunderstood, poorly managed and under-leveraged; these technologies could be used on legacy systems to start getting benefits even before BPM was added into the mix.

All of this allowed them to reduce the cost of ownership, which protects them in a business that competes on price, as well as offering a level of innovation and integration with their customers’ systems that their competitors are unable to achieve.

They used Software AG’s professional services, and paired each external person with an internal one in order to achieve knowledge transfer.

Innovation World: Day 1 keynote

Karl-Heinz Streibich gave the opening keynote; some of the same messages as his address at the media and analyst forum yesterday, plus some messaging about how SOA is a paradigm shift, and the Net generation entering the workforce is a strong driver for modernization and integration. He made the point that process innovation outranks product innovation, and how BPM is the future: companies who apply process management are more agile (hence more competitive) and have more optimized processes (hence are more efficient).

He finished up by referring to their new process frameworks — vertical templates including process models, user interfaces, business rules and KPI definitions for building solutions quickly — but didn’t elaborate; however, they issued a press release today discussing them in more detail. Software AG is providing the following process frameworks:

  • Order-to-Cash
  • Procure-to-Pay
  • Payments
  • Underwriting
  • Product Recall
  • New Product Introduction
  • Monthly/Quarterly Closing
  • Employee On-boarding

In addition, their partners are providing frameworks for automotive claims management, electronic check processing, claims management, field services jeopardy management, invoice dispute management, and telecommunications service provisioning.

We also heard from Dr. Peter Kürpick, with much the same message as he delivered yesterday at the analyst forum, although I think that the animated graphics in his slides were nicer today, especially the one showing a process orchestrated across several organizations participating in the end-to-end supply chain. He talked about some of their customers and how they’re improving their business processes using SOA and BPM.

I believe that he had an error in one of his slides, however: in showing how Software AG is a leader in several categories of the analyst reports, and their competitors are not, he showed that they’re a leader in Forrester’s human-centric wave but TIBCO is not. I’m looking at the the wave report right now, and TIBCO actually places higher in the leader category than Software AG. I may have mis-read the slide, it went by quickly and I didn’t have a chance to snap a photo.

He finished up highlighting some of the things coming out of their research lab that will be seen at the demo jam competition today, including BAM capabilities that can be viewed on the iPhone.

Innovation World: Media and Analyst Forum

I’m spending the morning at the media and analyst forum at Software AG’s user conference, Innovation World, in Miami. The first half of the morning covered mainframe modernization, plus a presentation by Miko Matsumura (who I met last week at the Business Rules Forum), Deputy CTO, on the state of SOA adoption. He’s just published a book — more of a booklet at 86 pages — on SOA Adoption for Dummies, continuing Software AG’s trend of using the Dummies brand to push out lengthy white papers in a lighthearted format. For example, chapter 10 is SOA Rocket Science, which covers three principles of SOA:

  1. Keep the pointy end up (instrumentation)
  2. Maintain upward momentum (organization)
  3. Don’t stop until you’re in orbit (automation)

He finished up with a discourse on SOA as IT postmodernism, casting postmodernism as an architectural pattern language: given a breakdown in the dominant metanarrative and the push towards deconstructionism, a paradigm of composition emerges…

After the break, we heard from Ian Walsh from webMethods product marketing to give us an overview of the webMethods suite:

  • webMethods BPM, including process management, rules management and simulation
  • CAF (composite application framework), for codeless application design and web-based composite applications
  • BAM, including process monitoring and alerting, and predictive management

He stated that the “pure-play” BPMS vendors (mentioning Lombardi, Savvion and Pega) are having problems because they sold on the ability to allow the  business to create their own processes quickly, but that doesn’t work in reality when you have to integrate complex systems. He also said that the platform vendors (Microsoft, IBM, Oracle) have confusing offerings that are not well integrated, hence take too long to implement. He mentioned TIBCO as a special case, neither pure-play nor platform, but sees their weakness as being very focused on events: good for their CEP strategy, but not good for their broader BPM/SOA strategy.

Walsh sees their strengths in both BPM and SOA as their differentiator: customers are buying both their BPM and SOA products together, not individually.

Bruce Williams was up next speaking on the BPM as the killer application for SOA. He’s a Six Sigma guy, so spent some time talking about BPM in the context of quality management initiatives: if we can manage processes well, we can achieve our business goals; in order to manage processes, we need some systems and infrastructure. He defines the killer app as being flexible and dynamic, not a fixed state or a system with unchangeable functionality. He sees BPM as being the language that can be spoken and understood by both business and IT: not the Tower of Babel created by technology-speak, but how process ties to business strategy.

Logistics are not great: they’ve billeted me in the down-market Marriott Courtyard next door rather than at the Hyatt where the conference is being held (I had to change rooms due to no hot water, can’t run the a/c at night because of the noise, and I have a view — complete with sound effects — of the I95 onramp), and there’s no wifi or power in the lecture hall. There’s supposed to be wifi, but it’s a hidden, protected network that only some people seem to be able to connect to (yes, I added it manually to my wireless network settings). They’ve promised us power at the desks and some assistance with the wifi after lunch.

In case my policy about vendor conferences isn’t crystal clear from previous posts, Software AG is paying my travel expenses to be here, although they are not compensating me for my time nor do they have any editorial control over what I write.

Business Rules Forum: Pedram Abrari on MDA, SOA and rules

Pedram Abrari, founder and CTO of Corticon, did a breakout session on model-driven architecture, SOA, and the role that rules play in all of this. I’m also in the only room in conference center that’s close enough to the lobby to pick up the hotel wifi, and I found an electrical outlet, so I’m in blogger heaven.

It’s a day for analogies, and Abrari uses the analogy of car for a business application: the driver representing business, and the mechanic representing IT. A driver needs to have control over where he’s going and how he gets there, but doesn’t need to understand the details of how the car works. The mechanic, on the other hand, doesn’t need to understand where the driver is going, but keeps the car and its controls in good working order. Think of the shift from procedural to declarative development concepts, where we’ve moved from stating how to do something, to what needs to be done. A simple example: the difference between writing code to sum a series of numbers, and just selecting a range of cells in Excel and selecting the SUM function.

The utopia of model-driven architecture (MDA) is that  business applications are modeled, not programmed; they’re abstract yet comprehensive, directly executable (or at least deployable to an execution environment without programming), the monitoring and analytics are tied directly to the model, and optimization is done directly on the model. The lack of programming required for creating an executable model is critical for keeping the development in the model, and not having it get sucked down into the morass of coding that often happens in environments that are round-trippable in theory, but end up with too much IT tweaking in the execution environment to ever return to the modeling environment.

He then moved on to define SOA: the concept of reusable software components that can be loosely coupled, and use a standard interface to allow for platform neutrality and design by contract. Compound/complex services can be built by assembling lower-level services in an orchestration, usually with BPM.

The key message here is that MDA and SOA fit together perfectly, as most of us are aware: those services that you create as part of your SOA initiative can be assembled directly by your modeling environment, since there is a standard interface for doing so, and services provide functionality without having to know how (or even where) that function is executed. When your MDA environment is a BPMS, this is a crystal-clear connection: every BPMS provides easy ways to interrogate and integrate web services directly into a process as a process step.

From all of this, it’s a simple step to see that a BRMS can provide rules/decisioning services directly to a process; essentially the same message that I discussed yesterday in my presentation, where decision services are no different than any other type of web services that you would call from a BPMS. Abrari stated, however, that the focus should not be on the rules themselves, but on the decision service that’s provided, where a decision is made up of a complete and consistent set of rules that addresses a specific business decision, within a reasonable timeframe, and with a full audit log of the rules fired to reach a specific decision in order to show the decision justification. The underlying rule set must be declarative to make it accessible to business people.

He ended up with a discussion of the necessity to extract rules out of your legacy systems and put them into a central rules repository, and a summary of the model-driven service-oriented world:

  • Applications are modeled rather than coded
  • Legacy applications are also available as web services
  • Business systems are agile and transparent
  • Enterprise knowledge assets (data, decisions, processes) are stored in a central repository
  • Management has full visibility into the past, present and future of the business
  • Enterprises are no longer held hostage by the inability of their systems to keep up with the business

Although the bits on MDA and SOA might have been new to some of the attendees, some of the rules content may have been a bit too basic for this audience, and/or already covered in the general keynotes. However, Abrari is trying to make that strong connection between MDA and rules for model-driven rules development, which is the approach that Corticon takes with their product.

Ultimus: V8 technical demo

FlobotI ended up wrapped up in a discussion at the break that had me arrive late to the last session of the day; Steve Jones of Ultimus is going through many of the technical underpinnings of V8 for designers and developers, particularly those that are relevant to the people in the audience who will be upgrading from those old V7 systems soon.

A nice way to integrate with web services, where the WSDL can be interrogated and a data structure matching the interface parameters created directly from that; most other systems that I’ve seen require that you define the process parameters explicitly then map from one to the other. Of course, there’s lots of cases when you don’t want a full representation of the web services interface, or you want to filter or combine parameters during interface, but this gives you the option for setting up a lot of web services really quickly.

The integrated rules editor allows you to drag and drop process variables — including recipients — onto a graphical decision tree; you don’t have the full power of a business rules system, but this may be enough for a lot of human-centric processes where most of the complex decisions in the process are made by people rather than the system.

For interfacing with any of the external components, such as the email connector or a form, it’s possible to drag and drop data fields from the process instance schema or org chart/ActiveDirectory directly to assign variables for that component, which is a pretty intuitive way to make the link between the data sources and the external calls. They’ve also eliminated some of the coding required for things like getting the current user’s supervisor’s email address, which used to require a bit of code in V7.

Ultimus provides a virtual machine with the software pre-installed as part of their training offerings, which is a great way to learn how to work with all of this; I don’t understand why more vendors don’t provide this to their customers.

I looked back to some old notes from early 2007 when I had a demo of Ultimus V7; my impression at that time is that it was very code-like, with very little functionality that was appropriate for business analysts; V8 looks like a significant improvement over this. They’re still behind the curve relative to many of their competitors, but that’s not completely surprising considering their management upheavals over the past year. If you’re a pure Microsoft shop, however, you’ll likely be willing to overlook some of those issues; Forrester placed Ultimus in the leaders sector (in an admittedly small field) in their report on human-centric BPM on Microsoft platforms. In the broader market of all BPM vendors, Gartner placed them in the visionaries quadrant: good completeness of vision, but not quite enough ability to execute to make it into the leaders quadrant, although this latter assessment seemed to be based on the performance of the previous management team.

Steve spent a bit of time showing the V8 end-user interface: reconfigurable columns in task lists, including queries and filters; shared views to allow a personal view to be shared with another user (and allow that other user to complete work on your behalf); and the ability to run reports directly out of the standard user environment, not a separate interface.

They’ve also done some performance improvements, such as moving completed process instances to a separate set of tables (or even archived out to another database) for historical reporting without impacting the performance of work in progress.

That’s it for me for the conference (and the week); tonight, we’ll be down by the Riverwalk drinking margaritas while listening to a Mariachi band. Tomorrow is an Ultimus partner day and I’ll be on an early morning flight home. Next week, I’ll be at the Business Rules Forum in Orlando, where I’m giving a presentation on mixing rules and process. The following week, I’m headed to Miami for the Software AG analyst/blogger roundtable and a day at their user conference, a late addition to my schedule.