AIIM Toronto seminar: FNF Canada’s data capture success

Following John Mancini’s keynote, we heard from two of the sponsors, SourceHOV and ABBYY. Pam Davis of SourceHOV spoke about EIM/ECM market trends, based primarily on analyst reports and surveys, before giving an overview of their BoxOffice product.

ABBYY chose to give their speaking slot to a customer, Anjum Iqbal of FNF Canada, who spoke about their capture and ECM projects. FNF provides services to financial institutions in a variety of lending areas, and deals with a lot of faxed documents. A new business line would have their volume move to 4,500 inbound faxes daily, mostly time-sensitive documents, such as mortgage or loan closing, that need to be processed within an hour of receipt. To do this manually, they would have needed to increase their 4 full time staff to 10 people handle the inbound workflow even at a rate of 1 document/minute; instead, they used ABBYY FlexiCapture to build a capture solution for the faxes that would extract the data using OCR, and interface with their downstream content and workflow systems without human intervention. The presentation went by pretty quickly, but we learned that they had a 3-month implementation time.

I stayed on for the roundtable that ABBYY hosted, with Iqbal giving more details on their implementation. They reached a tipping point when the volume of inbound printed faxes just couldn’t be handled manually, particularly when they added some new business lines that would increase their volume significantly. Unfortunately, the processes involving the banks were stuck on fax technology — that is, the banks refused to move to electronic transfer rather than faxes — so they needed to work with that fixed constraint. They needed quality data with near-zero error rates extracted from the faxes, and selected ABBYY and one of their partners to help build a solution that took advantage of standard form formats and 100% machine printing on the forms (rather than handwriting). The forms weren’t strictly fixed format, in that some critical information such as mortgage rates may be in different places on the document depending on the other content of the form; this requires a more intelligent document classification as well as content analytics to extract the information. They have more than 40 templates that cover all of their use cases, although still need to have one person in the process to manage any exceptions where the recognition certainty was below a certain percentage. Given the generally poor quality of faxed documents, undoubtedly this capture process could also handle documents scanned on a standard business scanner or even a mobile device in addition to their current RightFax server. Once the data is captured, it’s formatted as XML, which their internal development team then used to integrate with the downstream processes, while the original faxes are stored in a content management system.

Given that these processes accept mortgage/loan application forms and produce the loan documents and other related documentation, this entire business seems ripe for disruption, although given the glacial pace of technology adoption in Canadian financial services, this could be some time off. With the flexible handling of inbound documents that they’ve created, FNF Canada will be ready for it when it happens.

That’s it for me at the AIIM Toronto seminar; I had to duck out early and missed the two other short sponsor presentations by SystemWare and Lexmark/Kofax, as well as lunch and the closing keynote. Definitely worth catching up with some of the local people in the industry as well as hearing the customer case studies.

AIIM Toronto keynote with @jmancini77

I’m at the AIIM Toronto seminar today — I pretty much attend anything that is in my backyard and looks interesting — and John Mancini of AIIM is opening the day with a talk on business processes. Yes, Mr. Column 1 is talking about Column 2, if you get the Zachman reference. This is actually pretty significant: content management isn’t just about content, just as process management isn’t just about process, but both need to overlap and work together. I had a call with Mancini yesterday in advance of my keynote at ABBYY’s conference next month, and we spent 30 minutes talking about how disruption in capture technologies has changed all business processes. Today, in his keynote, he talked about disruptive business processes that have transformed many industries.

John Mancini at AIIM TorontoHe gave us a look at people, process and technology against the rise (and sometimes fall) of different technology platforms: document management and workflow; enterprise content management; mobile and cloud. There are a lot of issues as we move from one type of platform to another: moving to a cloud SaaS offering, for example, drives the move from perimeter-based security to asset-based security. He showed a case study for financial processes within organizations — such as accounts payable and receivable — with both a tactical dimention of getting things done and a strategic side of building a bridge to digital transformation. Most businesses (especially traditional ones) operate at a slim profit margin, making it necessary to look at ways to reduce costs: not through small, incremental improvements, but through more transformational means. For financial processes, in many cases this means getting rid of manual data capture and manipulation: no more manual data entry, no more analysis via spreadsheets. And cost reduction isn’t the key driver behind transforming financial business processes any more: it’s the need for better business analytics. Done right, these analytics provide real-time insight into your business that provide a strategic competitive differentiator: the ability to predict and react to changing business conditions.

Mancini finished by allowing today’s sponsors, with booths around the room, to introduce themselves: Precision ContentAIIMBox, Panasonic, SystemWareABBYY, SourceHOV, and Lexmark (Kofax). I’ll be here for the rest of the morning, and look forward to hearing from some of the sponsors and their customers here today.

Join the AIIM paper-free pledge

Pledge_badge1AIIM recently posted about the World Paper-Free Day on November 6th, and although I’m not sure that it’s recognized as a national holiday or anything, it’s certainly a good idea. I blogged almost three years ago about my mostly paperless office, and how to achieve such a thing yourself. Since that time, I’ve added an Epson DS-510 scanner, which has a nice small footprint and a sheet feeder; it sits right on my desk and there is never a backlog of scanning.

It’s not just about scanning and shredding, although those are pretty important activities: you have to have a proper retention plan that adheres to any regulatory requirements, and a secure offsite (cloud or otherwise) backup capability to ameliorate any physical site disasters.

You also need to consider how much backfile conversion that you’ll do: I decided to back-scan everything except my financial records at the time that I started going completely paperless, then scan everything including financials from that date forward. Each year, another batch of old paper financial records reached their destruction date and were shredded, the last of them just last year, and I no longer have any paper files. If back-scanning is too time-consuming for you but you want to start scanning everything day-forward, then store your old paper files by destruction date so that you can easily shred the batch of expired files each year until there are none left.

These things – scanning, document destruction, retention plan, secure backup, backfile conversion – are the same things that I’ve dealt with at large enterprise customers in the past on ECM projects, just on a small-office scale.

IBM ECM Strategy at Content2015

Wrapping up the one-day IBM Content 2015 mini-conference in Toronto (repeated in several other cities across North America) is Feri Clayton, director of document imaging and capture. Feri and I were two of the few female engineers at FileNet back during my brief time there in 2000-1, and I have many fond memories of our “women in engineering” lunch club of three members.

Clayton talked about how enterprises are balancing the three key imperatives of staying competitive through productivity and cost savings, increasing growth through customer centricity, and protecting the organization through security and compliance. With ECM initiatives, this boils down to providing the right information to employees and customers to allow them to make the right decisions at the right time. From and ECM capabilities standpoint, this requires the critical capabilities of content capture, content protection, activating content by putting it into business processes, analyzing content to reveal insights, and engaging people in content-centric processes and collaboration. Some recent advances for IBM: they have been moving towards a single unified UI for all of their ECM portfolio, and IBM Content Navigator now provides a common modern user experience across all products; they have also been recognized as a market leader in Case Management by the big analysts.

She did a pretty complete review of the entire ECM portfolio, including recent major releases as well as what’s coming up.

Looking forward, they’re continuing to improve Navigator Cloud (hosted ECM), advancing mobile capture and other document capture in Datacap, releasing managed cloud (IBM hosted) offerings for CMOD and Case Manager, and releasing a new Information Lifecycle Governance solution. They’re also changing their release cadence, moving to quarterly releases rather than the usual 1-2 years between releases, while making the upgrades much easier so that they don’t require a lot of regression testing.

IBM Navigator Cloud — the cloud ECM product, not the unified UI — has a new mobile UI and a simplified web UI that includes external file sharing; soon it will have a Mac sync client, and an ECM solution platform on the cloud codenamed “Galaxy” that provides for much faster development using solution patterns. There’s quite an extensive ECM mobile roadmap, with Case Manager and Datacap coming soon on mobile. The core content platform continues to be enhanced, but they’re also expanding to integrate with web-based editors such as Office 365 and Google Docs, and enhancing collaboration for external participants.

Case Manager, which is my key product of interest here today, will soon see a mobile interface (or app?), enhanced case analytics, enhanced property layout editor, simplified solution deployment and packaging, and more industry and vertical solutions. Further out, they’re looking at hybrid use cases with cloud offerings.

Good summary of the IBM ECM roadmap, and a wrap for the day.

IBM ECM and Cloud

I’m at the IBM Content 2015 road show mini-conference in Toronto today, and sat in on a session with Mike Winter (who I know from my long-ago days at FileNet prior to its acquisition by IBM) discussing ECM in the cloud.

The content at the conference so far has been really lightweight: I think that IBM sees this more as a pre-sales prospecting presentation than an actual informational conference for existing customers. Although there is definitely a place for the former, it should not necessarily be mixed with the latter; it just frustrates knowledgeable customers who were really looking for more product detail and maybe some customer presentations.

ECM in the cloud has a lot of advantages, such as being able to access content on mobile devices and share with external parties, but also has a lot of challenges in terms of security — or, at least, perceived security — when implementing in larger enterprise environments. IBM ECM has a very robust and granular security/auditing model that was already in place for on-premise capabilities; they’re working to bring that same level of security and auditing to hybrid and cloud implementations. They are using the CMIS content management standard as the API into their Navigator service for cloud implementation: their enhanced version of CMIS provides cloud access to their repositories. The typical use case is for a cloud application to access an ECM repository that is either on premise or in IBM’s SoftLayer managed hosting in a sync-and-share scenario; arguably, this is not self-provisioned ECM in the cloud as you would see from cloud ECM vendors such as Box, although they are getting closer to it with per-user subscription pricing. This is being rolled out under the Navigator brand, which is a bit confusing since Navigator is also the term used for the desktop UI. There was a good discussion on user authentication for hybrid scenarios: basically, IBM replicates the customers’ LDAP on a regular basis, and is moving to do the same via a SAML service in the future.

Winter gave us a quick demo of the cloud (hosted) Navigator running against a repository in Amsterdam: adding a document, adding tags (metadata) and comments, viewing via an HTML5 viewer that includes annotations, and more. Basically, a nice web-based UI on an IBM ECM repository, with most of the rich functionality exposed. It’s quick to create a shared teamspace and add documents for collaboration, and create simple review workflows. He’s a tech guy, so didn’t know the SLA or the pricing, although he did know that the pricing is tiered.

Activiti BPM Suite – Sweet!

There are definitely changes afoot in the open source BPM market, with both Alfresco’s Activiti and camunda releasing out-of-the-box end-user interfaces and model-driven development tools to augment their usual [Java] developer-friendly approach. In both cases, they are targeting “citizen developers”: people who have technical skills and do some amount of development, but in languages lighter weight than Java. There are a lot of people who fall into this category, including those (like me) who used to be hard-core developers but fell out of practice, and those who have little formal training in software development but have some other form of scientific or technical background.

Prior to this year, Activiti BPM was not available as a standalone commercial product from Alfresco, only bundled with Alfresco or as the community open source edition; as I discussed last year, their main push was to position Activiti as the human-centric workflow within their ECM platform. However, Activiti sports a solid BPMN engine that can be used for more than just document routing and lifecycle management, and in May Alfresco released a commercially-supported Alfresco Activiti product, although focused on the human-centric BPM market. This provides them with opportunities to monetize the existing Activiti community, as well as evolving the BPM platform independently of their ECM platform, such as providing cloud and hybrid services; however, it may have some impact on their partners who were relying on support revenue for the community version.

The open source community engine remains the core of the commercial product – in fact, the enterprise release of the engine lags behind the community release, as it should – but the commercial offering adds all of the UI tools for design, administration and end-user interface, plus cluster configuration for the execution engine.

Activiti Administrator cluster monitoringThe Activiti Administrator is an on-premise web application for managing clusters, deploying process models from local packages or the Activiti Editor, and technical monitoring and administration of in-flight processes. There’s a nice setup wizard for new clusters – the open source version requires manual configuration of each node – and allows nodes within the cluster to be auto-discovered and monitored. The monitoring of process instances allows drilling into processes to see variables, the in-flight process model, and more. Not a business monitoring tool, but seems like a solid technical monitoring tool for on-premise Activiti Enterprise servers.

The Activiti Editor is a web-based BPMN process modeling environment that is a reimplementation of other open-source tools, refactored with JavaScript libraries for better performance. The palette can be configured based on the user profile in order to restrict the environment, which would typically be used to limit the number of BPMN objects available for modeling in order to reduce complexity for certain business users to create simple models; a nice feature for companies that want to OEM this into a larger environment. Models can be shared for comments (in a history stream format), versioned, then accessed from the Eclipse plug-in to create more technical executable models. Although I saw this as a standalone web app back in April, it is now integrated as the Visual Editor portion of Kickstart within the Activiti Suite.

Activiti SuiteThe Activiti Suite is a web application that brings together several applications into a single portal:

  • Kickstart is their citizen development environment, providing a simple step editor that generates BPMN 2.0 – which can then be refined further using the full BPMN Visual Editor or imported into the Eclipse-based Activiti Designer – plus a reusable forms library and the ability to bundles processes into a single process application for publishing within the Suite. In the SaaS version, it will integrate with cloud services including Google Drive, Alfresco, Salesforce, Dropbox and Box.
  • Tasks is the end-user interface for starting, tracking and participating in processes. It provides an inbox and other task lists, and provides for task collaboration by allowing a task recipient to add others who can then view and comment on the task. Written in Angular JS.
  • Profile Management to , for user profile and administration
  • Analytics, for process statistics and reports.

The Suite is not fully responsive and doesn’t have a mobile version, although apparently there are mobile solutions on the way. Since BP3 is an Activiti partner, some of the Brazos tooling is available already, and I suspect that more mobile support may be on the way from BP3 or Alfresco directly.

They have also partnered with Fluxicon to integrate process mining, allowing for introspection of the Activiti BPM history logs; I think that this is still a bit ahead of the market for most process analysts but will make it easy when they are ready to start doing process discovery for bottlenecks and outliers.

I played around with the cloud version, and it was pretty easy to use (I even found a few bugs Smile ) and it would be usable by someone with some process modeling and lightweight development skills to build apps. The Step Editor provides a non-BPMN flowcharting style that includes a limited number of functions, but certainly enough to build functional human-centric apps: implicit process instance data definition via graphical forms design; step types for human, email, “choice” (gateway), sub-process and publishing to Alfresco Cloud; a large variety of form field types; and timeouts on human tasks (although timers based on business days, rather than calendar days, are not there yet). The BPMN Editor has a pretty complete palette of BPMN objects if you want to do a more technical model that includes service tasks and a large variety of events.

Although initially launched in a public cloud version, everything is also available on premise as of the end of November. They have pricing for departmental (single-server up to four cores with a limit on active processes) and enterprise (eight cores over any number of servers, with additional core licensing available) configurations, and subscription licensing for the on-premise versions of Kickstart and Administrator. The cloud version is all subscription pricing. It seems that the target is really for hybrid BPM usage, with processes living on premise or in the cloud depending on the access and security requirements. Also, with the focus on integration with content and human-centric processes, they are well-positioned to make a play in the content-centric case management space.

Instead of just being an accelerator for adding process management to Java development projects, we’re now seeing open source BPM tools like Activiti being positioned as accelerators for lighter-weight development of situational applications. This is going to open up an entire new market for them: an opportunity, but also some serious new competition.

Activiti 2014 

Spotfire Content Analytics At TIBCONOW

(This session was from late yesterday afternoon, but I didn’t remember to post until this morning. Oops.)

Update: the speakers were Thomas Blomberg from TIBCO and Rik Tamm-Daniels from Attivio. Thanks, guys!

I went to the last breakout on Monday to look at the new Spotfire Content Analytics, which combines Spotfire in-memory analytics and visualization with Attivio content analysis and extraction. This is something that the ECM vendors (e.g., IBM FileNet) have been offering for a while, and I was interested to see the Spotfire take on it.

Basically, content analytics is about analyzing documents, emails, blogs, press releases, website content and other human-created textual data (also known as unstructured content) in order to find insights; these days, a primary use case is to determine sentiment in social media and other public data, in order for a company to get ahead of any potential PR disasters.

Spotfire Content Analytics — or rather, the Attivio engine that powers the extraction — uses four techniques to find relative information in unstructured content:

  • Text extraction, including metadata
  • Key phrase analysis, using linguistics to find “interesting” phrases
  • Entity extraction, identifying people, companies, places, products, etc.
  • Sentiment analysis, to determine degree of negative/positive sentiment and confidence in that score

Once the piece of content has been analyzed to extract this relevant information, more traditional analytics can be applied to detect patterns, tie these back to revenue, and allow for handling of potential high-value or high-risk situations.

Spotfire Content Analytics (via their ) uses machine learning that allows you to train the system using sample data, since the information that is considered relevant is highly dependent on the specific content type (e.g., a tweet versus a product review). They provide rich text analytics, seamless visualization via Spotfire, agility through combining sources and transformations, and support for diverse content sources. They showed a demo based on a news feed by country from the CIA factbook site (I think), analyzing and showing aggregate sentiment about countries: as you can imagine, countries experiencing war and plague right now aren’t viewed very positively. Visualization using Spotfire allows for some nice geographic map-based searching, as well as text searching. The product will be available later this month (November 2014).

Great visualizations, as you would expect from Spotfire; it will be interesting to see how this measures up to IBM’s and other content analytics offerings once it’s released.

AIIM Information Chaos Rescue Mission – Toronto Edition

AIIM is holding a series of ECM-related seminars across North America, and since today’s is practically in my back yard, I decided to check it out. It’s a free seminar so heavily sponsored; most of the talks are from the sponsor vendors or conversations with them, but John Mancini kicked things off and moderated mini-panels with the sponsor speakers to tease out some of the common threads.

The morning started with John Mancini talking about disruptive consumer technologies — cloud, mobile, IoT — and how these are “breaking” our internal business processes by fragmenting the channels and information sources. The result is information chaos, where information about a client lives in multiple places and often can’t be properly aggregated and contextualized, while still remaining secure. Our legacy systems, designed to be secure, were put in place before the devices that are causing security leaks were even invented; those older systems can’t even envision all the ways that information can leak out of an organization. Furthermore, the more consumer technologies advance, the further behind our IT people seem, making it more likely that business users will just go outside/around IT for what they need. New technologies need to be put in the context of our information management practices, and those practices adjusted to include the disruptors, rather than just ignore them: consider how to minimize risk in this information chaos state;  how to use information to engage and collaborate, rather than just shutting it away in a vault; how to automate processes that involve information that may not be stored in an ECM; and how to extract insights from this information.

A speaker from Fujitsu was up next, stating some interesting statistics on just how big the information chaos problem is:

  • 50% of business documents are still on paper; most businesses have many of their processes still reliant on paper.
  • Departmental CM systems have proliferated: 75% of organizations with a CM system have more than one, and 25% have more than four. SharePoint is like a virus among them, with an estimated 50% of organizations worldwide using SharePoint ostensibly for collaboration, but usually for ad hoc content management.
  • Legacy CM systems are themselves are a hidden source of costs, inefficiency and risk.

In other words, we have a lot of problems to tackle still: large organizations tend to have a lot of non-integrated content management systems; smaller organizations tend to have none at all.

We finished the first morning segment with an introduction from the event sponsors at small booths around the room:

An obvious omission (to me, anyway) was IBM/FileNet — not sure why they are not here as a sponsor considering that they have a sizable local contingent.

The rest of the morning was taken up with two sets of short vendor presentations, each followed by a Q&A session moderated by John Mancini: first Epson, K2 and EMC; then KnowledgeLake, HP Autonomy, Kodak alaris and OpenText. There were audience questions about information security and risk, collaboration/case management, ECM benefits and change management, auto-classification, SharePoint proliferation, cloud storage, managing content retention and disposal, and many other topics; lots of good discussions from the panelists. I was amazed (or maybe just sadly accepting) at the number of questions dealing with paper capture and disposal; I’ve been working in scanning/workflow/ECM/BPM since the late 80’s, and apparently there are still a lot of people and processes resistant to getting rid of paper. As a small business owner, I run a paperless office, and have spent a big chunk of my career helping much larger enterprises go paperless as part of streamlining their processes, so I know that this is not only possible, but has a lot of benefits. As one of the vendors pointed out, just do something, rather than sitting frozen, surrounded by ever-increasing piles of paper.

I skipped out at lunchtime and missed the closing keynote since it was the only bit remaining after the lunch break, although it looked like a lot of the customer attendees stayed around for the closing and the prize draws afterwards, plus to spend time chatting with the vendors.

Thanks to AIIM and the sponsors for today’s seminar; the presentations were a bit too sales-y for me but some good nuggets of information. There’s still one remaining in Chicago and one in Minneapolis coming up next week if you want to sign up.

The Case For Smarter Process At IBMImpact 2014

At analyst events, I tend to not blog every presentation; rather, I listen, absorb and take some time to reflect on the themes. Since I spent the first part of last week at the analyst event at IBM Impact, then the second half across the country at Appian World, I waited until I had to pull all the threads together here. IBM keeps the analysts busy at Impact, although I did get to the general session and a couple of keynotes, which were useful to provide context for the announcements that they made in pre-conference briefings and the analyst event itself.

A key theme at Impact this year was that of “composable business” (I have to check carefully every time I type that to make sure I don’t write “compostable business”, but someone did point out that it is about reuse…). I’m not sure that’s a very new concept: it seems to be about assembling the building blocks of business capabilities, processes and technologies in order to meet the current needs without completely retooling, which is sort of what we’ve all been saying that BPM, ODM and SOA can do for some years now.

Smarter Process is positioned as an enabler of composable business, and is IBM’s approach for “reinventing business operations” by enabling the development of customer-centric applications that push top-line growth, while still providing the efficiency and optimization table stakes. Supporting knowledge workers has become a big part of this, which leads to IBM’s biggest new feature in BPM: the inclusion of “basic” case management within BPM. The idea is that you will be able to support a broader range of work types on a single platform: pre-defined “structured” processes, structured processes with some ad hoc activities, ad hoc (case) work that can invoke structured process segments, and fully ad hoc work. I’ve been talking about this range of work types for quite a while, and how we need products that can range across them, because I see so few real-world processes that fit into the purely structured or the purely unstructured ends of the spectrum: almost everything lies somewhere in the middle, where there is a mix of both. In fact, what IBM is providing is “production case management”, where a designer (probably not technical, or not very technical) creates a case template that pre-defines all of the possible ad hoc activities and structured process fragments; the end user can choose which activities to run in which order, although some may be required or have dependencies. This isn’t the “adaptive case management” extreme end of the spectrum, where the end user has complete control and can create their own activities on the fly, but PCM covers a huge range of use cases in business today. Bruce Silver

“But wait,”, you say, “IBM already has case management with IBM Case Manager. What’s the difference?” Well, IBM BPM (Lombardi heritage) provides full BPM capabilities including process analytics and governance, plus basic case capabilities, on the IBM BPM platform;  IBM Case Manager (FileNet heritage) provides full content and case capabilities including content analytics and governance, plus basic workflow capabilities, on the IBM ECM platform. Hmmm, sounds like something that Marketing would say. The Smarter Process portfolio graphic includes the three primary components of Operational Decision Management, Business Process Management and Case Management, but doesn’t actually specify which product provides which functionality, leaving it open for case management to come from either BPM or ICM. Are we finally seeing the beginning of the end of the split between process management in BPM and ICM? The answer to that is likely more political than technical – these products report up through different parts of IBM, turning the merging/refactoring of them into a turf war – and I don’t have a crystal ball, but I’m guessing that we’ll gradually see more case capabilities in BPM and a more complete integration with ECM, such that the current ICM capabilities become redundant, and IBM BPM will expand to manage the full spectrum of work types. The 1,000th cut may finally be approaching. Unfortunately for ICM users, there’s no tooling or migration path to move from ICM to BPM (presumably, no one is even talking about going the other way) since they are built on different infrastructure. There wasn’t really a big fuss made about this new functionality or how it might overlap with ICM about this outside the BPM analyst group; in fact, Bruce Silver quipped “IBM Merges Case into BPM but forgets to announce it”. Winking smile

The new case management functions are embedded within the BPM environment, and appear fairly well integrated: although a simple web-based case design tool is used instead of the BPM Eclipse authoring environment, the runtime appears within the BPM process portal. The case detail view shows the case data, case document and subfolders, running tasks, activities that can be started manually (including processes), and an overall status – similar enough to what you would see with any work item that it won’t be completely foreign, but with the information and controls required for case management. Under the covers, the ad hoc activities execute in the BPM (not ICM) process engine, and a copy of ECM is embedded within BPM to support the case folder and documents artifacts.

The design environment is pretty simple, and very similar to some parts of the ICM design environment: required and optional ad hoc activities are defined, and the start trigger (manual or automatically based on declarative logic or an event) of each activity is specified. Preconditions can be set, so that an activity can’t be started (or won’t automatically start) until certain conditions are met. If you need ad hoc activities in the context of a structured process, these can be defined in the usual BPM design environment – there’s no real magic about this, since ad hoc (that is, not connected by flow lines) activities are part of the BPMN standard and have been available for some time in IBM BPM. The case design environment is integrated with Process Designer and Process Center for repository and versioning, and case management is being sold as an add-on to IBM BPM Advanced.

Aside from the case management announcement, there are some new mobile capabilities in IBM BPM: the ability to design and playback responsive Coaches (UI) for multiple form factors, and pushing some services down to the browser. These will make the UI look better and work faster, so all good there. IBM also gave a shout out to BP3’s mobile portal product, Brazos, for developing iOS and Android apps for IBM BPM; depending on whether you want to go with responsive browser or native apps as a front-end for BPM, you’re covered.

They also announced some enhancements to Business Monitor: a more efficient, high-performance pub-sub style of event handling, and the ability to collect events from any source, although the integration into case management (either in BPM or ICM) at design time still seems a bit rudimentary. They’ve also upgraded to Cognos BI 10.2.1 as the underlying platform, which brings more powerful visualizations via the RAVE engine.  I have the impression that Business Monitor isn’t as popular as expected as a BPM add-on: possibly by the time that organizations get their processes up and running, they don’t have the time, energy or funds for a full-on monitoring and analytics solution. That’s too bad, since that can result in a lot of process improvement benefits; it might make sense to be bundling in some of this capability to at least give a teaser to BPM customers.

In BPM cloud news, there are some security enhancements to the Softlayer-based BPM implementations, including 2-factor authentication and SAML for identity management, plus new pricing at $199/user/month with concurrent user pricing scenarios for infrequent users. What was more interesting is what was not announced: the new Bluemix cloud development platform offers decision services, but no process services.

Blueworks Live seemed to have the fewest announcements, although it now has review and approval processes for models, which is a nice governance addition. IBM can also now provide Blueworks Live in a private cloud – still hosted but isolated as a single tenant – for those who are really paranoid about their process models.

BPM For Product Lifecycle Management At Johnson & Johnson

In this last breakout of Innovation World, simultaneous sessions from Johnson & Johnson and Johnson Controls were going on in adjacent rooms. I’m guessing that a few people might have ended up in the wrong session.

I was in the J&J session, where Pieter Boeykens and Sanjay Mandloi presented on web collaboration and process automation for global product development in the highly regulated health and pharmaceutical industry. They have a standardized set of processes for developing and launching products, with four different IT systems supporting the four parts of the PLM. A lot of this focuses on collecting documents from employees and suppliers all over the world, but there was no control over the process for doing this and the form of the information collected – they had five different processes for this in four regions. They rationalized this into a single standardized global process, modeled in webMethods BPM, then spent a significant amount of time on the human interaction at each step in the process: creating wireframes, then going through several version of the UI design in collaboration with the business users to ensure that it was intuitive and easy to use. They integrated BrainTribe for content management, which apparently handles the documents (the architecture diagram indicated that the actual documents are in Documentum) but also integrates structured content from other systems such as SAP.

In conjunction with this, they performed a webMethods upgrade from 8.2.x to 9 for their existing integration applications, migrating over their existing applications with little impact. Interestingly, this aspect generated far more questions from the audience than any of the functionality of the new BPM implementation, which gives you an idea of the business-technical mix in the audience. Smile

That’s it for Software AG’s Innovation World 2013. Next week, I’ll be in Vegas for TIBCO’s TUCON conference, where I’ll be on an analyst panel on Wednesday, then back to Vegas the following week for SAP TechEd (not next week, as I tweeted earlier) with a detour through Houston on the way home to speak at the APQC process conference. If you’re at any of those events, look me up and say hi.