Kofax Capture Product Portfolio

I finished the first day at Kofax Transform with a briefing on the Capture product portfolio from Bruce Orcutt. A key trend in content capture is that content can really come from anywhere, in any format: paper, web, data, fax, etc.; Kofax Capture is attempting to be that universal gateway for all content capture, not just the scanning that they’re best known for.

Some Kofax Capture feature updates:

  • Thin client indexing and validation, so that the capabilities of the desktop client that would normally be used by an indexing/validation operator can now be done with a lower TCO.
  • Access to KTM capabilities, including structured forms processing for extraction and other document and page processing. There are still some functions that require a full KTM implementation, such as content-based classification and separation, but a big chunk of KTM functionality is now right in KC.
  • Import connector is now a separate product, handling import from fax, email, file import, SMS and other sources. This isn’t just a simple import; VRS can be applied to enhance images before downstream recognition. No more printing of faxes and emails so that they can be scanned!
  • Kofax Front Office Server (KFS) allows KC to be extended to the front panel of an MFP, so that KC processes can be initiated there. I covered this in more detail in my post about the MFP session earlier today, although I missed noting that only Ricoh MFPs support card swipes for authentication.
  • Centralized configuration of VRS, which is then pushed out to the individual scanning stations running KC and VRS.
  • Detection and reporting of device problems based on image quality degradation from within VRS, e.g., stretched images may indicate ADF transport roller wear, allowing maintenance to be performed before catastrophic equipment failure occurs.

This was more of an incremental update than a review of the entire portfolio, but worthwhile nonetheless.

Kofax Transform Keynote: Craig Le Clair

Craig Le Clair from Forrester gave a keynote to discuss the role of capture and dynamic case management. He co-authored the Forrester Wave for Dynamic Case Manager published in January 2011, in which Singularity (acquired by Kofax last year) places in the leaders section. If I had wifi right now, I’d look up and link to his Forrester profile, but I recall that he also does a lot of CRM and enterprise software of various sorts.

I have little respect for middle-aged people (many younger than me) who just don’t make the effort to get plugged into this century, and tell cute anecdotes about “digital natives” – usually children under 10 who do something clever with an iPad – as an introduction to talking about social media and mobile applications in business environments. After that initial misstep, however, Le Clair laid out how the shift of consumer power to mobile devices will drive functions such as mobile capture, which Kofax provides by allowing the Atalasoft portal and mobile devices to become the point of origin for captured content.

He continued on to talk about managing untamed business processes, the topic of presentations that I’ve seen him do on webinars, and how case management can help knowledge workers deal with unstructured processes within an information-rich context. This was a bit of an introduction to case management, which it probably appropriate for most of the audience who come from the Kofax customer/partner side, including the three main use cases that Forrester is predicting for case management: investigations, service requests, and incidents.

He then went completely off on a tangent to talk about SharePoint and content frameworks, recommending that targeting SharePoint in your organization requires a view of its strengths and weaknesses. Duh, yeah. This appeared to be some sort of weak lead-in to a division between SharePoint targets and capture-driven process targets, but didn’t really make sense, or possibly there just wasn’t sufficient time to develop the idea. Not sure why the discussion of content ecosystems was even in this presentation.

He finished with a comparison between “Process 2011” (meaning today, so the slide should be updated to “Process 2012”) and “Process 2020”: in today’s world, processes are dictated by the business, not the customers, and mobile is just  a pretty face on a traditional process that keeps peeking out at the most inopportune moments. There is a shift happening that puts customers in control in business processes, and enterprise software needs to adapt to accommodate that.

Kofax Transform Keynote: Reynolds Bish

I’m in San Diego today and tomorrow for Kofax’s annual user and partner conference, Transform. It’s been a while since I’ve had to complain about no conference wifi, so I’ll just get that out of the way now – seriously, it’s 2012, wifi should be ubiquitous at conferences. The session moderator just pointed out all of the countries from which international attendees have traveled, and how many of them do you think have US data coverage for their smartphones? I can’t even pick up the Hilton room wifi (which I paid for) or the lobby wifi (which is free) in the main conference area. Grrr. Also very little social media promotion: although there is a Foursquare venue, the conference guide doesn’t mention a hashtag, Twitter account to follow or any other social media links, plus no app or mobile-friendly website.

This is a bit of a FileNet reunion here, since the west coast location attracted a lot of people from FileNet who didn’t stay on with IBM after the acquisition. Many people that I know from my short period working at FileNet in 2000-1 (plus the work that I’ve done with their customers over the years) are Kofax employees or partners, and it seems to be a good fit for them especially since the acquisition of Singularity to give more weight to their “capture-enabled BPM” message. I’ve always thought of Kofax as the “gold standard” for document capture from the time that I first met them over 20 years ago, but they have fallen off my radar recently and it’s good to get caught up.

Back to the keynotes, Reynolds Bish, Kofax CEO, was up to talk about their timeline and future. He headed Captiva before their acquisition by EMC, and came on to Kofax about five years ago when it was in a bit of a slump in terms of vision. There’s been quite a bit of transformation since then – the Atalasoft and Singularity acquisitions, divestiture of their hardware business, trimming of the underperforming partners – and their financial results are starting to show, with increased revenues, no debt and cash in the bank. Their Europe numbers are down, as I expect many enterprise software vendors’ are, and Bish talked quite openly about the global economic issues causing that and what they are expecting for the coming months. They’re closing a number of deals over $1M, showing that they’ve grown far beyond their document scanning origins. They own 35% of the batch image capture market (the largest position), and hold significant market share in batch and ad hoc content capture, primarily in the enterprise market.

Their Atalasoft acquisition adds the capability to add internet portals as a point of origin for their capture platform, allowing consumers to capture their own documents and submit them through a secure portal. The Singularity acquisition, adding BPM and dynamic case management, will allow them to extend their capture workflow into full downstream process and case management. He stated that this allows them to double their addressable market, and showed statistics comparing the capture and BPM market sizes; he implied that they could achieve a similar market share in BPM as they have in capture, which is clearly not going to happen, but combining capture and BPM/case management does provide some compelling capabilities. Although other vendors, such as IBM (with the Datacap acquisition) have capture and BPM products, Kofax is pushing to have a single unified product that will be a competitive differentiator in both spaces, and through both on-premise and cloud licensing. He stated that Kofax Capture and KTM are strong products that are continuing to develop, but I assume that this new combined product will eventually offer an alternative platform for capture that extends into full BPM functionality. This is going to be very interesting to watch, since Kofax can potentially define and own this market, but also risks splitting their customer and partner base between the existing and new platforms. I also think that the existing Kofax partner base may not be the best channel for BPM, much as FileNet found in 2000 when they tried to push their new eProcess product through the existing document imaging partners (and their own sales teams) and found the results less than satisfactory.

They plan to continue with organic growth but also make some additional strategic acquisitions, and eventually augment their London Stock Exchange listing with a NASDAQ listing. I might just buy some of those shares if they can keep to their vision and work their way through some of the challenges.

Process Intelligence White Paper

And here’s the white paper from this afternoon’s webinar (registration required) on Enabling Process Intelligence Through Process Mining & Analytics. Sponsored by Fujitsu, written by me, reviewed with many helpful comments by Keith Swenson and the rest of the Fujitsu team. When the on-demand webinar is available, I’ll post a link.

Enabling Process Intelligence Through Process Mining & Analytics

A bit short notice, but I’m doing a webinar this afternoon at 2pm (Eastern) on process intelligence through process mining and analytics, along with Keith Swenson of Fujitsu, and you can sign up here. Fujitsu is sponsoring the webinar, as well as the white paper that I have written to go into the topic in a bit more detail; if you sign up for the webinar, I think that they will send you a copy of the white paper.

Fujitsu has recently repackaged their Automated Process Discovery (process mining, which I have reviewed here previously) together with their analytics to create a sort of “intelligent process” suite: you use the mining part to find out what your processes are across multiple systems, then add the necessary instrumentation to those processes in order to feed into a consolidated analytics visualization. Whereas most process analytics are based just on the processes automated or orchestrated by a BPMS, Fujitsu is trying to expand that level of visibility into systems that aren’t connected to the BPMS. With my background in pattern recognition, I have a bit of interest in process mining, and have written about their process discovery tool previously as well as working my way through Wil van der Aalst’s recent book, Process Mining: Discovery, Conformance and Enhancement of Business Processes.

Hope that you can join us for the webinar today.

Upcoming Webinars with Progress Software

Blogging around here has been sporadic, to say the least. I have several half-finished posts about product reviews and some good BPM books that I’ve been reading, but I have that “problem” that independent consultants sometimes have: I’m too busy doing billable work to put up much of a public face, both with work with vendors and some interesting end-customer projects.

Today, I’ll be presenting the second in a series of three webinars for Progress Software, focused on how BPM fits with more traditional application development environments and existing custom applications. Progress continues to integrate the Savvion and Corticon acquisitions into their product set, and wanted to put forward a webinar series that would speak to their existing OpenEdge customers about how BPM can accelerate their application development without having to abandon their existing custom applications. I really enjoyed the first of the series, because Matt Cicciari (Progress product marketing manager) and I had a very conversational hour – except for the part where he lost his voice – and this time we’ll be joined by Ken Wilmer, their VP of technology, to dig into some of their technology a bit more. My portion will focus on generic aspects of combining BPM and traditional application development, not specific to the Progress product suite, so this may be of use even if you’re not using Progress products but want to understand how these seemingly disparate methodologies and technologies come together.

We’re doing today’s webinar twice: once at 11am Eastern to cover Europe and North America, then a repeat at 7pm ET (that’s 11AM tomorrow in Sydney) for the Asia Pacific region or those of you who just didn’t get enough in the first session. It will be live both times, so I will have the chance to think about what I said the first time around, and completely change it. 😉

You can sign up for today’s session here, plus the next session on February 29th that will include more about business rules in this hybrid environment.

Q&A From Making Social Mean Business

We had a few unanswered questions left from our webinar on Tuesday, so I’ve included the ones that were not related to Pega’s products below, with answers from both Emily Burns and myself:

There’s a lot of discussion about the readiness of an org before social features are introduced to its employees. What would be a way to assess maturity/readiness of an org for such features with regards to BPM?

Emily: Boy, I guess I am on the more liberal side of that discussion and would err on the side of providing access to these features and seeing how they evolve—collective intelligence is pretty impressive, and can take things in many positive directions that a designer just wouldn’t think of. It’s hard for me to see the downside to fostering better communication and collaboration between people who are already working on the same cases, but may not currently be aware of who the other people are.

Sandy: There is a lot of work being done on social business readiness by organizations such as the Social Business Council (http://council.dachisgroup.com/) that can serve as a reference for how that will work with social features in BPM. In assessing readiness, you can look at the use of other social tools within the organization, such as wikis for documentation, or instant messaging for chat, to get an idea of whether the users have been provided with tools such as this in the past. However, just because they haven’t used these in the workplace before is no reason to avoid social BPM functions since users may be using similar capabilities in consumer applications, and as Emily points out, the best thing is to provide them with the tools and see what emerges.

Emily: For features that impact the application more, such as design-by-doing, that I think is an area that does need careful consideration. In the case of design-by-doing, more often than not, that is something that is limited only to certain roles, and even then, while the default is to allow the new type of case to be instantly in production, in reality, most of our clients use it more as a way of gathering suggestions for application improvements. As it becomes more widely used, and best practices developed around governance, I expect this type of thing to be used more aggressively to foster the kind of real-time adaptation for which it was conceived.

Sandy: Although many organizations are worried about users “going wild” with collaborative and social tools, the opposite is often true: it is more difficult to get users to participate unless they can see a clear personal benefit, such as being able to get their job done better or more efficiently. This may require creating some rewards specifically geared at users who are taking advantage of the social tools, in order to help motivate the process.

While the knowledge that we can glean from social networking sites is indeed powerful, and allows us to serve up tailored offers, it can also irritate some customers, or seem “creepy” like it’s a bit of an invasion of privacy.

Emily: I totally agree, and am just such a customer. In fact, I won’t go to a company’s Facebook page unless I am logged out of Facebook, because I don’t want them to know anything about me, nor do I want my friends to know about my interactions with different companies. In order to get around this sort of stone-walling, there are a few things that organizations can do.

  1. Make the content and actions that can be performed from your Facebook page sufficiently compelling that you overcome this resistance.
  2. DON’T BE SNEAKY! Do not default settings to “post to my wall” so that all of a client’s friends see that she just applied for a new credit card. Be frank and up front about any information that might be broadcast, and about how you are using the information that they have so graciously allowed you to access by virtue of logging in via Facebook. If you want to give people the option of posting something, make sure they are forced to make the choice. And make it transparent and easy to change settings in the future. This will help you gain trust and increase the uptake of these low-cost, highly viral channels.

Sandy: I completely agree – transparency is the key here for organizations starting on a social media path. Anything less than complete transparency about what you’re doing with the consumer’s information – including their actions on your site – will be exposed in the full glare of public scrutiny on the web when people discover it. Accept, however, that there is a wide range of social behavior for customers: some want to be seen to be associated with your product or brand, and will “Like” your Facebook page or check-in on Foursquare at your location, whereas others will not want that information to be publicized in any way.

Do you think there is a trust built up yet for customers to interact with companies via social as yet?

Emily: See my response above. I think that in many cases, organizations have started out on the wrong foot, taking advantage of how easily available the information is to really milk it for all its worth. The fact that many of the social networking sites had low-granularity privacy settings initially made it so that this wasn’t entirely the fault of the different organizations, either. Because of this, and in light of continually improving granularity and control over privacy settings, I think now is a time to try to re-establish trust, and establish what it means to be a good “social” corporate citizen.

Sandy: Social media is becoming a powerful channel for customer interaction, particularly in situations where the company is monitoring Twitter and Facebook updates to track any problems that customers are experiencing. From my own personal experience (and in part because I have a large Twitter following and use my real name on Twitter), I have had near-immediate responses to problems that I Tweeted about hotels, car rentals and train travel. In some cases, the social media wasn’t necessarily well-integrated with the rest of their customer service channel, but when it is well-integrated, it’s a very satisfying customer experience for someone like me with a strong social media focus. There are initiatives to create the type of trusted online behavior that we would all like to see, such as the Respect Trust Framework; early days for these, but we’ll see more organizations adopt this as customers insist on their online rights.

I’ve also included my slides below, although not Emily’s deck. I’ll update this post with the link to the webinar replay when it is available.

Making Social BPM Mean Business

When I owned a boutique consulting firm in the 1990’s, our catchphrase was “Making Technology Mean Business”, and when we were coming up with a title for the webinar that I’m doing with Pegasystems next week, an updated version of that phrase just seemed to fit. We’ll be discussing the social aspects of business processes, particularly in the context of case management. I’ll be expanding on a discussion point from my Changing Nature of Work keynote at BPM 2011 to discuss the social dimension and how that correlates with structure (i.e., a priori modeling), triggered in part by some of the discussion that arose from that presentation. As with the spectrum of structure, I believe that there’s a spectrum of socialness in business processes: some processes are just inherently more social than others (or can benefit from social features).

Interested? The webinar is on Tuesday at 11am Eastern, and you can register here.

Emerging Trends in BPM – Five Years Later

I just found a short article that I wrote for Savvion (now part of Progress Software) dated November 21, 2006, and decided to post it with some updated commentary on the 5th anniversary of the original paper. Enjoy!

Emerging trends in BPM
What happened in 2006, and what’s ahead in 2007

The BPM market continues to evolve, and although 2006 has seen some major events, there will be even more in 2007. This column takes a high-level view of four areas of ongoing significant change in BPM: the interrelationship between SOA and BPM; BPM standards; the spread of process modeling tools; and the impact of Web 2.0 on BPM.

SOA and BPM, together at last. A year ago, many CIOs couldn’t even spell SOA, much less understand what it could do for them. Now, Service-Oriented Architecture and BPM are seen as two ends of the spectrum of integration technologies that many organizations are using as an essential backbone for business agility.

SOA is the architectural philosophy of exposing functionality from a variety of systems as reusable services with standardized interfaces; these, in turn, can be orchestrated into higher-level services, or consumed by other services and applications. BPM systems consume the services from the SOA environment and add in any required human interaction to create a complete business process.

As with every year for the last several years, 2006 has seen ongoing industry consolidation, particularly with vendors seeking to bring SOA and BPM together in their product portfolios. This trend will continue as SOA and BPM become fully recognized as being two essential parts of any organization’s process improvement strategy.

There has certainly been consolidation in the BPM vendor portfolios, especially the integration vendors adding better human-centric capabilities through acquisitions: Oracle acquired BEA in 2008, IBM acquired Lombardi in 2009, Progress acquired Savvion in 2010, and TIBCO acquired Nimbus in 2011. Although BPM is being used in some cases to orchestrate and integrate systems using services, this is still quite a green field for many organizations who have implemented BPM but are still catching up on exposing services from their legacy applications, and orchestrating those with BPM.

BPM standards. 2006 was the year that the Business Process Modeling Notation (BPMN), a notational standard for the graphical representation of process models, went mainstream. Version 2 of the standard was released, and every major BPM vendor is providing some way for their users to make use of the BPMN standard, whether it’s through a third-party modeling tool or directly in their own process modelers.

But BPMN isn’t the only standard that gained importance this year. 2006 also saw the widespread adoption of XPDL (XML Process Definition Language) by BPM vendors as an interchange format: once a process is modeled in BPMN, it’s saved in the XPDL file format to move from one system to another. A possible competitor to XPDL, the Business Process Definition Metamodel (BPDM) had its first draft release this year, but we won’t know the impact of this until later in 2007. On the SOA side, the Business Process Execution Language (BPEL), a service orchestration language, is now widely accepted as an interchange format, if not a full execution standard.

The adoption of BPM standards is critical as we consider how to integrate multiple tools and multiple processes to run our businesses. There’s no doubt that BPMN will remain the predominant standard for the graphical representation of process models, but 2007 could hold an interesting battle between XPDL, BPDM and BPEL as serialization formats.

The “Version 2” that I referred to was actually the second released version of the BPMN standard, but the actual version number was 1.1. That battle for serialization formats still goes on: most vendors support XPDL (and will continue to do so) but are also starting to support the (finally released) BPMN file format as well. BPDM disappeared somewhere in the early days of BPMN 2.0. BPEL is used as a serialization and interchange format primarily between systems that use BPEL as their core execution language, which are a minority in the broader BPMS space.

Modeling for the masses. In March of 2006, Savvion released the latest version of their free, downloadable process modeler: an application that anyone, not just Savvion customers, could download, install and run on their desktop without requiring access to a server. This concept, pioneered by Savvion in 2004, lowers the barrier significantly for process modeling and allows anyone to get started creating process models and finding improvements to their processes.

Unlike generic modeling tools like Microsoft Visio, a purpose-built process modeler can enforce process standards, such as BPMN, and can partially validate the process models before they are even imported into a process server for implementation. It can also provide functionality such as process simulation, which is essential to determining improvements to the process.

2006 saw other BPM vendors start to copy this initiative, and we can expect more in the months to come.

Free or low-cost process modelers have proliferated: there are web-based tools, downloadable applications and Visio BPMN add-ons that have made process modeling accessible – at least financially – to the masses. The problem continues to be that many people using the process modeling tools lack the analysis skills to do significant process optimization (or even, in some cases, representation of an event-driven process): the hype about having all of your business users modeling your business processes has certainly exceeded the reality.

Web 2.0 hits BPM. Web 2.0, a set of technologies and concepts embodied within the next generation of internet software, is beginning to impact enterprise software, too.

Web 2.0 is causing changes in BPM by pushing the requirement for zero-footprint, platform-independent, rich user interfaces, typically built using AJAX (Asynchronous Java and XML). Although browser-based interfaces for executing processes have been around for many years in BPM, the past year has seen many of these converted to AJAX for a lightweight interface with both functionality and speed.

There are two more Web 2.0 characteristics that I think we’re going to start seeing in BPM in 2007: tagging and process syndication. Tagging would allow anyone to add freeform keywords to a process instance (for example, one that required special handling) to make it easier to find that instance in the future by searching on the keywords. Process event syndication would allow internal and external process participants to “subscribe” to a process, and feed that process’ events into a standard feed reader in order to monitor the process, thereby improving visibility into the process through the use of existing feed technologies such as RSS (Really Simple Syndication).

Bringing Web 2.0 to BPM will require a few changes to corporate culture, especially those parts that require different – and more creative – types of end-user participation. As more people at all levels in the organization participate in all facets of process improvement, however, the value of this democratization of business processes will become clear.

I’ve been writing and presenting about the impact of social software on BPM for over five years now; adoption has been slower than I predicted, although process syndication (subscribing to a process’ events) has finally become mainstream. Tagging of processes is just starting to emerge; I’ve seen it in BonitaSoft but few other places.

I rarely do year-end prediction posts, but it was fun to look back at one that I did five years ago to see how well I did.