Kofax Capture Technical Session

It’s been a long time since I looked at much of the Kofax technology, so I took the opportunity of their Transform conference to attend a two-hour advanced technical session with Bret Hassler, previously the Capture product manager but now responsible for BPM, and Bruce Orcutt from product marketing. They started by asking the attendees about areas of interest so that they could tailor the session, and thereby rescue us from the PowerPoint deck that would be the default. This session contained a lot more technical detail than I will ever use (such as the actual code used to perform some of the functions), but that part went by fairly quickly and overall it was a useful session for me. I captured some of the capabilities and highlights following.

Document routing allows large scan batches to be broken up into sub-batches that can be tracked and routed independently, and move documents and pages between the child batches. This makes sense both for splitting work to create manageable sizes for human processing, but also so that there doesn’t need to be as much presorting of documents prior to scanning. For my customers who are considering scanning at the point of origination, this can make a lot of sense where, for example, a batch loaded on an MFD in a regional office may contain multiple types of transactions that go to different types of users in the back office. Child batch classes to be changed independently of the main batch, so that properties and rules to be applied are based on the child batch class rather than the original class. A reference batch ID, which can be exported to an ECM repository as metadata on the resulting documents, can be used to recreate the original batch and the child batch that a document belonged to during capture. Batch splitting, and the ability to change routing and permissions on the child batch, makes particular sense for processing that is done in the Capture workflow, so that the child batches follow a specific processing path and is available to specific roles. This will also feed well when they start to integrate TotalAgility (the Singularity product that they acquired last year) for full process management, as described in this morning’s keynote. Integrating TotalAgility for capture workflow will also, as Hassler pointed out, will bring in a graphical process modeler; currently, this is all done in code.

Disaster recovery allows remote capture sites connected to a centralized server to fail over to a DR site with no administrative intervention. In addition to supporting ongoing operations, batches in flight are replicated between the central sites (using, in part, third-party replication software) and held at remote capture locations until replication is confirmed, so that batches can be resumed on the DR server. The primary site manages batch classes and designated/manages the alternate sites. There’s some manual cleanup to do after a failure, but that’s to be expected.

Kofax has just released a Pega connector; like other custom connectors, they ship it with source code so that you can make changes to it (that, of course, is not necessarily a good idea since it might compromise your upgradability). The Kofax Export Connector for PRPC does not send the images to Pega, since Pega is not a content repository; instead, it exports the document to an IBM FileNet, EMC Documentum or SharePoint repository, gets the CMIS ID back again, then creates a Pega work object that has that document ID as an attachment. Within Pega, a user can then open the document directly from that link attachment. You have to configure Pega to create a web service method that allows a work object to be created for a specific work class (which will be invoked from Kofax), and create the attribute that will hold the CMIS document ID (which will be specified in the invocation method parameters). There are some technicalities around the data transformation and mapping, but it looks fairly straightforward. The advantage of doing this rather than pushing documents into Pega directly as embedded attachments is that the chain of custody of documents is preserved and the documents are immediately available to other users of the ECM system.

Good updates, although I admit to doing some extracurricular reading during the parts with too much detail.

4 thoughts on “Kofax Capture Technical Session

  1. Hi Sandy,
    This is really very informative…
    In the last few lines , you were saying that KOFAX would first put a document into , say DOCUMENTUM and get the CMIS ID and create a PEGA WorkObject with that ID.
    Is Kofax creating a pega work object here?
    Or say if I am from pega and i create a service provider to which the KOFAX will connect to …wil get a Kofax request with the document details , using which i can create a work object?

    Would be more helpful to me if you can give above information.

    Thanks in Advance!!!


  2. Hi Santhosh,

    It was my understanding that Kofax’s Pega connector is creating the Pega work item and passing it the CMIS ID that it receives after committing the document to Documentum or FileNet. I’m not sure how it would work if you wanted to call Kofax as a service from Pega instead — that doesn’t seem to fit with the flow, since Kofax is controlling the content creation and knows when there are documents that require a corresponding work item to be created, so seems to make more sense to call Pega from Kofax. Even if you have a Pega work item waiting for an inbound document, that should be triggered from Kofax (or the content repository) by the arrival of the document, since Pega wouldn’t know when the document had arrived.

    Obviously, you should talk to Kofax for more details on how this works, since I only had a brief technical briefing and don’t have any hands-on experience with it.

  3. Hi Santhosh,

    What is the reply from Kofax team?
    How the Pega Work Item is created? Is it created from Kofax’s Pega connector OR we can create it in Pega…

Leave a Reply

Your email address will not be published. Required fields are marked *