Closing the loop with analytics: TIBCONOW 2016 day 2 keynote

Yesterday at TIBCO NOW 2016, we heard about the first half of TIBCO’s theme — interconnect everything — and today, Matt Quinn introduced the second half — augment intelligence — before turning the stage over to Mark Palmer, SVP engineering for streaming analytics.

wp-1463592662913.png

Palmer talked about the role of analytics over history, and how today’s smart visual analytics allow you to be first to insight, then first to action. We then had a quick switch to wp-1463592680877.pngBrad Hopper, VP strategy for analytics, for a demo of Spotfire visual analytics while wearing a long blond wig (attempting to make a point about the importance of beauty, I think). He built an analytics dashboard while he talked, showing how easy it is to create visual analytics and trigger smart actions. He went on to talk about data preparation and cleansing, which can often take as much as 50% of an analyst’s time, and demonstrated importing a CSV file and using quick visualizations to expose and correct potential problems in the underlying data. As always, the Spotfire demos are very impressive; I don’t follow Spotfire closely enough to know what’s new, but it all looks pretty slick.

wp-1463592703428.pngMichael O’Connell, TIBCO’s chief analytics officer, came up to demonstrate a set of analytics applications for a fictitious coffee company: sales figures and drilldowns, with what-if predictions for planning promotions; and supply chain management and smart routing of product deliveries.

Palmer came back to talk about TIBCO Jaspersoft, the other side of their analytics portfolio that provides business intelligence capabilities built in to applications, but it was a pretty quick mention with no demo. A Jaspersoft demo would look pretty mundane after seeing all of the sexy Spotfire features, but it undoubtedly is a workhorse for analytics with many customers. He moved on to ways that TIBCO is helping customers to roll analytics out, from accelerators and sample source code to engagement in the community.

wp-1463592727471.png

wp-1463592749782.png

He continued on with streaming analytics (Palmer was the CEO of Streambase before it was acquired TIBCO), and O’Connell came back to show an
wp-1463592771034.pngoil industry application that leverages sensor analytics to maximize equipment productivity by initiating preventative maintenance when the events emitted by the device indicate that failure may be imminent. He showed a more comprehensive interface that would be used in the head office for real-time monitoring and analysis, and a simpler tablet interface for field service personnel to receive information about wells requiring service. Palmer finished the analytics segment with a brief look at LiveView Web, a zero-code environment for building operational intelligence dashboards.

wp-1463592816127.png

Quinn returned to talk about their B-tree-based Graph Database, which is in preview mode now with an open API, and other areas where they are looking to provide innovative solutions. He went through a history of how they’ve grown as a technology organization, and got quite verklempt when thanking his team for how awesome they’ve continued to be over the past 18 months since the acquisition, which was really touching.

IMG_9495After the break, Adam Steltzner, NASA’s lead engineer on the Mars Rover and author of The Right Kind of Crazy: A True Story of Teamwork, Leadership, and High-Stakes Innovation, talked about innovation, collaboration and decision-making under pressure. Check out the replay of the keynote for his talk, a fascinating story of the team that built and landed the Mars landing vehicles, along with some practical tips for leaders to foster exploration and innovation in teams.

Murray Rode returned to close out the keynote by announcing the winners of their Trailblazer customer awards:

  • Norfolk Southern (Pioneer) for implementing a real-time view of their railway operations
  • CargoSmart (Innovator) for incorporating real-time optimization of shipping logistics into their cargo management software
  • First Citizens Bank (Impact) for simplifying IT structure to allow for quick creation and delivery of new branch services
  • University of Chicago Medicine (Visionary) for optimizing operating room turnover to save costs and improve service
  • TUI Group (Transformer) for transforming their platforms through integration to enable new customer-facing tourism applications

That’s it for the morning keynote, and I’m off to catch some of the breakout sessions for most of the rest of the day before we come back for the customer panel and closing keynote at the end of the day.

The Enterprise Digital Genome with Quantiply at BPMCM15

“An operating system for a self-aware quantifiable predictive enterprise” definitely gets the prize for the most intriguing presentation subtitle, for an afternoon session that I went to with Surendra Reddy and David Chaney from Quantiply (a stealth startup that has just publicly launched), and their customer, a discount brokerage service whose name I have been requested to remove from this post.

Said customer has some significant event data challenges, with a million customers and 100,000 customer interactions per day across a variety of channels, and five billion log messages generated every day across all of their product systems and platforms. Having this data exist in silos with no good aggregation tools means fragmented and poor customer support, and also significant challenges in system and internal support.

To address these types of heterogenous data analysis problems, Quantiply has a two-layer tool: Edge Cloud for the actual data analysis, which can then be exposed to different roles based on access control (business users, operational users, data scientists, etc.); and Pulse for connecting to various data sources including data warehouses, transactional databases, BPM systems and more. It appears that they’re using some sort of dimensional fact models, which is fairly standard data warehouse analytical tools, but their Pulse connectors is allowing them to pour in data on a near-real-time basis, then make the connections between capabilities and services to be able to do fast problem resolution on their critical trading platforms. Because of the nature of the graph connectivity that they’re deriving from the data sources, they’re able to not only resolve the problem by drilling down, but also determine what customers were impacted by the problem in order to follow up. In response to a question, the customer said that they had used Splunk and other log analytics tools, but that this was “not Splunk”, in terms of both the real-time nature, and the front-end user experience, plus deeper analytical capabilities such as long-term interaction trending. In some cases, the Quantiply representation is sufficient analysis; in other cases, it’s a starting point for a data scientist to dig in and figure out some of the more complex correlations in the data.

There was a lot of detail in the presentation about the capabilities of the platform and what the customer is doing with it, and the benefits that they’re seeing; there’s not a lot of information on the Quantiply website since they’re just publicly launching.

Update: The original version of this post included the name of the customer and their representative. Since this was a presentation at a public conference with no NDA or confidentiality agreements in place, not even a verbal request at any time during the session, I live-blogged as usual. A day later, the vendor, under pressure from the customer’s PR group, admitted that they did not have clearance to have this customer speak publicly, which is a pretty rookie mistake on their part, although it lines up with my general opinion on their social media skills. As a favor to the conference organizers, who put a lot of effort into making a great experience for all of us, I’ve decided to remove the customer’s name from this post. I’m sure that those of you who really want to know it won’t have any trouble finding it, because of this thing called “the internet”.

The Personology of @RBSGroup at PegaWorld 2015

IMG_7261Andrew McMullan, director of analytics and decisioning (aka “personologist”) at Royal Bank of Scotland, gave a presentation on how they are building a central (Pega-based) decisioning capability to improve customer engagement and change their culture along the way. He started with a personal anecdote about how RBS did the right thing for a family member and gained a customer for life – a theme echoed from this morning’s keynote that also included RBS.  He showed a short video of their current vision, which stated goals of making RBS easier to do business with, and to work for, in addition to being more efficient. In that order, in case you other banks are following along.

RBS is now government owned, having been bailed out during the financial crisis; I’m not sure how much this has allowed them to focus on customer engagement rather than short-term profits, but they do seem to be talking the right talk.

RBS uses Pega’s Chordiant – primarily the decision management components, if I am reading it correctly – although are implementing Pega 7 for an August 2015 rollout to bring in more robust Next Best Action capabilities; they also use SAS Visual Analytics for reporting. This highlights the huge role of decisioning as well as process in customer engagement, especially when you’re applying analytics to a broad variety of customer information in order to determine how to interact with the customer (online or IRL) at any particular moment. RBS is proactive about having their customers do things that will save them money, such as renewing a mortgage at a lower rate, or choosing a package of banking services that doesn’t overlap with other services that they are paying for elsewhere. Contrary to what nay-sayers within RBS said about lost revenue, this tends to make customers more loyal and ultimately do more business with them.

There was a good question from the audience about how much of this was changes to organizational culture, and how much was the data science: McMullan said that it’s really critical to win the hearts and minds of the employees, although obviously you need to have at least the beginnings of the analytics and recommendations to get that started. Also, they use Net Promoter Score as their main internal metric, which tends to reward relationship-building over short-term profits; having the right incentives for employees goes a long ways towards helping them to do the right thing.

PegaWorld 2015 Day 2 Customer Keynotes: Big Data and Analytics at AIG and RBS

After the futurist view of Brian Solis, we had a bit more down-to-earth views from two Pega customers, starting with Bob Noddin from AIG Japan on how to turn information that they have about customers into an opportunity to do something expected and good. Insurance companies have the potential to help their customers to reduce risk, and therefore insurance claims: they have a lot of information about general trends in risk reduction (e.g., tell an older customer that if they have a dog and walk it regularly, they will stay healthier and live longer) as well as customer-specific actions (e.g., suggest a different route for someone to drive to work in order to reduce likelihood of accident, based on where they live and work, and the accident rates for the roads in between). This is not a zero-sum game: fewer claims is good for both AIG and the customers. Noddin was obviously paying close attention to Solis, since he wove elements of that into his presentation in how they are engaging customers in the way that the customer chooses, and have reworked their customer experience – and their employee and agent experience – with  that in mind.

Between the two customers, we heard from Rob Walker, VP of Decision Management and Analytics at Pega, about the always-on customer brain and strategies for engaging with them:

  • Know your customer: collect and analyze their data, then put it in the context of their entire customer journey
  • Reach your customer: break down the silos between different channels, and also between inbound and outbound communications, to form a single coherent conversation
  • Delight your customer: target their needs and wants based on what you know about them, using the channels through which you know that they can be reached.

He discussed how to use Pega solutions to achieve this through data, analytics and decisioning; obviously, the principles are universal.

Chrome Legacy Window 2015-06-09 103539 AM.bmpThe second customer on stage was Christian Nelissen from Royal Bank of Scotland, who I also saw yesterday (but didn’t blog about) on the big data panel. RBS has a good culture of knowing their customer from their roots as a smaller, more localized bank: instead of the branch manager knowing every customer personally, however, they now rely on data about customers to create 1:1 personalize experiences based on predictive and adaptive analytics in the ever-changing context of the customer. He talked about the three pillars of their approach:

  • It’s about the conversation. If you focus on doing the right thing for the customer, not always explicit selling to them, you build the relationship for the long term.
  • One customer, one bank. A customer may have products in different bank divisions, such as retail banking, credit cards and small business banking, and you need to be cognizant of their complete relationship with the bank and avoid internal turf wars.
  • You can do a lot with a little. Data collection and analytics technologies have become increasingly cheaper, allowing you to start small and learn a lot before expanding your customer analytics program.

Alan Trefler closed out the keynote before sending us off to the rest of the day of breakout sessions. Next years, PegaWorld is in Las Vegas; not my favorite place, but I’ll be back for the quality of the presentations and interactions here.

These two keynotes this morning have been great to listen to, and also closely aligned with the future of work workshop that I’m doing at IRM BPM in London next week, as well as the session on changing incentives for knowledge workers. Always good when the planets align.

SapphireNow 2015 Day 2 Keynote with Bernd Leukert

SAP HANA functionalityThe second day of SAP’s SAPPHIRENOW conference started with Bernd Leukert discussing some customers’ employees worry of being disintermediated by the digital enterprise, but how the digital economy can be used to accentuate the promise of your original business to make your customers happier without spending the same amount of time (and hopefully, money) on enterprise applications. It’s not just about changing technologies but about changing business models and leveraging business networks to address the changing world of business. All true, but I still see a lot of resistance to the digital enterprise in large organizations, with both mid-level management and front-line workers feeling threatened by new technologies and business models until they can see how it can be of benefit to them.

S/4HANAAlthough Leukert is on the stage, the real star of the show is S/4HANA: the new generation of their Business Suite ERP solutions based natively on the in-memory HANA data and transaction engine for faster processing, a simplified data model for easier analytics and faster reconciliation, and a new user interface with their Fiori user experience platform. With the real-time analytical capabilities of HANA, including non-SAP as well as S/4HANA data from finances and logistics, they are moving from being just a system of record to a full decision support system. We saw a demo of a manufacturing scenario, where we walked through a large order process where we saw a combination of financial and logistics data presented in real time for making recommendations on how to deal with a shortage in fulfilling an order. Potential solutions — in this case, moving stock allocated from one customer to another higher priority customer — are presented with a predicted financial score, allowing the user to select one of the options. Nice demo of analytics and financial predictions directly integrated with order processing.

Order processing dashboard Order processing recommendations Order process simulation results

The new offering is modular, with additional plug-ins for their other products such as Concur and SuccessFactors to enhance the suite capabilities. It runs in the cloud and on-premise. Lots of reasons to transition, but having this type of new functionality requires significant work to adopt the new programming model: both on SAP’s side in building the new platform, and also on the customers’ side for refactoring their applications to take advantage of the new features. Likely this will take several months, if not years, for widespread adoption by customers that have highly customized solutions (isn’t that all of them?), in spite of the obvious advantages. As we have seen with other vendors who completely re-architect their product, new customers are generally very happy with starting on the new platform, but existing customers can take years even when there is certified migration path. However, since they launched in February, 400 customers have committed to S4/HANA, and they are now supporting all 25 industries that they serve.

As we saw last year, SAP is pushing to have existing customers first migrate to HANA as the underlying database in their existing systems (typically displacing Oracle), which is a non-trivial but straightforward operation that is likely to improve performance; then, reconsider whether the customizations that they have in their current system are handled out of the box with S/4HANA or can be easily re-implemented based on the simpler data model and more functional capabilities. Sounds good, and I imagine that they will get a reasonable share of their existing customers to make the first step and migrate to HANA, but the second step starts to look more like a new implementation than a simple migration that will scare off a lot of customers. Leukert invited a representative from their customer Asian Paints to the stage to talk about their migration: they have moved to HANA and the simplified finance core functionality, and are still working on implementing the simplified logistics and other modules with a vision to soon be completely on S/4HANA. A good success story, but indicative of the length of time and amount of work required to migrate. For them, definitely worth the trip since they have been able to re-imagine their business model to reach new markets through a better understanding of their customers and their own business data.

He moved on to talk about the HANA Cloud Platform (HCP), a general-purpose application development platform that can be used to build applications unrelated to SAP applications, or to build extensions to SAP functionality. He mentioned an E&Y application built on HCP for fraud detection that is directly integrated with core SAP solutions, which is just one of 1,000 or more third-party applications available on the HCP marketplace. HCP provides structured and unstructured data models, geospatial, predictive, Fiori UX platform as a service, mobile support, analytics portfolio, and integration layers that provide direct connection to your business both on the device side through IoT events and into the operational business systems. With the big IoT push that we saw in the panel yesterday, Siemens has selected HCP as their cloud platform for IoT: the Siemens Cloud for Industry. Peter Weckesser of Siemens joined Leukert on stage to talk more about this newly-launched platform, and how it can be added to their customer installations as a monitoring (not control) layer: remote devices, such as sensors on manufacturing equipment, push their event streams to the Siemens cloud (based on HCP) in public, hybrid or on-premise configurations; analytics can then be applied for predictive maintenance scheduling as well as aggregate operational optimization.

Energy grid geospatial analyticsWe saw a demo based on the CenterPoint IoT example at the panel yesterday, showing monitoring and maintenance of energy distribution networks: tracking the health of transformers, grid storage and other devices and identifying equipment failures, sometimes before they even happen. CenterPoint already has 100,000 sensors out in the field, and since this is integrated with S/4HANA, this is not just monitoring: an operator can trigger a work order directly from the predictive equipment maintenance analytics dashboard.

Energy grid analytics Energy grid analytics drill-down

Leukert touched on to the HANA roadmap, with the addition of Hadoop and SPARK Cluster Manager to handle infinite volumes of data, then welcomed Walmart CIO Karenann Terrell to discuss what it is like to handle a really large HANA implementation. Walmart serves 250 million customers per week through 11,000 locations with 2.2 million employees, meaning that they generate a lot of data just in their daily operations: they generate literally trillions of financial transactions. Because technology is so core to managing this well, she pointed out that Walmart is creating a technology company in the middle of the world’s largest retail company, which allows them to stay focused on the customer experience while reducing costs. Their supply chain is extensive, since they are directly plugged into many of their suppliers, and innovating along that supply chain has driven them to partner with SAP more closely than most other customers. HANA allows them to have 5,000 people hitting on data stores of a half-billion records simultaneously with sub-second response time to provide a real-time view of their supply chain, making them a true data-driven retailer and shooting them to the top of yesterday’s HANA Innovation Awards. She finished by saying that seeing S/4HANA implemented at Walmart in her lifetime is on her bucket list, which got a good laugh from the audience but highlighted the fact that this is not a trivial transition for most companies.

Leukert finished with an invitation — or maybe it was a challenge — to use S/4HANA and HCP to reinvent your business: “clean your basement” to remove unnecessary customization in your current SAP solutions or convert it to HCP or S/4HANA extension platform; change your business model to become more data-driven; and leverage business networks to expand the edges of your value chain. Thrive, don’t just survive.

Employee disaster scenarioSteve Singh, CEO of Concur (acquired by SAP last December) then took over to look at reinventing the employee travel experience, from booking through trip logistics to expense reporting. For companies with large number of traveling employees, managing travel can be a serious headache both from a logistics and financial standpoint. Concur does this by creating a business network (or a network or networks) that directly integrates with suppliers — such as airlines and car rental companies — for booking and direct invoice capture, plus easy functions for inputting travel expenses that are not captured directly from the supplier. I heard comments yesterday that SAP already has travel and expense management, and although the functionality of Concur for that functionality is likely a bit better, the networks that they bring are the real prize here. The networks, for example, allow for managing the extraction of an employee who finds themself in a disaster or other dangerous travel scenario, and becomes part of a broader human resources risk management strategy.

At the press Q&A later, Leukert fielded questions about how they have simplified the complete core of their ERP solution in terms of data model and functionality but still have work to do for some industry modules: although all 25 industries are supported as of now in the on-premise version, they need to do a bit of tinkering under the hood and do additional migration for the cloud version. They’re also still working on the cloud version of everything, and are recommending the HCM and CRM standalone products if the older Business Suite versions don’t meet requirements. In other words, it’s not done yet, although core portions are fully functional. Singh talked about the value of business networks such as Ariba in changing business models, and sees that products such as Concur using HCP and the SAP business networks will help drive broader adoption.

There was a question on the ROI for migration to S/4HANA: it’s supposed to run 1,800 times faster than previous versions, but customers may not be seeing much (if any) savings, opening things up to competitive displacement. I heard this same sentiment from some customers last night at the HANA Innovation Awards reception; since there is little or no cost reduction in terms of license and deployment costs, they need to make the case based on what additional capabilities that HANA enables, such as real-time analytics and predictions, that allow companies to run their businesses differently, and a longer-term reduction in IT complexity and maintenance costs. Since a lot of more traditional companies don’t yet see the need to change their business models, this can be a hard sell, but eventually most companies will need to come around to the need for real-time insights and actions.

Consolidated Inbox in SAP Fiori at SapphireNow 2015

I had a chance to talk with Benny Notheis at lunchtime today about the SAP Operational Intelligence product directions, and followed on to his session on a consolidated inbox that uses SAP’s Fiori user experience platform to provide access to SAP’s Business Suite workflow, BPM and Operational Process Intelligence work items, as well as work items from non-SAP workflow systems. SAP has offered a few different consolidated inboxes over the years — some prettier than others — but they all serve the same purpose: to make things easier for users by providing a single point of contact for all work items, and easier for IT by reducing maintenance and support. In the case of the Fiori My Inbox, it also provides a responsive interface across mobile and desktop devices. Just as the underlying database and transaction platform for SAP is converging on HANA, all user experience for applications and analytics is moving to Fiori. Fiori (and therefore the consolidated My Inbox) is not yet available on the cloud platform, but that’s in the works.

As a consolidated work list manager, My Inbox provides multiple device support including mobile, managing work items from multiple systems in a single list and fully integrated into the Fiori launchpad. It has some nice features such as mass approvals, full-text searching, sorting and filtering, and sharing tasks via email and SAP JAM; work items can have attachments, comments and custom attributes that are exposed in the work list UI or by launching the UI specific to the work item.

We saw a demo of My Inbox, with  a user-configurable view that allows workers to create filtered lists within their inbox for specific task types or source systems in order to organize their work in the way that they want to view it. Work items can be viewed and managed in the work list view within Fiori, or the work item launched for full interaction using its native UI. Tasks can be forwarded to other users or suspended, as well as task type-specific actions such as approve and reject. Attachments can be added and viewed directly from the work list view, as well as direct links into other systems. The history for a work item is maintained directly in My Inbox for viewing by the user, although the underlying workflow systems are likely also maintaining their own separate history logs; this provides a more collaborative history by allowing users to add comments that become part of the My Inbox history. Emailing a task to a user sends a direct link to the task but does not interrogate or allocate access rights; I assume that this could mean that a task could  sent to someone who does not have rights to open or edit the tasks, and the original sender would not be informed. Within any list view, a multi-select function can be used to select multiple items for approval; these all have to be approval-type items rather than notifications, so this might be most useful in a list view that is filtered for a single task type. There is no view of tasks that a user delegated or completed — a sort of Sent Items box — so a user can’t monitor the progress of something that they forward to someone else. Substitutions for out-of-office times are set in My Inbox, meaning that the user does not need to visit each of the underlying systems of record to set up substitution rules; these rules can be applied based on task groups, which are established by how task profiles are set up during the initial technical configuration.

A good demonstration of the new generation of SAP user experience, and how Fiori can be used in a production transaction-oriented environment. There obviously needs to be a fair amount of cooperation between the Fiori-based My Inbox and the systems of record that contribute work items: My Inbox needs to be able to interrogate quite a bit of data from each work item, send actions, and manage user substitution rules via a common task consumption model that interacts with gateways to each type of underlying system. There is likely still quite a bit of work to do in those integration points to make this a fully-functional universal inbox, especially for systems of record that are more reluctant to yield their secrets to other systems; SAP has published specifications for building task gateways that could then be plugged into this model, which would expose work items from any system in My Inbox via a compatible gateway.

image

(Image from SDN link above)

The next good trick will be to have a consolidated history log, combining the logs from My Inbox with those in the systems of record to build a more complete history of a work item for reporting and decisioning.

SapphireNow 2015 Day 1 Keynote with Bill McDermott

Happy Cinco de Mayo! I’m back in Orlando for the giant SAP SAPPHIRE NOW and ASUG conference to catch up with the product people and hear about what organizations are doing with SAP solutions. If you’re not here, you can catch the keynotes and some of the other sessions online either in real time or on demand. The wifi is swamped as usual, my phone kicked from LTE down to 3G and on down to Edge before declaring No Service during the keynote, and since I’m blogging from my tablet/keyboard configuration, I didn’t have connectivity at the keynote (hardwired connections are provided for media/analysts, but my tablet doesn’t have a suitable port) so this will be posted sometime after the keynote and the press conference that follows.

We kicked off the 2015 conference with CEO Bill McDermott asking what the past can teach us about the present. Also, a cat anecdote from his days as a door-to-door Xerox salesman, highlighting the need for empathy and understanding in business, in addition to innovation in products and services. From their Run Simple message last year, SAP is moving on to Making Digital Simple, since all organizations have a lot of dark data that could be exploited to make them data-driven and seamless across the entire value chain: doing very sophisticated things while making them look easy. There is a sameness about vendors’ messaging these day around the digital enterprise — data, events, analytics, internet of things, mobile, etc. — but SAP has a lot of the pieces to bridge the data divide, considering that their ERP systems are at the core of so many enterprises and that they have a lot of the other pieces including in-memory computing, analytics, BPM, B2B networks, HR systems and more. Earlier this year, SAP announced S/4HANA: the next generation of their core ERP suite running on HANA in-memory database and integrating with their Fiori user experience layer, providing a more modular architecture that runs faster, costs less to run and looks better. It’s a platform for innovation because of the functionality and platform support, and it’s also a platform for generating and exposing so much of that data that you need to make your organization data-driven. The HANA cloud platform also provides infrastructure for customer engagement, while allowing organizations to run their SAP solutions in on-premise, hybrid and cloud configurations.

SAP continues to move forward with HR solutions, and recently acquired Concur — the company that owns TripIt (an app that I LOVE) as well as a number of other travel planning and expense reporting tools — to better integrate travel-related information into HR management. Like many other large vendors, SAP is constantly acquiring other companies; as always, the key is how well that they can integrate this into their other products and services, rather than simply adding “An SAP Company” to the banner. Done well, this provides more seamless operations for employees, and also provides an important source of data for analyzing and improving operations.

A few good customer endorsements, but pretty light on content, and some of the new messaging (“Can a business have a soul?”) seemed a bit glib. The Stanley Cup may a short and somewhat superfluous appearance, complete with white-gloved handler. Also, there was a Twitter pool running for how many times the word “simple” was used in the keynote, another indication that the messaging might need a bit of fine-tuning.

There was a press conference afterwards, where McDermott was joined by Jonathan Becher and Steve Lucas to talk about some other initiatives (including a great SAP Store demo by Becher) and answer questions from press and analysts both here in Orlando and in Germany. There was a question about supporting Android and other third-party development; Lucas noted that HANA Cloud Platform is available now for free to developers as a full-stack platform for building applications, and that there are already hundreds of apps built on HCP that do not necessarily have anything to do with SAP ERP solutions. Building on HCP provides access to other information sources such as IoT data: Siemens, for example, is using HCP for their IoT event data. There’s an obvious push by SAP to their cloud platform, but even more so to HANA, either cloud or on-premise: HANA enables real-time transactions and reconciliations, something rarely available in ERP systems, while allowing for far superior analytics and data integration without complex customization and add-ons. Parts of the partner channel are likely a bit worried about this since they exploit SAP’s past platform weaknesses by providing add-on products, customization and services that may no longer be necessary. In fact, an SAP partner that relies on the complexity of SAP solutions by providing maintenance services just released a survey claiming to show a lack of customer interest in S/4HANA; although this resulted in a flurry of sensational headlines today, if you look at the numbers that show some adoption and quite a bit of non-committed interest — not bad for three months after release — it starts to look more like an act of desperation. It will be more interesting to ask this questions a few quarters from now. HANA may also be seen as a threat to SAP’s customers’ middle management, who will be increasingly disintermediated as more information is gathered, analyzed and used to automatically generate decisions and recommendations, replacing manually-collated reports that form the information fiefdoms within many organizations.

Becher and Lucas offered welcome substance as a follow-on to McDermott’s keynote; I expect that we’ll see much more of the product direction details in tomorrow’s keynote with Bernd Leukert.

bpmNEXT 2015 Day 3 Demos: IBM (again), Safira, Cryo

It’s the last (half) day of bpmNEXT 2015, and we have five presentations this morning followed by the Best in Show award. Unfortunately, I have to leave at lunchtime to catch a flight, so you will have to check the Twitter hashtag to see who won — or maybe I’ll do a wrapup post from the road.

IBM: BPM, say Hello to Watson. A New Era of Cognitive Work – Here Today

First up was Chris Vavra discussing how Watson and cognitive computing and natural language analysis capabilities can be used in the context of BPM, acting as an expert advisor to knowledge workers to enhance, scale and accelerate their work with its (or as Chris said, “his”) reasoning capabilities. There are a number of Watson services offered on their Bluemix cloud development platform; he demonstrated an example of an HR hiring process where the HR person uses Watson to analyze a candidate’s personality traits as part of the evaluation process. This is based on a written personal statement provided by the candidate; Watson analyzes that text (or could link through to a personal website or blog) to provide a personality analysis. From the Bluemix developer dashboard, you can create applications that include any of the services, including Watson Personality Insights that provides ranking on several factors in the five basic personality traits of Openness, Conscientiousness, Extraversion, Agreeableness and Emotional Range, with a graphical representation to highlight values and needs that may be of concern in the hiring process. It’s unlikely that a hiring manager would use solely this information to make a decision, but it’s interesting for exploring a candidate’s personality characteristics as part of the process. There are a number of other Watson-based services available on Bluemix to bind into BPM (and other) applications; in the IBM cloud BPM designer, this just appears as a service connector that can be configured with the Watson authentication information, and invoked at a services step in a process flow. Lots of other potential applications for bringing this level of expert recommendations into processes, such as healthcare condition diagnoses or drug interactions.

Safira: Managing Unstructured Processes with AdHoc BPM Framework

Filipe Pinho Pereira addressed the issue of the long tail of organizations’ processes, where only the high-volume, high-value structured processes are being implemented as full BPM projects by IT, and the long tail of less critical and ad hoc processes that end up being handled manually. Using IBM BPM, he demonstrated their Ad-Hoc BPM Framework add-on that allows a business user to create a new ad-hoc process based on a predefined request-intervention process pattern, which has only an initial data capture/launch step, then a single “do it” human step with a loop that keeps returning to the same step until explicitly completed. The example was an expense report process, where a blank expense spreadsheet was attached, a form created to capture basic data, and SLAs specified. Routing is created by specifying the primary recipient, and notifications that will be issued on start, end and SLA violations. Users can then create an instance of that process (that is, submit their own expense report), which is then routed to the primary recipient; the only routing options at that point are Postpone, Forward and Complete, since it’s in the main human task loop part of the process pattern. This distills ad-hoc processes to their simplest form, where the current recipient of the main task decides on who the next recipient is or whether to complete the task; this is functionally equivalent to an email-based process, but with proper process monitoring and SLA analytics. By looking at the analytics for the process, we saw the number of interventions (the number of times that the human step loop was executed for an instance), and the full history log could be exported to perform mining to detect patterns for process improvement. Good example of very simple user-created ad hoc processes based on an industrial-strength infrastructure; you’re not going to buy IBM BPM just to run this, but if you’re already using IBM BPM for your high-volume processes, this add-on allows you to leverage the infrastructure for the long tail of your processes.

Cryo: Tools for Flexibility in Collaborative Processes

Rafael Fazzi Bortolini and Leonardo Luzzatto presented on processes that lie somewhere in the middle of the structured-unstructured spectrum, and how to provide flexibility and dynamic aspects within structured constraints through decision support, flexible operations, ad-hoc task execution and live changes to processes. Demonstrating with their Orquestra BPMS, they showed a standard process task interface with the addition of localized analytics based on the history of that task in order to help the user decide on their actions at that point. Flexible routing options allow the user to return the process to an earlier step, or forward the current task to a colleague for consultation before returning it to the original user at the same step; this does not change the underlying process model, but may move the instance between activities in a non-standard fashion or reassign it to users who were not included in the original process definition. They also have an ad-hoc process pattern, but unlike Safira, they are using actual ad-hoc activities in BPMN, that is, tasks that are not connected by flow lines. Users are presented with the available ad hoc tasks in the process model, allowing them to “jump” between the activities in any order. They also demonstrated live changes to production processes; the examples were adding a field to a form and changing the name of a task in the process, both of which are presumably loaded at runtime rather than embedded within the instantiated process to allow these types of changes.

bpmNEXT 2015 Day 2 Demos: Kofax, IBM, Process Analytica

Our first afternoon demo session included two mobile presentations and one on analytics, hitting a couple of the hot buttons of today’s BPM.

Kofax: Integrating Mobile Capture and Mobile Signature for Better Multichannel Customer Engagement Processes

John Reynolds highlighted the difficulty in automating processes that involve customers if you can’t link the real world — in the form of paper documents and signatures — with your digital processes. Kofax started in document scanning, and they’ve expanded their repertoire to include all manner of capture that can make processes more automated and faster to complete. Smartphones become intelligent scanners and signature capture devices, reducing latency in capture information from customers. John demonstrated the Kofax Mobile Capture app, both natively and embedded within a custom application, using physical documents and his iPhone; it captures images of a financial statement, a utility bill and a driver’s license, then pre-processes them on the device to remove irregularities that might impact automated character recognition and threshold them to binary images to reduce the data transmission size. These can then be directly injected into a customer onboarding process, with both the scanned image and the extracted data included, for automated or manual validation of the documents to continue the process. He showed the back-end tool used to train the recognition engine by manually identifying the data fields on sample images, which can accept a variety of formats for the same type of document, e.g., driver’s licenses from different states. This is done by a business person who understands the documents, not developers. Similarly, you can also use their Kapow Design Studio to train their system on how to extract information from a website (John was having the demo from hell, and his Kapow license had expired) by marking the information on the screen and walking through the required steps to extract the required data fields. They take on a small part of the process automation, mostly around the capture of information for front-end processes such as customer onboarding, but are seeing many implementations moving toward an “app” model of several smaller applications and processes being used for an end-to-end process, rather than a single monolithic process application.

IBM: Mobile Case Management and Capture in Insurance

Mike Marin and Jonathan Lee continued on the mobile theme, stressing that mobile is no longer an option for customer-facing and remote worker functionality. They demonstrated IBM Case Manager for an insurance example, showing how mobile functionality could be used to enhance the claims process by mobile capture, content management and case handling. Unlike the Kofax scenario where the customer uses the mobile app, this is a mobile app for a knowledge worker, the claims adjuster, who may need a richer informational context and more functionality such as document type classification than a customer would use. They captured the (printed and filled) claims form and a photo of the vehicle involved in the claim using a smartphone, then the more complete case view on a tablet that showed more case data and related tasks. The supervisor view shows related cases plus a case visualizer that shows a timeline view of the case. They finished with a look at the new IBM mobile UI design concepts, which presented a more modern mobile interface style including a high-level card view and a smoother transition between information and functions.

Process Analytica: Process Discovery and Analytics in Healthcare Systems

Robert Shapiro shifted the topic to process mining/discovery and analytics, specifically in healthcare applications. He started with a view of process mining, simulation and other analytical techniques, and how to integrate with different types of healthcare systems via their history logs. Looking at their existing processes based on the history data, missed KPIs and root causes can be identified, and potential solutions derived and compared in a systematic and analytic manner. Using their Optima process analytics workbench, he demonstrated importing and analyzing an event log to create a BPMN model based on the history of events: this is a complete model that includes interrupting and non-interrupting boundary events, and split and merge gateways based on the patterns of events, with probabilistic weights and/or decision logic calculated for the splitting gateways. Keeping in mind that the log events come from systems that have no explicit process model, the automatic derivation of the boundary events and gateways and their characteristics provides a significant step in process improvement efforts, and can be further analyzed using their simulation capabilities. Most of the advanced analysis and model derivation (e.g., for gateway and boundary conditions) is dependent on capturing data value changes in the event logs, not just activity transitions; this is an important distinction since many event logs don’t capture that information.

Analytics For Kofax TotalAgility With @Altosoft

Last session here at Kofax Transform, and as much I’d like to be sitting around the pool, I also like to squeeze every bit out of these events, and support the speakers who get this most unenviable timeslot. I’ve been in a couple of the analytics sessions over the past two days, which are based on the Kofax Altasoft Insight product. Married with TotalAgility for process analytics, they offer a simple version with some pre-defined dashboards, a more complete version but tied only to the KTA databases, and the full version that has the full Insight functionality with any data sources including KTA. The focus seems to be only on document capture workflow analytics, with many of the default reports on things like productivity, extraction rates and field accuracy in the scan and extraction modules; although these are definitely important, and likely of primary importance to Kofax’s current customer base of capture clients, the use cases for their demos need to push further into the post-capture business processes if they expect to be taken seriously as a BPM vendor. I know that KTA is a “first mile” solution and the capture processes are essential, but there should be more to apply analytics to across the customer journey managed within a SPA.

The visualization and dynamic filtering is pretty nice, as you would expect in the Altosoft environment, allowing you to drill into specific processes and tasks to find problem areas in process quality and operator performance. Traditional capture customers in the audience are going to like this, since it provides a lot of information on those front-end processes that can become an expensive bottleneck to downstream processing. 

We had another look at the process intelligence that I saw in an earlier session, monitoring event logs from capture workflows plus downstream processing in KTA or another system such as a third-party BPM or ERP system. Although that’s all good stuff, it does highlight that the Kofax end-to-end solution is made up of a number of systems strung together, rather than an integrated platform with shared infrastructure. It’s also completely document-centric since it uses document ID as the instance ID: again, well-suited for their current capture customers, but not necessarily the mind-set required to approach a more general BPM/case management market that is more data-centric than document-centric.

This wraps up Kofax Transform 2015. There is a customer awards dinner tonight that I plan to attend, then head home tomorrow. Thanks to the entire Kofax team, especially the amazing analyst relations crew, for inviting me here and making sure my time was well-spent. As a matter of disclosure, Kofax paid my travel expenses to be here, but did not otherwise compensate me for my time or for anything that I wrote here on my blog. Kofax has been a customer of mine in the past for presentations at Transform as well as webinars and white papers.

My next event is bpmNEXT in Santa Barbara at the end of the month — if you’re interested in the next generation of BPM or just want to hang with a bunch of BPM geeks in a relatively non-partisan environment, I highly recommend that you check it out.