Category Archives: analytics

analytics, business intelligence and business activity monitoring

The collision of capture, content and analytics

Martyn Christian of UNDRSTND Group, who I worked with back in FileNet in 2000-1, gave a keynote at ABBYY Technology Summit 2017 on the evolution and ultimate collision of capture, content and analytics. He started by highlighting some key acquisitions in the industry, including the entry of private capital, as well as a move to artificial intelligence in the capture space, as harbingers of the changes in the capture market. Since Gartner declared enterprise content management dead — long live content services platforms! — and introduced new players in the magic quadrant alongside the traditional ECM players, while shifting IBM from the leaders quadrant back to the challengers quadrant.

Intelligent capture is gaining visibility and importance, particularly as a driver for digital transformation. Interestingly, capture was traditionally about converting analog (paper) to digital (data); now, however, many forms of information are natively digital, and capture is not only about performing OCR on scanned paper documents but about extracting and analyzing actionable data from both analog and digital content. High-volume in-house production scanning operations are being augmented — or replaced — with customers doing their own capture, such as we now see with depositing a check using a mobile banking application. Information about customer actions and sentiment is being automatically gleaned from their social media actions. Advanced machine learning is being used to classify content, reducing the need for manual intervention further downstream, and enabling straight-through processing or the use of autonomous agents.

As a marketing guy, he had a lot of advice on how this can be positioned and sold into customers; UNDRSTND apparently ran a workshop yesterday for some of the channel partner companies on bringing this message to their customers who are seeking to move beyond simple capture solutions to digital transformation.

Strategy to execution – and back: it’s all about alignment

I recently wrote a paper sponsored by Software AG called Strategy To Execution – And Back, which you can find here (registration required). From the introduction:

When planning for business success, corporate management sets business strategy and specifies goals in terms of critical success factors and key performance indicators (KPIs). Although senior management is not concerned with the technical details of how business operations are implemented, they must have confidence that the operations are aligned with the strategy, and be able to monitor performance relative to the goals in real time.

In order to achieve operational alignment, there must be a clear path that maps strategy to execution: a direct link from the strategic goals in the high-level business model, through IT development and management practices, to the systems, activities and roles that make the business work. However, that’s only half the story: there must also be a path back from execution to strategy, allowing operational performance to be measured against the objectives in order to guide future strategy. Without both directions of traceability, there’s a disconnect between strategy and operations that can allow a business to drift off course without any indication until it’s far too late.

I cover how you need to have links from your corporate strategy through various levels of architecture to implementation, then be able to capture the operational metrics from running processes and roll those up relative to the corporate goals. If you don’t do that, then your operations could just be merrily going along their own path rather than working towards corporate objectives.

Analytics customer keynote at TIBCONOW 2016

Michael O’Connell hosted the last general session for TIBCO NOW 2016, focusing on analytics customer stories with the help of five customers: State Street, Shell, Vestas, Monsanto and Western Digital. I’m not going to try to attribute specific comments to the customer representatives, just capture a few thoughts as they go by.

wp-1463615608893.jpg

  • Spotfire is allowing self-service analytics to be pushed down to the business users
  • Typically, the analysis going on in a number of different solutions — from Excel to BI tools — are able to be consolidated onto a single analytics platform
  • Analytics is allowing the business to discover the true nature of their business, especially with outliers
  • Real-time analytics on physical processes (e.g., supply chain) generates significant benefits
  • Providing visual analytics to business changes the way that they use data and collaborate across the organization
  • The enterprise-class back-end and the good visualizations in Spotfire are helping it to win over both IT and business areas
  • Data and events are being generated faster and in greater volumes from more devices, making desktop analytics solutions impractical
  • Business users who are not data specialists can understand — and leverage — fairly complex analytical models when it concerns their own data
  • Analytics about manufacturing quality can be used to identify potential problems before they occur

We finished up with a brief presentation from Fred Ehlers, VP of IT at Norfolk Southern, about their use of TIBCO products to help manage their extensive railway operations. He talked about optimizing their intermodal terminals, where goods shipped in containers are moved between trains, trucks and ships; asset utilization, to ensure that empty cars are distributed to the right place at the right time for expected demand; and their customer service portal that shows an integrated view of a shipment lifecycle to give customers a more accurate, real-time view. As an old company, they have a lot of legacy systems, and used TIBCO to integrate them, centralizing operational events, data and business rules. For them, events can come from their physical assets (locomotives and railway sensors), legacy reporting systems, partner networks for assets not under their ownership, and external information including weather. On this, they build asset state models, and create applications that automatically correlate information and optimize operations. They now have one source of data and rules, and a reusable set of data and services to make application development faster. Their next steps are predictive maintenance, gathering information from locomotives, signal systms, switches and trackside defect detector to identify problems prior to an equipment failure; and real-time visual analytics with alerts on potential problem areas. They also want to inmprove operational forecasting to support better allocation of resources, allowing them to divert traffic and take other measures to avoid service disruptions. Great case study that incorporates the two conference themes of interconnecting everything and augmenting intelligence.

We’re at the end of day 2, and the end of my blogging at TIBCO NOW; there are breakouts sessions tomorrow but I’ll be on my way home. Some great new stuff in BPM and analytics, although far too many sessions going on at once to capture more than a fraction of what I wanted to see.

Intelligent Business Operations at TIBCONOW 2016

wp-1463598777880.jpgNicolas Marzin of TIBCO gave a breakout session on making business operations intelligent, starting with the drivers of efficiency, agility, quality and transparency. There are a number of challenges to achieving this in terms of work management: workers may have too many queues to monitor and not know which is most important, or people may be having work assigned to them that they are either over- or under-qualified to complete. This can result in missed SLAs and unhappy customers, lower efficiency, and lack of agility since business priorities aren’t enforced.

Looking at a day in the life of an operational business user, they need to know their own and their team’s performance goals, and what work that they should be completing that day in order to achieve those goals. Managers are concerned about their team as a whole, including whether they are meeting goals and SLAs, whether they have sufficient resources, and how to prioritize work. Managers need tools for real-time metrics, workforce administration, workload balancing, and changing priorities on the fly. ActiveMatrix BPM provides the ability to model your workforce in terms of roles, groups, privileges, relationships and capabilities; rules are applied to create a distribution strategy that determines what work is assigned to what resource at any point in a business process. Typically, work is assigned to a subset of the workforce whose skills match the requirement, since allocating work to an individual creates an operational risk if that person is absent or overloaded with work. AMX BPM includes process patterns for resource management: separation of duties, retain familiar, chaining and piling.

wp-1463598797560.jpgAnalytics comes into play in the management dashboard, where Spotfire is used to monitor operational performance and trigger actions directly from the dashboard. Typical visualization include work backlog and SLAs, resources pool workload and capacity, process and case performance, and business data in context. Marzin showed examples of dashboards for real-time tracking of work backlog and staffing, plus as-is forecasting that identifies bottlenecks. The charts show the factors that are most important for a manager to make resource allocation decisions, understand staffing needs based on combinations of skills, and reprioritize specific work types, which can then be pushed back to AMX BPM.

wp-1463598810544.jpgThis is fairly traditional BPM and case management, with rule-based workforce management, but that’s a huge part of where AMX BPM is being used in practice. However, their workforce management is fairly advanced compared to many competitive solutions, and using Spotfire for operational analytics raises the bar in active manager dashboards while allowing for what-if prediction and simulation on the fly. This ties in to the “closing the loop” theme of the day, where manager dashboard actions feed directly back to adjust the workforce management rules. This level of integrated visual analytics for AMX BPM is long overdue, but it looks like they’ve turned the previous demo-ware into something much more robust and generally applicable.

As an aside, I’ve done some presentations recently about the need to align incentives with corporate goals; although individual performance statistics are important, it’s key to ensure that they match up with overall goals, and include measurements of collaboration and teamwork too. Metrics for collaboration are just starting to emerge, and are not included in most BPM or other work management platforms.

Closing the loop with analytics: TIBCONOW 2016 day 2 keynote

Yesterday at TIBCO NOW 2016, we heard about the first half of TIBCO’s theme — interconnect everything — and today, Matt Quinn introduced the second half — augment intelligence — before turning the stage over to Mark Palmer, SVP engineering for streaming analytics.

wp-1463592662913.png

Palmer talked about the role of analytics over history, and how today’s smart visual analytics allow you to be first to insight, then first to action. We then had a quick switch to wp-1463592680877.pngBrad Hopper, VP strategy for analytics, for a demo of Spotfire visual analytics while wearing a long blond wig (attempting to make a point about the importance of beauty, I think). He built an analytics dashboard while he talked, showing how easy it is to create visual analytics and trigger smart actions. He went on to talk about data preparation and cleansing, which can often take as much as 50% of an analyst’s time, and demonstrated importing a CSV file and using quick visualizations to expose and correct potential problems in the underlying data. As always, the Spotfire demos are very impressive; I don’t follow Spotfire closely enough to know what’s new, but it all looks pretty slick.

wp-1463592703428.pngMichael O’Connell, TIBCO’s chief analytics officer, came up to demonstrate a set of analytics applications for a fictitious coffee company: sales figures and drilldowns, with what-if predictions for planning promotions; and supply chain management and smart routing of product deliveries.

Palmer came back to talk about TIBCO Jaspersoft, the other side of their analytics portfolio that provides business intelligence capabilities built in to applications, but it was a pretty quick mention with no demo. A Jaspersoft demo would look pretty mundane after seeing all of the sexy Spotfire features, but it undoubtedly is a workhorse for analytics with many customers. He moved on to ways that TIBCO is helping customers to roll analytics out, from accelerators and sample source code to engagement in the community.

wp-1463592727471.png

wp-1463592749782.png

He continued on with streaming analytics (Palmer was the CEO of Streambase before it was acquired TIBCO), and O’Connell came back to show an
wp-1463592771034.pngoil industry application that leverages sensor analytics to maximize equipment productivity by initiating preventative maintenance when the events emitted by the device indicate that failure may be imminent. He showed a more comprehensive interface that would be used in the head office for real-time monitoring and analysis, and a simpler tablet interface for field service personnel to receive information about wells requiring service. Palmer finished the analytics segment with a brief look at LiveView Web, a zero-code environment for building operational intelligence dashboards.

wp-1463592816127.png

Quinn returned to talk about their B-tree-based Graph Database, which is in preview mode now with an open API, and other areas where they are looking to provide innovative solutions. He went through a history of how they’ve grown as a technology organization, and got quite verklempt when thanking his team for how awesome they’ve continued to be over the past 18 months since the acquisition, which was really touching.

IMG_9495After the break, Adam Steltzner, NASA’s lead engineer on the Mars Rover and author of The Right Kind of Crazy: A True Story of Teamwork, Leadership, and High-Stakes Innovation, talked about innovation, collaboration and decision-making under pressure. Check out the replay of the keynote for his talk, a fascinating story of the team that built and landed the Mars landing vehicles, along with some practical tips for leaders to foster exploration and innovation in teams.

Murray Rode returned to close out the keynote by announcing the winners of their Trailblazer customer awards:

  • Norfolk Southern (Pioneer) for implementing a real-time view of their railway operations
  • CargoSmart (Innovator) for incorporating real-time optimization of shipping logistics into their cargo management software
  • First Citizens Bank (Impact) for simplifying IT structure to allow for quick creation and delivery of new branch services
  • University of Chicago Medicine (Visionary) for optimizing operating room turnover to save costs and improve service
  • TUI Group (Transformer) for transforming their platforms through integration to enable new customer-facing tourism applications

That’s it for the morning keynote, and I’m off to catch some of the breakout sessions for most of the rest of the day before we come back for the customer panel and closing keynote at the end of the day.

The Enterprise Digital Genome with Quantiply at BPMCM15

“An operating system for a self-aware quantifiable predictive enterprise” definitely gets the prize for the most intriguing presentation subtitle, for an afternoon session that I went to with Surendra Reddy and David Chaney from Quantiply (a stealth startup that has just publicly launched), and their customer, a discount brokerage service whose name I have been requested to remove from this post.

Said customer has some significant event data challenges, with a million customers and 100,000 customer interactions per day across a variety of channels, and five billion log messages generated every day across all of their product systems and platforms. Having this data exist in silos with no good aggregation tools means fragmented and poor customer support, and also significant challenges in system and internal support.

To address these types of heterogenous data analysis problems, Quantiply has a two-layer tool: Edge Cloud for the actual data analysis, which can then be exposed to different roles based on access control (business users, operational users, data scientists, etc.); and Pulse for connecting to various data sources including data warehouses, transactional databases, BPM systems and more. It appears that they’re using some sort of dimensional fact models, which is fairly standard data warehouse analytical tools, but their Pulse connectors is allowing them to pour in data on a near-real-time basis, then make the connections between capabilities and services to be able to do fast problem resolution on their critical trading platforms. Because of the nature of the graph connectivity that they’re deriving from the data sources, they’re able to not only resolve the problem by drilling down, but also determine what customers were impacted by the problem in order to follow up. In response to a question, the customer said that they had used Splunk and other log analytics tools, but that this was “not Splunk”, in terms of both the real-time nature, and the front-end user experience, plus deeper analytical capabilities such as long-term interaction trending. In some cases, the Quantiply representation is sufficient analysis; in other cases, it’s a starting point for a data scientist to dig in and figure out some of the more complex correlations in the data.

There was a lot of detail in the presentation about the capabilities of the platform and what the customer is doing with it, and the benefits that they’re seeing; there’s not a lot of information on the Quantiply website since they’re just publicly launching.

Update: The original version of this post included the name of the customer and their representative. Since this was a presentation at a public conference with no NDA or confidentiality agreements in place, not even a verbal request at any time during the session, I live-blogged as usual. A day later, the vendor, under pressure from the customer’s PR group, admitted that they did not have clearance to have this customer speak publicly, which is a pretty rookie mistake on their part, although it lines up with my general opinion on their social media skills. As a favor to the conference organizers, who put a lot of effort into making a great experience for all of us, I’ve decided to remove the customer’s name from this post. I’m sure that those of you who really want to know it won’t have any trouble finding it, because of this thing called “the internet”.

The Personology of @RBSGroup at PegaWorld 2015

IMG_7261Andrew McMullan, director of analytics and decisioning (aka “personologist”) at Royal Bank of Scotland, gave a presentation on how they are building a central (Pega-based) decisioning capability to improve customer engagement and change their culture along the way. He started with a personal anecdote about how RBS did the right thing for a family member and gained a customer for life – a theme echoed from this morning’s keynote that also included RBS.  He showed a short video of their current vision, which stated goals of making RBS easier to do business with, and to work for, in addition to being more efficient. In that order, in case you other banks are following along.

RBS is now government owned, having been bailed out during the financial crisis; I’m not sure how much this has allowed them to focus on customer engagement rather than short-term profits, but they do seem to be talking the right talk.

RBS uses Pega’s Chordiant – primarily the decision management components, if I am reading it correctly – although are implementing Pega 7 for an August 2015 rollout to bring in more robust Next Best Action capabilities; they also use SAS Visual Analytics for reporting. This highlights the huge role of decisioning as well as process in customer engagement, especially when you’re applying analytics to a broad variety of customer information in order to determine how to interact with the customer (online or IRL) at any particular moment. RBS is proactive about having their customers do things that will save them money, such as renewing a mortgage at a lower rate, or choosing a package of banking services that doesn’t overlap with other services that they are paying for elsewhere. Contrary to what nay-sayers within RBS said about lost revenue, this tends to make customers more loyal and ultimately do more business with them.

There was a good question from the audience about how much of this was changes to organizational culture, and how much was the data science: McMullan said that it’s really critical to win the hearts and minds of the employees, although obviously you need to have at least the beginnings of the analytics and recommendations to get that started. Also, they use Net Promoter Score as their main internal metric, which tends to reward relationship-building over short-term profits; having the right incentives for employees goes a long ways towards helping them to do the right thing.

PegaWorld 2015 Day 2 Customer Keynotes: Big Data and Analytics at AIG and RBS

After the futurist view of Brian Solis, we had a bit more down-to-earth views from two Pega customers, starting with Bob Noddin from AIG Japan on how to turn information that they have about customers into an opportunity to do something expected and good. Insurance companies have the potential to help their customers to reduce risk, and therefore insurance claims: they have a lot of information about general trends in risk reduction (e.g., tell an older customer that if they have a dog and walk it regularly, they will stay healthier and live longer) as well as customer-specific actions (e.g., suggest a different route for someone to drive to work in order to reduce likelihood of accident, based on where they live and work, and the accident rates for the roads in between). This is not a zero-sum game: fewer claims is good for both AIG and the customers. Noddin was obviously paying close attention to Solis, since he wove elements of that into his presentation in how they are engaging customers in the way that the customer chooses, and have reworked their customer experience – and their employee and agent experience – with  that in mind.

Between the two customers, we heard from Rob Walker, VP of Decision Management and Analytics at Pega, about the always-on customer brain and strategies for engaging with them:

  • Know your customer: collect and analyze their data, then put it in the context of their entire customer journey
  • Reach your customer: break down the silos between different channels, and also between inbound and outbound communications, to form a single coherent conversation
  • Delight your customer: target their needs and wants based on what you know about them, using the channels through which you know that they can be reached.

He discussed how to use Pega solutions to achieve this through data, analytics and decisioning; obviously, the principles are universal.

Chrome Legacy Window 2015-06-09 103539 AM.bmpThe second customer on stage was Christian Nelissen from Royal Bank of Scotland, who I also saw yesterday (but didn’t blog about) on the big data panel. RBS has a good culture of knowing their customer from their roots as a smaller, more localized bank: instead of the branch manager knowing every customer personally, however, they now rely on data about customers to create 1:1 personalize experiences based on predictive and adaptive analytics in the ever-changing context of the customer. He talked about the three pillars of their approach:

  • It’s about the conversation. If you focus on doing the right thing for the customer, not always explicit selling to them, you build the relationship for the long term.
  • One customer, one bank. A customer may have products in different bank divisions, such as retail banking, credit cards and small business banking, and you need to be cognizant of their complete relationship with the bank and avoid internal turf wars.
  • You can do a lot with a little. Data collection and analytics technologies have become increasingly cheaper, allowing you to start small and learn a lot before expanding your customer analytics program.

Alan Trefler closed out the keynote before sending us off to the rest of the day of breakout sessions. Next years, PegaWorld is in Las Vegas; not my favorite place, but I’ll be back for the quality of the presentations and interactions here.

These two keynotes this morning have been great to listen to, and also closely aligned with the future of work workshop that I’m doing at IRM BPM in London next week, as well as the session on changing incentives for knowledge workers. Always good when the planets align.

SapphireNow 2015 Day 2 Keynote with Bernd Leukert

SAP HANA functionalityThe second day of SAP’s SAPPHIRENOW conference started with Bernd Leukert discussing some customers’ employees worry of being disintermediated by the digital enterprise, but how the digital economy can be used to accentuate the promise of your original business to make your customers happier without spending the same amount of time (and hopefully, money) on enterprise applications. It’s not just about changing technologies but about changing business models and leveraging business networks to address the changing world of business. All true, but I still see a lot of resistance to the digital enterprise in large organizations, with both mid-level management and front-line workers feeling threatened by new technologies and business models until they can see how it can be of benefit to them.

S/4HANAAlthough Leukert is on the stage, the real star of the show is S/4HANA: the new generation of their Business Suite ERP solutions based natively on the in-memory HANA data and transaction engine for faster processing, a simplified data model for easier analytics and faster reconciliation, and a new user interface with their Fiori user experience platform. With the real-time analytical capabilities of HANA, including non-SAP as well as S/4HANA data from finances and logistics, they are moving from being just a system of record to a full decision support system. We saw a demo of a manufacturing scenario, where we walked through a large order process where we saw a combination of financial and logistics data presented in real time for making recommendations on how to deal with a shortage in fulfilling an order. Potential solutions — in this case, moving stock allocated from one customer to another higher priority customer — are presented with a predicted financial score, allowing the user to select one of the options. Nice demo of analytics and financial predictions directly integrated with order processing.

Order processing dashboard Order processing recommendations Order process simulation results

The new offering is modular, with additional plug-ins for their other products such as Concur and SuccessFactors to enhance the suite capabilities. It runs in the cloud and on-premise. Lots of reasons to transition, but having this type of new functionality requires significant work to adopt the new programming model: both on SAP’s side in building the new platform, and also on the customers’ side for refactoring their applications to take advantage of the new features. Likely this will take several months, if not years, for widespread adoption by customers that have highly customized solutions (isn’t that all of them?), in spite of the obvious advantages. As we have seen with other vendors who completely re-architect their product, new customers are generally very happy with starting on the new platform, but existing customers can take years even when there is certified migration path. However, since they launched in February, 400 customers have committed to S4/HANA, and they are now supporting all 25 industries that they serve.

As we saw last year, SAP is pushing to have existing customers first migrate to HANA as the underlying database in their existing systems (typically displacing Oracle), which is a non-trivial but straightforward operation that is likely to improve performance; then, reconsider whether the customizations that they have in their current system are handled out of the box with S/4HANA or can be easily re-implemented based on the simpler data model and more functional capabilities. Sounds good, and I imagine that they will get a reasonable share of their existing customers to make the first step and migrate to HANA, but the second step starts to look more like a new implementation than a simple migration that will scare off a lot of customers. Leukert invited a representative from their customer Asian Paints to the stage to talk about their migration: they have moved to HANA and the simplified finance core functionality, and are still working on implementing the simplified logistics and other modules with a vision to soon be completely on S/4HANA. A good success story, but indicative of the length of time and amount of work required to migrate. For them, definitely worth the trip since they have been able to re-imagine their business model to reach new markets through a better understanding of their customers and their own business data.

He moved on to talk about the HANA Cloud Platform (HCP), a general-purpose application development platform that can be used to build applications unrelated to SAP applications, or to build extensions to SAP functionality. He mentioned an E&Y application built on HCP for fraud detection that is directly integrated with core SAP solutions, which is just one of 1,000 or more third-party applications available on the HCP marketplace. HCP provides structured and unstructured data models, geospatial, predictive, Fiori UX platform as a service, mobile support, analytics portfolio, and integration layers that provide direct connection to your business both on the device side through IoT events and into the operational business systems. With the big IoT push that we saw in the panel yesterday, Siemens has selected HCP as their cloud platform for IoT: the Siemens Cloud for Industry. Peter Weckesser of Siemens joined Leukert on stage to talk more about this newly-launched platform, and how it can be added to their customer installations as a monitoring (not control) layer: remote devices, such as sensors on manufacturing equipment, push their event streams to the Siemens cloud (based on HCP) in public, hybrid or on-premise configurations; analytics can then be applied for predictive maintenance scheduling as well as aggregate operational optimization.

Energy grid geospatial analyticsWe saw a demo based on the CenterPoint IoT example at the panel yesterday, showing monitoring and maintenance of energy distribution networks: tracking the health of transformers, grid storage and other devices and identifying equipment failures, sometimes before they even happen. CenterPoint already has 100,000 sensors out in the field, and since this is integrated with S/4HANA, this is not just monitoring: an operator can trigger a work order directly from the predictive equipment maintenance analytics dashboard.

Energy grid analytics Energy grid analytics drill-down

Leukert touched on to the HANA roadmap, with the addition of Hadoop and SPARK Cluster Manager to handle infinite volumes of data, then welcomed Walmart CIO Karenann Terrell to discuss what it is like to handle a really large HANA implementation. Walmart serves 250 million customers per week through 11,000 locations with 2.2 million employees, meaning that they generate a lot of data just in their daily operations: they generate literally trillions of financial transactions. Because technology is so core to managing this well, she pointed out that Walmart is creating a technology company in the middle of the world’s largest retail company, which allows them to stay focused on the customer experience while reducing costs. Their supply chain is extensive, since they are directly plugged into many of their suppliers, and innovating along that supply chain has driven them to partner with SAP more closely than most other customers. HANA allows them to have 5,000 people hitting on data stores of a half-billion records simultaneously with sub-second response time to provide a real-time view of their supply chain, making them a true data-driven retailer and shooting them to the top of yesterday’s HANA Innovation Awards. She finished by saying that seeing S/4HANA implemented at Walmart in her lifetime is on her bucket list, which got a good laugh from the audience but highlighted the fact that this is not a trivial transition for most companies.

Leukert finished with an invitation — or maybe it was a challenge — to use S/4HANA and HCP to reinvent your business: “clean your basement” to remove unnecessary customization in your current SAP solutions or convert it to HCP or S/4HANA extension platform; change your business model to become more data-driven; and leverage business networks to expand the edges of your value chain. Thrive, don’t just survive.

Employee disaster scenarioSteve Singh, CEO of Concur (acquired by SAP last December) then took over to look at reinventing the employee travel experience, from booking through trip logistics to expense reporting. For companies with large number of traveling employees, managing travel can be a serious headache both from a logistics and financial standpoint. Concur does this by creating a business network (or a network or networks) that directly integrates with suppliers — such as airlines and car rental companies — for booking and direct invoice capture, plus easy functions for inputting travel expenses that are not captured directly from the supplier. I heard comments yesterday that SAP already has travel and expense management, and although the functionality of Concur for that functionality is likely a bit better, the networks that they bring are the real prize here. The networks, for example, allow for managing the extraction of an employee who finds themself in a disaster or other dangerous travel scenario, and becomes part of a broader human resources risk management strategy.

At the press Q&A later, Leukert fielded questions about how they have simplified the complete core of their ERP solution in terms of data model and functionality but still have work to do for some industry modules: although all 25 industries are supported as of now in the on-premise version, they need to do a bit of tinkering under the hood and do additional migration for the cloud version. They’re also still working on the cloud version of everything, and are recommending the HCM and CRM standalone products if the older Business Suite versions don’t meet requirements. In other words, it’s not done yet, although core portions are fully functional. Singh talked about the value of business networks such as Ariba in changing business models, and sees that products such as Concur using HCP and the SAP business networks will help drive broader adoption.

There was a question on the ROI for migration to S/4HANA: it’s supposed to run 1,800 times faster than previous versions, but customers may not be seeing much (if any) savings, opening things up to competitive displacement. I heard this same sentiment from some customers last night at the HANA Innovation Awards reception; since there is little or no cost reduction in terms of license and deployment costs, they need to make the case based on what additional capabilities that HANA enables, such as real-time analytics and predictions, that allow companies to run their businesses differently, and a longer-term reduction in IT complexity and maintenance costs. Since a lot of more traditional companies don’t yet see the need to change their business models, this can be a hard sell, but eventually most companies will need to come around to the need for real-time insights and actions.

Consolidated Inbox in SAP Fiori at SapphireNow 2015

I had a chance to talk with Benny Notheis at lunchtime today about the SAP Operational Intelligence product directions, and followed on to his session on a consolidated inbox that uses SAP’s Fiori user experience platform to provide access to SAP’s Business Suite workflow, BPM and Operational Process Intelligence work items, as well as work items from non-SAP workflow systems. SAP has offered a few different consolidated inboxes over the years — some prettier than others — but they all serve the same purpose: to make things easier for users by providing a single point of contact for all work items, and easier for IT by reducing maintenance and support. In the case of the Fiori My Inbox, it also provides a responsive interface across mobile and desktop devices. Just as the underlying database and transaction platform for SAP is converging on HANA, all user experience for applications and analytics is moving to Fiori. Fiori (and therefore the consolidated My Inbox) is not yet available on the cloud platform, but that’s in the works.

As a consolidated work list manager, My Inbox provides multiple device support including mobile, managing work items from multiple systems in a single list and fully integrated into the Fiori launchpad. It has some nice features such as mass approvals, full-text searching, sorting and filtering, and sharing tasks via email and SAP JAM; work items can have attachments, comments and custom attributes that are exposed in the work list UI or by launching the UI specific to the work item.

We saw a demo of My Inbox, with  a user-configurable view that allows workers to create filtered lists within their inbox for specific task types or source systems in order to organize their work in the way that they want to view it. Work items can be viewed and managed in the work list view within Fiori, or the work item launched for full interaction using its native UI. Tasks can be forwarded to other users or suspended, as well as task type-specific actions such as approve and reject. Attachments can be added and viewed directly from the work list view, as well as direct links into other systems. The history for a work item is maintained directly in My Inbox for viewing by the user, although the underlying workflow systems are likely also maintaining their own separate history logs; this provides a more collaborative history by allowing users to add comments that become part of the My Inbox history. Emailing a task to a user sends a direct link to the task but does not interrogate or allocate access rights; I assume that this could mean that a task could  sent to someone who does not have rights to open or edit the tasks, and the original sender would not be informed. Within any list view, a multi-select function can be used to select multiple items for approval; these all have to be approval-type items rather than notifications, so this might be most useful in a list view that is filtered for a single task type. There is no view of tasks that a user delegated or completed — a sort of Sent Items box — so a user can’t monitor the progress of something that they forward to someone else. Substitutions for out-of-office times are set in My Inbox, meaning that the user does not need to visit each of the underlying systems of record to set up substitution rules; these rules can be applied based on task groups, which are established by how task profiles are set up during the initial technical configuration.

A good demonstration of the new generation of SAP user experience, and how Fiori can be used in a production transaction-oriented environment. There obviously needs to be a fair amount of cooperation between the Fiori-based My Inbox and the systems of record that contribute work items: My Inbox needs to be able to interrogate quite a bit of data from each work item, send actions, and manage user substitution rules via a common task consumption model that interacts with gateways to each type of underlying system. There is likely still quite a bit of work to do in those integration points to make this a fully-functional universal inbox, especially for systems of record that are more reluctant to yield their secrets to other systems; SAP has published specifications for building task gateways that could then be plugged into this model, which would expose work items from any system in My Inbox via a compatible gateway.

image

(Image from SDN link above)

The next good trick will be to have a consolidated history log, combining the logs from My Inbox with those in the systems of record to build a more complete history of a work item for reporting and decisioning.