CamundaCon 2023 Day 2: Healthcare Workflow to Improve Patient Outcomes

Steven Gregory of Cardinal Health™ Sonexus™ Access and Patient Support, a healthcare technology provider, presented on some of the current US healthcare trends — including value-based care and telemedicine — and the technology trends that are changing healthcare, from IoT wearable devices to AI for clinical decisioning. Healthcare is a very process-driven industry, but many of the processes are manual, or embedded within forms, or within legacy systems: scheduling, admin/discharge, insurance, and health records management. As with many other industries, these “hidden” workflows are critical to patient outcomes but it’s not possible to see how the flows work at any level, much less end-to-end.

There’s some amount of history of clinical workflow automation; I worked with Siemens Medical Systems (now Cerner) on their implementation of TIBCO’s workflow more than 10 years ago, and even wrote a paper on the uses of BPM in healthcare back in 2014. What Steven is talking about is a much more modern version of that, using Camunda and a microservice architecture to automate processes and link legacy systems.

They implemented a number of patient journey workflows effectively: appointment creating, rescheduling and cancellation; benefits verification and authorization; digital enrollment; and some patient-facing chatbot flows. Many of these are simply automation of the existing manual processes, but there’s a lot of benefit to be gained as long as you recognize that’s not the final version of the flow, but a milestone on the journey to process improvement.

He discussed a really interesting use case of cell and gene therapy: although they haven’t rolled this out this yet, it’s a complex interaction of systems integration, data tracking across systems, unique manufacturing processes while providing personalized care to patients. He feels that Camunda is key for orchestrating complex processes like this. In the Q&A, he also spoke about the difference in ramp-up time for their developers, and how much faster it is to learn Camunda and individual microservices than a legacy system.

Great examples of moving beyond straightforward process orchestration for improving critical processes.

TechnicityTO 2018: Cool tech projects

The afternoon session at Technicity started with a few fast presentations on cool projects going on in the city. Too quick to grab details from the talks, but here’s who we heard from:

  • Dr. Eileen de Villa, medical officer of health at Toronto Public Health, and Lawrence ETA, deputy CIO at the city of Toronto, on using AI to drive public health outcomes.
  • Angela Chung, project director at Toronto Employment and Social Services, Children’s Services, Shelter Support and Housing, on client-centric support through service platform integration.
  • Matthew Tenney, data science and visualization team supervisor, on IoT from streetcars to urban forestry for applications such as environmental data sensing.
  • Arash Farajian, policy planning consultant, on Toronto Water’s use of GIS, smart sensors, drones (aerial and submersible) and augmented reality.

The rest of the afternoon was the 10th annual Toronto’s Got IT Awards of Excellence, but unfortunately I had to duck out for other meetings, so that’s it for my Technicity 2018 coverage.

TechnicityTO 2018: Taming Transportation Troubles with Technology

Every year, IT World Canada organizes the Technicity conference in Toronto, providing a technology showcase for the city and an opportunity to hear about some of the things that are happening both in the city government and organizations that operate here. Fawn Annan, president of ITWC, opened the conference and introduced the city manager, Chris Murray for a backgrounder on the city as an economic engine, and how technology enables that.

The sessions started with a panel on transportation technology, moderated by Jaime Leverton, GM of Cogeco Peer 1 and featuring three people from the City of Toronto: Barb Gray, General Manager of Transportation Services; Ryan Landon, Autonomous Vehicle Lead; and Jesse Coleman, Transportation Big Data Team Leader. Erik Mok, Chief Enterprise Architect for the Toronto Transit Commission, is also supposed to be on the panel but not arrived yet: hopefully not delayed on the TTC. 🙂

They spoke about the need for data collection in order to determine how to improve transportation in the city, whether related to personal vehicles, public transit, cycling or walking. In the past, this used to require manual data collection on the street; these days, the proliferation of traffic cameras, embedded sensors and smartphones means that a lot of data is being collected about how people are moving around the streets. This creates a need for understanding how to work with the resulting big data, and huge opportunities for gaining better insights into making the streets more efficient and safer for everyone. Since the city is a big proponent of open data, this means that the data that the city collects is available (in an anonymized format) to anyone who wants to analyze it. The city is trying to do some of this analysis themselves (without the benefit of a data scientist job classification at the city), but the open data initiative means that a lot of commercial organizations — from big companies to startups — are incorporating this into apps and services. For the King Street Pilot, a year-old project that restricts the travel of private cars on our busiest streetcar route in order to prioritize public transit, the city deployed new types of sensors to measure the impact: Bluetooth sensors that track devices, traffic cameras with embedded AI, and more. This allows for unbiased measurement of the actual impact of the pilot (and other initiatives) that can be communicated to constituents.

There are privacy safeguards in place for ensuring that Bluetooth devices that are tracked can’t be traced to an individual on an ongoing basis, but video is a larger issue: in general, intelligence related to the transportation issues is extracted from the video, then the video is discarded. They mentioned the need for privacy by design, that is, building in privacy considerations from the start of any data collection project, not trying to add it on later.

They also discussed some of the smart sensors and signals being used for controlling traffic signals, where the length of the waiting queue of vehicles can influence when the traffic signals change. This isn’t just related to vehicles, however: there’s an impact on pedestrians that use the same intersections, and on public health in terms of people with mobility challenges.

Cities like Seattle, San Francisco and New York, that started with transportation data collection much earlier than Toronto, are doing some innovative things but the panel feels that we’re catching up: there’s an autonomous shuttle project in the works now to fill some of the gaps in our transit system, for example. There’s also some work being done with drones to monitor traffic congestion around special events (presumably both vehicle and pedestrian) in order to understand dispersal patterns.

Interesting audience questions on data storage (Amazon AWS) and standardization of data formats, especially related to IoT.

As a Toronto resident who uses public transit, walks a lot and sometimes even drives, some great information on how big data is feeding into improving mobility for everyone.

Consumer IoT potential: @ZoranGrabo of @ThePetBot has some serious lessons on fun

I’m back for a couple of sessions at the second day at Big Data Toronto, and just attended a great session by Zoran Grabovac of PetBot on the emerging markets for consumer IoT devices. His premise is that creating success with IoT devices is based on saving/creating time, strengthening connections, and having fun.

It also helps to be approaching an underserved market, and if you believe his somewhat horrifying stat that 70% of pet owners consider themselves to be “pet parents”, there’s a market with people who want to interact with and entertain their pets with technology while they are gone during working hours. PetBot’s device gives you a live video feed of your pet remotely, but can also play sounds, drop treats (cue Pavlov) and record pet selfies using facial recognition to send to you while you’re out. This might seem a bit frivolous, but his lessons on using devices to “create” time (allowing for interaction during a time that you would not normally be available), make your own type of interactions (e.g., create a training regimen using voice commands), and have fun to promote usage retention (who doesn’t like cute pet selfies?).

I asked about integrating with pet activity trackers and he declined to comment, so we might see something from them on this front.; other audience questions asked about the potential for learning and recognition algorithms that could automatically reward specific behaviours. I’m probably not going to run out and get a PetBot – it seems much more suited for dogs than cats – but his insights into consumer IoT devices are valid across a broader range of applications.

Pegaworld 2016 Day 1 Keynote: Pega direction, Philips and Allianz

It seems like I was just here in Vegas at the MGM Grand…oh, wait, I was just here. Well, I’m back for Pegaworld 2016, and 4,000 of us congregated in the Grand Garden Arena for the opening keynote on the first day. If you’re watching from home, or want to catch a replay, there is a live stream of the keynotes that will likely feature an on-demand replay at some point.

IMG_9776Alan Trefler, Pega’s CEO, kicked things off by pointing out the shift from a focus on technology to a focus on the customer. Surveys show that although most companies think that they understand their customers, the customers don’t agree; companies need to undergo a serious amount of digital transformation in order to provide the level of service that today’s customers need, while still improving efficiencies to support that experience. One key to this is a model-driven technology environment that incorporates insights and actions, allowing the next best action to be provided at any given point depending on the current context, while supporting organizational evolution to allow constant change to meet the future demands. Model-driven environments let you create applications that are future-proof, since it is relatively quick to make changes to the models without changing a lot of code. Pega has a lot of new online training at the Pega Academy, a marketplace of third-party Pega applications at the Pega Exchange, and the continuing support of their Pega Express easy-to-use modeler; they continue to work on breaking free from their tech-heavy past to support more agile digital transformation. Pega recently sponsored an Economist report on digital transformation; you can grab that here.

wp-1465232175851.jpgDon Schuerman, Pega’s CTO, took over as MC for the event to introduce the other keynote speakers, but first announced a new partnership with Philips that links Pega’s care management package with Philips’ HealthSuite informatics and cloud platform for home healthcare. Jeroen Tas, CEO of Connected Care & Health Informatics at Philips presented more on this, specifically in the context of the inefficient and unevenly-distributed US healthcare system. He had a great chart that showed the drivers for healthcare transformation: from episodic to continuous, by orchestrating 24/7 care; from care provider to human-centric, by focusing on patient experience; from fragmented to connected, by connecting patients and caregivers; and from volume to value, by optimizing resources. Connected, personalized care links healthy living to disease prevention, and supports the proper diagnosis and treatment since healthcare providers all have access to a comprehensive set of the patient’s information. Lots of cool personal healthcare devices, such as ultrasound-as-a-service, where they will ship a device that can be plugged into a tablet to allow your GP to do scans that might normally be done by a specialist; continuous glucose meters and insulin regulation; and tools to monitor elderly patients’ medications. Care costs can be reduced by 26% and readmissions reduced by 52% through active monitoring in networked care delivery environments, such as by monitoring heart patients for precursors of a heart attack; this requires a combination of IoT, personal health data, data analytics and patient pathways provided by Philips and Pega. He ended up stating that it’s a great time to be in healthcare, and that there are huge benefits for patients as well as healthcare providers.

Although Tas didn’t discuss this aspect, there’s a huge amount of fear of connected healthcare information in user-pay healthcare systems: people are concerned that they will be refused coverage if their entire health history is known. Better informatics and analysis of healthcare information improves health and reduces overall healthcare costs, but it needs to be provided in an environment that doesn’t punish people for exposing their health data to everyone in the healthcare system.

We continued on the healthcare topic, moving to the insurance side with Birgit König, CEO of Allianz Health Germany. Since basic healthcare in Germany is provided by the state, health insurance is for additional services not covered by the basic plan, and for travelers while they are outside Germany. There is a lot of competition in the market, and customer experience for claims is becoming a competitive differentiator especially with new younger customers. In order to accommodate, Allianz is embracing a bimodal architecture approach, where back-end systems are maintained using traditional development techniques that focus on stability and risk, while front-end systems are more agile and innovative with shorter release cycles. I’ve just written a paper on bimodal IT and how it plays out in enterprises; not published yet, but completely aligned with what König discussed. Allianz is using Pega for more agile analytics and decisioning at the front end of their processes, while keeping their back-end systems stable. Innovation and fast development has been greatly aided by co-locating their development and business teams, not surprisingly.

wp-1465232200882.jpgThe keynote finished with Kerim Akgonul, Pega’s SVP of Products, for a high-level product update. He started by looking at the alignment between internal business goals and the customer journey, spanning marketing, sales, customer service and operations. The Pega Customer Decision Hub sits at the middle of these four areas, linking information so that (for example), offers sent to customers are based on their past orders.

wp-1465234442978.jpg

 

  • Marketing: A recent Forrester report stated that Pega Marketing yields an 8x return on marketing investment (ROMI) due to the next-best-action strategies and other smart uses of analytics. Marketers don’t need to be data scientists to create intelligent campaigns based on historical and real-time data, and send those to a targeted list based on filters including geolocation. We saw this in action, with a campaign created in front of us to target Pegaworld attendees who were actually in the arena, then sent out to the recipients via the conference mobile app.
  • Sales: The engagement map in the Pega Sales Automation app uses the Customer Decision Hub information to provide guidance that links products to opportunities for salespeople; we saw how the mobile sales automation app makes this information available and recommends contacts and actions, such as a follow-up contact or training offer. There are also some nice tools such as capturing a business card using the mobile camera and importing the contact information, merging it if a similar record is found.
  • wp-1465234409405.jpgCustomer service: The Pega customer service dashboard shows individual customer timelines, but the big customer service news in this keynote is the OpenSpan acquisition that provides robotic process automation (RPA) to improve customer service environments. OpenSpan can monitor desktop work as it is performed, and identify opportunities for RPA based on repetitive actions. The new automation is set up by recording the actions that would be done by a worker, such as copying and pasting information between systems. The example was an address change, where a CSR would take a call from a customer then have to update three different systems with the same information by copying and pasting between applications. We saw the address change being recorded, then played back on a new transaction; this was also included as an RPA step in a Pega Express model, although I’m not sure if that was just to document the process as opposed to any automation driven from the BPM side.
  • Operations: The Pega Field Service application provides information for remote workers doing field support calls, reducing the time required to complete the service while documenting the results and tracking the workers. We saw a short video of Xerox using this in Europe for their photocopier service calls: the field engineer sees the customer’s equipment list, the inventory that he has with him, and other local field engineers who might have different skills or inventory to assist with his call. Xerox has reduced their service call time, improved field engineer productivity, and increased customer satisfaction.

Good mix of vision, technology and customer case studies. Check out the replay when it’s available.

BPM and IoT in Home and Hospice Healthcare with @PNMSoft

I listened in on a webinar by Vasileios Kospanos of PNMSoft today about business process management (BPM) and the internet of things (IoT). They started with some basic definitions and origins of IoT – I had no idea that the term was coined back in 1999, which is about the same time that the term BPM came into use – as a part of controls engineering that relied on a lot of smart devices and sensors producing data and responding to remote commands. There are some great examples of IoT in use, including environmental monitoring, manufacturing, energy management, and medical systems, in addition to the more well-known consumerized applications such as home automation and smart cars. Gartner claims that there will be 26B devices on the internet by 2020, which is probably not a bad estimate (and is also driving the new IP6 addressing standards).

PNMSoft - Amedar healthcare presentationDominik Mazur from Amedar Consulting Group (a Polish business and technology consulting firm) joined to discuss a case study from one of their healthcare projects, helping to improve the flow of medical information and operational flow that included home care and hospices – parts of the medical system that are often orphaned from an information gathering standpoint – tied into their National Health Fund systems. This included integrating the information from various devices used to measure the patients’ vital statistics, and supported processes for admission and discharge from medical care facilities. The six types of special purpose devices communicate over mobile networks, and can store the data for later forwarding if there is no signal at the point of collection. Doctors and other health care professionals can view the data and participate in remote diagnosis activities or schedule patient visits.

PNMSoft - Amedar healthcare presentationMazur showed the screens used by healthcare providers (with English annotations, since their system is in Polish) as well as some of the underlying architecture and process models implemented in PNMSoft, such as the admitting interview and specialist referrals process for patients, as well as coordination of physician and specialist visits, plus home medical equipment rental and even remote configuration through remote monitoring capabilities. He also showed a live demo of the system, highlighting features such as alarms that appear when patient data falls outside of normal boundaries; they are integrating third-party and open-source tools such as Google for charting data directly into their dashboards. He also discussed how other devices can be paired to the systems using Bluetooth; I assume that this means that a consumer healthcare device could be used as an auxiliary measurement device, although manufacturers of these devices are quick to point out that they are not certified healthcare devices in order to absolve themselves of responsibility for bad data.

He wrapped up with lessons that they learned from the project, which sound much like many other BPM projects: model-driven Agile development (using PNMSoft, in their case), and work closely with key stakeholders. However, the IoT aspect adds complexiy, and they learned some key lessons around that, too: start device integration sooner, and allow 20-30% of time for testing. They developed a list of best practices for similar projects, including extending business applications to mobile devices, and working in parallel on applications, device integration and reporting.

We wrapped up with an audience Q&A, although there were many more questions than we had time for. One of the more interesting ones was around automated decisioning: they are not doing any of that now, just alerting that allows people to make decisions or kick off processes, but this work lays the foundation for learning what can be automated without risk in the future. Both patients and healthcare providers are accepting the new technology, and the healthcare providers in particular find that it is making their processes more efficient (reducing administration) and transparent.

Great webinar. It will be available on demand from the resources section on PNMSoft’s website within a few days.

PNMSoft - Amedar webinar

Update: PNMSoft published the recording on their YouTube channel within a couple of hours. No registration required!

The Enterprise Digital Genome with Quantiply at BPMCM15

“An operating system for a self-aware quantifiable predictive enterprise” definitely gets the prize for the most intriguing presentation subtitle, for an afternoon session that I went to with Surendra Reddy and David Chaney from Quantiply (a stealth startup that has just publicly launched), and their customer, a discount brokerage service whose name I have been requested to remove from this post.

Said customer has some significant event data challenges, with a million customers and 100,000 customer interactions per day across a variety of channels, and five billion log messages generated every day across all of their product systems and platforms. Having this data exist in silos with no good aggregation tools means fragmented and poor customer support, and also significant challenges in system and internal support.

To address these types of heterogenous data analysis problems, Quantiply has a two-layer tool: Edge Cloud for the actual data analysis, which can then be exposed to different roles based on access control (business users, operational users, data scientists, etc.); and Pulse for connecting to various data sources including data warehouses, transactional databases, BPM systems and more. It appears that they’re using some sort of dimensional fact models, which is fairly standard data warehouse analytical tools, but their Pulse connectors is allowing them to pour in data on a near-real-time basis, then make the connections between capabilities and services to be able to do fast problem resolution on their critical trading platforms. Because of the nature of the graph connectivity that they’re deriving from the data sources, they’re able to not only resolve the problem by drilling down, but also determine what customers were impacted by the problem in order to follow up. In response to a question, the customer said that they had used Splunk and other log analytics tools, but that this was “not Splunk”, in terms of both the real-time nature, and the front-end user experience, plus deeper analytical capabilities such as long-term interaction trending. In some cases, the Quantiply representation is sufficient analysis; in other cases, it’s a starting point for a data scientist to dig in and figure out some of the more complex correlations in the data.

There was a lot of detail in the presentation about the capabilities of the platform and what the customer is doing with it, and the benefits that they’re seeing; there’s not a lot of information on the Quantiply website since they’re just publicly launching.

Update: The original version of this post included the name of the customer and their representative. Since this was a presentation at a public conference with no NDA or confidentiality agreements in place, not even a verbal request at any time during the session, I live-blogged as usual. A day later, the vendor, under pressure from the customer’s PR group, admitted that they did not have clearance to have this customer speak publicly, which is a pretty rookie mistake on their part, although it lines up with my general opinion on their social media skills. As a favor to the conference organizers, who put a lot of effort into making a great experience for all of us, I’ve decided to remove the customer’s name from this post. I’m sure that those of you who really want to know it won’t have any trouble finding it, because of this thing called “the internet”.

Wearable Workflow by @wareFLO at BPMCM15

Charles Webster gave a breakout session on wearable workflow, looking at some practical examples of combining wearables — smart glasses, watches and even socks — with enterprise processes, allowing people wearing these devices to have device events integrated directly into their work without having to break to consult a computer (or at least a device that self-identifies as a computer). Webster is a doctor, and has a lot of great case studies in healthcare, such as detecting when a healthcare worker hasn’t washed their hands before approaching a patient by instrumenting the soap dispenser and the worker. Interestingly, the technology for the hand hygiene project came from smart dog collars, and we’re now seeing devices such as Intel’s Curie that are making this much more accessible by combining sensors and connectivity as we commercialize the internet of things (IoT).

He was an early adopter of Google Glass, and talked to us about the experience of having a wearable integrated into his lifestyle, such as for voice-controlled email and photography, plus some of the ideas for Google Glass that he has for healthcare workflows where electronic health records (EHR) and other device information can be integrated with work patterns. Google Glass, however, was not a commercial success since it is too bulky and geeky-looking, as well as requiring frequent recharging if you’re using it a lot. It requires more miniaturization to be considered as a possibility for most people, but that’s a matter of time, and probably a short amount of time, especially if they’re integrated directly into eyeglass frames that likely have a lot of unused volume that could be filled with electronic components.

Webster talked about a university curriculum for healthcare technology and IoT that he designed, which would include the following courses:

  • Wearable human factors and workflow ergonomics
  • Data and process mining wearable data, since wearables generate so much more interesting data that needs to be analyzed and correlated
  • Designing and prototyping wearable products

IMG_20150623_104530He is working on a prototype for a 3D-printed, Arduino-based wearable interactive robot, MrRIMP, intended to be used by pediatric healthcare professionals to amuse and distract their young patients during medical examinations and procedures. He showed us a video of a demo of he and MrRIMP interacting, and the different versions that he’s created. Great ideas about IoT, wearables and healthcare.

SapphireNow 2015 Day 2 Keynote with Bernd Leukert

SAP HANA functionalityThe second day of SAP’s SAPPHIRENOW conference started with Bernd Leukert discussing some customers’ employees worry of being disintermediated by the digital enterprise, but how the digital economy can be used to accentuate the promise of your original business to make your customers happier without spending the same amount of time (and hopefully, money) on enterprise applications. It’s not just about changing technologies but about changing business models and leveraging business networks to address the changing world of business. All true, but I still see a lot of resistance to the digital enterprise in large organizations, with both mid-level management and front-line workers feeling threatened by new technologies and business models until they can see how it can be of benefit to them.

S/4HANAAlthough Leukert is on the stage, the real star of the show is S/4HANA: the new generation of their Business Suite ERP solutions based natively on the in-memory HANA data and transaction engine for faster processing, a simplified data model for easier analytics and faster reconciliation, and a new user interface with their Fiori user experience platform. With the real-time analytical capabilities of HANA, including non-SAP as well as S/4HANA data from finances and logistics, they are moving from being just a system of record to a full decision support system. We saw a demo of a manufacturing scenario, where we walked through a large order process where we saw a combination of financial and logistics data presented in real time for making recommendations on how to deal with a shortage in fulfilling an order. Potential solutions — in this case, moving stock allocated from one customer to another higher priority customer — are presented with a predicted financial score, allowing the user to select one of the options. Nice demo of analytics and financial predictions directly integrated with order processing.

Order processing dashboard Order processing recommendations Order process simulation results

The new offering is modular, with additional plug-ins for their other products such as Concur and SuccessFactors to enhance the suite capabilities. It runs in the cloud and on-premise. Lots of reasons to transition, but having this type of new functionality requires significant work to adopt the new programming model: both on SAP’s side in building the new platform, and also on the customers’ side for refactoring their applications to take advantage of the new features. Likely this will take several months, if not years, for widespread adoption by customers that have highly customized solutions (isn’t that all of them?), in spite of the obvious advantages. As we have seen with other vendors who completely re-architect their product, new customers are generally very happy with starting on the new platform, but existing customers can take years even when there is certified migration path. However, since they launched in February, 400 customers have committed to S4/HANA, and they are now supporting all 25 industries that they serve.

As we saw last year, SAP is pushing to have existing customers first migrate to HANA as the underlying database in their existing systems (typically displacing Oracle), which is a non-trivial but straightforward operation that is likely to improve performance; then, reconsider whether the customizations that they have in their current system are handled out of the box with S/4HANA or can be easily re-implemented based on the simpler data model and more functional capabilities. Sounds good, and I imagine that they will get a reasonable share of their existing customers to make the first step and migrate to HANA, but the second step starts to look more like a new implementation than a simple migration that will scare off a lot of customers. Leukert invited a representative from their customer Asian Paints to the stage to talk about their migration: they have moved to HANA and the simplified finance core functionality, and are still working on implementing the simplified logistics and other modules with a vision to soon be completely on S/4HANA. A good success story, but indicative of the length of time and amount of work required to migrate. For them, definitely worth the trip since they have been able to re-imagine their business model to reach new markets through a better understanding of their customers and their own business data.

He moved on to talk about the HANA Cloud Platform (HCP), a general-purpose application development platform that can be used to build applications unrelated to SAP applications, or to build extensions to SAP functionality. He mentioned an E&Y application built on HCP for fraud detection that is directly integrated with core SAP solutions, which is just one of 1,000 or more third-party applications available on the HCP marketplace. HCP provides structured and unstructured data models, geospatial, predictive, Fiori UX platform as a service, mobile support, analytics portfolio, and integration layers that provide direct connection to your business both on the device side through IoT events and into the operational business systems. With the big IoT push that we saw in the panel yesterday, Siemens has selected HCP as their cloud platform for IoT: the Siemens Cloud for Industry. Peter Weckesser of Siemens joined Leukert on stage to talk more about this newly-launched platform, and how it can be added to their customer installations as a monitoring (not control) layer: remote devices, such as sensors on manufacturing equipment, push their event streams to the Siemens cloud (based on HCP) in public, hybrid or on-premise configurations; analytics can then be applied for predictive maintenance scheduling as well as aggregate operational optimization.

Energy grid geospatial analyticsWe saw a demo based on the CenterPoint IoT example at the panel yesterday, showing monitoring and maintenance of energy distribution networks: tracking the health of transformers, grid storage and other devices and identifying equipment failures, sometimes before they even happen. CenterPoint already has 100,000 sensors out in the field, and since this is integrated with S/4HANA, this is not just monitoring: an operator can trigger a work order directly from the predictive equipment maintenance analytics dashboard.

Energy grid analytics Energy grid analytics drill-down

Leukert touched on to the HANA roadmap, with the addition of Hadoop and SPARK Cluster Manager to handle infinite volumes of data, then welcomed Walmart CIO Karenann Terrell to discuss what it is like to handle a really large HANA implementation. Walmart serves 250 million customers per week through 11,000 locations with 2.2 million employees, meaning that they generate a lot of data just in their daily operations: they generate literally trillions of financial transactions. Because technology is so core to managing this well, she pointed out that Walmart is creating a technology company in the middle of the world’s largest retail company, which allows them to stay focused on the customer experience while reducing costs. Their supply chain is extensive, since they are directly plugged into many of their suppliers, and innovating along that supply chain has driven them to partner with SAP more closely than most other customers. HANA allows them to have 5,000 people hitting on data stores of a half-billion records simultaneously with sub-second response time to provide a real-time view of their supply chain, making them a true data-driven retailer and shooting them to the top of yesterday’s HANA Innovation Awards. She finished by saying that seeing S/4HANA implemented at Walmart in her lifetime is on her bucket list, which got a good laugh from the audience but highlighted the fact that this is not a trivial transition for most companies.

Leukert finished with an invitation — or maybe it was a challenge — to use S/4HANA and HCP to reinvent your business: “clean your basement” to remove unnecessary customization in your current SAP solutions or convert it to HCP or S/4HANA extension platform; change your business model to become more data-driven; and leverage business networks to expand the edges of your value chain. Thrive, don’t just survive.

Employee disaster scenarioSteve Singh, CEO of Concur (acquired by SAP last December) then took over to look at reinventing the employee travel experience, from booking through trip logistics to expense reporting. For companies with large number of traveling employees, managing travel can be a serious headache both from a logistics and financial standpoint. Concur does this by creating a business network (or a network or networks) that directly integrates with suppliers — such as airlines and car rental companies — for booking and direct invoice capture, plus easy functions for inputting travel expenses that are not captured directly from the supplier. I heard comments yesterday that SAP already has travel and expense management, and although the functionality of Concur for that functionality is likely a bit better, the networks that they bring are the real prize here. The networks, for example, allow for managing the extraction of an employee who finds themself in a disaster or other dangerous travel scenario, and becomes part of a broader human resources risk management strategy.

At the press Q&A later, Leukert fielded questions about how they have simplified the complete core of their ERP solution in terms of data model and functionality but still have work to do for some industry modules: although all 25 industries are supported as of now in the on-premise version, they need to do a bit of tinkering under the hood and do additional migration for the cloud version. They’re also still working on the cloud version of everything, and are recommending the HCM and CRM standalone products if the older Business Suite versions don’t meet requirements. In other words, it’s not done yet, although core portions are fully functional. Singh talked about the value of business networks such as Ariba in changing business models, and sees that products such as Concur using HCP and the SAP business networks will help drive broader adoption.

There was a question on the ROI for migration to S/4HANA: it’s supposed to run 1,800 times faster than previous versions, but customers may not be seeing much (if any) savings, opening things up to competitive displacement. I heard this same sentiment from some customers last night at the HANA Innovation Awards reception; since there is little or no cost reduction in terms of license and deployment costs, they need to make the case based on what additional capabilities that HANA enables, such as real-time analytics and predictions, that allow companies to run their businesses differently, and a longer-term reduction in IT complexity and maintenance costs. Since a lot of more traditional companies don’t yet see the need to change their business models, this can be a hard sell, but eventually most companies will need to come around to the need for real-time insights and actions.

IoT Solutions Panel at SapphireNow 2015

Steve Lucas, president of platform solutions at SAP, led a panel on the internet of things at SAPPHIRENOW 2015. He kicked off with some of their new IoT announcements: SAP HANA Cloud Platform (HCP) for IoT with free access to SAP SQL Anywhere embeddable database for edge intelligence; a partner ecosystem that includes Siemens and Intel; and customer success stories from Tennant and Tangoe. Their somewhat complex marketecture diagram shows a fairly comprehensive IoT portfolio that includes connecting to people and things at the edges of your value chain, and integrating the events that they generate to optimize your core corporate planning and reporting, providing real-time insights and automated decisioning. The cloud platform is key to enabling this, since it provides the fabric that weaves all of the data, actions, rules and decisions into a single connected enterprise.

SAP IoT marketecture

He was joined on stage by Austin Swope, who demonstrated remote equipment monitoring using a tiny but operational truck on the stage, complete with onboard sensors that pushed events and data to the cloud for remote monitoring and problem detection. We saw some of the real-time analytics (when the wifi cooperated) on-screen while the truck ran around the stage, and some of the other types of dashboards and analytics that would be used for broader equipment management programs. Since the equipment is now fully instrumented, analytics can be used to visualize and optimize operations: reducing costs, improving maintenance cycles, and increasing equipment load factors through a better understanding of what each piece of equipment is doing at any given time.

Next, Lucas was joined by Gary Hayes, CIO of CenterPoint Energy; Paul Wellman, CIO of Tennant; and Peter Weckesser, CEO Customer Service, Digital Factory at Siemens. Hayes talked about how CenterPoint is using smart meters, grid storage, digital distribution networks and other IoT-enabled technologies to drastically reduce costs and improve service, while maintaining safety and security standards. They’re starting to use predictive analytics on HANA to model and predict underground cable failures, and several other innovations in intelligent energy management. Wellman discussed how Tennant, which has fleets of large-scale cleaning machines such as you would see in conference centers and airports, has added telemetry to provide machine monitoring and predictive maintenance, and expose this information to customers so that they can understand and reduce costs themselves through fleet management and usage. Last up, Weckesser talked about how Siemens devices (of which there are millions out there in a variety of industrial applications) generate events that can be analyzed to optimize industrial plants and machines as well as energy and resources As an SAP partner, Siemens is offering an open cloud platform for industry customers based on HANA; customers can easily connect their existing Siemens devices to the Siemens Cloud for Industry apps via public cloud, private cloud or on-premise infrastructure. This allows them to do analysis for predictive maintenance on individual machines, as well as aggregate fleet operations optimization, through apps provided by Siemens, SAP, SAP partners or the customers themselves.

I was disappointed not to see the SAP Operational Process Intelligence offering involved in this discussion: it seems a natural fit since it can be used to monitor events and control processes from a variety of underlying systems and sources, including event data in HANA. However, good to see that SAP is providing some real-world examples of how they are supporting their customers’ and partners’ IoT efforts through the HANA Cloud Platform.