Category Archives: IoT

Internet of Things

Consumer IoT potential: @ZoranGrabo of @ThePetBot has some serious lessons on fun

I’m back for a couple of sessions at the second day at Big Data Toronto, and just attended a great session by Zoran Grabovac of PetBot on the emerging markets for consumer IoT devices. His premise is that creating success with IoT devices is based on saving/creating time, strengthening connections, and having fun.

It also helps to be approaching an underserved market, and if you believe his somewhat horrifying stat that 70% of pet owners consider themselves to be “pet parents”, there’s a market with people who want to interact with and entertain their pets with technology while they are gone during working hours. PetBot’s device gives you a live video feed of your pet remotely, but can also play sounds, drop treats (cue Pavlov) and record pet selfies using facial recognition to send to you while you’re out. This might seem a bit frivolous, but his lessons on using devices to “create” time (allowing for interaction during a time that you would not normally be available), make your own type of interactions (e.g., create a training regimen using voice commands), and have fun to promote usage retention (who doesn’t like cute pet selfies?).

I asked about integrating with pet activity trackers and he declined to comment, so we might see something from them on this front.; other audience questions asked about the potential for learning and recognition algorithms that could automatically reward specific behaviours. I’m probably not going to run out and get a PetBot – it seems much more suited for dogs than cats – but his insights into consumer IoT devices are valid across a broader range of applications.

Pegaworld 2016 Day 1 Keynote: Pega direction, Philips and Allianz

It seems like I was just here in Vegas at the MGM Grand…oh, wait, I *was* just here. Well, I’m back for Pegaworld 2016, and 4,000 of us congregated in the Grand Garden Arena for the opening keynote on the first day. If you’re watching from home, or want to catch a replay, there is a live stream of the keynotes that will likely feature an on-demand replay at some point.

IMG_9776Alan Trefler, Pega’s CEO, kicked things off by pointing out the shift from a focus on technology to a focus on the customer. Surveys show that although most companies think that they understand their customers, the customers don’t agree; companies need to undergo a serious amount of digital transformation in order to provide the level of service that today’s customers need, while still improving efficiencies to support that experience. One key to this is a model-driven technology environment that incorporates insights and actions, allowing the next best action to be provided at any given point depending on the current context, while supporting organizational evolution to allow constant change to meet the future demands. Model-driven environments let you create applications that are future-proof, since it is relatively quick to make changes to the models without changing a lot of code. Pega has a lot of new online training at the Pega Academy, a marketplace of third-party Pega applications at the Pega Exchange, and the continuing support of their Pega Express easy-to-use modeler; they continue to work on breaking free from their tech-heavy past to support more agile digital transformation. Pega recently sponsored an Economist report on digital transformation; you can grab that here.

wp-1465232175851.jpgDon Schuerman, Pega’s CTO, took over as MC for the event to introduce the other keynote speakers, but first announced a new partnership with Philips that links Pega’s care management package with Philips’ HealthSuite informatics and cloud platform for home healthcare. Jeroen Tas, CEO of Connected Care & Health Informatics at Philips presented more on this, specifically in the context of the inefficient and unevenly-distributed US healthcare system. He had a great chart that showed the drivers for healthcare transformation: from episodic to continuous, by orchestrating 24/7 care; from care provider to human-centric, by focusing on patient experience; from fragmented to connected, by connecting patients and caregivers; and from volume to value, by optimizing resources. Connected, personalized care links healthy living to disease prevention, and supports the proper diagnosis and treatment since healthcare providers all have access to a comprehensive set of the patient’s information. Lots of cool personal healthcare devices, such as ultrasound-as-a-service, where they will ship a device that can be plugged into a tablet to allow your GP to do scans that might normally be done by a specialist; continuous glucose meters and insulin regulation; and tools to monitor elderly patients’ medications. Care costs can be reduced by 26% and readmissions reduced by 52% through active monitoring in networked care delivery environments, such as by monitoring heart patients for precursors of a heart attack; this requires a combination of IoT, personal health data, data analytics and patient pathways provided by Philips and Pega. He ended up stating that it’s a great time to be in healthcare, and that there are huge benefits for patients as well as healthcare providers.

Although Tas didn’t discuss this aspect, there’s a huge amount of fear of connected healthcare information in user-pay healthcare systems: people are concerned that they will be refused coverage if their entire health history is known. Better informatics and analysis of healthcare information improves health and reduces overall healthcare costs, but it needs to be provided in an environment that doesn’t punish people for exposing their health data to everyone in the healthcare system.

We continued on the healthcare topic, moving to the insurance side with Birgit König, CEO of Allianz Health Germany. Since basic healthcare in Germany is provided by the state, health insurance is for additional services not covered by the basic plan, and for travelers while they are outside Germany. There is a lot of competition in the market, and customer experience for claims is becoming a competitive differentiator especially with new younger customers. In order to accommodate, Allianz is embracing a bimodal architecture approach, where back-end systems are maintained using traditional development techniques that focus on stability and risk, while front-end systems are more agile and innovative with shorter release cycles. I’ve just written a paper on bimodal IT and how it plays out in enterprises; not published yet, but completely aligned with what König discussed. Allianz is using Pega for more agile analytics and decisioning at the front end of their processes, while keeping their back-end systems stable. Innovation and fast development has been greatly aided by co-locating their development and business teams, not surprisingly.

wp-1465232200882.jpgThe keynote finished with Kerim Akgonul, Pega’s SVP of Products, for a high-level product update. He started by looking at the alignment between internal business goals and the customer journey, spanning marketing, sales, customer service and operations. The Pega Customer Decision Hub sits at the middle of these four areas, linking information so that (for example), offers sent to customers are based on their past orders.

wp-1465234442978.jpg

 

  • Marketing: A recent Forrester report stated that Pega Marketing yields an 8x return on marketing investment (ROMI) due to the next-best-action strategies and other smart uses of analytics. Marketers don’t need to be data scientists to create intelligent campaigns based on historical and real-time data, and send those to a targeted list based on filters including geolocation. We saw this in action, with a campaign created in front of us to target Pegaworld attendees who were actually in the arena, then sent out to the recipients via the conference mobile app.
  • Sales: The engagement map in the Pega Sales Automation app uses the Customer Decision Hub information to provide guidance that links products to opportunities for salespeople; we saw how the mobile sales automation app makes this information available and recommends contacts and actions, such as a follow-up contact or training offer. There are also some nice tools such as capturing a business card using the mobile camera and importing the contact information, merging it if a similar record is found.
  • wp-1465234409405.jpgCustomer service: The Pega customer service dashboard shows individual customer timelines, but the big customer service news in this keynote is the OpenSpan acquisition that provides robotic process automation (RPA) to improve customer service environments. OpenSpan can monitor desktop work as it is performed, and identify opportunities for RPA based on repetitive actions. The new automation is set up by recording the actions that would be done by a worker, such as copying and pasting information between systems. The example was an address change, where a CSR would take a call from a customer then have to update three different systems with the same information by copying and pasting between applications. We saw the address change being recorded, then played back on a new transaction; this was also included as an RPA step in a Pega Express model, although I’m not sure if that was just to document the process as opposed to any automation driven from the BPM side.
  • Operations: The Pega Field Service application provides information for remote workers doing field support calls, reducing the time required to complete the service while documenting the results and tracking the workers. We saw a short video of Xerox using this in Europe for their photocopier service calls: the field engineer sees the customer’s equipment list, the inventory that he has with him, and other local field engineers who might have different skills or inventory to assist with his call. Xerox has reduced their service call time, improved field engineer productivity, and increased customer satisfaction.

Good mix of vision, technology and customer case studies. Check out the replay when it’s available.

BPM and IoT in Home and Hospice Healthcare with @PNMSoft

I listened in on a webinar by Vasileios Kospanos of PNMSoft today about business process management (BPM) and the internet of things (IoT). They started with some basic definitions and origins of IoT – I had no idea that the term was coined back in 1999, which is about the same time that the term BPM came into use – as a part of controls engineering that relied on a lot of smart devices and sensors producing data and responding to remote commands. There are some great examples of IoT in use, including environmental monitoring, manufacturing, energy management, and medical systems, in addition to the more well-known consumerized applications such as home automation and smart cars. Gartner claims that there will be 26B devices on the internet by 2020, which is probably not a bad estimate (and is also driving the new IP6 addressing standards).

PNMSoft - Amedar healthcare presentationDominik Mazur from Amedar Consulting Group (a Polish business and technology consulting firm) joined to discuss a case study from one of their healthcare projects, helping to improve the flow of medical information and operational flow that included home care and hospices – parts of the medical system that are often orphaned from an information gathering standpoint – tied into their National Health Fund systems. This included integrating the information from various devices used to measure the patients’ vital statistics, and supported processes for admission and discharge from medical care facilities. The six types of special purpose devices communicate over mobile networks, and can store the data for later forwarding if there is no signal at the point of collection. Doctors and other health care professionals can view the data and participate in remote diagnosis activities or schedule patient visits.

PNMSoft - Amedar healthcare presentationMazur showed the screens used by healthcare providers (with English annotations, since their system is in Polish) as well as some of the underlying architecture and process models implemented in PNMSoft, such as the admitting interview and specialist referrals process for patients, as well as coordination of physician and specialist visits, plus home medical equipment rental and even remote configuration through remote monitoring capabilities. He also showed a live demo of the system, highlighting features such as alarms that appear when patient data falls outside of normal boundaries; they are integrating third-party and open-source tools such as Google for charting data directly into their dashboards. He also discussed how other devices can be paired to the systems using Bluetooth; I assume that this means that a consumer healthcare device could be used as an auxiliary measurement device, although manufacturers of these devices are quick to point out that they are not certified healthcare devices in order to absolve themselves of responsibility for bad data.

He wrapped up with lessons that they learned from the project, which sound much like many other BPM projects: model-driven Agile development (using PNMSoft, in their case), and work closely with key stakeholders. However, the IoT aspect adds complexiy, and they learned some key lessons around that, too: start device integration sooner, and allow 20-30% of time for testing. They developed a list of best practices for similar projects, including extending business applications to mobile devices, and working in parallel on applications, device integration and reporting.

We wrapped up with an audience Q&A, although there were many more questions than we had time for. One of the more interesting ones was around automated decisioning: they are not doing any of that now, just alerting that allows people to make decisions or kick off processes, but this work lays the foundation for learning what can be automated without risk in the future. Both patients and healthcare providers are accepting the new technology, and the healthcare providers in particular find that it is making their processes more efficient (reducing administration) and transparent.

Great webinar. It will be available on demand from the resources section on PNMSoft’s website within a few days.

PNMSoft - Amedar webinar

Update: PNMSoft published the recording on their YouTube channel within a couple of hours. No registration required!

The Enterprise Digital Genome with Quantiply at BPMCM15

“An operating system for a self-aware quantifiable predictive enterprise” definitely gets the prize for the most intriguing presentation subtitle, for an afternoon session that I went to with Surendra Reddy and David Chaney from Quantiply (a stealth startup that has just publicly launched), and their customer, a discount brokerage service whose name I have been requested to remove from this post.

Said customer has some significant event data challenges, with a million customers and 100,000 customer interactions per day across a variety of channels, and five billion log messages generated every day across all of their product systems and platforms. Having this data exist in silos with no good aggregation tools means fragmented and poor customer support, and also significant challenges in system and internal support.

To address these types of heterogenous data analysis problems, Quantiply has a two-layer tool: Edge Cloud for the actual data analysis, which can then be exposed to different roles based on access control (business users, operational users, data scientists, etc.); and Pulse for connecting to various data sources including data warehouses, transactional databases, BPM systems and more. It appears that they’re using some sort of dimensional fact models, which is fairly standard data warehouse analytical tools, but their Pulse connectors is allowing them to pour in data on a near-real-time basis, then make the connections between capabilities and services to be able to do fast problem resolution on their critical trading platforms. Because of the nature of the graph connectivity that they’re deriving from the data sources, they’re able to not only resolve the problem by drilling down, but also determine what customers were impacted by the problem in order to follow up. In response to a question, the customer said that they had used Splunk and other log analytics tools, but that this was “not Splunk”, in terms of both the real-time nature, and the front-end user experience, plus deeper analytical capabilities such as long-term interaction trending. In some cases, the Quantiply representation is sufficient analysis; in other cases, it’s a starting point for a data scientist to dig in and figure out some of the more complex correlations in the data.

There was a lot of detail in the presentation about the capabilities of the platform and what the customer is doing with it, and the benefits that they’re seeing; there’s not a lot of information on the Quantiply website since they’re just publicly launching.

Update: The original version of this post included the name of the customer and their representative. Since this was a presentation at a public conference with no NDA or confidentiality agreements in place, not even a verbal request at any time during the session, I live-blogged as usual. A day later, the vendor, under pressure from the customer’s PR group, admitted that they did not have clearance to have this customer speak publicly, which is a pretty rookie mistake on their part, although it lines up with my general opinion on their social media skills. As a favor to the conference organizers, who put a lot of effort into making a great experience for all of us, I’ve decided to remove the customer’s name from this post. I’m sure that those of you who really want to know it won’t have any trouble finding it, because of this thing called “the internet”.

Wearable Workflow by @wareFLO at BPMCM15

Charles Webster gave a breakout session on wearable workflow, looking at some practical examples of combining wearables — smart glasses, watches and even socks — with enterprise processes, allowing people wearing these devices to have device events integrated directly into their work without having to break to consult a computer (or at least a device that self-identifies as a computer). Webster is a doctor, and has a lot of great case studies in healthcare, such as detecting when a healthcare worker hasn’t washed their hands before approaching a patient by instrumenting the soap dispenser and the worker. Interestingly, the technology for the hand hygiene project came from smart dog collars, and we’re now seeing devices such as Intel’s Curie that are making this much more accessible by combining sensors and connectivity as we commercialize the internet of things (IoT).

He was an early adopter of Google Glass, and talked to us about the experience of having a wearable integrated into his lifestyle, such as for voice-controlled email and photography, plus some of the ideas for Google Glass that he has for healthcare workflows where electronic health records (EHR) and other device information can be integrated with work patterns. Google Glass, however, was not a commercial success since it is too bulky and geeky-looking, as well as requiring frequent recharging if you’re using it a lot. It requires more miniaturization to be considered as a possibility for most people, but that’s a matter of time, and probably a short amount of time, especially if they’re integrated directly into eyeglass frames that likely have a lot of unused volume that could be filled with electronic components.

Webster talked about a university curriculum for healthcare technology and IoT that he designed, which would include the following courses:

  • Wearable human factors and workflow ergonomics
  • Data and process mining wearable data, since wearables generate so much more interesting data that needs to be analyzed and correlated
  • Designing and prototyping wearable products

IMG_20150623_104530He is working on a prototype for a 3D-printed, Arduino-based wearable interactive robot, MrRIMP, intended to be used by pediatric healthcare professionals to amuse and distract their young patients during medical examinations and procedures. He showed us a video of a demo of he and MrRIMP interacting, and the different versions that he’s created. Great ideas about IoT, wearables and healthcare.

SapphireNow 2015 Day 2 Keynote with Bernd Leukert

SAP HANA functionalityThe second day of SAP’s SAPPHIRENOW conference started with Bernd Leukert discussing some customers’ employees worry of being disintermediated by the digital enterprise, but how the digital economy can be used to accentuate the promise of your original business to make your customers happier without spending the same amount of time (and hopefully, money) on enterprise applications. It’s not just about changing technologies but about changing business models and leveraging business networks to address the changing world of business. All true, but I still see a lot of resistance to the digital enterprise in large organizations, with both mid-level management and front-line workers feeling threatened by new technologies and business models until they can see how it can be of benefit to them.

S/4HANAAlthough Leukert is on the stage, the real star of the show is S/4HANA: the new generation of their Business Suite ERP solutions based natively on the in-memory HANA data and transaction engine for faster processing, a simplified data model for easier analytics and faster reconciliation, and a new user interface with their Fiori user experience platform. With the real-time analytical capabilities of HANA, including non-SAP as well as S/4HANA data from finances and logistics, they are moving from being just a system of record to a full decision support system. We saw a demo of a manufacturing scenario, where we walked through a large order process where we saw a combination of financial and logistics data presented in real time for making recommendations on how to deal with a shortage in fulfilling an order. Potential solutions — in this case, moving stock allocated from one customer to another higher priority customer — are presented with a predicted financial score, allowing the user to select one of the options. Nice demo of analytics and financial predictions directly integrated with order processing.

Order processing dashboard Order processing recommendations Order process simulation results

The new offering is modular, with additional plug-ins for their other products such as Concur and SuccessFactors to enhance the suite capabilities. It runs in the cloud and on-premise. Lots of reasons to transition, but having this type of new functionality requires significant work to adopt the new programming model: both on SAP’s side in building the new platform, and also on the customers’ side for refactoring their applications to take advantage of the new features. Likely this will take several months, if not years, for widespread adoption by customers that have highly customized solutions (isn’t that all of them?), in spite of the obvious advantages. As we have seen with other vendors who completely re-architect their product, new customers are generally very happy with starting on the new platform, but existing customers can take years even when there is certified migration path. However, since they launched in February, 400 customers have committed to S4/HANA, and they are now supporting all 25 industries that they serve.

As we saw last year, SAP is pushing to have existing customers first migrate to HANA as the underlying database in their existing systems (typically displacing Oracle), which is a non-trivial but straightforward operation that is likely to improve performance; then, reconsider whether the customizations that they have in their current system are handled out of the box with S/4HANA or can be easily re-implemented based on the simpler data model and more functional capabilities. Sounds good, and I imagine that they will get a reasonable share of their existing customers to make the first step and migrate to HANA, but the second step starts to look more like a new implementation than a simple migration that will scare off a lot of customers. Leukert invited a representative from their customer Asian Paints to the stage to talk about their migration: they have moved to HANA and the simplified finance core functionality, and are still working on implementing the simplified logistics and other modules with a vision to soon be completely on S/4HANA. A good success story, but indicative of the length of time and amount of work required to migrate. For them, definitely worth the trip since they have been able to re-imagine their business model to reach new markets through a better understanding of their customers and their own business data.

He moved on to talk about the HANA Cloud Platform (HCP), a general-purpose application development platform that can be used to build applications unrelated to SAP applications, or to build extensions to SAP functionality. He mentioned an E&Y application built on HCP for fraud detection that is directly integrated with core SAP solutions, which is just one of 1,000 or more third-party applications available on the HCP marketplace. HCP provides structured and unstructured data models, geospatial, predictive, Fiori UX platform as a service, mobile support, analytics portfolio, and integration layers that provide direct connection to your business both on the device side through IoT events and into the operational business systems. With the big IoT push that we saw in the panel yesterday, Siemens has selected HCP as their cloud platform for IoT: the Siemens Cloud for Industry. Peter Weckesser of Siemens joined Leukert on stage to talk more about this newly-launched platform, and how it can be added to their customer installations as a monitoring (not control) layer: remote devices, such as sensors on manufacturing equipment, push their event streams to the Siemens cloud (based on HCP) in public, hybrid or on-premise configurations; analytics can then be applied for predictive maintenance scheduling as well as aggregate operational optimization.

Energy grid geospatial analyticsWe saw a demo based on the CenterPoint IoT example at the panel yesterday, showing monitoring and maintenance of energy distribution networks: tracking the health of transformers, grid storage and other devices and identifying equipment failures, sometimes before they even happen. CenterPoint already has 100,000 sensors out in the field, and since this is integrated with S/4HANA, this is not just monitoring: an operator can trigger a work order directly from the predictive equipment maintenance analytics dashboard.

Energy grid analytics Energy grid analytics drill-down

Leukert touched on to the HANA roadmap, with the addition of Hadoop and SPARK Cluster Manager to handle infinite volumes of data, then welcomed Walmart CIO Karenann Terrell to discuss what it is like to handle a really large HANA implementation. Walmart serves 250 million customers per week through 11,000 locations with 2.2 million employees, meaning that they generate a lot of data just in their daily operations: they generate literally trillions of financial transactions. Because technology is so core to managing this well, she pointed out that Walmart is creating a technology company in the middle of the world’s largest retail company, which allows them to stay focused on the customer experience while reducing costs. Their supply chain is extensive, since they are directly plugged into many of their suppliers, and innovating along that supply chain has driven them to partner with SAP more closely than most other customers. HANA allows them to have 5,000 people hitting on data stores of a half-billion records simultaneously with sub-second response time to provide a real-time view of their supply chain, making them a true data-driven retailer and shooting them to the top of yesterday’s HANA Innovation Awards. She finished by saying that seeing S/4HANA implemented at Walmart in her lifetime is on her bucket list, which got a good laugh from the audience but highlighted the fact that this is not a trivial transition for most companies.

Leukert finished with an invitation — or maybe it was a challenge — to use S/4HANA and HCP to reinvent your business: “clean your basement” to remove unnecessary customization in your current SAP solutions or convert it to HCP or S/4HANA extension platform; change your business model to become more data-driven; and leverage business networks to expand the edges of your value chain. Thrive, don’t just survive.

Employee disaster scenarioSteve Singh, CEO of Concur (acquired by SAP last December) then took over to look at reinventing the employee travel experience, from booking through trip logistics to expense reporting. For companies with large number of traveling employees, managing travel can be a serious headache both from a logistics and financial standpoint. Concur does this by creating a business network (or a network or networks) that directly integrates with suppliers — such as airlines and car rental companies — for booking and direct invoice capture, plus easy functions for inputting travel expenses that are not captured directly from the supplier. I heard comments yesterday that SAP already has travel and expense management, and although the functionality of Concur for that functionality is likely a bit better, the networks that they bring are the real prize here. The networks, for example, allow for managing the extraction of an employee who finds themself in a disaster or other dangerous travel scenario, and becomes part of a broader human resources risk management strategy.

At the press Q&A later, Leukert fielded questions about how they have simplified the complete core of their ERP solution in terms of data model and functionality but still have work to do for some industry modules: although all 25 industries are supported as of now in the on-premise version, they need to do a bit of tinkering under the hood and do additional migration for the cloud version. They’re also still working on the cloud version of everything, and are recommending the HCM and CRM standalone products if the older Business Suite versions don’t meet requirements. In other words, it’s not done yet, although core portions are fully functional. Singh talked about the value of business networks such as Ariba in changing business models, and sees that products such as Concur using HCP and the SAP business networks will help drive broader adoption.

There was a question on the ROI for migration to S/4HANA: it’s supposed to run 1,800 times faster than previous versions, but customers may not be seeing much (if any) savings, opening things up to competitive displacement. I heard this same sentiment from some customers last night at the HANA Innovation Awards reception; since there is little or no cost reduction in terms of license and deployment costs, they need to make the case based on what additional capabilities that HANA enables, such as real-time analytics and predictions, that allow companies to run their businesses differently, and a longer-term reduction in IT complexity and maintenance costs. Since a lot of more traditional companies don’t yet see the need to change their business models, this can be a hard sell, but eventually most companies will need to come around to the need for real-time insights and actions.

IoT Solutions Panel at SapphireNow 2015

Steve Lucas, president of platform solutions at SAP, led a panel on the internet of things at SAPPHIRENOW 2015. He kicked off with some of their new IoT announcements: SAP HANA Cloud Platform (HCP) for IoT with free access to SAP SQL Anywhere embeddable database for edge intelligence; a partner ecosystem that includes Siemens and Intel; and customer success stories from Tennant and Tangoe. Their somewhat complex marketecture diagram shows a fairly comprehensive IoT portfolio that includes connecting to people and things at the edges of your value chain, and integrating the events that they generate to optimize your core corporate planning and reporting, providing real-time insights and automated decisioning. The cloud platform is key to enabling this, since it provides the fabric that weaves all of the data, actions, rules and decisions into a single connected enterprise.

SAP IoT marketecture

He was joined on stage by Austin Swope, who demonstrated remote equipment monitoring using a tiny but operational truck on the stage, complete with onboard sensors that pushed events and data to the cloud for remote monitoring and problem detection. We saw some of the real-time analytics (when the wifi cooperated) on-screen while the truck ran around the stage, and some of the other types of dashboards and analytics that would be used for broader equipment management programs. Since the equipment is now fully instrumented, analytics can be used to visualize and optimize operations: reducing costs, improving maintenance cycles, and increasing equipment load factors through a better understanding of what each piece of equipment is doing at any given time.

Next, Lucas was joined by Gary Hayes, CIO of CenterPoint Energy; Paul Wellman, CIO of Tennant; and Peter Weckesser, CEO Customer Service, Digital Factory at Siemens. Hayes talked about how CenterPoint is using smart meters, grid storage, digital distribution networks and other IoT-enabled technologies to drastically reduce costs and improve service, while maintaining safety and security standards. They’re starting to use predictive analytics on HANA to model and predict underground cable failures, and several other innovations in intelligent energy management. Wellman discussed how Tennant, which has fleets of large-scale cleaning machines such as you would see in conference centers and airports, has added telemetry to provide machine monitoring and predictive maintenance, and expose this information to customers so that they can understand and reduce costs themselves through fleet management and usage. Last up, Weckesser talked about how Siemens devices (of which there are millions out there in a variety of industrial applications) generate events that can be analyzed to optimize industrial plants and machines as well as energy and resources As an SAP partner, Siemens is offering an open cloud platform for industry customers based on HANA; customers can easily connect their existing Siemens devices to the Siemens Cloud for Industry apps via public cloud, private cloud or on-premise infrastructure. This allows them to do analysis for predictive maintenance on individual machines, as well as aggregate fleet operations optimization, through apps provided by Siemens, SAP, SAP partners or the customers themselves.

I was disappointed not to see the SAP Operational Process Intelligence offering involved in this discussion: it seems a natural fit since it can be used to monitor events and control processes from a variety of underlying systems and sources, including event data in HANA. However, good to see that SAP is providing some real-world examples of how they are supporting their customers’ and partners’ IoT efforts through the HANA Cloud Platform.

SapphireNow 2015 Day 1 Keynote with Bill McDermott

Happy Cinco de Mayo! I’m back in Orlando for the giant SAP SAPPHIRE NOW and ASUG conference to catch up with the product people and hear about what organizations are doing with SAP solutions. If you’re not here, you can catch the keynotes and some of the other sessions online either in real time or on demand. The wifi is swamped as usual, my phone kicked from LTE down to 3G and on down to Edge before declaring No Service during the keynote, and since I’m blogging from my tablet/keyboard configuration, I didn’t have connectivity at the keynote (hardwired connections are provided for media/analysts, but my tablet doesn’t have a suitable port) so this will be posted sometime after the keynote and the press conference that follows.

We kicked off the 2015 conference with CEO Bill McDermott asking what the past can teach us about the present. Also, a cat anecdote from his days as a door-to-door Xerox salesman, highlighting the need for empathy and understanding in business, in addition to innovation in products and services. From their Run Simple message last year, SAP is moving on to Making Digital Simple, since all organizations have a lot of dark data that could be exploited to make them data-driven and seamless across the entire value chain: doing very sophisticated things while making them look easy. There is a sameness about vendors’ messaging these day around the digital enterprise — data, events, analytics, internet of things, mobile, etc. — but SAP has a lot of the pieces to bridge the data divide, considering that their ERP systems are at the core of so many enterprises and that they have a lot of the other pieces including in-memory computing, analytics, BPM, B2B networks, HR systems and more. Earlier this year, SAP announced S/4HANA: the next generation of their core ERP suite running on HANA in-memory database and integrating with their Fiori user experience layer, providing a more modular architecture that runs faster, costs less to run and looks better. It’s a platform for innovation because of the functionality and platform support, and it’s also a platform for generating and exposing so much of that data that you need to make your organization data-driven. The HANA cloud platform also provides infrastructure for customer engagement, while allowing organizations to run their SAP solutions in on-premise, hybrid and cloud configurations.

SAP continues to move forward with HR solutions, and recently acquired Concur — the company that owns TripIt (an app that I LOVE) as well as a number of other travel planning and expense reporting tools — to better integrate travel-related information into HR management. Like many other large vendors, SAP is constantly acquiring other companies; as always, the key is how well that they can integrate this into their other products and services, rather than simply adding “An SAP Company” to the banner. Done well, this provides more seamless operations for employees, and also provides an important source of data for analyzing and improving operations.

A few good customer endorsements, but pretty light on content, and some of the new messaging (“Can a business have a soul?”) seemed a bit glib. The Stanley Cup may a short and somewhat superfluous appearance, complete with white-gloved handler. Also, there was a Twitter pool running for how many times the word “simple” was used in the keynote, another indication that the messaging might need a bit of fine-tuning.

There was a press conference afterwards, where McDermott was joined by Jonathan Becher and Steve Lucas to talk about some other initiatives (including a great SAP Store demo by Becher) and answer questions from press and analysts both here in Orlando and in Germany. There was a question about supporting Android and other third-party development; Lucas noted that HANA Cloud Platform is available now for free to developers as a full-stack platform for building applications, and that there are already hundreds of apps built on HCP that do not necessarily have anything to do with SAP ERP solutions. Building on HCP provides access to other information sources such as IoT data: Siemens, for example, is using HCP for their IoT event data. There’s an obvious push by SAP to their cloud platform, but even more so to HANA, either cloud or on-premise: HANA enables real-time transactions and reconciliations, something rarely available in ERP systems, while allowing for far superior analytics and data integration without complex customization and add-ons. Parts of the partner channel are likely a bit worried about this since they exploit SAP’s past platform weaknesses by providing add-on products, customization and services that may no longer be necessary. In fact, an SAP partner that relies on the complexity of SAP solutions by providing maintenance services just released a survey claiming to show a lack of customer interest in S/4HANA; although this resulted in a flurry of sensational headlines today, if you look at the numbers that show some adoption and quite a bit of non-committed interest — not bad for three months after release — it starts to look more like an act of desperation. It will be more interesting to ask this questions a few quarters from now. HANA may also be seen as a threat to SAP’s customers’ middle management, who will be increasingly disintermediated as more information is gathered, analyzed and used to automatically generate decisions and recommendations, replacing manually-collated reports that form the information fiefdoms within many organizations.

Becher and Lucas offered welcome substance as a follow-on to McDermott’s keynote; I expect that we’ll see much more of the product direction details in tomorrow’s keynote with Bernd Leukert.

bpmNEXT 2015 Day 1 Demos: SAP, W4 and Whitestein

The demo program kicked off in the afternoon, with time for three of them sandwiched between two afternoon keynotes. Demos are strictly limited to 30 minutes, with a 5-minute, 20-slide, auto-advancing Ignite-style presentation (which I am credited with suggesting after some of last year’s slideware dragged on), followed by a 15-minute demo and 10 minutes for Q&A and changeover to the next speaker.

SAP: BPM and the Internet of Everything

Harsh Jegadeesan and Benjamin Notheis were in the unenviable first position, given the new presentation format; they gave an introduction to the internet of everything, referring to things, people, places and content. Events are at the core of many BPM systems that sense and respond to events; patterns of events are detected, and managed with rules and workflow. They introduced Smart Process Services on HANA Cloud Platform, including an app marketplace, and looked at a case study of pipeline incident management, where equipment sensor events will trigger maintenance processes: a machine-to-process scenario. The demo showed a dashboard for pipeline management, with a geographic view of a pipeline overlaid with pump locations and details, and highlighting abnormal readings and predicted failures. This is combined with cost data, including the cost of various risk scenarios such as a pipeline break or pump failure. The operator can drill down into abnormal readings for a pump, see predicted failure and maintenance records, then trigger an equipment repair or replacement. The incident case can be tracked, and tasks assigned and escalated. Aggregates for incident cases shows the number of critical cases or those approaching deadlines, and can be used to cluster the incidents to detect contributing factors. Nice demo; an expansion of the operational intelligence dashboards that I’ve seen from SAP previously, with good integration of predictions. Definitely a two-person demo with the inclusion of a tablet, a laptop and a wearable device. They finished with a developer view of the process-related services available on the HANA cloud portal plus the standard Eclipse environment for assembling services using BPMN. This does not have their BPM engine (the former Netweaver engine) behind it: the workflow microservices compile to Javascript and run in an in-memory cloud workflow engine. However, they see that some of the concepts from the more agile development that they are doing on the cloud platform could make their way back to the enterprise BPM product.

W4: Events, IOT, and Intelligent Business Operations

Continuing on the IoT theme, Francois Bonnet talked about making business operations more intelligent by binding physical device events together with people and business events in a BPMS. His example was for fall management — usually for the elderly — where a device event triggers a business process in a call center; the device events can be integrated into BPMN models using standard event constructs. He demonstrated with a sensor made from a Raspberry Pi tied to positional sensors that detect orientation; by tipping over the sensor, a process instance was created that triggered a call to the subscriber, using GPS data to indicate the location on a map. If the call operator indicated that the subscriber did not answer, they would be prompted to call a neighbour, and then emergency services. KPIs such as falls within a specified period are tracked, and a history of the events for the subscriber’s device. The sensor being out of range or having no movement over a period of time can also trigger a new task instance, while reorienting the sensor to the upright orientation within a few seconds after a fall was detected can cancel the process. Looking at the BPMN for managing events from the sensor, they are using the event objects in standard BPMN to their fullest extent, including both in-line and boundary events, with the device events translating to BPMN signal events. Great example of responsive event handling using BPMN.

Whitestein: Demonstrating Measurable Intelligence in an Enterprise Process Platform

The last demo of the day was Dan Neason of Whitestein also was in the theme of events, but more focused on intelligent agents and measurable intelligence in processes. Their LSPS solution models and executes goal-driven processes, where the system uses previous events to evolve its methods for reaching the goals, predicting outcomes, and recommending alternatives. The scenario used was a mortgage application campaign, where information about applicants is gathered and the success of the campaign determined by the number of completed mortgages; potential fraud cases are detected and recommended actions presented to a user to handle the case. Feedback from the user, in the form of accepting or rejecting recommendations, is used to tune the predictions. In addition to showing standard dashboards of events that have occurred, it can also give a dashboard view of predictions such as how many mortgage applications are expected to fail, including those that may be able to be resolved favorably through some recommended actions. The system is self-learning based on statistical models and domain knowledge, so can detect predefined patterns or completely emergent patterns; it can be applied to provide predictive analytics and goal-seeking behavior across multiple systems, including other BPMS.

Wrapping up this set of demos on intelligent, event-driven processes, we had a keynote from Jim Sinur (formerly of Gartner, now an independent consultant) on goal-directed processes. He covered concepts of hybrid processes, made up of multiple heterogeneous systems and processes that may exhibit both orchestration and collaboration to solve business problems.

Great first set of demos, definitely setting the bar high for tomorrow’s full day of 11 demos, and a good first day. We’re all off to the roof deck for a reception, wine tasting and dinner, so that’s it for blogging for today.

Canary roof deck

By the way, I realize that we completely forgot to create bpmNEXT bingo cards, although it did take until after 4pm for “ontology” to come up.

Software AG Analyst Day: The Enterprise Gets Digital

After the DST Advance conference in Phoenix two weeks ago, I headed north for a few days vacation at the Grand Canyon. Yes, there was snow, but it was lovely:

Grand Canyon

Back at work, I spent a day last week in Boston for the first-ever North American Software AG analyst event, attended by a collection of industry and financial analysts. It was a long-ish half day followed by lunch and opportunities for one-on-one meetings with executives: worth the short trip, especially considering that I managed to fly in and out between the snow storms that have been plaguing Boston this year. I didn’t live-blog this since there was a lot of material spread over the day, so had a chance to see some of the other analysts’ coverage published after the event, such as this summary from Peter Krensky of Aberdeen Group.

The focus of the event was squarely on the digital enterprise, a trend that I’m seeing at many other vendors but not so many customers yet. Software AG’s CEO, Karl-Heinz Streibich kicked off the day talking about how everywhere you turn, you hear about the digital enterprise: not just using digital technology, but having enough real-time data and devices integrated into our work and lives that they can be said to be truly digital. Streibich feels that companies with a basis in integration middleware – like Software AG with webMethods and other products – are in a good position to enable digital enterprises by integrating data, devices and systems of all types.

Although Software AG is not a household consumer name, its software is in 70% of the Fortune 1000, with a community of over 2M developers; it’s fair to say that you will likely interact with a company that uses Software AG products at least once per day: banks, airports and airlines, manufacturing, telecommunications, energy and more. Their revenues are split fairly evenly between Europe and the Americas, with a small amount in Asia Pacific. License revenues are 32% of the total, with maintenance and consulting splitting the remainder; this relatively low proportion of license revenue is an indicator of a mature software company, and not unexpected from a company more than 40 years old. I found a different representation of their revenues more interesting: they had 66% of their business in the “digital business” segment in 2014, expected to climb to 75% this year, which includes their portfolio minus the legacy ADABAS/NATURAL mainframe development tools. Impressive, considering that it was about a 50:50 split in 2010. 2015-03-04 Boston Analyst Day WJ-WEB.pdf - Adobe Reader 07032015 103114 PM.bmpPart of this increase is likely due to their several acquisitions over that period, but also because they are repositioning their portfolio as the Digital Business Platform, a necessary shift towards the systems of engagement where more of the customer spend is happening. Based on the marketecture diagram, this platform forms a cut-out layer between back office core operational systems and front office customer engagement systems. Middleware, by any other name; but according to Streibich, more business logic is moving to the middleware layer, although this is what middleware vendors have been telling us for decades.

There’s definitely a lot of capable products in the portfolio that form this “development platform for digital business” – webMethods (integration and BPM), ARIS (BPA), Terracotta (in memory big data), Longjump (application PaaS), Metaquark (mobility), Alfabet, Apama, JackBe and more – but the key will be to see how well they can make them all work together to be a true platform rather than just a collection of Software AG-branded tools.

We had an in-depth presentation on their Digital Business Platform from Wolfram Jost, Software AG’s CTO; you can read the long version on their site, so I’ll just hit the high points. He started with some industry quotes, such as “every company will become a software company”, and one analyst firm’s laughable brainstorm for 2014, “Big Change”, but moved on to define digital business as having the following characteristics:

  • Blurring the digital and physical world
  • More influence of customers (on business direction as well as external perceptions)
  • Combining people, business and physical things
  • Agility, speed, scale, responsiveness
  • “Supermaneuverable” business processes
  • Disrupting existing business models

The problem with this shift in business models is that conventional business applications don’t support the way that the new breed of business applications are designed, developed, used and operated. Current applications and development techniques are still valuable, but are being pushed behind the scenes as core operational systems and packaged applications.

Software AG’s Digital Business Platform, then, is based on the premise that few packaged applications are useful in the face of business transformation and the required agility. We need tools to create adaptive applications – built to change, not to last – especially in front office customer engagement applications, replacing or augmenting packaged CRM and other applications. This is not fundamentally different from the message about any agile/adaptive/mashup/model-driven application development environment over the past few years, including BPMS; it’s interesting to see how a large vendor such as Software AG positions their entire portfolio around that message. In fact, one of their slides refers to the adaptive application platform as iBPMS, since the definition of iBPMS has expanded to include everything related to model-driven application development.

2015-03-04 Boston Analyst Day WJ-WEB.pdf - Adobe Reader 07032015 103731 PM.bmpThe core capabilities of their platform include intelligent business operations (webMethods Operational Intelligence, Apama Streaming Analytics); agile processes (webMethods BPM and AgileApps); integration (webMethods Integration and API Management); in-memory data fabric (Terracotta); and business and IT transformation (ARIS BPA and GRC, Alfabet IT Portfolio Management and EA Management). In a detailed slide overlaying their products, they also added a transaction processing capability to allow the inclusion of ADABAS-NATURAL, as well as the cloud offerings that they’ve released over the past year.

Jost dug further in to definitions of business application layers and architectural requirements. They provide the structure and linkages for event routing and event persistence frameworks, using relatively loose event-based coupling between their own products to allow them to be deployed selectively, but also (I imagine) to reduce the amount of refactoring of the products that would be required for tighter coupling. Their cloud IoT offering plays an interesting role by ingesting events from smart devices – developed via co-innovation with device companies such as Bosch and Siemens – for integration with on-premise business applications.

We then heard two shorter presentations, each followed by a panel. First was Eric Duffaut, the Chief Customer Officer, presenting their go-to-market strategy then moderating a panel with two partners, Audi Lucas of Wipro and Chris Brinton of Mosaic Data Science. Their GTM plan was fairly standard for a large enterprise software vendor, although they are improving effectiveness by having a single marketing team across all products as well as improving the sales productivity processes. Their partners are critical for scalability in this plan, and provide the necessary industry experience and solutions; both of the partner panelists talked about co-innovation with Software AG, rather than just providing resources trained on the products.

The second presentation and panel was led by John Bates, CMO and head of industry solutions; he was joined by a customer panel including Bryan Zigler of Boeing, Mark DuBrock of Standard&Poor, and Greg James of Outerwall. Bates discussed the role of industry solutions and solution accelerators, built by Software AG and/or partners, that provide a pre-built, customizable and adaptive application for fast deployment. They’re not using the Smart Process Application terminology that other vendors adopted from the Forrester trend from a couple of years ago, but it’s a very similar concept, and Bates announced the solution marketplace that they are launching to allow these to be easily discovered and purchased by customers.

My issue with solution accelerators and industry solutions in general is that many of these solutions are tied to a specific version of the underlying technology, and are templates rather than frameworks in that you change the solution itself during implementation: upgrades to platform may not be easily performed, and upgrades to the actual solution likely requires re-customizing for each deployed instance. I didn’t get a chance to ask Bates how SAG helps partners and customers to create and deploy more upgradable solutions, e.g., recommended technology guardrails; this is a sticky problem that every technology vendor needs to deal with.

AVPageView 07032015 111148 PM.bmpBates also discussed the patterns of digital disruption that can be seen in the marketplace, and how these are manifesting in three specific areas that they can help to address with their Digital Business Platform:

  • Connected customers, providing opportunities for location-based marketing and offers, automated concierge service, customer location tracking, demographic marketing
  • Internet of Things/Machine-to-Machine (IoT/M2M), with real-time monitoring and diagnostics, and predictive maintenance
  • Proactive risk and compliance, including proactive financial trade surveillance for unusual/rogue behavior

After a wrapup by Streibich, we received copies of his latest book, The Digital Enterprise, plus Thingalytics by Bates; ironically, these were paper rather than digital copies. Winking smile

Disclosure: Software AG paid my airfare and hotel to attend this event, plus gave me a nice lunch and two books, but did not otherwise compensate me for my time nor for anything that I have written here.

This week, I’m in Las Vegas for Kofax Transform, although just as an attendee this year rather than a speaker; expect to see a few notes from here over the two days of the conference.