Rethinking personal data: Pegaworld 2016 panel

I attended a breakout panel on how the idea and usage of personal data are changing was moderated by Alan Marcus of the World Economic Forum (nice socks!), and included Richard Archdeacon of HP, Rob Walker from Pega and Matt Mobley from Merkel.

image

The focus is on customer data as it is maintained in an organization’s systems, and the regulations that now drive how that data is managed. The talk was organized around three key themes that are emerging from the global dialog: strengthening trust and accountability; understanding usage-based, individual-centric frameworks; and engaging the individual. Thoughts from the panel:

  • Once you have someone’s data, you remain responsible for it even as you pass it to other parties
  • Customer data management is now regulation-driven
  • It’s not enough to restrict values in a customer data set; it’s now possible to derive hidden values (such as gender or race) from other values, which can result in illegal targeting: how much efforts should be put into anonymizing data when it can be easily deanonymized?
  • Organizations need to inform customers of what data that they have about them, and how it is being used
  • Consumers want the convenience offered by giving up their data more than they fear misuse of the data
  • The true currency of identity for organizations is an email address and one other piece of data, which can then be matched to a vast amount of data from other sources
  • The biggest consumer fear is data privacy violation from a security breach (about which is there is a high level of hysteria), but possibly they should be more afraid of how the companies that they willingly give the data to are going to use it
  • Personal data includes data that you create, data that others create about you, and data that is inferred based on your activities
  • Many people are maintained multiple identities on social media sites, curated differently for professional and personal audiences
  • Personal health data, including genetic data, has an additional set of concerns since it can impact individual healthcare options
  • Unresolved question of when personal data is no longer personal data, e.g., after a certain amount of aggregation and analysis occurs
  • Issues of consent (by customers to use their data) are becoming more prominent, and using data without consent will be counter to the regulations in most jurisdictions
  • Many smaller businesses will find it difficult to meet security compliance regulations; this may drive them to use cloud services where the provider assumes some degree of security responsibility

Food for thought. A lot of unresolved issues in personal data privacy and management.

Pegaworld 2016 day 2 keynote: digital transformation and the 4th industrial revolution

Day 2 of Pegaworld 2016 – another full day on the schedule.

The keynote started with Gilles Leyrat, SVP of Customer and Partner Services at Cisco, discussing how they became a more digital operation in order to provide better customer service and save costs. Cisco equipment provides a huge part of the backbone of the internet, supporting digital transformation for many other organizations, but this was about how they are transforming themselves to keep pace with their customers as well as their competitors. They are using Pega to digitize their business by connecting people and technology, automating processes, and using data for real-time analytics and process change to support their 20,000-strong sales team and 2M orders per year.

wp-1465319210937.png

Their digitization has three key goals: operational excellence, revenue growth, and “delightful” customer experience. Customer experience is seen as being crucial to revenue growth, with strong causal links showing up in research. He compared the old world — offshore customer service centers augmented by onshore specialists — with the new digital world, where digitization is a means to achieving their customer experience goal by simplifying, automating and using analytics. By reducing human touch in many standard processes, they are able to reduce wait time for customers while allowing workers to focus on interacting with customers to resolve problems: 93% of cases are now handled with zero touch, saving 2M hours of wait time per year and reducing order resolution time to 6 hours. The employee experience is improved through integrated workplaces and actionable intelligence that support their work patterns. He ended with the advice to understand what you’re trying to achieve, and linking your digital transformation initiatives to those goals.

Next was a panel on digital transformation moderated by Christopher Paquette, Digital Principal at McKinsey, including Alistair Currie, COO at ANZ Bank; Toine Straathof, EVP at Rabobank; Kevin Sullivan, SVP and Head of the Decision Sciences Group at Fifth Third Bank; and Nicole Gleason, Practice Lead for Business Intelligence & Analytics at Comet Global Consulting. A few notes from the panel (I mostly haven’t attributed to the specific speaker since the conversation was free-ranging):

  • Digital transformation is being driven by rapidly-changing customer expectations
  • Banking customers prefer mobile/online first, then ATM, then branch, then call center; this aligns well with operational costs but requires that the digital platforms be built out first
  • Moving internal stakeholders off their old methods and out of operational silos can be more difficult than dealing with regulators and other external parties
  • Making IT and business people responsible for results (e.g., a guiding business architecture) rather than dictating their exact path can lead to innovation and optimal solutions
  • Employee incentives need to be consistent across channels to lessen the competition across them
  • A lot of current digitization efforts are to bridge/hide the complexity of existing legacy systems rather than actual digital transformation

wp-1465322079456.pngAlan Trefler returned to the stage to introduce the concepts of the fourth industrial revolution and workforce disruption; he sees what is happening now as a step change in how society works and how we interact with technology. We heard from Alan Marcus, Head of the Technology Agenda at the World Economic Forum, on this topic, and how new categories of jobs and the required skill sets will completely transform employment markets. Lots of opportunities, but also lots of disruption, in both first world and emerging markets. He covered a timeline of changes and their impacts, and stressed that skill sets are changing quickly: 35% of core skills will change by 2020. wp-1465322062127.pngCompanies need to expose workers to new roles and training, and particularly open doors to women in all roles. Creativity will become a core skill, even as AI technologies gain acceptance. Governments and education systems need to innovate to support the changing workforce. Organizations need to reinvent their HR to help employees to move into this brave new world.

IMG_9803The keynote finished with Gerald Chertavian, Founder and CEO at Year Up, an organization that helps low-income youth prepare for a professional job. There’s a social justice goal of helping young adults who have no college degree (and no path to get one) to become hireable talent through practical training and internships; but there’s also the side benefit of feeding skilled workers into the rapidly-changing technology-heavy employment market that Marcus discussed earlier. Year Up was contacted by American Express, who needed people trained in Java and Pega in order to re-onshore some of their development work; they created a curriculum targeted at those jobs and trained up a large number of people who then competed successfully for those jobs. IMG_9804Year Up is now in 18 cities across the US, working with large organizations to identify skills gaps and train people to suit the employment pipeline. They’re changing tens of thousands of lives by providing a start on the path to upward mobility, and feeding a need for companies to hire the right skills in order to transform in this fourth industrial revolution.

 

Pegaworld 2016 Day 1 Keynote: Pega direction, Philips and Allianz

It seems like I was just here in Vegas at the MGM Grand…oh, wait, I was just here. Well, I’m back for Pegaworld 2016, and 4,000 of us congregated in the Grand Garden Arena for the opening keynote on the first day. If you’re watching from home, or want to catch a replay, there is a live stream of the keynotes that will likely feature an on-demand replay at some point.

IMG_9776Alan Trefler, Pega’s CEO, kicked things off by pointing out the shift from a focus on technology to a focus on the customer. Surveys show that although most companies think that they understand their customers, the customers don’t agree; companies need to undergo a serious amount of digital transformation in order to provide the level of service that today’s customers need, while still improving efficiencies to support that experience. One key to this is a model-driven technology environment that incorporates insights and actions, allowing the next best action to be provided at any given point depending on the current context, while supporting organizational evolution to allow constant change to meet the future demands. Model-driven environments let you create applications that are future-proof, since it is relatively quick to make changes to the models without changing a lot of code. Pega has a lot of new online training at the Pega Academy, a marketplace of third-party Pega applications at the Pega Exchange, and the continuing support of their Pega Express easy-to-use modeler; they continue to work on breaking free from their tech-heavy past to support more agile digital transformation. Pega recently sponsored an Economist report on digital transformation; you can grab that here.

wp-1465232175851.jpgDon Schuerman, Pega’s CTO, took over as MC for the event to introduce the other keynote speakers, but first announced a new partnership with Philips that links Pega’s care management package with Philips’ HealthSuite informatics and cloud platform for home healthcare. Jeroen Tas, CEO of Connected Care & Health Informatics at Philips presented more on this, specifically in the context of the inefficient and unevenly-distributed US healthcare system. He had a great chart that showed the drivers for healthcare transformation: from episodic to continuous, by orchestrating 24/7 care; from care provider to human-centric, by focusing on patient experience; from fragmented to connected, by connecting patients and caregivers; and from volume to value, by optimizing resources. Connected, personalized care links healthy living to disease prevention, and supports the proper diagnosis and treatment since healthcare providers all have access to a comprehensive set of the patient’s information. Lots of cool personal healthcare devices, such as ultrasound-as-a-service, where they will ship a device that can be plugged into a tablet to allow your GP to do scans that might normally be done by a specialist; continuous glucose meters and insulin regulation; and tools to monitor elderly patients’ medications. Care costs can be reduced by 26% and readmissions reduced by 52% through active monitoring in networked care delivery environments, such as by monitoring heart patients for precursors of a heart attack; this requires a combination of IoT, personal health data, data analytics and patient pathways provided by Philips and Pega. He ended up stating that it’s a great time to be in healthcare, and that there are huge benefits for patients as well as healthcare providers.

Although Tas didn’t discuss this aspect, there’s a huge amount of fear of connected healthcare information in user-pay healthcare systems: people are concerned that they will be refused coverage if their entire health history is known. Better informatics and analysis of healthcare information improves health and reduces overall healthcare costs, but it needs to be provided in an environment that doesn’t punish people for exposing their health data to everyone in the healthcare system.

We continued on the healthcare topic, moving to the insurance side with Birgit König, CEO of Allianz Health Germany. Since basic healthcare in Germany is provided by the state, health insurance is for additional services not covered by the basic plan, and for travelers while they are outside Germany. There is a lot of competition in the market, and customer experience for claims is becoming a competitive differentiator especially with new younger customers. In order to accommodate, Allianz is embracing a bimodal architecture approach, where back-end systems are maintained using traditional development techniques that focus on stability and risk, while front-end systems are more agile and innovative with shorter release cycles. I’ve just written a paper on bimodal IT and how it plays out in enterprises; not published yet, but completely aligned with what König discussed. Allianz is using Pega for more agile analytics and decisioning at the front end of their processes, while keeping their back-end systems stable. Innovation and fast development has been greatly aided by co-locating their development and business teams, not surprisingly.

wp-1465232200882.jpgThe keynote finished with Kerim Akgonul, Pega’s SVP of Products, for a high-level product update. He started by looking at the alignment between internal business goals and the customer journey, spanning marketing, sales, customer service and operations. The Pega Customer Decision Hub sits at the middle of these four areas, linking information so that (for example), offers sent to customers are based on their past orders.

wp-1465234442978.jpg

 

  • Marketing: A recent Forrester report stated that Pega Marketing yields an 8x return on marketing investment (ROMI) due to the next-best-action strategies and other smart uses of analytics. Marketers don’t need to be data scientists to create intelligent campaigns based on historical and real-time data, and send those to a targeted list based on filters including geolocation. We saw this in action, with a campaign created in front of us to target Pegaworld attendees who were actually in the arena, then sent out to the recipients via the conference mobile app.
  • Sales: The engagement map in the Pega Sales Automation app uses the Customer Decision Hub information to provide guidance that links products to opportunities for salespeople; we saw how the mobile sales automation app makes this information available and recommends contacts and actions, such as a follow-up contact or training offer. There are also some nice tools such as capturing a business card using the mobile camera and importing the contact information, merging it if a similar record is found.
  • wp-1465234409405.jpgCustomer service: The Pega customer service dashboard shows individual customer timelines, but the big customer service news in this keynote is the OpenSpan acquisition that provides robotic process automation (RPA) to improve customer service environments. OpenSpan can monitor desktop work as it is performed, and identify opportunities for RPA based on repetitive actions. The new automation is set up by recording the actions that would be done by a worker, such as copying and pasting information between systems. The example was an address change, where a CSR would take a call from a customer then have to update three different systems with the same information by copying and pasting between applications. We saw the address change being recorded, then played back on a new transaction; this was also included as an RPA step in a Pega Express model, although I’m not sure if that was just to document the process as opposed to any automation driven from the BPM side.
  • Operations: The Pega Field Service application provides information for remote workers doing field support calls, reducing the time required to complete the service while documenting the results and tracking the workers. We saw a short video of Xerox using this in Europe for their photocopier service calls: the field engineer sees the customer’s equipment list, the inventory that he has with him, and other local field engineers who might have different skills or inventory to assist with his call. Xerox has reduced their service call time, improved field engineer productivity, and increased customer satisfaction.

Good mix of vision, technology and customer case studies. Check out the replay when it’s available.

Analytics customer keynote at TIBCONOW 2016

Michael O’Connell hosted the last general session for TIBCO NOW 2016, focusing on analytics customer stories with the help of five customers: State Street, Shell, Vestas, Monsanto and Western Digital. I’m not going to try to attribute specific comments to the customer representatives, just capture a few thoughts as they go by.

wp-1463615608893.jpg

  • Spotfire is allowing self-service analytics to be pushed down to the business users
  • Typically, the analysis going on in a number of different solutions — from Excel to BI tools — are able to be consolidated onto a single analytics platform
  • Analytics is allowing the business to discover the true nature of their business, especially with outliers
  • Real-time analytics on physical processes (e.g., supply chain) generates significant benefits
  • Providing visual analytics to business changes the way that they use data and collaborate across the organization
  • The enterprise-class back-end and the good visualizations in Spotfire are helping it to win over both IT and business areas
  • Data and events are being generated faster and in greater volumes from more devices, making desktop analytics solutions impractical
  • Business users who are not data specialists can understand — and leverage — fairly complex analytical models when it concerns their own data
  • Analytics about manufacturing quality can be used to identify potential problems before they occur

We finished up with a brief presentation from Fred Ehlers, VP of IT at Norfolk Southern, about their use of TIBCO products to help manage their extensive railway operations. He talked about optimizing their intermodal terminals, where goods shipped in containers are moved between trains, trucks and ships; asset utilization, to ensure that empty cars are distributed to the right place at the right time for expected demand; and their customer service portal that shows an integrated view of a shipment lifecycle to give customers a more accurate, real-time view. As an old company, they have a lot of legacy systems, and used TIBCO to integrate them, centralizing operational events, data and business rules. For them, events can come from their physical assets (locomotives and railway sensors), legacy reporting systems, partner networks for assets not under their ownership, and external information including weather. On this, they build asset state models, and create applications that automatically correlate information and optimize operations. They now have one source of data and rules, and a reusable set of data and services to make application development faster. Their next steps are predictive maintenance, gathering information from locomotives, signal systms, switches and trackside defect detector to identify problems prior to an equipment failure; and real-time visual analytics with alerts on potential problem areas. They also want to inmprove operational forecasting to support better allocation of resources, allowing them to divert traffic and take other measures to avoid service disruptions. Great case study that incorporates the two conference themes of interconnecting everything and augmenting intelligence.

We’re at the end of day 2, and the end of my blogging at TIBCO NOW; there are breakouts sessions tomorrow but I’ll be on my way home. Some great new stuff in BPM and analytics, although far too many sessions going on at once to capture more than a fraction of what I wanted to see.

Closing the loop with analytics: TIBCONOW 2016 day 2 keynote

Yesterday at TIBCO NOW 2016, we heard about the first half of TIBCO’s theme — interconnect everything — and today, Matt Quinn introduced the second half — augment intelligence — before turning the stage over to Mark Palmer, SVP engineering for streaming analytics.

wp-1463592662913.png

Palmer talked about the role of analytics over history, and how today’s smart visual analytics allow you to be first to insight, then first to action. We then had a quick switch to wp-1463592680877.pngBrad Hopper, VP strategy for analytics, for a demo of Spotfire visual analytics while wearing a long blond wig (attempting to make a point about the importance of beauty, I think). He built an analytics dashboard while he talked, showing how easy it is to create visual analytics and trigger smart actions. He went on to talk about data preparation and cleansing, which can often take as much as 50% of an analyst’s time, and demonstrated importing a CSV file and using quick visualizations to expose and correct potential problems in the underlying data. As always, the Spotfire demos are very impressive; I don’t follow Spotfire closely enough to know what’s new, but it all looks pretty slick.

wp-1463592703428.pngMichael O’Connell, TIBCO’s chief analytics officer, came up to demonstrate a set of analytics applications for a fictitious coffee company: sales figures and drilldowns, with what-if predictions for planning promotions; and supply chain management and smart routing of product deliveries.

Palmer came back to talk about TIBCO Jaspersoft, the other side of their analytics portfolio that provides business intelligence capabilities built in to applications, but it was a pretty quick mention with no demo. A Jaspersoft demo would look pretty mundane after seeing all of the sexy Spotfire features, but it undoubtedly is a workhorse for analytics with many customers. He moved on to ways that TIBCO is helping customers to roll analytics out, from accelerators and sample source code to engagement in the community.

wp-1463592727471.png

wp-1463592749782.png

He continued on with streaming analytics (Palmer was the CEO of Streambase before it was acquired TIBCO), and O’Connell came back to show an
wp-1463592771034.pngoil industry application that leverages sensor analytics to maximize equipment productivity by initiating preventative maintenance when the events emitted by the device indicate that failure may be imminent. He showed a more comprehensive interface that would be used in the head office for real-time monitoring and analysis, and a simpler tablet interface for field service personnel to receive information about wells requiring service. Palmer finished the analytics segment with a brief look at LiveView Web, a zero-code environment for building operational intelligence dashboards.

wp-1463592816127.png

Quinn returned to talk about their B-tree-based Graph Database, which is in preview mode now with an open API, and other areas where they are looking to provide innovative solutions. He went through a history of how they’ve grown as a technology organization, and got quite verklempt when thanking his team for how awesome they’ve continued to be over the past 18 months since the acquisition, which was really touching.

IMG_9495After the break, Adam Steltzner, NASA’s lead engineer on the Mars Rover and author of The Right Kind of Crazy: A True Story of Teamwork, Leadership, and High-Stakes Innovation, talked about innovation, collaboration and decision-making under pressure. Check out the replay of the keynote for his talk, a fascinating story of the team that built and landed the Mars landing vehicles, along with some practical tips for leaders to foster exploration and innovation in teams.

Murray Rode returned to close out the keynote by announcing the winners of their Trailblazer customer awards:

  • Norfolk Southern (Pioneer) for implementing a real-time view of their railway operations
  • CargoSmart (Innovator) for incorporating real-time optimization of shipping logistics into their cargo management software
  • First Citizens Bank (Impact) for simplifying IT structure to allow for quick creation and delivery of new branch services
  • University of Chicago Medicine (Visionary) for optimizing operating room turnover to save costs and improve service
  • TUI Group (Transformer) for transforming their platforms through integration to enable new customer-facing tourism applications

That’s it for the morning keynote, and I’m off to catch some of the breakout sessions for most of the rest of the day before we come back for the customer panel and closing keynote at the end of the day.

Destination: Digital at the TIBCONOW 2016 day 1 keynote

TIBCO had a bit of a hiatus on their conference while they were being acquired, but are back in force this week in Las Vegas with TIBCO NOW 2016. The theme is “Destination: Digital” with a focus on innovation, not just optimization, and the 2,000 attendees span TIBCO’s portfolio of products. You can catch the live stream here, which covers at least the general sessions each morning.

IMG_9433CMO Thomas Been opened the day by positioning TIBCO as a platform for digital transformation, then was joined by CEO Murray Rode. Rode talked about TIBCO’s own transformation over the last 18 months since the last conference, and how their customers are using TIBCO technology for real-time operations, analyzing and predicting the consumers’ needs, and enhancing the customer experience in this 4th industrial revolution that we’re experiencing. He used three examples to illustrate the scope of digital business transformation:

  • A banking customer applies and is approved for a loan through the bank’s mobile app, without documents and signatures
  • A consumer’s desires are predicted based on their behavior, and they are offered the right product at the right time
  • A customer’s order (or other interaction with a business) is followed in real-time to enhance their experience

Although TIBCO has always been about real-time, he pointed out that real-time has become the new norm: consumers don’t want to wait for information or responses, and the billions of interconnected smart devices are generating events all the time. The use of TIBCO’s software is shifting from the systems of record — although that is still their base of strength — to the systems of engagement: from the core to the edge. That means not only different types of technologies, but also different development and deployment methodologies. Their goals: interconnect everything, and augment intelligence; this seems to also represent the two main divisions for their products.

wp-1463505346663.pngThat set the stage for Ray Kurzweil, the author and futurist, who spoke about the revolution in artificial intelligence-driven innovation supported by the exponential growth in computing capabilities. The drastically dropping price performance ratio of computing is what is enabling innovation: in some cases, innovation doesn’t occur on a broad scale if it’s not cost effective. He had lots of great examples of how innovation has occurred and will continue to evolve in the future, especially around human biology, finishing up with Thomas Been joining him on stage for a conversation about Kurzweil’s research as well as the opportunities facing TIBCO’s customers. I didn’t put most of the detail in here; check for a replay on the live stream.

wp-1463511387590.png

Matt Quinn, TIBCO’s CTO, took over with a product overview. In this keynote, he looked at the “interconnect everything” products, leaving the “augment intelligence” side of the portfolio for tomorrow’s keynote. They’ve set some core principles for all product development: cloud first (including on-premise and hybrid, as well as public cloud), ease of use (persona-based UX, industry solutions, and support community), and industrialization (cross-product integration, more open DevOps, and IoT). He expanded the idea of “interconnect everything” to “interconnect everything, everywhere”, and brought in VP of engineering Randy Menon to talk about their cloud platform strategy specifically as it relates to integration. As Quinn mentioned, he talked about how TIBCO has always built great products for the core, or “products for the CIO” as he put it, but that they are now looking at addressing different audiences. He went through some of the new functionality in their interconnection portfolio, include enhancements to ActiveMatrix BusinessWorks, ActiveMatrix BPM (now including case management and more flexible UI building), TIBCO MDM, and FTL messaging. He also introduced and showed demos of BusinessWorks Container Edition for cloud-native integration, supporting a number of standard cloud container services; TIBCO Cloud Integration, allowing iPaaS use cases to be enabled using a point-and-click environment; and Microflows using Node.js. He talked about their Mashery acquisition and what’s coming up in the API management product with real-time APIs, richer visual analytics leveraging Spotfire, and a cloud-native hybrid gateway. Combined with the other cloud products, this provides an end-to-end environment for creating and deploying cloud APIs. But their technology advances aren’t just about developers: it’s also for “digital citizens” who want to integrate and automate a variety of cloud tools using Simplr, which allows for simple workflows and forms. Nimbus Maps, a slimmed-down version of Nimbus, is also a tool for business people who want to do some of their own process documentation.

IMG_9452Rajeev Kozhikkattuthodi, director of product marketing, came up to announce Project Flogo, a lightweight IoT integration product, which they intend to make open source. It can be used to create simple workflows using a Golang-based engine that integrate with a variety of devices, a design bot in Slack and an interactive debugger; the runtime is 20-50 times smaller than similar development environments. It’s not released yet but he showed a brief demo and it’s apparently on the show floor.

wp-1463511613470.png
wp-1463511629016.png

Quinn returned to mention a few other products — TIBCO Expresso; Momento; and their IoT innovations — before turning over to Raj Verma, EVP of worldwide sales to talk about their customers’ journey during the purchasing process. With 10,000+ customers and $1B in revenue, TIBCO is big but has room to grow, and a better experience during the purchase, installation and scaling of TIBCO products would help with that. They are starting to roll out some of this, which includes much more self-service for product information and downloaded trials, plus enhancements to the TIBCO community to include more training materials and support; standardized pricing for product suites; and online purchasing. Although there is still a significant field sales force to help you along, it’s possible to do much more directly, and they’re enhancing their partner channel (which Verma admitted has some significant problems in the past) if you have already have a trusted service provider. A much more customer-focused approach to sales and implementation, which was certainly required to make them more competitive.

A marathon 3-hour general session, with a lot of good content. I’m looking forward to the rest of the conference.

I’ll be speaking on a panel this afternoon on the topic of digital business, drop by and say hi if you’re at the conference.

bpmNEXT 2016 demos: Oracle, OpenRules and Sapiens DECISION

This afternoon’s first demo session shifts the focus to decision management and DMN.

Decision Modeling Service – Alvin To, Oracle

wp-1461187532237.jpgOracle Process Cloud as an alternative to their Business Rules, implementing the DMN standard and the FEEL expression language. Exposes decisions as services that can be called from a BPMN process. Create a space (container) to contain all related decision models, then create a DMN decision model in that space. Create test data records in the space, which will be deleted before final deployment. Define decisions using expressions, decision tables, if-then-else constructs and functions. Demo example was a loyalty program, where discounts and points accumulation were decided based on program tier and customer age. The decisions can be manually executed using the test data, and the rules changed and saved to immediately change the decision logic. A second demo example was an order approval decision, where an order number could be fed into the decision and an approval decision returned, including looping through all of the line items in the order and making decisions at that level as well as an overall decision based on the subdecisions. Once created, expose the decisions or subdecisions as services to be called from external systems, such as a step in a BPMN model (or presumably any other application). Good way to introduce standard DMN decision modeling into any application without having an on-premise decision management system.

Dynamic Decision Models: Activation/Deactivation of Business Rules in Real Time – Jacob Feldman, OpenRules

wp-1461187557576.jpgWhat-If Analyzer for decision modeling, for optimization, to show conflicts between rules, and to enable/disable rules dynamically. Interface shows glossary of decision variables, and a list of business rules with a checkbox to activate/deactivate each. Deactivating rules using the checkboxes updates the values of the decision results to find a desired solution, and can find minimum and maximum values for specified decision variables that will still yield the same decision result. The demo example was a loan approval calculation, where several rules were disabled in order to have the decision result of “approved”, then a maximum value generated for accumulated debt that would still give an “approved” result. Second example was how to build a good burger, optimizing cost for specific health and taste standards by selecting different rules and optimizing the resulting sets of decision variables. Third example was a scheduling problem, optimizing activities when building a house in order to maintain precedence and resulting in the earliest possible move-in date, working within budget and schedule constraints. Interesting analysis tool for gaining a deep understanding of how your rules/decisions interact, far beyond what can be done using decision tables, especially for goal-seeking optimization problems. All open source.

The Dirty Secrete in Process and Decision Management: Integration is Difficult – LarryGoldberg, Sapiens DECISION

wp-1461190003376.jpgData virtualization to create in-memory logical units of data related to specific business entities. Demo started with a decision model for an insurance policy renewal, with input variables included for each decision and subdecision. Acquiring the data for those input variables can require a great deal of import/export and mapping from source systems containing that data; their InfoHub creates the data model and allows setup of the integration with external sources by connecting data sources and defining mapping and transformation between source and destination data fields. When deployed to the InfoHub server, web service interfaces are created to allow calling from any application; at runtime, InfoHub ensures that the logical unit of data required for a decision is maintained in memory to improve performance and reduce implementation complexity of the calling application. There are various synchronization strategies to update their logical units when the source data changes — effectively, a really smart caching scheme that syncronizes only the data that is required for decisions.

bpmNEXT 2016 demos: W4 and BP3

Second round of demos for the day, with more case management. This time with pictures!

BPM and Enterprise Social Networks for Flexible Case Management – Francois Bonnet, W4 (now ITESOFT Group)

wp-1461177318999.jpgAdding ESN to case management (via Jamespot plugin) to improve collaboration and flexibility, enhancing a timeline of BPM events with the comments and other collaboration events that occur as the process executes. Initiates social routing as asynchronous event call. Example shows collaborative ownership assignment on an RFP, where an owner must self-select within the ESN before a process deadline is reached, or the assignment is made automatically. Case ID shared between W4 BPM and Jamespot ESN, so that case assignments, comments and other activities are sent back to BPM for logging in the process engine to create a consolidated timeline. Can create links between content artifacts, such as between RFP and proposal. Nice use of BPMN events to link to ESN, and a good example of how to use an external (but integrated) ESN for collaborative steps within a standard BPMN process, while capturing events that occur in the ESN as part of the process audit trail.

A Business Process Application with No Process – Scott Francis, BP3 Global

IMG_9207Outpatient care example with coordination of resources (rooms, labs) and people (doctors, patients), BPMN may not be best way to model and coordinate resources since can end up as a single-task anti-pattern. Target UI on tablet, using their Brazos tools with responsive UI, but can be used on desktop or phone. Patient list allows provider to manage high-level state of waiting versus in progress by assigning room, then add substatessuch as “Chaperone Required”, immediate updates regardless of platform used. Patient and doctor notifications can be initiated from action menu. A beautiful UI implementation of a fairly simple state management application built on IBM BPM, although the infrastructure is there to tie in events and data from other systems.

DSTAdvance16 Keynote with @KevinMitnick

Hacker and security consultant Kevin Mitnick gave today’s opening keynote at DST’s ADVANCE 2016 conference. Mitnick became famous for hacking into a lot of places that he shouldn’t have been, starting as a phone-phreaking teenager, and spending some time behind bars for his efforts; these days, he hacks for good, being paid by companies to penetrate their security and identify the weaknesses. A lot of his attacks used social engineering in addition to technical exploits, and that was a key focus of his talk today, starting with the story of how Stanley Rifkin defrauded the bank where he worked of $10.2M by conning the necessary passwords and codes out of employees.

Hacking into systems using social engineering is often undetectable until it’s too late, because the hacker is getting in using valid credentials. People are strangely willing to give up their passwords and other security information to complete strangers with a good story, or unintentionally expose confidential information on peer-to-peer networks, or even throw out corporate paperwork without shredding. Not surprisingly, Mitnick’s company has a 100% success rate of hacking into systems if they’re permitted to use social engineering in addition to technical hacks; the combination of internal information and technical vulnerabilities is deadly. He walked us through how this could be done by looking just at metadata about a company, its users and their computers in order to build a target list and likely attack vector. He also discussed hacks that can be done using a USB stick, such as installing a rootkit or keylogger, reminding me of a message exchange that I had a couple of days ago with a security-conscious friend:

image

Mitnick demonstrated how to create a malicious wifi hotspot using WifiPineapple to hijack a connection and capture information such a login credentials, or trigger an update (such as Adobe Flash Player) that actually installs a fake update instead, gaining complete access to the computer. He pointed out that you can avoid these types of attacks by using a VPN every time you connect to a non-trusted wifi hotspot.

He demonstrated an access (HID) card reader that can read a card from three feet away, allowing the card and site ID to be read from the card, then played back to gain physical access to a building as if he had the original card. Even high-security HID cards can be read with a newer device that they’ve created.

He described how phishing attacks can be used in conjunction with cloned IVR systems and man-in-the-middle attacks, where an unsuspecting consumer calls what they think is their credit card company’s number, but that call is routed via a malicious system that tracks any information entered on the keypad, such as credit card number and zip code.

Next, he showed the impact of opening a PDF with a malicious payload, where an Acrobat vulnerability can be exploited to insert malware on your computer. Java applets can use the same type of approach, making you think that the applet is signed by a trusted source.

Using an audience volunteer, he showed how online tracing sites can be used to search for a person, retrieving their SSN, date of birth, address, phone numbers and mother’s maiden name: more than enough information to be able to call in to any call center and impersonate that person.

Although he demonstrated a lot of technical exploits that are possible, the message was that many of these can be avoided by educating people, and testing them on their compliance to the procedures necessary to thwart social engineering attacks. He referred to this as the “human firewall”, and had a lot of good advice on how to strengthen it, such as advising people to use Google Docs to open untrusted attachments, and using technology to protect information from internal people when they don’t need to see it.

Lots of great — and scary — demos of ways that you can be hacked.

This is the last day for ADVANCE 2016; I might make it to a couple of sessions later today, then we have a private concert with Heart tonight.

DSTAdvance16 Day 1 Keynote with @PeterGSheahan

I’m back at DST‘s annual AWD ADVANCE user conference, where I’ll be speaking this afternoon on microservices and component architectures. First, however, I’m sitting in on the opening keynote where John Vaughn kicked things off, then passed off to Steve Hooley for a market overview. He pointed out that we’re in a low-growth environment now, with uncertain markets, making it necessary to look at cash conservation and business efficiencies as survival mechanisms. Since most of DST’s AWD customers are financial services, he talked specifically about the disruption coming to that industry, and how current companies have to drive down costs to be positioned to compete in the new landscape. Only a few minutes into his talk, Hooley mentioned blockchain, and how decentralized trust and transactions have the potential to turn financial services on its ear: in other words, the disruptions are technological as well as cultural.

He turned things over to the main keynote guest speaker, Peter Sheahan, author of several business innovation books as well as head of Karrikins Group. Sheahan talked about finding opportunity in disruption rather than fighting it. He presented four strategies for turning the challenge of disruption into opportunity: move towards the disruption; focus on higher order opportunities; question assumptions; and partner like you mean it. These all depend on looking beyond the status quo to identify where the disruption is happening to drive recognition of the opportunities, not just trying to do the same thing that you’re doing now, just better and faster. Some good case studies, such as Burberry — where the physical stores’ biggest competition is their own online shopping site, forcing them to create unique in-store experiences — with a focus on how the convergence of a number of disruptive forces can result in a cornucopia of opportunities. It’s necessary to look at the higher order opportunities, orienting around outcomes rather than processes, and not spend too much time optimizing lower-level activities without looking at how the entire business model could be disrupted.

A dynamic and inspiring talk to kick off the conference. Not sure I’ll be attending many more sessions before my own presentation this afternoon since I’m doing some last-minute preparations, although there are some pretty interesting ones tempting me.