OpenText EIM Day Toronto, Company/Product Keynotes

I always try to drop in on vendor events that happen in my own backyard, so today I’m at OpenText’s EIM Day in Toronto. OpenText is a success story in the Canadian software space, focused on enterprise information management, which includes content and process management. They have grown significantly through acquisitions, acquiring (somewhat controversially) two different BPM vendors (Metastorm and Global 360) to add to their home-grown content management capabilities.

Following a welcome from Jim McIntyre, the regional VP of sales, we heard a keynote from Mark Barrenechea, CEO. Barrenechea was with SAP Oracle in the past, and obviously has continued to leverage those strong ties into the ERP market by integrating and partnering with SAP and other ERP vendors. He sees information-based strategies as the direction of business-technology transformation today, providing support for all of the unstructured information that lives alongside the structured information in ERP and other line of business systems. He outlined several transformations going on in the information enterprise: paper to digital; hierarchical to social; on premise to hybrid cloud; fragmented to managed, secured and governed; products to platforms; and ERP to EIM. He claimed that they will be able to replace multiple different products with a single platform from OpenText covering everything from capture to archive — capture, content management, process management, customer experience management (CEM) — although it appears that’s not yet released, and not clear if this will be a product branding exercise rather than a fully integrated platform.

This appeared to be a fairly conservative audience in terms of product adoption — I sat with someone who was just in the process of converting their LiveLink installation to Content Server, which I think is a bit overdue — so I’m not sure how well the message about their Tempo social collaboration platform went down, but OpenText will be pushing it later this year by using it for customer support and service interactions. What did go over well was Barrenechea’s scare tactic about Dropbox and Google Docs licensing — “did you know that they have the right to use your content for whatever purposes that they want?” — as a lead-in to the need for content security.

Barrenechea wrapped up with a product overview in their four main categories:

  • ECM, with Content Server, Tempo Box (an enterprise Dropbox-like product) and Archive (storage management)
  • CEM, with Tempo Social, DAM (digital asset management), WEM (web experience management) and CCM (customer communication management) making up the social suite
  • BPM, with Assure, MBPM and targeted apps making up their Smart Process Apps
  • iX (information exchange), with Secure iX, EDI and MFT (managed file transfer) providing secure transactions
  • DX (discovery), with InfoFusion and Semantic Navigation, indicating OpenText’s reentry into enterprise search; keep in mind that OpenText was a spin-off from a University of Waterloo project for indexing and searching the Oxford English Dictionary, making search part of their DNA

This still seems like a lot of products to me, many of which came through acquisitions hence may have quite different internal architecture. Although Barrenechea made claims that these are integrated, I did hear the qualifier “…on some level”. Hopefully they are integrated in more than his slide deck.

We had a deeper product view with Lynn Elwood, VP of product marketing, walking us through a (fictional) customer use case for a tablet manufacturer:

  • Creating and publishing product web pages using WEM (this functionality originated with the Vignette acquisition), including a review/approval cycle for the content before publication, plus cross-platform publication to update Facebook and Twitter with the newly published information, as well as mobile-optimized sites. This also gathers metrics and KPIs about the published information, including user actions, sentiment, ratings and comments.
  • Customer communications using StreamServe for customizing any customer communications, including adding customer-specific messages to invoices and letters.
  • Dynamic case management for help desk and product complaints/returns, which can include scanned documents with content captured automatically and added as case metadata. Mobile device support and Tempo Box allows a customer to take a photo of damaged goods and upload for the CSR to review.
  • Process analytics with ProVision (previously acquired by Metastorm, which was then acquired by OpenText) to model and simulate processes for improvement.
  • Records management within their Content Server product. This includes direct integration with Microsoft Outlook, so that emails can be manually dragged (or automatically moved) into folders that are managed by Content Server, hence can be part of a case and controlled by records management. There’s a lot of automated classification built in, so that content can be automatically found, classified and managed according to policies and usage.
  • Content storage management using their Archive product, which includes media staging and access control (including geographic constraints) based on policies.

A good overview of the product suite, but I’m still left with the feeling that this is a huge grab-bag of partially integrated components based on a variety of acquisitions over the years. They are definitely making progress in bringing them together, and the sort of use cases that Elwood showed us will help customers to understand the range of capabilities that OpenText can provide. As long as the products are individually capable and moving towards a common vision in terms of architecture, integration and user experience, there is an advantage to dealing with a single vendor for an array of related information management functionality: after all, that’s the same reason that many enterprises buy IBM products, in spite of an equally fragmented product acquisition and development strategy.

Smarter Process At IBM Impact 2013

Day 1 at IBM Impact 2013, following a keynote full of loud drums, rotating cars and a cat video, David Millen and Kramer Reeves gave a presentation on IBM’s vision for Smarter Process, which focuses on improving process effectiveness with BPM, case management and decision management. There are a number of drivers that they mentioned here that we’ll address in our panel this afternoon on “What’s Next For BPM” — the big four of mobile, social, cloud and big data — with the point that the potential for these is best seen when tied to mission-critical business processes. Not surprisingly, their research shows that 99% of CIOs looking to transform their business realize that they have to change their processes to do so.

Processes are not just about internal operations, but extend beyond the walls of the organization to take the customers’ actions into consideration, binding the systems of record to the systems of engagement. Therefore, it’s not just about process efficiency any more: we’re being forced to move beyond automation and optimization by the aforementioned disruptive forces, and directly address customer-centricity. In a customer-centric world, processes need to be responsive, seamless and relevant in order to engage customers and keep them engaged and well-served, while still maintaining efficiencies that we learned from all those years of process automation.

This isn’t new, of course; analysts (including me) and vendors have been talking about this sort of transformation for some time. What is new (-ish) is that IBM has a sufficiently robust set of product functionality to now have some solid case studies that show how BPM, CM and/or DM are being used with some configuration of mobile, social, cloud and big data. They’re also emphasizing the cross-functional approach required for this, with involvement of operations as well as IT and line of business teams.

Their key platforms for Smarter Process are BPM, Case Manager and ODM, and we had a summary of the relevant new features in each of these. BPM and ODM v8.5 are announced today and will be available in the next month or so. Here’s some of the key enhancements that I caught from the torrent of information.

BPM v8.5:

  • Dashboards that allow you to click through directly to take action on the process. The dashboards provide a much better view of the process context, both for instance information such as the process timeline and activity stream, and for insights into team performance. This is now a more seamless integration with their “Coach” UI framework that is used for task UI, including presence, collaboration and social activity. I think that this is pretty significant, since it blurs the line between the inbox/task UI and the report/dashboard UI: analytics are context for actionable information. The process timeline provides a Gantt chart view — similar to what we’ve seen for some time in products such as BP Logix — and includes the beginnings of their predictive process analytics capabilities to predict if a specific instance will miss its milestones. There’s so much more than can be done here, such as what-if simulation scenarios for a high-value instance that is in danger of violating an SLA, but it’s a start. The team performance view provides real-time management of a team’s open tasks, and some enhanced views of the team members and their work.
  • Mobile enhancements with some new mobile widgets and sample apps, plus a non-production Worklight license bundled in for jumpstarting an organization’s mobile application development. You would need to buy full Worklight licenses before production deployment, but so many organizations are still at the tire-kicking stage so this will help move them along, especially if they can just modify the sample app for their first version. The design environment allows you to playback the mobile UI so that you can see what it’s going to look like on different form factors before deploying to those devices. As expected, you can take advantage of device capabilities, such as the camera and GPS, within mobile apps.
  • Social/collaboration enhancements, including presence indicators.
  • Integration into IBM Connections and IBM Notes, allowing for task completion in situ.
  • Blueworks Live integration, providing a link back to BWL from a BPM application that was originally imported from BWL. This is not round-tripping; in fact, it’s not even forward-tripping since any changes to the process in BWL require manual updates in BPM, but at least there’s an indication of what’s connected and that the changes have occurred.
  • Integration with the internal BPM content repository now uses the CMIS standard, so that there is a single consistent way to access content regardless of the repository platform.
  • A new BPM on SmartCloud offering, providing a full IBM BPM platform including design and runtime tools in IBM’s cloud. This can be used for production as well as development/test scenarios, and is priced on a monthly subscription basis. No official word on the pricing or minimums; other BPM vendors who go this route often put the pricing and/or minimum license numbers prohibitively high for a starter package, so hoping that they do this right. Applications can be moved between cloud and on-premise BPM installations by networking the Process Centers.

ODM v8.5:

  • MobileFirst for business rules on the go, with RESTful API adapters inside the Worklight environment for building mobile apps that invoke business rules.
  • Decision governance framework for better reusability and control of rules, allowing business users to participate in rule creation, review, management and release. Considering that rules are supposed to be the manifestation of business policies, it’s about time that the business is given the tools to work with the rules directly. There’s a full audit trail so that you can see who worked on and approved rules, and when they were promoted into production, and the ability to compare rule and decision table versions.

Blueworks Live, for the enhancements already released into production a couple of weeks ago:

  • Decision discovery through graphical models, using the emerging decision modeling notation (DMN) from OMG. Decisions can now be documented as first-class artifacts in BWL, so that the rules are modeled and linked with processes. Although the rules can be exported to Excel, there’s no way to get them into IBM ODM right now, but I’m sure we can expect to see this in the future. The graphical representation starts with a root decision/question, and breaks that down to the component decisions to end up with a decision table. Metadata about the decisions is captured, just as it is for processes, leveraging the glossary capability for consistency and reuse.
  • Natural language translation, allowing each user to specify their language of choice; this allows for multi-language collaboration (although the created artifacts are not translated, just the standard UI).
  • Process modeling and discovery

Case Manager v5.1.1:

  • Enhanced knowledge worker control and document handling, bringing better decision management control into the case environment.
  • Modeling complex cases.
  • Two solutions built on top of Case Manager: intelligent (fraud) investigation management, and patient care and insight.

Integration Bus v9.0:

  • Decision services built in so that decisions can be applied to in-flight data.
  • Policy-driven workload management to manage traffic flow on the ESB based on events.
  • Mobile enablement to allow push notifications to mobile devices.

The Case Manager stuff went by pretty quickly, and wasn’t included in my pre-conference briefing last week, but I think that it’s significant that we’re (finally) seeing the FileNet-based Case Manager here at Impact and on the same marketecture chart as BPM and ODM. I’m looking forward to hearing more about the level of integration that they’re going to achieve, and whether the products actually combine.
image

Underlying the main product platforms, they’re leveraging Business Monitor and ODM to develop operational intelligence capabilities, including predictive analytics. This can gather events from a variety of sources, not just BPM, and perform continuous analysis in real-time to aid decision-making.

They are also including their services offerings as part of the Smarter Process package, supporting an organization’s journey from pilot to project to program. They offer industry solution accelerators — I assume that these are non-productized templates — and can assist with the development of methodologies and a BPM COE.

There are a number of breakout sessions on the different products and related topics over the next couple of days, but I’m not sure how much I’ll be able to see given the hectic schedule that they’ve given me as part of the analyst program.

Apologies for those who saw (briefly) an earlier version of this post; the new version of the WordPress Android app has a new button, and I went ahead and clicked it.

IBMImpact Next Week

I’m off to IBM Impact next week, where I’m speaking on a panel on Monday afternoon about “What’s Next For BPM”, along with Neil Ward-Dutton, Bruce Silver, Eric Herness and Pierre Haren, hosted by Irene Lyakovetsky. I’ll also be attending the analyst briefings and will post about what’s new with IBM BPM, Blueworks Live and related products. Annoyingly, there doesn’t appear to be any way to see the agenda unless you’re signed up for the conference, meaning that I can’t link directly to session descriptions, but will blog about whatever I attend if I have time.

It will be a pretty crammed few days, but if you’re going to be there and want to say hi, let me know and we can try to connect. And speaking of connecting, get yourself invited to the BP3 Connect cocktail hour on Tuesday evening (I’m sure that Scott Francis can help you with that), I’ll be there for sure [and everything will be off the record, if you know what I mean 🙂 ].

Smart Process Apps with Kofax and Forrester

Kofax sponsored a webinar this week (replay here) featuring Andy Bartels of Forrester Research speaking about Smart Process Applications (SPA): a term introduced by Forrester to describe collaborative, process-based packaged applications for human-centric work. In their terms: “a new generation of applications to help make human-centric, collaborative business activities be more effective”, with the goal to “help people be smarter in executing critical business activities”. You can check out their report on this from last year; the name is still struggling to gain acceptance, but vendors such as Kofax and OpenText (for whom I did a webinar and white paper on this topic last month) are helping to push it as a slice of the ECM/BPM/CM market where they have product offerings [by CM, I mean case management, including advanced CM (ACM), adaptive CM (also ACM), production CM (PCM) and dynamic CM (DCM), the latter term preferred by Forrester].

Forrester makes the distinction between transactional process apps and SPAs: transactional process apps tend to have standardized processes and little collaboration, whereas SPAs have a greater degree of collaboration as well as decision-making by the participants. If that was all, then this would just fall into the case management category – probably production case management – but an important focus of SPAs is that they are packaged applications for a specific activity: contract lifecycle management, customer support, procurement and the like. Bartels described them as filling in the gaps between the transactional apps, rather than using email and spreadsheets to bridge those gaps. He kept referring to these apps as “making people smarter”, which I think is a slightly awkward way of saying that they provide informational context for human decision-making, providing the right information to people at the right time to do their work.

He pointed out that BPM/DCM platforms provide an application development environment for companies to build their own SPAs, and that companies can then keep that app to themselves as a competitive differentiator, give it back to the vendor to incorporate into the base product, or sell it themselves (possibly in conjunction with the vendor). I think that a lot of these apps will come from the vendors directly, possibly via code developed for customer projects.

Kofax Smart Process AppsMartyn Christian of Kofax took the second part of the webinar to talk about Kofax solutions that fit into the Smart Process Apps envelope: capture of content as it moves from systems to engagement to systems of record is definitely their sweet spot. He overlaid their technology portfolio on Forrester’s “jigsaw” graphic to show that they offer something in all five pieces, although they are really pushing a platform for building SPAs, not the fully packaged SPAs that we’re seeing from some other vendors that are starting from a more comprehensive platform. That being said, Kofax is offering a customer onboarding SPA for capturing information at the point of origination, automating NIGO (not in good order) resolution and integrating with line of business and ECM systems; this sort of capture-focused SPA, or what they call “First Mile Solutions” is what we’re likely to see from Kofax in the future, especially as they continue to integrate the functionality of the Singularity (BPM/CM) and Altasoft (BI/analytics) acquisitions.

Forrester has a brand new Wave for SPAs; you can get this from the Kofax site here (registration required), plus a copy of a Forrester market analysis of multichannel capture, BPM and SPA, commissioned by Kofax. I’m sure that many of the other vendors in the Wave will have the report available as well, and it’s an interesting group of vendors: some horizontal BPM/ECM vendors, Salesforce, and a supply chain software vendor. This category is still such a mixed bag, and it does have the feeling of Forrester running a clustering algorithm on characteristics of existing solutions to see what they had in common, then “creating” the SPA category to describe them. Whether this is a true market category or just a speed bump on the way to a new age of applications and their development platforms remains to be seen.

Can BPM Save Lives? Siemens Thinks So

My last session at Gartner BPM 2013 is a discussion between Ian Gotts of TIBCO and their customer Tommy Richardson, CTO of Siemens Medical Solutions. I spoke with Siemens last year at Gartner and TUCON and was very interested in their transition from the old iProcess BPM platform (which originally came from TIBCO’s Staffware acquisition) to the newly-engineered AMX platform, which includes BPM and several other stack components such as CEP. Siemens isn’t an end-user, however: they OEM the TIBCO products into their own Soarian software, which is then sold to medical organizations for what Richardson refers to as “ERP for hospitals”. If you go to a hospital that uses their software, a case (process instance) is created for you at check-in, and is maintained for the length of your stay, tracking all of the activity that happens while you’re there.

With about 150 customers around the world, Seimens offers both hosted and on-premise versions of their software. Standard processes are built into the platform, and the hospitals can use the process modeler to create or modify the models to match their own business processes. These processes can then guide the healthcare professionals as they administer treatment (without forcing them to follow a flow), and capture the actions that did occur so that analytics can determine how to refine the processes to better support patient diagnosis and treatment. This is especially important for complex treatment regimes such as when an unusual infectious disease is diagnosed, which requires both treatment and isolation actions that may not be completely familiar to the hospital staff. Data is fed to and from other hospital systems as part of the processes, so the processes are not executing in isolation from all of the other information about the patient and their care.

For Siemens, BPM is a silver bullet for software development: they can make changes quickly since little is hard-coded, allowing treatment processes to be modified as research and clinical results indicate new treatment methods. In fact, the people who maintain the flows (both at Siemens and their customers) are not developers: they have clinical backgrounds so that they are actually subject matter experts, although are trained on the tools and in a process analyst role rather than medical practitioner role. If more technical integration is required, then developers do get involved, but not for process model changes.

The Siemens product does a significant amount of integration between the executing processes and other systems, such as waiting for and responding to test results, and monitoring when medications are administered or the patient is moved to another location in the hospital. This is where the move to AMX is helping them, since there’s a more direct link to data modeling, organizational models, analytics, event handling from other systems via the ESB, and other functionality in the TIBCO stack, replacing some amount of custom software that they had developed as part of the previous generations of the system. As I’ve mentioned previously, there is no true upgrade from iProcess to AMX/BPM since it’s a completely new platform, so Siemens actually did a vendor evaluation to see if this was an opportunity to switch which product OEMed into their product, and decided to stay with TIBCO. When they roll out the AMX-based version in the months ahead, they will keep the existing iProcess-based system in place for each existing client for a year, with new patient cases being entered on the new system while allowing the existing cases to be worked in place on the old system. Since a case completes when a patient is discharged, there will be very few cases remaining on the iProcess system after a year, which can then be transferred manually to the new system. This migration strategy is far beyond what most companies do when switching BPM platforms, but necessary for Siemens because of the potentially life-threatening (or life-saving) nature of their customers’ processes. This also highlights how the BPMS is used for managing the processes, but not as a final repository for the persistent patient case information: once a case/process instance completes on patient check-out, the necessary information has been pushed to other systems that maintain the permanent record.

Modernizing the healthcare information systems such as what Siemens is doing also opens up the potential for better sharing of medical information (subject to privacy regulations, of course): the existence of an ESB as a basic component means that trusted systems can exchange information, regardless of whether they’re in the same or different organizations. With their hosted software, there’s also the potential to use the Siemens platform as a way for organizations to collaborate; although this isn’t happening now (as far as I can tell), it may be only a matter of time before Siemens is hosting end-to-end healthcare processes with participants from hospitals, speciality clinics and even independent healthcare professionals in a single case to provide the best possible care for a patient.

The Neuroscience Of Change

We wrapped up day 2 of Gartner BPM 2013 with David Rock, author of Your Brain at Work: Strategies for Overcoming Distraction, Regaining Focus, and Working Smarter All Day Long, on neuroleadership and the neuroscience behind organizational change. Neuroleadership deals with how leaders make decisions and solve problems, regulate emotions, collaborate, and facilitate change; this last one is of key focus in his presentation today. In order to create change effectively using a brain-based model, we need to create a “toward” state, facilitate new connections, and embed new habits. Basically, our brains are really bad at doing things that we’ve never done before, because that requires using the relatively small prefrontal cortex. In other words, if we have to think about something, it’s hard. Furthermore, if you’re threatened or stressed, the capability of the prefrontal cortex decreases, meaning that you’re only going to be able to do simple tasks that you’ve done before.

He outlined three levels of thinking: level 1 tasks are simplistic things that you’ve seen/done a lot before, such as deleting emails; level 2 tasks are things that you’ve seen less often, such as scheduling meetings; and level 3 are more complex concepts that you’ve never seen before, such as writing a business plan. When you’re really stressed, you’re pretty much only good for doing level 1 tasks, although peak performance does happen when you’re under a bit of stress.

Change requires a lot of cognitive processing, but when change is perceived as a threat, cognitive processing function decreases. Having change not be perceived as a threat requires creating a toward state, that is, something that is rewarding; since our brains are deeply social, to the point where social pain is the same as physical pain within the brain, social rewards can be used to create that toward state. The five domains of social pain/pleasure are status (your perception of your position relative to others), certainty (uncertainty arouses the limbic system), autonomy (the brain likes to predict and have a say in the future, and having some degree of choice can reduce stress levels), relatedness (categorizations of similar/different to decide who’s on your team and shares your goals), and fairness (unfairness is the same as pain, to the brain). Having higher levels of these social rewards reduces stress, and we protect against the threat of them decreasing. Change, however, can create threats in all of these domains, and you need to find offsetting rewards in one or more of these domains in order to get people thinking about the future rather than just mentally cowering in a corner.

Once a toward state is created by addressing the social reward domains, you can facilitate new connections in people’s brains by creating an environment that permits them to have insights, which starts to form those new pathways that lead to habits.

Thought-provoking talk about the neurological motivations behind change, and a good way to end the day.

Tonight, I’m off to a TIBCO customer event — as a matter of disclosure, TIBCO provided me with one of their conference passes to be here, although I paid my own travel expenses — and I’ll only have time for one or two sessions in the morning before I head for the airport.

Empowering Business Roles For Dynamic BPM

It’s the end of day 2 at Gartner BPM 2013, and I’m in my first session with Janelle Hill — hard to believe, because usually I gravitate to her topics since she’s such an insightful speaker. She admitted that she is feeling like it’s day 4 (of a 2.5-day conference), so glad to know that I’m not the only one experiencing a bit of conference fatigue. Of course, this is my third week in a row at conferences, so that could be a contributor, too.

She’s doing one of their short “To The Point” sessions, only 30 minutes including Q&A, with a quick review of dynamic BPM and what it means to change a process in-flight. There are a number of things that can be done to change a process, ranging from small changes such as reassigning work during runtime, deleting outdated activities, or changing a monitoring dashboard; to mid-range changes such as adding new performance metrics or changing a business rule; to large changes such as major resequencing or mapping to different underlying services. This was a bit of a confusing classification, since some of these were clearly runtime changes to a specific process instance or set of instances, while others were more design-time template changes that would impact all process instances created from that point on. Regardless, it comes down to what kind of things you might want to change in your process, and what could be considered as changes that business could make directly or could collaborate on with IT. And, as soon as process changes are made by the business — or made at all — there need to be changes to the mindset: developers no longer should think about building to last, but rather building to change. This seems like just good practice for most of us, but there are still a lot of enterprise developers who don’t think about a modular service-oriented architecture, and using declarative business rules to enforce constraints.

She finished up with some must-haves for enabling dynamic BPM, which were all technology-based; this was a bit disappointing since she only briefly addressed the topic of what cultural and role/responsibility changes need to be made in order to have business people actually make the changes that the technology now allows them to make. The technology in this area is becoming fairly mature, but I find that the mindset required for IT to relinquish control of process changes, and business to assume responsibility, is the part that’s lagging. She did point out that business people are becoming more comfortable with being involved with model-driven design, but it’s way more than just knowing how to use the tools.

BPM COE At Pershing

Barbara Fackelman and Regina DeGennaro of Pershing (a BNY Mellon subsidiary providing outsourced financial transaction services) presented at Gartner BPM 2013 about their BPM initiative, as it grew from reengineering a broken process related to federal reserve exchanges — saving them $1M/year — to a BPM center of excellence (COE). As I often recommend for growing a COE, they built their COE as an offshoot of their initial project by building a reusable BPM framework along the way, then communicated that out to the rest of the organization to undercover other potential spots for process improvement.

They started to identify sub-processes and functions that are reusable across different processes, such as document rendezvous, which impacted document scanning and handling processes as well as the downstream transaction processing. With that in their portfolio, they were able to implement additional BPM projects with significant savings, making the BPM COE a very popular service inside Pershing.

Their BPM COE reports up to the executive committee, and gets input from a number of other sources internally:

  • Architecture review board
  • Technology prioritization committee
  • Dedicated programming groups and QA
  • Process owners
  • Quality management office
  • BPM solutions team

They have a number of key roles in the BPM COE:

  • Executive sponsor
  • Process owner
  • Process architect
  • Product owner
  • Governors
  • Development leads
  • Quality assurance
  • Product manager
  • Business analyst
  • Process librarian
  • Metrics master (BI architect)

With all of this in place, they have a mature COE that supports process optimization and innovation, and reviews new technologies to support the enhanced vision. Interestingly, they treat their BPM COE like any other process project: having defined and implemented it, they are constantly monitoring what/how their COE is doing, and continuously optimizing it. As an outsourcing firm, their main focus is on maximizing straight-through processes, and they can measure the performance of the COE since STP is a specific mission of the COE. As they have found, nothing succeeds like success: their STP process improvements to date have led to more collaboration and projects in other areas of their organization.

They’re using a lot of homegrown stuff, plus IBM BPM and Pega; like most big financial services organizations, they are piecing together a lot of this themselves to make it work best for them.

BPM And MDM For Business Performance Improvement

Andrew White presented a session at Gartner BPM 2013 on how process, applications and data work together, from his perspective as an analyst focused on master data management (MDM). He was quick to point out that process is more important than data 😉 but put forward MDM as a business-led discipline for maintaining a single version of the truth for business data. The focus is on how that data is created and maintained in business applications, assuring the integrity of the processes that span those applications. Since his background is in ERP systems, his view is that processes are instantiated by applications, which are in turn underpinned by data; however, the reality that I see with BPMS is that data resides there as well, so it’s fair to say that processes can consume data directly, too.

Master data is the common set of attributes that are reused by a wide variety of systems, not application-specific data — his example of master data was the attributes of a specific inventoried product such as size and weight — but there is also shared data: that grey area between the common master data and application-specific data. There are different tiers of systems identified in their pace layering, with different data access: systems of record (e.g., ERP) tend to consume enterprise master data and transaction data; systems of differentiation (e.g., CRM) consume master data, analytic data and rich data; and systems of innovation (e.g., Facebook app) consume analytic data, rich data and cloud-sourced data that might be someone else’s master data. End-to-end business processes may link all of these systems together, and be guided by different data sources along the way. It all makes my head hurt a little bit.

MDM programs have some of the same challenges as BPM programs: they need to focus on specific business outcomes, and focus on which processes need improving. And like the Fight Club reference that I heard earlier today (“the first rule of process is that you don’t talk about process”), you want MDM to become transparent and embedded, not be a silo of activity on its own. Also in common with some BPM initiatives is that MDM is often seen as an IT initiative, not a business initiative; however, just like defining business processes, it’s up to the business to identify their master data. MDM isn’t about data storage and retention; it’s about how data is used (and abused) throughout the business lifecycle. In my opinion, we still need better ways to model the data lifecycle at the same time as we model business processes; BPMN 2.0 added some provisions for data modeling, but it’s woefully inadequate for a whole data lifecycle model.

White presented a number of things that we need to think about when creating an enterprise data model, and best practices for aligning BPM and MDM. The two initiatives can be dovetailed, so that BPM provides priority and scope for the MDM efforts. Business processes (and not just those implemented in a BPMS) create and consume data, and once a process is defined, the points where data is created, viewed and updated can be identified and used as input to the master data model. From an EA standpoint, the conceptual, logical and physical models for data and process (or column 1 and column 2, if you’re a Zachman follower) need to be aligned.

Process Intelligence And Real-Time Visibility At Electrolux With SoftwareAG

Jean Lastowka of Electrolux and Dave Brooks of SoftwareAG presented at Gartner BPM 2013 on process intelligence and visibility; apparently, SoftwareAG chose to include a white paper that I wrote for them in the conference handouts (which I neglected to keep), so if you’re here, check out the package.

Electrolux makes home and professional appliances — best known in North America for vacuum cleaners, but they have a much broader repertoire — and were looking to do some internal alignment in order to serve customers better. To meet this goal, they established their BPM practice a week before last year’s Gartner BPM conference, established a BPM framework, did collaborative process modeling and launched a new ERP system for their new end-customer distribution channel over the next five months, then brought on SoftwareAG’s iKnow product for process visibility in October, and launched it to their business community in November.

Their BPM efforts were initially around end-to-end process mapping of the new processes in ARIS, allowing business and IT to have a shared knowledge of the processes; they are not using a BPMS to automate processes, but the processes are encapsulated in the ERP system implementation and procedural knowledge. Unfortunately, with these new processes and a new ERP system, people were still trying to manage the processes in the old ways (including Excel), causing a lot of customer dissatisfaction. iKnow allowed them to take their process models, connect up event feeds (I assume) from the ERP system (and presumably other systems), and show real-time order tracking and KPIs overlaid on the process model. This allows for predictive analytics, providing advance warning of potential lead time failures based on inventory levels, for example, and allowed them to track order trends and provide a single view of on-hand and in-transit inventory. Best of all, the visualizations — inventory displayed on a geographic map, for example, as well as real-time alerts based on KPIs — allowed the business to more easily consume the data than in the textual format that they had previously received.

This was a good example of what BPM looks like without a BPMS automation project: collaborative process modeling, processes implemented in some other system (an ERP system, in this case), then metrics and KPIs gathered and displayed relative to the process model in a dashboard, with items requiring action flagged and pushed to the appropriate people. Bracketing the ERP system with process modeling and monitoring transforms it into a process-centric BPM initiative that drives process improvement and provides actionable information.

There are a couple of vendors in this part of the BPM technology business, providing tooling to allow you to see the processes that are running in your other (non-BPMS) systems in real-time. For many organizations, this is the only option since they have core ERP, CRM and legacy systems that run their business, but that don’t provide good visualizations nor explicit process models. Process visibility is the first step to process excellence.