Bruce Silver Now Stylish With DMN As Well As BPMN

I thought that Bruce Silver’s blog had been quiet for a while: turns out that he moved to a new, more representative domain name, and my feed reader wasn’t updating from there. He’s rebranding his business, including his blog, under Method & Style, mirroring the title of his popular book and training BPMN Method and Style , and now his new book and training options for DMN: DMN Method and Style: The Practitioner’s Guide to Decision Modeling with Business Rules .

His blog has a ton of new content on DMN, starting with a great piece that compares the path of the DMN standard with that of BPMN, which is considerably more mature. He discusses the five key elements of DMN, then goes into each of those in detail in the next five posts: Decision Requirements Diagrams, Decision Tables, FEEL (a new expression language developed for DMN), Boxed Expressions and the Metamodel and Schema. It’s really interesting to read his analysis comparing the evolution of the two standards: there was a time when everyone thought that BPMN was just about the visual notation, but to make it really useful, the interchange format and execution semantics have to come along at some point. Still, it’s useful to get started in DMN now with DRDs and decision tables, since that at least makes the decision models explicit instead of being buried in text requirements.

Once you’ve brushed up on his posts covering the five key elements, you can also read about conformance levels that vendor can choose to implement, and what didn’t make it into DMN 1.1, which is the first real version of the standard.

He doesn’t pull any punches in his discussion, and is not very complimentary on some aspects of the standard and how some vendor choose to implement it. Just as he is with BPMN. Smile

HoHoTO 2015: be a sponsor, or just come for the party

HoHoTO is a fundraiser event put on each year by Toronto’s digital community: a great party with dancing, raffles and a chance to catch up with your friends (at the top of your lungs to be heard over the dance tunes). Since its inception in 2008, HoHoTO has raised over $350,000 for the Daily Bread Food Bank – an awesome organization that helps to feed people in our community – but this year, HoHoTO has turned its eye to supporting “the next generation of founders, funders and tech professionals”. In particular, the focus will be on organizations that help to bring more women and minorities into technology and digital businesses. The event is on December 11 at the Mod Club, and early bird tickets are on sale here.

The primary focus is on the YWCA Toronto’s Girl’s Centre, with a 3-year goal to completely fund the Girls’ Centre and push for the opening of another one. This centre provides programs for girls from 9-18 to allow them to try activities and develop skills, including “Miss Media” for designing online media such as blogs and websites. It’s located in Scarborough, the easternmost 1/3 of Toronto, serving a community that has upwards of 65% visible minorities (and the best ethnic food in the world, according to one economist), meaning that it is a great match with HoHoTO’s focus on promoting women and minorities in business and technology from an early age. HoHoTO is also bringing together professional women as mentors, including me.

The HoHoTO event, run by unpaid volunteers, is raising money through tickets and sponsorships. If you or your organization recognizes the value of diversity in business, and wants to support the success of women and minorities in digital and technology fields, consider becoming a sponsor of the event. Details are here, and most of your contribution is eligible for a tax receipt. You’ll get recognition on HoHoTO’s site and at the event, other promotional opportunities throughout the year, a handful of event and drink tickets to bring your team out to enjoy the evening, and a nice warm feeling in your heart.

Join the AIIM paper-free pledge

Pledge_badge1AIIM recently posted about the World Paper-Free Day on November 6th, and although I’m not sure that it’s recognized as a national holiday or anything, it’s certainly a good idea. I blogged almost three years ago about my mostly paperless office, and how to achieve such a thing yourself. Since that time, I’ve added an Epson DS-510 scanner, which has a nice small footprint and a sheet feeder; it sits right on my desk and there is never a backlog of scanning.

It’s not just about scanning and shredding, although those are pretty important activities: you have to have a proper retention plan that adheres to any regulatory requirements, and a secure offsite (cloud or otherwise) backup capability to ameliorate any physical site disasters.

You also need to consider how much backfile conversion that you’ll do: I decided to back-scan everything except my financial records at the time that I started going completely paperless, then scan everything including financials from that date forward. Each year, another batch of old paper financial records reached their destruction date and were shredded, the last of them just last year, and I no longer have any paper files. If back-scanning is too time-consuming for you but you want to start scanning everything day-forward, then store your old paper files by destruction date so that you can easily shred the batch of expired files each year until there are none left.

These things – scanning, document destruction, retention plan, secure backup, backfile conversion – are the same things that I’ve dealt with at large enterprise customers in the past on ECM projects, just on a small-office scale.

Avoiding a surfeit of conferences

This time of year, I’m usually flying back and forth to Las Vegas to engage in the fall conference season: software vendors hold their annual user conferences, and invite me to attend in exchange for covering most of my travel expenses. They don’t pay me to attend unless I give a presentation – in fact, many are not even my clients – and since I’m self-employed, that means I’m giving up billable days to attend. Usually, I consider that a fair trade, since it allows me to get a closer look at the products and talk to the vendor’s employees and customers, and I typically blog about what I see.

This year, however, I stepped away from most of the conferences, including the entire slate of fall events. A couple of family crises over the summer required a lot of my attention and energy, and when I started getting requests to attend fall conferences, I just didn’t feel that they were worth my time.

Many vendors have become overly focused on the amount of blogging that I do at their conference, rather than on strengthening our relationship. My conference blogging, described as “almost like being there”, is seen by some vendors as a savant party trick, and they consider themselves cheated in some way if I don’t publish enough content during the conference. What they forget is that by attending their conference, I’m gaining insights into their company and products that I can use in future discussions with enterprise clients, as well as in any future projects that I might do with the vendor. I generate revenue as a consultant and industry analyst; blogging is something that I do to analyze and solidify my observations, to discuss opinions with others in the field, and to expand my business reach, but I’m never paid for it, and it is never a condition of attending an event – at least in my mind.

Another factor is the race to the bottom in travel expenses. Many vendors require that they book my air travel, and when booking the one conference that I was going to attend this fall, I asked their travel group to pay the $20 fee to select a decent (economy) seat for the 5-hour tourist-class flight, but they refused. Many times in the past I’ve just paid for seat assignments and upgrades out of my own pocket, but this time it became about the principle: the vendor in question, who is not an active client of mine, placed that little value on my attendance.

So if you’re a vendor, here’s the deal. A paid client relationship with me is not a prerequisite of me attending your conference, and has never been in the past, but there has to be a mutual recognition of the value that we each bring to the table. I bring 25 years of experience and opinions as a systems implementer, consultant and industry analyst, and I offer those opinions freely in conversation: consider it free consulting while I’m at your conference. I expect to gain insights into your company, products and customers, through public conference sessions and private discussions. I may blog about what I see and hear (at least the parts not under non-disclosure), or use that information in future discussions with enterprise clients. Or I may not, if I don’t find it relevant or interesting. Lastly, when you ask me to fly somewhere, keep in mind that it is not a treat for me to travel to Las Vegas or Orlando, and at least make sure that I’m not in the middle seat at the back of a 50-row aircraft.

As always, everything after the bar opens is off the record.

The Enterprise Digital Genome with Quantiply at BPMCM15

“An operating system for a self-aware quantifiable predictive enterprise” definitely gets the prize for the most intriguing presentation subtitle, for an afternoon session that I went to with Surendra Reddy and David Chaney from Quantiply (a stealth startup that has just publicly launched), and their customer, a discount brokerage service whose name I have been requested to remove from this post.

Said customer has some significant event data challenges, with a million customers and 100,000 customer interactions per day across a variety of channels, and five billion log messages generated every day across all of their product systems and platforms. Having this data exist in silos with no good aggregation tools means fragmented and poor customer support, and also significant challenges in system and internal support.

To address these types of heterogenous data analysis problems, Quantiply has a two-layer tool: Edge Cloud for the actual data analysis, which can then be exposed to different roles based on access control (business users, operational users, data scientists, etc.); and Pulse for connecting to various data sources including data warehouses, transactional databases, BPM systems and more. It appears that they’re using some sort of dimensional fact models, which is fairly standard data warehouse analytical tools, but their Pulse connectors is allowing them to pour in data on a near-real-time basis, then make the connections between capabilities and services to be able to do fast problem resolution on their critical trading platforms. Because of the nature of the graph connectivity that they’re deriving from the data sources, they’re able to not only resolve the problem by drilling down, but also determine what customers were impacted by the problem in order to follow up. In response to a question, the customer said that they had used Splunk and other log analytics tools, but that this was “not Splunk”, in terms of both the real-time nature, and the front-end user experience, plus deeper analytical capabilities such as long-term interaction trending. In some cases, the Quantiply representation is sufficient analysis; in other cases, it’s a starting point for a data scientist to dig in and figure out some of the more complex correlations in the data.

There was a lot of detail in the presentation about the capabilities of the platform and what the customer is doing with it, and the benefits that they’re seeing; there’s not a lot of information on the Quantiply website since they’re just publicly launching.

Update: The original version of this post included the name of the customer and their representative. Since this was a presentation at a public conference with no NDA or confidentiality agreements in place, not even a verbal request at any time during the session, I live-blogged as usual. A day later, the vendor, under pressure from the customer’s PR group, admitted that they did not have clearance to have this customer speak publicly, which is a pretty rookie mistake on their part, although it lines up with my general opinion on their social media skills. As a favor to the conference organizers, who put a lot of effort into making a great experience for all of us, I’ve decided to remove the customer’s name from this post. I’m sure that those of you who really want to know it won’t have any trouble finding it, because of this thing called “the internet”.

PCM Requirements Linking Capability Taxonomy and Process Hierarchy at BPMCM15

I’m in Washington DC for a couple of days at the BPM and Case Management Summit; I missed this last year because I was at the IRM BPM conference in London, and in fact I was home from IRM less than 36 hours this weekend before I got back on a plane to head down to DC this morning.

I’m in a breakout session to hear John Matthias from the Court Consulting Services of the National Center for State Courts, who focuses on developing requirements for court case management systems. As might be expected, the usual method for courts to acquire their case management systems is to just pick the commercial off-the-shelf (COTS) software from the leading packaged solution vendor, then customize it to suit. Except in general, the leading vendor’s software doesn’t meet the current needs of courts’ case workers and court clerks, and Matthias is trying to rework the best practices to create definitive links and traceability between requirements, processes and the business capabilities taxonomy.

As he noted, justice case management is a prime example of production case management (PCM), wherein there are well-worn paths and complicated scenarios; multiple agents are involved (court and clerks, prosecution, public defender) and the specific order of activities is not always pre-defined, so the key role of the PCM system is to track and respond to state changes in the system of record. There are, however, some points at which there are very specific rules of procedure and deadlines, including actions that need to be taken in the event of missed deadlines. The problem comes with the inflexibility of the existing COTS justice case management software available in the market: regardless of how much study and customization is done at the time of original installation (or, perhaps, in spite of it), the needs change over time and there is no way for the courts to make adjustments to how the customized COTS package behaves.

To address the issue of requirements, Matthias has developed a taxonomy of business capabilities: a tree structure that breaks each business capability down to increasing specialized capabilities that can be mapped to the capability’s constituent requirements. He’s also looked at a process hierarchy, where process stages break down to process groups, and then to elementary processes. This process hierarchy is necessary for organization of the processes, particularly when it comes to reusability across various case types. Process groups show hand-offs between workers on a case, while the elementary processes are the low-level workflows that may be able to be fully automated, or at least represent atomic tasks performed by workers. The elementary processes are definitely designed to be reusable, so that a process such as “Issue Warrant” can be related to a variety of business capabilities. Managing the relationships between requirements gets complex fast, and they’re looking at requirements management software that allows them to establish relationships between business capabilities, business rules, processes, system requirements and more, then understand traceability when there is a change to one component.

Unlike systems with completely pre-defined processes, the requirements for PCM systems need to have the right degree of granularity (not too much to overconstrain the workers, and not too little to provide insufficient guidance), have performance measurement built in, and link to systems of record to provide state awareness and enable process automation. The goal is to achieve some amount of process discipline and standardization while will allowing variations in how the case managers operate: provide guidance, but allow for flexible selection of actions. Besides that ability to provide guidance without overconstraining, developing requirements for a PCM isn’t that much different from other enterprise systems: consider the future state, build to change, and understand the limits of the system’s configurability. I would also argue that requirements for any user-facing systems shouldn’t be done using a waterfall methodology where complete detailed requirements are necessary before implementation, but rather a more Agile approach where we collect the high level requirements, then enough detailed requirements to get you to your first implementation in an iterative development cycle. At which time all of the requirements will change anyway.

The Personology of @RBSGroup at PegaWorld 2015

IMG_7261Andrew McMullan, director of analytics and decisioning (aka “personologist”) at Royal Bank of Scotland, gave a presentation on how they are building a central (Pega-based) decisioning capability to improve customer engagement and change their culture along the way. He started with a personal anecdote about how RBS did the right thing for a family member and gained a customer for life – a theme echoed from this morning’s keynote that also included RBS.  He showed a short video of their current vision, which stated goals of making RBS easier to do business with, and to work for, in addition to being more efficient. In that order, in case you other banks are following along.

RBS is now government owned, having been bailed out during the financial crisis; I’m not sure how much this has allowed them to focus on customer engagement rather than short-term profits, but they do seem to be talking the right talk.

RBS uses Pega’s Chordiant – primarily the decision management components, if I am reading it correctly – although are implementing Pega 7 for an August 2015 rollout to bring in more robust Next Best Action capabilities; they also use SAS Visual Analytics for reporting. This highlights the huge role of decisioning as well as process in customer engagement, especially when you’re applying analytics to a broad variety of customer information in order to determine how to interact with the customer (online or IRL) at any particular moment. RBS is proactive about having their customers do things that will save them money, such as renewing a mortgage at a lower rate, or choosing a package of banking services that doesn’t overlap with other services that they are paying for elsewhere. Contrary to what nay-sayers within RBS said about lost revenue, this tends to make customers more loyal and ultimately do more business with them.

There was a good question from the audience about how much of this was changes to organizational culture, and how much was the data science: McMullan said that it’s really critical to win the hearts and minds of the employees, although obviously you need to have at least the beginnings of the analytics and recommendations to get that started. Also, they use Net Promoter Score as their main internal metric, which tends to reward relationship-building over short-term profits; having the right incentives for employees goes a long ways towards helping them to do the right thing.

PegaWorld 2015 Day 2 Customer Keynotes: Big Data and Analytics at AIG and RBS

After the futurist view of Brian Solis, we had a bit more down-to-earth views from two Pega customers, starting with Bob Noddin from AIG Japan on how to turn information that they have about customers into an opportunity to do something expected and good. Insurance companies have the potential to help their customers to reduce risk, and therefore insurance claims: they have a lot of information about general trends in risk reduction (e.g., tell an older customer that if they have a dog and walk it regularly, they will stay healthier and live longer) as well as customer-specific actions (e.g., suggest a different route for someone to drive to work in order to reduce likelihood of accident, based on where they live and work, and the accident rates for the roads in between). This is not a zero-sum game: fewer claims is good for both AIG and the customers. Noddin was obviously paying close attention to Solis, since he wove elements of that into his presentation in how they are engaging customers in the way that the customer chooses, and have reworked their customer experience – and their employee and agent experience – with  that in mind.

Between the two customers, we heard from Rob Walker, VP of Decision Management and Analytics at Pega, about the always-on customer brain and strategies for engaging with them:

  • Know your customer: collect and analyze their data, then put it in the context of their entire customer journey
  • Reach your customer: break down the silos between different channels, and also between inbound and outbound communications, to form a single coherent conversation
  • Delight your customer: target their needs and wants based on what you know about them, using the channels through which you know that they can be reached.

He discussed how to use Pega solutions to achieve this through data, analytics and decisioning; obviously, the principles are universal.

Chrome Legacy Window 2015-06-09 103539 AM.bmpThe second customer on stage was Christian Nelissen from Royal Bank of Scotland, who I also saw yesterday (but didn’t blog about) on the big data panel. RBS has a good culture of knowing their customer from their roots as a smaller, more localized bank: instead of the branch manager knowing every customer personally, however, they now rely on data about customers to create 1:1 personalize experiences based on predictive and adaptive analytics in the ever-changing context of the customer. He talked about the three pillars of their approach:

  • It’s about the conversation. If you focus on doing the right thing for the customer, not always explicit selling to them, you build the relationship for the long term.
  • One customer, one bank. A customer may have products in different bank divisions, such as retail banking, credit cards and small business banking, and you need to be cognizant of their complete relationship with the bank and avoid internal turf wars.
  • You can do a lot with a little. Data collection and analytics technologies have become increasingly cheaper, allowing you to start small and learn a lot before expanding your customer analytics program.

Alan Trefler closed out the keynote before sending us off to the rest of the day of breakout sessions. Next years, PegaWorld is in Las Vegas; not my favorite place, but I’ll be back for the quality of the presentations and interactions here.

These two keynotes this morning have been great to listen to, and also closely aligned with the future of work workshop that I’m doing at IRM BPM in London next week, as well as the session on changing incentives for knowledge workers. Always good when the planets align.

PegaWORLD 2015 Keynote with @BrianSolis: Innovate or Die!

Brian Solis from Altimeter  Group was the starting keynote, talking about disruptive technology and how businesses can undergo digital transformation. One of the issues with companies and change is that executives don’t live the way the rest of us do, and have to think of the shareholders first, but may not have sufficient insight into how changing customer attitudes and the supporting technology will impact their profitability, or even their ability to survive. “A Kodak moment” is now about how you go bankrupt when you ignore disruptive technology: not something that you want to capture for posterity.

Digital Darwinism

Customer experience can just happen by accident, or it can be something that we design in order to achieve a “higher purpose” of being customer centric. That doesn’t mean that we have complete control over that customer experience any more, since our brands are made up of what we put out there, and what other people say about us. Customer experience is not about what we say, but about what we do, since that’s what will be examined under the social media microscope. Altimeter’s research shows that almost all companies undergoing their digital transformation specifically because of customer experience, but that few of them really understand what the problem is. 67% of buyers’ customer journey is now done online, consulting 11 different sources for information even if they purchase IRL, and your online customer experience is the difference between surviving or not. Part of this is omni-channel presence, since almost none of those pre-buying search journeys happen on a single device. You can’t force customers to do business your way: you have to do it their way. And in order to do it their way, you have to understand what that is (that sounds kind of obvious, but may companies don’t get that). You have to think through the eyes of your customers: as Solis said, “Think like a customer. Act like a startup.”

Innovate or Die

Solis’ message, in short: if you don’t disrupt yourself, someone else will do it for you. Innovate or die.

IBM ECM Strategy at Content2015

Wrapping up the one-day IBM Content 2015 mini-conference in Toronto (repeated in several other cities across North America) is Feri Clayton, director of document imaging and capture. Feri and I were two of the few female engineers at FileNet back during my brief time there in 2000-1, and I have many fond memories of our “women in engineering” lunch club of three members.

Clayton talked about how enterprises are balancing the three key imperatives of staying competitive through productivity and cost savings, increasing growth through customer centricity, and protecting the organization through security and compliance. With ECM initiatives, this boils down to providing the right information to employees and customers to allow them to make the right decisions at the right time. From and ECM capabilities standpoint, this requires the critical capabilities of content capture, content protection, activating content by putting it into business processes, analyzing content to reveal insights, and engaging people in content-centric processes and collaboration. Some recent advances for IBM: they have been moving towards a single unified UI for all of their ECM portfolio, and IBM Content Navigator now provides a common modern user experience across all products; they have also been recognized as a market leader in Case Management by the big analysts.

She did a pretty complete review of the entire ECM portfolio, including recent major releases as well as what’s coming up.

Looking forward, they’re continuing to improve Navigator Cloud (hosted ECM), advancing mobile capture and other document capture in Datacap, releasing managed cloud (IBM hosted) offerings for CMOD and Case Manager, and releasing a new Information Lifecycle Governance solution. They’re also changing their release cadence, moving to quarterly releases rather than the usual 1-2 years between releases, while making the upgrades much easier so that they don’t require a lot of regression testing.

IBM Navigator Cloud — the cloud ECM product, not the unified UI — has a new mobile UI and a simplified web UI that includes external file sharing; soon it will have a Mac sync client, and an ECM solution platform on the cloud codenamed “Galaxy” that provides for much faster development using solution patterns. There’s quite an extensive ECM mobile roadmap, with Case Manager and Datacap coming soon on mobile. The core content platform continues to be enhanced, but they’re also expanding to integrate with web-based editors such as Office 365 and Google Docs, and enhancing collaboration for external participants.

Case Manager, which is my key product of interest here today, will soon see a mobile interface (or app?), enhanced case analytics, enhanced property layout editor, simplified solution deployment and packaging, and more industry and vertical solutions. Further out, they’re looking at hybrid use cases with cloud offerings.

Good summary of the IBM ECM roadmap, and a wrap for the day.