Exploring City of Toronto’s Digital Transformation at TechnicityTO 2016

I’m attending the Technicity conference today in Toronto, which focuses on the digital transformation efforts in our city. I’m interested in this both as a technologist, since much of my work is related to digital transformation, and as a citizen who lives in the downtown area and makes use of a lot of city services.

After brief introductions by Fawn Annan, President and CMO of IT World Canada (the event sponsor), Mike Williams, GM of Economic Development and Culture with City of Toronto, and Rob Meikle, CIO at City of Toronto, we had an opening keynote from Mark Fox, professor of Urban Systems Engineering at University of Toronto, on how to use open city data to fix civic problems.

Fox characterized the issues facing digital transformation as potholes and sinkholes: the former are a bit more cosmetic and can be easily paved over, while the latter indicate that some infrastructure change is required. Cities are, he pointed out, not rocket science: they’re much more complex than rocket science. As systems, cities are complicated as well as complex, with many different subsystems and components spanning people, information and technology. He showed a number of standard smart city architectures put forward by both vendors and governments, and emphasized that data is at the heart of everything.

He covered several points about data:

  • Sparseness: the data that we collect is only a small subset of what we need, it’s often stored in silos and not easily accessed by other areas, and it’s frequently lost (or inaccessible) after a period of time. In other words, some of the sparseness is due to poor design, and some is due to poor data management hygiene.
  • Premature aggregation, wherein raw data is aggregated spatially, temporally and categorically when you think you know what people want from the data, removing their ability to do their own analysis on the raw data.
  • Interoperability and the ability to compare information between municipalities, even for something as simple as date fields and other attributes. Standards for these data sets need to be established and used by municipalities in order to allow meaningful data comparisons.
  •  Completeness of open data, that is, what data that a government chooses to make available, and whether it is available as raw data or in aggregate. This needs to be driven by what problems that the consumers of the open data are trying to solve.
  • Visualization, which is straightforward when you have a couple of data sets, but much more difficult when you are combining many data sets — his example was the City of Edmonton using 233 data sets to come up with crime and safety measures.
  • Governments often feel a sense of entitlement about their data, such that they choose to hold back more than they should, whereas they should be in the business of empowering citizens to use this data to solve civic problems.

Smart cities can’t be managed in a strict sense, Fox believes, but rather it’s a matter of managing complexity and uncertainty. We need to understand the behaviours that we want the system (i.e., the smart city) to exhibit, and work towards achieving those. This is more than just sensing the environment, but also understanding limits and constraints, plus knowing when deviations are significant and who needs to know about the deviations. These systems need to be responsive and goal-oriented, flexibly responding to events based on desired outcomes rather than a predefined process (or, as I would say, unstructured rather than structured processes): this requires situational understanding, flexibility, shared knowledge and empowerment of the participants. Systems also need to be introspective, that is, compare their performance to goals and find new ways to achieve goals more effectively and predict outcomes. Finally, cities (and their systems) need to be held accountable for actions, which requires that activities need to be auditable to determine responsibility, and the underlying basis for decisions be known, so that a digital ombudsman can provide oversight.

Great talk, and very aligned with what I see in the private sector too: although the terminology is a bit different, the principles, technologies and challenges are the same.

Next, we heard from Hussam Ayyad, director of startup services at Ryerson University’s DMZ — a business incubator for tech startups — on Canadian FinTech startups. The DMZ has incubated more than 260 startups that have raised more than $206M in funding over their six years in existence, making them the #1 university business incubator in North America, and #3 in the world. They’re also ranked most supportive of FinTech startups, which makes sense considering their geographic proximity to Toronto’s financial district. Toronto is already a great place for startups, and this definitely provides a step up for the hot FinTech market by providing coaching, customer links, capital and community.

Unfortunately, I had to duck out partway through Ayyad’s presentation for a customer meeting, but plan to return for more of Technicity this afternoon.

What’s on your agenda for 2017? Some BPM conferences to consider

I just saw a call for papers for a conference for next October, and went through to do a quick update of my BPM Event Calendar. I certainly don’t attend all of these events, but like to keep track of who’s doing what, when and where. Here’s what I have in there so far; if you have others, send me a note or add them as a comment to this post and I’ll add to the calendar. I’m posting just the major conferences here, not every regional seminar.

Many vendors are eschewing a single large annual conference in favor of several regional conferences, easing the travel concerns of attendees; since these are usually just one day long, they aren’t announced this far in advance. It will be interesting to see if more vendors decide to go this way, or do more livestreaming to allow people to participate in more of the conference content remotely.

At this point, I don’t have confirmed attendance or speaking spots at any of these, although I will almost certainly be attending bpmNEXT and a few of the vendor conferences, either as a speaker or as an analyst/blogger. If you’re interested in having me attend your conference, let me know; I require that my travel expenses are covered (otherwise they come out of my own pocket in addition to the billable days that I’m giving up to attend), and a speaking fee if you want me to do a keynote or other presentation.

Intelligent Capture enables Digital Transformation at #ABBYYSummit16

IMG_0672I’ve been in beautiful San Diego for the past couple of days at the ABBYY Technology Summit, where I gave the keynote yesterday on why intelligent capture (including recognition technologies and content analytics) is a necessary onramp to digital transformation. I started my career in imaging and workflow over 25 years ago – what we would now call capture, ECM and BPM – and I’ve seen over and over again that if you don’t extract good data up front as quickly as possible, then you just can’t do a lot to transform those downstream processes. You can see my slides at Slideshare as usual:

I’m finishing up a white paper for ABBYY on the same topic, and will post a link here when it’s up on their site. Here’s the introduction (although it will probably change slightly before final publication):

Data capture from paper or electronic documents is an essential step for most business processes, and often is the initiator for customer-facing business processes. Capture has traditionally required human effort – data entry workers transcribing information from paper documents, or copying and pasting text from electronic documents – to expose information for downstream processing. These manual capture methods are inefficient and error-prone, but more importantly, they hinder customer engagement and self-service by placing an unnecessary barrier between customers and the processes that serve them.

Intelligent capture – including recognition, document classification, data extraction and text analytics – replaces manual capture with fully-automated conversion of documents to business-ready data. This streamlines the essential link between customers and your business, enhancing the customer journey and enabling digital transformation of customer-facing processes.

I chilled out a bit after my presentation, then decided to attend one presentation that looked really interesting. It was, but was an advance preview of a product that’s embargoed until it comes out next year, so you’ll have to wait for my comments on it. Winking smile

A well-run event with a lot of interesting content, attended primarily by partners who build solutions based on ABBYY products, as well as many of ABBYY’s team from Russia (where a significant amount of their development is done) and other locations. It’s nice to attend a 200-person conference for a change, where – just like Cheers – everybody knows your name.

Keynoting at @ABBYY_USA Technology Summit

I’ve been in the BPM field since long before it was called BPM, starting with imaging and workflow projects back in the early 1990s. Although my main focus is on process now (hence the name of my blog), almost every project that I’m involved in has some element of content and capture, although not necessarily from paper documents. Basically, content capture is the onramp to many business processes: either the capture of a piece of content is what triggers a process (e.g., an application form) or it adds information to a process to move it along. Capture can mean traditional document scanning with intelligent recognition in the form of OCR, ICR, barcode and mark sense recognition, or can also be capture of information already in electronic form but not in a structured format (e.g., emails).

To get to the point, this is why I’m excited to be keynoting at the ABBYY Technology Summit coming up on November 16-18, in a presentation entitled How Digital Transformation is Increasing the Value of Capture and Text Analytics:

As the business world has been wrestling with the challenge of Digital transformation, the last few years have seen the shift away from BPM and Case Management technology platforms towards the more solutions-orientated approach offered by Smart Process Applications and Case Management Frameworks. A critical component of these business solutions is capability to capture the key business information at the point of origin.

This information is often buried inside forms and other business documents. Capturing this data through recognition technologies and automatic document classification transforms streams of documents of any structure and complexity into business-ready data.

This enables organizations of any size to streamline their existing business processes, increasing efficiency and reducing costs; it also enables real-time customer self-service processes triggered by mobile document capture.

I’ll be covering trends and benefits of intelligent capture, providing ABBYY’s customers and partners in attendance with solid advice on how to best start integrating these technologies to make their business processes run better. I’m also writing a paper covering these topics, sponsored by ABBYY, which will be available in time for the conference.

If you’re at the conference, please stop by and say hi, I’ll be hanging out there for the rest of the day after my keynote.

Strategy to execution – and back: it’s all about alignment

I recently wrote a paper sponsored by Software AG called Strategy To Execution – And Back, which you can find here (registration required). From the introduction:

When planning for business success, corporate management sets business strategy and specifies goals in terms of critical success factors and key performance indicators (KPIs). Although senior management is not concerned with the technical details of how business operations are implemented, they must have confidence that the operations are aligned with the strategy, and be able to monitor performance relative to the goals in real time.

In order to achieve operational alignment, there must be a clear path that maps strategy to execution: a direct link from the strategic goals in the high-level business model, through IT development and management practices, to the systems, activities and roles that make the business work. However, that’s only half the story: there must also be a path back from execution to strategy, allowing operational performance to be measured against the objectives in order to guide future strategy. Without both directions of traceability, there’s a disconnect between strategy and operations that can allow a business to drift off course without any indication until it’s far too late.

I cover how you need to have links from your corporate strategy through various levels of architecture to implementation, then be able to capture the operational metrics from running processes and roll those up relative to the corporate goals. If you don’t do that, then your operations could just be merrily going along their own path rather than working towards corporate objectives.

Another rift in the open source BPM market: @FlowableBPM forks from @Alfresco Activiti

Photo of Berries On Forks by my talented friend Pat Anderson (digiteyes)In early 2013, Camunda – at the time, a value-added Activiti consulting partner as well as a significant contributor to the open source project – created a fork from Activiti to form what is now the Camunda open source BPM platform as well as their commercial version based on the open source core. As I wrote at the time:

At the end of 2012, I had a few hints that things at Alfresco’s Activiti BPM group was undergoing some amount of transition: Tom Baeyens, the original architect and developer of Activiti (now CEO of the Effektif cloud BPM startup announced last week), was no longer leading the Activiti project and had decided to leave Alfresco after less than three years; and camunda, one of the biggest Activiti contributors (besides Alfresco) as well as a major implementation consulting partner, was making noises that Activiti might be too tightly tied to Alfresco’s requirements for document-centric workflow rather than the more general BPM platform that Activiti started as.

Since then, Effektif became Signavio Workflow and Camunda decided to use a capital letter in their name; what didn’t change, however, is that as the main sponsor of Activiti, Alfresco obviously has a need to make Activiti work for document-centric BPM and skewed the product in that direction. That’s not bad if you’re an Alfresco ECM customer, but likely was not the direction that the original Activiti BPM team wanted to go.

Last month, I heard that key Activiti people had left Alfresco but had no word about where they were heading; last week, former Activiti project lead Tijs Rademakers and Activiti co-founder and core developer Joram Barrez announced that they were creating an Activiti fork to form Flowable with a team including former Alfresco Activiti people and other contributors.

To be clear to those who don’t dabble in open source, forking is a completely normal activity (no pun intended…well, maybe only a little) wherein a copy of the source code is take at a point in time to create a new open source project with its own name and developer community. This may be done because of a disagreement in product direction – as appears was the case here – or because someone wants to take it in a radically different direction to address a different need or market.

I heard about all of this from Jeff Potts, who has been involved in the Alfresco open source community for a decade, via his blog. He wrote about the Activiti leads leaving Alfresco back in September, although he reported it as three people leaving independently that just happened to occur at the same time. Possibly not completely accurate, in hindsight, but that was the word at the time. He then saw the Flowable announcement (linked above) and wrote about that,  which is where I first saw it. Potts has been involved in the Alfresco open source community for a decade.

Alfresco’s founder and CTO, John Newton, posted about the team departure and the fork:

Unfortunately, some of my early friends on the Activiti project have disagreed with our direction and have taken the step of forking the Activiti code. This is disappointing because the work that they have done is very good and has generally been in the spirit of open source. However, the one thing that we could not continue to give them was exclusive control of the project. I truly wish that we could have found a way to work with them within a community framework.

This seemed to confirm my suspicion that this was a disagreement in product direction as well as a philosophical divide; with Alfresco now a much bigger company than it was at the time that they took Activiti under their wing, it’s not surprising that the corporate mindset wouldn’t always agree with open source direction. Having to spend much more effort on the enterprise edition than the open source project and seeing BPM subsumed under ECM would not sit well with the original Activiti BPM team.

Newton’s comments are also an interesting echo of Barrez’ post at the time of the Camunda fork. In both situations, a sense of disappointment – and maybe a bit of betrayal? – although now Barrez is on the other side of the equation.

Flowable was quick to offer a guide on how to move from Activiti to Flowable – trivial at this point since the code base is still the same – and Camunda jumped in with a guide on moving from Activiti to Camunda, an update on what they’ve done to the engine since their fork in 2013, and reasons why you should make the switch.

If you’re using Activiti right now, you have to be a bit nervous about this news, but don’t be.

  • If you’re using it primarily for document workflow with your Alfresco ECM, you’re probably best to stay put in the long run: undoubtedly, Activiti will be restaffed with a good team and will continue to integrate tightly with Alfresco; it’s possible that some of the capabilities might find their way from the Activiti project to the Alfresco project over time. There’s obviously going to be a bit of a gap in the team for a while: the project shows no new commits for a month, and questions on the forum are going unanswered.
  • If you use Activiti without Alfresco ECM (or with very little of it), you still don’t need to do anything right now: as Camunda showed in their post, a migration path from Activiti to Flowable or Camunda or any other fork will still exist in the future because of the shared heritage. It will get more complex over time, but not impossible. Sit tight for 6-12 months, and reassess your options then.
  • If you’re considering Activiti as a new installation, consider your use cases. If you’re a heavy Alfresco ECM user and want it tightly integrated, Activiti is the way to go. For other cases, it’s not so clear. We need to hear a bit more from Alfresco on their plans, but it’s telling that Newton’s post said “Business Process Management is one of those critical components of any content management system” which seems to place ECM as the primary focus and BPM as a component. He also said that they would be more explicit in their roadmap, and I recommend that you wait for that if you’re in the evaluation stage for a pure (non-ECM) BPM platform.

In the meantime, Flowable has released their initial 5.22.0 engine, and have plans for version 6 by the end of November. They haven’t published a product roadmap yet, but I’m expecting significant diversions from Activiti to happen quickly to incorporate new technologies that bring the focus back to BPM.

Note: the photo accompanying this post was taken by my talented photographer friend, Pat Anderson, with whom I have eaten many delicious and photogenic meals.

Bridging the bimodal IT divide

Bimodal ITI wrote a paper a few months back on bimodal IT: a somewhat controversial subject, since many feel that IT should not be bimodal. My position is that it already is – with a division between “heavy” IT development and lighter-weight citizen development – and we need to deal with what’s there with a range of development environments including low-code BPMS. From the opening section of the paper:

The concept of bimodal IT – having two different application development streams with different techniques and goals – isn’t new. Many organizations have development groups that are not part of the standard IT development structure, including developers embedded within business units creating applications that IT couldn’t deliver quickly enough, and skunkworks development groups prototyping new ideas.

In many cases, this split didn’t occur by design, but out of situational necessity when mainstream IT development groups were unable to service the needs of the business, especially transformational business units created specifically to drive innovation. However, in the past few years, analysts are positioning this split as a strategic action to boost innovation. By 2013, McKinsey & Company was proposing “greenfield IT” – technology managed independently of the legacy application development and infrastructure projects that typically consume most of the CIO’s attention and budget – as a way to innovate more effectively. They found a correlation between innovation and corporate growth, and greenfield IT as a way to achieve that innovation. By the end of 2014, the term “bimodal IT” was becoming popular, with Mode 1 being the traditional application development cycle focused on stability, well suited to infrastructure and legacy maintenance and Mode 2, focused on agility and innovation, similar to McKinsey’s greenfield IT.

Read on by downloading the paper from Software AG’s site; right now, it looks like registration isn’t required.

AIIM Toronto seminar: FNF Canada’s data capture success

Following John Mancini’s keynote, we heard from two of the sponsors, SourceHOV and ABBYY. Pam Davis of SourceHOV spoke about EIM/ECM market trends, based primarily on analyst reports and surveys, before giving an overview of their BoxOffice product.

ABBYY chose to give their speaking slot to a customer, Anjum Iqbal of FNF Canada, who spoke about their capture and ECM projects. FNF provides services to financial institutions in a variety of lending areas, and deals with a lot of faxed documents. A new business line would have their volume move to 4,500 inbound faxes daily, mostly time-sensitive documents, such as mortgage or loan closing, that need to be processed within an hour of receipt. To do this manually, they would have needed to increase their 4 full time staff to 10 people handle the inbound workflow even at a rate of 1 document/minute; instead, they used ABBYY FlexiCapture to build a capture solution for the faxes that would extract the data using OCR, and interface with their downstream content and workflow systems without human intervention. The presentation went by pretty quickly, but we learned that they had a 3-month implementation time.

I stayed on for the roundtable that ABBYY hosted, with Iqbal giving more details on their implementation. They reached a tipping point when the volume of inbound printed faxes just couldn’t be handled manually, particularly when they added some new business lines that would increase their volume significantly. Unfortunately, the processes involving the banks were stuck on fax technology — that is, the banks refused to move to electronic transfer rather than faxes — so they needed to work with that fixed constraint. They needed quality data with near-zero error rates extracted from the faxes, and selected ABBYY and one of their partners to help build a solution that took advantage of standard form formats and 100% machine printing on the forms (rather than handwriting). The forms weren’t strictly fixed format, in that some critical information such as mortgage rates may be in different places on the document depending on the other content of the form; this requires a more intelligent document classification as well as content analytics to extract the information. They have more than 40 templates that cover all of their use cases, although still need to have one person in the process to manage any exceptions where the recognition certainty was below a certain percentage. Given the generally poor quality of faxed documents, undoubtedly this capture process could also handle documents scanned on a standard business scanner or even a mobile device in addition to their current RightFax server. Once the data is captured, it’s formatted as XML, which their internal development team then used to integrate with the downstream processes, while the original faxes are stored in a content management system.

Given that these processes accept mortgage/loan application forms and produce the loan documents and other related documentation, this entire business seems ripe for disruption, although given the glacial pace of technology adoption in Canadian financial services, this could be some time off. With the flexible handling of inbound documents that they’ve created, FNF Canada will be ready for it when it happens.

That’s it for me at the AIIM Toronto seminar; I had to duck out early and missed the two other short sponsor presentations by SystemWare and Lexmark/Kofax, as well as lunch and the closing keynote. Definitely worth catching up with some of the local people in the industry as well as hearing the customer case studies.

AIIM Toronto keynote with @jmancini77

I’m at the AIIM Toronto seminar today — I pretty much attend anything that is in my backyard and looks interesting — and John Mancini of AIIM is opening the day with a talk on business processes. Yes, Mr. Column 1 is talking about Column 2, if you get the Zachman reference. This is actually pretty significant: content management isn’t just about content, just as process management isn’t just about process, but both need to overlap and work together. I had a call with Mancini yesterday in advance of my keynote at ABBYY’s conference next month, and we spent 30 minutes talking about how disruption in capture technologies has changed all business processes. Today, in his keynote, he talked about disruptive business processes that have transformed many industries.

John Mancini at AIIM TorontoHe gave us a look at people, process and technology against the rise (and sometimes fall) of different technology platforms: document management and workflow; enterprise content management; mobile and cloud. There are a lot of issues as we move from one type of platform to another: moving to a cloud SaaS offering, for example, drives the move from perimeter-based security to asset-based security. He showed a case study for financial processes within organizations — such as accounts payable and receivable — with both a tactical dimention of getting things done and a strategic side of building a bridge to digital transformation. Most businesses (especially traditional ones) operate at a slim profit margin, making it necessary to look at ways to reduce costs: not through small, incremental improvements, but through more transformational means. For financial processes, in many cases this means getting rid of manual data capture and manipulation: no more manual data entry, no more analysis via spreadsheets. And cost reduction isn’t the key driver behind transforming financial business processes any more: it’s the need for better business analytics. Done right, these analytics provide real-time insight into your business that provide a strategic competitive differentiator: the ability to predict and react to changing business conditions.

Mancini finished by allowing today’s sponsors, with booths around the room, to introduce themselves: Precision ContentAIIMBoxPanasonicSystemWareABBYYSourceHOV, and Lexmark (Kofax). I’ll be here for the rest of the morning, and look forward to hearing from some of the sponsors and their customers here today.

Case management in insurance

Case Management In InsuranceI recently wrote a paper on how case management technology can be used in insurance claims processing, sponsored by DST (but not about their products specifically). From the paper overview:

Claims processing is a core business capability within every insurance company, yet it is
often one of the most inefficient and risky processes. From the initial communication that
launches the claim to the final resolution, the end-to-end claims process is complex and
strictly regulated, requiring highly-skilled claims examiners to perform many of the
activities to adjudicate the claim.

Managing a manual, paper-intensive claims processing operation is a delicate balance
between risk and efficiency, with most organizations choosing to decrease risk at the cost
of lower operational efficiency. For example, skilled examiners may perform rote tasks to
avoid the risk of handing off work to less-experienced colleagues; or manual tracking and
logging of claims activities may have to be done by each worker to ensure a proper audit
trail.

A Dynamic Case Management (DCM) system provides an integrated and automated
claims processing environment that can improve claim resolution time and customer
satisfaction, while improving compliance and efficiency.

You can download it from DST’s site (registration required).