TechnicityTO 2018: Administrative Penalty System case study

We had a quick review of the City’s Administrative Penalty System (APS), which lets you pay or dispute your parking ticket online, with a panel made up of Lenny Di Marco, senior systems integrator; Kelli Chapman, director of prosecution services; and Susan Garossino, director of court services.

Technologically, this was a challenge to integrate old COBOL systems and newer systems across both city and provincial agencies, but there was also a cultural change to do some level of dispute resolution online rather than in the courts. Paying online isn’t new (I seem to remember paying a ticket online years ago when I still had a car), but the process of requesting a review and appealing a review result now happens in a matter of weeks rather than years. In addition to the obvious benefit of a timely outcome – which is better for citizens to get things sorted out, for the city in terms of resolving tickets faster, and for police officers who don’t have to attend court if the issue is resolved online — this also frees up court time for more serious charges. It’s still possible to do this in person, but a lot of people don’t have the time to get to a city office during business hours, or don’t want to go through the face-to-face process.

This is not just a matter of keeping up with regular day-to-day parking violations, but managing peaks that occur when the city has ticketing blitzes (usually caused when an elected official wants to make a statement about being tough on parking offenders).

The whole project took 12-14 months from inception to rollout, and is based on integrating and extending their COBOL back end and other existing systems, rather than purchasing new technology or bringing in outside help. Definitely some technology challenges, but also assessing the needs of the stakeholders from the city, the province and the police so that they can do their job including the new online review and adjudication roles.

Cool stuff, even if you don’t like paying parking tickets. Sounds like they’re already working on another integration project for next year related to Vision Zero, although we didn’t get the details.

TechnicityTO 2018: Innovative Toronto

The second session at today’s Technicity conference highlighted some of the technology innovation going on at the city, with a panel featuring Grant Coffey, director of strategy and program management at the City of Toronto; Tina Scott, Blockchain proof of concept lead for the city; and Gabe Sawhney, executive director of Code for Canada and a representative for Civic Hall Toronto. Jim Love, CIO of IT World Canada, moderated.

There are a number of different technology innovations underway at the city: some of them are public services, such as public WiFi and the offerings of Code for Canada and Civic Hall, while others are about how the city does business internally and with its commercial partners, such as blockchain in procurement processes.

Civic Hall has some interesting programs for connecting city government with other organizations for the purpose of building solutions together — I’ve been aware of and involved in things like this over several years, and they can yield great results in conjunction with the open data initiative at the city. Toronto also has a Civic Innovation Office as an in-house accelerator to help come up with innovative solutions to tough problems. These private and public programs aren’t in competition: they both foster innovation, and support different constituents in different ways.

Blockchain is starting to gain a foothold in the city through some training and an internal hackathon earlier this year to develop proofs of concept; this provided exposure to both business and technology areas about the potential for blockchain applications. Now, they are trading ideas with some of the other levels of government, such at provincial ministries, about using blockchain, and developing use cases for initial applications. They’re still just coming out of the experimental stage, and are looking at uses such as cross-jurisdictional/cross-organizational information sharing as near-term targets.

It’s not all positive, of course: challenges exist in evolving the city employee culture to take advantage of innovation and do things differently (which is pretty much the same as in private industry), as well as changing policies and governance best practices to be ready for innovation rather than playing catch-up. Sharing success stories is one of the best ways to help promote those changes.

TechnicityTO 2018: Taming Transportation Troubles with Technology

Every year, IT World Canada organizes the Technicity conference in Toronto, providing a technology showcase for the city and an opportunity to hear about some of the things that are happening both in the city government and organizations that operate here. Fawn Annan, president of ITWC, opened the conference and introduced the city manager, Chris Murray for a backgrounder on the city as an economic engine, and how technology enables that.

The sessions started with a panel on transportation technology, moderated by Jaime Leverton, GM of Cogeco Peer 1 and featuring three people from the City of Toronto: Barb Gray, General Manager of Transportation Services; Ryan Landon, Autonomous Vehicle Lead; and Jesse Coleman, Transportation Big Data Team Leader. Erik Mok, Chief Enterprise Architect for the Toronto Transit Commission, is also supposed to be on the panel but not arrived yet: hopefully not delayed on the TTC. 🙂

They spoke about the need for data collection in order to determine how to improve transportation in the city, whether related to personal vehicles, public transit, cycling or walking. In the past, this used to require manual data collection on the street; these days, the proliferation of traffic cameras, embedded sensors and smartphones means that a lot of data is being collected about how people are moving around the streets. This creates a need for understanding how to work with the resulting big data, and huge opportunities for gaining better insights into making the streets more efficient and safer for everyone. Since the city is a big proponent of open data, this means that the data that the city collects is available (in an anonymized format) to anyone who wants to analyze it. The city is trying to do some of this analysis themselves (without the benefit of a data scientist job classification at the city), but the open data initiative means that a lot of commercial organizations — from big companies to startups — are incorporating this into apps and services. For the King Street Pilot, a year-old project that restricts the travel of private cars on our busiest streetcar route in order to prioritize public transit, the city deployed new types of sensors to measure the impact: Bluetooth sensors that track devices, traffic cameras with embedded AI, and more. This allows for unbiased measurement of the actual impact of the pilot (and other initiatives) that can be communicated to constituents.

There are privacy safeguards in place for ensuring that Bluetooth devices that are tracked can’t be traced to an individual on an ongoing basis, but video is a larger issue: in general, intelligence related to the transportation issues is extracted from the video, then the video is discarded. They mentioned the need for privacy by design, that is, building in privacy considerations from the start of any data collection project, not trying to add it on later.

They also discussed some of the smart sensors and signals being used for controlling traffic signals, where the length of the waiting queue of vehicles can influence when the traffic signals change. This isn’t just related to vehicles, however: there’s an impact on pedestrians that use the same intersections, and on public health in terms of people with mobility challenges.

Cities like Seattle, San Francisco and New York, that started with transportation data collection much earlier than Toronto, are doing some innovative things but the panel feels that we’re catching up: there’s an autonomous shuttle project in the works now to fill some of the gaps in our transit system, for example. There’s also some work being done with drones to monitor traffic congestion around special events (presumably both vehicle and pedestrian) in order to understand dispersal patterns.

Interesting audience questions on data storage (Amazon AWS) and standardization of data formats, especially related to IoT.

As a Toronto resident who uses public transit, walks a lot and sometimes even drives, some great information on how big data is feeding into improving mobility for everyone.

Unintended consequences (the good kind) of DigitalTransformation with @jkyriakidis

Jordan Kyriakidis, CEO of QRA Corp, spoke at a session at ITWC’s Toronto digital transformation conference on some of the unexpected consequences of technological advances in terms of collaboration and cross-fertilization of ideas. QRA is a tech startup in Atlantic Canada, and Kyriakidis’ examples are about how companies in that relatively small (economically) region are encouraging new ways of thinking about solving business problems through these sorts of “collisions”.

Addressing the complexity introduced by advancing technology means that we have to invent new methods and tools: he gave the example in industrial complexity where design moved from paper to computer-aided design, then added electronic design automation when the complexity of where to put which chip overwhelmed human capabilities, and now design verification allows for model-based (requirements-driven) design to be validated before more expensive engineering and production begins.

Another example in precision diagnosis and treatment was around data-driven farming, combining computer vision and big data analytics (plus drone delivery of individual plant treatment) to optimize crop yields.

His third example was of integrating and analyzing a variety of data sources about a specific athlete to allow a coach to optimize training and performance for that athlete in their chosen sport.

His main theme of precision diagnosis and treatment — essentially, doing something different for every case based on the context — can be extended in pretty much any industry: consider the attempts by many consumer-facing companies to customize individual customer experiences. Interesting look at companies that are actually doing it.

FinTech panel at ITWC DigitalTransformation 2018

Lynn Elwood, VP Cloud and Services from OpenText, hosted a panel on FinTech to close out the morning at the ITWC digital transformation conference in Toronto. She started with some background on digital transformation in financial services, where there is still a strong focus on cost reduction, but customer engagement has become more important. She included survey results with a somewhat disappointing view on paperless offices, with more than 75% of the respondents saying that they would not be going paperless for as much as five years or maybe never. Never??!! Maybe just not within the career lifetime of the respondents, but c’mon, never? I understand that digital transformation is not the same as content digitization, but if you’re still running on paper, that’s just going to fundamentally limit the degree of your transformation. At the same time, more than 75% were saying that they plan to use AI already or within the short term (hopefully to replace the people who think that they’re never going to be paperless), and most organizations said that they were equal or better than their peers in digital transformation (statistically unlikely). Unintentionally hilarious.

The panel was made up of Michael Ball, CISO Advisor for a number of firms including Freedom Mobile; Amer Matar, CTO of Moneris (a large Canadian payment processor); and Patrick Vice, partner at Insurance-Canada.ca (an industry organization for P&C insurance). Matar talked about how legacy technology holds back companies: existing companies have the advantage of being established incumbents, but newer players (e.g., Square in the payments market) can enter with a completely new business model and no legacy customers or infrastructure to drag along. Vice talked about how companies can combat this by spinning off separate business units to provide a more streamlined digital experience and brand, such as how Economical Insurance did with Sonnet (a project that I had the pleasure of working on last year), which still uses the established insurance organization behind a modern customer experience. Ball stressed that the legacy systems are evolving at a much slower rate than is required for digital transformation, and the new front ends need to go beyond just putting a friendly UI on the old technology: they need to incorporate new services to present a transformed customer experience.

They had an interesting discussion about security, and how moving to digital business models means that companies need to offer a more secure environment for customers. Many people are starting to look at security (such as two-factor authentication) as a competitive differentiator when they are selecting service providers, and while most people wouldn’t now change their bank just because it didn’t provide 2FA, it won’t be long before that is a decision point. It’s not just about cloud versus on-premise, although there are concerns about hosting Canadian customers’ financial data outside Canada, where financial laws (and government access to data) may be different; it’s about an organization’s ability to assure their customer that their information won’t be improperly accessed while offering a highly secure customer-facing portal. There’s a huge spend on security these days, but that needs to settle down as this becomes just baked into the infrastructure rather than an emergency add-on to existing (insecure) systems.

Good discussion, although it points out that it’s still early days for digital transformation in financial services.

Digital government with @AlexBenay at IT World DigitalTransformation 2018

I’ve attended IT World Canada conferences in Toronto before — easy for me to attend as a local, and some interesting content such as Technicity — and today they’re running a digital transformation conference (that oddly, has the hashtag #digitaltransformation as if that were a unique tag).

Alex Benay, CIO of the government of Canada, gave the opening keynote: with $6B/year in IT spend and more than a few high-profile mistakes under their belt that happened before he arrived in the job in early 2017, he has some views on how to do things better. He’s even written a book about digital government, but given that the federal government takes five years to write requirements, he’ll probably be long retired before we know if any of his predictions come true. He talked about some models of digital government, such as Estonia, and how the government of Canada is attempting to integrate their digital services into our everyday lives by partnering with the private sector: think Transport Canada road alerts built into your GM car, or passport renewal and customs forms triggered by an Expedia booking. He admits to a lot of obstacles, including untrained staff in spite of massive training spends, but also many enablers to reaching their goals, such as changing policies around cloud-first deployments. He finished with five core tenents for any government IT moving forward:

  • Open data by default while protecting citizens
  • Collaborate in the open
  • Grow digital talent
  • Change laws/policies to avoid situations like Facebook/Cambridge Analytica
  • Adapt business models to focus only on meeting user needs (procurement, tech management, service design)

Good principles, and I hope that our government can learn to live by them.

Integrating your enterprise content with cloud business applications? I wrote a paper on that!

Just because there’s a land rush towards SaaS business applications like Salesforce for some of your business applications, it doesn’t mean that your content and data are all going to be housed on that platform. In reality, you have a combination of cloud applications, cloud content that may apply across several applications, and on-premise content; users end up searching in multiple places for information in order to do a single transaction.

In this paper, sponsored by Intellective (who have a bridging product for enterprise content/data with SaaS business applications), I wrote about some of the architecture and design issues that you need to consider when you’re linking these systems together. Here’s the introduction:

Software-as-a-service (SaaS) solutions provide significant utility and value for standard business applications, including customer relationship management (CRM), enterprise resource planning (ERP), supply chain management (SCM), human resources (HR), accounting, insurance claims management, and email. These “systems of engagement” provide a modern and agile user experience that guides workers through actions and enables collaboration. However, they rarely replace the core “systems of record”, and don’t provide the range of content services required by most organizations.

This creates an issue when, for example, a customer service worker’s primary environment is Salesforce CRM, but for every Salesforce activity they may also need to access multiple systems of record to update customer files, view regulatory documentation or initiate line-of-business (LOB) processes not supported in Salesforce. The worker spends too much time looking for information, risks missing relevant content in their searches, and may forget to update the same information in multiple systems.

The solution is to integrate enterprise content from the systems of record – data, process and documents – directly with the primary user-facing system of engagement, such that the worker sees a single integrated view of everything required to complete the task at hand. The worker completes their work more efficiently and accurately because they’re not wasting time searching for information; data is automatically updated between systems, reducing data entry effort and errors.

Head on over to get the full paper (registration required).

Upcoming webinar on digital transformation in financial services featuring @BPMdotcom and @ABBYY_USA – and my white paper

Something strange about receiving an email about an upcoming webinar, featuring two people who I know well…

 …then scrolling down to see that ABBYY is featuring the paper that I wrote for them as follow-on bonus material!

Nathaniel Palmer and Carl Hillier are both intelligent speakers with long histories in the industry, tune in to hear them talk about the role that content capture and content analytics play in digital transformation.

Low-Code webinar with @TIBCO – new ways for business and IT to develop and innovate together

Liveappas_rev1_1200I’m back at the webinars this Thursday (April 26), with the first of two parts in a series on low-code and how it enables business and IT to work better together. Together with Roger King and Nicolas Marzin of TIBCO, we’re doing another one of our free-ranging “fireside chat” discussions, such as we did on case management last November. This time, we dig into more of the technical and governance issues of how low-code application development platforms are used across organizations by both business developers and IT.

You can sign up for the webinar here.

I’m also putting the finishing touches on a white paper that goes into more of these concepts in depth. Sign up for the webinar and you’ll get a link to the paper afterwards.

bpmNEXT 2018: All DMN all the time, with Trisotech, Bruce Silver Associates and Red Hat

First session of the afternoon on the first day of bpmNEXT 2018, and this entire section is on DMN (decision management notation) and the requirement for decision automation based on DMN.

Decision as a Service (DaaS): The DMN Platform Revolution, Trisotech

Denis Gagne of Trisotech, who knows as much about DMN and other related standards as anyone around, started off the session with his ideas on the need for decision automation driven by requirements such as GDPR. He walked through their suite of decision-related products that can be used to create decision services to be consumed by other applications, as well as their conformance to the DMN standards. His demo showed a decision model for determining the best price to offer a rental vehicle customer, and walked through the capabilities of their platform with this model: DMN style check, import/export, execution, team collaboration, and governance through versioning. He also showed how decision models can be reused, so that elements from one model can be used in another model. Then, he showed how to take portions of the model and define them as a service using a visual wrapper, much like a subprocess wrapper visualization in BPMN, where the relationship lines that cross the service boundary become the inputs and outputs to the service. Cool. The service can then be deployed as an executable service using (in his demo) the Red Hat platform, test its execution using from a generated HTML form, generate the REST API or Open API interface code, run predefined test cases based on DMN TCK, promote the service from test to production, and publish it to an API publisher platform such as WSO2 for public consumption. The execution environment includes debugging and audit logs, providing traceability on the decision services.

Timing the Stock Market with DMN, Bruce Silver Associates

Bruce Silver, also a huge contributor to BPMN and DMN standards, and author of the BPMN Method & Style books and now the DMN M&S, presented an application for buying a stock at the right time based on price patterns. For investors who time the market based the pricing, the best way to do this is to look at daily min/max trends and fit them to one of several base type models. Bruce figured that this could be done with a decision table applied to a manipulated version of the data, and automated this for a range of stocks using a one-year history, processing in Excel, and decision services in the Trisotech cloud. This is a practical example of using decision services in a low-code environment by non-programmers to do something useful. His demo showed us the decision model for doing this, then the data processing (smoothing) done in Excel. However, for an application that you want to run every day, you’re probably not going to want to do the manual import/export of data, so he showed how to automate/orchestrate this with Microsoft Flow, which can still use the Excel sheet for data manipulation but automate the data import, execute the decision service, and publish the results back to the same Excel file. Good demonstration of the democratization of creating decisioning applications by through easy-to-use tools such as the graphical DMN modeler, Excel and Flow, highlighting that DMN is an execution language as well as a requirement language. Bruce has also just published a new book, DMN Cookbook, co-authored with Edson Tirelli of Red Hat, on getting started DMN business implementations using lightweight stateless decision services called via REST APIs.

Smarter Contracts with DMN, Red Hat

Edson Tirelli of Red Hat, Bruce Silver’s co-author on the above-mentioned DMN Cookbook, finished this section of DMN presentations with a combination of blockchain and DMN, where DMN is used to define the business language for calculations within a smart contract. His demo showed a smart land registry case, specifically a transaction for selling a property involving a seller, a buyer and a settlement service created in DMN that calculates taxes and insurance, with the purchase being executed using cryptocurrency. He mentioned Vanessa Bridge’s demo from earlier today, which showed using BPMN to define smart contract flows; this adds another dimension to the same problem, and likely no reason why you wouldn’t use them all together given the right situation. Edson said that he was inspired, in part, by this post on smart contracts by Paul Lachance, in which Lachance said “a visual model such as a BPMN and/or DMN diagram could be used to generate the contract source code via a process-engine”. He used Ethereum for the blockchain smart contract and the Ether cryptocurrency, Trisotech for the DMN models, and Drools for the rules execution. All in all, not such a far-fetched idea.

I’m still catching flak for suggesting the now-ubiquitous Ignite style for presentations here at bpmNEXT; my next lobbying effort will be around restricting the maximum number of words per slide. 🙂