Category Archives: digital business

AIIM breakfast meeting on Feb 16: digital transformation and intelligent capture

AIIM TorontoI’m speaking at the AIIM breakfast meeting in Toronto on February 16, with an updated version of the presentation that I gave at the ABBYY conference in November on digital transformation and intelligent capture. ABBYY is generously sponsoring this meeting and will give a brief presentation/demo on their intelligent capture and text analytics products after my presentation.

Here’s the description of my talk:

This presentation will look at how digital transformation is increasing the value of capture and text analytics, recognizing that these technologies provide an “on ramp” to the intelligent, automated processes that underlie digital transformation. Using examples from financial services and retail companies, we will examine the key attributes of this digital transformation. We will review step-by-step, the role of intelligent capture in digital transformation, showing how a customer-facing financial services process is changed by intelligent capture technologies. We will finish with a discussion of the challenges of introducing intelligent capture technologies as part of a digital transformation initiative.

You can register to attend here, and there’s a discount if you’re an AIIM member.

You can read about last month’s meeting here, which featured Jason Bero of Microsoft talking about SharePoint and related Microsoft technologies that are used for classification, preservation, protection and disposal of information assets.

TechnicityTO 2016: Challenges, Opportunities and Change Agents

The day at Technicity 2016 finished up with two panels: the first on challenges and opportunities, and the second on digital change agents.

The challenges and opportunities panel, moderated by Jim Love of IT World Canada, was more of a fireside chat with Rob Meikle, CIO at City of Toronto, and Mike Williams, GM of Economic Development and Culture, both of whom we heard from in the introduction this morning. Williams noted that they moved from an environment of few policies and fewer investements under the previous administration to a more structured and forward-thinking environment under Mayor John Tory, and that this introduced a number of IT challenges; although the City can’t really fail in the way that a business can fail, it can be ineffective at serving its constituents. Meikle added that they have a $12B operating budget and $33B in capital investments, so we’re not talking about small numbers: even at those levels, there needs to be a fair amount of justification that a solution will solve a civic problem rather than just buying more stuff. This is not just a challenge for the City, but for the vendors that provide those solutions.

There are a number of pillars to technological advancement that the City is striving to establish:

  • be technologically advanced and efficient in their internal operations
  • understand and address digital divides that exist amongst residents
  • create an infrastructure of talent and services that can draw investment and business to the City

This last point becomes a bit controversial at times, when there is a lack of understanding of why City officials need to travel to promote the City’s capabilities, or support private industry through incubators. Digital technology is how we will survive and thrive in the future, so promoting technology initiatives has widespread benefits.

There was a discussion about talent: both people who work for the City, and bringing in businesses that draw private-sector talent. Our now-vibrant downtown core is attractive for tech companies and their employees, fueled by our attendance at our universities. The City still has challenges with procurement to bring in external services and solutions: Williams admitted that their processes need improvement, and are hampered by cumbersome procurement rules. Democracy is messy, and it slows things down that could probably be done a lot faster in a less free state: a reasonable trade. 🙂

The last session of the day looked at examples of digital change agents in Toronto, moderated by Fawn Annan of IT World Canada, and featuring Inspector Shawna Coxon of the Toronto Police Service, Pam Ryan from Service Development & Innovation at the Toronto Public Library, Kristina Verner, Director Intelligent Communities of Waterfront Toronto, and Sara Diamond, President of OCAD University. I’m a consumer and a supporter of City services such as these, and I love seeing the new ways that they’re using to include all residents and advance technology. Examples of initiatives include fiber broadband for all waterfront community residences regardless of income level; providing mobile information access to neighbourhood police officers to allow them to get out of their cars and better engage with the community; integrating arts and design education with STEM for projects such as transit and urban planning (STEAM is the new STEM); and digital innovation hubs at some library branches to provide community access to high-tech gadgets such as 3D printers.

There was a great discussion about what it takes to be a digital innovator in these contexts: it’s as much about people, culture and inclusion as it is about technology. There are always challenges in measuring success: metrics need to include the public’s opinion of these agencies and their digital initiatives, an assessment of the impact of innovation on participants, as well as more traditional factors such as number of constituents served.

That’s it for Technicity 2016, and kudos to IT World Canada and the City of Toronto for putting this day together. I’ve been to a couple of Technicity conferences in the past, and always enjoy them. Although I rarely do work for the public sector in my consulting business, I really enjoy seeing how digital transformation is occuring in that sector; I also like hearing how my great city is getting better.

TechnicityTO 2016: Open data driving business opportunities

Our afternoon at Technicity 2016 started with a panel on open data, moderated by Andrew Eppich, managing director of Equinix Canada, and featuring Nosa Eno-Brown, manager of Open Government Office at Ontario’s Treasury Board Secretariat, Lan Nguyen, deputy CIO at City of Toronto, and Bianca Wylie of the Open Data Institute Toronto. Nguyen started out talking about how data is a key asset to the city: they have a ton of it gathered from over 800 systems, and are actively working at establishing data governance and how it can best be used. The city wants to have a platform for consuming this data that will allow it to be properly managed (e.g., from a privacy standpoint) while making it available to the appropriate entities. Eno-Brown followed with a description of the province’s initiatives in open data, which includes a full catalog of their data sets including both open and closed data sets. Many of the provincial agencies such as the LCBO are also releasing their data sets as part of this initiative, and there’s a need to ensure that standards are used regarding the availability and format of the data in order to enable its consumption. Wylie covered more about open data initiatives in general: the data needs to be free to access, machine-consumable (typically not in PDF, for example), and free to use and distribute as part of public applications. I use a few apps that use City of Toronto open data, including the one that tells me when my streetcar is arriving; we would definitely not have apps like this if we waited for the City to build them, and open data allows them to evolve in the private sector. Even though those apps don’t generate direct revenue for the City, success of the private businesses that build them does result in indirect benefits: tax revenue, reduction in calls/inquiries to government offices, and a more vibrant digital ecosystem.

Although data privacy and security are important, these are often used as excuses for not sharing data when an entity benefits unduly by keeping it private: the MLS comes to mind with the recent fight to open up real estate listings and sale data. Nguyen repeated the City’s plan to build a platform for sharing open data in a more standard fashion, but didn’t directly address the issue of opening up data that is currently held as private. Eno-Brown more directly addressed the protectionist attitude of many public servants towards their data, and how that is changing as more information becomes available through a variety of online sources: if you can Google it and find it online, what’s the sense in not releasing the data set in a standard format? They perform risk assessments before releasing data sets, which can result in some data cleansing and redaction, but they are focused on finding a way to release the data if all feasible. Interestingly, many of the consumers of Ontario’s open data are government of Ontario employees: it’s the best way for them to find the data that they need to do their daily work. Wylie addressed the people and cultural issues of releasing open data, and how understanding what people are trying to do with the data can facilitate its release. Open data for business and open data for government are not two different things: they should be covered under the same initiatives, and private-public partnerships leveraged where possible to make the process more effective and less costly. She also pointed out that shared data — that is, within and between government agencies — still has a long ways to go, and should be prioritized over open data where it can help serve constituents better.

The issue of analytics came up near the end of the panel: Nguyen noted that it’s not just the data, but what insights can be derived from the data in order to drive actions and policies. Personally, I believe that this is well-served by opening up the raw data to the public, where it will be analyzed far more thoroughly than the City is likely to do themselves. I agree with her premise that open data should be used to drive socioeconomic innovation, which supports my idea that many of the apps and analysis are likely to emerge from outside the government, but likely only if more complete raw data are released rather than pre-aggregated data.

TechnicityTO 2016: IoT and Digital Transformation

I missed a couple of sessions, but made it back to Technicity in time for a panel on IoT moderated by Michael Ball of AGF Investments, featuring Zahra Rajani, VP Digital Experience at Jackman Reinvents, Ryan Lanyon, Autonomous Vehicle Working Group at City of Toronto, and Alex Miller, President of Esri Canada. The title of the panel is Drones, Driverless Cars and IoT, with a focus is on how intelligent devices are interacting with citizens in the context of a smart city. I used to work in remote sensing and geographic information systems (GIS), and having the head of Esri Canada talk about how GIS acts as a smart fabric on which these devices live is particularly interesting to me. Miller talked about how there needs to be a framework and processes for enabling smarter communities, from observation and measurement, data integation and management, visualization and mapping, analysis and modeling, planning and design, and decision-making, all the way to action. The vision is a self-aware community, where smart devices that are built into infrastructure and buildings can feed back into an integrated view that can inform and decide.

Lanyon talked about autonomous cars in the City of Toronto, from the standpoint of the required technology, public opinion, and cultural changes away from individual car ownership. Rajani followed with a brief on potential digital experiences that brands create for consumers, then we circled back to the other two participants about how the city can explore private-public sensor data sharing, whether for cars or retail stores or drones. They also discussed issues of drones in the city: not just regulations and safety, but the issue of sharing space both on and above the ground in a dense downtown core. A golf cart-sized pizza delivery robot is fine for the suburbs with few pedestrians, but just won’t work on Bay Street at rush hour.

The panel finished with a discussion on IoT for buildings, and the advantages of “sensorizing” our buildings. It goes back to being able to gather better data, whether it’s external local factors like pollution and traffic, internal measurements such as energy consumption, or visitor stats via beacons. There are various uses for the data collected, both by public and private sector organizations, but you can be sure that a lot of this ends up in those silos that Mark Fox referred to earlier today.

The morning finished with a keynote by John Tory, the mayor of Toronto. This week’s shuffling of City Council duties included designating Councillor Michelle Holland as Advocate for the Innovation Economy, since Tory feels that the city is not doing enough to enable innovation for the benefit of residents. Part of this is encouraging and supporting technology startups, but it’s also about bringing better technology to bear on digital constituent engagement. Just as I see with my private-sector clients, online customer experiences for many services are poor, internal processes are manual, and a lot of things only exist on paper. New digital services are starting to emerge at the city, but it’s a bit of a slow process and there’s a long road of innovation ahead. Toronto has made committments to innovation in technology as well as arts and culture, and is actively seeking to bring in new players and new investments. Tory sees the Kitchener-Waterloo technology corridor as a partner with the Toronto technology ecosystem, not a competitor: building a 21st century city requires bring the best tools and skills to bear on solving civic problems, and leveraging technology from Canadian companies brings benefits on both sides. We need to keep moving forward to turn Toronto into a genuinely smart city to better serve constituents and to save money at the same time, keeping us near or at the top of livable city rankings. He also promised that he will step down after a second term, if he gets it. 🙂

Breaking now for lunch, with afternoon sessions on open data and digital change agents.

By the way, I’m blogging using the WordPress Android app on a Nexus tablet today (aided by a Microsoft Universal Foldable Keyboard), which is great except it doesn’t have spell checking. I’ll review these posts later and fix typos.

Exploring City of Toronto’s Digital Transformation at TechnicityTO 2016

I’m attending the Technicity conference today in Toronto, which focuses on the digital transformation efforts in our city. I’m interested in this both as a technologist, since much of my work is related to digital transformation, and as a citizen who lives in the downtown area and makes use of a lot of city services.

After brief introductions by Fawn Annan, President and CMO of IT World Canada (the event sponsor), Mike Williams, GM of Economic Development and Culture with City of Toronto, and Rob Meikle, CIO at City of Toronto, we had an opening keynote from Mark Fox, professor of Urban Systems Engineering at University of Toronto, on how to use open city data to fix civic problems.

Fox characterized the issues facing digital transformation as potholes and sinkholes: the former are a bit more cosmetic and can be easily paved over, while the latter indicate that some infrastructure change is required. Cities are, he pointed out, not rocket science: they’re much more complex than rocket science. As systems, cities are complicated as well as complex, with many different subsystems and components spanning people, information and technology. He showed a number of standard smart city architectures put forward by both vendors and governments, and emphasized that data is at the heart of everything.

He covered several points about data:

  • Sparseness: the data that we collect is only a small subset of what we need, it’s often stored in silos and not easily accessed by other areas, and it’s frequently lost (or inaccessible) after a period of time. In other words, some of the sparseness is due to poor design, and some is due to poor data management hygiene.
  • Premature aggregation, wherein raw data is aggregated spatially, temporally and categorically when you think you know what people want from the data, removing their ability to do their own analysis on the raw data.
  • Interoperability and the ability to compare information between municipalities, even for something as simple as date fields and other attributes. Standards for these data sets need to be established and used by municipalities in order to allow meaningful data comparisons.
  •  Completeness of open data, that is, what data that a government chooses to make available, and whether it is available as raw data or in aggregate. This needs to be driven by what problems that the consumers of the open data are trying to solve.
  • Visualization, which is straightforward when you have a couple of data sets, but much more difficult when you are combining many data sets — his example was the City of Edmonton using 233 data sets to come up with crime and safety measures.
  • Governments often feel a sense of entitlement about their data, such that they choose to hold back more than they should, whereas they should be in the business of empowering citizens to use this data to solve civic problems.

Smart cities can’t be managed in a strict sense, Fox believes, but rather it’s a matter of managing complexity and uncertainty. We need to understand the behaviours that we want the system (i.e., the smart city) to exhibit, and work towards achieving those. This is more than just sensing the environment, but also understanding limits and constraints, plus knowing when deviations are significant and who needs to know about the deviations. These systems need to be responsive and goal-oriented, flexibly responding to events based on desired outcomes rather than a predefined process (or, as I would say, unstructured rather than structured processes): this requires situational understanding, flexibility, shared knowledge and empowerment of the participants. Systems also need to be introspective, that is, compare their performance to goals and find new ways to achieve goals more effectively and predict outcomes. Finally, cities (and their systems) need to be held accountable for actions, which requires that activities need to be auditable to determine responsibility, and the underlying basis for decisions be known, so that a digital ombudsman can provide oversight.

Great talk, and very aligned with what I see in the private sector too: although the terminology is a bit different, the principles, technologies and challenges are the same.

Next, we heard from Hussam Ayyad, director of startup services at Ryerson University’s DMZ — a business incubator for tech startups — on Canadian FinTech startups. The DMZ has incubated more than 260 startups that have raised more than $206M in funding over their six years in existence, making them the #1 university business incubator in North America, and #3 in the world. They’re also ranked most supportive of FinTech startups, which makes sense considering their geographic proximity to Toronto’s financial district. Toronto is already a great place for startups, and this definitely provides a step up for the hot FinTech market by providing coaching, customer links, capital and community.

Unfortunately, I had to duck out partway through Ayyad’s presentation for a customer meeting, but plan to return for more of Technicity this afternoon.

Intelligent Capture enables Digital Transformation at #ABBYYSummit16

IMG_0672I’ve been in beautiful San Diego for the past couple of days at the ABBYY Technology Summit, where I gave the keynote yesterday on why intelligent capture (including recognition technologies and content analytics) is a necessary onramp to digital transformation. I started my career in imaging and workflow over 25 years ago – what we would now call capture, ECM and BPM – and I’ve seen over and over again that if you don’t extract good data up front as quickly as possible, then you just can’t do a lot to transform those downstream processes. You can see my slides at Slideshare as usual:

I’m finishing up a white paper for ABBYY on the same topic, and will post a link here when it’s up on their site. Here’s the introduction (although it will probably change slightly before final publication):

Data capture from paper or electronic documents is an essential step for most business processes, and often is the initiator for customer-facing business processes. Capture has traditionally required human effort – data entry workers transcribing information from paper documents, or copying and pasting text from electronic documents – to expose information for downstream processing. These manual capture methods are inefficient and error-prone, but more importantly, they hinder customer engagement and self-service by placing an unnecessary barrier between customers and the processes that serve them.

Intelligent capture – including recognition, document classification, data extraction and text analytics – replaces manual capture with fully-automated conversion of documents to business-ready data. This streamlines the essential link between customers and your business, enhancing the customer journey and enabling digital transformation of customer-facing processes.

I chilled out a bit after my presentation, then decided to attend one presentation that looked really interesting. It was, but was an advance preview of a product that’s embargoed until it comes out next year, so you’ll have to wait for my comments on it. Winking smile

A well-run event with a lot of interesting content, attended primarily by partners who build solutions based on ABBYY products, as well as many of ABBYY’s team from Russia (where a significant amount of their development is done) and other locations. It’s nice to attend a 200-person conference for a change, where – just like Cheers – everybody knows your name.

American Express digital transformation at Pegaworld 2016

Howard Johnson and Keith Weber from American Express talked about their digital transformation to accommodate their expanding market of corporate card services for global accounts, middle market and small businesses. Digital servicing using their @work portal was designed with customer engagement in mind, and developed using Agile methodologies for improved flexibility and time to market. They developed a set of guiding principles: it needed to be easy to use, scalable to be able to manage any size of servicing customer, and proactive in providing assistance on managing cash flow and other non-transactional interactions. They also wanted consistency across channels, rather than their previous hodge-podge of processes and teams depending on which channels.

wp-1465337619564.jpg

AmEx used to be a waterfall development shop — which enabled them to offshore a lot of the development work but meant 10-16 months delivery time — but have moved to small, agile teams with continuous delivery. Interesting when I think back to this morning’s keynote, where Gerald Chertavian of Year Up said that they were contacted by AmEx about providing trained Java/Pega developers to help them with re-onshoring their development teams; the AmEx presenter said that he had four of the Year Up people on his team and they were great. This is a pretty negative commentary on the effectiveness of outsourced, offshore development teams for agile and continuous delivery, which is considered essential for today’s market. AmEx is now hiring technical people for onshore development that is co-located with their business process experts, greatly reducing delivery times and improving quality.

wp-1465337686253.jpg

Technology-wise, they have moved to an omni-channel platform that uses Pega case management, standardizing 65% of their processes while providing a single source of the truth. This has resulted in faster development (lower cost per market and integration time, with improved configurability) while enabling future capabilities including availability, analytics and a process API. On the business side, they’re looking at a lot of interesting capabilities for the future: big data-enabled insights, natural language search, pluggable widgets to extend the portal, and frequent releases to keep rolling this out to customers.

It sounds like they’re starting to use best practices from a technology design and development standpoint, and that’s really starting to pay off in customer experience. It will be interesting to see if other large organizations — with large, slow-moving offshore development shops — can learn the same lessons.

Rethinking personal data: Pegaworld 2016 panel

I attended a breakout panel on how the idea and usage of personal data are changing was moderated by Alan Marcus of the World Economic Forum (nice socks!), and included Richard Archdeacon of HP, Rob Walker from Pega and Matt Mobley from Merkel.

image

The focus is on customer data as it is maintained in an organization’s systems, and the regulations that now drive how that data is managed. The talk was organized around three key themes that are emerging from the global dialog: strengthening trust and accountability; understanding usage-based, individual-centric frameworks; and engaging the individual. Thoughts from the panel:

  • Once you have someone’s data, you remain responsible for it even as you pass it to other parties
  • Customer data management is now regulation-driven
  • It’s not enough to restrict values in a customer data set; it’s now possible to derive hidden values (such as gender or race) from other values, which can result in illegal targeting: how much efforts should be put into anonymizing data when it can be easily deanonymized?
  • Organizations need to inform customers of what data that they have about them, and how it is being used
  • Consumers want the convenience offered by giving up their data more than they fear misuse of the data
  • The true currency of identity for organizations is an email address and one other piece of data, which can then be matched to a vast amount of data from other sources
  • The biggest consumer fear is data privacy violation from a security breach (about which is there is a high level of hysteria), but possibly they should be more afraid of how the companies that they willingly give the data to are going to use it
  • Personal data includes data that you create, data that others create about you, and data that is inferred based on your activities
  • Many people are maintained multiple identities on social media sites, curated differently for professional and personal audiences
  • Personal health data, including genetic data, has an additional set of concerns since it can impact individual healthcare options
  • Unresolved question of when personal data is no longer personal data, e.g., after a certain amount of aggregation and analysis occurs
  • Issues of consent (by customers to use their data) are becoming more prominent, and using data without consent will be counter to the regulations in most jurisdictions
  • Many smaller businesses will find it difficult to meet security compliance regulations; this may drive them to use cloud services where the provider assumes some degree of security responsibility

Food for thought. A lot of unresolved issues in personal data privacy and management.

Pegaworld 2016 day 2 keynote: digital transformation and the 4th industrial revolution

Day 2 of Pegaworld 2016 – another full day on the schedule.

The keynote started with Gilles Leyrat, SVP of Customer and Partner Services at Cisco, discussing how they became a more digital operation in order to provide better customer service and save costs. Cisco equipment provides a huge part of the backbone of the internet, supporting digital transformation for many other organizations, but this was about how they are transforming themselves to keep pace with their customers as well as their competitors. They are using Pega to digitize their business by connecting people and technology, automating processes, and using data for real-time analytics and process change to support their 20,000-strong sales team and 2M orders per year.

wp-1465319210937.png

Their digitization has three key goals: operational excellence, revenue growth, and “delightful” customer experience. Customer experience is seen as being crucial to revenue growth, with strong causal links showing up in research. He compared the old world — offshore customer service centers augmented by onshore specialists — with the new digital world, where digitization is a means to achieving their customer experience goal by simplifying, automating and using analytics. By reducing human touch in many standard processes, they are able to reduce wait time for customers while allowing workers to focus on interacting with customers to resolve problems: 93% of cases are now handled with zero touch, saving 2M hours of wait time per year and reducing order resolution time to 6 hours. The employee experience is improved through integrated workplaces and actionable intelligence that support their work patterns. He ended with the advice to understand what you’re trying to achieve, and linking your digital transformation initiatives to those goals.

Next was a panel on digital transformation moderated by Christopher Paquette, Digital Principal at McKinsey, including Alistair Currie, COO at ANZ Bank; Toine Straathof, EVP at Rabobank; Kevin Sullivan, SVP and Head of the Decision Sciences Group at Fifth Third Bank; and Nicole Gleason, Practice Lead for Business Intelligence & Analytics at Comet Global Consulting. A few notes from the panel (I mostly haven’t attributed to the specific speaker since the conversation was free-ranging):

  • Digital transformation is being driven by rapidly-changing customer expectations
  • Banking customers prefer mobile/online first, then ATM, then branch, then call center; this aligns well with operational costs but requires that the digital platforms be built out first
  • Moving internal stakeholders off their old methods and out of operational silos can be more difficult than dealing with regulators and other external parties
  • Making IT and business people responsible for results (e.g., a guiding business architecture) rather than dictating their exact path can lead to innovation and optimal solutions
  • Employee incentives need to be consistent across channels to lessen the competition across them
  • A lot of current digitization efforts are to bridge/hide the complexity of existing legacy systems rather than actual digital transformation

wp-1465322079456.pngAlan Trefler returned to the stage to introduce the concepts of the fourth industrial revolution and workforce disruption; he sees what is happening now as a step change in how society works and how we interact with technology. We heard from Alan Marcus, Head of the Technology Agenda at the World Economic Forum, on this topic, and how new categories of jobs and the required skill sets will completely transform employment markets. Lots of opportunities, but also lots of disruption, in both first world and emerging markets. He covered a timeline of changes and their impacts, and stressed that skill sets are changing quickly: 35% of core skills will change by 2020. wp-1465322062127.pngCompanies need to expose workers to new roles and training, and particularly open doors to women in all roles. Creativity will become a core skill, even as AI technologies gain acceptance. Governments and education systems need to innovate to support the changing workforce. Organizations need to reinvent their HR to help employees to move into this brave new world.

IMG_9803The keynote finished with Gerald Chertavian, Founder and CEO at Year Up, an organization that helps low-income youth prepare for a professional job. There’s a social justice goal of helping young adults who have no college degree (and no path to get one) to become hireable talent through practical training and internships; but there’s also the side benefit of feeding skilled workers into the rapidly-changing technology-heavy employment market that Marcus discussed earlier. Year Up was contacted by American Express, who needed people trained in Java and Pega in order to re-onshore some of their development work; they created a curriculum targeted at those jobs and trained up a large number of people who then competed successfully for those jobs. IMG_9804Year Up is now in 18 cities across the US, working with large organizations to identify skills gaps and train people to suit the employment pipeline. They’re changing tens of thousands of lives by providing a start on the path to upward mobility, and feeding a need for companies to hire the right skills in order to transform in this fourth industrial revolution.

 

Destination: Digital at the TIBCONOW 2016 day 1 keynote

TIBCO had a bit of a hiatus on their conference while they were being acquired, but are back in force this week in Las Vegas with TIBCO NOW 2016. The theme is “Destination: Digital” with a focus on innovation, not just optimization, and the 2,000 attendees span TIBCO’s portfolio of products. You can catch the live stream here, which covers at least the general sessions each morning.

IMG_9433CMO Thomas Been opened the day by positioning TIBCO as a platform for digital transformation, then was joined by CEO Murray Rode. Rode talked about TIBCO’s own transformation over the last 18 months since the last conference, and how their customers are using TIBCO technology for real-time operations, analyzing and predicting the consumers’ needs, and enhancing the customer experience in this 4th industrial revolution that we’re experiencing. He used three examples to illustrate the scope of digital business transformation:

  • A banking customer applies and is approved for a loan through the bank’s mobile app, without documents and signatures
  • A consumer’s desires are predicted based on their behavior, and they are offered the right product at the right time
  • A customer’s order (or other interaction with a business) is followed in real-time to enhance their experience

Although TIBCO has always been about real-time, he pointed out that real-time has become the new norm: consumers don’t want to wait for information or responses, and the billions of interconnected smart devices are generating events all the time. The use of TIBCO’s software is shifting from the systems of record — although that is still their base of strength — to the systems of engagement: from the core to the edge. That means not only different types of technologies, but also different development and deployment methodologies. Their goals: interconnect everything, and augment intelligence; this seems to also represent the two main divisions for their products.

wp-1463505346663.pngThat set the stage for Ray Kurzweil, the author and futurist, who spoke about the revolution in artificial intelligence-driven innovation supported by the exponential growth in computing capabilities. The drastically dropping price performance ratio of computing is what is enabling innovation: in some cases, innovation doesn’t occur on a broad scale if it’s not cost effective. He had lots of great examples of how innovation has occurred and will continue to evolve in the future, especially around human biology, finishing up with Thomas Been joining him on stage for a conversation about Kurzweil’s research as well as the opportunities facing TIBCO’s customers. I didn’t put most of the detail in here; check for a replay on the live stream.

wp-1463511387590.png

Matt Quinn, TIBCO’s CTO, took over with a product overview. In this keynote, he looked at the “interconnect everything” products, leaving the “augment intelligence” side of the portfolio for tomorrow’s keynote. They’ve set some core principles for all product development: cloud first (including on-premise and hybrid, as well as public cloud), ease of use (persona-based UX, industry solutions, and support community), and industrialization (cross-product integration, more open DevOps, and IoT). He expanded the idea of “interconnect everything” to “interconnect everything, everywhere”, and brought in VP of engineering Randy Menon to talk about their cloud platform strategy specifically as it relates to integration. As Quinn mentioned, he talked about how TIBCO has always built great products for the core, or “products for the CIO” as he put it, but that they are now looking at addressing different audiences. He went through some of the new functionality in their interconnection portfolio, include enhancements to ActiveMatrix BusinessWorks, ActiveMatrix BPM (now including case management and more flexible UI building), TIBCO MDM, and FTL messaging. He also introduced and showed demos of BusinessWorks Container Edition for cloud-native integration, supporting a number of standard cloud container services; TIBCO Cloud Integration, allowing iPaaS use cases to be enabled using a point-and-click environment; and Microflows using Node.js. He talked about their Mashery acquisition and what’s coming up in the API management product with real-time APIs, richer visual analytics leveraging Spotfire, and a cloud-native hybrid gateway. Combined with the other cloud products, this provides an end-to-end environment for creating and deploying cloud APIs. But their technology advances aren’t just about developers: it’s also for “digital citizens” who want to integrate and automate a variety of cloud tools using Simplr, which allows for simple workflows and forms. Nimbus Maps, a slimmed-down version of Nimbus, is also a tool for business people who want to do some of their own process documentation.

IMG_9452Rajeev Kozhikkattuthodi, director of product marketing, came up to announce Project Flogo, a lightweight IoT integration product, which they intend to make open source. It can be used to create simple workflows using a Golang-based engine that integrate with a variety of devices, a design bot in Slack and an interactive debugger; the runtime is 20-50 times smaller than similar development environments. It’s not released yet but he showed a brief demo and it’s apparently on the show floor.

wp-1463511613470.png
wp-1463511629016.png

Quinn returned to mention a few other products — TIBCO Expresso; Momento; and their IoT innovations — before turning over to Raj Verma, EVP of worldwide sales to talk about their customers’ journey during the purchasing process. With 10,000+ customers and $1B in revenue, TIBCO is big but has room to grow, and a better experience during the purchase, installation and scaling of TIBCO products would help with that. They are starting to roll out some of this, which includes much more self-service for product information and downloaded trials, plus enhancements to the TIBCO community to include more training materials and support; standardized pricing for product suites; and online purchasing. Although there is still a significant field sales force to help you along, it’s possible to do much more directly, and they’re enhancing their partner channel (which Verma admitted has some significant problems in the past) if you have already have a trusted service provider. A much more customer-focused approach to sales and implementation, which was certainly required to make them more competitive.

A marathon 3-hour general session, with a lot of good content. I’m looking forward to the rest of the conference.

I’ll be speaking on a panel this afternoon on the topic of digital business, drop by and say hi if you’re at the conference.