Reading And Writing Resolutions For 2014

The past year was a pretty busy one for me, with quite a bit of enterprise client work that I couldn’t discuss here, so my blogging was mostly limited to conferences that I attended and webinars/white papers that I created. Okay for information dissemination, but not so much for starting conversations, which is why I started blogging 9 years, 2,400 posts and 850k words ago. I’m also way behind on my reading of other blogs, so much so that the older unread ones are starting to drop out of my newsreader.

Catching up on the reading will likely involve committing a drastic act in my newsreader (clearing all unread – yikes!), trimming down the blogs that I follow, and making time regularly to browse for interesting developments in blogs and Twitter.

Getting back to some more interesting writing will follow from the reading: reading other people’s interesting ideas always helps me to generate some of my own, then it’s just a matter of putting hands to keyboard on a regular basis, and letting the ideas out into the wild.

Here’s to 2014!

camunda BPM 7.0 Refactors Engine And Process Administration

On August 31, camunda released camunda BPM platform 7.0 (community open source and enterprise editions), the first major release of the software since it was forked from the Activiti project in March, although there were nine community releases between the fork at the release of 7.0. I had the chance for a couple of briefings over a period following that with Daniel Meyer, the project lead, and promised that I’d actually get this post written in time for Christmas. 🙂

The 7.0 release contains a significant amount of new code, but their focus remains the same: a developer-friendly BPM platform rather than a tool positioned for use by end users or non-technical analysts. As I discussed in a recent webinar, BPMS have become model-driven application development environments, hence having a BPMS positioned explicitly for developers meets a large market segment, especially for complex core processes.

The basic tasklist and other modeler and runtime features are mostly unchanged in this version, but there are big changes to the engine and to Cockpit, the technical process monitoring/administration module. Here’s what’s new:

Cockpit:

  • Inspect/repair process instances, including retrying failed service calls.
  • Create instance variables at runtime, and update variable values.
  • Reassign human activities.
  • Send a link directly to a specific instance or view.
  • Create a business key corresponding to a line-of-business system variable, providing a fast and easy to search on LOB data.
  • Extensible via third-party plug-ins. The aim with Cockpit is to solve 80% of use cases, then allow plug-ins from consulting partners and customers to handle the remainder; they provide full instructions on how to develop a Cockpit plug-in.
  • Add tabs to the detailed views of process instance, e.g., a link to LOB or other external data.

Engine:

  • A new authorization framework (also manifesting in admin capabilities for users/groups/authorizations): this is a preview feature in 7.0, supporting only application, group, group membership authorization. In the future, this will be expanded to include process definition and instance authorization. Users can be maintained in an internal camunda database or using a direct link to LDAP.
  • A complete rewrite of the history/audit log, splitting the history and runtime databases, which is potentially a huge performance booster. Updates to the history are triggered from events on running instances, whereas previously, writing history records required querying and updating existing records for that instance. The history log can be redirected to a database that is shared by multiple process engines; since some amount of the Cockpit monitoring is done on the history database, this makes it easier to consolidate monitoring of multiple process engines if the history logs are redirected to the same database. The logs can also be written directly to an external database based on the new history event stream API. Writes to the history log are asynchronous, which also improves performance. At the time of release, they were seeing preliminary benchmarks of 10-20% performance improvement in the process engine, and a significant reduction in the runtime database index size.
  • There is some increase in the coverage of the BPMN 2.0 standard; their reference document shows  supported elements in orange with a link through on each element to the description and usage, including code snippets where appropriate. Data objects/stores are still not supported, nor are about half of the event types, but their track record is similar to most vendors in this regard.

Version 7.0 is all open source, but a consolidation release (7.1) is already in alpha and will contain some proprietary administration features in Cockpit not available in the open source version: bulk edit/restart of instances, complex search/filter across instances from different process definitions, and a process-level authorizations UI (although the authorization structure will be built into the common engine). camunda is pretty open about their development, as you might expect from an open source company; you can even read some quite technical discussions about design decisions such as a new Activity Instance Execution Model have been made in the process engine in order to improve performance.

In September, camunda released a cloud application for collaborating on process models, camunda share. This is not a full collaborative authoring environment, but a place to upload, view and discuss process models. The camunda team created it during their “ShipIt-Day”, where they are tasked with creating something awesome within 24 hours. There’s no real security, your uploaded model generates a unique URL that you can send to others, but it provides the option to anonymize the process model by removing labels if your process contains proprietary information. A cool little side project that could let you avoid sending around PDFs of your process models for review.

camunda’s business model is in providing and supporting the enterprise edition of the software, which includes some proprietary functions in Cockpit but is otherwise identical to the community open source edition, plus in consulting and training services to help you get started with camunda BPM. They provide a great deal of the effort behind the community edition, while encouraging and supporting platform extensions such as fluent testing, PHP developer support and enterprise integration via Apache Camel.

camunda BPM 7.0

Webinar On Business-IT Alignment In Process Applications

This afternoon, I’m giving a webinar (hosted by Software AG) on business-IT alignment when developing process-centric applications: you can sign up for it or see the replay here.

Some interesting stuff on model-driven development and also why we usually need to use separate modeling tools when we’re building applications for complex core processes.

We’re also developing a white paper on this topic, to be released in the next few weeks; I’ll post a link to that when it’s out.

Technicity2013 Cybersecurity Panel: How Prepared Is Business?

Our afternoon panel was moderated by Pete Deacon of Blackiron Data (another conference sponsor), and featured panelists from private industry: Kevvie Fowler, forensic advisory services at KPMG; Daniel Tobok, digital forensics at TELUS; Jeff Curtis, chief privacy officer at Sunnybrook Hospital; and Greg Thompson, enterprise security services at Scotiabank.

Security breaches happen. And as Deacon reminded us, over 60% of those take months (or years) to detect, and are usually detected by someone outside the organization. What are the real cybersecurity risks, what are companies’ perceptions of the risk, and what are the challenges that we face? Fowler believes that since security is often a low-level IT issue, the security message isn’t making its way up the ladder to the C-suite unless a high-profile breach occurs that requires some sort of executive damage control. Curtis agreed, adding that hospitals are used to dealing with clinical risks right up through the executive levels but that IT security risks are a new topic for their executive risk management participants. Both noted that it’s important to have the right people to carry that message: it has to be technically correct, but integrated with the business context and goals. Thompson added that the message doesn’t need to be dumbed down for the C-suite: their board is very used to assessing complex financial risk, and is capable of assessing other types of complex risk, although may need to become versed in some of the cybersecurity language and technology.

The next topic was BYOD (bring your own device), and Thompson pushed the conversation beyond this to BYON(etwork), where people bring their own network, even if just through a smartphone hotspot. Companies are losing control of where people do their work, both devices and network, and solutions should be designed to assume that all endpoints and networks are potentially hostile. Business and productivity have to be balanced with risk in these cases: people will do what they need to do in order to get their job done, and if you think that you’ve avoided security breaches by locking down someone’s access on their corporate device, you can be sure that they’re finding a way around that, possibly on their own device. Curtis agreed, and pointed out that they have a lot of students and interns who come in and out of the hospital environment with their own devices: the key is to enable workers to get their work done and protect the data, not to hamstring their work environment, so they have a device registration policy for BYOD that is working well. Tobok works with a lot of law firms, and notes a recent trend of new lawyers using technology capabilities (including openness to BYOD) as a competitive criterion when selecting a firm to work for.

Moving on to security analytics, Fowler said that there are few organizations actually getting value from predictive security analytics, versus more straightforward data mining: it’s important to query the vendors providing predictive analytics on the models that they’re actually using and the success rates. Thompson agreed that predictive analytics is a bit of black magic right now, but sees a lot of value in historical data analysis as a guide to improving the security environment. In my opinion, in the next two years, predictive analytical models are going to start to become mainstream and useful, moving out of a more purely research phase; we’re seeing this in predictive process analytics as well, which I still talk about in the context of “emerging technologies”. This is all tied up with reporting and compliance, of course: business intelligence and analytics have played, and will continue to play, a key role in detecting breaches and auditing cybersecurity. Both Curtis and Thompson spoke about the regulatory pressures in their respective industries and the growth of analytics and other GRC-related tools; healthcare is obviously a highly-regulated industry, and Scotiabank does business in 55 countries and has to deal with the regulations in all of them. Auditors and regulatory bodies are also having to step up their knowledge about cybersecurity.

There was a question from the audience on investigations of security breaches in cloud environments: Tobok is involved in cybersecurity forensic investigations including cloud, and discussed the changes that have happened in the industry in the four years that he’s been involved in cloud security forensics in order to provide better traceability and auditing. Fowler added that forensic science is adapting for these type of investigations, and half of the work is just figuring out what systems that the data has been resident on since the typical cloud contract only allows a client to access their data, not the actual servers on which is resides. These can include a number of other factors, such as hackers that use compromised credit cards to lease space in a data centre in order to hack into another organization’s data in that same centre; obviously, these complexities don’t exist in breaches to a company’s own data centre.

There was a final panel with five of the vendors who are sponsoring the conference, but my brain was pretty full of security information by then (and I thought that this might be a bit more about their products than I care about) so I decided to duck out before the end.

Another great Technicity conference, and I look forward to next year.

Technicity2013 Cybersecurity Keynote: Microsoft’s Angela McKay

This morning at Technicity 2013, we mostly heard from academics and public sector; this afternoon, it’s almost all private sector presentations and panels, starting with a keynote from Angela McKay, director of cybersecurity and strategy at Microsoft, on managing cyber risks through different approaches to addressing uncertainty. Risk, and therefore answering the question “am I secure enough?”, are quite individual choices: different people and different companies (and cultures) have different risk thresholds, and therefore may have different cybersecurity strategies.

By 2020, we will have 4B internet users, 50B connected devices, and data volumes 50x those of 2010. As users evolved, so have cyber threats: from early web defacement hacks, to worms, to the present day botnets and targeted attacks. There is a spectrum of cybersecurity threats: crime, disruptions (e.g., DDoS attacks), espionage, conflict, war; there is a lot of technological development going on around these, but there are also cultural and policy issues, namely the expectations of consumers, companies and governments. McKay discussed the EU network and information security directive and the US executive order and presidential policy directive on cybersecurity, and the levels of new regulation that are coming.

Reducing the impact of cyber threats involves risk management, information exchange, and effective partnership (both public-private and between private organizations). You can’t do risk management without information, and this means that cybersecurity is a CIO-level issue, not just some technical plumbing. Information sharing, however, can’t be indiscriminate; it has to be focused on specific outcomes. [As an aside, I’m not sure that I agree with this in some situations: open data initiatives work because the “owners” of the data can’t conceive of what anyone would do with their data, yet emergent uses happen with interesting results.] Private-public partnerships bring together the policies and goals related to public safety of the public sector, and the technical know-how of the private sector.

She spoke about the shared responsibilities for managing cyber risks: awareness and education, partnering effectively, driving and incentivizing cyber security, adopting best practices, building advancing capabilities, and developing a secure workforce. Furthermore, academia has to step up and start teaching security concepts and remedies at the college and university level, since most developers don’t have much of an idea about cyber risks unless they specialized in security post-graduation.

Microsoft is the premier sponsor of Technicity 2013, although to be fair, McKay’s talk covered very little about their products and services except for some generic discussion about automated cyberdefense at a machine level. Her slides used that ubiquitous font that we see on the Microsoft Windows 8 website, however, so probably some subliminal messaging going on. 🙂

Technicity2013 Cybersecurity Panel: Is Canada Ready?

Andy Papadopulous of Navantis moderated a panel on the Canadian context of cybersecurity, with panelists Rob Meikle, CIO of City of Toronto; Ritesh Kotak, Operation Reboot (cybercrime initiative) at Toronto Police Service; Wesley Wark, professor at University of Ottawa’s graduate school of public and international affairs, and a specialist in national security policy; and Stephen McCammon, legal counsel at the Ontario Information and Privacy Commissioner.

They each spoke about their specific take on privacy and security in Canada:

Meikle: The interconnection and importance of data and technology, and how these are no longer just on computers inside our offices any more: in addition to cloud computing, we consume information on mobile devices, but also collect and process information from remote devices such as transit vehicles. He addressed the Toronto open data initiative, and how it is critical to look at data from a public citizen perspective rather than an organizational perspective: similar views would not go amiss in private sector organizations and their data.

Kotak: How TPS is having to redefine crime in the era of cybercrime, and how the police force is having to adapt in order to track down online crimes in the same way that they do with “real world” crimes in order to protect public safety. His experience in researching how police services are addressing cybercrime is that many of them equated it only with child exploitation (driven, likely, by the federal government tendency to do the same in order to justify their over-reaching anti-privacy legislation that we heard about from Michael Geist earlier), but there are obviously many other forms of cybercrime, from financial to hacking pacemakers. They identified a number of areas that they needed to address with respect to cybercrime: overt communication (e.g., social media), investigations, covert operations, and policies and procedures.

Wark: Cyberaggression and its impact on us, with five possible outlets: cyberwar, cyberterrorism, cyber covert operations, cyberespionage and cybercrime. He feels that the first two do not actually exist, that covert operations is an emerging area, while espionage and crime are well-established cyber activities. He maintains that the government’s focus on terrorism in general is a bit ridiculous, considering the lack of any evidence that this is occurring or even imminent (a recent US study showed that Americans are more likely to be killed by their own furniture than by terrorism); and that the government has a difficult time establishing their role and responsibilities in cybersecurity beyond throwing out some simplistic barriers around classified government data. We need to do more with private-public partnerships and education — starting with some simple sharing of best practices — in order to appropriately address all forms of cyberaggression. We need to decide what we really mean by privacy, then define the legal framework for protecting that.

McCammon: How to achieve the balance between privacy and openness. Usability is critical: it’s not just enough to have good authentication, encryption and other services to protect people’s privacy; those tools need to be easy enough for everyone to use (or completely and transparently embedded in other platforms), although Wark challenged that that was unlikely to happen. More information is being gathered, and will continue to be gathered, and analytics allow that to be integrated in new ways; there is no putting the toothpaste back in that particular tube, so we need to learn to deal with it in ways that protect us without requiring us to pull the plug and move to the woods. Trust is essential for privacy (although I would add that enforcement of that trust is pretty critical, too).

Good discussion.

Technicity2013 – Focus On Cybersecurity Michael Geist Keynote @mgeist

I can’t believe that it’s been a year since the last Technicity conference: a free conference hosted by IT World Canada, and sponsored this year by McAfee and Microsoft. Last year, the focus was on crowdfunding including some lessons from crowdfunding in the UK and a panel on legalizing equity crowdfunding; this year, it’s about cybersecurity. There’s a strong presence from the city of Toronto here, including an opening address from Councillor Gary Crawford, and the participation of the city’s CIO Rob Meikle on a panel; plus provincial government participation with Blair Poetschke, director of the international trade branch for the Ontario Ministry of Economic Development, and Stephen McCammon, legal counsel at the Office of the Ontario Information and Privacy Commissioner.

Ontario is a hotbed for technology development in Canada, with a large software development community in and around Toronto. Toronto has also been a relatively early provider of open government data and publish a catalogue of online data, which in turn fosters innovation. The G8 countries have now signed on to a full open data initiative, and this is a good thing: we, as taxpayers, pay to have this information collected, and as long as it doesn’t violate anyone’s privacy, it should be freely available to us. Although this conference isn’t about open data, an environment of freely-available government data is a good place to start talking about security and privacy.

It wouldn’t be a Canadian event about cybersecurity without a keynote by Michael Geist, and he delivered on the topic of “The Internet: Friend or Foe?” (a question that many of us ask daily). Although he started with the answer “friend”, he also immediately addressed the privacy and security concerns that arise from the recent news that the NSA has hacked pretty much everyone on the planet, and the ramifications of Edward Snowden’s revelations: it’s not just metadata (as if that weren’t bad enough), and there are a lot of governments and companies complicit in this, including ours. You can read more about this from a Canadian security perspective on Geist’s excellent blog; as a law professor and the Canada Research Chair on internet and e-commerce law, he has a pretty good perspective on this. Geist and others think that what has come out from Snowden’s information is just the tip of the iceberg, and that we have many more horror stories to come.

A big challenge in this environment is with cloud computing, specifically any cloud storage that is resident in the US or owned by a US company: many companies are now calling for local (and locally-owned, therefore out of the grasp of the US Patriot Act) storage from their cloud providers. It’s a small consolation that I’ve been asking about locally-hosted — or at least, non-US hosted — BPM cloud providers for a number of years now; finally, the general business public has woken up to the potential surveillance dangers.

Encryption is becoming a much more visible issue, whereas previously it was a purely technical concern: cloud providers (Google, Microsoft and Twitter, to name three) are ramping up encryption of their traffic in what is rapidly becoming a technology arms race against our own governments. Similarly, businesses and individuals are demanding greater transparency from cloud providers with respect to the disclosures that they are making to government intelligence agencies. Many international bodies are calling for control of internet domains and standards to be wrested away from US-based organizations, since these have been shown to include a variety of government intelligence and corporate sock puppets.

In Canada, our conservative government is busy sucking up to the US government, so we have seen a number of privacy-busting attempts at an online surveillance bill by positioning “lawful access” (i.e., the government can access all of your information without explicit permission) as “protecting our children” by tossing in a bit about cyberbullying. Geist discussed some of the dangers of this bill (Bill C-13, just introduced last week) in a post yesterday, specifically that companies have immunity against prosecution for violating our privacy and information security if they hand that information over to the government under the definitions of this bill. 

He finished up with a look at Canada’s anti-spam law that is coming into effect shortly; this includes making communication from businesses opt-in rather than opt-out, and also requiring consent before installing computer programs in the course of a commercial activity.

It was great to see Geist in person, he’s a great speaker, full of passion and knowledge about his subject. As always, he inspires me to help make Canada a better place for our online activities.

Breakfast Seminar On Intelligent Business Processes (Toronto) – December 3

I recently wrote a white paper and gave a webinar on intelligent business processes, sponsored by Software AG (although not about their products), and I’m now giving a breakfast seminar for them on the same topic in Toronto on December 3rd. If you’re Toronto-based, or are going to be there that day, you can see more information on the free seminar here and sign up for it here.

From Brazil To Vegas: BPM Global Trends and Building Business Capability

My frenzy of seven conferences in six weeks (eight, if you count the two different cities in Brazil as different conferences) is drawing to a close, but the past two weeks have been a bit brutal. Last week, I was in São Paulo and Brasília for the BPM Global Trends seminar series, where I presented in both cities along with Jan vom Brocke from University of Liechtenstein. It was arranged by ELO Group with the strong support of ABPMP Brazil, and was most interesting because I was presenting from a Portuguese version of my slides (with an English version visible to me) while United Nations-style simultaneous translators worked their magic from a booth at the back of the room.

I did a longer presentation in São Paulo earlier in the week with a workshop in the afternoon, then split it into two presentations with some added material for the public sector seminar in Brasília:

Many thanks to my hosts, and to those voices in my head: Leonardo and Daniel, the wonderful translators who brought my material alive for the Portuguese audience, and translated the questions and comments into English for me.

Unfortunately, I didn’t get to see a lot of Brazil except for hotels, airports and conference rooms, although I did get a short tour (thanks, Jones!) of the weird and wonderful modernist architecture of Brasília on the day that I flew out.

I arrived home in Toronto on Sunday morning, then 24 hours later was on a flight to Las Vegas for the Building Business Capability conference – my third trip to Vegas in a month. I presented a half-day seminar yesterday on emerging BPM technology, an ever-changing topic that continues to fascinate me:

I finished up today with a breakout session on the interplay of rules, process and content in case management, which is the combination of a number of different themes that I’ve been playing with over the past few years, but the first time for the presentation in this form:

I’m off to the evening reception to meet up with my peeps here, then tomorrow I get to take it easy and listen to someone else present for a change. Or maybe sit by the pool and let my brain relax for a day before I fly home to get back to my regular client work, and start to work through that backlog of product briefings that I have piled up in my drafts folder.

That’s the last of my conference travel for the year, but not the last of my conferences: I’ll be attending at least one day of CASCON next week for a workshop on Real Time Patient Flow Management using Business Process Management, Location Tags, and Complex Events Processing and to hear some of the research papers, then the Technicity Focus on Cyber Security event on November 26th. I’m also speaking at a Toronto breakfast seminar on intelligent business processes on December 3rd for Software AG.

Whew!

Intelligent Business Processes Webinar Q&A

Earlier this week, I gave a webinar on intelligent business processes, sponsored by Software AG; the slides are embedded following, and you can get a related white paper that I wrote here.

There were a number of questions at the end that we didn’t have time to answer, and I promised to answer them here, so here goes. I have made wording clarifications and grammatical corrections where appropriate.

First of all, here’s the questions that we did have time for and a brief response to each – listen to the replay of the webinar to catch my full answer to those.

  • How do you profile people for collaboration so that you know when to connect them? [This was in response to me talking about automatically matching up people for collaboration as part of intelligent processes – some cool stuff going on here with mining information from enterprise social graphs as well as social scoring]
  • How complex is to orchestrate a BPMS with in-house systems [Depends on the interfaces available on the in-house systems, e.g., web services interfaces or other APIs]
  • Are Intelligent Business Processes less Dynamic Business Processes [No, although many intelligent processes rely on an a priori process model, there’s a lot of intelligence that can be applied via rules rather than process, so that the process is dynamic]
  • How to quantify the visibility to the management? [I wasn’t completely sure of the intention of this one, but discussed the different granularities of visibility to different personas]
  • Where does real-time streaming fit within the Predictive Analytics model? [I see real-time streaming as how we get events from systems, devices or whatever as input to the analytics that, in turn, feed back to the intelligent process]

And here’s the ones that we didn’t get to, with more complete responses. Note that I was not reading the questions as I was presenting (I was, after all, busy presenting), so some of them may be referring to a specific point in the webinar and may not make sense out of context. If you wrote the question, feel free to elaborate in the comments below. If something was purely a comment or completely off topic, I have probably removed it from this list, but ping me if you require a follow-up.

There were a number of questions about dynamic processes and case management:

We are treating exceptions as more as normal business activity pattern called dynamic business process to reflect the future business trend [not actually a question, but may have been taken out of context]

How does this work with case management?

I talked about dynamic processes in response to another question; although I primarily described intelligent processes through the concepts of modeling a process, then measuring and providing predictions/recommendations relative to that model, a predefined process model is not necessarily required for intelligent processes. Rules form a strong part of the intelligence in these processes, and even if you don’t have a predefined process, you can consider measuring a process relative to accomplishments of goals that are aligned with rules rather than a detailed flow model. As long as you have some idea of your goals – whether those are expressed as completing a specific process, executing specific rules or other criteria – and can measure against those goals, then you can start to build intelligence into the processes.

Is process visibility about making process visible (documented and communicated) or visibility about operational activities through BPM adoption?

In my presentation, I was mainly addressing visibility of processes as they execute (operational activities), but not necessarily through BPM adoption. The activities may be occurring in any system that can be measured; hence my point about the importance of having instrumentation on systems and their activities in order to have them participate in an intelligent process. For example, your ERP system may generate events that can be consumed by the analytics that monitor your end-to-end process. The process into which we are attempting to gain visibility is that end-to-end process, which may include many different systems (one or more of which may be BPM systems, but that’s not required) as well as manual activities.

Do we have a real world data to show how accurate the prediction is from Intelligent Processes?

There’s not a simple (or single) answer to this. In straightforward scenarios, predictions can be very accurate. For example, I have seen predictions that make recommendations about staff reallocation in order to handle the current workload within a certain time period; however, predictions such as that often don’t include “wild card” factors such as “we’re experiencing a hurricane right now”. The accuracy of the predictions are going to depend greatly on the complexity of the models used as well as the amount of historical information that can be used for analysis.

What is the best approach when dealing with a cultural shift?

I did the keynote last week at the APQC process conference on changing incentives for knowledge workers, which covers a variety of issues around dealing with cultural shifts. Check it out.

In terms of technology and methodology, how do you compare intelligent processes with the capabilities that process modeling and simulation solutions (e.g., ARIS business process simulator) provide?

Process modeling and simulation solutions provide part of the picture – as I discussed, modeling is an important first step to provide a baseline for predictions, and simulations are often used for temporal predictions – but they are primarily process analysis tools and techniques. Intelligent processes are operational, running processes.

What is the role of intelligent agents in intelligent processes?

Considering the standard definition of “intelligent agent” from artificial intelligence, I think that it’s fair to say that intelligent processes are (or at least border on being) intelligent agents. If you implement intelligent processes fully, they are goal-seeking and take autonomous actions in order to achieve those goals.

Can you please talk about the learning curves to the Intelligent Business process?

I assume that this is referring to the learning curve of the process itself – the “intelligent agent” – and not the people involved in the process. Similar to my response above regarding the accuracy of predictions, this depends on the complexity of the process and its goals, and the amount of historical data that you have available to analyze as part of the predictions. As with any automated decisioning system, it may be good practice to have it run in parallel with human decision-making for a while in order to ensure that the automated decisions are appropriate, and fine-tune the goals and goal-seeking behavior if not.

Any popular BPM Tools from the industry and also any best practices?

Are ERP solutions providers and CRMs doing anything about it?

I grouped these together since they’re both dealing with products that can contribute to intelligent processes. It’s fair to say that any BPM system and most ERP and CRM systems could participate in intelligent processes, but are likely not the entire solution. Intelligent processes combine processes and rules (including processes and rules from ERP and CRM systems), events, analytics and (optionally) goal-seeking algorithms. Software AG, the sponsor of the webinar and white paper, certainly have products that can be combined to create intelligent processes, but so do most of the “stack” software vendors that have BPM offerings, including IBM, TIBCO and SAP. It’s important to keep in mind that an intelligent process is almost never a single system: it’s an end-to-end process than may combine a variety of systems to achieve a specific business goal. You’re going to have BPM systems in there, but also decision management, complex event processing, analytics and integration with other enterprise systems. That is not to say that the smaller, non-stack BPM vendors can’t piece together intelligent processes, but the stack vendors have a bit of an edge, even if their internal product integration is lightweight.

How to quantify the intelligent business process benefits for getting funding?

I addressed some of the benefits on slide 11, as well as in the white paper. Some of the benefits are very familiar if you’ve done any sort of process improvement project: management visibility and workforce control, improved efficiency by providing information context for knowledge workers (who may be spending 10-15% of their day looking for information today), and standardized decisioning. However, the big bang from intelligent processes comes in the ability to predict the future, and avoid problems before they occur. Depending on your industry, this could mean higher customer satisfaction ratings, reduced risk/cost of compliance, or a competitive edge based on the ability for processes to dynamically adapt to changing conditions.

What services, products do you offer for intelligent business processes?

I don’t offer any products (although Software AG, the webinar sponsor, does). You can get a better idea of my services on my website or contact me directly if you think that I can add value to your process projects.

How are Enterprise Intelligent Processes related to Big Data?

If your intelligent process is consuming external events (e.g., Twitter messages, weather data), or events from devices, or anything else that generates a lot of events, then you’re probably having to deal with the intersection between intelligent processes and big data. Essentially, the inputs to the analytics that provide the intelligence in the process may be considered big data, and have some specific data cleansing and aggregation required on the way in. You don’t necessarily have big data with intelligent processes, but one or more of your inputs might be big data. 

And my personal favorite question from the webinar:

Humans have difficulty acting in an intelligent manner; isn’t it overreaching to claim processes can be “intelligent”?

I realize that you’re cracking a joke here (it did make me smile), but intelligence is just the ability to acquire and apply knowledge and skills, which are well within the capabilities of systems that combine process, rules, events and analytics. We’re not talking HAL 9000 here.

To the guy who didn’t ask a question, but just said “this is a GREAT GREAT webinar ” – thanks, dude. 🙂