A good article by Bruce Silver on the convergence of BPM and SOA, including some speculation about what Oracle might be about to spring on the BPMS market.
Author: sandy
SOA myths
From Intelligent Enterprise, the top 5 myths of SOA:
- If you’re using Web services, you’ve achieved SOA.
- You can buy SOA out of the box.
- You can simply wrap legacy systems with services.
- Once the top executives are sold on SOA, your troubles are over.
- SOA is easy.
Check out the article for the details (not particularly meaty) on each.
I consider the last two myths to be self-evident for any type of non-trivial technology, so are a bit frivolous to be included in this list. The first three, however, I run across on a regular basis, and are good to keep in mind when evaluating what you’re doing with SOA, and what vendors are trying to sell to you. My response when I encounter these three myths:
- Web services is a set of specific technology standards for building interoperable components. SOA is an architectural and design philosophy that is typically implemented using web services standards, but doesn’t have to be.
- You can’t buy philosophy out of the box: “I introspect, therefore I integrate” provides design guidance, but it’s not a ready-to-go product (with apologies to RenĂ© Descartes).
- I’m not so sure that this is a myth, since you could actually wrap legacy systems with services interfaces and they would become valid parts of your SOA environment. The point in the IE article is that maybe this is a good time to reevaluate the systems that you have and replace some of that old legacy stuff, but that’s not an SOA/not-SOA decision per se.
RedMonk Radio podcasts
I’ve been catching up on some podcasts lately — I subscribe to more than 20 via iTunes but only have a chance to listen to a couple of my favourites each day — and finally had a chance to listen to some of the recent RedMonk Radio podcasts from RedMonk, a small analyst firm. I like RedMonk Radio because it’s a conversation between the participants, rather than a formal interview or a standalone speaker, so it’s fun to listen to as well as informative (I especially like James’ imitation of American accents 🙂 ).
In episode 6, James Governor and Michael CotĂ© talk about SOA testing, and the specific problems of testing the heterogeneous SOA environment that most organizations have that includes legacy systems and newer services. There’s a great bit where James states “EAI and SOA are not the same thing, but on the other hand, some of the concepts of EAI, some of the approaches and even some of the technologies are relevant.” CotĂ© asks “What do you think the differences are between the two?” James responds “$40,000 per CPU” (referring to the former market for expensive adapters and connectors that have been replaced by standard web services interfaces), which completely cracked me up. Also some interesting thoughts on testing when some or all of the software/services are outsourced.
In episode 8, Scott Mark interviews James and CotĂ© about life as an industry analyst, which is really about the small-firm analyst experience: I’m sure that things are much different for Gartner analysts, for example. The outside view of analysts is much more glamorous than the reality, but these guys are obviously in it because they have a passion for it, not because they’re making huge buckets of money at it. The funny thing is that the typical day that James described for himself is much like what I do: customer/prospect followups, industry research, reviewing blogs, vendor briefings, writing vendor and technology reviews, and consulting. Maybe I’m an analyst in disguise.
A Short History of BPM, Part 4
Continued from Part 3.
Part 4: Creating the Light, Stars, Moon and Sun
(Okay, the Genesis analogy is getting a bit old, but this really is a tale of creation.)
Organizations that had implemented workflow quickly realized that once the process became electronic instead of paper-driven, it wasn’t possible to monitor the process just by walking around the shop floor. Workflow monitoring and reporting were born from the need to understand where the process was — and wasn’t — working, and this was often a highly customized capability. After years of customers building their own custom reporting and monitoring tools, or using third-party analytics, the workflow products started to extend their native capabilities into this area. The early real-time monitoring grew to become a part of what we now know as business activity monitoring, or BAM. Process management and governance, both through this real-time monitoring and historical reporting and analysis, became a critical part of the process to customers, especially those implementing quality programs such as Six Sigma. The need to optimize processes pushed beyond analytics to process simulation and optimization tools.
EAI products grew in a different direction altogether. Most of the products provided some degree of reporting and analytics, and didn’t need the process governance associated with human-facing workflow. Instead, EAI vendors started to look outside the organization, and extended EAI to business-to-business integration, or B2Bi. This process collaboration allowed their customers to implement processes — still primarily system-to-system — that loosely coupled their business processes with those of their customers and other trading partners, not just flows between internal systems.
This would become one of the most significant advances for the type of BPM that we have today, allowing organizations to include both human and system participants in processes that span multiple organizations. The ROI on B2Bi can be enormous, and allows for the creation of a sort of configurable “business process firewall” as a standard process interface between an organization and its trading partners so that either can change their internal processes and data without disturbing the other.
Next: convergence and confusion.
A Short History of BPM, Part 3
Continued from Part 2.
Part 3: Creating the Firmament
In the late 1990’s, as workflow vendors saw the benefits of EAI and understood how it could replace the more heavily customized integration that was typical for workflow solutions, they began adding EAI capabilities to their workflow products. Although there were exceptions, this was typically done by the workflow vendor striking an OEM agreement with an EAI vendor to allow the EAI product to be embedded seamlessly into the workflow product.
Not surprisingly, EAI vendors saw the benefits of having some human-facing steps in the system-to-system processes, and began adding rudimentary human-facing capabilities. These functions, usually built by the EAI vendor directly rather than partnering with a workflow vendor, were fairly rudimentary since they were intended just for the repair of processes that had an exception condition that could not be handled automatically. You could say that these processes were “human interrupted” rather than “human facing”.
Although we now had systems that (theoretically) spanned the full range of the integration space, these were really just workflow systems with a bit of EAI, or EAI systems with a bit of workflow, and the vendors still sold to their strong suit.
Next: sideways diversification.
Gartner podcast on EA leaders
A good 9-minute podcast from Gartner on the role of enterprise architecture leaders, featuring Robert Handler, one of their research VPs. He spends some time discussing how an EA role is more than just IT:
“It’s become a truly leadership role, one that involves strong communication skills and fundamentally bilingual capabilities, being able to speak both business and IT.”
Handler mentions that the EA role often includes aspects of process optimization now. He also says that although EA’s have moved up from reporting a couple of levels below the CIO to reporting directly to the CIO, that they should really be reporting to the CEO — something that I’ve posted about previously. Lots of good stuff on who the enterprise architect should be forming strong relationships with, in order to help foster architectural alignment throughout the organization.
Update: there’s also an EA-related podcast from Macehiter Ward-Dutton, a UK-based consultancy. The first part is unrelated industry news analysis, but there’s some EA bits based on a discussion that Neil Macehiter had (offline) with James McGovern starting at the 23-minute mark. Apparently, they’ll be having McGovern on the podcast later this month.
A Short History of BPM, Part 2
Continued from Part 1.
Part 2: Creating Night and Day
While workflow was getting its start in the 1980’s, enterprise application integration, or EAI, emerged independently for system-to-system integration.
A key player in the early days (and still today) was IBM with their MQ Series messaging, which became a de facto standard for system-to-system communication in many large organizations. Many other vendors entered the market, and a variety of technical architectures for message brokers emerged, but all had the same basic goal: to automate the near-real-time exchange of data between systems, typically mainframe-based transaction processing systems or server-based relational databases.
Even the simple functionality provided by early versions of EAI could reap huge benefits in two areas:
- It was no longer necessary to re-enter data in order to transfer it between systems, reducing data entry efforts and error rates.
- Information could be exchanged between systems in near-real-time, rather than relying on overnight batch updates.
These early EAI systems had no human-facing functionality, since they assumed that the applications being integrated provided all required user interface. In other words, if an error occurred sending data from system A to system B, the data or other problem would be fixed in one of those two systems, not in the EAI system.
Workflow and EAI lived very separate lives during that time, although they are now seen as two ends of the same integration spectrum. Not only were they created by two different camps of vendors (with a few not-very-succesful exceptions), and based on different technologies and standards, but they were sold to two different parts of the customer organization: workflow was typically sold to business units as a departmental solution, where as EAI was sold to IT as an infrastructure component.
Next: workflow meets EAI.
Another BPM blogger
Keith Swenson, chief architect at Fujitsu Software (and therefore of their Interstage BPM product), now has a blog. He’s involved in the WfMC and BPM standards, which I heard him speak about at the Gartner BPM Summit earlier this year.
BPM and BI
Lots of interesting news recently on BPM and BI. Last month, Lombardi and Cognos signed an OEM agreement to embed Lombardi’s Teamworks into Cognos’ analytics applications. Bruce Silver had a good post about the implications of this agreement, and the blurring lines between BPM and analytics. Then last week, IBI announced that they’ve embedded their WebFocus business intelligence into iWay‘s Process Manager (iWay is a subsidiary of IBI), further indicating this blurring of technologies.
A Short History of BPM, Part 1
Since I am fast approaching the 19th anniversary of starting my first company, which provided integration services for imaging, document management, workflow, e-commerce and eventually BPM, I have a bit of an historical perspective on the the field and often end up explaining to customers how BPM got to where it is today: sometimes as part of my Making BPM Mean Business course, and sometimes just as a standalone presentation. Although a lot of readers of this blog are BPM professionals of some sort, I’m going to reproduce this history lesson here in several parts. Click on the category BPMhistory in the sidebar to see the complete set to date.
I welcome comments from any of you other “old-timers” out there, and will incorporate relevant corrections and comments into the body of the post.
Part 1: In The Beginning
In the beginning there was workflow. To be more precise, there was person-to-person routing of scanned documents through a pre-determined process map.
As far as I know, FileNet was the first to use the term “workflow” in this context, back in the early 1980’s when they started up, although they were quickly joined in the marketplace by IBM and other product vendors. Most of these early workflow systems were document-focused, that is, the only purpose of the workflow was to move a scanned image of a paper document from one person to another so that they could perform some action on a different system, such as transaction data entry. This was a logical step after organizations started scanning their documents in order to preserve and share them: why not scan them before working on them, then pass them around electronically to try and improve efficiencies? Unfortunately, any direct integration between these other systems and the workflow processes was custom-built, expensive, and not very flexible.
From these early beginnings, workflow systems evolved fairly slowly during the 90’s into products that were much more functional and flexible, primarily in the area of better integration capabilities, more diverse server and client platforms, and some basic process modelling and process monitoring tools. In many cases, however, the workflow products were still very document-centric: the document scanning/management business was such a cash cow for many of these vendors that they didn’t show a lot of vision when it came to finding uses for workflow concepts outside of routing documents around an office. That set the stage for two things to happen:
- Third-party add-ons to popular workflow products would proliferate, to do all the things that the workflow vendors didn’t deem important.
- More nimble “pure” workflow startups (the early BPM products) would ignore the document-centric side and build highly-functional workflow products that were unable to handle an enterprise workload, but would push the envelope in terms of functionality.
Before we get to that, however, we need to talk a bit about EAI.