Recording a “Hello World” Podcast with @PBearne at #pcto2012

I’ve been blogging a long time, and participate in webinars with some of my vendor clients, but I don’t do any podcasting (yet). Here at PodCamp Toronto 2012, I had the opportunity to sit through a short session with Paul Bearne on doing a simple podcast: record, edit and post to WordPress.

In addition to a headset and microphone – you can start with a minimal $30 headset/mic combo such as a USB Skype headset that he showed us with decent transcoding included, or move up to a more expensive microphone for better quality – he also recommends at least a basic two-channel audio mixer.

He walked us through what we need from a software standpoint:

  • Audacity – Free open source audio recording software. We recorded a short podcast using Audacity based on a script that Bearne provided, checked the playback for distortion and other quality issues, trimmed out the unwanted portions, adjusted the gain. I’ve used Audacity a bit before to edit audio so this wasn’t completely unfamiliar, but saw a few new tricks. Unfortunately, he wasn’t completely familiar with the tool when it came to some features since it appears that he actually uses some other tool for this, so there was a bit of fumbling around when it came to inserting a pre-recorded piece of intro music and converting the mono voice recording to stereo. There’s also the option to add metadata for the recording, such as title and artist.
  • Levelator – After exporting from Audacity project as a WAV or AIFF file, we could read into Levelator for fixing the recording levels. It’s not clear if there are any settings for Levelator or if it just equalizes the levels, but the result was a total mess the first time, not as expected. He ran it again (using an AIFF export instead of WAV) and the result was much better, although not clear what caused the difference. After fixing the levels with Levelator and importing back into Audacity, the final podcast was exported in MP3 format.
  • WordPress – As he pointed out, the difference between a podcast and a regular audio recording is the ability to subscribe to it, and using WordPress for publishing podcasts allows anyone to subscribe to them using an RSS reader or podcatcher. You may not host the files on your WordPress site since you may not have the storage or bandwidth there, but we used WordPress in this case to set up the site that provides the links and feed to the podcasts.
  • Filezilla FTP – For transferring the resulting MP3 files to the destination.
  • PowerPress – A WordPress plugin from Blubrry allows you to do things such as creating the link to iTunes so that the podcast appears in the podcast directory there, and publishing the podcast directly into a proper podcast post that has its own unique media RSS feed, allowing you to mix podcasts and regular posts on the same WordPress site.

He also discussed the format of the content, such as the inclusion of intro music, title and sponsorship information before the actual content begins.

There was definitely value in this session, although if I wasn’t already familiar with a lot of these concepts, it would have been a lot less useful. Still not sure that I’m going to be podcasting any time soon, but interesting to know how to make it work.

Upcoming Webinars with Progress Software

Blogging around here has been sporadic, to say the least. I have several half-finished posts about product reviews and some good BPM books that I’ve been reading, but I have that “problem” that independent consultants sometimes have: I’m too busy doing billable work to put up much of a public face, both with work with vendors and some interesting end-customer projects.

Today, I’ll be presenting the second in a series of three webinars for Progress Software, focused on how BPM fits with more traditional application development environments and existing custom applications. Progress continues to integrate the Savvion and Corticon acquisitions into their product set, and wanted to put forward a webinar series that would speak to their existing OpenEdge customers about how BPM can accelerate their application development without having to abandon their existing custom applications. I really enjoyed the first of the series, because Matt Cicciari (Progress product marketing manager) and I had a very conversational hour – except for the part where he lost his voice – and this time we’ll be joined by Ken Wilmer, their VP of technology, to dig into some of their technology a bit more. My portion will focus on generic aspects of combining BPM and traditional application development, not specific to the Progress product suite, so this may be of use even if you’re not using Progress products but want to understand how these seemingly disparate methodologies and technologies come together.

We’re doing today’s webinar twice: once at 11am Eastern to cover Europe and North America, then a repeat at 7pm ET (that’s 11AM tomorrow in Sydney) for the Asia Pacific region or those of you who just didn’t get enough in the first session. It will be live both times, so I will have the chance to think about what I said the first time around, and completely change it. 😉

You can sign up for today’s session here, plus the next session on February 29th that will include more about business rules in this hybrid environment.

Q&A From Making Social Mean Business

We had a few unanswered questions left from our webinar on Tuesday, so I’ve included the ones that were not related to Pega’s products below, with answers from both Emily Burns and myself:

There’s a lot of discussion about the readiness of an org before social features are introduced to its employees. What would be a way to assess maturity/readiness of an org for such features with regards to BPM?

Emily: Boy, I guess I am on the more liberal side of that discussion and would err on the side of providing access to these features and seeing how they evolve—collective intelligence is pretty impressive, and can take things in many positive directions that a designer just wouldn’t think of. It’s hard for me to see the downside to fostering better communication and collaboration between people who are already working on the same cases, but may not currently be aware of who the other people are.

Sandy: There is a lot of work being done on social business readiness by organizations such as the Social Business Council (http://council.dachisgroup.com/) that can serve as a reference for how that will work with social features in BPM. In assessing readiness, you can look at the use of other social tools within the organization, such as wikis for documentation, or instant messaging for chat, to get an idea of whether the users have been provided with tools such as this in the past. However, just because they haven’t used these in the workplace before is no reason to avoid social BPM functions since users may be using similar capabilities in consumer applications, and as Emily points out, the best thing is to provide them with the tools and see what emerges.

Emily: For features that impact the application more, such as design-by-doing, that I think is an area that does need careful consideration. In the case of design-by-doing, more often than not, that is something that is limited only to certain roles, and even then, while the default is to allow the new type of case to be instantly in production, in reality, most of our clients use it more as a way of gathering suggestions for application improvements. As it becomes more widely used, and best practices developed around governance, I expect this type of thing to be used more aggressively to foster the kind of real-time adaptation for which it was conceived.

Sandy: Although many organizations are worried about users “going wild” with collaborative and social tools, the opposite is often true: it is more difficult to get users to participate unless they can see a clear personal benefit, such as being able to get their job done better or more efficiently. This may require creating some rewards specifically geared at users who are taking advantage of the social tools, in order to help motivate the process.

While the knowledge that we can glean from social networking sites is indeed powerful, and allows us to serve up tailored offers, it can also irritate some customers, or seem “creepy” like it’s a bit of an invasion of privacy.

Emily: I totally agree, and am just such a customer. In fact, I won’t go to a company’s Facebook page unless I am logged out of Facebook, because I don’t want them to know anything about me, nor do I want my friends to know about my interactions with different companies. In order to get around this sort of stone-walling, there are a few things that organizations can do.

  1. Make the content and actions that can be performed from your Facebook page sufficiently compelling that you overcome this resistance.
  2. DON’T BE SNEAKY! Do not default settings to “post to my wall” so that all of a client’s friends see that she just applied for a new credit card. Be frank and up front about any information that might be broadcast, and about how you are using the information that they have so graciously allowed you to access by virtue of logging in via Facebook. If you want to give people the option of posting something, make sure they are forced to make the choice. And make it transparent and easy to change settings in the future. This will help you gain trust and increase the uptake of these low-cost, highly viral channels.

Sandy: I completely agree – transparency is the key here for organizations starting on a social media path. Anything less than complete transparency about what you’re doing with the consumer’s information – including their actions on your site – will be exposed in the full glare of public scrutiny on the web when people discover it. Accept, however, that there is a wide range of social behavior for customers: some want to be seen to be associated with your product or brand, and will “Like” your Facebook page or check-in on Foursquare at your location, whereas others will not want that information to be publicized in any way.

Do you think there is a trust built up yet for customers to interact with companies via social as yet?

Emily: See my response above. I think that in many cases, organizations have started out on the wrong foot, taking advantage of how easily available the information is to really milk it for all its worth. The fact that many of the social networking sites had low-granularity privacy settings initially made it so that this wasn’t entirely the fault of the different organizations, either. Because of this, and in light of continually improving granularity and control over privacy settings, I think now is a time to try to re-establish trust, and establish what it means to be a good “social” corporate citizen.

Sandy: Social media is becoming a powerful channel for customer interaction, particularly in situations where the company is monitoring Twitter and Facebook updates to track any problems that customers are experiencing. From my own personal experience (and in part because I have a large Twitter following and use my real name on Twitter), I have had near-immediate responses to problems that I Tweeted about hotels, car rentals and train travel. In some cases, the social media wasn’t necessarily well-integrated with the rest of their customer service channel, but when it is well-integrated, it’s a very satisfying customer experience for someone like me with a strong social media focus. There are initiatives to create the type of trusted online behavior that we would all like to see, such as the Respect Trust Framework; early days for these, but we’ll see more organizations adopt this as customers insist on their online rights.

I’ve also included my slides below, although not Emily’s deck. I’ll update this post with the link to the webinar replay when it is available.

Making Social BPM Mean Business

When I owned a boutique consulting firm in the 1990’s, our catchphrase was “Making Technology Mean Business”, and when we were coming up with a title for the webinar that I’m doing with Pegasystems next week, an updated version of that phrase just seemed to fit. We’ll be discussing the social aspects of business processes, particularly in the context of case management. I’ll be expanding on a discussion point from my Changing Nature of Work keynote at BPM 2011 to discuss the social dimension and how that correlates with structure (i.e., a priori modeling), triggered in part by some of the discussion that arose from that presentation. As with the spectrum of structure, I believe that there’s a spectrum of socialness in business processes: some processes are just inherently more social than others (or can benefit from social features).

Interested? The webinar is on Tuesday at 11am Eastern, and you can register here.

Emerging Trends in BPM – Five Years Later

I just found a short article that I wrote for Savvion (now part of Progress Software) dated November 21, 2006, and decided to post it with some updated commentary on the 5th anniversary of the original paper. Enjoy!

Emerging trends in BPM
What happened in 2006, and what’s ahead in 2007

The BPM market continues to evolve, and although 2006 has seen some major events, there will be even more in 2007. This column takes a high-level view of four areas of ongoing significant change in BPM: the interrelationship between SOA and BPM; BPM standards; the spread of process modeling tools; and the impact of Web 2.0 on BPM.

SOA and BPM, together at last. A year ago, many CIOs couldn’t even spell SOA, much less understand what it could do for them. Now, Service-Oriented Architecture and BPM are seen as two ends of the spectrum of integration technologies that many organizations are using as an essential backbone for business agility.

SOA is the architectural philosophy of exposing functionality from a variety of systems as reusable services with standardized interfaces; these, in turn, can be orchestrated into higher-level services, or consumed by other services and applications. BPM systems consume the services from the SOA environment and add in any required human interaction to create a complete business process.

As with every year for the last several years, 2006 has seen ongoing industry consolidation, particularly with vendors seeking to bring SOA and BPM together in their product portfolios. This trend will continue as SOA and BPM become fully recognized as being two essential parts of any organization’s process improvement strategy.

There has certainly been consolidation in the BPM vendor portfolios, especially the integration vendors adding better human-centric capabilities through acquisitions: Oracle acquired BEA in 2008, IBM acquired Lombardi in 2009, Progress acquired Savvion in 2010, and TIBCO acquired Nimbus in 2011. Although BPM is being used in some cases to orchestrate and integrate systems using services, this is still quite a green field for many organizations who have implemented BPM but are still catching up on exposing services from their legacy applications, and orchestrating those with BPM.

BPM standards. 2006 was the year that the Business Process Modeling Notation (BPMN), a notational standard for the graphical representation of process models, went mainstream. Version 2 of the standard was released, and every major BPM vendor is providing some way for their users to make use of the BPMN standard, whether it’s through a third-party modeling tool or directly in their own process modelers.

But BPMN isn’t the only standard that gained importance this year. 2006 also saw the widespread adoption of XPDL (XML Process Definition Language) by BPM vendors as an interchange format: once a process is modeled in BPMN, it’s saved in the XPDL file format to move from one system to another. A possible competitor to XPDL, the Business Process Definition Metamodel (BPDM) had its first draft release this year, but we won’t know the impact of this until later in 2007. On the SOA side, the Business Process Execution Language (BPEL), a service orchestration language, is now widely accepted as an interchange format, if not a full execution standard.

The adoption of BPM standards is critical as we consider how to integrate multiple tools and multiple processes to run our businesses. There’s no doubt that BPMN will remain the predominant standard for the graphical representation of process models, but 2007 could hold an interesting battle between XPDL, BPDM and BPEL as serialization formats.

The “Version 2” that I referred to was actually the second released version of the BPMN standard, but the actual version number was 1.1. That battle for serialization formats still goes on: most vendors support XPDL (and will continue to do so) but are also starting to support the (finally released) BPMN file format as well. BPDM disappeared somewhere in the early days of BPMN 2.0. BPEL is used as a serialization and interchange format primarily between systems that use BPEL as their core execution language, which are a minority in the broader BPMS space.

Modeling for the masses. In March of 2006, Savvion released the latest version of their free, downloadable process modeler: an application that anyone, not just Savvion customers, could download, install and run on their desktop without requiring access to a server. This concept, pioneered by Savvion in 2004, lowers the barrier significantly for process modeling and allows anyone to get started creating process models and finding improvements to their processes.

Unlike generic modeling tools like Microsoft Visio, a purpose-built process modeler can enforce process standards, such as BPMN, and can partially validate the process models before they are even imported into a process server for implementation. It can also provide functionality such as process simulation, which is essential to determining improvements to the process.

2006 saw other BPM vendors start to copy this initiative, and we can expect more in the months to come.

Free or low-cost process modelers have proliferated: there are web-based tools, downloadable applications and Visio BPMN add-ons that have made process modeling accessible – at least financially – to the masses. The problem continues to be that many people using the process modeling tools lack the analysis skills to do significant process optimization (or even, in some cases, representation of an event-driven process): the hype about having all of your business users modeling your business processes has certainly exceeded the reality.

Web 2.0 hits BPM. Web 2.0, a set of technologies and concepts embodied within the next generation of internet software, is beginning to impact enterprise software, too.

Web 2.0 is causing changes in BPM by pushing the requirement for zero-footprint, platform-independent, rich user interfaces, typically built using AJAX (Asynchronous Java and XML). Although browser-based interfaces for executing processes have been around for many years in BPM, the past year has seen many of these converted to AJAX for a lightweight interface with both functionality and speed.

There are two more Web 2.0 characteristics that I think we’re going to start seeing in BPM in 2007: tagging and process syndication. Tagging would allow anyone to add freeform keywords to a process instance (for example, one that required special handling) to make it easier to find that instance in the future by searching on the keywords. Process event syndication would allow internal and external process participants to “subscribe” to a process, and feed that process’ events into a standard feed reader in order to monitor the process, thereby improving visibility into the process through the use of existing feed technologies such as RSS (Really Simple Syndication).

Bringing Web 2.0 to BPM will require a few changes to corporate culture, especially those parts that require different – and more creative – types of end-user participation. As more people at all levels in the organization participate in all facets of process improvement, however, the value of this democratization of business processes will become clear.

I’ve been writing and presenting about the impact of social software on BPM for over five years now; adoption has been slower than I predicted, although process syndication (subscribing to a process’ events) has finally become mainstream. Tagging of processes is just starting to emerge; I’ve seen it in BonitaSoft but few other places.

I rarely do year-end prediction posts, but it was fun to look back at one that I did five years ago to see how well I did.

Enterprise BPM Webinar Q&A Followup

I know, two TIBCO-related posts in one day, but I just received the link to the replay of the Enterprise BPM webinar that I did for TIBCO last week, along with the questions that we didn’t have time to answer during the webinar, and wanted to summarize here. First of all, my slides:

These were the questions that came in during the webinar via typed chat that are not related to TIBCO or its products; I think that we covered some of these during the session but will respond to all of them here.

Is it possible to implement BPM (business process management) without a BPMS?

How to capture process before/without technology?

These are both about doing BPM without a BPMS. I wrote recently about Elevations Credit Union (the fact that they are an IBM customer is completely immaterial in this context) that gained a huge part of their BPM success long before they touched any technology, Basically, they carved out some high-level corporate goals related to quality, modeled their value streams, then documented their existing business processes relative to those value streams. Every business process had to fit into a value stream (which was in turn related to a corporate goal), or else it didn’t survive. They saw how processes touched various different groups, and where the inefficiencies lay, and they did all of this using manual mapping on white boards, paper and sticky notes. In other words, they used the management discipline and methodology side of BPM before they (eventually) selected a tool for collaborative process modeling, which then helped them to spread the word further in their organization. There is a misperception in some companies that if you a buy a BPMS, your processes will improve, but you really need to reorient your thinking, management and strategic goals around your business processes before you start with any technology, or you won’t get the benefits that you are expecting.

In enterprises that do not have SOA implemented horizontally across the organization, how can BPM be leveraged to implement process governance in the LOB silos, yet have enterprise control?

A BPM center of excellence (CoE) would be the best way to ensure process governance across siloed implementations. I wrote recently about a presentation that I was at where Roger Burlton spoke about BPM maturity; there was some advice that he had at the end of that about organizations that had only a level 1 or 2 in process maturity (which, if you’re still very siloed, you’re probably at): get a CoE in place and target it more at change initiatives than governance. However, you will be able to leverage the CoE to put standards in place, provide mentoring and training, and eventually build a repository of reusable process artifacts.

I work in the equipment finance industry. Companies in this space are typically classified as banks/bank-affiliates, captives and independents. With a few exceptions it’s my understanding that this particular industry has been rather slow at adopting BPMS. Have you noticed this in other industries and, if so, what do you see as being the “tipping point” for greater BPMS adoption rates? Does it ultimately come down to a solid ROI, or perhaps a few peer success stories?

My biggest customers are in financial services and insurance, so are also fairly conservative. Insurance, in particular, tends to adopt technology at the very end of adoption tail. I have seen a couple of factors that can impact a slower-moving adoption of any sort of technology, not just BPMS: first, if they just can’t do business the old way any more, and have to adopt the new technology. An example of this was a business process outsourcer for back-office mutual fund transactions that started losing bids for new work because it was actually written into the RFP that they had to have “imaging and workflow” technology rather than paper-based processes. Secondly, if they can’t change quickly enough to be competitive in the market, which is usually the case when many other of their competitors have already started using the technology. So, yes, it does come down to a solid ROI and some peer success stories, but in many cases, the ROI is one of survival rather than just incremental efficiency improvements.

Large scale organizations tend to have multiple BPM / workflow engines. What insights can you share to make these different engines in different organizational BUs into an enterprise BPM capability?

Every large organization that I work with has multiple BPMS, and this is a problem that they struggle with constantly. Going back to the first question, you need to think about both sides of BPM: it’s the management discipline and methodology, then it’s the technology.  The first of these, which is arguably the one with the biggest impact, is completely independent of the specific BPMS that you’re using: it’s about getting the organization oriented around processes, and understanding how the end-to-end business processes relate to the strategic goals. Building a common BPM CoE for the enterprise can help to bring all of these things together, including the expertise related to the multiple BPM products. By bringing them together, it’s possible to start looking at the target use cases for each of the systems currently in use, and selecting the appropriate system for each new implementation. Eventually, this may lead to some systems being replaced to reduce the number of BPMS used in the organization overall, but I rarely see large enterprises without at least two different BPMS in use, so don’t be fanatical about getting it down to a single system.

Typically what is the best order to implement ; first BPM and last SOA or vice versa.

I recommend a hybrid approach rather than purely top-down (BPM first) or bottom-up (SOA first). First, do an inventory in your environment for existing services, since there will almost always be some out there, even if just in your packaged applications such as ERP. While is this happening, start your BPM initiative by setting the goals and doing some top-down process modeling. Assuming that you have a particular process in mind for implementation, do the more detailed process design for that, taking advantage of any services that you have discovered, and identifying what other services need to be created. If possible, implement the process even without the services: it will be no worse from an efficiency standpoint than your current manual process, and will provide a framework both for adding services later and for process monitoring. As you develop the services for integration and automation, replace the manual steps in the process with the services.

Re: Enterprise BPM Goals – Develop, Execute, but what about Governance?

This was in response to the material on my agenda for the webinar. Yes, governance is important, but I only had 40 minutes and could barely cover the design/develop/execute parts of what we wanted to cover. Maybe TIBCO will have me back for another webinar on governance. 😉

Data/content centric processes vs. people-centric vs. EAI/integration centric re: multiple BPMS platforms. Any guidelines for when and where to demarcate?

These divisions are very similar to the Forrester divisions of the BPMS landscape from a few years ago, and grew mostly out of the different types of systems that were all lumped together as “BPMS” by the analysts in the early 2000’s. Many of today’s products offer strength in more than one area, but you need to have a good understanding of your primary use cases when selecting a product. Personally, I think that content-centric and human-centric isn’t the right way to split it: more like unstructured (case management) versus structured; even then, there is more of a spectrum of functionality in most cases than purely unstructured or purely structured. So really, the division is between processes that have people involved (human-centric) or those that are more for automated integration (system-centric), with the latter having to accommodate this wider spectrum of process types. If you have mostly automated integration processes, then certainly an integration-centric BPMS makes sense; if you have human-facing processes, then the question is a bit more complex, since you’re dealing with content/documents, process types, social/collaborative capabilities and a host of other requirements that you need to look at relative to your own use cases. In general, the market is moving towards the full range of human-facing processes being handled by a single product, although specialist product companies would differ.

Thoughts on the role of the application/solution architect within an LOB or COE vs. that of the enterprise architect assigned to the BPM domain?

An enterprise architect assigned to the BPM CoE/domain is still (typically) part of the EA team, therefore involved with the broader scope of enterprise architecture issues. An application/solution architect tends to be more product and technology focused, and in many some that is just a fancy term used for a developer. In other words, the EA should be concerned with overall strategy and goals, whereas the solution architect is focused on implementation.

Role of the COE in governance? How far does/should it extend?

The CoE is core to governance: that’s what it’s there for. At the very least, the CoE will set the standards and procedures for governance, and may rely on the individual projects to enforce that governance.

Is it really IT giving up control? In many cases, the business does whatever they do — and IT has little (or aged) information about the actual processes.

This was in reference to slide #11 in my deck about cultural issues. Certainly business can (and often do) go off and implement their own processes, but that is outside the context of enterprise-wide systems. In order to have the business be doing that within the enterprise BPMS, IT has to ensure that business can access the process discovery and modeling tools that become the front-end of process design. That way, business and IT share models of the business processes, which means that what gets implemented in the BPMS might actually resemble what is required by the business. In some cases, I see a company buy a BPMS but not allow the business users to use the business-level tools to participate in process modeling: this is usually the result of someone in IT thinking that this is beyond the capability of the business people.

Is following of any BPM notation standards part of BPM development? I saw that there was no mention of it.

There was so much that I did not have time to address with only 40 minutes or so to speak, and standards didn’t make the cut. In longer presentations, I always address the issue of standards, since a common process modeling notation is essential to communication between various stakeholders. BPMN is the obvious front-runner there, and if used properly, can be understood by both business and IT. It’s not just about process models, however: a BPMS implementation has to also consider data models, organizational models and more, around which there is less standardization.

Regarding Common UI: shouldn’t it be Common Architecture, accessed by different UIs that fit the user’s roles, knowledge, etc?

In the context of slide #6, I did mean a common UI, literally. In other words, using the BPMS’ composite application development and forms environment to create a user interface that hides multiple legacy applications behind a single user interface, so that the user deals with this new integrated UI instead of multiple legacy UIs. Your point seems to be more about persona-based (or role-based) interfaces into the BPMS, which is a valid, but different, point. That “single UI” that I mention would, in fact, be configurable for the different personas who need to access it.

How does a fully fledged BPM tool stack up against workflow tools part of other COTS application, e.g. workflow in a document management tool or in a trouble ticketing tool?

A full BPMS tends to be much more flexible than what you will find in the embedded workflow within another platform, and is more of an application development platform than just a way to control processes within that application. On the other side, the workflow within those applications are typically already fully integrated with the other business objects within them (e.g., documents, trouble tickets), so the implementation may be faster for that particular type of process. If the only type of process management that you need to do is document approvals within your document management system, it may make sense to use that rather than purchase a full BPMS; if you have broader process management needs, start looking at a more general BPMS platform that can handle more of your use cases.

How do u see BPM tools surviving when CRM tools with more or less same capability is getting widely accepted by enterprises with out-of-box processes defined?

Similar to my response to the previous question, if the processes are related only to the business objects within the CRM, then you may be better off using the workflow tools within it. However, as soon as you want to integrate in other data sources, systems or users, you’ll start to get beyond the functional capabilities of the simpler workflow tools within the CRM. There’s room in the market for both; the trick is, for customers, to understand when to use one versus the other.

What are the reasons you see for BPM tools not getting quickly and widely accepted and what are the solutions to overcome that?

There are both cost and complexity components with BPMS adoption, but a big reason before you even start looking at tools is moving your organization to a process-driven orientation, as I discussed above. Once people start to look at the business as end-to-end processes, and those processes as assets and capabilities that the business offers to its customers, there will be a great pull for BPMS technologies to help that along. Once that motivation is in place, the cost and complexity barriers are still there, but are becoming less significant: first of all, more vendors are offering cloud-based versions of their software that allow you to try it out – and even do your full development and testing – without capital expenditures. If they offer the option, you can move your production processes on-premise, or leave them in the cloud to keep the total cost down. As for complexity, the products are getting easier to use, but are also offering a lot more functionality. This shifts the complexity from one of depth (learning how to do a particular function) to breadth (learning what all the functions are and when to use which), which is still complex but less of a technological complexity.

Is it possible to start introducing and implementing BPM in one department or module only and then extending the BPM to other departments or modules? Or this should be the enterprise wide decisions since it involves heavy cost to bring BPM technologies.

Almost every organization that I work with does their BPM implementation in one department first, or for one process first (which may span departments): it’s just not possible to implement everything that you will ever implement in BPM at the same time, first time. There needs to be ROI within that first implementation, but you also have to look at enterprise cost justification as with any horizontal technology: plan for the other projects that will use this, and allocate the costs accordingly. That might mean that some of the initial costs come from a shared services or infrastructure budget rather than the project budget, because they will eventually be allocated to future projects and processes.

How difficult would it be to replace legacy workflow system with BPM?

It depends (that’s always the consultant’s answer). Seriously, though, it depends on the level of integration between the existing workflow system and other systems, and how much of the user interface that it provides. I have seen situations where a legacy workflow system is deeply embedded in a custom application platform, with fairly well-defined integration points to other systems, and the user interface hiding the workflow system from the end user. In this case, although it’s not trivial, it is a straightforward exercise to rip out the workflow system since it is being used purely as a process engine, replace it with a new one, refactor the integration points so that the new system calls the other systems in the environment (usually easier since modern BPMS’ have better integration capabilities) and refactor the custom UI so that it calls the new BPMS (also usually easier because of updated functionality). That’s the best case, and as I said, it’s still not trivial. If the legacy workflow system also provides the user interface, then you’re looking at redeveloping your entire UI either in the new BPMS or in some other UI development tool, plus the back-end systems integration work. A major consideration in either case is that you don’t just want to replace the same functionality of the old workflow system, since the new BPMS will have far greater functionality: you need to think about how you are going to leverage capabilities such as runtime collaboration that never existed in the old system, in order to see the greatest benefit from the upgrade.

Is it possible to switch between BPM vendors without having pain?

No. Similar to the previous answer, this is a non-trivial exercise, and depending on how much of the functionality of the BPMS that you were using, could be pretty much a complete redevelopment. If the BPMS was used primarily for orchestration of automated processes, it will be much easier, but as soon as you get into custom integration/orchestration and user interfaces, it gets a lot more complicated (and painful).

Do we really need to go for BPM in a situation where we need only integration orchestration only?

One end of the BPMS market is integration-centric systems, which primarily do just integration orchestration. The advantage of using a BPMS for this instead of orchestrating directly in application code is that you get all of the other stuff that comes with the BPMS “for free”: graphical process modeling, execution monitoring, process governance and whatever other goodies are in the BPMS. It’s not really free, of course, but it’s valid to consider a comparison of all of that functionality against what parts of it you would have to custom-build if you were to do the orchestration in code.

That’s it for the Q&A. If you listen to the replay, or were on the live broadcast, my apologies for the rushed beginning: I got off on the wrong foot out of the gate, but settled down after the first few minutes.

TIBCO Spotfire 4.0

I had a briefing with TIBCO on their Spotfire 4.0 release, announced today and due to be released by the end of November. Spotfire is the analytics platform that TIBCO acquired a few years back, and provides end-user tools for dimensional analysis of data. This includes both visualization and mashups with other data sources, such as ERP systems.

In 4.0, they have focused on two main areas:

  • Analytic dashboards for monitoring and interactive drilldowns. This seems more like the traditional BI dashboards market, whereas Spotfire is known for its multidimensional visualizations, but I expect that business customers find that just a bit too unstructured at times.
  • Social collaboration around data analysis, both in terms of finding contributors and publishing results, by allowing Spotfire analysis to be embedded in SharePoint or shared with tibbr, and by including tibbr context in an analysis.

I did get a brief demo, starting with the dashboards. This starts out like a pretty standard dashboard, but does show some nice tools for the user to change the views, apparently including custom controls that can be created without development. The dynamic visualization is good, as you would expect if you have ever seen Spotfire in full flight: highlighting parts of one visualization object (graph or table) causes the corresponding bits in the other visualizations on the screen to be highlighted, for example.

Spotfire 4.0 - tibbr in sidebar of dashboard

There’s also some built-in collaboration: a chart on the Spotfire dashboard can be shared on tibbr, which has a static snapshot of the chart shared in a discussion thread but links back directly to the live visualization, [Insert obligatory iPad demo here] whereas sharing in SharePoint embeds the live visualization rather than a static shot. Embedding a tibbr discussion as context within an analysis is really less of an integration than just a side-by-side complementary viewing: you can have a tibbr discussion thread viewed on the same page as part of the analysis, although the tibbr thread is not itself being analyzed.

I found that the integration/collaboration was a bit lightweight, some of it no more than screen sharing (like the difference between a portal and a composite application). However, the push into the realm of more traditional dashboards will allow Spotfire to take on the more traditional BI vendors, particularly for data related to other TIBCO products, such as ActiveMatrix BPM.

[Update: All screenshots from briefing; for some reason, Flickr doesn’t want to show them as an embedded slideshow]

Taking Time To Remember

296354_10150930653380305_645435304_21318425_2030905227_nToday is Remembrance Day in Canada (Veterans’ Day if you are in the US), which marks the anniversary of the signing of the armistice in World War I on November 11, 1918. Today, this day is used to honor soldiers of all wars.

I started a little project last year, after finding my grandfather’s WWI journals and my father’s WWII journal: I have been blogging the journals on a daily basis, with each day’s entry on the same day, just shifted by 94 years and 67 years, respectively. My grandfather’s journal covers the entire period from the day he shipped out in 1916 until when he arrived home in 1919; we’re at November 1917 right now, so have a year and half to go. My father’s journal, unfortunately, only covers the period from January-September 1944, so is already finished, but considering that he was in the navy, and took troops into the beaches of Normandy during the invasion, there’s some pretty interesting reading.

For the most part, these are just a few sentences each day written by small-town boys who were not recognized war heroes: not usually the kind of stories that we read about the wars. My grandfather’s sense of melancholy and my father’s sense of adventure are interesting contrasts. Feel free to follow along, and help out with the handwriting when I can’t decipher the journal myself.

SAP NetWeaver Business Warehouse with HANA

Continuing in the SAP World Tour in Toronto today, I went to a breakout innovation session on NetWeaver Business Warehouse (BW) and HANA, with Steve Holder from their BusinessObjects center of excellence. HANA, in case you’ve been hiding from all SAP press releases in the past two years, is an analytic appliance (High-performance ANalytic Applicance, in fact) that includes hardware and in-memory software for real-time analysis of non-aggregated information (i.e., not complex event processing). Previously, you would have had to move your BW data (which had probably already been ETL’d from your ERP to BW) over to HANA in order to take advantage of that processing power; now, you can actually make HANA be the persistence layer for BW instead of a relational database such as Oracle or DB2, so that the database behind BW becomes HANA. All the features of BW (such as cubes and analytic metadata) can be used just as they always could be, and any customizations such as custom extractors already done on BW by customers are supported, but moving to an in-memory provides a big uplift in speed.

Previously, BW provided data modeling, an analytical/planning engine, and data management, with the data storage in a relationship database. Now, BW only provides the data modeling, and everything else is pushed into HANA for in-memory performance. What sort of performance increases? Early customer pilots are seeing 10x faster data loading, 30x faster reporting (3x faster than BW Accelerator, another SAP in-memory analytics option), and a 20% reduction in administration and maintenance (no more RDBMS admins and servers). This is before the analytics have been optimized for in-memory: this is just a straight-up conversion of their existing data into HANA’s in-memory columnar storage. Once you turn on in-memory InfoCubes, you can eliminate physical cubes in favor of virtual cubes; there are a lot of other optimizations that can be done by eventually refactoring to take advantage of HANA’s capabilities, allowing for things such as interfacing to predictive analytics, and providing linear scaling of data, users and analysis.

This is not going to deprecate BW Accelerator, but provides options for moving forward that include a transition migration path from BWA to BW on HANA. BWA, however, provides performance increases for only a subset of BW data, so you can be sure that SAP will be encouraging people to move from BWA to BW on HANA.

A key message is that customers’ BW investments are completely preserved (although not time spent on BWA), since this is really just a back-end database conversion. Eventually, the entire Business Suite ERP system will run on top of HANA, so that there will be no ETL delay in moving operational data over to HANA for analysis; presumably, this will have the same sort of transparency to the front-end applications as does BW on HANA.