BEAParticipate: Daniel Weiler

The Ministry of Education of Luxembourg has implemented an educational portal using BEA technology, and Daniel Weiler, a professor at the Center of Technology for Education, discussed what they’ve learned over the 5 years since first implementation. They had a vision 5 years ago of a digital learning place that was accessible anywhere on any type of device, with many social computing aspects such as collaboration, that provided access to eLearning and other educational content, using single sign-on authentication and security, and different personalities depending on whether it was a teacher, student or other type of participant. Today, these requirements are no big deal. Five years ago, in an educational environment, it was a bit more revolutionary.

Weiler did a live demonstration of the portal from a teacher’s viewpoint, which provides access to educational content such as MSN Encarta, internally-created documents, email, collaboration communities, and services such as forms creation. They’ve built and integrated a number of custom applications, such as resource scheduling and library management. In the student-facing version of the portal, they have eLearning collaboration spaces that include direct Skype links to teachers as well as content specific to a certain subject area.

Each school can have a personalized extranet portal that is hosted under the mail mySchool portal, but has its own look and feel, including navigational structures; he showed examples of an elementary and a primary school that looked and behaved completely differently, but were based on the same platform. They also have a media gallery with both photos and video on various arts, literature and other topics, all managed and accessible through the portal platform.

BEAParticipate: Frank Ybarra and Bambi George

Next up were Frank Ybarra and Bambi George of Applebee’s, a customer of BEA, discussing their corporate (internal-facing) portal and business agility. They’re a public company (for now) with almost 2,000 restaurants in 49 states and 17 other countries, 75% of which are franchised, so they have a lot of internal communications challenges. They use the portal to publish operational changes, such as recipe changes that need to be implemented; sales figures and other branch-specific metrics; schedules and other HR information; and collaboration spaces for project teams such as new country startups.

They’ve had their portal in place for long enough to reach their entire target audience, and are now looking for new content to deliver through the portal to increase its value to the internal community. For example, their head chef (love his title: VP of Menu) writes a regular blog-like column, and answers questions that come in as feedback on the column; once this was put in place, no IT support was required to keep it running since the chef was posting his columns and reading/responding to feedback directly.

They’ve done some manual search optimization to help people to find documents on the portal site, and they’re looking forward to trying out AquaLogic Pathways to see how community tagging and bookmarking improves the search capabilities. They also mentioned incorporating Pages for blogging and other user content creation in the future.

BEAParticipate: Mark Carges

Day 1 of the BEA user conference in Atlanta, and we start out with a morning of general sessions hosted by Ira Pollack, SVP Sales at BEA; the remainder of the 2-1/2 day conference is all breakout sessions. There’s wifi around but I seem to be missing the conference code necessary to get logged on, so posts will be delayed throughout the conference as I’ll be gathering them up to publish at times when I can get internet access. There’s also not a power source in sight, which could mean that the last parts of this are really delayed as I transcribe them from paper. 🙁

BEAParticipate is a new user conference dedicated to portals, BPM and social computing with tracks for business and developer-focussed audiences. My focus has only come on BEA with the acquisition of Fuego a year or so ago, so I’m not sure what they had in terms of user/developer conferences prior to (or in addition to) this, although I talked last night with a web developer who has been a Plumtree customer for years and has transitioned from the Plumtree conference as it was rolled into this conference.

We started out with Mark Carges, EVP of BEA, (who many years ago helped develop the source code for Tuxedo) with a high-level vision of how these technologies can create new types of agile applications, and how BEA is delivering BPM, SOA and enterprise social computing (Enterprise 2.0). He talked about the difference between traditional and situational applications, the top-most point of which is that traditional ones are built for permanance whereas situational ones are built for change: exactly the point that I made last week in my talk at TUCON. He covers other comparative points, such as tightly- versus loosely-coupled, non-collaborative versus collaborative, homogeneous vertical integration in application siles versus heterogeneous horizontal integration, and application-driven versus business process-driven.

He walked us through a few examples of their customers’ portal applications — purely intranet, customer-facing, and public — and one example of BPM in a customer, before moving on to talk about BEA’s strategy and product development, particularly in Enterprise 2.0. He made the point that enterprise applications are having to learn from the consumer-facing Web 2.0 applications by allowing for different types and degrees of user participation. Instead of just listing consumer Web 2.0 applications, however, Carges makes analogies with how the same sort of technology could be used inside an enterprise: Digg-like ranking used for ranking sales tools internally; social bookmarking and implicit connections for internal expert knowledge discovery (much like what IBM is doing with Dogear, which I’m sure that they’ll turn into a commercial product once companies like BEA prove the market for it); mashups for creating a single view of a customer from multiple sources including product, support incidents and account information; and wikis to capture competitive intelligence. This is where their new product suite fits: AquaLogic Pages (to create pages, blogs and wikis), Ensemble (for developers to create mashups) and Pathways (for tagging and bookmarking). All of these mesh with IT governance such as security and versioning, but the content isn’t controlled by IT.

Interesting that the focus of his talk has really been on their new Enterprise 2.0 products rather than portals or BPM; they obviously see this as a strong potential growth area.

BEA user conference this week

I’ll be blogging live from the BEA user conference this week in Atlanta. I arrived here today and had a quick look around the evening partner exhibition/drinks reception, but I’m really looking forward to the sessions tomorrow. Jesper Joergensen has also promised that I’ll get a proper demo of both their BPM product and the new Enterprise 2.0 product suite this week.

All of my posts for this conference will be here.

Disclosure: BEA paid my travel expenses to be at this conference.

Transformation & Innovation conference coming up this month

The Transformation & Innovation conference is running in Washington DC on May 21-24, with several sessions on BPM.

I won’t be there; the dates are sandwiched in between a vacation trip to Nova Scotia and a presentation at the Shared Insights Portals & Collaboration conference in Las Vegas.

Yahoo 404’s at AT&T Park

I was in San Francisco this week for a vendor conference, and the gala event was 750 of us heading off to AT&T Park to watch the Giants play the Colorado Rockies. I’ve never been in this stadium before, and one of the first things that I noticed was this juxtaposition of Yahoo’s ad and the 400′ distance marker on the outfield fence.

I pointed this out to a few people, had to explain it to a few others, and generally concluded that no one else had noticed this. I thought it was hilarious, and can’t believe that it’s accidental.

Oh yeah, I was there for Barry Bonds’ 743rd home run, too.

TUCON: The Face of BPM

Thursday morning, and it seems like a few of us survived last night’s baseball game (and the after-parties) to make it here for the first session of the day. This will be my last session of the conference, since I have a noon flight in order to get back to Toronto tonight.

Tim Stephenson and Mark Elder from TIBCO talked about Business Studio, carrying on from Tim’s somewhat shortened bit on Business Studio on Tuesday when I took up too much of our joint presentation time. The vision for the new release coming this quarter is that one tool can be used by business analysts, graphical tools developers and operational administrators by allowing for different perspectives, or personas. There’s 9 key functions from business process analysis and modelling to WYSIWYG forms design to service implementation.

The idea of the personas within the product are similar to what I’ve seen in the modelling tool of other BPMS vendors: each has a different set of functions available and has some different views onto the process being modelled. Tim gave some great insight into how they considered the motivations and requirements of each of the types of people that might use the product in order to develop the personas, and showed how they mapped out the user experience flow with the personas overlaid to show the interfaces and overlaps in functionality. This shows very clearly the overlap between the business analyst and developer functionality, which is intentional: who does what in the overlap depends on the skills of the particular people involved.

As we heard in prior sessions, Business Studio provides process modelling using BPMN, plus concept modelling (business domain data modelling) using UML to complement the process model. There’s a strong focus on how BPM can consume web services and BusinessWorks services, because much of the audience is likely developers who use TIBCO’s other products like BusinessWorks to create service wrappers around legacy applications. At one point between sessions yesterday, I had an attendee approach me and thank me for the point that I made in my presentation on Tuesday about how BPM is the killer app for SOA (a point that I stole outright from Ismael Ghalimi — thanks, Ismael!), because it helped him to understand how BPM creates the ROI for SOA: without a consumer of services, the services themselves are difficult to justify.

We saw a (canned) demo of how to create a simple process flow that called a number of services that included a human-facing step, a database call to a stored procedure, a web service call based on introspecting the WSDL and performing some data mapping/transformation, a script task that uses JavaScript to perform some parameter manipulation, and an email task that allows the runtime process instance parameters to be mapped to the email fields. Then, the process definition is exported to XPDL, and imported into the iProcess Modeler in order to get it into the repository that’s shared with the execution engine. Once that’s done, the process is executable: it can be started using the standard interface (which is built in General Interface), and the human-facing steps have a basic form UI auto-generated.

It is possible to generate an HTML document that describes a process definition, including a graphical view of the process map and tabular representations of the process description.

As I mentioned in other posts, and in many posts that I’ve made about BPA tools, is that there’s no shared model between the process modeller, which is a serious issue for process agility and round-tripping unless you do absolutely nothing to the process in the iProcess Modeler except to use it as a portal to the execution repository. TIBCO has brought a lot (although not all) of the functionality of the Modeler into Studio, and are working towards a shared model between analysts and developers; they believe that they can remove the need for Modeler altogether over time. There’s no support at this time, however, to being able to deploy directly from Studio, that is, Studio won’t plug directly into the execution engine environment. Other vendors who have gone the route of a downloadable disconnected process modeller or a separate process discovery tool are dealing with the same issue; ultimately, they all need to make this new generation of modelling tools have the capability to be as integrated with the execution environment as those that they’re replacing in order to eliminate the requirement for round-tripping.

TUCON: Continuous Process Improvement Using iProcess Analytics

For the last breakout session of the day, Mark Elder of TIBCO talked about reporting and analytics with iProcess Analytics, their historical (rather than real-time) analytics product.. The crowd’s thin this time of day, although I understand that the lobby bar is well-populated; this was the same timeslot that Tim and I had yesterday, but the attendees now have an additional day of conference fatigue. Also doesn’t help that the presentation PC acted up and we were 10 minutes late starting.

He looks at the second half of any business process implementation: after it’s up and running, you need to measure what’s going on, then feed that back to the design stage for process improvement. iProcess Analytics has a number of built-in metrics, then wizard interfaces to allow a business analyst to build new KPIs by specifying dimensions and filters, then create interactive reports. It’s even possible to set different threshold values for filtered subsets of data, such as setting different cycle-time goals for different geographic regions.

He moved on to a live demo after a few minutes of slideware to show us just how easy it is to create a chart or report for a process, or even a single step within a process. The wizard looks pretty easy to use, although chart generation isn’t exactly rocket science. There’s some nice report distribution capabilities, much like what you’d see in a business intelligence suite, so that you can share a chart or report with other members of your team. You can’t do a lot of calculations on the data, however, but you can export tabular data to Excel for further calculations and aggregation.

One very cool feature is that for a given set of data that’s being used to generate a report, you can reconstruct the process map from the report data to see where the data is coming from, since process metadata is passed over to iProcess Analytics along with the execution data.

It appears that if you want to get real historical data into Business Studio for simulation, you’re going to create a tabular report in iProcess Analytics, export it to Excel, then save it as a text file for importing. Not as integrated as I would have expected — this needs to be fixed as well as more people start to use the simulation functionality within Business Studio.

It’s all browser-based, and can generate either the interactive web-based reports that we saw, or static HTML or PDF reports. It will be interesting to see how TIBCO moves forward with their analytics strategy, since they now have both iProcess Analytics and iProcess Insight (BAM). Although historical analytics and BAM serve different purposes, they’re opposite ends of the same spectrum, and analytics requirements will continue to creep from both ends towards the middle. Like many other vendors who started with an historical analytics tool then bolted on an OEM BAM tool, they’re going to have to reconcile this at some point in the future. There’s also the question, which was raised by an audience member, about the boundary between iProcess Analytics and a full BI suite like Cognos. Although there’s a lot of nice functionality built into iProcess Analytics that’s specific to analyzing processes, many customers are going to want to go beyond this fairly rudimentary BI capability.

That’s the end of day 2 at TUCON; tomorrow morning I’ll probably only be able to catch one session before I have to leave for the flight home. Tonight, 750 of us are off to the SF Giants game, where we’ll see if Vivek Ranadivé’s throwing practice paid off when he throws out the first pitch. Watch for all of us in our spiffy new TIBCO jackets; with free wifi in the stadium, there’s likely to be some geeks there with their laptops, too.

TUCON: Architecting for Success

In an afternoon breakout session, Larry Tubbs from AmeriCredit talked about using TIBCO to automate their contract processing workflow, that is, the part between loan origination/approval and the contract administration system. Their business case was similar to that I’ve seen in many other financial and insurance applications: visibility into the processes, appropriate management of the resources, and ever more stringent regulatory requirements. They did a product evaluation and selected TIBCO, using iProcess Suite, BusinessFactor, BusinessWorks, and EMS as the underlying service bus. They implemented really quickly: for their initial release, it was a matter of months from initial design to rollout to five branches handling 1600 cases simultaneously (the system is designed for a peak load of 7000 cases).

Nimish Rawal from TIBCO, who was involved in the implementation, described some details of what they did and the best practices that they used: use iProcess engine for process orchestration and BusinessWorks for integration; put application data in a separate schema (they had 583 instance data fields and 257 metadata fields); create a queue/group structure according to business divisions; and allow the business to control the rules to allow for easy changes to the process flow or any changing regulations. They used master and slave iProcess servers hitting against a common database to distribute the load, and used clustering for high availability although the failover process is not automatic (which surprised me a bit since clustering software or hardware can automate this). They also planned for disaster recovery by distributing nodes between two physical locations and sending archive files from the master to DR site about once every five minutes; again, the failover is not automatic, but that’s less expected in the case of a total site loss.

Rawal also went through the TIBCO professional services engagement model. On the AmeriCredit side, they had four core developers to work with the TIBCO team (which went from five to seven to two), and now the TIBCO people only do mentoring with all development being done by AmeriCredit’s developers.

TUCON: BPM, The Open Source Debate

Ryan Herd, who heads the BPM centre of competence within RBM Private Bank, was up next to talk about the analysis that they did on open source BPM alternatives. Funny that the South Africans, like we understated Canadians, use the term “centre of competence” as opposed to the very American “center of excellence”. 🙂

Don’t tell Ismael Ghalimi, but Herd thinks that jBoss’ jBPM is the only open source BPM alternative; it was the only one that they evaluated, along with a number of proprietary solutions including TIBCO. Given that he’s here speaking at this conference, you can guess which one they picked.

Their BPM project started with some strategic business objectives:

  • operational efficiency
  • improved client service
  • greater business process agility

and some technology requirements:

  • a platform to define, improve and automate business processes
  • real-time and historical process instance statistics
  • single view of a client and their related activities

They found that then needed to focus on three things:

  • Process: dynamic quality verification, exception handling that can step outside the defined process, and a focus on the end-to-end process.
  • People: have their people be obsessed with the client, develop an end-to-end process culture in order to address SLAs, and create full-function teams rather than an assembly-line process.
  • Systems: a single processing front-end, a reusable business object layer and centralized work management.

Next, they started looking at vendors, and for whatever reasons, open source was considering the mix: quite forward-thinking for a bank. In addition to TIBCO and jBPM, they considered DST‘s AWD, IBM‘s BPM, eiStream (now Global 360) and K2: a month and a half to review all of the products, then another month and a half doing a more focussed comparison of TIBCO and jBPM.

For process design, jBPM has only a non-visual programmer-centric environment, and has support for BPEL but not (obviously, since it’s not visual) BPMN. It does allow modelling freedom, but that can be a problem with enforcing internal standards. It also has no process simulation. TIBCO, on the other hand, has a visual process modelling environment that supports BPMN, has a near zero-code process design and provides simulation. Point: TIBCO.

On the integration side, jBPM has no graphical application integration environment, although it has useful integration objects and methods and has excellent component-based design. The adapters are available but not easily reused, and has no out-of-the box communication or integration facilities. TIBCO has a graphical front-end for application integration, and a lots of adapters and integration facilities. Point: TIBCO.

On the UI side, jBPM has only a rudimentary web-based end user environment, whereas TIBCO has the full GI arsenal at their disposal. Point: TIBCO.

Reporting and analytics: jBPM has nothing, TIBCO has iProcess Analytics and (now) iProcess Insight.

Support: don’t even go there, although jBoss wins on price. 🙂

Overall, they found that the costs would be about the same (because of the greater jBPM customization requirement), but a much longer time to deploy with jBPM, which had them choose TIBCO.

Given what they found, I find it amazing that they spent three months looking at jBPM, since jBPM is, in its raw form, a developer tool whereas TIBCO spans a broader range of analyst and developer functionality. The results as presented are so biased in favour of TIBCO that it should have been obvious long before any formal evaluation was done that jBPM wasn’t suited for their particular purposes and should not have made their short list; likely, open source was someone’s pet idea so was thrown into the mix on a lark. Possibly an open source BPM solution like Intalio, which wasn’t available as open source at the time of their evaluation, would have made a much better fit for their needs if they were really dedicated to open source ideals. I’m pretty sure that anyone in the room that had not considered open source in the past would run screaming away from it in the future.

Getting past the blatant TIBCO plug masquerading as a product comparison, Herd went on to show the architecture of their solution, which uses a large number of underlying services managed by a messaging layer to interface with the BPM layer — a fairly standard configuration. They expect to go live later this year.