Category Archives: cloud

bpmNEXT 2014 Wednesday Morning: Cloud, Synthetic APIs and Models

I’m not going to say anything about last night, but it’s a bit of a subdued crowed here this morning at bpmNEXT. Smile

We started the day with Tom Baeyens of Effektif talking about cloud workflow simplified. I reviewed Effektif in January at the time of launch and liked the simple and accessible capabilities that it offers; Tom’s premise is that BPM is just as useful as email, and it needs to be just as simple to use as email so that we are not reliant on a handful of power users inside an organization to make them work. To do this, we need to strip out features rather than add features, and reduce the barriers to trying it out by offering it in the cloud. Inspired by Trello (simple task management) and IFTTT (simple cloud integration, which basically boils down every process to a trigger and an action), Effektif brings personal DIY workflow to the enterprise that also provides a bridge to enterprise process management through layers of functionality. Individual users can get started building their own simple workflows to automate their day-to-day tasks, then more technical resources can add functionality to turn these into fully-integrated business processes. Tom gave a demo of Effektif, starting with creating a checklist of items to be completed, with the ability to add comments, include participants and add attachments to the case. There have been a few changes since my review: you can use Markdown to format comments (I think that understanding of Markdown is very uncommon in business and may not be well-adopted as, for example, a TinyMCE formatted text field); cases can now to started by a form as well as manually or via email; and Google Drive support is emerging to support usage patterns such as storing an email attachment when the email is used to instantiate the case. He also talked about some roadmap items, such as migrating case instances from one version of a process definition to another.

Next up was Stefan Andreasen of Kapow (now part of Kofax) on automation of manual processes with synthetic APIs – I’m happy for the opportunity to see this because I missed seeing anything about Kapow during my too-short trip to the Kofax Transform conference a couple of weeks ago. He walked through a scenario of a Ferrari sales dealership who looks up SEC filings to see who sold their stock options lately (hence has some ready cash to spend on a Ferrari), and narrow that down with Bloomberg data on age, salary and region to find some pre-qualified sales leads, then load them into Salesforce. Manually, this would be an overwhelming task, but Kapow can create synthetic APIs on top of each of these sites/apps to allow for data extraction and manipulation, then run those on a pre-set schedule. He started with a “Kapplet” (applications for business analysts) that extracts the SEC filing data, allows easy manual filtering by criteria such as filing amount and age, then select records for committal to Salesforce. The idea is that there are data sources out there that people don’t think of as data sources, and many web applications that don’t easily integrated with each other, so people end up manually copying and pasting (or re-keying) information from one screen to another; Kapow provides the modern-day equivalent to screen-scraping that taps into the presentation logic and data (not the physical layout or style, hence less likely to break when the website changes) of any web app to add an API using a graphical visual flow/rules editor. Building by example, elements on a web page are visually tagged as being list items (requiring a loop), data elements to extract, and much more. It can automate a number of other things as well: Stefan showed how a local directory of cryptically-named files can be renamed to the actual titles based on table of contents HTML document; this is very common for conference proceedings, and I have hundreds of file sets like this that I would love to rename. The synthetic APIs are exposed as REST services, and can be bundled into Kapplets so that the functionality is exposed through an application that is useable by non-technical users. Just as Tom Baeyens talked about lowering the barriers for BPM inside enterprises in the previous demo, Kapow is lowering the bar for application integration to service the unmet needs.

It would be great if Tom and Stefan put their heads together and lunch and whipped up an Effektif-Kapow demo, it seems like a natural fit.

Next was Scott Menter of BP Logix on a successor to flowcharts, namely their Process Director GANTT chart-style process interface – he said that he felt like he was talking about German Shepherds to a conference of cat-lovers – as a different way to represent processes that is less complex to build and modify than a flow diagram, and also provides better information on the temporal aspects and dependencies such as when a process will complete and the impacts of delays. Rather than a “successor” model such as a flow chart, that models what comes after what, a GANTT chart is a “predecessor” model, that models the preconditions for each task. A subtle but important difference when the temporal dependencies are critical. Although you could map between the two model types on some level, BP Logix has a completely different model designer and execution engine, optimized for a process timeline. One cool thing about it is that it incorporates past experience: the time required to do a task in the past is overlaid on the process timeline, and predictions made for how well this process is doing based on current instance performance and past performance, including tasks that are far in the future. In other words, predictive analytics are baked right into each process since it is a temporal model, not an add-on such as you would have in a process based on a flow model.

For the last demo of this session, Jean-Loup Comeliau of W4 on their BPMN+ product, which provides model-driven development using BPMN 2, UML 2, CMIS and other standards to generate web-based process applications without generating code: the engine interprets and executes the models directly. The BPMN modeling is pretty standard compared to other process modeling tools, but they also allow UML modeling of the data objects within the process model; I see this in more complete stack tools such as TIBCO’s, but this is less common from the smaller BPM vendors. Resources can be assigned to user tasks using various rules, and user interface forms are generated based on the activities and data models, and can be modified if required. The entire application is deployed as a web application. The data-centricity is key, since if the models change, the interface and application will automatically update to match. There is definitely a strong message here on the role of standards, and how we need more than just BPMN if we’re going to have fully model-driven application development.

We’re taking a break, and will be back for the Model Interchange Working Group demonstration with participants from around the world.

Effektif: Simple BPM In The Cloud

Effektif BPMTen months ago, Tom Baeyens (creator of jBPM and Activiti) briefed me on a new project that he was working on: Effektif, a cloud-based BPM service that seeks to bridge the gap between simple collaborative task lists and complex IT-driven BPMS. In October, he gave me a demo on the private beta version, with some discussion of what was coming up, and last week he demonstrated the public version that was launched today. With Caberet-inspired graphics on the landing page and a name spelling that could only have been dreamed up by a Belgian influenced by Germans ;-) the site has a minimalistic classiness but packs a lot of functionality in this first version.

We talked about his design inspirations: IFTTT and zapier, which handle data mappings transparently and perform the simplest form of integration workflow; Box and Dropbox, which provide easy content sharing; Trello and Asana, which enable micro-collaboration around individual tasks; and Wufoo, which allows anyone to build online forms. As IFTT has demonstrated, smaller-grained services and APIs are available from a number of cloud services to more easily enable integration. If you bring together ideas about workflow, ad hoc tasks, collaboration, content, forms and integration, you have the core of a BPMS; if you’re inspired by innovative startups that specialize in each of those, you have the foundation for a new generation of cloud BPM. All of this with a relatively small seed investment by Signavio and a very lean development team.

One design goal of Effektif is based on a 5-minute promise: users should be able to have a successful experience within 5 minutes. This is achievable, considering that the simplest thing that you can do in Effektif is create a shared task list, which is no more complex than typing in the steps and (optionally) adding participants to the list or individual tasks. However, rather than competing with popular shared task list services such as Trello and Asana, Effektif allows you to take that task list and grow it into something much more powerful: a reusable process template with BPMN flow control, multiple start event types, and automated script tasks that allow integration with common cloud services. Non-technical users that want to just create and reuse task lists never need to go beyond that paradigm or see a single BPMN diagram, since the functionality is revealed as you move from tasks to processes, but technical people can create more complex flows and add automated tasks.

Within the Effektif interface, there are two main tabs: Tasks and Processes. Tasks is for one-off collaborative task lists, whereas Processes allows you to create a process, which may be a reusable task list or a full BPMN model.

Within Tasks:

  • Effektif task definition and executionThe Tasks interface is a simple list of tasks, with a default filter of “Assigned to me”. The user can also select “I’m a candidate”, “Unassigned” or “Assigned to others” as task filters.
  • Each task is assigned to the creator by default, but can be assigned to another user or have other users added as participants, which will cause the task to appear on their task lists.
  • Each task can have a description, and can have documents attached to it at any point by any participant, either through uploading or via URL. Since any URL can be added, this doesn’t have to be a “document” per se, but any link or reference. Eventually, there will be direct integration with Google Drive and Box for attachments, but for the next month or two, you have to copy and paste the URL. Although you can upload documents as attachments, this really isn’t meant as a document repository, and the intention is that most documents will reside in an external ECM (cloud or on-premise).
  • Each task can have subtasks, created by any participants; each of those subtasks is the same as a task, that is, it can have a description, documents and subtasks, but is nested as part of the parent task.
  • Any participant can add comments to a task or subtask, which appear in the activity stream alongside the task list but only in context: that is, a comment added to a subtask will only appear when that subtask is selected. Other actions, such as task creation and completion, are also shown in the activity stream.
  • When the subtask assignee checks Done to complete the subtask, they are prompted with the remaining subtasks in that task that are assigned to them. This does not happen when completing a top-level task, which seemed a bit inconsistent, but I probably need to play around with this functionality a bit more. In looking at how processes instances are handled, likely a task is executed as a process instance with its subtasks as activities within that instance, but that distinction probably isn’t clear to (or cared about by) a non-technical user.

Effektif process definition and execution (release version)Within Processes, the basic process creation looks very much like creating a task list in Tasks, except that you’re creating a reusable process template rather than a one-off task list. In its simplest form, a process is defined as a set of tasks, and a process instance is executed in the same way as a task with the process activities as subtasks. When defining a new process:

  • Each process has a name. By default, instances of this process will use the same name followed by a unique number.
  • Each process has a trigger, either manually in Effektif using the Start Process button, or by email to a unique email address generated for that process template.
  • The activities in the process are initially defined as a task list, where each is either a User Task or Script Task.
  • Each user task can have a description and be assigned to a user, similar to in the Tasks tab, but can also have a form created for that activity that includes text fields, checkboxes and drop-down selection lists. A key functionality with forms is that defining the form fields at any activity within a process creates process instance variables that can be reused at other activities in the process, including within scripts. In other words, you create the process data model implicitly by designing the UI form.
  • Effektif process definition and execution (release version)Each script task allows you to write Javascript code that will be executed in a secure NodeJS environment. Some samples are provided, plus field mapping for mapping instance variables to Javascript variables, and an inline test environment.
  • Optionally, the activities can be viewed as a BPMN process flow using an embedded, simplified version of the Signavio modeler: the list of tasks is just converted to process activities, and you can then draw connectors between them to define serial logic. XOR gateways can also be added, which automatically adds buttons to the previous activity to select the outbound pathway from the gateway. You can switch between the Activities (task list) and Process Flow (BPMN) views, creating tasks in either view, although I was able to cause some weird behaviors by doing that – my Secret Superpower is breaking other people’s code.
  • The process is published in order to allow process instances to be started from it.

To create a simple reusable task list template, you just give it a name, enter the activities as a list, and publish. If you want to enhance it later with triggers, forms and script tasks, you can come back and edit it later, and republish.

When running a process instance:

  • Effektif process definition and execution (release version)The process is started either by an email or manual trigger, which then creates a task in the assigned user’s task list for the process instance, containing the activities as subtasks. If no process flow was defined, then all activities appear as subtasks; if a flow was defined, then only the next available one is visible.
  • As with the ad hoc tasks, participants can create new subtasks for this process instance or its activities at execution time.
  • If gateways were added, then buttons will appear at the step prior to the gateway prompting which path to follow out of the gateway. I’m not sure what happens if the step prior is a script task, e.g., a call to a rules engine to provide the flow logic.

As I played around with Effektif, the similarities and differences between tasks, processes (templates) and process instances started to become more clear, but that’s definitely not part of the 5-minute promise.

I’m not sure of the roadmap for tenanting within the cloud environment and sharing of information; currently they are using a NoSQL database with shards by tenant to avoid bottlenecks, but it’s not clear how a “tenant” is defined or the scope of shared process templates and instances.

Other things on the roadmap:

  • Importing and exporting process models from the full Signavio modeler, or from other BPMN 2.0-compliant modelers, although only a small subset of activity types are supported: start, end, user task, script task, XOR gateway, plus an implied AND gateway by defining multiple paths out of a task.
  • Additional start event types, e.g., embeddable form, triggers from ECM systems such as triggering a workflow when a document is added to a folder.
  • Google Drive/Box integration for content.
  • Salesforce integration for content and triggers.
  • Common process patterns built in as wizards/templates, allowing users to deploy with simple parameterization (and learn BPMN at the same time).

Effektif is not targeting any particular industry verticals, but are positioned as a company-wide BPM platform for small companies, or as a departmental/team solution for support processes within larger companies. A good example of this is software development: both the Effektif and Signavio teams are using it for managing some aspects of their software development, release and support processes.

There will be three product editions, available directly on the website or (for the Enterprise version) through the Signavio sales force:

  • Collaborate, providing shared task list functionality and document sharing. Free for all users.
  • Team Workflow, adding process flows (BPMN modeling) and connectors to Salesforce.com, Google Drive and a few other common cloud services. The first five users are free, then paid for more than five.
  • Enterprise Process Management, adding advanced integration including with on-premise systems such as SAP and Oracle, plus analytics. That will be a paid offering for all users, and likely significantly more than the Team Workflow edition due to the increased functionality.

I don’t know the final pricing, since the full functionality isn’t there yet: Box, Google Drive and Salesforce integration will be released in the next month or two (currently, you still need to copy and paste the URL of a document or reference into Effektiv, and those systems can’t yet automatically trigger a workflow), and the enterprise integration and analytics will be coming later this year.

Go ahead and sign up: it only takes a minute and doesn’t require any information except your name and email address. If you want to hear more about Effektif, they are holding webinars on February 3rd (English) and 6th (German).

Q&A With Vishal Sikka @vsikka

Summary of this morning’s keynote (replay available online within 24 hours):

  • Have seen “HANA effect” over past 2.5 years, and see HANA is being not just a commercial success for SAP but a change in the landscape for enterprise customers. A key technology to help people do more.
  • Partnerships with SAS, Amazon, Intel, Cisco, Cloud Foundry.
  • Enterprise cloud and cloud applications.
  • SuccessFactors Learning products.
  • 1000 startup companies developing products on HANA.
  • Team of 3 teenagers using HANA Cloud and Lego to build shelf-stacking robots.

Vishal Sikki keynote

Q&A with audience (in person and online):

  • SAP has always had an excellent technology platform for building applications, used to build their core enterprise applications. HANA is the latest incarnation of that platform, and one that they are now choosing to monetize directly as an application development platform rather than only through the applications. HANA Enterprise Cloud and HANA Cloud Platform are enterprise-strength managed cloud versions of HANA, and HANA One uses AWS for a lower entry point; they’re the same platform as on-premise HANA for cloud or hybrid delivery models. I had a briefing yesterday with Steve Lucas and others from the Platform Solution Group, which covers all of the software tools that can be used to build applications, but not the applications themselves: mobile, analytics, database and technology (middleware), big data, and partners and customers. PSG now generates about half of SAP revenue through a specialist sales force that augments the standard sales force; although obviously selling platforms is more of an IT sell, they are pushing to talk more about the business benefits and verticals that can be built on the platform. In some cases, HANA is being used purely as an application development platform, with little or no data storage.
  • Clarification on HANA Cloud: HANA Enterprise Cloud is the cloud deployment of their business applications, whereas HANA Cloud Platform is the cloud version of HANA for developing applications.
  • SAP is all about innovation and looking forward, not just consolidating their acquisitions.
  • Examples of how SAP is helping their partners to move into their newer innovation solutions: Accenture has a large SuccessFactors practice, for example. I think that the many midrange SIs who have SAP ERP customization as their bread and butter may find it a bit more of a challenge.
  • Mobile has become a de facto part of their work, hence has a lower profile in the keynotes: it is just assumed to be there. I, for one, welcome this: mobile is a platform that needs to be supported, but let’s just get to the point where we don’t need to talk about it any more. Fiori provides mobile and desktop support for the new UI paradigms.

As with the keynote, too much information to capture live. This session was recorded, and will be available online.

SAP TechEd Day 1 Keynote With @vsikka

Vishal Sikka – head technology geek at SAP – started us off at TechEd with a keynote on the theme of how great technology always serves to augment and amplify us. He discussed examples such as the printing press, Nordic skis and the Rosetta Stone, and ends up with HANA (of course) and how a massively parallel, in-memory columnar database with built-in application services provides a platform for empowering people. All of SAP’s business applications – ERP, CRM, procurement, HR and others – are available on or moving to HANA, stripping out the complexity of the underlying databases and infrastructure without changing the business system functionality. The “HANA effect” also allows for new applications to be built on the platform with much less infrastructure work through the use of the application services built into HANA.

He also discussed their Fiori user interface paradigm and platform which can be used to create better UX on top of the existing ERP, CRM, procurement, HR and other business applications that have formed the core of their business. Sikka drew the architecture as he went along, which was a bit of fun:

SAP architecture with HANA and Fiori

He was joined live from Germany by Franz Faerber, who heads up HANA development, who discussed some of the advances in HANA and what is coming next month in version SP7, then Sam Yen joined on stage to demonstrate the HANA developer experience, the Operational Intelligence dashboard that was shown at SAPPHIRE earlier this year as in use at DHL for tracking KPIs in real time, and the HANA Cloud platform developer tools for SuccessFactors. We heard about SAS running on HANA for serious data scientists, HANA on AWS, HANA and Hadoop, and much more.

There’s a lot of information pushing out in the keynote: even if you’re not here, you can watch the keynotes live (and probably watch it recorded after that fact), and there will be some new information coming out at TechEd in Bangalore in six weeks. The Twitter stream is going by too fast to read, with lots of good insights in there, too.

Bernd Leukert came to the stage to highlight how SAP is running their own systems on HANA, and to talk more about building applications, focusing on Fiori for mobile and desktop user interfaces: not just a beautification of the existing screens, but new UX paradigms. Some of the examples that we saw are very tile-based (think Windows 8), but also things like fact sheets for business objects within SAP enterprise systems. He summed up by stating that HANA is for all types of businesses due to a range of platform offerings; my comments on Hasso Plattner’s keynote from SAPPHIRE earlier this year called it the new mainframe (in a good way). We also heard from Dmitri Krakovsky from the SuccessFactors team, and from Nayaki Nayyar about iFlows for connecting cloud solutions.

TechEd is inherently less sales and more education than their SAPPHIRE conference, but there’s a strong sense of selling the concepts of the new technologies to their existing customer and partner base here. At the heart of it, HANA (including HANA cloud) and Fiori are major technology platform refreshes, and the big question is how difficult – and expensive – it will be for an existing SAP customer to migrate to the new platforms. Many SAP implementations, especially the core business suite ERP, are highly customized; this is not a simple matter of upgrading a product and retraining users on new features: it’s a serious refactoring effort. However, it’s more than just a platform upgrade: having vastly faster business systems can radically change how businesses work, since “reporting” is replaced by near-realtime analytics that provide transparency and responsiveness; it also simplifies life for IT due to footprint reduction, new development paradigms and cloud support.

We finished up 30 minutes late and with my brain exploding from all the information. It will definitely take the next two days to absorb all of this and drill down into my points of interest.

Disclosure: SAP is a customer, and they paid my travel expenses to be at this conference. However, what I write here is my own opinion and I have not been financially compensated for it.

Still More Conference Within A Conference: ARIS World

The irrepressible Joerg Klueckmann, Director of Product Marketing for ARIS, hosted the ARIS World session, third in the sub-conferences that I’ve attended here at Innovation World.

Georg Simon, SVP of Product Marketing, discussed some of the drivers for ARIS 9: involving occasional users in processes through social collaboration, shortening the learning curve with a redesigned UI, modernizing the look and feel of the UI with new colors and shapes, lowering the TCO with centralized user and license management, and speeding content retrieval with visual and ad hoc search capabilities. There are new role-specific UI perspectives, allowing users to decide what capabilities that they want to see on their interface (based on what they have been allocated by an administrator). There’s a new flexible metamodel, allowing you to create new object types beyond what is provided in the standard metamodel.

He also briefly mentioned Process Live, which moves this newly re-architected ARIS into the public cloud, and went live yesterday, and discussed their plans to release a mobile ARIS framework, allowing some functionality to be exposed on mobile devices: consuming, collaborating and designing on tablets, and approvals on smartphones as well.

Their recent acquisition, Alfabet, is being integrated with ARIS so that its repository of IT systems can be synchronized with the ARIS process repository for a more complete enterprise architecture view. This allows for handoffs in the UI between activities in an ARIS process model and systems in an Alfabet object model, with direct navigation between them.

Software AG Process LiveKlueckmann gave us a demo of Process Live and how it provides social process improvement in the cloud. This is hardly a market leader – cloud-based process discovery/modeling collaboration started with Lombardi Blueprint (now IBM’s Blueworks Live) around 2007 – but it is definitely significant that a leading BPA player like ARIS is moving into the cloud. They’re also offering a reasonable price point: about $140/month for designers, and less than $6/month for viewers, which you can buy directly on their site with a credit card – and there’s a one-month free trial available. Contrast this with Blueworks Live, where an editor is $50/month, a contributor (who can comment) is $10/month, and a viewer is $2/month (but has to be purchased in multiples of 1,000): considerably more expensive for the designer, but likely much more functionality since it brings much of the ARIS functionality to the cloud.

Software AG Process LiveProcess Live offers three templates for create new project databases, ranging from a simple one with four model types, to the full-on professional one with 74 model types. Process Live doesn’t provide the full functionality of ARIS 9: it lacks direct support from Software AG, instead relying on community support; it is missing a number of advanced modeling and analysis features; and can’t be customized since it’s multi-tenanted cloud. You can check out some of their video tutorials for more information on how it works. Data is stored on the Amazon public cloud, which might offer challenges for those who don’t want to include the NSA as a collaborator.

Software AG Process LiveWe heard from Fabian Erbach of Litmus Group, a large consulting organization using Process Live with their customers. For them, the cloud aspect is key since it reduces the setup time by eliminating installation and providing pre-defined templates for initiating projects; furthermore, the social aspects promote engagement with business users, especially occasional ones. Since it’s accessible on mobile (although not officially supported), it is becoming ubiquitous rather than just a tool for BPM specialists. The price point and self-provisioning makes it attractive for companies to try it out without having to go through a software purchasing cycle.

ARIS World ended with a panel of three ARIS customers plus audience participation, mostly discussing future features that customers would like to have in ARIS as well as Process Live. This took on the feel of a user group meeting, which offered a great forum for feedback from actual users, although I missed a lot of the nuances since I’m not a regular ARIS user. Key topics included the redesigned ARIS 9 UI, and the distinction between ARIS and Process Live capabilities.

OpenText Acquires Cordys And Assure Platform For Cloud-Based Smart Process Apps

I had a briefing two weeks ago with OpenText about their acquisition of Cordys, an interesting move considering their acquisition just a few weeks before that of ICCM, a partner company that created the Assure platform on top of OpenText’s MBPM (hat tip to Neil Ward-Dutton for tying together these two acquisitions). This provides them with two important pillars in their strategy:

  • A robust cloud platform. Although OpenText Cloud existed, it wasn’t what you’d call industrial strength and it was more of an infrastructure platform: it wasn’t fully multi-tenant, and it’s not clear that they had the skills internally to build out the functionality into a competitive solution-oriented platform as a service (PaaS). Cordys, in spite of being a relative unknown, has a strong cloud and PaaS heritage. They also bring other parts of the infrastructure that are missing or deficient in OpenText – including a rapid application development framework, ESB, rules, analytics and MDM – which should accelerate their time to market for a full cloud offering, as well as benefit the on-premise platform.
  • A “Smart Process Application factory” for creating vertical process-centric applications. OpenText first announced Assure almost a year ago as a development platform, plus their initial Assure HR application built on the platform, and have since released (or at least announced on their product pages) customer service, ITSM and insurance client management apps on Assure as well. Now, presumably, they own all of the underlying intellectual property so are free to port it to other platforms.

They have immediate plans (for release near the beginning of 2014) for bringing things together, at least on the surface: they are porting Assure to Cordys so that the same development platform and vertical applications are available there as on MBPM, which gives them greatly improved cloud functionality, and will create additional smart process applications along the way. Effectively, they are using Assure as a technology abstraction layer: not only will the user experience be the same regardless of the underlying BPM platform, the Assure application developer experience will also be the same across platforms, although obviously there’s completely different technology under the covers.

There are some obvious issues with this. In spite of OpenText CEO’s claim that this is a “single platform”, it’s not. OpenText was still working out the Metastorm/Global360 picture from earlier acquisitions, and now have the Cordys platform thrown into the mix. In 2011, when IBM pulled Lombardi, WPS and a few other bits and pieces into IBM BPM, some of us called them out for the fact that although it was presented as a single platform, there were still multiple engines with some degree of overlapping functionality. IBM’s response, and I’m sure OpenText would line up behind this, is that it doesn’t matter as long as it’s integrated “at the glass”, that is, there’s a common user interface for business users and potentially some more technical functions. As someone who has designed and written a lot of code in the past, I see a fundamental flaw with that logic: in general, more engines means more code, which means more maintenance, which means less innovation. Hence, I consider refactoring redundant engines into a single back-end as well as a common UI to be critical for growth. Getting Assure in place quickly as a technology abstraction layer will definitely help OpenText along this route, although primarily for existing Assure customers and new customers for whom Assure will be the primary OpenText application development environment; existing MBPM and OpenText customers will likely need to consider porting their applications to the Assure platform at some point in order to get on the main upgrade bandwagon.

Following the Cordys announcement, Gartner released their analysis that casts doubts on whether OpenText can bring together the acquisitions into a coherently unified strategy. Aside from the stunningly obvious point in the summary, “If OpenText’s BPM suites do not fully meet your needs, evaluate other providers” (when is that not true for any vendor/product?), they see that this just makes the OpenText landscape more complex, and goes so far as to wave off prospective customers. As one person who passed on this link said: ouch.

.

Disclosure: OpenText has been a customer in the past for which I have created white papers, the most recent of which was “Thinking Beyond Traditional BPM” (and a related webinar). I was not compensated in any way for my recent briefing nor for writing this blog post.

SAPPHIRENOW Vishal Sikka Keynote – HANA For Speed, Fiori For Usability

Vishal Sikka, who leads technology and innovation at SAP, followed Hasso Platner onto the keynote stage; I decided to break the post and publish just Plattner’s portion since my commentary was getting bit long.

Sikka also started his part of the keynote with HANA, and highlighted some customer case studies from their “10,000 Club”, where operations are more than 10,000 times faster when moved to HANA, plus one customer with an operation that runs 1 million times faster on HANA. He talked about how imperatives for innovation are equal parts math and design: it has to be fast, but it also has to solve business problems. HANA provides the speed and some amount of the problem-solving, but really good user experience design has to be part of the equation. To that end, SAP is launching Fiori, a collection of 25 easy-to-use applications for the most common SAP ERP and data warehouse functions, supported on phone, tablet and desktop platforms with a single code base. Although this doesn’t replace the 1000′s of existing screens, it can likely replace the old screens for many user personas. As part of the development of Fiori, they partnered with Google and optimized the applications for Chrome, which is a pretty bold move. They’ve also introduced a lot of new forms of data visualization, replacing mundane list-style reports with more fluid forms that are more common on specialized data visualization platforms such as Spotfire.

Fiori doesn’t depend on HANA (although you can imagine the potential for HANA analytics with Fiori visualization), but can be purchased directly from the HANA Marketplace. You can find out more about SAP’s UX development, including Fiori, on their user experience community site.

Returning to HANA, and to highlight that HANA is also a platform for non-SAP applications, Sikka showed some of the third-party analytics applications developed by other companies on the HANA platform, including eBay and Adobe. There are over 300 companies developing applications on HANA, many addressing specific vertical industries.

That’s it for me from SAPPHIRE NOW 2013 — there’s a press Q&A with Plattner and Sikka coming up, but I need to head for the airport so I will catch it online. As a reminder, you can see all of the recorded video (as well as some remaining live streams today) from the conference here.

SAPPHIRENOW Hasso Plattner Keynote – Is HANA The New Mainframe (In A Good Way)?

It’s the last day of SAP’s enormous SAPPHIRE NOW 2013 conference here in Orlando, and the day opens with Hasso Plattner, one of the founders of SAP who still holds a role in defining technology strategy. As expected, he starts with HANA and cloud. He got a good laugh from the audience when saying that HANA is there to radically speed some of the very slow bits in SAP’s ERP software, such as overnight process, he stated apologetically “I had no idea that we had software that took longer than 24 hours to run. You should have sent me an email.” He also discussed cloud architectures, specifically multi-tenancy versus dedicated instances, and said that although many large businesses didn’t want to share instances with anyone else for privacy and competitive reasons, multi-tenancy becomes less important when everything is in memory. They have three different cloud architectures to deal with all scenarios: HANA One on Amazon AWS, which is fully public multi-tenant cloud currently used by about 600 companies; their own managed cloud using virtualization to provide a private instance for medium to large companies, and dedicated servers without virtualization in their managed cloud (really a hosted server configuration) for huge companies where the size warrants it.

Much of his keynote rebutting myths about HANA — obviously, SAP has been a bit stung by the press and competitors calling their baby ugly — including the compression factor between how much data is on disk versus in memory at any given time, the relative efficiency of HANA columnar storage over classic relational record storage, support on non-proprietary hardware, continued support of other database platforms for their Business Suite, HANA stability and use of HANA for non-SAP applications. I’m not sure that was the right message: it seemed very defensive rather than talking about the future of SAP technology, although maybe the standard SAP user sitting the audience needed to hear this directly from Plattner. He did end up with some words on how customers can move forward: even if they don’t want to change database or platform, moving to the current version of the suite will provide some performance and functionality improvements, while putting them in the position to move to Business Suite on HANA (either on-premise or on the Enterprise Cloud) in the future for a much bigger performance boost.

HANA is more than just database: it’s database, application server, analytics and portals bundled together for greater performance. It’s like the new mainframe, except running on industry-standard x86-based hardware, and in-memory so lacking the lengthy batch operations that we associate with old-school mainframe applications. It’s OLTP and OLAP all in one, so there’s no separation between operational data stores and data warehouses. As long as all of the platform components are (relatively) innovative, this is great, for the same reason that mainframes were great in their day. HANA provides a great degree of openness, allowing for code written in Java and a number of other common languages to be deployed in a JVM environment and use HANA as just a database and application server, but the real differentiating benefits will come with using the HANA-specific analytics and other functionality. Therein lies the risk: if SAP can keep HANA innovative, then it will be a great platform for application development; if they harken to their somewhat conservative roots and the innovations are slow to roll out, HANA developers will become frustrated, and less likely to create applications that fully exploit (and therefore depend upon) the HANA platform.

SAP HANA Enterprise Cloud

Ingrid Van Den Hoogen and Kevin Ichhpurani gave a press briefing on what’s coming for HANA Enterprise Cloud following the launch last week. Now that the cloud offering is available,  existing customers can move any of their HANA-based applications — Business Suite, CRM, Business Warehouse, and custom applications — to the cloud platform. There’s also a gateway that allows interaction between the cloud-based applications and other applications left on premise. Customers can bring their own HANA licences, and use SAP services to onboard and migrate their existing systems to the cloud.

HANA Enterprise Cloud is the enterprise-strength, managed cloud version of HANA in the cloud: there’s also HANA One, which uses the Amazon public cloud for a lower-end entry point at $0.99/hour and a maximum of 30GB of data. Combined with HANA on premise (using gear from a certified hardware partner) and hosting partner OEM versions of HANA cloud that they repackage and run on their own environment (e.g., IBM or telcos), this provides a range of HANA deployment environments. HANA functionality is the same whether on AWS, on premise or on SAP’s managed cloud; moving between environments (such as moving an application from development/test on HANA One to production on HANA Enterprise Cloud) is a simple “lift and shift” to export from one environment and import into the target environment. The CIO from Florida Crystals was in the audience to talk about their experience moving to HANA in the cloud; they moved their SAP ERP environment from an outsourced data center to HANA Enterprise Cloud in 180 hours (that’s the migration time, not the assessment and planning time).

SAP is in the process of baking some of the HANA extensions into the base HANA platform; currently, there’s some amount of confusion about what “HANA” will actually provide in the future, although I’m sure that we’ll hear more about this as the updates are released.

Effektif BPM In The Cloud

No, that’s not a typo in the title: Effektif is the newest cloud BPM offering, and it has some pretty impressive BPM credentials: the newly formed company is led by Tom Baeyens, who created not one, but two open source BPM projects: jBPM (now part of Red Hat/JBoss) and Activiti (now part of Alfresco). It’s funded (and partially staffed) by Signavio, creators of a popular process modeling tool; Signavio is headed up by Gero Decker, who has a strong process modeling background including participation in the BPMN standards. I had a chance to chat last week with Tom and Gero about the market needs that led them to start Effektif, and what it’s going to look like. On Monday this week they launched an information-only Effektif website, and expect to have initial rollout of the prototype product launch by early summer.

The aim of Effektif is to provide a simpler cloud BPM platform than what is currently available, with a lower cost point to reduce the barriers to smaller organizations and non-technical users. It will be completely tooled for the cloud, as simple and streamlined as possible but with more complex functionality available. We talked a lot about that aspect – having a less technical, yet still complete, set of functionality for business people to create their own processes, and more technical perspectives for other personas and uses – and agreed that many existing products don’t do a good job of segmenting into usable layers by complexity or technical ability: typically, there are some required functions that are just too technical for business users, even if the product is billed as such.

They also want to include checklist paradigms for simple task management and case management, some of which can be created and configured on the fly, plus collaboration around the content on checklist forms. I haven’t seen a demo yet, but they described three levels of functionality:

  1. The first level will provide a non-technical way to coordinate people using forms-based processes for tasks with email notification. The idea is to allow these to be implemented in five minutes by a non-technical user, but provide a bit more functionality than simple online task lists like Do or Asana since they will have processes built into them, even if the user doesn’t think of them as processes. Tom compared this level with IFTTT, in that it will have simple business paradigms that create operational processes; I’m looking forward to seeing what this looks like, since this is key to adoption in the consumer cloud.
  2. The second level will provide integration to pre-built services such as Google Apps and Salesforce, with likely Salesforce as a first target for integration since those users already have the right mindset about cloud services. This will be a bit more technical, but still require no coding.
  3. The third level will allow custom code to be developed and run inside a process step, more like the functionality that we see in the process portion of most full BPMS.

The target market is not the same as the heavyweight BPMS being used to create enterprise applications; rather, this is aimed at “end-user computing” that is currently the domain of Excel spreadsheets and Access databases. However, with the planned administration capabilities, this is not just a platform for the business to develop and provision their own applications, but also to allow for IT support and governance. That’s a delicate balance to strike, and one that is often not done very well (think of the rampant SharePoint virus in many organizations), but it is essential to provide some degree of governance before anyone would allow Effektif to be used for anything beyond simple processes, namely, the second and third levels discussed above.

Key to this transition between levels is that even the simple first level task management will be based on processes, even if hidden to the user, which provides the potential to expand to more complex functionality without changing products: a continuum of functionality from a simple to-do list to a full business process. Although some BPMS vendors cover the range of functionality, they often split this up between two or more products, or expose too much of the technical underpinnings at the simplest level.

We spent quite a bit of time discussing the process modeling paradigm throughout the levels:

  • Starts as simple sequential modeling
    • Value chain visualization for steps
    • Bullet list of items within steps
    • Model individual tasks as (presumably sequential) BPMN activities
    • Checklist within tasks for to do items: modeled as form data
  • Guide towards BPMN for more complex implementations, since that is what’s underneath the simpler model view
  • For less-rigid ACM functionality/flexibility:
    • Allow adding new items to task list on the fly; switch to form editing view
    • Allow deviation from downstream process steps in current instance or all new instances
    • Activity feed shows what is happening, allows social interaction/collaboration
    • Can deactivate flexibility for more rigid steps/processes, but available by default

As expected, everything is logged in the audit trail, regardless of whether pre-defined or on the fly.

One interesting feature for runtime collaboration will be the ability to allow simultaneous access for multiple users to form-based checklists, so that a single form could be filled out and checked off by multiple participants. I’m not sure what the controls will be for ensuring that these collaborating users don’t overwrite each others changes, but presumably there will be some mechanism.

Mobile apps are not planned for first half-year after release, but are on the radar. The site is created with HTML5 and Javascript, which will allow for mobile websites even before apps have been developed.

Pricing, although likely to based on the number of activity (process) instances, is still to be determined. There may be a free option, although the idea is that the cost is low enough cost that there is no financial barrier to adoption. Hosting will likely be in both central Europe and the US, and you can be sure that I reminded them about issues with data stored on US servers, particularly that owned by non-US companies, and how that would be a complete non-starter for most non-US companies.

I’ve had a lot of interest in what Tom and Gero have been doing separately over the past years; now, I’m very interested to see what they can do together. Looking forward to my first demo of Effektif.