New Theme for Column2.com

I’ve been looking for a new theme for this blog that has more functionality built in, especially a more responsive feel on mobile devices. With the latest upgrade to WordPress 3.6 and the release of the Twenty Thirteen theme, I thought that I’d give it a try. I’ve left the sidebar pretty much the same, although removed the Pages and Search items because they are in the theme’s title bar, and added links to my online social presence. I added a custom header image that matches my business cards. The default location for widgets with this theme is at the bottom, but I need to think about whether I want to exclude any content before I move to that format. There are now nested comments, gravatars and the ability to login using your social media accounts in order to post a comment.

Let me know if there are any strange behaviors on any platforms. I’ve tested it on Chrome and IE (newer versions only) on Windows, Chrome and Safari on iOS, and Chrome and Firefox on Android; so far, so good. Note that the mobile theme in Jetpack is disabled; otherwise the header was a bit wonky. The header is a bit large on an iPhone but looks okay on the desktop and tablet versions, and the header fonts are a bit big on the desktop; I’ll continue to do some tuning. And hopefully start adding some meaningful content again soon.

Activiti Update 2013: New Functionality And New Partners

I had a briefing on the latest version of Alfresco’s Activiti BPM a couple of months back, but decided to wait until the news about their new partners – BP3 and Edorasware – was released before I posted. This strong showing of enterprise support partners is crucial for them following the defection of camunda from the Activiti fold, since many large enterprises won’t deploy an open source product without some level of support from the open source vendor directly or via their partner channel.

Alfresco’s interest in Activiti is as a part of their open source enterprise content management suite: they don’t offer Activiti as a standalone commercial open source product, only bundled within their ECM. Activiti exists as an Apache-licensed open source project with about 1/3 of its main developers – likely representing more than 1/3 of the actual development effort – being Alfresco employees, making Alfresco the main project sponsor. Obviously, Alfresco’s document-centric interests are going to be represented within the Activiti project, but that doesn’t make it unsuitable as a general purpose BPMS; rather, Alfresco makes use of the BPM platform functionality for the purpose of document flow and tasks, but doesn’t force content concepts into Activiti or require Alfresco in any way to use Activiti. Activiti is continuing to develop functionality that has nothing to do with ECM, such as integration with MuleESB.

Activiti was one of the first BPMS platforms to execute BPMN 2.0 natively, and provides full support for the standard. It’s not a “zero-code” approach, but intended as a developer tool for adding high-performance, small-footprint BPM functionality to applications. You can read more about full Activiti functionality on the main project site and some nuances of usage on the blog of core developer Joram Barrez; in this post, I just want to cover the new functionality that I saw in this briefing.

Activiti BPM 5.12 ad hoc task collaborationLike all of the other BPMS out there, Activiti is jumping on the ad hoc collaborative task bandwagon, allowing any user to create a task on the fly, add participants to the task and transfer ownership of the task to another participant. The task definition can include a due date and priority, and have subtasks and attached content. Events for the task are showing in an activity feed sidebar, including an audit trail of the actions such as adding people or content to the task, plus the ability to just post a comment directly into the activity feed. The Activiti Explorer UI shows tasks that you create in the My Tasks tab of the Tasks page, although they do not appear in the Inbox tab unless (I think) the task is actually assigned to you. If someone includes you as a participant (“involves” you) in a task, then it shows in the Involved tab. This is pretty basic case management functionality, but provides quite a bit of utility, at least in part because of the ability to post directly to the activity feed: instead of having to build data structures specific to the task, you can just post any information in the feed as a running comments section. Mostly unconstrained, but at least it’s in a collaborative environment.

Activiti BPM 5.12 table-driven process definitionThe other big new thing is a table-driven process definition as an alternative to the full BPMN modeler, providing a simpler modeling interface for business users to create models without having to know BPMN, or for fast process outlining. This allows you to create a process definition, then add any number of tasks, the order of which implies the sequence flow. Each task has a name, assignee, group (which I believe is a role rather than a direct assignment to a person) and description; you can also set the task to start concurrently with the previous task, which implies a parallel branch in the flow. Optionally, you can define the form that will be displayed for this task by adding a list of the properties to display, including name, type and whether each is mandatory; this causes an implicit definition of the process instance variables. The value of these properties can then be referenced in the description or other fields using a simple ${PropertyName} syntax. You can preview the BPMN diagram at any time, although you can’t edit in diagram mode. You can deploy and run the process in the Activiti Explorer environment; each task in the process will show up in the Queued tab of the Tasks page if not assigned, or in the Inbox tab if assigned to you. The same task interface as seen in the ad hoc task creation is shown at each step, with the addition of the properties fields if a form was defined for a task. The progress of the process instance can be viewed against the model diagram or in a tabular form. Indeed, for very simple processes without a lot of UI requirements, an entire process could be defined and deployed this way by a non-technical user within the Explorer. Typically, however, this will be used for business people to prototype a process or create a starting point; the model will then make a one-way trip into the Eclipse modeling environment (or, since it can be exported in BPMN, into any other BPMN-compliant tool) for the developers to complete the process application. Once the simple table-driven process is moved over to the Eclipse-based Activiti Modeler, it can be enhanced with BPMN attributes that can’t be represented in the table-driven definition, such as events and subprocesses.

There were a few other things, such as enhanced process definition and instance management functions, including the ability to suspend a process definition (and optionally, all instances based on that definition) either immediately or at a scheduled time in the future; some end-user reporting with configurable parameters; and integration of an SMS notification functionality that sent me a text telling me that my order for 2 iPads was shipped. Sadly, the iPads never arrived. Winking smile

We finished with a brief description of their roadmap for the future:

  • Hybrid workflow that allows on-premise and cloud (including instant deployment on CloudBees) for different tasks in same flow, solving the issue of exposing part of process to external participants without putting the entire process off premise.
  • Project KickStart, which builds on the table-driven process definition that I saw in the demo to provide better UI form display (making a real contender as a runtime environment, rather than just for prototyping) and the ability to make changes to the process definition on the fly.
  • Polyglot BPM, allowing Activiti to be called from other (non-Java) languages via an expanded REST API and language-specific libraries for Ruby, C#, Javascript and others.

It’s great to see Activiti continue to innovate after so much change (losing both the original product architect and their main partner) within a short period of time; it certainly speaks to their resiliency as an organization, as you would expect from a robust open source project.

Activiti April 2013 

I also talked with Scott Francis of BP3 about their new Activiti partnership; apparently the agreement was unrelated to the camunda departure, but definitely well-timed. I was curious about their decision to take on another BPM product, given their deep relationship with IBM (and formerly with Lombardi), but they see IBM BPM and Activiti as appealing to different markets due to organizational cultural choices. Certainly to begin with, most of their new Activiti customers will be existing Activiti customers looking for an enterprise support partner, just as many of their new IBM BPM customers are already IBM BPM customers; however, I’ve been in a couple of consulting engagements recently where organizations had both commercial and open source solutions under evaluation, so I’m anticipating a bit of channel conflict here. BP3 has no existing Activiti customers (or any other BPM other than IBM), and has no significant open source contribution experience, but plans to contribute to the Activiti open source community, possibly with hybrid/HTML mobile front-ends, REST APIs architecture and other areas where they have some expertise from building add-ons to IBM BPM. Interestingly, they do not plan to build/certify WAS support for Activiti; although they didn’t see this as a big market, I’m wondering whether this also just cuts a bit too close to the IBM relationship.

Aside from the obvious potential for awkwardness in their IBM relationship, I see a couple of challenges for BP3: first, getting the people with the right skills to work on the Activiti projects. Since the IBM BPM skills are pretty hard to come by, they won’t be redeploying those people, so presumably have to train up other team members or make some new hires. The other challenge is around production support, which is not something that BP3 does a lot of now: typically, IBM would be the main production support for any IBM BPM installation even if BP3 was involved, although BP3 would support their own custom code and may act as triage for IBM’s support. With Activiti, they will have to decide whether they will offer full production support (and if not them, then who?) or just provide developer support during business hours.

SAPPHIRENOW Vishal Sikka Keynote – HANA For Speed, Fiori For Usability

Vishal Sikka, who leads technology and innovation at SAP, followed Hasso Platner onto the keynote stage; I decided to break the post and publish just Plattner’s portion since my commentary was getting bit long.

Sikka also started his part of the keynote with HANA, and highlighted some customer case studies from their “10,000 Club”, where operations are more than 10,000 times faster when moved to HANA, plus one customer with an operation that runs 1 million times faster on HANA. He talked about how imperatives for innovation are equal parts math and design: it has to be fast, but it also has to solve business problems. HANA provides the speed and some amount of the problem-solving, but really good user experience design has to be part of the equation. To that end, SAP is launching Fiori, a collection of 25 easy-to-use applications for the most common SAP ERP and data warehouse functions, supported on phone, tablet and desktop platforms with a single code base. Although this doesn’t replace the 1000’s of existing screens, it can likely replace the old screens for many user personas. As part of the development of Fiori, they partnered with Google and optimized the applications for Chrome, which is a pretty bold move. They’ve also introduced a lot of new forms of data visualization, replacing mundane list-style reports with more fluid forms that are more common on specialized data visualization platforms such as Spotfire.

SAP Fiori

Fiori doesn’t depend on HANA (although you can imagine the potential for HANA analytics with Fiori visualization), but can be purchased directly from the HANA Marketplace. You can find out more about SAP’s UX development, including Fiori, on their user experience community site.

Returning to HANA, and to highlight that HANA is also a platform for non-SAP applications, Sikka showed some of the third-party analytics applications developed by other companies on the HANA platform, including eBay and Adobe. There are over 300 companies developing applications on HANA, many addressing specific vertical industries.

That’s it for me from SAPPHIRE NOW 2013 — there’s a press Q&A with Plattner and Sikka coming up, but I need to head for the airport so I will catch it online. As a reminder, you can see all of the recorded video (as well as some remaining live streams today) from the conference here.

SAPPHIRENOW Hasso Plattner Keynote – Is HANA The New Mainframe (In A Good Way)?

It’s the last day of SAP’s enormous SAPPHIRE NOW 2013 conference here in Orlando, and the day opens with Hasso Plattner, one of the founders of SAP who still holds a role in defining technology strategy. As expected, he starts with HANA and cloud. He got a good laugh from the audience when saying that HANA is there to radically speed some of the very slow bits in SAP’s ERP software, such as overnight process, he stated apologetically “I had no idea that we had software that took longer than 24 hours to run. You should have sent me an email.” He also discussed cloud architectures, specifically multi-tenancy versus dedicated instances, and said that although many large businesses didn’t want to share instances with anyone else for privacy and competitive reasons, multi-tenancy becomes less important when everything is in memory. They have three different cloud architectures to deal with all scenarios: HANA One on Amazon AWS, which is fully public multi-tenant cloud currently used by about 600 companies; their own managed cloud using virtualization to provide a private instance for medium to large companies, and dedicated servers without virtualization in their managed cloud (really a hosted server configuration) for huge companies where the size warrants it.

Much of his keynote rebutting myths about HANA — obviously, SAP has been a bit stung by the press and competitors calling their baby ugly — including the compression factor between how much data is on disk versus in memory at any given time, the relative efficiency of HANA columnar storage over classic relational record storage, support on non-proprietary hardware, continued support of other database platforms for their Business Suite, HANA stability and use of HANA for non-SAP applications. I’m not sure that was the right message: it seemed very defensive rather than talking about the future of SAP technology, although maybe the standard SAP user sitting the audience needed to hear this directly from Plattner. He did end up with some words on how customers can move forward: even if they don’t want to change database or platform, moving to the current version of the suite will provide some performance and functionality improvements, while putting them in the position to move to Business Suite on HANA (either on-premise or on the Enterprise Cloud) in the future for a much bigger performance boost.

HANA is more than just database: it’s database, application server, analytics and portals bundled together for greater performance. It’s like the new mainframe, except running on industry-standard x86-based hardware, and in-memory so lacking the lengthy batch operations that we associate with old-school mainframe applications. It’s OLTP and OLAP all in one, so there’s no separation between operational data stores and data warehouses. As long as all of the platform components are (relatively) innovative, this is great, for the same reason that mainframes were great in their day. HANA provides a great degree of openness, allowing for code written in Java and a number of other common languages to be deployed in a JVM environment and use HANA as just a database and application server, but the real differentiating benefits will come with using the HANA-specific analytics and other functionality. Therein lies the risk: if SAP can keep HANA innovative, then it will be a great platform for application development; if they harken to their somewhat conservative roots and the innovations are slow to roll out, HANA developers will become frustrated, and less likely to create applications that fully exploit (and therefore depend upon) the HANA platform.

SAP HANA Enterprise Cloud

Ingrid Van Den Hoogen and Kevin Ichhpurani gave a press briefing on what’s coming for HANA Enterprise Cloud following the launch last week. Now that the cloud offering is available,  existing customers can move any of their HANA-based applications — Business Suite, CRM, Business Warehouse, and custom applications — to the cloud platform. There’s also a gateway that allows interaction between the cloud-based applications and other applications left on premise. Customers can bring their own HANA licences, and use SAP services to onboard and migrate their existing systems to the cloud.

HANA Enterprise Cloud is the enterprise-strength, managed cloud version of HANA in the cloud: there’s also HANA One, which uses the Amazon public cloud for a lower-end entry point at $0.99/hour and a maximum of 30GB of data. Combined with HANA on premise (using gear from a certified hardware partner) and hosting partner OEM versions of HANA cloud that they repackage and run on their own environment (e.g., IBM or telcos), this provides a range of HANA deployment environments. HANA functionality is the same whether on AWS, on premise or on SAP’s managed cloud; moving between environments (such as moving an application from development/test on HANA One to production on HANA Enterprise Cloud) is a simple “lift and shift” to export from one environment and import into the target environment. The CIO from Florida Crystals was in the audience to talk about their experience moving to HANA in the cloud; they moved their SAP ERP environment from an outsourced data center to HANA Enterprise Cloud in 180 hours (that’s the migration time, not the assessment and planning time).

SAP is in the process of baking some of the HANA extensions into the base HANA platform; currently, there’s some amount of confusion about what “HANA” will actually provide in the future, although I’m sure that we’ll hear more about this as the updates are released.

SAPPHIRENOW Day 2 Keynote

This morning, our opening keynote was from SAP’s other co-CEO, Jim Snabe. He started with a bit about competitive advantage and adaptation to changing conditions, illustrated with the fact that Sumatran tigers have evolved webbed feet so that they can chase their prey into water: evolution and even extinction in business is not much different from that in the natural world, it just happens at a much faster pace. In business, we have both gradual evolution through continuous improvement, and quantum leaps caused primarily by the introduction of disruptive technology. Snabe positions HANA as being one of those disruptive technologies.

McLaren racing dashboardRon Dennis, chairman of McLaren Group, joined Snabe to talk about how they’re using HANA to gather, analyze and visualize data from their cars during Formula 1 races: 6.5 billion data points per car per race. We saw a prototype dashboard for visualizing that data, and heard how the data is used to make predictions and optimize performance during the race. Your processes probably don’t generate 6.5B events per instance, but in-flight optimization is something that’s beyond the capabilities of many organizations unless they use big data and predictive analytics. Integrating this functionality into process management may well be what allows the large vendors such as SAP and IBM to regain the BPM innovation advantage over some of the smaller and more nimble vendors. Survival of the fittest, indeed.

Snabe talked about other applications for HANA, such as in healthcare, where big data allows for comprehensive individual DNA analysis and disease prevention, before returning to the idea of using it for realtime business optimization that allows organizations to adapt and thrive. SAP is pushing all of their products onto HANA as the database platform, first providing data warehousing capabilities, SuccessFactors and now their Business Suite on HANA for greatly improved performance due to in-memory processing. They’ve opened up the platform so that other companies can develop applications on HANA, which will help to drive it into vertical industries. Interestingly, Snabe made the point that having realtime in-memory processing not only makes things faster, it also makes applications less complex, since some of the complexity in code is due to disk and processing latency. They have 1,500 customers on HANA now, and that number is growing fast.

HANA and in-memory processing was just one of the three “quantum leaps” that SAP has been effecting during the last three years; the second is having everything available in the cloud. Just as in-memory processing is about increasing speed and reducing complexity, so is cloud, except that it is about increasing speed and reducing complexity of IT implementations. In the three years that they’ve been at it, and including their SuccessFactors and Ariba acquisitions, they’ve gained 29 million users in the cloud. He was joined by executives from PepsiCo, Timken and Nespresso to talk about their transition to cloud, which included SuccessFactors for cloud-based performance management and HR across their global operations, and CRM in the cloud.

Combining their HANA and cloud initiatives, SAP launched HANA Enterprise Cloud last week, with HANA running on SAP’s infrastructure, which will allow organizations to run all of their SAP applications in the cloud, with the resulting benefits of elasticity and availability. I have a more detailed briefing on HANA Enterprise Cloud this afternoon

Their third quantum leap in the past three years is user experience, culminating in today’s launch of Fiori, a new user interface that brings the aesthetic of consumer UI — including mobile interfaces — to enterprise software. We’ll be hearing more about this in tomorrow’s keynote with Vishal Sikka.

By the way, you can watch the keynotes live and replays of many sessions here; I confess to have watched this morning’s keynote online from my hotel room in order to have reliable wifi to research while I watched and wrote this post.

Process Intelligence With @alanrick

I met up with the NetWeaver BPM product management team and sat in on a session given by Alan Rickayzen of SAP and their customer King Tantivejkul of Colgate-Palmolive on putting intelligence into processes. This wasn’t about process automation — it was assumed that you have some sort of process automation in some system already, which constitutes the instrumentation on the processes — but rather taking all of the process events from a heterogeneous collection of systems and analyzing them in the aggregate in order to drive and support decision-making.

Colgate brings funnels all of their data from their global operations through a master data hub to their SAP back-end, including financials, materials, customer and reference data. SAP’s business suite ERP software is great for crunching data, but not so great at visualizing it — Colgate is using some hard-coded monthly reports that showed some metrics, but little about the process itself — so Colgate signed up for the operational process intelligence (OPINT) ramp-up (first customer release) to help them identify potential issues and bottlenecks in the process. They don’t have anything to show yet, but seem pretty excited about what they can get out of it.

OPINT, built on HANA, provides a more responsive and flexible view of process metrics. Without writing any Java or ABAP code, you can put together a dashboard that shows metrics from multiple systems, since HANA is acting as a process event warehouse for Business Workflow and NetWeaver BPM process events as well as custom processes made visible via Process Observer. In the future, they’ll be adding in other data sources, so you can pull in process models and event data from other systems. The HANA studio design environment allows these processes to be imported from the back-end systems and represents them as BPMN; events in these processes can then be mapped to different phases of a business scenario in order to generate the dashboard.

Predictive analytics are built in, as you might expect given the capabilities of HANA, allow for forecasting of missing specific KPIs and milestones. As we saw at IBM Impact a couple of weeks ago, predictive process analytics are becoming big for high-value process instances: it’s not enough to know if you’re meeting a specific KPI right now, you need to know how the process is going to roll out through its entire lifecycle.

The dashboard widgets that we saw in a short video clip look completely adequate: different data visualizations, colors to denote states, KPIs and drilldowns. No big UI innovations, but the real gold here is in the HANA analytics going on behind the scenes, and the ease with which a solution developer can create a dashboard view of the HANA data. Furthermore, this runs completely on HANA: HANA is the database, the analytics engine and the app server, making it a bit easier to deploy than some other analytics solutions. This is big data applied to process, and it’s fair to say that this combination is going to be significant for the future of BPM.

Back At SAPPHIRENOW – Day 1 Keynote

It’s been a couple of years since I last attended SAP’s huge SAPPHIRE NOW conference, but this week I’m here with my 20,000 closest friends at the Orlando Convention Center (plus another 80,000 watching online) to get caught up. The conference kicked off with a keynote from Bill McDermott, SAP’s co-CEO, and it’s all about HANA and cloud: everything from SAP now runs on HANA, and combined with their cloud platforms realize the dream of realtime, predictive supply chains. HANA is also at the heart of how SAP is addressing social enterprise functionality, allowing a company to analyze a flood of consumer social data to find what’s relevant.

They highlighted some of their sports-related customers’ applications — which definitely allowed for some good lead-in video — with executives from Under Armour, the San Francisco 49’ers and the NBA. In part, sports applications are about helping teams play better and manage their talent through play/player data analysis (think Moneyball), but are also about customer engagement online and in the stadium. The most traditional usage of SAP on the panel is with Under Armour, which manufactures sportswear and sports-related biometrics devices, but their incredible growth means that they needed enterprise systems that they won’t outgrow. An interesting new industry vertical focus for SAP.

The keynote finished with Bob Calderoni, CEO of Ariba (recently acquired by SAP) talking about how cloud — in the form of private business networks, of course — drives productivity. Good focus, since too often the current technology buzzwords (social, mobile, cloud) are discussed purely as the end, not the means, and we can lose sight of how these can make us more productive and efficient, as well as fully buzzword-enabled.

As usual, wifi in the keynote area is impossible, and since I’m tablet-only, I couldn’t even plug into the hard-wired internet that they provided for we guests of Global Communications – I’m not the only one in this section with a tablet rather than a laptop, so imagine that they’ll have to do something in the future to allow the media to consume and publish during the keynote. T-Mobile’s iPhone coverage is resolutely stuck at EDGE in this area, so I can’t even reliably set up a hotspot, although that would just contribute to the wifi problems. The WordPress Android app works fine offline, however, so I was able to take notes and publish later.

OpenText EIMDay Toronto, Financial Services Session

After lunch at the Toronto OpenText EIM Day, Catharine MacKenzie of the Mutual Fund Dealers Association talked about how they’re using OpenText MBPM (from the Metastorm acquisition). She spoke on an OpenText webinar last year, and I was interested in how they’ve progressed since then.

The MFDA is very process-based, since they’re a regulatory body, and although their policies don’t change that often, the processes used to deal with members and policies are constantly being improved. There was no packaged solution for their regulatory processes, and the need to have process flexibility without a full-on custom solution (which was beyond their budget and IT capabilities) led them to BPM. As I described in the post about the webinar (linked above), they started with four processes including compliance and enforcement, and sped through the implementation of several other processes through 2012. Although during the webinar, she stated that they would be implementing five new processes in 2012, most of that has been pushed to 2013, in part (it appears) because of a platform upgrade to MBPM 9.

She pointed out that everyone in MFDA is using BPM for internal administrative processes, such as booking time off, as well as for the member-facing processes; for many of these processes, the users don’t even know that they’re using BPM. They’re also an OpenText eDocs customer, so can present content within processes, although apparently they have had to do a lot of that integration work themselves.

As for benefits, they’re seeing a huge decrease in development and deployment time compared to custom applications that they build in Visual Studio, with process versioning and auditing built in. They’ve had challenges around having the business own the processes, rather than IT, while maintaining good process design and disciplined testing; the MBPM upgrade and migration is also taking longer than expected, hence is delaying some of their planned process implementations. This is an interesting result, against the backdrop of this morning’s customer keynote talking about major system upgrades: an upgrade that requires data migration and custom application refactoring is almost always going to cause delays in a previously-defined schedule of roll-outs, but very necessary for setting the stage for future functionality.

I’m skipping out for the rest of the afternoon to get back to my desk, but this has been a good opportunity to get caught up on the entire OpenText product suite and talk to some of their local customers.

Disclosure: OpenText is a customer, for whom I recently did a webinar and related white paper, but I am not paid to be here today, nor for writing any of these blog posts.

OpenText EIMDay Toronto, Customer Keynotes

Following the company/product keynotes, we heard from two OpenText customers.

First up was Tara Drover from Hatch Engineering, a Canadian engineering firm with 11,000 employees worldwide. They have OpenText Content Server on 10 corporate instances containing 32 million documents for more than 37,000 projects, almost half at their corporate headquarters in the Toronto area. They use it for project documentation, but also for a variety of other administrative and management documents. It appears that they have configured and customized Content Server, and built add-ons, to be the core of their corporate information store. They’ve been using Content Server since 2002 (v9.1), and have upgraded through v9.5 (including “de-customization”, a term and philosophy that I adore), v9.7.1 and v10. The latest upgrade, to CS10, is the one that she focused on in her presentation. Their drivers for the upgrade were to move to a 64-bit platform for scalability and performance reasons, to get off v9.7.1 before support ended, and to set the stage for some of the features in CS10: facets and columns, an improved search engine, and multilingual support. However, they wanted to keep the UI as similar as possible, providing more of a back-end upgrade as a platform for growth rather than a radical user experience change.

They started in March 2012 with strategy, change assessment and planning, then continued on to environmental assessment, development and testing, people change management and their first deployment in July 2012. Their readiness assessment identified that they first had to update their Windows Server and SQL Server instances (to 2008 — hardly cutting edge), and showed some of the changes to the integration points with other Hatch systems. As part of their development and testing, they developed an 80-page deployment guide, since this would have to roll out to all of the Content Server sites worldwide, including estimates of times required for the upgrade in order to avoid downtime during local business hours, and plans for using local staff for remote upgrades. During development and testing, they simultaneously ran the v9.7.1 production environment on the upgraded Windows Server platform, plus a CS10 development environment and a separate CS10 test/staging environment where the production processes were cloned and tested.

If you’re upgrading a single Content Server instance, you’re unlikely to go to this level of complexity in your upgrade plans and rollout, but for multiple sites across multiple countries (and languages), it’s a must. In spite of all the planning, they did have a few hiccups and some production performance issues, in part because they didn’t have a load testing tool. From their first rollout in Santiago, Chile in July 2012, followed by a few months of tuning and testing, they’re now rolling out about one site per month. They’re seeing improvements in the UI and search functions, and are ready to start thinking about how to use some of the new CS10 features.

They had a number of success factors that are independent of whatever product that you’re upgrading, such as clearly defined scope, issue management, and upgrading the platform without adding too many new user features all at once.

The second customer keynote was from Robin Thompson, CIO for the shared services sector of the Government of Ontario. They had some pretty serious information and records management issues, pretty much leaving the retention and disposition of information in the hands of individuals, with little sharing of information between ministries. To resolve this, they have developed a framework for information management over the next several years, targeted at improving efficiencies and improving services to constituents. Their guiding principles are that nformation needs to be protected and secure, managed, governed, accessible and relevant, and valued; in other words, the information needs to be managed as business records. Their roadmap identified an enterprise records and document management service as a necessary starting point, which they have deployed (based on OpenText) in the past year to the Social Services Ministries, with six more areas queued up and ready to implement. In addition to deploying in more ministries, they are also expanding functionality, bringing in email records management. to the Ministry of Finance later this year. This information management framework and vision is long overdue for the Ontario government, and hopefully will lead to better services for those of us who live here.

She shared a number of lessons that they learned along the way: the importance of change management and stakeholder communication; the time required for developing data architecture and taxonomy; the balance between overly-rigid standardization and too many customized instances; the need for external and internal resources to develop and maintain a records/document management practice; and the importance of governance. They’ve focused on an incremental approach, and have allowed the business leaders to pull the functionality rather than have IT push it into the business areas.