Aligning BPM and EA Tutorial at BBCCon11

I reworked my presentation on BPM in an enterprise architecture context (a.k.a., “why this blog is called ‘Column 2’”) that I originally did at the IRM BPM conference in London in June, and presented it at the Building Business Capability conference in Fort Lauderdale last week. I removed much of the detailed information on BPMN, refined some of the slides, and added in some material from Michael zur Muehlen’s paper on primitives in BPM and EA. Some nice improvements, I thought, and it came in right on time at 3 hours without having to skip over some material as I did in London.

Here are some of the invaluable references that I used in creating this presentation:

That should give you plenty of follow-on reading if you find my slides to be too sparse on their own.

NSERC BI Network at CASCON2011 (Part 1)

I only have one day to attend CASCON this year due to a busy schedule this week, so I am up in Markham (near the IBM Toronto software lab) to attend the NSERC Business Intelligence Network workshop this morning. CASCON is the conference run by IBM’s Centers for Advanced Studies throughout the world, including the Toronto lab (where CAS originated), as a place for IBM researchers, university researchers and industry to come together to discuss many different areas of technology. Sometimes, this includes BPM-related research, but this year the schedule is a bit light on that; however, the BI workshop promises to provide some good insights into the state of analytics research.

Eric Yu from University of Toronto started the workshop, discussing how BI can enable organizations to become more adaptive. Interestingly, after all the talk about enterprise architecture and business architecture at last week’s Building Business Capability conference, that is the focus of Yu’s presentation, namely, that BI can help enterprises to better adapt and align business architecture and IT architecture. He presented a concept for an adaptive enterprise architecture that is owned by business people, not IT, and geared at achieving measurable business success. He discussed modeling variability at different architectural layers, and the traceability between them, and how making BI an integral part of an organization – not just the IT infrastructure – can support EA adaptability. He finished by talking about maturity models, and how a closed loop deployment of BI technologies can help meet adaptive enterprise requirements. Core to this is the explicit representation of change processes and their relationship to operational processes, as well as linking strategic drivers to specific goals and metrics.

Frank Tompa from University of Waterloo followed with a discussion of mapping policies (from a business model, typically represented as high-level business rules) to constraints (in a data model) so that these can be enforced within applications. My mind immediately went to why you would be mapping these to a database model rather than a rules management system; his view seems to be that a DBMS is what monitors at a transactional level and ensures compliance with the business model (rules). His question: “how do make the task of database programming easier?” My question: “why aren’t you doing this with a BRMS instead of a DBMS?” Accepting his premise that this should be done by a database programmer, the approach is to start with object definitions, where an object is a row (tuple) defined by a view over a fixed database schema, and represents all of the data required for policy making. Secondly, consider the states that an object can assume by considering that an object x is in state S if its attributes satisfy S(x). An object can be in multiple states at once; the states seem to be more like functions than states, but whatever. Thirdly, the business model has to be converted to an enforcement model through a sort of process model that also includes database states; really more of a state diagram that maps business “states” to database states, with constraints on states and state transitions denoted explicitly. I can see some value in the state transition constraint models in terms of representing some forms of business rules and their temporal relationships, but his representation of a business process as a constraint diagram is not something that a business analyst is ever going to read, much less create. However, the role of the business person seems to be restricted to “policy designer” listing “states of interest”, and the goal of this research is to “form a bridge between the policy manager and the database”. Their future work includes extracting workflows from database transaction logs, which is, of course, something that is well underway in the BPM data mining community. I asked (explicitly to the presenter, not just snarkily here in my blog post) about the role of rules engines: he said that one of the problems was in vocabulary definition, which is often not done in organizations at the policy and rules level; by the time things get to the database, the vocabulary is sufficiently constrained that you can ensure that you’re getting what you need. He did say that if things could be defined in a rules engine using a standardized vocabulary, then some of the rules/constraints could be applied before things reached the database; there does seem to be room for both methods as long as the business rules vocabulary (which does exist) is not well-entrenched.

Jennifer Horkoff from University of Toronto was up next discussing strategic models for BI. Her research is about moving BI from a technology practice to a decision-making process that starts with strategic concerns, generates BI queries, interprets the results relative to the business goals and decide on necessary actions. She started with the OMG Business Motivation Model (BMM) for building governance models, and extended that to a Business Intelligence Model (BIM), or business schema. The key primitives include goals, situations (can model SWOT), indicators (quantitative measures), influences (relationships) and more. This model can be used at the high-level strategic level, or at a more tactical level that links more directly to activities. There is also the idea of a strategy, which is a collection of processes and quality constraints that fulfill a root-level goal. Reasoning that can be done with BIMs, such as whether a specific strategy can fulfill a specific goal, and influence diagrams with probabilities on each link used to help determine decisions. They are using BIM concepts to model a case study with Rouge Valley Health System to improve patient flow and reduce wait times; results from this will be seen in future research.

Each of these presentations could have filled a much bigger time slot, and I could only capture a flavor of their discussions. If you’re interested in more detail, you can contact the authors directly (links to each above) to get the underlying research papers; I’ve always found researchers to be thrilled that anyone outside the academic community is interested in what they’re doing, and are happy to share.

We’re just at the md-morning break, but this is getting long so I’ll post this and continue in a second post. Lots of interesting content, I’m looking forward to the second half.

Process and Information Architectures

Last day of the Building Business Capability conference, and I attended Louise Harris’ session on process and information architectures as the missing link to improving enterprise performance. She was on the panel on business versus IT architecture that I moderated yesterday, and had a lot of great insight into business architecture and enterprise architecture.

Today’s session highlighted how business processes and information are tightly interconnected – business processes create and maintain information, and information informs and guides business processes – but that different types of processes use information differently. This is a good distinction: looking at what she called “transactional” (structured)  versus “creative” (case management) versus “social” (ad hoc), where transactional processes required exact data, but the creative and social processes may require interpretation of a variety of information sources that may not be known at design time. She showed the Burlton Hexagon to illustrate how information is not just input to be processed into output, but also used to guide processes, inform desisions and measure process results.

This led to Harris’ definition of a business process architecture as “defining the business processes delivering results to stakeholders and supported by the organization/enterprise showing how they are related to each other and to the strategic goals of the organization/enterprise”. (whew) This includes four levels of process models:

  • Business capability models, also called business service models or end-to-end business process models, which is the top level of the work hierarchy that defined what business processes are, but not how they are performed. Louise referenced this to a classic EA standpoint as being row 1 of Zachman (in column 2).
  • Business process models, which provide deeper decomposition of the end-to-end models that tie them to the KPIs/goals. This has the effect of building process governance into the architecture directly.
  • Business process flow models, showing the flow of business processes at the level of logistical flow, such as value chains or asset lifecycles, depending on the type of process.
  • Business process scope models (IGOEs, that is, Inputs, Guides, Outputs, Enablers), identifying the resources involved in the process, including information, people and systems.

She moved on to discuss information architecture, and its value in defining information assets as well as content and usage standards. This includes three models:

  • Information concept model with the top level of the information related to the business, often organized into domains such as finance or HR. For example, in the information domain of finance, we might have information subject areas (concepts) of Invoicing, capital assets, budget, etc.
  • Information relationship model defines the relationships between the concepts identified in the information concept model, which can span different subject areas. This can look like an ERD, but the objects being connected are higher-level business objects rather than database objects: this makes it fairly tightly tied to the processes that those business objects undergo.
  • Information governance model, which defines that has to be done to maintain information integrity: governance structure, roles responsible, and policy and business standards.

Next was bringing together the process and information architectures, which is where IGOE (business process scope models) come into play, since they align information subject areas with top level business processes or business capabilities, allowing identification of gaps between process and information. This creates a framework for ensuring alignment at the design and operational levels, but does not map information subject areas to business functions since that is too dependent on the organizational structure.

Harris presented these models as being the business architecture, corresponding to rows 1 and 2 of Zachman (for example), which can then be used to provide context for the remainder of the enterprise architecture and into design. For example, once these models are established, the detailed process design can be integrated with logical data models.

She finished up by looking at how process and information architectures need to be developed in lock step, since business process ensures information quality, while information ensures process effectiveness.

Accepting The Business Architecture Challenge with @logicalleap

Forrester analyst Jeff Scott presented at Building Business Capability on what business architecture is and what business architects do. According to their current research, interest in business architecture is very high – more than half of organizations consider it “very important”, and all organizations survey showed some interest – and more than half also have an active business architecture initiative. This hasn’t changed all that much since their last survey on this in 2008, although the numbers have crept up slightly. Surprisingly, less than half of the business architecture activities are on an enterprise-wide level, although if you combine that with those that have business architecture spanning multiple lines of business, it hits about 85%. When you look at where these organizations plan to take their business architecture programs, over 80% are planning for them to be enterprise-wide but that hasn’t changed in 3 years, meaning that although the intention is there, that may not actually be happening with any speed.

He moved on to a definition of business architecture, and how it has changed in the past 15 years. In the past, it used to be more like business analysis and requirements, but now it’s considered an initiative (either by business, EA or IT) to improve business performance and business/IT alignment. The problem is, in my opinion, that the term “business/IT alignment” has become a bit meaningless in the past few years as every vendor uses it in their marketing literature. Process models are considered a part of business architecture by a large majority of organizations with a business architecture initiative, as are business capability models and business strategy, application portfolio assessments, organizational models and even IT strategy.

Business architecture has become the hot new professional area to get into, whether you’re a business analyst or an enterprise architecture, which means that it’s necessary to have a better common understanding of what business architecture actually is and who the business architects are. I’m moderating a panel on this topic with three business/IT/enterprise architects today at 4:30pm, and plan to explore this further with them. Scott showed their research on what people did before they became (or labeled themselves as) business architects: most held a business analyst role, although many also were enterprise architects, application architects and other positions. Less than half of the business architects are doing it full time, so may still be fulfilling some of those other roles in addition. Many of them are resident in the EA group, and more than half of organizations consider EA to be responsible for the outcomes of business architecture.

It’s really a complex set of factors in figuring out what business architects do: some of them are working on enterprise-wide business transformation, while others are looking at efficiency within a business unit or project. The background of the business architect – that is, what they did before they became a business architect – can hugely impact (obviously) the scope and focus of their work as a business architect. In fact, is business architecture a function performed by many players, or is it a distinct role? Who is really involved in business architecture besides those who hold the title, and where do they fit in the organization? As Scott pointed out, these are unanswered questions that will plague business architecture for a few years still to come.

He presented several shifts to make in thinking:

  • Give up your old paradigms (think different; act different to get different results)
  • Start with “why” before you settle on the how and what
  • “Should” shouldn’t matter when mapping from “what is” to “what can be”
  • Exploration, not standardization, since enterprise architecture is still undergoing innovation on its way to maturity
  • Business architecture, not technology architecture, is what provides insight, risk management and leadership (rather than engineering, knowledge and management)
  • Stress on “business” in business architecture, not “architecture”, which may not fit into the EA frameworks that are more focused on knowledge
  • Focus on opportunity rather than efficiency, which is aligned with the shift in focus for BPM benefits that I’ve been seeing in the past few years
  • Complex problems need different solutions, including looking at the problems in context rather than just functional decomposition.
  • Solve the hard “soft” problems of building business architect skills and credibility, leveraging local successes to gain support and sponsorship, and overcome resistance to change
  • Think like the business before applying architectural thinking to the business problems and processes

He finished up with encouragement to become more business savvy: not just the details of business, but innovation and strategy. This can be done via some good reading resources, finding a business mentor and building relationships, while keeping in mind that business architecture should be an approach to clarify and illuminate the organization’s business model.

He wrote a blog post on some of the challenges facing business architects back in July, definitely worth a read as well.

Strategic Synergies Between BPM, EA and SOA

I just had to attend Claus Jensen’s presentation on actionable architecture with synergies between BPM, EA and SOA since I read two of his white papers in preparing the workshop that I delivered here on Wednesday on BPM in an EA context. I also found out that he’s co-authored a new red book on EA and BPM.

Lots of great ideas here – I recommend that you read at least the first of the two white papers that I link to above, which is the short intro – about how planning (architecture) and solution delivery (BPM) are fundamentally different, and you can’t necessarily transform statements and goals from architecture into functions in BPM, but there is information that is passed in both directions between the two different lifecycles.

He went through descriptions of scenarios for aligning and interconnecting EA and BPM, also covered in the white papers, which are quite “build a (IBM-based) solution”-focused, but still some good nuggets of information.

Workshop: BPM In An EA Context

Here’s my presentation slides from the workshop that I gave on Wednesday here at the IRM BPM conference in London, entitled Architecting A Business Process Environment:

As always, some slides may not make much sense without my commentary (otherwise, why would I be there live?), but feel free to ask any questions here in the comments.

Building a Business Architecture Capability and Practice Within Shell

For the first breakout of the day, I attended Dan Jeavon’s session on Shell’s business architecture practice. For such a massive company – 93,000 employees in 90 countries – this was a big undertaking, and they’ve been at this for five years.

He defines business architecture as the business strategy, governance, organization and key business process information, as well as the interaction between these concepts, which is taken directly from the TOGAF 9 definition. Basically, this involves design, must be implemented and not just conceptual, and requires flexibility based on business agility requirements. They started on their business architecture journey because of factors that affect many other companies: globalization, competition, regulatory requirements, realization of current inefficiencies, and emergence of a single governance board for the multi-national company.

Their early efforts were centered on a huge new ERP system, especially with the problems due to local variations from the global standard process models. “Process” (and ERP) became naughty words to many people, with connotations of bloated, not-quite-successful projects. Following on from some of the success points, their central business architecture initiative actually started with process modeling/design: standard processes across the different business areas with global best practices. This was used to create and roll out a standard set of financial processes, with a small core team doing the process redesign, and coordinating with IT to create a common metamodel and architectural standards. As they found out, many other parts of the company had similar process issues – HR, IT and others – so they branched out to start building a business architecture for other areas as well.

They had a number of challenges in creating a process design center of excellence:

  • Degree of experience with the tool and the methodology; initial projects weren’t sufficiently structured, reducing benefits.
  • Perceived value to the business, especially near-term versus long-term ROI.
  • Impact of new projects, and ensuring that they follow the methodology.
  • Governance and high-level sponsorship.

They also found a number of key steps to implementing their CoE and process architecture:

  • Sponsorship
  • Standard methodology, embedded within standard project delivery framework
  • Communication of success stories

Then, they migrated their process architecture initiative to a full business architecture by looking at the relationships to other elements of business architecture; this led to them do business architecture (mostly) as part of process design initiatives. Recent data quality/management initiatives have also brought a renewed focus on architecture, and Jeavons feels that although the past five years have been about process, the next several years will be more about data.

He showed a simplified version of their standard metamodel, including aspects of process hierarchy models, process flow models, strategy models and organization models. He also showed a high-level view of their enterprise process model in a value stream format, with core processes surrounded by governing and supporting processes. From there, he showed how they link the enterprise process model to their enterprise data catalogue, which links to the “city plan” of their IT architecture and portfolio investment cycle; this allows for traceability as well as transparency. They’ve also been linking processes to strategy – this is one of the key points of synergy between EA and BPM – so that business goals can be driven down into process performance measures.

The EA and process design CoE have been combined (interesting idea) into a single EA CoE, including process architects and business architects, among other architect positions; I’m not sure that you could include an entire BPM CoE within an EA CoE due to BPM’s operational implementation focus, but there are certainly a lot of overlapping activities and functions, and should have overlapping roles and resources.

He shared lots of great lessons learned, as well as some frank assessment of the problems that they ran into. I particularly found it interesting how they morphed a process design effort into an entire business architecture, based on their experience that the business really is driven by its processes.

Integrating BPM and Enterprise Architecture

Michael zur Muehlen presented this morning on integrating BPM and enterprise architecture, based on work that he’s done with the US Department of Defense. Although they use the DoDAF architecture framework in particular, the concepts are applicable to other similar EA frameworks. Like the Zachman framework, DoDAF prescribes the perspectives that are required, but doesn’t specify the artifacts (models) required for each of those perspectives; this is particularly problematic in DoD EA initiatives where there are likely to be many contractors and subcontractors involved, all of whom may use different model types to represent the same EA perspective.

He talked briefly about what makes a good model: the information must be correct, relevant (and complete) and economical (with respect to level of detail), as well as clear, comparable (linked to reality) and systematic. From there, he moved on to their selection of BPMN as the dominant standard for process modeling, since it has better event handling than UML activity diagrams, better organizational modeling than IDEF0, and better cross-organizational modeling than simple flowcharts. However, many tools support only a subset of BPMN – particularly those intended for process execution rather than just process modeling – and some tools have non-standard enhancements to BPMN that inhibit interoperability. Another issue is that the BPMN specification is enormous, with over 100 elements, with some different constructs that mean the same thing, such as explicit versus implicit gateways.

They set out to design primitives for the use of BPMN: where they “outlawed” the use of certain symbols such as complex gateways, and developed best practices for BPMN usage. They also mapped the frequency of BPMN symbol usage from internal DoD models, those that Michael sees in his practice as a professor of BPM at Stevens Institute of Technology, as well as samples found on the web, and came up with a distribution of the BPMN elements by frequency of usage. This research led to the creation of the subsets that are now part of the BPMN standard, as well as usage guidelines for BPMN in terms of both primitives and patterns.

In addition to the BPMN subsets (e.g., the most commonly implemented Descriptive subclass), they developed naming conventions to use within models, driven by the vocabulary related to their domain content. This idea of separating the control of model structure from the vocabulary makes sense: the first is more targeted at an implementer, while the second is targeted at a domain/business expert; this in turn led to vocabulary-driven development, where the relationship between capabilities, activities, resources and performers (CARP analysis) is established as a starting point for the labels used in process models, data models (or ontologies/taxonomies), security models and more as the enterprise architecture artifacts are built out.

Having defined how to draw the right models and how to select the right words to put in the models, they looked at different levels of models to be used for different purposes: models focused on milestones, handoffs, decisions and procedures. These are not just more detailed versions of the same, but rather different views on the process. The milestones view is a high-level view of the major process phases; handoffs looks at transitions between lanes with all activities with a lane rolled up to single activity, primarily showing the happy path; decisions look at major decision points and exception/escalation paths; and procedures showing a full requirements-level view of the process, i.e., the greatest level of detail that a business analyst is likely to create before involving technical resources to add things such as service calls.

To finish up, he tied this back to the six measures of model quality and how this approach based on primitives conforms to these measures. They’ve achieved a number of benefits, including minimizing modeling errors, ensuring that models are clear and consistent, and ensuring that the models can be converted to an executable form. I’m seeing an increased interest with my clients and in the marketplace on how BPM and EA can work together, so this was a great example of how one large organization manages to do it.

Michael posted earlier this year on the DoDAF subset of BPMN (in response to a review that I wrote of a BPMN update presentation by Robert Shapiro). If we go back a couple of years before that, there was quite a dust-up in the BPMN community when Michael first published the usage distribution statistics – definitely worth following the links to see where all this came from.

Bridging Process Modeling and IT Solutions Design at adidas

Eduardo Gonzalez of the adidas Group talked about how they are implementing BPM within their organization, particularly the transition from business process models to designing a solution, which ties in nicely with the roundtable that I moderated yesterday. The key issue is that process models are created for the purpose of modeling the existing and future business processes, but the linkage between that and requirements documents – and therefore on to solution design – is tenuous at best. One problem is with traceability: there is no way to connect the process models to the thick stack of text-based requirements documents, and from the requirements documents to the solution modules; this means that when something changes in a process model, it’s difficult to propagate that through to the requirements and solution design. Also, the requirements leave a bit too much to the developers imaginations, so often the solution doesn’t really meet the requirements.

The question becomes how to insert the business process models into the software development lifecycle. Different levels of the process model are required, from high-level process flows to executable workflows; they wanted to tie this in to their V-cycle model of solution design and development, which appears to be a modified waterfall model with integrated testing. Increasingly granular process models are built as the solution design moves from requirements and architecture to design and implementation; the smaller and more granular process building blocks, translated into solution building blocks, are then reassembled into a complete solution that includes a BPMS, a rules engine, a portal, and several underlying databases and other operational systems that are being orchestrated by the BPMS.

Gonzalez has based some of their object-driven project decomposition methods on Martyn Ould’s Business Process Management: A Rigorous Approach , although he found some shortcomings to that approach and modified it to suit adidas’ needs. Their approach uses business and solution objects in an enterprise architecture sort of approach (not surprising when he mentioned at the end of the presentation that he is an enterprise architect), moving from purely conceptual object models to logical object models to physical object models. Once the solution objects have been identified, they model the object states through its lifecycle, and object handling cases (analogous to use cases) that describe how the system handles an object through its full lifecycle, including both system and human interaction. He made the point that you have to have the linkage to master data; this is becoming recognized as a critical part of process applications now, and some BPMS vendors are starting to consider MDM connectivity.

The end solution includes a portal, BPMS, BRMS, ESB, MDM, BI and back-end systems – a fairly typical implementation – and although the cycle for moving from process model to solution design isn’t automated, at least they have a methodology that they use to ensure that all the components are covered and in synchronization. Specific models at particular points in their cycle include models from multiple domains, including process and data. They did a proof of concept with this methodology last year, and are currently running a live project using it, further refining the techniques.

Their cycle currently includes the model and execute phases of a standard BPM implementation cycle; next, they want to take on the monitor and optimize phases, and add modeling techniques to derive KPIs from functional and non-functional requirements. They also plan to look at more complex object state modeling techniques, as well as how adaptive case management fits into some of their existing concepts.

I posed a question at the end of my roundtable yesterday: if a tool existed that allowed for the definition of the process model, user interface, business rules and data model, then generated an executable system from that, would there still be a need for written requirements? Once we got past the disbelief that such tools exist (BPMS vendors – you have a job to do here), the main issue identified was one of granularity: some participants in the process modeling and requirements definition cycle just don’t need to see the level of detail that will be present in these models at an executable level. Obviously, there are still many challenges in moving seamlessly from conceptual process models to an executable process application; although some current BPMS provide a partial solution for relatively simple processes, this typically breaks down as processes (and related integrations) become more complex.

Metastorm M3 Demonstration

I had a briefing on Metastorm’s M3 collaborative modeling and Smart Business Workspace two weeks ago, and last week we had a follow-up demo. This is the start of a push towards a full BPM suite in the cloud, providing collaborative process modeling and the end user runtime hosted on Azure, but Microsoft still needs to add some planned functionality to Azure in order to allow Metastorm to move the BPM engine there as well. When that happens, however, the Azure Fabric Connector will allow the BPM engine to connect to on-premise systems and data sources, regardless of whether the Azure instance is on-premise or hosted elsewhere.

We first walked through M3, which provides self-registration for a modeling account. This isn’t just process modeling, however; based on their Provision acquisition, there are 11 different types of models available: Workflow, Organization, Goal, Location, System, Capability, Activity, Deliverable, Project, Requirement and Rule. Although I have seen multiple model types in some of the other collaborative modeling tools – such as strategy and capability maps in IBM’s BPM BlueWorks, this goes beyond that in scope, and has a more robust backing of the ProVision metamodel, allowing the models to be exported from M3 and imported into the full version of ProVision. It’s also possible to create associations between different model types: for example, linking an activity in a workflow model with a measurement or location. Models can be exported in ProVision’s CIF (Common Interchange Format) only, although there are tools to transform a process model in CIF to XPDL or BPEL.

We also viewed a sharing session, which is a synchronous collaboration of two or more people that allows for interactive whiteboarding and chat. Although users in an interactive whiteboarding environment will more likely use telephone as their primary communications tool rather than chat, the chat is useful because it is logged as part of the session history, so can be used to record decisions and notes. A shared session can be played back using a VCR-like control to see how a model evolved over the session.

M3 provides extensive help for modelers, including best practices and strategies for modeling, and will continue to be augmented with feedback from the online Metastorm community. There’s not a direct link to that community, which would be useful; it seems like some of the best practice sections in the help have just been copied from the community site, not directly linked.

Metastorm M3 - 2010

The second part of the demo was on Smart Business Workspace, Metastorm’s Silverlight-based composite application development (mashup) environment. Except for the fact that it’s based on Silverlight (which may not be considered an advantage in some circles), there’s not much different here than most other mashup environments except for the inclusion of their own BPM and model widgets. There’s a role-based starting point for the workspace, and pages can be fully personalized if the user has the appropriate permissions. Widgets are dragged on from a predefined palette, and can be dynamically sized and the general page layout changed. Administrators and page designers can lock down specific pages and widgets for a more controlled environment. Depending on the type of widget, there is publish/subscribe wiring between the widgets to allow for standard use cases such as list-detail or map display of data. Branding and general appearance of the workspace can be styled with CSS and .Net resource files.

Smart Business Workspace allows you to add any Silverlight widget, but does not support other widely-used widget standards. Although you can add any webpage as a “personal widget”, these are really more like unwired portlets than true widgets; you’ll have to use the widget designer to turn something into a first class widget.

Metastorm Smart Business Workspace - 2010