Skype me!

Tons of stuff showing up these days about Skype, a free VOIP service, such as a ZDnet article, “Skype goes for the gold”, discussing how the newly-developing paid add-ons will eventually allow Skype to become profitable while remaining a free service for computer-to-computer calls. The longest-standing paid service is SkypeOut, which allows you to call any landline at greatly reduced rates, presumably because it makes the connection from your computer to the target country via IP, then bridges to a landline for a local call. New services coming out are Skype Voicemail and SkypeIn, the latter being a phone number for your Skype identity that allows a landline user to call you. For example, if you live in South Africa but do a lot of calling with the UK, you can get a UK phone number that, when called, will ring through to your Skype session on your computer, no matter where you’re located at the time.

I’ve been using Skype for a number of months now, for both voice and text (IM). Although I primarily use it to talk to other computer-based users in Australia and North America, I also use SkypeOut for making overseas calls, and for making calls when I’m travelling in order to avoid mobile roaming charges. If my hotel doesn’t have broadband (a rarity these days), I can just find a wireless hotspot, connect my laptop, plug in my headset and make calls on Skype while I download my email. Okay, I look a bit geeky doing that, but it’s worth it.

My only problem is that at my current rate, I won’t use up my €10 SkypeOut credit before I’m 90: I made a four-minute call to the UK earlier this week that cost me less than €0.07.

Alphabet soup for lunch

Just watched a great lunchtime webinar, “The Business Value of Process Standards”, although I was dining al desko and managed to drop sunflower seeds into my keyboard that required some mid-webinar keyboard surgery. I mentioned this webinar in a previous post on BPM standards, and was glad to see that it lived up to my expectations. eBizq usually makes the webinars available for replay on the same link within a couple of days, so you can check it out if you missed it live.

Jeanne Baker, the primary speaker, is VP of Technology at Sterling Commerce, but she was speaking in her capacity as Chairman of BPMI and didn’t even mention Sterling’s products. In fact, the webinar was sponsored by Oracle, so the only product that was (briefly) discussed was Oracle’s BPEL Process Manager.

Since the discussion was on standards, the inevitable alphabet soup resulted, but two acronyms floated to the top of the broth: BPMN and BPEL. BPMN is a standard for modelling business process notation, and BPEL is a standard for executing business processes. Conveniently, BPMN maps directly to BPEL, so they work in concert for designing and implementing business processes. Both of the speakers stressed the importance of the BPMN and BPEL standards, a point with which I fully agree.

On the subject of modelling standards, I especially liked Ms. Baker’s comment on the use of UML for process design (which echoes my own sentiments from my previously-referenced posting): “UML is used by poor, hapless process modellers [who didn’t have anything better, such as BPMN].” I was laughing so hard when I heard that, I didn’t get the whole quote, but that’s the gist of it.

It’s also worth checking out the newly-designed BPMI site, it’s a lot nicer to look at and has a great deal more information than on my last visit there. It gives a much better definition of BPMI’s role in standards development, and features an interesting graphic that Ms. Baker also used in her presentation to illustrate BPMI’s involvement the lifecycle of business process management:

The diagram shows Process Designers as an essential link between business analysts and system architects, but that skill set is often absent on BPM projects. As she spoke about the importance of that role, it struck me that although I started my career as a developer and software architect, I now usually sit in the process designer role, while spreading in both directions into business analysis and system architecture. I describe myself as a “technology catalyst”, because I like to make to make stuff happen, especially that bridge between business and technology.

From the BPMI site:

BPMI focuses upon the Business Process as the inflexion point between the business environment and a technology implementation. Our work is relevant to a wide range of audiences as we innovate a seamless transition ‘path to execution’ for Business Processes. Our aim is to unify process thinking across Business and IT disciplines.

A lofty, but very worthwhile goal.

A “Column 2” sort of girl

Although process, not content, is my main focus, I dropped by the e-Content Institute’s 16th annual Information Highways conference in Toronto today to sit in on a workshop called “Bridging Obstacles in E-mail, Workflow and Compliance Management: Best Practices”. Although it was a vendor presentation by Tower Software, I figured that I’d see something interesting along the way.

As an aside, I have to say something about the conference title: the term “information highway” is a bit of a blast from the past. It might have been on the cutting edge 15 years ago when the conference started, but when’s the last time that you heard it without a snicker involved?

Meanwhile, back at the presentation, the speaker from Tower’s Toronto office couldn’t make it, and the replacement had flown in from DC this morning. She spent half of her intro expressing surprise that none of us had met the original speaker (from what she called “our Canada office”), as if Toronto were a small town where we all have dinner at the local Legion hall together on Saturday night. Sigh.

Tower’s view of workflow is interesting: they consider that it’s either ad hoc, transaction-based or knowledge-based, where the latter can be email-based, process-based or document-based… huh? Okay, I’ll cut the speaker some slack for having to work from someone else’s slide deck, but what was the original speaker thinking? Maybe he was trying to categorize everyone else’s stuff as ad hoc or transaction-based, then show why their “knowledge-based” workflow was better, but it wasn’t clear to me and I’ve spent enough years around workflow such that anyone’s explanation of where they fit in the space should be pretty obvious to me within a couple of minutes.

In spite of all that, some interesting tidbits, particularly about how email messages are now considered evidentiary in many cases, with their legal admissibility being based on authenticity, which in turn relies on content, context and structure being preserved and auditable. Unfortunately, although IT is usually responsible for email, they know nothing about records management (RM); furthermore, individuals manage/delete/archive (or don’t manage/delete/archive) their corporate email as if it belonged to them personally, not the corporation. The answer, even admitted by Tower, is not in a software package, but in the creation and enforcement of email RM policies.

Although their system can capture everything without user intervention, that’s not really recommended because you just end up with a mass of undiscriminated data, not unlike what is on many corporate email servers now. They state that every user needs to take some responsibility for RM (presumably because there are insufficient business rules built into the system to allow it to automatically categorize messages, or even recognize duplicates catalogued by multiple recipients), but I think that the chances of that happening on a suitably complete basis are pretty small when you consider that most people don’t even put the items in their InBox and Sent Items into properly categorized folders.

All good stuff, but a bit of a yawn: now I remember why I kept my focus on process and became less and less interested in content except as an adjunct to process. I recall working on a client project several months ago where I was designing a BPM implementation to integrate with a line-of-business database, hence I had a lot of discussions with the data architect who was designing the database side of things. I dropped by his desk one afternoon and we had a rather passionate discussion about the relative roles of data and process in the system.

After some amount of discussion, I said “Do you know the Zachman framework? Well, I’m a column 2 kind of girl.”

“That explains it,” he said, “I’m a column 1 kind of guy”.

Clearly incompatible.

BPMG London conference

Looking at the BPMG’s 13th annual conference in London next month, some interesting material at all levels. It appears that this is really the main BPMG conference worldwide, which makes sense because BPMG started in the UK and there is a strong BPM community there. The conference in Las Vegas later in May doesn’t seem to have the breadth of London’s event, and I really don’t want the Vegas immersion experience anyway. Besides which, they have one price for “Industry Professionals” and a higher price for “Vendors/Consultants” — what’s with that? I do consulting for a living (although I hate the word “consultant” because of the high number of IT charlatans who assume that title), but I’m certainly considered an industry professional: would I have to pretend not to be a consultant to get a fair price?

When did 3-day conferences get so expensive? The London 3-day (workshops plus conference) is the equivalent of $C3,200, plus the cost of travel and living… a pretty significant outlay for a small business. The Vegas conference is even more for a day of workshops plus the conference, and increases further by adding evening workshops to top out at over $C5,000. Given that budgets are still tight in many organizations, who can justify attending these?

Networking games

I recently started using LinkedIn, a very cool professional networking site that allows you to establish a network of trusted references, then link to other people through your connections on the assumption that they’d be willing to pass along your request for contact to someone that they know. You can search for people based on a number of criteria such as location and market segment, with the search results being all the people who are connected to you by four or less degrees of separation (plus people like me who allow you to request a connection directly without going through an intermediary). I’ve built up a list of 35 people in the last two weeks, which links me to 220,000 people through these connections up to four degrees away. A drop in the bucket compared to one LinkedIn member with over 6,000 people on his immediate connections list, and probably about a gazillion people that he’s linked to, but enough to have me test this out as a networking method. So far, it’s working well enough that every time I meet someone in business now, I search for them on LinkedIn, and either get connected or invite them to join.

Like most other cool things on the web, there’s a game to be had with this. Remember the Googlewhack craze of ’02, where you tried to find a combination of two search words that yielded a single result? LinkedIn lends itself to similar games. Our latest one (thanks Damir) is to find how many people you are connected to in any random country, only counting the connections up to four degrees and not the ones that allow themselves to be contacted without a connection (who will also show up in any search). For example, I’m connected to two people in Gibraltar, even though I’ve never been there, both of them with four degrees of separation. (I also see four others in my search results, but they aren’t connected to me so don’t count according to the strict rules of our game.) The really cool thing is that when I select one of them to see which of my direct connections through which they are connected to me, the first one is connected to me through seven of my direct connections, and I’m pretty sure that four of these seven connections have never met each other. So far, I haven’t found a country where I’m not connected to at least one person, which definitely says something about the power of networking.

Professional networking: a necessity for business? Of course. A fun weekend activity? You bet.

BPM standards

I feel pretty strongly about the benefits of standards in all areas of technology, and BPM is no exception. For years, we’ve been using different notations in different tools to create BPM systems that don’t communicate with each other because they speak a different language. There’s been a lot of headway in standardizing the communications part — the messaging protocols, for example — but there’s still work to be done in the design end. How can we hope to make BPM systems truly interoperable when they don’t use the same notation to model the flow within the system?

Every BPM system has its process designer tool, but when it comes right down to it, most people model their process in Microsoft Visio before implementing it in the BPM system. First of all, a lot of people already have Visio on their desktop and know how to use it, so there are less licensing and training issues. Secondly, a process model will usually include all sorts of non-automated steps that will never be represented in the BPM system (especially the “as-is” model), but need to be modelled for proper business analysis. I’m not interested in tools, however, but in the actual BPM notation, which can be drawn in Visio or any number of other process modelling tools.

There are standards emerging for BPM notation, with two strong contenders: UML activity diagrams, and BPMN business process diagrams. If you have a technical background and haven’t had your head in a paper sack for the past 10 years, you know about UML; however, you probably know it as a modelling tool for software design, with activity diagrams being used for computational processes, which is why the OMG originally developed it. UML activity diagrams have recently been repurposed for business process modelling: a type of object-oriented flowchart, if you please.

As much as I like UML for software design, I like the emerging BPMN standard (from BPMI, an industry standards body) better: it’s been designed as a business process modelling notation, not retrofit from some other standard; it’s more understandable to business analysts and other non-technical participants; and it has a direct mapping to WSBPEL for process orchestration. It was only introduced last year and may take a while to catch on, but it’s worth knowing about.

If you’re interested in learning more about BPM standards, there’s a non-vendor webinar (a rarity!) called The Business Value of Process Standards at ebizQ on April 6th.

Testing for real life

I watched the movie K-19: The Widowmaker on TV last night; it’s about a Russian nuclear submarine on its maiden voyage in 1961 where pretty much everything goes wrong. In the midst of watching reactor failures and other slightly less catastrophic mishaps, I started thinking about software testing. I’ve seen software that exhibited the functional equivalent of a reactor failure: a major point of failure that required immediate shutdown for repairs. Fortunately, since I have worked primarily on back-office BPM systems for financial services clients over the years, the impact of these catastrophic system failures is measured in lost efficiences (time and money) by having to revert to paper-based processes, not in human lives.

When I owned a professional services company in the 90’s, I spent many years being directly responsible for the quality of the software that left our hands and was installed on our clients’ systems. In the early days, I did much of the design, although that was later spread over a team of designers, and I like to think that good design led to systems with a low “incident” rate. That’s only part of the equation, however. Without doubt, the single most important thing that I did to maximize the quality of our product was to create an autonomous quality assurance and testing team that was equivalent in rank (and capabilities) to the design and development teams, and had the power to stop the release of software to a client. Because of this, virtually all of our “showstopper” bugs occurred while the system was still in testing, saving our clients the expense of production downtime, and maintaining our own professional reputation. Although we always created emergency system failure plans that would allow our client to revert to a manual process, these plans were rarely executed due to faults in our software, although I did see them used in cases of hardware and environmental failures.

When I watched Liam Neeson’s character in K-19 try to stop the sea trials of the sub because it wasn’t ready, and be overruled for political reasons, I heard echoes of so many software projects gone wrong, so many systems put into production with inadequate testing despite a QA team’s protests. But not on my watch.

A lesson in disintermediation

I recall learning the real meaning of the word “disintermediation” in the mid-90’s, when I was helping mutual fund companies build systems to do exactly that: cut out the middle-man (the broker or dealer) in some of the transactions that they have with their end-customers. The primary vehicle for this disintermediation is, of course, the web, where now almost all financial services companies provide some sort of self-service, bypassing a mutual funds dealer, securities trader, bank teller or insurance broker whenever regulations allow. This trend is not restricted to financial services: travel agents, for example, have been practically disintermediated out of existence in some market segments.

The “bad guy” in this has always been the big company: by allowing their customers direct access to their services, they endanger the livelihood of the intermediary. (Having been self-employed for most of 20 years, I don’t believe in the sanctity of any job, but that’s a topic for another day.)

Now it’s the banks’ turn to be disintermediated. The Economist reports in a recent article on the launch of Zopa, a UK-based online lending and borrowing exchange: consider it peer-to-peer lending for the rest of us.

Since borrowers and lenders can get together without a bank in the middle, Zopa effectively disintermediates the bank, while providing bank-like security through credit checks, spreading each loan over multiple parties, and committing to collecting from overdue borrowers. Borrowers are classified by their risk, and lenders choose the level of risk that they want to assume when picking a market within Zopa. Zopa takes 1% of the deal, which means that borrowers get a better rate than a bank could offer them for a consumer loan, and lenders get a better rate than a bank savings account. To quote their site:

Lenders can choose what rate to lend at and, by looking at the markets, decide what sort of people to lend to and when. Borrowers can choose to take a rate offered or to wait and see whether the rates drop. Both avoid paying needless chunks of commission to Financial MegaCorp plc and can get better rates of interest as a result.

I love that dig about “Financial MegaCorp plc”.

You can be sure that the big financial institutions will fight the growth of exchanges like Zopa when it starts to impact their business, but then, all parties being disintermediated fight the trend. This style of borrowing and lending won’t be for everyone, but it will attract those who don’t want to spend the extra money just to have a bank in the middle, any more than I would pay a higher fare to have a travel agent book an airline ticket for me.