BPM – Is it a Software Engineering Tool? A Technology? or a Management Discipline?

November 30, 2008

In his excellent posting, Keith Swenson makesmany good points. He points to the range of interpretations of BPM, and particularly highlights the issues associated with its interpretation by software engineers as just another piece of hype on the road to good programs. But I think there is another, perhaps more important strand that is buried in there. As Ketih points out, BPM is about the Management of Business Processes.

As we all know, everyone’s interpretation of the term Business Process is different. In my training (whether that be BPMN, or higher level training on BPM Methods), it is one of the first things I get people to write down (inside the first 5 mins), and not surprise, every definition is entirely different. And when those people are senior managers in a ciompany, their interpretation of the term is invariably what I call “Process as Purpose“. The point is that they see Processes as being more about the purpose than the constraint implied by sequencing of steps. They are there (at the training), because they see the importance of “Managing” their processes. Indeed that concept (Managing Business Processes) as central to the success of their companies.

[[ I am still down in Brazil, and I am really struck by the process sophistication of the people I am meeting. They all get it. I was more than surprised to find two “Business Owners” (people who own significant businesses), giving up 3 days of their time to come on a course around how to structure and run BPM programs. They were cherry picking from the broad range of techniques we covered, but ask yourself whether you could imagine the CEO or COO deciding that they should attend a public training course. That’s what I am getting at about the sophistication of the Brazilian business climate. ]]

Coming back to Keith’s post, he describes a spectrum of BPM interpretation – from pure Software Engineering (where the SW Eng tries to reflect the needs of the business person’s Business Process); through Business Processes being modeled by a business person, then tossed over the fence to a Software Engineer to finish them off; to the Business Process as modeled by the business person, then being directly executed (what he called “Pure BPM”). I am not quite sure I agree with the Pure BPM bit, but I do know what he is pointing to … where the processes of the firm are driven by models (without translation to some intermediate executable format (like say BPEL).

One of the comments on Keith’s post points to the challenges of getting business people to model their own processes and make the resulting collection of stuff useful. He described the usually resulting mess as an “expensive disaster”. And the reason for this is that business people dont usually have the sophistication to understand their business problem as a set of inter-releated processes that between them deliver on the “Process as Purpose” concept I referred to earlier.

Invariably, process modelers (whether IT or business) tend to see a process problem as a single process. They interpret the high level Purpose as a single implementation process (which invariably it is not). They make all sorts of mistakes such as mixing up the “Handle an Instance” with the “Manage the Flow of instances”; they switch from batch mode to handling a single instance; they dont think about the interfaces between processes (handing information from one to another), etc. What they do is try and connect up everything that sounds like an Activity into one convoluted process.

Now software engineers are usually more adept at the necessary abstract thinking, but that doesn’t mean to say that business people cannot wrap their pretty heads around the notions. It is merely a reflection of the fact that they have not had adequate training. What is missing (across this entire industry) is better learning around “Process Architecture” – what “chunks” do you need and why. Poor chunking leads to unnecessary complexity (and even “expensive disasters”).

We are still stuck with decomposition as the prevailing mind set – where sub-processes are always contained within the parent. SOA concepts seek to get around this, but there is also a higher level “Business Services Oriented Architecture”. Processes should not be regarded as some sort of static hierarchy, they are more accurately regarded as a network of interacting instances. Think more jigsaw puzzle than org chart.

When I gave a “Power Breakfast” at the last Gartner BPM Summit on BPM and Process Architecture I had a packed room (it was starting at 7:30 in the morning so these people were keen). I described a set of methods that you could use to go from “What Business Are You In” to what “Processes Do You Need” right down to the SOA components if that is what you wanted to do (I would recommend looking at a BPM Suite first rather than going straight to the SOA software engineers paradise). I only saw one person out of the 90 or so get up and leave, and nearly everyone else gave me their card at the end of it. The room really was comprised of mostly Enterprise Architecture folks from the IT community, all of whom struggle with this transition.

Switching tacks – the vendors BPM Suites are unconciously making this architecture problem worse. With only a few exceptions (Pega, Itensil, BizAgi … I am sure there are others in this category too, these are the ones that spring to mind), vendors interpret the business process problem as being entirely seperate from the data and artifacts associated with the process (business people see them as intertwined). They regard the process relevant data as a set of named value pairs … the information required by Process A is declared on Process A, and must be recreated on Process B and then mapped from one to the other (if the processes need to communicate with each other). That means that there is an extra (unnecessary) layer of complexity for business people trying to reflect their business problem. Moreover, if you change one process, then you need to refactor all the interfaces. This is “software engineering” oriented thinking.

The other approach is to define your data structures (perhaps as an “Entity” defined as an exstensible set of XML artifacts) and then describe the views on those artifacts at the level of the data structure. Then it is merely a matter of associating your processes with the Data Entity, and all the different views become available. Process interfaces become an order of magnitude more accessible (to the business user), you can use any number of processes to support a single case of work, and again it helps move away from the software engineering mindset we find in so many BPM tools (which were often created to solve the problem of Enterprise Applicatin Integration … hence their association with Software Engineering).

Some amazing results already – Business Analyst Survey

August 12, 2008

Well I must say I am more than just a little surprised at some of the results that have popped out of our Business Analyst survey. With approaching 300 responses now, it is starting to become statistically significant (although with hindsight some of the questions could have been phrased more directly).

First, its the apparent popularity of Role Activity Diagrams. To my great surprise, the technique is more widely used than BPMN at this point. Something like 47% of respondents have indicated that they use RADs. Now I am sure that there is a certain element of confusion here (RADs are not the same as Flow Diagrams segmented by Swimlanes), but nevertheless, it indicates a heightened level of awareness around the technique. Of course, most Business Analysts still model processes without any formal techniques underpinnign their approaches.

Secondly, despite the attempted spin from some naysayers in our industry, around 65% of firms do use a BPM Suite or Workflow product somewhere in their organization (and a significant proportion have more than one BPMS). This is validated by the fact that a similar number of people say their own process models are used to support a BPM or Workflow implementation. And it is quite inciteful to see the distribution of the products used (it roughly follows what I would have guessed at the outset … which is somewhat at odds with the touted penetration of the vendors involved).

Thirdly, the percentage of respondents who feel they would benefit from further training. For the moment I will keep the details of this to myself, and the areas that are of most interest (let’s wait till the survey is completed in mid-October).

If you havent taken the survey then get your votes in … it should only take about 5-10 mins (something to do on those quiet summer days in the office ;-/). To encourage you to take part, we have put up some significant prizes (Nikon D200 or Bose Home Entertainment System being the first prize). The drawing (from those that complete the survey) will take place at the next BPM Technology Showcase in Washington DC (October 14-16).

Itensil Dynamic Process Platform

April 15, 2008

Clearly I have been quiet for some time – busy organizing conferences and events and also spending more and more time exploring the world of Software as a Service (SaaS) as it applies to BPM. Some are talking about this conjunction as really a Platform as a Service (PaaS). Over the last 6 months or so I have been playing in the SaaS-BPM space – initially developing the process modeling training material for Appian, and more recently experimenting with a relative newcomer – Itensil.

A while back, I talked quite a bit about the Appian experience – a powerful application building environment that is now delivered On Demand. However, in common with virtually all BPM Suites, it suffered from an overdose of complexity – its just not really accessible to the common person.

Over recent months, I have been experimenting with Itensil as a vehicle to deliver both on demand BPM Training, and also to provide a robust collaboration support environment for BPM Projects. I have also used it to support the preparation of the two major events I have been involved in putting on – the BPM Technology Showcase and BPM Lisbon 2008. I think you would agree, this is not your run of the mill problem that we find in use around BPM, yet sharing many characteristics with difficult problems found in many businesses. With this series of postings I will explore what I have found out about Itensil.


If you are reading this, then you are probably already fairly knowledgeable about Business Process and will understand that I have spent most of the last 20 years focusing almost exclusively in that domain. Initially this was developing a product called Office Engine (I killed that off in 1992 – was not a good time to have the greatest thing since sliced bread), and since then I’ve been crawling all over, and teasing apart just about every major product out there concerned with Business Process – my main interest is in using business processes to drive the way that work happens.

Now if you are like me, you are probably more than just a little frustrated with the rigid and inflexible nature of most workflow systems. What I am getting at is the lack of adaptability inherent in most workflow and BPM systems – you are stuck with whatever process was described at the time it was built. Now while that might be a good thing for a big insurance company or bank wanting to make sure that a clerk doesn’t get creative with a bank draft, it is a cumbersome problem for the rest of us. It smacks of control for controls own sake. Even for the smallest change, you have to refer your application back to at least a business analyst (or worse, the IT department) and then wait for them to get around to understanding what you want, before they reflect that in the next rev that is rolled out some time in the future. Not much use to you as you try to deal with some unique customer situation that has just emerged in the middle of your major bid process; or the customer who says to the architect – I know we are halfway through building this hotel, but can I have a swimming pool on the 10th floor (the point being that the architect cannot throw away all the work to date and start with a new version of the process).

I am referring to a class of processes and applications that most Workflow/BPM Suites just can’t get anywhere near – collaborative knowledge worker processes, where individuals continually interpret the case in hand and make decisions accordingly. I have already mentioned architecture and bid management, but the examples here are endless – from emergency response management, to how an advertising agency operates, through to consultancy, medical investigations (indeed all sorts of investigations and research) … right through to what you do.

Adaptability is also an issue if you are building out applications – it’s just that now you have to blend adaptability and collaboration with more rigorous procedures – all of which somehow need to live together. I think this is the sweet spot for BPM and SaaS delivery. It is not the highly rigorous, slowly changing transactional procedures that you use to record and support your sales transactions, or issue credit cards. Those sorts of processes are what traditional BPM Suites are designed for (and are good at). But what they cannot handle are scenarios where change and adaptability are at the core – where change is part of domain.

Itensil Dynamic Process Platform (DPP)

The Itensil Dynamic Process Platform is really quite a different architecture from every other BPM Suite I have looked at. The developers have combined the ease of use and adaptability of a wiki, with the mechanics of Process. Every object in the system, whether it be a Process, a Document, an Entity or even the home page has a sort of Wiki style history feature. You can roll back to it at any time (make the previous version active).

Processes development has some real wow factors associated with it (see Instant Wows below). Each Process design can then be instantiated into individual Process Activities (each of which can be adapted by the end user on the fly if needed). There is also an instant, ad hoc team collaboration environment, which can be endlessly adapted and added to (really a special use of the process functionality that is embedded into a tool called “Meeting”). Any number of instances of a Process (and Meetings) can be rolled up into a “Project,” which has its own Wiki home page. A Process instance might belong to several different Projects. There’s a lot more than this, but I will get to that later.

This is the core environment that delivers an end-user accessible process support environment. Itensil describe the processes that they designed the product to support as “roughly repeatable,” where each instance of a process is subtly different from all that went before. It really is a nuanced blend of what I call “Practice” and “Procedure.”

At the heart of the system is a sophisticated document management and content repository that is all served up out of a LAMP stack – actually most end-users won’t care what operating system or application server platform it sits on, it is delivered On Demand over the Internet (although an On Premise version is also available for those who must have it).

But its not just an end-user tool set – you could think of the Itensil DPP as Process Platform as a Service (PPaaS), where an application developer or third party can embed their own IP and application know-how into a robust application and blend that with an accessible and tightly integrated user interface that is entirely delivered through AJAX in a Browser.

Collaboration+Process≠Workflow+Document Management    

While all BPM Suite vendors are trying to work out how to graft collaboration onto their existing workflow tooling, Itensil have taken another approach. From the ground up, they have added business process support to a rich collaboration environment. Rather than design-time ease of use (i.e. for the Business Analyst and Professional Developer), they have sought to deliver run-time ease of use and adaptability for the knowledge worker (i.e. designed for people like you and me). In Itensil, all processes are developed in the environment itself using the outline editor and/or the process modeling canvas. Each one is, in its own way pretty special (see below).

Itensil’s Organizing Principle is Adaptability @ The User Level

Instant Wows

Outline to Process – When initially envisaging a process, the user merely outlines the steps in a wiki editor (perhaps cutting and pasting from somewhere on the web or other office document such as Word, PDF, Excel or a Mindmap outline). They can also insert markers (associated file attributes) to create dropzones for any outputs that are produced, and specify  team roles, along with review loops and due dates if desired). The Process itself is generated automatically. Each step has an attached Wiki page that serves as the user interface for that step.

Instant Mode Switching – The user can immediately switch between “Run” mode (instantiating the process); “Status” view providing an overview of the steps and also provides mechanisms to re-assign tasks to team members, set due dates, jump to steps, etc.); and the “Design” view which will take you to a drag and drop modeling canvas and the outline editor.

On the Fly Process Change – When Processes are run an instance is referred to as an Activity, each of which is adaptable at run-time (as required by the end-user). Alternatively, the user might decide to save the changes to the default process model (affecting all future Activities).

Run Time Binding – At the Task level, the user could decide to bind a separate (stand alone) Process to a Task to further elaborate on how the work should be done (perhaps involving other roles). The user interface presents an interface that allows the standalone sub-process to be viewed discretely, dragging the artifacts from one process level to another (and thereby avoiding the need for complex mapping). 

WebDav – Itensil supports a network file system that is delivered via the Internet (while HTTP gives you read access, WebDAV gives you write access). Essentially, it acts like virtual file system, allowing users to edit files in situ (saving the new version automatically). It provides a network drive that is available at the Project, Process, or Meeting levels (indeed any level). In this way, Itensil can more easily support the virtual enterprise as users up and down the value chain collaborate in a safe and secure environment.

Object History – all objects in the domain receive the wiki treatment; each file, each page, each process, everything … has a history associated with it. Suitably authorized users can roll back to a previous version (making the older version active but still keeping the new edits). Even a process can be undone step-by-step to recover from errors or facilitate a change to the process. Of course, all objects in the system have a robust security model associated with them (setting up the system it is possible to control the default permissions on different classes of objects). 

Look Ma, No Forms – given that all steps in a process are presented to the user via a WYSIWYG wiki page, the system provides a natural canvas for presenting information. Users can create Process Attributes at any point, and simply drag them onto the wiki canvas in edit mode. Placement is just a question of where the cursor was at the time. In fact what is happening is a tiny Xform is created automatically to support that attribute. So if you decide that this process needs a “Budget” field, all you need to do is drop into Design Mode, select the step where it is set, open the Attributes panel, click the new attribute link, give it a name, say what sort (say Currency) and then drag it onto the canvas where you want it. The user will then be prompted to provide the Budget at the right point (which might then be reused at a later step).

These points have massive implications for adoption, design and usability of BPM – rather than having to predict all processes in advance, the user can decide how to interpret the task in hand and apply an appropriate personal template. The net result, better adoption by the end-users themselves (they are in control) and a more agile adaptable organization. The outline to process functionality provides a real step up in accessibility, putting it in reach of the average knowledge worker or manager – if you can work with a PowerPoint outline, you can work with Itensil.  

Itensil Power Features

So while all of this might sound really interesting but not quite appropriate for your complex business problems, the latest set of features delivered by Itensil change all of that. These facilities are primarily designed for the professional developer or business analyst; they provide the tooling to help build out Itensil into vertical industry domains and deliver discrete, knowledge-intensive applications. Further, I understand that the whole environment is configurable including the look and feel and the brand image delivered (meaning that OEM partners can embed the entire environment into their own offering). 

Entities – Itensil now incorporates a mechanism to flesh out your Line of Business (LOB) data structure, and establish the relationships between those Entities. Once the established, the system can then automatically walk the meta-data structure, automatically finding related Entities and presenting the correct information to the user at run-time. 

This is in contrast to the usual approach taken by most BPM vendors who rely on Process level variables (i.e. named value pairs) to represent LOB data. This traditional approach introduces significant complexity to the end user trying to work on developing a process (they must always worry about the mapping of attributes from one process to another and over time the sets of variables can easily become fragmented, further driving the complexity and cost of ownership). In Itensil, this is all done automatically.

Each Entity can have a number of XForms associated with it. Since the Entity relationships are already known, the XForms environment can also display the related Form at the same time as the primary Entity. So a Person form might also include the Address as a sub-form. When designing a Process, an Entity Relationship is easily established that can then automatically make use of this form – say you know that the user should enter a new person called a “Target,” then it is as simple as saying a New Entity Relationship of type Person is required, which is referred to as a Target. When you drag the resulting widget onto the work zone wiki page, the system will ask which Form is to be used (as there might be several different views), and hey presto, you have a sophisticated XForm delivered to the user (based on a robust data model). Of course, the professional developer can also resort to specialist tools for creating sophisticated Xforms instead of using the out-of-the-box functionality.

Integrated Rule Builder – Of course, developers might want even greater sophistication to create rule driven XForms that embed some special behavior. Say for example, if you want to display a different section of the Form based on whether the Customer is based in the US (the tax calculation might then be different on a state by state basis), or if the Customer was in the UK, then the form should include provisions for VAT (a wholly different regime). The point is that these form-related rules are executing on the fly in the browser of the user, yet still pointing back to the Entities and Process Attributes (speeding up the user experience and removing latency in the connection).  On the other hand, say for example you want the rule to run on the server to govern process logic, then that same rule or a new one could be applied to the process. All of this functionality is supported with a drag and drop rules builder that knows all about the associated entities and attributes (i.e. it is tightly integrated with the LOB modeled earlier).

Partner/Customer Mode – this interface enables partners and customers to take part in the process, yet doesn’t provide them with access to the rest of the environment. For example, if your application is designed to support the RFP process in putting on a conference, each hotel representative can be provided with this sort of guest access to upload their response and later respond to queries.

Organizational Hierarchy – the new version introduces a more sophisticated modeler, allowing the system to reflect any number of complex organizational forms. For example, this now supports a user changing department in the organization and the supervisor role is automatically updated. similar ssignmnets  meemversnymoerves as the have any additional questions or information that will make our call more usefulf a

Courseware Now this might sound a bit strange for a BPM Suite, but when you consider that this is a knowledge-centric environment (heads-up process support, as against heads-down process control), it is important that you have a way of training people and unlocking areas of the system once they are qualified. It is also a better way of ensuring compliance (in the sense that only qualified people are given relevant work to do). So Itensil have released a Courseware builder – where the course itself is a process – the wiki page is the user interface for delivery. It will handle embedded graphics, all your advice notes and any attachments (whether they be PDFs or bits of embedded Flash).

Training can then be delivered in-situ, as required at the coalface. The new Quiz Builder supports the user as she develops the process-driven courseware, generating multi-choice quizzes to test a users understanding before moving on to the next section or opening up access to some specific process or feature capability (you have to supply the questions and answers). People taking a course don’t get access to the process that controls it, but from the perspective of a professional training developer this sort of functionality is really game-changing. Quite apart from changing the rules of the game in Distance Learning delivery (better than endlessly regenerating Flash courseware), it is also a way of ensuring that applications and features are sufficiently understood.

Now I am sure it could do with improvement, and as I write this I am already thinking of areas where I would like to see changes, but once you grok the implications of this it is quite literally mind-blowing.

Excel Interface – Itensil now supports the capability to Import and Export form content to an Excel spreadsheet leveraging the XML format of Excel. While you could use the cut and paste method (as in Paste to Outline feature of Process building interface), in certain applications it is necessary to export LOB data into Excel for offline and disconnected working (or even just to take a snapshot of a case in order to lock down the data as an evidential artifact).

So in these ways, Itensil allows the developer to deliver structured LOB information and applications alongside the knowledge worker collaborative environment that distinguishes it from the competition.

The Big Idea

It really is all about leveraging the power of process to knowledge workers in a rich collaborative space in the cloud. Personally, I think Itensil have really brought something fundamentally new to the market. The reality is that this sort of end-user accessibility really does change the game. Knowledge workers can now do it for themselves, seamlessly transitioning from the freeform collaboration environment of a wiki to the structured work that some of their processes demand. At the same time, Itensil have delivered an evolving work environment that is under the control of the users who most need it – the knowledge workers themselves. As a result, the cost of ownership for an application is dramatically lower – instead of always having to refer back to a Business Analyst or IT developer.

For the industry, I think we are going to see Itensil define a new segment in the BPMS market that is currently massively underserved. Knowledge workers are poorly served – they quickly revert to email as most BPM Suites fail to deliver the necessary adaptability. And we shouldn’t forget that all of this is delivered On Demand, in a multi-tenant platform (i.e. all you need is a web browser and an Internet connection). The Itensil combination of Web 2.0 + Collaboration + BPM is quite literally showing the way forward and I think, starting to beat up the competition.

BPM Technology Showcase and Awards – An Opportunity to Save Hundreds of Hours and Thousands Of Dollars

February 12, 2008

Well after a lot of hard work the event is now fully fleshed out. Of course there are a million and one things to get done to organize a major event – and I am still getting through them.

But we have a full program of really interesting vendors (IMNSHO). They cover a range of different themes that regular readers will recognize. This is a real opportunity for people involved in BPM projects to save hundreds of hours and many thousands of dollars by assessing all the best vendors in one place, picking up on the best practices, pitfalls and other implementation wrinkles.

In no particular order they are:

BPM and SaaS – Appian, Integrify, Itensil, Lombardi (Blueprint) … I am not sure whether I should put Cordys, and Fujitsu in that category (since apparently they can do this combination but haven’t made a big noise about it).

Case Handling – Cordys, Graham Technology, Itensil, Pallas Athena, Pega

Complex Customer Interaction – Graham Technology, Pega

Knowledge Workers – Appian, HandySoft and Itensil

Microsoft and .NET – Ascentn, Bluespring Software and BizAgi

BPM-SOA Stack – BEA, Fujitsu and TIBCO

Unified Data Model – BizAgi, Pega

They all have something special about them – they are all becoming more and more “model driven” (some are better than others), they all feature mechanisms to monitor and track work. Here is the complete list along with links to their web sites.

AppianAscentnBEABizAgiBluespring SoftwareCordysGraham TechnologyHandySoftIntegrify ItensilFujitsuLombardiPallas AthenaPegaTIBCO

That’s 15 vendors, each delivering 4 sessions – one in the morning and one in the afternoon on each of the core showcase days (Tuesday and Wednesday). The Showcase itself is capped off with a simulated product bake-off where each vendor demonstrates how they have built out one or other of the two core scenarios we will provide them with.

I intend to create short 5-minute videos of each vendor, featuring their best points and place them on YouTube with links to their product profile – which I will endeavor to get up on the BPMF site within a few weeks of the event (but I am traveling for the month of March so it might not be till mid-April before that happens.

Oh – and lets not forget the Monday when you will hear a keynote from Connie Moore (of Forrester), followed by three new case studies (the best submissions from the Awards program – Wells Fargo, Geisinger Health and Louisiana Supreme Court), the three inclusive ½ day training courses:

  • Ensuring BPM Project Success – from me
  • Modeling in BPMN – from Stephen White of IBM (the main author of the BPMN specification)
  • BPM Overview from the WfMC

And all of this is available for the killer price of just $395 (up until close of business this Friday … after that it reverts to $595). Just to put that price in perspective, that’s less than you would pay a traditional conference for their pre-conference workshop !! We have deliberately kept the prices low so that you can bring the team – to form a shared understanding of the issues and the way ahead (and it’s impossible to get around all 15 vendors in the 12 sessions that you will have time to attend).
You can get directly to the registration page here

Download the brochure here

BPM Awards and Technology Showcase

January 16, 2008

The BPM Awards and Technology Showcase is taking shape and it’s promising to be quite an interesting affair. Located at the Sheraton in Downtown Nashville, it is easy (and cheap) for the whole US to get there, it will take place in late February – 25th through the 27th.

In my opinion, participation by any organisation with a BPM project(s) on its plate (current or planned), will save hundreds of hours and thousands of dollars – through the stunning case studies, through exploration of what the vendors have to offer in one concentrated educational program; and through the all inclusive workshops focusing on implementation best practices. Although it has run successfully in Europe over several years, this format is relatively new for the BPM market in the US.

In the contrast to the traditional conference+trade show model – where you will find a mix of hypothetical talks and vendor marketing – this is event is focus on providing pragmatic and actionable information specifically about BPM technology and its implementation. Rather than trying to glean scraps of insight in the chaos of an exhibition showroom floor, this event is primarily based around structured sessions that focus on how products are used and deployed (and the best practices, challenges and pitfalls along the way).

Of course it is much more than that:

On the Monday (Feb 25th) we have a “BPM in Practice” day where you’ll get the big picture in a Keynote from Connie Moore of Forrester. We then segue immediately into a selection of the top North American case studies from the 2007 Global Excellence Awards in BPM and Workflow (I think the best ones). This is where we have the real 24 Carat Gold – three brand new case studies from Wells Fargo, Louisiana Supreme Court and Geisinger Health – all focused on the reality of modern BPM implementation. These case studies are delivered by the business and IT people themselves talking about their experiences – setting the scene for what is to follow over the next 2 days.

Then, over a Gala lunch, we have the Awards Ceremony itself (where the shiny stuff get handed out to the winners). This is quickly followed by a joint presentation from Nathaniel Palmer and I – where we discuss the Technology Assessment Framework (everybody will have copies of all the product reports by this time).

We then all go to a choice of 3 workshops – I will be running a concentrated form of our “Developing A Structured Approach for BPM Project Success” course, and if Steve White gets the permission to come from his masters at IBM, he will run a shortened version of the BPM Process Modeling Fundamentals (focused on BPMN). In parallel the leading lights at the WfMC will run their own session, taking a more general view of BPM (I expect they will also talk about the role of XPDL). Its worth noting that these workshops are usually delivered as conference add-ons – the difference with this event is that they are all included in the very cheap price of attendance ($295 if you get in quickly).

On Tuesday and Wednesday (26th and 27th) we have the Showcase itself. On each day we start with a short plenary (who’s on when, showing what); then we immediately split into 5 tracks. On each track there are six sessions during the day (three different vendors giving two sessions each on each track). The delegates self select the sessions that or interest to them. Each session is 40-45 minutes, with a 5 minute break to get to the next session (we do let you have breaks for coffee and lunch).

But the real difference here is that there is no exhibition, just concentrated truth telling from the vendors as they explain and demonstrate how their products are used for real. These are up close and personal sessions where the 25-30 people in the audience can pop any question they like.

And as you’ll discover (assuming you come), the whole thing quickly becomes very interactive.  Everyone realizes it is OK to ask questions and very soon we are all learning from each other. This opportunity to interact is further bolstered by the birds of a feather lunch tables and Round Table discussions on the Tuesday (where each table will explore a particular area).

Moreover, the format ensures it is a level playing field for all (rather than who can afford the biggest stand). Vendors range from the relatively small innovation leaders such as Ascent, BizAgi and Itensil, through the established pure-play BPM vendors (such as Lombardi and Appian) then into the big guys like TIBCO, and we anticipate BEA/Oracle will also have a presence.

To cap it all – we end on the Tuesday with two different vendor shoot-out scenarios, where participating vendors show how their tool was used to build out a specific example. We will have two different flavors here – one aimed at the more traditional transactional example (human & system centric), the other describing a knowledge worker scenario (human collaboration oriented).

So if you are interested – check out the Event Brochure here. The early-bird registration ends on Friday (currently at just $295) and can be accessed directly here. We still have a couple of slots left open for vendors to participate, so if you are interested, contact me directly.

BPM Focus Update

May 28, 2007

I know I promised a relatively quick update cycle to my blog – but I am having trouble with my Clone Management Interface (the interface to the three versions of myself that never sleep and are eagerly beavering away at all hours of the day).

BPMN Process Modeling

The BPM Process Modeling Fundamentals courses in London, LA and Washington DC were all pretty much sold out (we couldn’t fit any more in the room). The course for Sydney on June 7th and 8th is also virtually full already (register here). Perhaps we will run another one at the end of the following week before I head back to the UK. Not surprisingly, the Ensuring BPM Project Success course is less well attended (as it is designed for a different audience).

But one thing has struck me as I reflect on the BPMN aspect of the training. A lot of people seem to expect a lot more methodology out of BPMN. In the end, BPMN itself is method independent – that it allows companies, individuals and tool vendors can apply the Notation to any number of methods. Moreover, adding simulation into the mix, while it may be useful in some situations, is not part of BPMN. It belongs in some extended method (which is relatively poorly supported by the attributes of BPMN icons).

Indeed, one could argue that if your challenge is to understand the process (as it is in the early stages of most BPM initiatives), applying simulation to the mix is a complete waste of time as it involves a lot of effort in gathering data about the process and validating that the distributions and estimates of time/resource usage is correctly applied in the model. Moreover, many of these (simulation) models are constructed with the perspective of proving the benefits of the approach (proving to management the benefits in terms of money saved or revenue generated). As such, the modeler is often (unconsciously) constructing a model that is already pre-disposed to supporting the aim … a model that buries the assumptions rather than surfacing them.

In that upstream activity (of understanding the domain to identify the 20% of functionality that will deliver 80% of the value), what is needed is the ability to compare and contrast different perspectives on the process … looking not just at the orchestration (ordered sequence of activities), but also the choreography (the sets of interactions between the roles), and the boundary conditions of the chunks. Because by understanding the process better, the end-users can really identify the areas that will make a difference.

It is in the downstream implementation of that defined scope that BPMN comes into its own as an implementation oriented graphical language. The point is that applying detailed BPMN modeling and simulation too early in the modeling endeavour is inappropriate.

BPM Focus Web Site

Well suffice to say that it is about to go through a major overhaul. We have been busy beavering away implementing a commercial BPM Suite under the covers that will make the whole experience an order of magnitude easier to handle. All those people who are currently on the BPM Focus mailing list will be invited to update their profile, such that we can more effectively target the messages and communication that we send. Moreover, this will facilitate the rapid introduction of a whole range of new services that have been in development for some time. Instead of worrying about the implementation detail, a new service becomes nothing more than a set of robust (BPMN) models. But more on that later.

Getting the BPM Message Across

April 27, 2007

Many in business people still struggle to see the role of business process in building better performance (i.e. business results). So I thought I would share this little hook that I developed within one of my consulting engagements. It is based around preparing bread – the components of the bread, the flour, the yeast, the water and then baking it all together for an effective result. In your business it is the dough rising that equates to achieving its performance objectives … however those performance objectives are defined.

Whether aware of it or not, in most businesses the different ingredients are not well aligned or working together as well as they could be. Mixing the metaphors for a moment, they are not rowing together in a coordinated fashion. Business Process Management brings together a range of techniques and approaches—the BPM tool box. The components of this tool box help change agents in the business (the bakers) create their own special sort of dough. At the heart of that is an ongoing enquiry into business processes—if you like the water that binds the flour (your people), with the yeast (the technology).

There may be other subtle ingredients. But cooking is not only about mixing the right quantity of ingredients; it is also how you mix them, and how long you bake the mixture. You might think it is just a question of getting the right measure of ingredients. But first, it is necessary to decide on the sort of bread you want to make, and how it is going to be delivered, to whom. Alongside the choice of people (flour), the most critical element is the water (processes)—the ingredient that binds it all together.

Relatively speaking, adding the technology is the easy part. But it requires a considerable amount of rigor. This rigor is most apparent in the way we understand and model processes—because in the modern BPM technology, it is these models that drive how work is managed and driven through the business. If we want to change the way the business operates, all we then need do is change the models. No programming should be required (or at least only in very specialized cases). As much as is possible, everything is configured with models.

But to develop these models requires a rigorous approach and methodology—one that allows us to bind together (integrate) the people, processes and technology. The problem is that process models are like a bikini—what they reveal is suggestive. But what they hide is vital. (Paraphrasing Levenstein talking about statistics).

This is the central thesis of the BPM Process Modeling Fundamentals training course we have developed within BPM Focus. It not only features the very latest developments in BPMN (developed in collaboration with Stephen White, the main author of the BPMN standard), it also includes complementary techniques that help people really see their processes from a number of different angles. The next iteration of the course is due for delivery in London next week (May 1st and 2nd), then in Washington DC on May 24th-25th and Sydney on June 7th-8th. We are also delivering the course inhouse to a number of corporate clients. It should also be available on-line soon.

It is complemented by another program Ensuring BPM Project Success, which is oriented toward ensuring that BPM Programs are rooted in the organization appropriately (due to run in Washington DC on May 21st and Sydney on June 12th-13th). You could think of this second program as being designed to help you set up to guarantee success in BPM projects (or how to avoid getting egg on your face). It is designed to cure you of the legacy thinking that created the existing mess and provides an actionable methodology and framework for BPM success.