As many of my regular readers will know by now – I joined Forrester Research in February 2010, and subsequently left in 2015 to take up a role with Structure Talent
I have posted just one blog post in the best part of 6 months, I am wondering whether I even qualify as a blogger. I suppose not … problem is that when I want to say something, I find I want to say it well and in considered form. Then I get distracted with some current deliverable (or crisis, or proposal, or call, or …) and the feeling passes.
Of course, I still have the current book hanging around my ears to get out too, excerpts of which might make good postings. The problem is that this book is turning into a bit of an opus … my perspectives on life the universe and everything to do with BPM and Transformation. So having virtually completely re-written the whole thing, I am about to take the knife to it again, perhaps splitting it into two books to make them both more accessible.
I am sure that the end result will be worth it … I mean, apart from Academics looking for a single reference guide on BPM for their students, who wants (or needs) a complete analysis of all things BPM’ish. So I am working with a colleague to cut it down to a business oriented discussion about the power of managing processes as a way of, in the end, driving organizational transformation.
It seems a contentious point of view in Business Process Management – but when we come up to the “Understand Phase” (“As Is” or “Current State” model ), we recommend “time boxing” the work to ensure that the activity is kept at a suitably high level. The intention of this activity is really to create a baseline; a reference point for the BPM project.
Now those who continue with their “legacy thinking” perspective usually decide that it is important to create a detailed description of how work happens. They model everything in sight, trying to create an accurate representation of the work as it happens today. While this is good for the “billable hours” of consulting firms, it does little for the business managers engaged on a journey of change and discovery.
The point is that the amount of work expended here is usually wholly inappropriate to the benefit derived. If your intention is to change the way things happen, gathering a great deal of detail around current work practices is a waste of time. If you are going to improve things (with or without the use of automation), then you will be changing how the process is carried out … i.e. how things happen today will soon become a thing of the past.
Don’t get me wrong, it is absolutely essential to develop a baseline understanding of the ways things are done. It’s just a question of emphasis. The issue for those involved in the exercise is just what degree of detail is required. They should be asking “can we stop now?”
The real purpose of current state modeling is to establish a baseline – so that the team can establish a realistic business case (allowing them to track benefits and improvements during and after implementation), and to identify the areas that require attention.
This is more about a pragmatic assessment of reality and clarification of current performance metrics than it is about process modeling. The metrics in question are those that the customer of the process really cares about (not the detailed cycle times of some low-level sub-process). From a modeling point of view, the need is for enough structure to hang the metrics upon (and perhaps one level of detail below). Anything more than that is a waste of time and resources.
So how much detail do you really need? Well, I normally start with high level outline of the process – the major chunks and then draw a simple high level process model. I recommend a high level BPMN diagram, but I usually seek to contrast that model with a Role Activity Diagram (not the same as a flow diagram with swim lanes, RADs model how the Roles involve change state and synchronize their actions), and perhaps simple Object State Transition Network (how the things moving through the model change state).
With a high-level flow diagram or outline of the process, it is really very straight forward to develop these alternative views, but they really do help people see things differently. I often say that the problem with Flow Diagrams, is that “the more you look at them, the less they mean.” Flow diagrams always look correct – for example, in my recent book “BPMN Modeling and Reference Guide” (authored with Stephen White the main author of the BPMN specification itself), I have yet to receive a note from anyone telling us that we have a major flaw in one of the models (yes, there is at least one). It just looks correct (and this is a book where we tried our very hardest to make sure every model was “right”).
Incidentally, the best reference on RADs is Martyn Ould’s “Business Process Management – A Rigorous Approach.” And for OSTN, I prefer the IDEF3 perspective as it is relatively simple and easy to understand (UML also has similar modeling capabilities).
Coming back to the Understand Phase, in the workshops with the Subject Matter Experts (SMEs), I also seek to understand the volumes of work, any major exceptions and the percentages of items that follow the major paths from decision points. The other thing to understand is the numbers of employees involved in the work (FTEs and the amount of time spent on each area of the process). From that information you can calculate the costs of undertaking the work and where the money goes. You also need to understand the roles involved, and the capabilities of the staff members who fulfill those roles. All of these become essential ingredients for the business case. Without it you are whistling in the wind (when it comes to asking for funding). Even if you already have the funding, you should do this anyway (it will certainly be needed later).
I could go on here at length, but the point I am trying to make here with this blog post is this … if your consulting provider is asking you to fund a detailed “As Is” phase of work, then you are throwing money away. They are more interested in lining their pockets than assisting the client. The only exception that I can think of is where the process is itself highly regulated (and a rigorous work definition is mandated by law). In such cases, I think you have to draw your own conclusions on how to avoid “analysis paralysis.”
Update – you can view the Keynote itself here (requires registration).
This division of Farmers deals in the small business insurance market – a $95B market. When Mhayse joined Farmers, they had just 2% of that market. He saw it as an incredible opportunity – where 47% of business in the small commercial insurance sector, are with small business insurers. The question is why these small players succeed against a big player – it’s because they know their local community, they are part of it and know how to communicate with their customers.
Mhayse described his approach that really started with a clear Vision … He set out with objective of becoming #1 in their industry, and then laid out a strategy of how to get there.
The vision was to:
- Excel at the core
- Build deeper expertise
- Leverage predictive modeling
- Expand the appetite and sophistication of the organization,
- Create a targeted set of offerings for agents and their customers.
Only then did they start to think about how you deal with the small business opportunity and the efficiency end of things. Sure you need an efficient way of doing the business, but the primary focus of that vision was on growth and the customer experience.
He went on “You cannot expand the appetite for more … unless you can automate the way in which the business operates. I didn’t know what I was searching for – but I was looking for something that would help us … something that would give us a clear line of sight to the solution to the business problem. I had to understand what it was going to feel like. However we get there, we had to create the right sort of agent experience … we had to get them (agents) fully engaged to get the benefits of the gem we had in our hand. How do we reap the benefits … it has to be done in increments. I wanted to know where the short term goals and pointers were (pointers that would indicate we were being successful). Trying to get there all at once is probably going to end in disappointment. We had to do a set of projects, and do them quickly, while being flexible along the way.”
At this point I was really engaged … I hadn’t heard a business leader at this level talk about a BPM project in such an impassioned way. This was his project, and he had been driving it top down. Now I started recording the slides and some of the related phrases:
Create the right agent experience – we had to demystify that experience so that it really helps the agent – pre-filling information into forms and easing the user experience.
- Eliminate the useless questions and options
- Automated underwriting decisions
- Automated pricing … it used to take us far too long to price a policy.
- We had to increase the pass through rate … the time to get a bound policy.
- We were looking at (touching) 80% of the business that was passing through, and were closing just 20%. That should have been precisely the other way around.
- The question was how could we enable the local agent to be local in terms of how they work.
Focus first on Agent expansion and New Business Growth
- First support environment was delivered in 5 months.
- Restaurant product went countrywide in July 2007
- Rolled out the Auto policy facility in Oct 07
- And getting an “umbrella” policy available as an add on in June 2008
- 14 days to 14 minutes
- Close rate was up 5%
- New business was up 70% “do you want Fries with that”.
- Renewals up 60%
- Added over 1000 new agents (later updated in the flow of conversation to 1500 new agents).
We focused secondly on efficiency … not how many people we could chop out
- Renewals …
- Focus on the desired business result
- Eliminate all the non Value Add steps, take out the noise and red tape.
Put the business change in the hands of the business
- Pulling together cross functional teams
- Finger pointing is the wrong way to go …
- Rapidly iterate
- We don’t always know exactly what we want
- We are sometimes representing other folks … like the agents that work for us
- Test, monitor and respond quickly.
Building the right team is critical
- Empowered … someone who is on my team that was also part of the IT organisation
- Dedicated cross functional teams – jammed them together, locked them in a room and told them they couldn’t come out.
- Wanted to have a partner with skin in the game. Developed a Customer Intimate relationship with Pega. Their compensation was linked to the delivery of our results. Now we really are on the same page.
- Get participation and engagement – with the agents.
Farmers had gone from low on the food chain … to the fastest growing at Farmers, the most profitable at Farmers, acquiring over 1500 new agents. They acquired a business along the way and have now grown to around $3B, representing 3% of the available business out there. Tied for first place.
Questions – How do you change the culture? At the end of the day it comes down to individuals. The traditional solutions were not going to get us to where we wanted to go. We have 1000s of people and unless you start to align the objectives, their compensation, etc. then you will have problems.
There has to be a common and shared vision – one that get both business and IT people excited. Too often there is an assumption in the business mindset that IT folks don’t have that sort of vision – that they don’t respond to the challenge. Point was that with the BPM program (still ongoing) they had proved that wasn’t true.
The key point for me was that he focused first on the Customer Experience. They had a strong visionary leader who publicly aligned himself with the overall success of the program. Theydrove partnership and engagement through cross-functional teams to achieve results The business results speak for themselves.
I just hope that Pega and Farmers agree to put the video up on the web so that we can point others to this powerful case study.Its one that every COO and CEO should see.
So while I am at it, I thought I should mention the other thread of research work I am involved in. I have become very interested in the emerging role of the Business Analyst and the ways in which organizations have responded to the challenge of BPM. So we put together a survey and put together a range of prizes as an incentive to persuade those in the industry to spend no more than 10 minutes completing the survey.
Major Prize – Either a Nikon D200 camera, or a Bose Home Entertainment System (value circa $1200)
Runners Up Prizes:
3 x Bose Quiet Comfort Headphones (value $349 each)
5 x Zune 8GB MP3 Players (value $180 each)
10 Copies of BPMN Modeling and Reference Guide.
You can take the survey here (one entry per person, validated against email address and employer).
The survey will run till mid-October, following which we will produce a summary report for those who took part. The prize draw will take place at the BPM Technology Showcase in Washington DC on October 16th.
Did I mention that as another reason I have been quiet … suppose not, but all these things contribute.The Showcase event is a completely different sort of BPM Experience – an opportunity to see the products, up close and personal, in a side-by-side comparison. I will probably post another entry about that in the next few days
(BTW, the early bird discount for delegate registrations expires tomorrow … at $395 it is a pretty amazing deal as it includes 2 workshops on the first day, any one of which would normally cost you that much at a traditional conference).
Another litany of excuses. Anyone who has looked at this blog over the last year or so will see the very poor track record I have established in terms of keeping it up to date. The reason – just not enough hours in the day. But that is probably true of all bloggers, but I wonder how many of them have been pushing the development of two books, along side a full schedule of training and consulting activities. Right now I am supposed to be outside on holiday with my family, but instead I am deep into the last push to finalize the content for the forthcoming BPMN Modeling and Reference Guide.
This has been a mamoth effort over the last few months involving detailed collaboration with Stephen White, the main author of the BPMN specification. We have done our best to make it as readable and accessible as possible, separating out the detailed reference section from a piece about modeling in general (everything from history of BPMN, why/how to model, etc).
There is also an extended scenario based introduction to BPMN functionality, bringing in new BPMN functionality in the context of an easily understood business problem. Throughout the section, the business scenario is elaborated upon and the corresponding models and BPMN functionality explained. Through our training courses, we have found that people learn far better this way.
Of course, there is a detailed explanation of all BPMN functionality. And for most, who are actively involved in modeling, this reference material is sorely needed. For the layman, the specification is somewhat hard to follow (and that is being kind). In the book, we explore each area of functionality and provide a detailed explanation for its use, and behavior.
The BPMN Modeling and Reference Guide will be launched at the Gartner BPM Summit in Washington DC.I am doing a session on BPM and Modeling and another on Developing Appropriate Process Architecures. Neither of these sessions are designed for beginners (although the modeling session should be pretty acessible.
Incidentally, I notice that I am the only non-Gartner BPM Analyst/Comentator presenting at the conference this time around – seems that the same old, same old’s have finally been found out ;-).
My other book, that has been in devleopment for the last 5 years or so is in the later stages of finalization. Mastering BPM has been an evolving piece that will probably hit the presses in a similar time frame. I still have a chapter to write, but it is just about there.
Of course, all of this book development work takes cycles out of the day and impacts the ability to execute on other things. Anyway … I hope to get into a more regular pattern of blog postings and updates by the end of next month.
I see I am going to have to dedicate more time to getting the blog out … but when I look at the broad range of tasks I have to get completed in the next 2 weeks I am back at the office – God knows where I will find the time.
Over the last 20 years I have found myself trying to help a broad range of people understand the various vagaries and wrinkes of business processes. But I have found a real difference in the receptivity between those people who know very little (and want to learn) those who think they know a bit (and want to be impressed). When introduced to a new modelling technique or approach, the common reaction is “why would I want to do it like that, I can always use <whatever technique I know already> to model that. What they seldom consider is what that new technique or approach might do for them, or how it might give them another subtle perspective.
So perhaps you will understand me when I say Business Process is a little bit like Religion – once people have been inured in one branch of the church, they tend to resist attempts by others to engage them (just think about Protestants and Catholics for a second – they have a lot more in common than they realize, yet they still seem to find each other repulsive). And the world of process is not that different. There are a lot of parallels between the differing factions of the business process movement, and those that one can observe in religion.
If you have been trained in UML, then that is what you want to use to model (everything must fit into that UML metamodel); if you grew up with IDEF, then all models appear as though they should be constructed around Inputs, Outputs, Controls and Mechanisms (or some other similar flavor). If Rummler Brache was your thang, then you favor the deployment flowcharts and swimlanes associated with the technique. Whether you have been brought up on LOVEM, BPMN or simple Fortran flowcharts, then the world is often colored by your original christening. It is only when you have been around for a while that you can see the benefits of the different approaches.
Putting all of that slightly different way, and maybe I am stating the obvious – a little knowledge can be dangerous. And in the world of business process, that is certainly the case.
As people search around for the meaning of life (or process) they discover different disciplines (new techniques and approaches). Sometimes, they become converted to a new religeon (say Pi Calculus, or conversational interaction loops of ActionWorflow) and feel it incumbent on themselves to act as missionaries, recruiting new sheep to the fold. Those that dont agree with them are clearly wrong, or misguided, or even worse, seditious. I suppose the point I am making is that while every branch of religion needs its evangelists, fundamentalism tends to alienate potential parishioners. And the problem with religion is that, for most people, once they have got some of it, they tend to shun all other approaches.
So by now, I hope you understand that I see the world of business processes as a pretty broad church. Personally, I am keenly interested in all process related innovation. But I don’t see that as a restrictive covenant that stops me from looking at, or even trying to explain, approaches that do not conform to some purists definition.
Indeed, I believe that it is only when you contrast different perspectives on a business process that you really understand it. You need to be able to step outside the box and see it for what it is. You need to be able to examine the interactions on one hand, and then flip it all around to look at the sequence flows; to look at what is required of the process and separate that from how it is achieved. With luck, the new BPDM metamodel from the OMG will enable the analyst to step around these different perspectives, sharing information between different modeling tools and techniques, without loosing the fidelity of the information.