Thursday 15 March 2012

Lessons Learned about Lessons Learned (Part 1)

My last post about "Best Practice" was something of a personal peeve rather than a major hurdle to successful business improvement, although lessons should be learnt from the story I told, namely:
  • your improvement focus shouldn't be about creating templates and completing documents to provide evidence for appraisals and audit
  • measures that you put in place need to be thought through from the perspective of those being measured if you want to avoid costly and wasteful dysfunctional behaviour
  • best implies that there is no room for improvement
Ultimately, I also suggested a relatively quick fix; simply stop using the term "best practice" and come up with something more appropriate.

"Lessons Learned" (learnt?) is a different kettle of fish however. Most organisations have some kind of lessons learned mechanism in place (very often associated with their "Best Practice Repository"!) and in my experience very few of these mechanisms add any real value to the organisation. They are put  in place to meet the "requirements" of CMMI, etc. and as is so often the case, fail to even to do this adequately because people fail to understand what the model is really trying to help the organisation achieve.



There are a whole bunch of issues associated with the Lessons Learned process, many of which would appear to be associated with the origins of the idea. When I first started working in Software Development over 25 years ago we never did Lessons Learned - but then again, we didn't do much of the stuff commonly considered as integral to the development process today. My first memory of anything resembling the concept of a lessons learned review was when I read an article by Tom deMarco on Project Postmortem Reviews * some time towards the end of the 1990s. This made a lot of sense back in those days - at least in the organisations I worked in where we had a small teams of developers who stayed together and worked on project after project. It made sense in projects that lasted for years, where phase postmortems could be used to identify and correct mistakes prior to starting the next phase. And in the days before I started getting involved in SW-CMM initiatives I introduced the concept and practice into several groups with some small success.

On a small scale the Project Postmortem Process defined in that paper can be very useful as long as two conditions remain in place:
  • The same teams of people are used in a "product development environment" so that they have a shared understanding of the issues which cause problems
  • There is a management environment that allows the problems to resolved internally within the teams rather than imposing  inappropriate solutions on the team without really understanding the underlying issues.
The Project Postmortem Process fails as soon as:
  • It is scaled up to become an organisational process - e.g. the focus changes from an internal review and change exercise to a management mechanism for fixing organisational issues
  • Project teams are constantly rearranged and staff treated as interchangeable resources
  • Emphasis changes from internal and potentially informal communication within a team, to a demand to codify the findings for wider use
  • Managers and process experts attempt to implement CMMI without understanding what the model really implies - i.e. that the process is part of the feedback mechanism to improve the business, not an exercise to generate paperwork to fill up a repository
  • The process is used to apportion blame
Over the years since first reading about the Project Postmortem Process I have been seen numerous lessons learned systems in numerous organisations. Most of them suffer from exacly the same problems.
  • Lessons Learned reports are documented and filed away and the feedback element to improve process rarely occurs
  • Lessons Learned are purely qualitative and can easily be distorted by the strongest or most vocal members of a team
  • Not all project team members or stakeholders are invited to participate in the process so not all perspectives are represented
  • Most staff perceive the process as a waste of time (and in most cases they are right)
  • Reviews are usually focused on project failures and successes and the process improvement opportunities are not realised
  • Managers or SEPGs that do any analysis on the results attempt to implement heavy handed changes based on their interpretation of the results without understanding the underlying circumstances or, most significantly, the differences between the environmental characteristics of the originating team (project or programme, agile or waterfall, etc.)
In fact many Lessons Learned reviews and reports have become so worthless that I can often predict the findings without knowing anything about the project or people. Typically the section on Things That Worked Well include:
  • Excellent teamwork and communication between project team members
  • Everyone went the extra mile and worked hard to complete the project under extremely difficult circumstances
  • The pizza brought in by senior management on the nights we had to work was really tasty and much appreciated
And in the Things That Could Have Been Better you'll see:
  • Really difficult relationship with the key stakeholder made this project a major challenge
  • Customer was never available
  • Requirements were so poorly specified we had to make them up as we went along
  • Technical procurement issues led to long delays
  • The CMMI initiative caused us a huge amount of extra work which didn't add value to the project
So very often, project lessons learned reviews tell us what we already knew, and probably have known about for some time. And the only thing that future projects can learn is that they are doomed to follow a similar pattern because nothing is being done to correct the organisational issues. The most likely outcome is that new status report and requirements specification templates will be imposed on the projects.

In part 2 I'll have a go at look at some of the things we might be able to do to make Lessons Learned a more valuable tool for the organisation, by extracting value for both projects and processes.

* Collier, DeMarco and Fearey: "A Defined Process for Project Postmortem Review" IEEE SOFTWARE 0740-7459/96/$05.00 © 1996 IEEE Vol. 13, No. 4: JULY 1996, pp. 65-72

Monday 12 March 2012

Good Practices, Recommended Practices but never Best Practice


If you've read previous posts on this blog, or if you follow me on Twitter, you may be aware that one of my current bug bears is the wilful misuse of the term "Best Practice". I have no idea where this concept originated, but it can't have been anywhere that believed in Continuous Improvement (or in any improvement for that matter!).



My first hands-on experience of "Best Practice" was with the introduction of a new global Quality Management System being adopted by a multi-national IT Services company across its Applications Delivery group. All the corporate assets such as templates and guidelines were stored in a "Best Practice Repository" (BPR) and tailored versions of these could be uploaded by local delivery groups. As a relatively naïve practitioner I welcomed this approach initially. All staff had access to the local assets by default, and if no asset was locally available they were shown the corporate asset. Part of my job was to manage our local assets and their incorporation into the BPR. Various metrics were provided to understand how assets were used which I used to monitor and report on to my local senior management.

As a relatively mature organisation (compared to other delivery centres around the world) with a large number of tried and tested assets already in use in hundreds of projects, my team uploaded most of our templates into the BPR unless we felt that the corporate standard was an improvement on our own.

This was when the first warning bells started ringing as we became plagued with questions like "should we be using the corporate or local standard?". Notices of Instruction were duly posted with appropriate advice but the real question remained unanswered by the QMS owners - the question being "how can you have multiple Best Practices?". Either it's Best Practice or it isn't.

The second warning bell came when the European management team decided to compete with the rest of the world as to who could provide the most best practices, and all European organisations found that the number of assets uploaded and downloaded in the BPR became part of the balanced scorecard for process improvement teams. A frenzy of activity saw dozens and dozens of best practices appearing over the next few years and PMs and developers had free reign to download whatever took their fancy regardless of the quality of the materials. Many organisations simply edited the corporate standard, added their delivery centre name and logo, and renamed it to reflect their own identity. As is so often the case, the focus of improvement degenerated into wasteful template production rather than adoption of process into the mindset. What started in Europe soon filtered out across the world  and the BPR became an unmoderated morass of templates and guidelines.

After several global reorganisations a new version of the global applications QMS was developed and I was a member of the design team. Some lessons from the past were learnt, and although the BPR still existed we cleaned out 90% of the contents and put in strict controls on how assets could be added which involved several levels of review (local, regional and corporate). The B still stood for Best however.

This story isn't unique sadly. I've seen it repeated in organisations all over the world, in big companies and in small ones. As long as CMMI is interpreted as being about the production of templates and documentation the nonsense will continue. But until that happens, at least refer to your Process Asset Library as that, or if that's a bit too difficult to understand call it a Good Practice Repository.

Best implies that it can't be improved, and that there can only be one. Neither of these implications has a place in the world of continuous improvement.

Postscript - there's no prizes for seeing the other moral of this tale; that's right…the one about how a carelessly considered measurement can cause totally dysfunctional behaviour which could cost you dearly.