There’s a tool in the Scrum toolbelt that is so utterly critical to success yet so fundamentally misunderstood by far too many development teams, Scrum Masters, and Product Owners. I’m talking, of course, about the Sprint Retrospective. I’ve seen it time and again, teams that are able to hit all the right notes in their standups, reviews, and planning sessions — but who wind up botching their Retrospectives in such a horrible fashion that they miss out on the single most important part of agile product development, continuous improvement. Certainly, it’s never fun to take time out of our day to look back and discuss what went wrong in the past two weeks — much less try to come up with new things to try on a sprint-by-sprint basis. But it’s the single most important part of the culture that we’re trying to build — the culture of agility, of adjusting, of improving…of change.
Looking Backward, Honestly
All too often, Retrospectives follow one of two patterns: either they are nothing more than bitch sessions where the developers bemoan all the things that they couldn’t control that prevented them from hitting their sprint goals, or they’re happy-happy-joy-joy back-clapping sessions applauding how awesome the team is and how great they were to hit all their goals. But the simple fact is, every sprint has its ups and downs, its highs and lows — there’s only rarely a sprint in which everything goes wrong, or in which everything goes right. And that’s precisely why we have retrospectives — they exist so that we can critically assess the things that worked and the things that didn’t; the things that we want to do and the the things we want to stop doing; the things that we breezed through an the things that challenged us. Teams who only talk about the good or only talk about the bad will never improve, and this is often more a reflection of the culture than it is on the practices of the team. Retros need to be not only open and engaging to the team, but a “safe space” where people can feel free to applaud or criticize anything related to the team’s work and their goals. This means that the Product Owner, the Scrum Master, direct line managers are all fair game for critiques — not to mention every member of the team itself. Scrum Masters who unnecessarily limit the scope of a retrospective are stifling the voices of their team. In my experience, the best and most useful retros are those in which truth is spoken based on the experience of the team members over the last two weeks — no matter who or what that truth is about and no matter who may be offended by it. It’s certainly up to the Scrum Master to…distill…those thoughts into something that can be discussed outside the room. But in the room, during the retro, there cannot be any sacred cows.
It’s Useless if it’s Not Actionable
Another common dysfunction that I see with Agile teams is that their retros become a boring recitation of issues without any real documentation or follow-up. I’ve seen teams where the same topic gets raised in retrospective every two weeks for six months, without anything being done to make it better. What’s the point in talking about problems unless you’re planning on doing something about it? The entire point of the retrospective is to drive the team to action on those things that didn’t work well, or those things that they want to improve upon. Failing this is failing to be Agile, at the most fundamental level. And it really isn’t enough to just write down things to try or things to improve — we need to plan those things into the next sprint, or by the time we get to them, we’ll have move on to something else, then something else, then something else — all the while repeating our mistakes while applauding our efforts to identify them and pretend to work to improve on them. I try to urge all the teams that I work with to make sure that each sprint has at least one story or task or work item that’s related to something captured during their retrospective — because that’s how they track work. Putting something on your sprint backlog is a commitment to getting it done — which should apply equally to product development work and to team development work.
Check Back to Ensure Progress
Even if we’ve done everything above — we’ve been honest about our strengths and weaknesses, and we’ve captured work items to check off in the next sprint, that still may not be enough. Part of every team’s retrospective should include reviewing the points made in the last retro — particularly the negative ones. We want to confirm that we’re moving in the right direction, and not slipping further and further into the abyss of dysfunction. We should review each item that was identified as something to improve on, and ask whether we’re doing better this sprint, whether we’re doing worse, or whether we’re about the same. It shouldn’t take a lot of time — but doing this on a regular basis reminds us that there’s more to continuous improvement than just having a meeting every two weeks. There’s a need for ongoing introspection, even in the absence of a documented work item. There’s a need for each team member to feel like they’re empowered to make things better. And there’s a need to constantly check our progress and skills so that we don’t get complacent. Complacency kills teams — there are always things to try to improve how our teams work, how our processes work, and how our people feel about the work they’re doing. Retrospectives are the best weapon in our arsenal to make sure that complacency stays buried six feet under.
Should there be any metrics that are used to spark discussion during the retrospective? For example, if one user story took much longer than expected based on its story points, should there be metrics to identify problematic user stories?
If you have specific metrics that your team finds valuable, then by all means use them to drive discussions in retrospectives! One useful thing that some of the teams I’ve worked with have done is to use their retrospectives to identify common situations where stories take longer than expected, and adjust their Definition of Ready to accommodate these aspects of the process. I’m not a big fan of being prescriptive with regard to what metrics to use, but common metrics might be your sprint burndown chart, walking through completed stories to see which were bigger than expected, or even just reviewing committed v. completed story points across the sprint. It’s all about figuring out what metrics your team cares about and what metrics will both drive and measure improvements.
Maybe I am missing something but it sounds like Scrum Master is holding the team back. Why have scrum master running retros in the first place then?
The role of the Scrum Master is the player/coach and facilitator for the team; if they’re “holding the team back” then they’re not doing their job. As for who runs the retro, it doesn’t really matter — as long as it gets done, it gets documented, and it’s looking forward at how to improve what the team does and how they do it.