The importance of Done in Scrum

In the last Scrum Guide (July 2011) the definition of Done was given considerably more attention. Rightfully, as “Done” is crucial in Scrum.

The Importance Of Done

The definition of Done is essential to fully understand the Increment that is being inspected at the Sprint Review with the stakeholders. The definition of Done implements the expectation of an Increment to be ‘releasable‘, so ideally it is comprised of all activities, tasks, qualities and work that allow releasing an Increment in production. The addition ‘potentially‘ to releasable refers to the Product Owner’s accountability to decide over the actual release of an Increment; a decision that will likely be based upon business cohesion and functional usefulness. But the Product Owner’s shipping decision should not be constrained by ‘development’ work.

The definition of Done should be clear and concise for the Scrum Team as it will determine how much work a Development Team can reasonably take in into a Sprint during the Sprint Planning meeting.

The empiricism of Scrum only functions well upon transparency. That includes the definition of Done. Transparency means not only visible, but also understandable. Besides being available, the content of the definition of Done should be clear by just reading it.

A New Scrum Artefact?

Should we make the Definition of Done an official Scrum artefact?

It would seem like adapting Scrum to reality, a mere formalization of an existing practice; because it is extremely important to put quality even more at the heart of what we do; because we want to clear out that little grey zone in the base Scrum framework allowing some people to doubt the formal need of a definition of Done. With regards to the latter, it would provide additional guidance for people and organizations to improve, and investigate the next steps on the cobblestone path to Agility, although probably not the guarantee hoped for by making it a mandatory artefact.

All existing Scrum artefacts support the ‘inspect & adapt cycles of Scrum; they provide accurate, up to date and transparent information to be inspected and adapted at the rhythm of the Scrum events (or sooner). In that sense, Done is already an artefact; it is in the Increment, making the state of the Increment transparent.

I suggest to formally include inspection of the Definition of Done at the Sprint Review, along the inspection of the working Increment, which it is a characteristic of. This pair-wise inspection serves to get feedback and input from the stakeholders that goes beyond mere functionality and business requirements. It will invoke a collaborative conversation over quality, and requirements with regards to the quality of working software in the organization.

The Sprint Review is also the time to inspect the current state of Product Backlog, i.e. what is now Done, what was left undone in this Sprint, what was additionally turned Done. From this current state, including the latest Velocity measurement, the actual product progress is available to the stakeholders.

I suggest to lay ownership over the definition of Done with the Development Team. A definition of Done can’t be forced upon a Development Team. Neither can it be cut short by forces outside of the Development Team. The Development Team will obviously include functional quality expectations from the Product Owner. The Development Team will obviously include general, organizational expectations and compliance (e.g. from the development or engineering organization). But it’s up to the Development Team to process it in the definition of Done. Decisions over the definition of Done will depend on the presence of skills, authorizations and availability of external systems, services and interfaces. Probably a Development Team would include stubs and simulators for non-available systems, add this to their definition of Done and make the impact of these dependencies clear to the Product Owner for ordering the Product Backlog. The effect on the planning horizon will no longer only be clear to the stakeholders by inspection of the Product Backlog at the Sprint Review, but also via explicit inspection of the definition of Done accompanying the working Increment.

The inspection of the working Increment and the definition of Done at the Sprint Review, and the related collaboration of the Scrum Team with the stakeholders, will provide the Development Team with important information to sustain, evolve and grow the definition of Done. They will probably have a deeper conversation over it at the Sprint Retrospective. The self-organizing drive of the Development Team will include all that’s actually possible, dig deeper, taking into account the feedback from the stakeholders, and evolve it as part of their continuous improvement of quality.

This ownership is comparable to the Product Backlog ownership. The Product Owner has accountability over it. But it won’t withhold the Product Owner from taking into account the technical and development input from the team. It won’t withhold the Product Owner from taking into account dependencies, non-functionals and organizational expectations. And after all, the frequent inspection of a working Increment provides information on reality to all involved so they can incorporate that in their work via their accountability.

A Desirable Side-effect

Although the goal is not to promote the Definition of Done to a Scrum artefact (as shown that is not needed), I do see an optional side-effect in explicitly inspecting it at the Sprint Review: additional transparency to the overall agile adoption.

Obviously the definition of Done will not always immediately, from day 1 of the adoption of Scrum, hold every possible task, activity or requirement to render every Increment completely production-ready. There will be a gradual evolution in applying Scrum. This is good as it helps all players continuously exploit the possible to a maximum extent.

Promoting inspection of the definition of Done with the stakeholders will help identify improvement areas in capturing enterprise agility. The finding of what is/is not included provides an indication on involvement of the broader organization in agile product development, even of organizational impediments. And stakeholders, in consultation with Product Owner and Scrum Master, might want to act on these from a management change perspective.

In a transformational period, including the definition of Done as an explicit artefact in the Scrum framework will help people and organizations in the software industry to… improve from better and more realistic insights.

Reflecting on Sprint Length via 1-day Sprints

The foundation of Scrum is empirical process control, a technique to build complex products in complex environments, where few activities are repeatable and the course of work is quite unpredictable. Which is the case for the creative work that software development is.

In empirical process control objectives are fed into a system, and via closed-loop feedback results are regularly inspected against the objectives in order to adapt materials, tasks and process. Skilled inspectors do inspection at an appropriate frequency, so focus and time to create valuable output are balanced against the risk of allowing too much variance in the created output.

Scrum includes 2 cycles and a lightweight set of events and artefacts to do inspection and adaptation upon transparently available information and commonly understood standards.

  • At the Daily Scrum the Development Team inspects its progress, estimates, planning and tasks within the container of the Sprint. All of these elements were initially laid out at the Sprint Planning. They use the Sprint Backlog, the Sprint Goal and a progress trend on remaining effort. It assures they don’t get out of sync with each other and with the Sprint Goal for more than 24 hours.
  • At the Sprint Retrospective the Scrum Team inspects the complete, well, ‘process’. Rather the way they have played the game of Scrum in the performed Sprint. The objective is to define what playing strategies will be applied in the next Sprint. No topics are excluded; tools, technology, communication, relationships, quality, engineering standards, definition of done, … It’s basically about establishing what went well, what shows room for improvement and what experiments might be useful to conduct in order to learn and build a better product.

There seems to be a tendency to move to shorter Sprints. The last version of the Scrum Guide advises Sprints of 1-4 weeks. 4 Weeks being an absolute maximum, I think 1 week is an acceptable minimum.

Let’s say you would do 1-day Sprints. Both Scrum’s described inspection events will occur at the same time, or at least take place at the same frequency. The danger is very substantial that a Scrum Team will focus merely on its daily work and progress, but takes no time to inspect & adapt the overall process and ways to improve quality.

We should also keep in mind that Sprint length is best defined upon the frequency at which the Product Owner and the Development Team need to consult with the stakeholders over a working version of the product and the decision on a functional release of the product. Sprint length will take into account the risk of losing a Business opportunity because Sprints are too long. Well, if your business is indeed so volatile that it is only 1 day, please do 1-day Sprints and release daily. But be careful in burning your inspection mechanisms.

Above all, do organize the Scrum events and enjoy the adaptive power of the 2 mutually re-enforcing inspect & adapt cycles. If you would only release on a weekly base, be realistic and go for the optimal approach, a Sprint that matches your product cycle and takes 1 week within which you will still adapt to reality on a daily base.