a·gen·tic a·gil·i·ty

The Definition of Done: Ensuring Quality without Compromising Value

TL;DR; The Definition of Done (DoD) ensures every release meets clear quality standards, while acceptance criteria define specific content requirements. Mixing acceptance criteria into the DoD can undermine transparency and adaptability, but consistently required quality measures should be added to the DoD itself. Review your acceptance criteria regularly and only update the DoD if it preserves both clarity and flexibility in your team’s delivery.

Published on
3 minute read
Image
https://nkdagility.com/resources/DcwDyX1ZGUP
Subscribe
Loading the Elevenlabs Text to Speech AudioNative Player...

The Definition of Done (DoD) is a sacrosanct measure of quality, ensuring that every piece of work meets the standards necessary for release. On the other hand, acceptance criteria focus on the work’s content. Merging the two can risk the integrity of a working, usable product. This article delves into the nuances of maintaining the sanctity of the DoD while ensuring the delivery of valuable increments.

Why having Acceptance Criteria on the Definition of Done risks the sanctity of the working usable product

The DoD is my touchstone for quality. It’s a measurable and automatable standard that ensures every increment is of a certain quality. Introducing optional or non-executable elements into the DoD can diminish its importance, potentially jeopardising the transparency and empiricism foundational to our work.

The DoD isn’t about guaranteeing functionality but ensuring that what we’ve built works. It’s the bedrock of transparency. If stakeholders are still determining whether the work meets the necessary standards for release, there needs to be more clarity on the additional effort required. Similarly, if Developers are unsure of these standards, they can’t accurately gauge the work needed.

Incorporating elements that might undermine this transparency or minimise these standards is a risk we shouldn’t take. The DoD’s primary purpose is to maintain this clarity.

Why having Acceptance Criteria on the Definition of Done may not risk the sanctity of the working usable product

There are instances where quality measures form part of the acceptance criteria. In such cases, one might be tempted to add “must meet acceptance criteria” to the DoD. However, this can jeopardise the team’s flexibility in determining the Sprint’s contents while upholding the DoD’s sanctity.

Instead, we should scrutinise the acceptance criteria. If specific criteria always need to be true, they should be incorporated into the standards and, by extension, the DoD. For instance, criteria like “all content is spelled correctly and has good grammar” or “all copy meets visual spacing guidelines” can be integrated into the DoD.

If you’re keen on including acceptance criteria in the DoD and wish to maintain the flexibility of value delivery , consider phrasing it as “for all work delivered in the Sprint, quality-based acceptance criteria on each backlog item are fulfilled”. This retains the original intent while allowing the team to adapt based on their learnings.

Conclusion

The DoD aims to ensure transparency, confirming that all showcased work meets our product’s standards. Anything that jeopardises this transparency or the adaptability of Sprint’s contents should be approached with caution. Only adopt such changes if you can guarantee the preservation of adaptability and transparency.

-–

NKDAgility can help!

These are the intricacies that lean -agile aficionados thrive on, but most find daunting. If you find it hard to distinguish between the Definition of Done and Acceptance Criteria, my team at NKDAgility is here to assist. Don’t let these issues undermine your value delivery. Seek help sooner rather than later.

Right now, you can request a free consultation with my team or enrol in one of our upcoming professional Scrum classes . Because you don’t just need agility, you need Naked Agility.

Smart Classifications

Each classification [Concepts, Categories, & Tags] was assigned using AI-powered semantic analysis and scored across relevance, depth, and alignment. Final decisions? Still human. Always traceable. Hover to see how it applies.

Subscribe

Connect with Martin Hinshelwood

If you've made it this far, it's worth connecting with our principal consultant and coach, Martin Hinshelwood, for a 30-minute 'ask me anything' call.

Our Happy Clients​

We partner with businesses across diverse industries, including finance, insurance, healthcare, pharmaceuticals, technology, engineering, transportation, hospitality, entertainment, legal, government, and military sectors.​

Slicedbread Logo

Slicedbread

Higher Education Statistics Agency Logo

Higher Education Statistics Agency

Healthgrades Logo

Healthgrades

Capita Secure Information Solutions Ltd Logo

Capita Secure Information Solutions Ltd

Lean SA Logo

Lean SA

ProgramUtvikling Logo

ProgramUtvikling

CR2

Cognizant Microsoft Business Group (MBG) Logo

Cognizant Microsoft Business Group (MBG)

Kongsberg Maritime Logo

Kongsberg Maritime

Akaditi Logo

Akaditi

Sage Logo

Sage

MacDonald Humfrey (Automation) Ltd. Logo

MacDonald Humfrey (Automation) Ltd.

Qualco Logo

Qualco

Hubtel Ghana Logo

Hubtel Ghana

Big Data for Humans Logo

Big Data for Humans

Alignment Healthcare Logo

Alignment Healthcare

Microsoft Logo

Microsoft

Xceptor - Process and Data Automation Logo

Xceptor - Process and Data Automation

Royal Air Force Logo

Royal Air Force

Nottingham County Council Logo

Nottingham County Council

Department of Work and Pensions (UK) Logo

Department of Work and Pensions (UK)

Washington Department of Transport Logo

Washington Department of Transport

New Hampshire Supreme Court Logo

New Hampshire Supreme Court

Ghana Police Service Logo

Ghana Police Service

Sage Logo

Sage

Hubtel Ghana Logo

Hubtel Ghana

Slicedbread Logo

Slicedbread

Slaughter and May Logo

Slaughter and May

Lockheed Martin Logo

Lockheed Martin

Xceptor - Process and Data Automation Logo

Xceptor - Process and Data Automation