a·gen·tic a·gil·i·ty

The Definition of Done: Ensuring Quality without Compromising Value

Explains how to maintain clear, measurable quality standards with the Definition of Done, while avoiding confusion with acceptance criteria and preserving product value.

Published on
3 minute read
Image
https://nkdagility.com/resources/DcwDyX1ZGUP
Loading the Elevenlabs Text to Speech AudioNative Player...

The Definition of Done (DoD) is a sacrosanct measure of quality, ensuring that every piece of work meets the standards necessary for release. On the other hand, acceptance criteria focus on the work’s content. Merging the two can risk the integrity of a working, usable product. This article delves into the nuances of maintaining the sanctity of the DoD while ensuring the delivery of valuable increments.

Why having Acceptance Criteria on the Definition of Done risks the sanctity of the working usable product

The DoD is my touchstone for quality. It’s a measurable and automatable standard that ensures every increment is of a certain quality. Introducing optional or non-executable elements into the DoD can diminish its importance, potentially jeopardising the transparency and empiricism foundational to our work.

The DoD isn’t about guaranteeing functionality but ensuring that what we’ve built works. It’s the bedrock of transparency. If stakeholders are still determining whether the work meets the necessary standards for release, there needs to be more clarity on the additional effort required. Similarly, if Developers are unsure of these standards, they can’t accurately gauge the work needed.

Incorporating elements that might undermine this transparency or minimise these standards is a risk we shouldn’t take. The DoD’s primary purpose is to maintain this clarity.

Why having Acceptance Criteria on the Definition of Done may not risk the sanctity of the working usable product

There are instances where quality measures form part of the acceptance criteria. In such cases, one might be tempted to add “must meet acceptance criteria” to the DoD. However, this can jeopardise the team’s flexibility in determining the Sprint’s contents while upholding the DoD’s sanctity.

Instead, we should scrutinise the acceptance criteria. If specific criteria always need to be true, they should be incorporated into the standards and, by extension, the DoD. For instance, criteria like “all content is spelled correctly and has good grammar” or “all copy meets visual spacing guidelines” can be integrated into the DoD.

If you’re keen on including acceptance criteria in the DoD and wish to maintain the flexibility of value delivery , consider phrasing it as “for all work delivered in the Sprint, quality-based acceptance criteria on each backlog item are fulfilled”. This retains the original intent while allowing the team to adapt based on their learnings.

Conclusion

The DoD aims to ensure transparency, confirming that all showcased work meets our product’s standards. Anything that jeopardises this transparency or the adaptability of Sprint’s contents should be approached with caution. Only adopt such changes if you can guarantee the preservation of adaptability and transparency.

-–

NKDAgility can help!

These are the intricacies that lean -agile aficionados thrive on, but most find daunting. If you find it hard to distinguish between the Definition of Done and Acceptance Criteria, my team at NKDAgility is here to assist. Don’t let these issues undermine your value delivery. Seek help sooner rather than later.

Right now, you can request a free consultation with my team or enrol in one of our upcoming professional Scrum classes . Because you don’t just need agility, you need Naked Agility.

Definition of Done Software Development Working Software Professional Scrum Transparency
Subscribe

Related Blog

Related videos

Connect with Martin Hinshelwood

If you've made it this far, it's worth connecting with our principal consultant and coach, Martin Hinshelwood, for a 30-minute 'ask me anything' call.

Our Happy Clients​

We partner with businesses across diverse industries, including finance, insurance, healthcare, pharmaceuticals, technology, engineering, transportation, hospitality, entertainment, legal, government, and military sectors.​

Workday Logo

Workday

Big Data for Humans Logo

Big Data for Humans

ALS Life Sciences Logo

ALS Life Sciences

Boxit Document Solutions Logo

Boxit Document Solutions

Microsoft Logo

Microsoft

Emerson Process Management Logo

Emerson Process Management

Genus Breeding Ltd Logo

Genus Breeding Ltd

Akaditi Logo

Akaditi

NIT A/S

Xceptor - Process and Data Automation Logo

Xceptor - Process and Data Automation

Flowmaster (a Mentor Graphics Company) Logo

Flowmaster (a Mentor Graphics Company)

Capita Secure Information Solutions Ltd Logo

Capita Secure Information Solutions Ltd

Philips Logo

Philips

Lean SA Logo

Lean SA

Trayport Logo

Trayport

Higher Education Statistics Agency Logo

Higher Education Statistics Agency

Deliotte Logo

Deliotte

Alignment Healthcare Logo

Alignment Healthcare

Nottingham County Council Logo

Nottingham County Council

Department of Work and Pensions (UK) Logo

Department of Work and Pensions (UK)

Washington Department of Transport Logo

Washington Department of Transport

Royal Air Force Logo

Royal Air Force

New Hampshire Supreme Court Logo

New Hampshire Supreme Court

Ghana Police Service Logo

Ghana Police Service

Epic Games Logo

Epic Games

DFDS Logo

DFDS

Workday Logo

Workday

Trayport Logo

Trayport

Ericson Logo

Ericson

Bistech Logo

Bistech