tech·nic·al·ly agile

Your Evolving Definition of Done

Evolve your Definition of Done (DoD) to align with organisational goals, ensuring quality and strategic value in every product increment.

Published on
6 minute read
Image
https://nkdagility.com/resources/5wIEg7lD_Xd
Loading the Elevenlabs Text to Speech AudioNative Player...

The Definition of Done (DoD) is not a static artefact; it evolves over time as a Scrum Team gains experience and capability. While the Scrum Guide acknowledges that teams may refine their DoD to improve product quality, there’s an often overlooked piece: Organisations should also provide an organisational Definition of Done that reflects their needs. This organisational perspective ensures that Scrum Teams build on a solid foundation, aligning technical execution with strategic goals.

The Definition of Done (DoD) is an objective, measurable standard of quality —not a negotiable target. Keep it clear, enforceable, and automated to ensure every Increment meets professional expectations.

Definition of Done - The Organisational quotient

For a product to deliver real value, its quality criteria must align with organisational and market expectations. It should meet a minimum quality standard that ensures usability while safeguarding the organisation, its employees, and its users. Any failure to do so could damage the organisation’s reputation and trust in the product.

This means organisations should define a business DoD that may include:

Without this business-level perspective, teams risk optimising for technical completeness while missing the broader value delivery picture. The result of many iterations of the organisational definition of done for a product might look like:

Live an in production

gathering telemetry

supporting or diminishing

the starting hypothesis

This short sentence packs a lot into it, and it’s a commercial product definition of “done” for a team I have collaborated closely with for over 17 years.

  1. “Live an in production” - done here mean that it is in the hands of real users

  2. “gathering telemetry” - done here mean that the Developers must add code that collects relevant information from usage, performance, and such…

  3. “supporting or diminishing the starting hypothesis” - Done here means that the team must define success metrics before building a feature or capability, ensuring that the collected data provides clear evidence of whether the intended outcomes are being achieved.

None of these elements define the “why” or “what” of what we’re building—those are captured in the backlogs. Instead, they establish the minimum quality standard required for work to be considered done.

Definition of Done - Translating Organisational Standards into Team Practice

While Scrum Teams are self-managing, that doesn’t mean they can do whatever they want. They operate within a structured environment, within a balance of leadership and control that upholds both autonomy and accountability. Scrum isn’t anarchy; it’s a social technology that enables self-management within clear constraints—Scrum events, commitments, and organisational expectations.

Each Scrum Team must interpret the organisational Definition of Done within their context, shaping an engineering-level DoD that aligns with it. While examples can guide them, it’s the team’s responsibility to determine what Done means within organisational constraints.

In addition to supporting the organisational definition of done, a robust DoD ensures that work meets a consistent level of quality before it is considered complete. This includes engineering practices , preferably within the bounds of a shift-left strategy , such as:

Each aspect contributes to quality, reducing the likelihood of defects and technical debt . However, quality isn’t just a technical concern—it is an economic and strategic one.

The Evolution of Done Over Time

New teams often start with a weak DoD that doesn’t yet guarantee releasability. A brownfield product with legacy constraints may have a DoD that initially excludes automation, testing, or continuous deployment due to existing technical debt. Over time, through Sprint Retrospectives and deliberate improvements, the DoD should:

  1. Start at a minimal viable level (e.g., basic testing, peer reviews).
  2. Expand to include automated testing, security checks, and CI/CD.
  3. Reach a state where every increment is truly releasable.

An experienced Scrum Team should aim for a DoD that ensures shippability at the end of every Sprint. Anything less introduces unnecessary risk and delays value realisation.

Common Misconceptions

  1. Can the DoD Change Per Sprint?
    Yes, but only to increase quality. The Sprint Retrospective is the right place to discuss DoD improvements, not reductions. However, if an issue arises, address it immediately—don’t wait for the Retrospective.

  2. Can the DoD Be Lowered to Deliver More Features?

    No. Quality is a long-term investment, not a short-term lever to pull for speed. A Scrum Team has no authority to cut quality—that’s a financial and risk decision made at the highest level. This authority rarely sits with project managers or middle management. If someone asks you to lower quality, tell them to get it in writing from the financial director.

  3. Can We Have Different DoDs Per Backlog Item?

    No. The DoD is a universal standard applied to all work, ensuring consistency in quality. Acceptance Criteria define specific conditions for a backlog item, but these conditions do not belong in the DoD.

  4. Should the DoD Be Fluid and Change Every Sprint?

    No. A fluctuating DoD signals dysfunction unless it’s always improving. Constant changes undermine transparency and disrupt planning. Evolution should be deliberate, incremental, and focused on raising quality—not shifting goalposts.

DoD as a Strategic Lever

A strong DoD isn’t just about engineering—it’s about protecting revenue, managing risk, and ensuring predictable delivery. Weak DoD practices lead to costly rework, delayed releases, and customer dissatisfaction. By embedding security, compliance, and quality checks into the development cycle, organisations reduce their exposure to financial and reputational risks. Teams that consistently meet a well-defined DoD can deliver with greater confidence, improving forecasting and market responsiveness.

A strong DoD reduces rework, increases predictability, and aligns technical work with business value. As organisations evolve, so should their quality expectations. This continuous refinement is not just a technical necessity—it’s a competitive advantage.

Definition of Done Software Development Scrum Product Development Professional Scrum Value Delivery Engineering Practices Product Delivery Agile Product Management Continuous Improvement Practical Techniques and Tooling
Comments

Related blog posts

Related videos

Connect with Martin Hinshelwood

If you've made it this far, it's worth connecting with our principal consultant and coach, Martin Hinshelwood, for a 30-minute 'ask me anything' call.

Our Happy Clients​

We partner with businesses across diverse industries, including finance, insurance, healthcare, pharmaceuticals, technology, engineering, transportation, hospitality, entertainment, legal, government, and military sectors.​

Slaughter and May Logo
New Signature Logo
Cognizant Microsoft Business Group (MBG) Logo
Lockheed Martin Logo
Philips Logo
Alignment Healthcare Logo
Bistech Logo
Hubtel Ghana Logo
Emerson Process Management Logo
Xceptor - Process and Data Automation Logo
Boxit Document Solutions Logo
Freadom Logo
Deliotte Logo
Epic Games Logo
Illumina Logo
ProgramUtvikling Logo
Capita Secure Information Solutions Ltd Logo
Akaditi Logo
Royal Air Force Logo
Department of Work and Pensions (UK) Logo
Ghana Police Service Logo
Nottingham County Council Logo
Washington Department of Enterprise Services Logo
New Hampshire Supreme Court Logo
Microsoft Logo
Deliotte Logo
Flowmaster (a Mentor Graphics Company) Logo
Capita Secure Information Solutions Ltd Logo
Lockheed Martin Logo
Lean SA Logo