A “good brief” isn’t a longer brief. It’s a clearer one.
It’s the kind of brief that lets capable people do their best work without guessing your meaning, tracking down missing stakeholders, or reworking deliverables that aren’t quite right.
If you’ve been through enough project reviews, you know what makes a brief effective instead of causing extra work. Good briefs act as a shared starting point. They make the problem, priorities, roles, timelines, constraints, success measures, and what counts as 'done' clear for everyone. This helps teams work confidently and manage changes openly.
This guide explains what to include in a good brief, gives you a completed example, offers a template you can use, and shares tips to help you avoid rework from the start.
What is a good brief (and what isn’t one)
A brief is a strategic specification of what needs to happen, who it’s for, the boundaries, and how you’ll know if it worked. That definition comes from APM, which calls it a “high-level outline of stakeholders’ needs and requirements for a project.”
The word 'strategic' is important here. A good brief is not just a list of tasks. It serves as a guide for every decision, trade-off, and discussion about scope that comes later.
After a brief is signed, it becomes the baseline. Any changes after that are not just 'feedback' or 'small tweaks', they are changes to what was agreed. This is why version control and change requests are important. We’ll cover this in more detail later.
What a brief is not:
- Phrases like “Make it premium” and “Make it pop” are not brief inputs. They’re ambiguity dressed up as direction.
- A brief is not a replacement for team alignment. It should show decisions that have already been made, not try to get everyone to agree by being vague.
- It’s not a project plan, but it should include enough timeline details to help with planning.
- It’s not a project charter, but it should be just as clear about who sponsors the work, who approves it, and what authority the team has.
The 8 things every brief needs
ISBA’s briefing guide splits the process into three steps: clarifying, writing, and meeting together. The brief itself needs to cover certain key points. Here’s the minimum you need to avoid costly mistakes.
1. Context and problem
What’s happening and why now. What’s been tried before. What you learned.
If a brief lacks context, the team has to guess the background. Research on ambiguity (Berry, Kamsties, and Krieger) shows that missing context is a main reason documents fail to communicate clearly.
2. One clear objective
One primary objective, time-bound. If there are secondary objectives, put them in explicit priority order.
A key line for any project is: 'If we have to choose, we focus on X instead of Y.' If you can’t write this, the brief isn’t ready.
3. Audience and insight
Identify who you’re speaking to and the main insight that explains their behavior. ISBA says to treat your audience as people, not just numbers. Demographics tell you who they are, but an insight explains why they act the way they do.
Include barriers, objections, and misconceptions. These shape the work more than any persona slide.
4. Deliverables, channels, and scope
What you’re making, where it runs, and what’s out of scope.
Be specific about formats, quantities, lengths, specs, and platforms. List all required elements, like legal, brand, accessibility, and localization needs. Also, clearly state what this project does not include. Listing out-of-scope items helps prevent scope creep better than any contract.
5. Constraints, dependencies, and risks
Budget (even a ballpark). Timeline with hard dates and why they’re set in stone. Who supplies what inputs by when: legal sign-off, product data, assets, translations, approvals.
ISBA’s advice is clear: be honest about timelines. Fake deadlines weaken results. If there’s no budget listed, it shows a lack of thoroughness.
6. Success measures and KPIs
One primary KPI that answers “did it work?” Two to four supporting KPIs as leading indicators. Baseline or benchmark values where possible. And a measurement owner, because if nobody owns tracking, nobody tracks.
The GCS evaluation framework splits measurement into outputs (what you delivered), outtakes (what people thought or felt), outcomes (what people did), and impact (what changed for the business). This order helps teams avoid confusing impressions with real results.
7. Roles and approvals (RACI)
Who owns the brief? Who delivers. Who must be consulted? Who gets informed? One accountable person per deliverable and approval gate. More on RACI below.
8. Acceptance criteria and Definition of Done
Ask two questions: 'Is this the right thing?' (acceptance criteria) and 'Is it ready to ship?' (Definition of Done). Most briefs leave this out, which is why projects often get stuck in 'almost done' cycles. More details are below.
Example: a filled-in campaign brief
Theory is helpful, but seeing a real example is even better. Here’s a sample campaign brief for a fictional SaaS product launch, following the structure above.
Brief: Q2 product launch campaign for Acme Analytics
Owner (A): Sarah Chen, Head of Marketing
Delivery Lead ®: James Okafor, Creative Lead
Version: v1.0 (signed baseline)
Context: Acme Analytics is launching a new reporting dashboard aimed at mid-market finance teams. Competitors shipped similar features in Q1. Our current customers have been requesting this for 8 months. Previous campaigns for feature launches underperformed because we led with the features instead of the problem they solve.
Objective: By 30 June, generate 400 qualified demo requests from mid-market finance directors. If trade-offs are needed, optimise for demo quality over volume.
Audience: Finance directors at companies with 200-2,000 employees who currently build reports manually in spreadsheets. Insight: they don’t want a “better dashboard.” They want to stop spending Friday afternoons rebuilding the same report for Monday’s leadership meeting.
Deliverables:
- 1x landing page (desktop + mobile)
- 3x email sequence (existing customers)
- 4x LinkedIn ads (static, 1200x628)
- 1x product walkthrough video (90 seconds max)
- Out of scope: blog content, sales collateral, PR
Constraints:
- Budget: $35,000 (media + production)
- Product data and screenshots available from 15 April
- Legal review required for all customer-facing claims (allow 5 business days)
- Launch date: 1 June (non-negotiable, tied to Q2 board presentation)
KPIs:
- Primary: qualified demo requests (outcome)
- Supporting: landing page conversion rate (output), email click-through rate (output), LinkedIn CTR (output), demo-to-trial conversion (outcome)
- Baseline: previous feature launch generated 180 demo requests
RACI:
- A: Sarah Chen (brief owner, final approver)
- R: James Okafor (creative), Priya Patel (email/ads), Tom Liu (video)
- C: Legal (claims review), Product (data/screenshots), Sales (messaging input)
- I: CEO (monthly update), Customer Success (launch timing)
Acceptance criteria:
- Landing page communicates the problem (not just the feature) in the first screen
- All claims substantiated and approved by Legal
- Email sequence includes unsubscribe and complies with GDPR
- Video loads under 3 seconds on mobile
Definition of Done:
- Proofread, including disclaimers and names
- Accessibility checked: captions on video, alt text on images, contrast
- Tracking links tested and live in analytics
- Final exports in correct formats, source files archived
- Version number applied
This is a brief you can actually use to deliver results. Each section answers a question the team would otherwise need to track down.
How to set up a RACI matrix for your brief
Most problems with briefs aren’t a writing problem - they’re a governance problem. The brief might be fine, but if no one agrees on who approves, who needs to be consulted, or who just needs to be informed, issues will come up.
APM recommends a RACI matrix (Responsible, Accountable, Consulted, Informed) for exactly this reason. Here’s the leanest version that works:
Accountable (A) is the brief owner. One person. Owns the objective, KPI targets, and trade-offs. Owns the signed baseline. Gets the final say when stakeholders disagree, unless compliance has a veto.
Responsible ® are the delivery leads. Creative, content, design, production, channel leads. They own estimates, execution plans, and delivery quality.
Consulted © are stakeholders with real constraints: legal, compliance, brand, data, regional teams, product. They must be involved early, because their constraints shape what’s feasible. Bringing them in at the end is how you get last-minute rewrites.
Informed (I) are stakeholders who need visibility, not votes. Leadership updates, adjacent teams, and partners are affected by timing.
The rule is simple: have only one accountable person for each deliverable and approval step. If two people are accountable, then no one really is. If your approvals list has more than three names, you have a committee, not a brief.
Acceptance criteria and Definition of Done for briefs
Most briefs skip this section, but including it would prevent the most rework.
Acceptance criteria answer: “Is this the right thing?” Does the deliverable meet the brief’s intent, required messages, and mandatories? These are specific to each deliverable.
Example for a landing page:
- Communicates the problem (not just the feature) in the first screen.
- Includes compliance disclaimer above the fold.
- All claims approved by Legal.
- Supports analytics events for CTA clicks and form submissions.
Definition of Done answers: “Is this ready to ship?” This is a quality bar that applies across all work, regardless of the deliverable type.
Example DoD checklist:
- Proofread complete (disclaimers, footnotes, names)
- Accessibility checked (captions, alt text, contrast ratios)
- Legal or compliance approval recorded.
- Links and forms tested, tracking verified.
- Exports in correct formats, source files archived.
- Version number applied, change log updated.
APM says acceptance criteria are the conditions needed before a deliverable is accepted. Scrum’s Definition of Done goes further: if work doesn’t meet the DoD, it isn’t finished. This is a good standard to use. 'Nearly done' is not done.
How to measure brief success: from outputs to impact
If a brief doesn’t define how success will be measured, it leaves stakeholders to decide later if they 'like it.' This leads to endless feedback based on opinions.
The GCS evaluation framework and AMEC’s Integrated Evaluation Framework both use the same progression, and it’s useful for briefs:
Outputs are what you put into the world. Impressions, reach, email sends, page views. These measure delivery, not effectiveness.
Outtakes are what people think or feel. Awareness, recall, sentiment, intent. These measure whether the message landed.
Outcomes are what people do. Sign-ups, purchases, applications, referrals. These measure behaviour change.
Impact is what changed for the business. Revenue, cost reduction, retention, churn reduction. This is where you connect the work to organisational goals.
Most briefs stop at outputs and call them results. For example, 'We got 2 million impressions' is not a true success measure - it’s just a delivery metric. The brief should say which measurement matters most and who is responsible for tracking it.
Creative brief vs content brief vs design brief
Different work needs different brief structures. Here’s how the main types compare.
| Type |
Best for |
What makes it work |
Where it usually goes wrong |
| Creative / campaign brief |
Multi-channel campaigns, launches, creative platform work |
One clear objective. Output and outcome metrics separated. Single-minded message. |
Becomes a deliverables shopping list with no clear success measure. Fix: define the outcome, not just the outputs. |
| Content brief |
Blog posts, whitepapers, video scripts, email sequences |
Specific audience need. Clear structure. Defined tone with examples, not adjectives. |
Endless edits because "tone" was never pinned down. Fix: include tone examples and separate musts from preferences. |
| Design brief |
Brand, UX, packaging, presentations, service design |
Constraints are explicit. Technical specs included. Ambiguity reduced by defining terms. |
"Make it modern" causes three rounds of misalignment. Fix: translate adjectives into observable criteria. |
| Project charter |
Cross-functional projects needing formal authority and resource allocation |
Clear sponsor authority. Resource permission. Baseline for governance. |
Exists separately from the working brief, so teams have authority but no execution clarity. Fix: pair it with a delivery brief. |
Choose the brief type that fits your work, and adjust the template below as needed.
Brief template you can copy
Use this template for any campaign, content, design, or project brief. Complete every section. If you can’t fill in a section, the brief needs more work.
Brief title:Type: Campaign / Content / Design / ProjectOwner (A):Delivery lead ®:Version: v0.x draft / v1.0 signed baselineDate:
Context:What’s happening? Why now? What’s been tried before? What did we learn?
Problem / opportunity:One to two sentences. The real issue. Not the solution request.
Objective:By [date], achieve [specific change] for [audience].Priority if trade-offs appear:
Audience:
- Primary:
- Secondary:
- Insight:
- Barriers / objections:
Key message:Single-minded message (one line). Supporting points (three max). Mandatories.
Deliverables and channels:List with specs (format, length, quantity). Where it runs. Mandatories (legal, brand, accessibility). Out-of-scope items.
Constraints and dependencies:Budget. Timeline with hard dates and reasons. Dependencies (who provides what, by when). Risks and assumptions.
KPIs:
- Primary (outcome):
- Supporting (outputs / outtakes):
- Baseline / benchmark:
- Measurement owner:
RACI:
- Accountable:
- Responsible:
- Consulted:
- Informed:
Acceptance criteria:(Checklist. What must be true for approval?)
Definition of Done:(Quality checklist. What must be true before it ships?)
Sign-off and change control:
- Approver(s):
- Sign-off date:
- Baseline version: v1.0
- Changes after sign-off require a change request.
Version log:
- v0.1 (draft) - date - notes
- v1.0 (signed) - date
- v1.1 (minor change) - date - impact summary
Common briefing pitfalls and fixes
These are the patterns that break projects most often.
| Pitfall | What happens | Fix |
| Vague objectives | Team delivers something that technically meets the brief but misses the point. | Write the objective as a time-bound result statement tied to an outcome KPI. |
| Outputs confused with outcomes | "Success = 2M impressions" but nobody signed up. | Use the outputs, outtakes, outcomes, impact sequence. Define which level matters. |
| Hidden stakeholders | Legal or brand appears in the final review with new requirements. | Name all Consulted stakeholders in the RACI. Involve them before the first draft. |
| "Nearly done" loops | Three rounds of "almost there" revisions with no clear finish line. | Add acceptance criteria and a Definition of Done checklist. If it meets both, it's done. |
| "Quick change" scope creep | A small request after sign-off turns into a new objective. | Baseline the signed brief. Post-sign-off changes go through a change request. |
| Approval by committee | Everyone has an opinion, nobody has authority. | One accountable owner per deliverable. Maximum three approvers. |
Brief lifecycle: from intake to close
Version control for signed briefs:
- v0.1 / v0.2 = working drafts
- v1.0 = signed baseline
- v1.1 / v1.2 = minor approved changes (same objective)
- v2.0 = material change to objective, scope, budget, or timeline. Requires re-approval.
How briefin makes structured briefing the default
It’s easy to know what belongs in a brief. The hard part is making sure it happens every time, for every project, client, and team.
Most agencies have a Word document template somewhere, but people often skip fields. Version history is just what’s in someone’s inbox. Approvals might happen over Slack or not at all. The process usually breaks down in practice, not in theory.
That’s the problem briefin was built to solve. It’s a briefing platform built specifically for agencies that replaces the scattered process of email chains, Word docs, and verbal requests with structured, trackable briefs.
The template library includes drag-and-drop brief builders with field types, conditional logic, and categories covering campaign briefs, brand strategy, app launches, and more. If a required field is left blank, the brief is flagged before it reaches your team. No more “I didn’t realise that section was important.”
Approval workflows route briefs through the right reviewers before work starts, and every edit after submission auto-creates a new version with timestamps and author tracking. The signed brief is the baseline. The version history is the record. No more “I thought we agreed on…” conversations.
Instead of hoping people follow a RACI chart in a document, briefin’s workflows enforce it. Brief owners submit. Reviewers review. Approvers approve. Everyone else sees progress on the Kanban board without digging through email or Slack.
briefin’s AI brief intelligence reviews briefs for clarity and completeness, suggesting improvements to structure, messaging, and depth. It helps clients who struggle to articulate what they need. The AI supports the structure. It doesn’t invent direction.
When a brief hits “Approved,” briefin can auto-create tasks in Magnetic, Asana, Jira, Trello, Monday, or Wrike with the full brief attached. And the analytics dashboard tracks brief volume, completion rates, and where things get stuck, so the learning stage is built in rather than something you promise to do next quarter.
Agencies using briefin save 3-7 hours per brief on chasing information alone. That adds up to less rework, better margins, and fewer “can we just jump on a quick call to clarify” meetings.
Get started with briefin or book a demo.
Sources
- Association for Project Management (APM), Project Management Glossary: acceptance, acceptance criteria, baseline, client brief, change request, change control.
- APM, "What is change control?": log, evaluate, approve/reject/defer, update plan.
- APM, Stakeholder engagement principle: RACI matrix recommendation.
- Schwaber & Sutherland, The Scrum Guide (2020): Definition of Done, Sprint Review, transparency.
- AMEC, Integrated Evaluation Framework: outputs, outtakes, outcomes, impact.
- ISBA, How to brief your agency: a best practice guide: three-stage briefing process, practical components.
- UK Government Communication Service (GCS), Evaluation Cycle: six-stage measurement framework.
- IAB Europe, Digital Advertising Effectiveness Measurement Framework: harmonised definitions across media, brand, and sales.
- Berry, Kamsties & Krieger, From Contract Drafting to Software Specification: Linguistic Sources of Ambiguity.
- Kamsties, Berry & Paech, Detecting Ambiguities in Requirements Documents Using Inspections.
- Project Management Institute (PMI), PMBOK Guide: project charter definition
Related blogs
Scope Creep Is a Briefing Failure: Why Modern Work Breaks Down Before It Starts