A QA checklist before you publish findings helps you catch four common problems: factual errors, missing or wrong attribution, lost context, and unbalanced evidence. Use the steps below to verify quotes, confirm sources, and clearly state limitations so readers can trust what you publish. It also helps your team review work faster because everyone checks the same items.
Primary keyword: QA checklist before you publish findings.
- Key takeaways:
- Check facts against primary sources and keep an audit trail for every number, quote, and claim.
- Confirm attribution (who said what, when, and where) and label anonymous sources clearly.
- Add context so readers understand conditions, definitions, and what the data can and cannot show.
- Balance evidence by presenting credible alternatives, counterpoints, and uncertainty.
- Use a “red flags” list to spot claims that need stronger support or different wording.
What “QA before publishing” means (and what it is not)
Quality assurance (QA) for published findings is a final review process that checks accuracy, attribution, context, and balance before an audience sees your work. It should happen after you finish analysis and writing, but before final approval and distribution.
QA is not a rewrite pass, and it is not only a grammar check. It is a structured verification step that asks: “Can we prove this, and have we explained it fairly?”
Who should use this checklist
- Researchers sharing results (academic, UX, market research).
- Journalists, editors, and content teams publishing reports or investigations.
- Product teams publishing benchmarks, case studies, or white papers.
- Nonprofits and public agencies publishing evaluations and impact reports.
When to run QA
- Draft complete: conclusions and recommendations are written.
- Assets ready: charts, tables, and quotations are inserted.
- Before final sign-off: legal/compliance and comms review (if applicable).
- Before translation/captions: avoid multiplying errors across versions.
Step-by-step QA workflow (fast, repeatable, and team-friendly)
This workflow keeps QA efficient by separating “truth checks” from “presentation checks.” If you try to do everything at once, you miss things.
Step 1: Create a claims log
Make a simple table with one row per claim. Include the exact sentence, the type of claim, and the supporting source.
- Claim: Copy the sentence as written.
- Type: quote, statistic, interpretation, causal claim, comparison, or recommendation.
- Support: link, citation, transcript timestamp, dataset name, or experiment ID.
- Status: verified / needs work / remove.
Step 2: Verify in this order
- Numbers first: readers repeat numbers, and small errors travel far.
- Quotes next: misquotes harm trust and relationships.
- Attribution: wrong attribution can create legal and ethical issues.
- Interpretation and context: ensure your “so what” matches the evidence.
Step 3: Do a cold read for balance
After verification, do one uninterrupted read as if you disagree with your own conclusion. Mark where a critical reader would say, “But what about…?”
Step 4: Lock sources and version control
Freeze the source set you used (documents, datasets, transcripts) and record where they live. If something changes later, you need to know which version supported the published claim.
Accuracy checklist: make every fact traceable
Accuracy means more than “no typos.” It means each factual statement matches a reliable source, and the reader can understand what you did.
Facts and figures
- Check every number against the original source (not a secondary summary).
- Confirm units, time ranges, and denominators (per month vs per year, per user vs per session).
- Recalculate any derived values (percent changes, averages, rates) from the raw numbers.
- Make rounding consistent and disclose it if it changes meaning.
- Ensure charts match tables and narrative text (no mismatched totals or labels).
Methods and reproducibility (at the reader’s level)
- Define your key terms the same way throughout the piece.
- State what data you used, from where, and the inclusion/exclusion rules.
- Explain major cleaning or filtering steps that could change results.
- Confirm that examples and screenshots represent typical cases (or label them as edge cases).
Quote verification (your requested final deliverable check)
- Match each quote to the original recording, transcript, or document.
- Verify wording, punctuation, and meaning (especially around “not,” “never,” and numbers).
- Confirm the quote’s timestamp or page number in your notes.
- Check you did not remove surrounding context that changes intent.
- Mark any edits with brackets or ellipses, and ensure the edit does not change meaning.
If you work from audio or video, use a transcript you can search and time-stamp. If you outsource this step, consider adding a second pass such as transcription proofreading so quotes and names stay consistent.
Attribution checklist: make it clear who said what (and why readers should believe it)
Attribution is about credit and accountability. Readers need to know which statements come from your data, which come from experts, and which are your interpretation.
Source identification
- Confirm speaker names, titles, and organizations (and the correct spelling).
- Verify dates, locations, and settings (interview, survey, public talk, internal memo).
- Use consistent labels for participants (e.g., “Participant 12,” “Engineer, mid-size SaaS”).
- Distinguish primary sources (original data, direct interview) from secondary sources (articles summarizing others).
Quote and paraphrase boundaries
- Use quotation marks only for exact words that you can point to in the source.
- When paraphrasing, ensure you did not add certainty the source did not have.
- Don’t merge two separate quotes into one sentence without clear labeling.
Anonymity and permissions
- If a source is anonymous, say why (safety, job risk, privacy) and how you verified them.
- Confirm any consent requirements that apply to your work and location.
- Remove identifying details if they are not essential to the finding.
If you collect data from people, make sure you follow your organization’s research ethics and privacy rules. If you need background on consent in the US, the HHS Common Rule (45 CFR 46) is a useful reference for human-subjects research frameworks.
Context and limitations checklist: prevent true facts from becoming misleading
A statement can be accurate and still mislead if you strip away the conditions. Context shows what the evidence means and where it stops.
Context readers need
- What question you tried to answer, in plain language.
- Who or what the data represents (and what it does not represent).
- When the data was collected and whether conditions changed afterward.
- Definitions for key terms (and any changes from common usage).
- Any strong assumptions behind the analysis.
Limitations you should state explicitly (your requested check)
- Sample limitations: size, selection bias, missing groups, nonresponse.
- Measurement limitations: self-reporting, proxy metrics, instrument error.
- Design limitations: observational vs experimental, confounders, lack of randomization.
- Data limitations: missing values, changes in tracking, merges, duplicates.
- Generalization limits: what contexts your findings likely won’t apply to.
Language that helps you stay honest
- Use “suggests,” “is consistent with,” or “may indicate” when evidence is limited.
- Use “in our sample,” “during the study period,” and “for the measured metric” to narrow scope.
- Avoid “proves,” “always,” and “never” unless you truly can support them.
Balance checklist: present the evidence fairly (even when you have a point of view)
Balance does not mean giving equal weight to weak ideas. It means you treat credible alternatives and uncertainty with respect, and you show your work.
Balance of evidence (your requested check)
- List your strongest supporting evidence and your strongest conflicting evidence.
- Address plausible alternative explanations, not just easy-to-dismiss ones.
- Separate what you observed from what you infer.
- Include negative results or null findings when they matter to the conclusion.
- Check that headlines and summaries match the nuance in the body.
Common imbalance patterns to fix
- Cherry-picking: only the best examples make it into the draft.
- Survivorship bias: you focus on what remained and ignore what dropped out.
- Overgeneralization: you treat a subgroup as “everyone.”
- False precision: exact numbers without acknowledging uncertainty.
Quick “skeptical reader” test
- What would a reasonable critic argue is missing?
- What would change your conclusion?
- Which claims depend on assumptions you have not stated?
Red flags: claims that require stronger support (or different wording)
Use this list as a stop sign. If you see a red flag, either add stronger evidence, narrow the claim, or remove it.
- Causal language without causal design: “X caused Y” based only on correlation.
- Big jumps from small samples: sweeping claims based on a few interviews or a tiny dataset.
- Single-source allegations: serious claims supported by only one person or document.
- Unverifiable superlatives: “best,” “worst,” “leading,” “unprecedented,” without criteria.
- Missing denominators: “a 40% increase” without the baseline.
- Ambiguous timeframes: “recently,” “now,” “over time” without dates.
- Claims that impact reputation: accusations, safety risks, or misconduct without documentation.
- Health, legal, or financial advice: recommendations that could harm people if wrong.
- Confident claims from noisy measures: conclusions drawn from proxies with known gaps.
For accessibility-related publications that include video, consider whether your captions and transcripts communicate meaning accurately. The WCAG overview is a practical reference for accessibility expectations and terminology.
Common questions
1) What should I verify first: writing, data, or quotes?
Verify data and quotes first, because they are the easiest to misstate and the hardest to fix after publication. Then confirm attribution, and only then do a final style pass.
2) How do I check quotes if I only have notes?
Only use quotation marks for exact words you can verify in a recording or document. If you cannot verify, paraphrase carefully and label it as a paraphrase, or remove it.
3) How detailed should limitations be?
Include limitations that could change how a reader interprets the conclusion or applies it. Keep them specific, and connect each one to what it affects.
4) What does “balance” mean if the evidence clearly supports one side?
Balance means you acknowledge credible counter-evidence and explain why your conclusion still holds. You do not need to amplify weak claims, but you should show you considered them.
5) How do I avoid accidental misattribution in multi-speaker interviews?
Use speaker-labeled transcripts, confirm names, and time-stamp key quotes. If you are uncertain, don’t guess; verify or remove the quote.
6) What’s a simple way to keep an audit trail?
Use a claims log with links to sources, plus a folder structure with versioned files. Keep one “source of truth” document that lists datasets, transcripts, and final figures.
7) Should I use automated transcription in my QA process?
Automated transcription can help you search and draft faster, but you should still review quotes against the audio for accuracy. If you want a faster first pass, see automated transcription and then run a verification pass for names, numbers, and critical statements.
Final QA checklist (copy/paste)
- Accuracy: every number and factual claim matches a primary source.
- Quotes: verified against recording/transcript; timestamps recorded; edits do not change meaning.
- Attribution: correct speaker/source, role, date, and context; paraphrases not marked as quotes.
- Context: definitions, timeframes, and scope are clear; assumptions stated.
- Limitations: sample, measurement, design, and data limits stated plainly.
- Balance: credible counterpoints addressed; uncertainty not hidden; headline matches body.
- Red flags: none remain without stronger support or narrowed wording.
- Version control: source set and figures locked; final files labeled and stored.
If transcripts, interview recordings, or meeting notes play a role in your findings, clean source material makes QA much easier. GoTranscript can support your process with professional transcription services, so you can verify quotes, speakers, and details before you publish.