GoTranscript
>
All Services
>

En/blog/data Security Plan Template Recorded Research

Blog chevron right Research

Data Security Plan Template for Recorded Research (Storage, Access, Encryption)

Christopher Nguyen
Christopher Nguyen
Posted in Zoom May 1 · 3 May, 2026
Data Security Plan Template for Recorded Research (Storage, Access, Encryption)

A data security plan for recorded research explains where you store audio and transcripts, who can access them, how you encrypt them, and when you delete them. If you write these decisions down before you record, you reduce privacy risk and make IRB, funder, and team reviews much easier. This template gives you a fill-in table plus practical defaults for storage, access roles, MFA, backups, sharing controls, and incident response.

Primary keyword: data security plan template

Key takeaways

  • Start with a data map: audio files, transcripts, consent forms, identifiers, and analysis datasets often live in different places.
  • Use “least access” roles, require MFA, and avoid personal devices and consumer file sharing for sensitive research data.
  • Encrypt in transit (TLS) and at rest (device/server encryption), and document what tools provide it.
  • Plan retention and disposal up front, including how you will delete originals, backups, and exported copies.
  • Write a short incident response checklist so your team knows what to do in the first hour.

What to include in a data security plan for recorded research

Your plan should cover the full life cycle: collect, transfer, store, use, share, retain, and dispose. Keep it short enough that the whole team will follow it.

At minimum, include these sections.

  • Data inventory: what you collect (audio, video, transcripts, notes, identifiers), and why.
  • Storage locations: approved systems for raw recordings, working files, and long-term archive.
  • Encryption: how data is protected in transit and at rest.
  • Access roles: who can view, edit, export, or share data.
  • Authentication: MFA requirements and account rules.
  • Backups: what is backed up, how often, and where backups live.
  • Sharing controls: how you share with vendors, collaborators, or advisors.
  • Retention and disposal: how long you keep each data type and how you delete it.
  • Incident response: what to do if data is lost, leaked, or accessed without approval.

Quick definitions (so the plan stays clear)

  • Encryption in transit: protection while data moves (for example, upload/download), usually via TLS.
  • Encryption at rest: protection while data sits on a device or server (disk or database encryption).
  • Identifiers: names, emails, phone numbers, faces, voices, locations, or any combination that can point to a person.
  • De-identification: removing or masking identifiers; for audio, “voice” itself can be identifying.

Fill-in template: data map table (copy/paste)

Use this table as the backbone of your plan. Fill one row per data type and update it when your workflow changes.

Data type Contains identifiers? (Y/N) Location (system + folder/project) Access (roles / named group) Encryption (in transit / at rest) Retention (how long + trigger) Disposal method Sharing allowed? (how)
Raw audio recordings
Verbatim transcripts
De-identified transcripts
Consent forms / contact details
Codebook / qualitative notes
Analysis dataset (exported quotes)

Tip: If your institution provides approved storage (research drive, secure cloud, managed laptop), list the exact name of that service and the project folder path so new team members do not guess.

Step-by-step security plan (storage, encryption, access, backups, sharing)

Use the steps below as your “default” workflow, then adjust based on sensitivity, regulations, and participant expectations. Document your final choices in the table above.

1) Choose storage locations based on sensitivity

Recorded research data often becomes risky because it spreads across laptops, email threads, and consumer file-sharing links. Reduce that risk by picking one primary storage system and one backup system, then banning everything else.

  • Best default for sensitive recordings: institution-managed secure storage (research drive or approved cloud).
  • Avoid as primary storage: personal email, USB sticks, unencrypted external drives, and ad-hoc messaging apps.
  • Separate locations: store identifiers (consent forms, contact lists) separately from recordings and transcripts.

2) Encrypt in transit and at rest (write down what provides it)

Encryption only helps when it is actually enabled and applied to every copy. Your plan should specify the tools that handle encryption and what your team must do (or not do) to keep it active.

  • In transit: require uploads/downloads over HTTPS/TLS and approved VPN when off-campus, if your institution requires it.
  • At rest: require full-disk encryption for laptops and phones used for research, plus encryption on servers/cloud storage when available.
  • Local copies: limit downloads; prefer working inside secure storage rather than syncing to personal devices.

If you need a neutral reference point for encryption language, see NIST guidance on storage encryption.

3) Define access roles (least access, time-bounded)

A good plan limits access by role, not by convenience. Make roles explicit so a new RA does not automatically get raw audio if they only need de-identified transcripts.

  • Principal Investigator (PI): approve access, sharing, and retention changes.
  • Research manager / data steward: administer folders, groups, and audit checks.
  • Transcription coordinator: prepares uploads, tracks files, and receives transcripts.
  • Coding team: accesses de-identified transcripts and coded datasets.
  • IT/security contact: helps with incidents and secure configuration.
  • Access rules to document:
    • Who can access raw audio versus de-identified transcripts.
    • Whether access expires when someone leaves the project.
    • Whether exporting/downloading is allowed, and who can do it.

4) Require strong authentication (MFA + account hygiene)

Recorded interviews can contain sensitive personal details, so treat access like you would treat access to other confidential research data. MFA reduces risk from password reuse and phishing.

  • Require MFA on storage, transcription portals, and any system that can export files.
  • Use institution accounts when possible, not personal accounts.
  • Turn off shared logins; give each team member their own account.
  • Set a rule for leaving staff: disable accounts and remove folder access the same day.

5) Plan backups without creating uncontrolled copies

Backups protect against accidental deletion and ransomware, but they also create extra copies you must retain and delete properly. Your plan should say what gets backed up, where, and how you will handle deletions.

  • Backup scope: raw audio, transcripts, and codebook files usually need backup; temporary exports may not.
  • Backup location: separate from primary storage (but still approved and access-controlled).
  • Access: limit backup access to the PI/data steward and IT where needed.
  • Deletion reality: note that deletion may not remove data immediately from backups; document the backup retention window.

6) Control sharing (vendors, collaborators, and advisors)

Sharing is where many projects lose control. Treat every share like a mini data transfer project with a checklist.

  • Prefer: role-based access to a controlled folder over sending attachments.
  • Use link controls: no public links; require sign-in; set expiration; disable re-sharing when possible.
  • Send the minimum: share only the files needed (often de-identified transcripts, not raw audio).
  • Track it: keep a simple log of what was shared, with whom, when, and why.

If your work falls under U.S. health data rules, align your sharing choices with HIPAA Security Rule guidance when applicable.

Pitfalls to avoid (what breaks most plans)

Most research teams do not fail because they ignore security entirely. They fail because they add “just one exception” that turns into ten.

  • Using personal devices without encryption: laptops and phones hold downloads, caches, and synced folders.
  • Emailing recordings or transcripts: attachments get forwarded and stored outside your control.
  • Keeping identifiers with the data: contact lists and consent forms should not sit beside raw audio.
  • No offboarding checklist: former team members keep access or keep local copies.
  • Unplanned “analysis exports”: quotes copied into slide decks or spreadsheets become new datasets.
  • Retention creep: “keep forever” becomes the default because nobody chose a trigger and date.

Retention and disposal: write it so it’s easy to follow

Retention depends on your protocol, consent language, contracts, and institutional rules. Your plan should name a clear trigger (for example, “study closeout” or “publication”) and a clear action.

Simple retention wording you can reuse

  • Raw audio: keep until transcripts are quality-checked and de-identified, then delete unless your protocol requires keeping audio.
  • Verbatim transcripts with identifiers: keep until a de-identified version is created and validated, then restrict or delete.
  • De-identified transcripts: keep for the analysis period and any required archive period.
  • Consent forms and contact lists: store separately with tighter access; retain based on institutional requirements.

Disposal methods to specify

  • Secure deletion: use approved institutional methods or managed storage deletion workflows.
  • Device disposal: ensure drives are wiped through IT before reuse or disposal.
  • Third parties: confirm how and when vendors delete project files and how you document that deletion.

Incident response checklist (first hour, first day, follow-up)

You do not need a long playbook, but you do need a clear set of actions. Add names and contact info so no one hunts for the right person during a crisis.

First hour

  • Stop further access: disable links, revoke folder permissions, and sign out affected sessions if possible.
  • Preserve evidence: do not delete logs or overwrite files; take screenshots of sharing settings.
  • Notify the right people: PI, data steward, and institutional IT/security contact.
  • Document what happened: date/time, system, files involved, who discovered it, and immediate actions taken.

First day

  • Identify scope: which data types were exposed (audio, transcripts, identifiers) and how many participants were affected.
  • Reset credentials: change passwords and enforce MFA where missing.
  • Decide on notifications: follow institutional policy for IRB, privacy office, funder, and participant notification if required.

Follow-up (within 1–2 weeks)

  • Fix root causes: remove unsafe workflows and retrain the team.
  • Update the plan and table: add the missing control (for example, link expiration or download restrictions).
  • Record lessons learned: what worked, what slowed you down, and what you will change.

Common questions

Do I need to encrypt audio files if my cloud storage says it is secure?

Often yes, because “secure” can mean many things. Your plan should state whether the storage encrypts data at rest and whether downloads to laptops are encrypted too.

Should I delete raw audio after transcription?

Many teams do, but the right answer depends on your protocol and consent language. If you keep audio, restrict it more tightly than de-identified transcripts and set a retention trigger.

How do I handle voices as identifiers?

A voice can identify a person, especially in small communities. If anonymity matters, consider limiting access to audio and sharing only de-identified transcripts for analysis.

Can research assistants use their own laptops?

They can, but it raises risk if devices lack full-disk encryption, strong passwords, and secure patching. A safer default is institution-managed devices or browser-only access to secure storage with no downloads.

What’s the simplest way to control sharing with collaborators?

Use role-based access to a controlled folder, require sign-in, and set link expiration. Avoid sending attachments and avoid public links.

What should I log for data transfers to transcription or coding vendors?

Log the date, files shared, method (portal/folder), who approved it, and when you received the deliverables. Also log when you requested deletion of project files, if that’s part of your workflow.

What if we discover a transcript contains more identifiers than expected?

Treat it like a data classification change. Restrict access, create a de-identified version, and update your table so everyone knows which version to use.

Optional add-on: a simple “one-page” plan you can paste into a protocol

  • Approved storage: [name system], project folder: [path].
  • Data separation: identifiers stored in [system/path], separate from research data.
  • Encryption: TLS for transfers; full-disk encryption on devices; storage encryption at rest where supported.
  • Access: raw audio limited to [roles]; de-identified transcripts available to [roles].
  • MFA: required for all accounts with access to storage or transcription portals.
  • Backups: [frequency], stored in [location], access limited to [roles].
  • Sharing: no email attachments; share via controlled folder; links expire in [days].
  • Retention: audio kept until [trigger]; transcripts kept until [trigger]; disposal via [method].
  • Incident response: contact [name/office], revoke access, document scope, follow institutional policy.

If you work with recorded interviews and need clean, usable text outputs for analysis, GoTranscript can support your workflow with transcription proofreading services and closed caption services when recordings also need accessibility deliverables. When you’re ready to move from planning to execution, GoTranscript also offers professional transcription services that fit into a controlled, documented research process.