How to Stop Stakeholders Overloading Your Discussion Guide (Full Transcript)

Keep qualitative guides aligned to objectives and protect data quality using pre-surveys, A/B guides, or online exercises.
Download Transcript (DOCX)
Speakers
add Add new speaker

[00:00:00] Speaker 1: Hi friends, I'm Katherine Korostow from Research Rockstar. Thanks for joining me for another episode of Conversations for Research Rockstars. And today I want to talk about a qualitative research topic. And it's one that if you've been doing qualitative research for a while, you've probably run into a situation that this topic relates to. And that has to do with managing clients and stakeholders and colleagues who try to put too much content into the discussion guide. Whether it's a discussion guide for focus groups or for in-depth interviews, for online or in-person research, the discussion guide of course is our data collection instrument. We need a solid discussion guide. The discussion guide to a qualitative research project is the same as a questionnaire to a survey research project. It's the means by which we are going to collect our data. And whether we're collecting qualitative data or quantitative data, we're always very intentional about what data we need to collect. So our discussion guide is a very important part of our process. The discussion guide is our data collection instrument. And sometimes unfortunately what can happen is that people start suggesting additional things to put into the discussion guide. So imagine you've put all of this time and effort into designing a great discussion guide. This is going to be a discussion guide that is tightly aligned with the stated project objectives. It's a discussion guide that you've designed really carefully to make sure that the sequence of questions will help you to get to really candid, deep answers. That there's nothing in the sequencing that might introduce bias. That you've done a good job of framing your questions and any projective exercises in a way that will reduce the risk of social desirability bias or acquiescence bias, for example. So you've put a lot of work into draft one. You send it out to your colleagues, your clients, your stakeholders. And what happens? Everybody has feedback. It can be a little overwhelming, right? You might even be getting conflicting feedback. So one person likes this section, the other person hates that section. And then you have the people who are suggesting other things to put into the discussion guide. Hey, as long as we're talking to our customers, can you also ask about this? When you are in a situation where people keep making suggestions to your discussion guide. And you feel that they're getting away from the stated project objectives. It is our professional obligation to remind them. Hey, I really appreciate your interest in collecting this additional data. I want to point out that that's currently not in our documented objectives. Which is fine, but then can we update the project objectives? Right? So that's a totally appropriate thing to ask for. Maybe something that initially was a high priority is no longer such a high priority. Okay, so maybe we're going to add this, but we're going to take something else out. But it is our job as the researcher to facilitate that conversation. Because the truth is, a lot of times our clients and stakeholders don't realize that what they're asking for could negatively impact data quality. They're just thinking, oh, I've got a budget. I'm going to get as much data as I can get from this budget. And it's on us to sort of remind them that, well, if we do it this way, then these are what some of the risks are. So what can we do? Well, if it really is a situation where we're being asked to collect more information than is really wise in the discussion guide, there are three different things that we can do. One is we can do a pre-survey. That is, before we actually are holding the focus groups or holding the IDIs, we can invite the participants, the people who have been recruited, to also complete a pre-survey. It might be something they do online. If you are doing, say, focus groups at a focus group facility, you can ask them to do them in the waiting room at the focus group facility. If it's in an IDI, you might say, hey, as we're getting started with this IDI, we're going to just take the first five minutes to ask you to fill out a brief survey for us. So you could even give it to them in the beginning of the IDI. If you think that you can also invite them to do it ahead of time and they will comply, that's great. But in my experience, if you have, say, 20 people recruited for an IDI, and you ask all 20 of them to complete this online survey, say, the day before, like maybe half of them will do so. So we just have to understand that compliance before the research event is not always perfect. But a pre-survey can be a great way to collect some information, especially if some of the information is very factual and survey-like anyway. Let's not take up our valuable, qualitative data collection time with asking people how many cars they own. That's just not a good use of that. The other option you have, option number two, is you can suggest to the client, wow, we've got so much great content here, we've got so many things that we can accomplish with this research project, but it's too much for a data collection. So let's split it into two parts. Hey, we were planning on doing 30 IDIs. So how about we do discussion guide version A for 15 and then discussion guide version B for the next 15? That way there might be some information we collect from everybody, but maybe for some of the extra information, only half people have to answer those particular components. So you're not trying to jam everything into every single interview. You're kind of load balancing a little bit by treating it as a two-part IDI project in this case. And, of course, you could do the same thing for focus groups. If you're planning on doing a set of six focus groups, maybe three focus groups are with discussion guide A and three focus groups are with discussion guide B. It's an option. And then there is another option, which is related to all of the cool things that we can do with online research these days. For example, there are some really cool exercises that you can build in to do prioritization exercises, to do rank order exercises. I've seen projects where people are asked to pack their backpack for the day. If you're on your way out the door, what are all the things you're going to put in your backpack? And it's an online exercise where they literally put things into their backpack. And it tells a story about how people plan their days. I've also seen some really cool online exercises where people were actually doing a sort of like a mini discrete choice exercise, where they were shown a set of product ideas, and they had to say which one they would be most likely to want to evaluate or purchase. And online exercises can be a great way to even energize the research. If you're doing a 60-minute IDI, bringing in maybe a three- or four-minute online exercise that might feel a little bit quantitative or creative can be a great way to collect a lot of information quickly and also, frankly, introduce a little bit of fun for the participant to break up what might be a long conversation. So if you really do have to get a lot of content pre-survey, breaking it into two parts or building it in an online exercise can be great. And I'll be sure to put in the show notes some links to some of my favorite websites for some of the online exercises in case you haven't seen those before. If you have any questions, please let me know. Otherwise, I hope you'll find this to be helpful the next time you have a client or a colleague asking you to put too much information into your discussion guide. Thank you.

ai AI Insights
Arow Summary
Katherine Korostow explains how qualitative researchers should handle stakeholders who try to overload discussion guides with extra questions. She emphasizes the discussion guide as the key data-collection instrument, tightly aligned to project objectives and designed to protect data quality and minimize bias. When additional requests drift from objectives, researchers should facilitate a prioritization conversation, update objectives, and explain risks of overstuffing. She offers three practical alternatives: use a pre-survey for factual items, split the study into A/B versions of the guide across interviews/groups, or incorporate short online exercises (e.g., prioritization, ranking, choice tasks) to collect more information efficiently while keeping sessions engaging.
Arow Title
Managing Scope Creep in Qualitative Discussion Guides
Arow Keywords
qualitative research Remove
discussion guide Remove
stakeholder management Remove
scope creep Remove
research objectives Remove
data quality Remove
bias reduction Remove
pre-survey Remove
A/B discussion guide Remove
online exercises Remove
focus groups Remove
in-depth interviews Remove
Arow Key Takeaways
  • Treat the discussion guide as the qualitative equivalent of a survey questionnaire—core to data quality.
  • Push back on added questions by re-anchoring to documented objectives and facilitating prioritization.
  • Explain how overloading the guide can reduce depth, introduce bias, and harm overall insights.
  • Use a pre-survey to capture factual or structured information without consuming interview time.
  • Split interviews or groups into A/B versions to cover more topics without overburdening each session.
  • Add short online exercises (ranking, prioritization, choice tasks) to gather extra input efficiently and keep participants engaged.
Arow Sentiments
Neutral: The tone is practical and advisory, focusing on process, tradeoffs, and solutions rather than expressing strong positive or negative emotion.
Arow Enter your query
{{ secondsToHumanTime(time) }}
Back
Forward
{{ Math.round(speed * 100) / 100 }}x
{{ secondsToHumanTime(duration) }}
close
New speaker
Add speaker
close
Edit speaker
Save changes
close
Share Transcript