A Practical Guide to Trialing and Piloting Insights Tools (Full Transcript)

Learn when to run a quick software trial vs a live pilot, plus tips to document success criteria, compare tools, and delegate evaluation efficiently.
Download Transcript (DOCX)
Speakers
add Add new speaker

[00:00:09] Speaker 1: I have a question. Have you attended any industry conferences lately? Maybe you've gone to one of the conferences here in the United States or outside of this country. There are certainly a lot of great market research and insights conferences both in the United States and around the world. Or I bet if you haven't actually attended one recently. I bet you've gotten a lot of email promoting them. So whether you've attended or just been looking at the promotions, you've probably noticed that at all of these different events there have been a ton of new companies. A lot of companies that are offering various types of software basically, right? All different types of software tools and platforms to help with the work of doing market research and insights. And some of these tools and platforms are very simple and focused on a single sort of, you know, focus like it's specifically a survey platform or specifically a text analytics platform or specifically a social listening platform. And then there's also these tools that seem to do it all, right? They have many different functions really being integrated into a single platform. But the theme I'm looking for here is just the idea that there are so many new types of software out there. There are all these new companies. There's all these new tools and platforms. And it can be a little bit overwhelming, right? Because we're all so busy with the day-to-day work of doing market research and insights that when you see all these cool new tools, it's really tempting, but it can also feel a little bit overwhelming. You know, which ones are you actually going to take time out of your busy day to try? And so let me ask you a question. When was the last time you tried out a new market research tool? Has it been a while? How long have you been using your current survey platform? Two years? Three years? Five years or more? How about social media listening tools? Have you been using the same tool for five or more years now? How about text analytics? Or maybe there are certain types of tools you've been thinking about using, and you just haven't had a chance to try them out yet. It really is hard. Of course, there is inspiration here, too. In the February 2018 issue of Harvard Business Review, some of the executives at Unilever, the big giant brand, talked about what they do in their insights team, and they casually mentioned in this article that they had trialed over 175, that's right, 175 different market research-related tools in a two-year period. Now, for us mere mortals in the market research and insight space, that would be a bit hard to match. However, surely we can do a couple of trials a year, right? Especially for those of us who tend to have gotten comfortable with the tools we use, you know, if you're happy with the tool, that's fantastic, but maybe you should just try out some of the newer competitors just to make sure that they're keeping up, that is, your existing tool is keeping up with the latest in terms of features and user interface, because there really has been a lot of progress, a lot of advances in market research software. So if you haven't tried out a new tool in a while, let me give you some tips. If you're going to try out a new tool, first things first, you can think about trying it in terms of a trial or actually conducting a pilot. So what's the difference between doing a trial of new software versus an actual pilot? Well, a trial is when you're testing the software. It's often in a sort of a mock situation. You know, you're just trying it out and you're not necessarily using it with a live project. And usually when you trial software, it's for a pretty short period of time, very often 7 to 14 days, just enough to kind of give you a sense of whether or not it's something that you're going to find useful. And a lot of times a trial can just be used to verify whether or not the way that the brand is promoting their software matches what it actually does, right? So you might see a promotion, it sounds like maybe too good to be true, but oh, if it really does all these things, then I'm going to be so happy. So do a trial and find out for yourself. A quick trial is a great way to verify whether or not the tool has the features that you really, really want. It's also a good opportunity to try out the user interface. The user interface on market research software products has really progressed a lot in the last 5 years. If I compare some of the tools that I was using 5 to 7 years ago to the tools that I'm using today, the user interfaces have made a huge amount of progress. So a trial is a great way just to, you know, check out the user interface as well. Now once you do a trial, you might decide to go ahead and actually do a pilot. So a pilot is when you're going to actually do a project, it might be a small project, but doing an actual project with that particular platform. So you might say, hey, I'm going to look at new survey platforms, I'm going to actually do a small survey. I'm going to actually do a live survey, I'm going to do it start to finish with this tool so that I can see what it's like to program the questionnaire, what it's like when I'm doing my data collection and monitoring my data collection process on this platform. And I also want to see what its reporting features look like. So I actually want to collect the data and then use that actual data to create a report. And again, it might not be a huge project, maybe you're not doing a sample size of 2,000, but maybe you do a sample size of 100 or 200, just enough to actually use the product. So a pilot is a great opportunity once you've decided that you really want to put a little bit more time and effort into it because obviously doing a pilot test does take more effort than simply doing a trial. And the great thing is that with a pilot, it gives you that opportunity of really seeing what it would be like in real life. And usually you're going to have some specific success criteria that you're going to specify like, you know, I'm going to consider this pilot a success if I experience A, B and C. So being really precise about what you want the pilot to show you is going to be very helpful. So in fact, let me give you a few more tips for how to conduct either a trial or a pilot because again, I know how busy you all are and so I want to make sure that if you do decide to do a trial or even a pilot that you've got some, you know, some clear tips that might help you make that process as efficient as possible. So first of all, always have a document. Document what your plan is for the trial or the pilot. And it doesn't need to take an hour of your time to create a document. The document that you're going to have for your trial or your pilot should really be very simple. Don't make it more onerous than it needs to be. It might simply be a list of what features you want to test like, you know, if it is say a survey platform, how easy is it to import or export data? Or if it's some sort of text analytics tool, maybe you want to have, you know, check out its ability to import and export data and see what its limitations might be. Or maybe it's a survey tool that has some promises of outputting really good data visualizations. You might want to see, okay, well, what are its formatting features and can I create my own templates? And so you might want to check that out. So part of the documentation can simply be your wish list of what your features are for that specific product. So whatever product it is, what's your wish list, right? What are the specific features that would make doing a trial or pilot really worthwhile for you? And then think about what questions you have about that software that would be sort of the questions that would make that software worth the pain of not only paying for it and installing it, but of learning it and actually incorporating it into your processes. Because whenever you put in a new piece of software into your tool chest, it's not just the cost of the software. It's really the time it takes to learn it and really become proficient at it and whether or not it's going to add any value to your projects. So for example, I might say in my documentation, I'm going to consider this pilot successful if doing task X can be done with three fewer clicks. Or if when I check, like if maybe I'm going to say, oh, I'm going to look at some new software for surveys because I really want to improve the look of my mobile-first designs, right? So we want to make sure that people who are taking our surveys on their mobile phones are having a really awesome experience. Well, my question might be, how does that product's responsive design look on my phone? I want to use it, and then I want to take that survey as a participant so I can see what it literally looks like on my phone so that I can experience it for myself. Does this platform's responsive design really feel, well, is it responsive? And it probably is. But different tools use different scales, and they format their scales differently. And you might find that one actually looks better on your phone than another. So it is an important part of the evaluation. Or if I am using a tool that's promising to output certain types of visualizations, whether it's from structured data or unstructured data, well, my question might be, are these visualizations client-ready? You know, if I have a tool that's outputting data visualization, again, whether it's from structured data or unstructured data, and it's really awesome, and I don't have to do any or very much work to make it client-ready, that is fantastic. But if it's promising that it's going to output these beautiful data visualizations, but the brutal reality is that I'm going to end up having to spend 30 minutes making that data visualization client-ready, I don't know. That tool might not really meet my criteria for success. So think about your own questions, whether you're looking at a survey platform or social listening or text analytics or anything else that you might be doing in market research. What are your questions that would, you know, if you got a yes answer, you would say, you know what? If it does this, and if I have this experience, then it's worth taking the time to learn it, you know, because that is a significant investment. Also, when you are doing a trial, and even if you're doing a pilot, I always try to do two products at the same time. I find that if I'm looking at doing a trial and I only trial one product, it's easy for me to kind of fall in love with that product because I'm looking at it in a vacuum. I find it keeps me a little bit more honest with myself if I'm actually trying two products at the same time. So if I'm looking at text analytics tools, I want to trial two text analytics tools at the same time so I can compare and contrast. It helps keep me honest because I tend to be somebody who like kind of falls in love with shiny new things, and you know, I kind of sometimes like change for the sake of change, but if I'm looking at two products, it will really help keep me honest. And by the way, another important tip is delegate. If you do a good job creating that document, and again, your documentation of your plan might literally be a one-page, you know, Word document. It doesn't have to be fancy, but once you've identified what features need to be tested and what your burning questions are, delegate it. The actual, you know, mechanics of it, you know, is something that a junior team member can do or maybe even an intern who's had some training. So, you know, don't feel like you have to personally do every step of a pilot. It's totally okay to delegate some of it so that it happens because I know you folks, and you're always busy, and you're always on deadline, and it's easy to let the trial or the pilot slip away or kind of get stuck on hold indefinitely. But put together the plan and delegate the day-to-day work. A junior team member or an intern will be happy for the opportunity. And by the way, if you want an example of a pilot test, I'll put a link in the show notes. Some of you have already seen it, but I recently did a case study doing a pilot test of a mobile ethnography platform. I'll put a link to that in the show notes so that you can see my case study from doing an actual pilot. The other thing I created for those of you who do decide that, yes, you're going to take the time. Maybe you're not going to be a Unilever and trial 175 products over the next two years, but maybe you'll try a couple. I wanted to show you that there are actually a lot of companies in the market research space that offer free trials. And a free trial certainly takes away one of the blocks to actually taking the time, right? So there are a lot of companies that offer free trials. Some of the companies are ones you've definitely heard of. Some might be newer to you. Qualtrics offers free trials, QuestionPro, MarketSite, Lexalytics, a bunch of them. And so also in the show notes for today, I created a little PDF. Those of you who are watching this on YouTube can see a screenshot of my PDF. But I created just a little PDF of 13 tools that do offer free trials. And so the little PDF, I'll have it linked to that in the show notes. And in the PDF, all of the companies are hyperlinked. So you can just look at the company and hyperlink. And I tried to give you a mix of survey platforms, platforms related to qualitative data collection, text analytics, social media listening, and I even put in one that offers secondary research. It's actually one of my absolute favorite tools. So check that out in the show notes. I hope this conversation was useful for you. And I know taking the time to trial or even pilot new software in our space can be hard. But there are so many new cool products that are out there. And I know it can be a little bit overwhelming. And we all start to feel like it's just a lot of hype, right? You see so many announcements for new products. Or you see all these promotions from trade shows where you see all these sponsors. And a lot of them are companies you may not have even heard of before. And it's like, is it hype or is it real? And there really is a lot of cool stuff that's out there these days. So if you've been using the same software platforms for a really long time, it's time to consider at least looking at your options. If not, to actually make a switch just to kind of get a sanity check on, are the platforms you're currently using keeping up, you know? Are they preventing you from maybe doing some things easier or faster or better, you know, than they could be? So that's my call to action today. If you haven't tried any new software in a while, if nothing else, check out the free trials. You may very well find something of interest to you that you can create a simple one-page documentation of, create a trial or pilot plan, and delegate it. On a slightly separate subject, I do want to share that I created a little article on the Research Rockstar website a few days ago. And I'll put a link to this as well. And this is on a really a very different topic, to be honest. But it's about something that has come up a lot in this YouTube series, in this iTunes series, about what are the skills that we need in market research and insights to advance our career. Now we live in a data-agnostic, data-fluent world. Now that market research is one of several different sources of data that organizations use to inform marketing strategy and product strategy. We know that the C-suite of large organizations are really passionate these days about being customer-focused and data-driven. So what does all of that mean for me as a market research and insights professional? Well, I keep coming to two recurring skill sets. And so I wrote just a little one-pager. And I gave you four free links to resources you can just click and get from some different sources that I found that I thought had really good information. But the skills at the highest level I'm talking about are, well, one skill, frankly, has to do with quantitative research, which is those of us who do survey research are increasingly asked to append survey data or blend survey data with data from other sources, what we might consider big data, you know, because it might be data from, you know, customer transactions or customer behavior data or third-party data, et cetera. And so I'm seeing a lot of interest from people who do survey research in, you know, okay, what do I need to know about blending my survey data with other data sources? And the second skill I'm getting a lot of interest in and a lot of questions about is how do I deliver results to the C-suite? So now that the C-suite is so passionate about being customer-centric and data-driven, a lot of us are really trying to make sure that every deliverable we put out there is absolutely ready. And what does that mean and how do I connect with them? So, again, I'll put this in the show notes, too, so I think that means I'm giving you free show note links this week. You'll get the link to my case study to show you an example of doing a pilot. I'm giving you a link to the 13 free trials. And I will also give you a link to this page that talks about and provides free links to two of the really important skill sets that I am seeing for today's market research and insights professionals. I think that's it for me today, folks. If you have any questions, please do add them in the comments and I will reply. And as always, whether you're watching this on YouTube or listening to it on iTunes, if you found any value here today, I'd really appreciate it if you would like and subscribe. It really helps me to get the word out. Thanks, everybody. Have a great day.

ai AI Insights
Arow Summary
The speaker notes the explosion of new market research and insights software showcased at conferences and in promotions, which can feel overwhelming for busy practitioners. They encourage professionals to periodically evaluate new tools—citing Unilever’s claim of trialing 175 tools in two years—and explain the difference between a short software trial (often 7–14 days, usually in a non-live setting) and a pilot (running a small real project end-to-end). Key tips include documenting a simple one-page plan with desired features, key questions, and success criteria; testing usability and UI improvements (especially mobile/responsive design and client-ready visualizations); trialing two competing products in parallel to avoid bias; and delegating execution to junior staff or interns. The speaker also references resources: a case study of piloting a mobile ethnography platform, a PDF list of 13 market research tools offering free trials, and an article about career skills—blending survey data with other data sources and presenting effectively to the C-suite.
Arow Title
How to Trial and Pilot New Market Research Tools Efficiently
Arow Keywords
market research software Remove
insights platforms Remove
free trials Remove
software evaluation Remove
trial vs pilot Remove
survey platforms Remove
text analytics Remove
social listening Remove
user interface Remove
mobile-first surveys Remove
data visualization Remove
pilot success criteria Remove
tool comparison Remove
delegation Remove
Unilever 175 tools Remove
data blending Remove
C-suite communication Remove
Arow Key Takeaways
  • The market research software landscape is crowded; periodic evaluation prevents stagnation.
  • A trial is a short, often mock test (7–14 days) to validate claims, features, and UI.
  • A pilot is a small live project run end-to-end to test real workflows and outputs.
  • Define success criteria and key questions (e.g., fewer clicks, better mobile experience, client-ready charts).
  • Keep documentation lightweight—often a one-page plan with a feature checklist.
  • Run two tools in parallel to compare and avoid falling for a single ‘shiny’ product.
  • Delegate trial/pilot execution to junior team members or interns to ensure it happens.
  • Assess mobile responsiveness and the time needed to make outputs presentation-ready.
  • Use free trials to reduce barriers; many well-known vendors offer them.
  • Career skills to develop: blending survey data with other data sources and delivering insights to the C-suite.
Arow Sentiments
Positive: The tone is encouraging and pragmatic, acknowledging overwhelm while emphasizing opportunity, actionable steps, and enthusiasm about improved tools and interfaces.
Arow Enter your query
{{ secondsToHumanTime(time) }}
Back
Forward
{{ Math.round(speed * 100) / 100 }}x
{{ secondsToHumanTime(duration) }}
close
New speaker
Add speaker
close
Edit speaker
Save changes
close
Share Transcript