7 Key Lessons from 10,000 Hours of Media Buying: Insights for Marketers
Discover the top seven lessons learned from over 10,000 hours of media buying, including the importance of flexibility, testing, and community support.
File
What Ive Learned From 10,000 Hours of Media Buying
Added on 09/29/2024
Speakers
add Add new speaker

Speaker 1: What's up, marketers? They say that it takes 10,000 hours of doing something to become an expert of something. And I am now celebrating my approximately seventh or eighth year as a media buyer. And therefore, after doing something that could be called math, I've realized that I've definitely done now over 10,000 hours of media buying and spent more than $100 million on paid social ads. But maybe you wanna sit down with you guys and do more of a sit-down style video where I discuss the seven things that I have learned after 10,000 hours of media buying. So let's go ahead and dive in. For those of you who don't know, a part of my YouTube process is I actually just like write down these little outlines to help me talk about this stuff in some sort of cohesive orders. Ah, number one, yes. Such an important thing, and I feel like this is actually really hard for me to learn as a newbie media buyer, but there is no one right way to do things. Now, I know that you guys are here watching my channel because you want to know how to do Facebook ads or TikTok ads or how to make good creative. Even my way isn't the only way to succeed. And I think something that I really got hung up on in the early days was doing it right. Should I stack my interests or not stack my interests? Should I do a 1%, a 3%, or a 10% lookalike? I got really hung up on these details that actually at the end of the day didn't really matter. Something that was really liberating for me after I became an expert in the space was realizing that there is more than one way to succeed with media buying. And it's really about developing your own perspective. I know at the end of the day, clients want to work with me because they want my perspective. I have their trust, and they want to see how I would do things. Yes, they want things to be done the right way. They want to get their own ROI. They want to have the right raw ads, but they buy into the way that I'm going to do it, which could be different, and maybe another way could work better. Who knows? I feel like once a media buyer understands my way is going to be different from another expert media buyer is going to be different from another expert media buyer, it lessens the pressure of always having to get it right. Number two is you have to be willing to change your mind. Now, something that I love about media buying, and it's kind of frustrating too, frankly, is that I often end up changing my mind about things, sometimes way sooner than I anticipated. I think something I've recently changed my mind on was attribution windows on Facebook ads. I used to only use one-day click or seven-day click and never really gave thought to view-through attribution because, oh, the Facebook ads algorithm actually optimizes for view-throughs, and those aren't incremental, blah, blah, blah. And now it's like, well, times have changed. Post-iOS 14, the data that we're getting back from this platform is really limited, so I want the most amount of data as possible. And I even think that for brands at scale, there is something to be said for that view-through attribution that is really not quantifiable, which is sort of difficult to talk about. But I think that if you're gonna be an expert in this space, you have to have the self-awareness of where your own limitations are and be malleable enough to have your mind changed when the data presents itself. I think that there are media buyers who are really stuck in their own way and really stuck in their lane, tend to struggle, especially when it comes to testing these things. Oh, God, I have to be on a call in seven minutes. Can I do the rest of this? Number three on that note is to keep a testing mindset. And frankly, you should also be encouraging your clients and your colleagues to also have that same mindset. I find that a lot of times clients or people that I work with will be like, oh, we can't do that because of X, Y, Z. And I'm like, have you tested it? Or is it just like a brand ideal? So something that I really try to do in my practice is anchor myself with the data and encourage people to go towards performance, but also test things that are out of the box, test things that make them a little uncomfortable. This is especially when it comes to creative. The truth is is that a lot of times what is going to perform tends to happen outside of your realm of comfortability when it comes to brand type of advertising. There's lots of times, especially with my consulting calls where I'm getting on calls with brands as opposed to agencies where a lot of what we're talking about is that dichotomy between brand marketing and performance marketing and how they can both be one. And to be honest, it's really hard to get both sides to see the value in each other, which is why I always double back on test it, especially when it comes to performance advertising. The reality is it's like no one's gonna remember an ad that you posted once or that you posted a few weeks ago. Number four is you need to have a why for your test. I think one of the biggest things that I ask the creative strategists and the creative directors at Thesis is why did you do this test? One of the most common things that I hear, not only from my team but other teams is, oh, we just need to test a whole bunch of stuff and figure out what works. And yeah, like on one hand we should, but you need to have a why for every single test that you're conducting because that's gonna tell me what you're trying to learn. You should be trying to learn something from every single test. Your whys can be rooted in data and performance or in past performance. Your whys could be rooted in industry or even platform trends. So if you see a new type of ad that's running across several different competitors in your industry, hey, maybe it's time to try something like that. But you really have to be aware of shiny object syndrome, especially when it comes to creative testing because just borrowing what all of your competitors are doing in ads manager or TikTok top ads or even what other people are saying on Twitter, all that combined is not going to make a strategy. You really need to drive your own strategy and be the anchor from that and be really, really selective of what you are going to test and have your why for your testing. Ah, number five, this is probably my favorite one. Having a community as a media buyer is your cheat code. They're gonna make you better. They're going to share their results. They're gonna talk about what they're testing. And they're also gonna challenge your bullshit ideas, being willing to change your mind on certain things because of how rapidly these platform tests have changed over the years. It's been invaluable to me to be able to swap notes with other media buyers to say, hey, I'm seeing this on my accounts, what are you seeing? That's why Twitter has been really, really valuable for me. I also, in the early days, used to be a lot on Reddit and some of the communities there. No matter where you find your community, I encourage you to double down on it and find like-minded people, like-minded media buyers, so that you can swap stories, swap tests, and ultimately become better as a result of it. God, I have two minutes. One, to also call out bullshit when you see it. I'm not saying, oh, you need to be like policing people, but I mean more or less when you are in those client calls or you are in those internal calls and you just hear something that doesn't seem to be in line with actual data or you feel like it's being rooted in a place of feelings or I think this, just go ahead and call that out. I was on a call with a client today, actually, and we were discussing TikTok because they want to get their brands on TikTok. And they were saying that this other agency that was trying to work with them on the platform had said something like, oh, the TikTok audiences are so much more refined and so much different, and blah, blah, blah. And I literally just like stopped her and was like, I don't know how I could know the audiences are better. Like, I guess we could split test them and that could show you some sort of barometer. Like, is that what they meant? And we both just sort of sat there and laughed because it was kind of just like a light bulb moment where we were like, oh, like, I think that this person was just getting sold to and it just felt like bullshit. Like, you can really kindly call out bullshit in our industry. Just ask why if someone says that they want to do something because if you don't understand, saying that you don't understand and to ask for more information is a completely valid response because to be honest, in our industry, people push a lot of bullshit because data is hard, because this work is hard and it's really hard to be creative with ad strategies all the time. I am late, so I'm gonna go through this one. And number seven is to be honest. This is kind of going into this whole, you know, calling out bullshit when it matters. But if you don't know something, don't tell a client that you know something. If you can't get a 2X ROAS on a certain campaign or on a certain client, don't say that you can do that. I have never, ever promised specific results to someone and I am very careful around the type of language that I use with clients, especially potential clients, when it comes to the type of ROI they can expect on Facebook ads. I can tell them, hey, this is the type of results that we're seeing across that specific industry or across that specific platform. But I am very, very careful to promise anything outlandish when it comes to results because to be honest, there is no sure thing in our industry. So I think that that's something that my clients really like about working with me is that I'm always gonna do the best by them and always try to, you know, put the strategy that I really think is gonna move the needle in front of them. But I'm not out here promising results to anyone and I am never, ever going to tell a client, hey, yeah, you can expect a 4X ROAS on this, this, and this. Because a lot of the times that stuff is out of my hands. Maybe their LP goes down, maybe their website goes down, God forbid, maybe their checkout goes down. That's just recently happened. And I don't have any control of that. What I can control is I can control the creative strategy. I can control the type of creators that we're working with. I can control the type of targeting, which is now mostly brought anyways, that we are doing on Facebook, Instagram, and TikTok. But aside from that, you know what? This whole brand is not my entire domain. So I'm really just playing a small part of it and I can make sure that that's buttoned up. But that does not mean that I'm going to be guaranteeing you a certain amount of results. End rant. That is it. I am officially two minutes late to my call, but I think that that was an awesome episode. Let me know what you think. I will see you guys next week, bye.

ai AI Insights
Summary

Generate a brief summary highlighting the main points of the transcript.

Generate
Title

Generate a concise and relevant title for the transcript based on the main themes and content discussed.

Generate
Keywords

Identify and highlight the key words or phrases most relevant to the content of the transcript.

Generate
Enter your query
Sentiments

Analyze the emotional tone of the transcript to determine whether the sentiment is positive, negative, or neutral.

Generate
Quizzes

Create interactive quizzes based on the content of the transcript to test comprehension or engage users.

Generate
{{ secondsToHumanTime(time) }}
Back
Forward
{{ Math.round(speed * 100) / 100 }}x
{{ secondsToHumanTime(duration) }}
close
New speaker
Add speaker
close
Edit speaker
Save changes
close
Share Transcript