Episode 438Content StrategyCustomer SuccessCustomer Experience

How to tie content to revenue, retention, and real customer outcomes with Justin Chappell

Justin Chappell, Head of Digital Strategy, CX and Operations, argues that most content programs are stuck measuring vanity metrics — opens, clicks, awareness, and engagement — when they should be tied to financial outcomes like gross revenue retention, net revenue retention, renewal rates, and time to value. He outlines the three places content programs typically break down: content gets created in isolation without cross-functional ownership, it gets delivered on rigid time-bound drip campaigns instead of predictive signals, and success is measured too early on easy numbers instead of 30/60/90-day behavior. Justin contrasts the 'peanut butter' health-score approach — where one red/yellow/green model gets spread across every customer — with predictive engagement models that use product telemetry, psychographic, and firmographic data to meet each customer individually. He introduces a long form / short form / micro-learning framework that turns content into a roadmap every department can contribute to, with program managers, tech writers, and marketing each owning a role. He closes with a case for self-service as a friction-removal strategy, arguing the next evolution is AI-enabled systems that anticipate what a customer needs before they ask — so content finds them, not the other way around.

Justin Chappell

Justin Chappell

Head of Digital Strategy, CX and Operations

20 min

Key Takeaways

  • 1Stop measuring content success with consumption metrics like views, opens, and clicks — tie every piece to a financial outcome like gross revenue retention, net revenue retention, renewal rate, or time to value so you can prove the content moved the business, not just the dashboard.
  • 2Content programs break down in three predictable places: content gets built in isolation with no cross-functional owner, it gets delivered on rigid time-based drip schedules instead of behavioral signals, and success is measured way too early before any real outcome can show up.
  • 3Replace the 'peanut butter' health-score approach — one red/yellow/green model spread across every customer — with a predictive engagement model that compares an individual customer's behavior to historical look-alike data so the system tells you what to do rather than the other way around.
  • 4Build every piece of content in three forms: a long form version for the technical deep-divers who want to print it out and highlight it, a short form 'too long, didn't read' version that earns the click, and a micro-learning video for the TikTok-trained audience that wants two minutes and done.
  • 5Use a center-of-excellence model to break silos: program or product managers own the technical accuracy, tech writers translate it into something consumable, and marketing gives it the brand polish — with the digital strategy lead owning the end-to-end roadmap and content gap analysis.

About this episode

Awareness, opens, and clicks are vanity metrics, and most marketing teams are still measuring content as if they aren't. In this episode of Content Amplified, Justin Chappell, Head of Digital Strategy, CX and Operations, breaks down how to connect content to the numbers that actually matter: gross revenue retention, net revenue retention, renewal rates, and time to value. Justin walks through the three places content programs typically break down, why a 'peanut butter' health-score approach fails customers, and how predictive engagement models beat old-school drip campaigns. He shares his long form / short form / micro-learning framework for building a content roadmap every team can contribute to, explains why you have to stop measuring success at the open and start measuring it at 30, 60, and 90 days, and makes the case that self-service content is really about removing friction, not removing humans. If your content program is stuck proving awareness instead of proving value, this conversation gives you a clear path forward.

Topics covered

  • Tying content to financial outcomes, not vanity metrics
  • Predictive engagement vs. time-based drip campaigns
  • The long form / short form / micro-learning framework
  • Building a content roadmap across silos
  • Self-service content and friction removal

Notable quotes

We need to take it one step further than that and tie that back into a financial metric. To me, it's always about gross revenue retention, net revenue retention, renewal rates, faster time to value. Those are hard numbers that we can report against.

Justin Chappell(3:40)

The right message at the right time to the right person. We measure our successes way, way too early.

Justin Chappell(7:10)

I'm not asking the system to tell me, I'm asking the system to tell me what to do. I'm not telling the system what to do. To me, that's the difference between a proactive engagement model versus a predictive engagement model.

Justin Chappell(10:15)

Customers don't want self-service content just because they don't want your help. They want it because they don't want any friction when they get to that information.

Justin Chappell(16:03)

Resources mentioned

  • Framework

    The Three Breakdowns in Content Programs

    Audit your program against the three places content typically fails. First, is content being created in isolation by marketing without CX, sales, or product owning the end-to-end value workflow? Second, is delivery time-bound (180 days to renewal) rather than signal-bound (a product-telemetry event says the customer is ready)? Third, are you measuring success on day-one metrics (views, clicks, opens) or on 30/60/90-day behavior (adoption, deployment lift, renewal, CSQL creation)? Fix the ownership gap, move to predictive delivery, and push your measurement window out to where real outcomes actually show up.

  • Framework

    Three-Format Content Roadmap (Long / Short / Micro)

    For every piece of content — knowledge-base article, best practice, white paper — produce three formats. Long form is the technical document for deep readers who want to print and highlight. Short form is the TL;DR teaser that drives a click to the community or knowledge platform. Micro-learning is a two-minute video sized for customers who've been trained by TikTok and YouTube to skim. Map your current library against this 3x grid, identify the gaps, and then distribute authoring: program or product managers for technical depth, tech writers to water it down, marketing to add brand polish. Now you have a plan, not a plea for collaboration.

  • Framework

    Predictive Engagement Over Peanut-Butter Health Scores

    Move beyond blanket red/yellow/green health scores that apply the same company-defined thresholds to every customer. Instead, feed product usage, product telemetry, psychographic, and firmographic data into a predictive model and compare each customer to historical look-alikes. Ask the system what action to take next rather than telling it what to look for. The payoff is threefold: you can personalize by behavior, meet customers in their preferred format (video vs. text), and time delivery to the exact moment a customer finishes a milestone and needs the next piece. Same content, different delivery — and it stops feeling like spam.

Justin Chappell (00:02) I got to put the framework in place. And now I have that framework, long form, short form, micro learning. That now I can go out to those different departments and those different groups and say, okay, great. Here's my roadmap of content we have today. Here's the gap of content we need for the future. And here's how you can contribute to that. Benjamin Ard (00:44) Welcome back to another episode of Content Amplified. Today I'm joined by Justin. Justin, welcome to the show. Justin Chappell (00:49) Thank you so much for having me today, Benjamin. Benjamin Ard (00:51) Yeah, Justin, I'm excited. This is a conversation that I think every marketer is having internally with their teams, figuring out really how they can expand their content efforts and tie it to revenue and everything involved with that. But before we dive into the full conversation, Justin, let's get to know you. Let's get to know your background, all that fun stuff so the audience knows who you are. Justin Chappell (01:14) Yeah, sure. So first, I just want to say thank you for letting me be part of this and thank you for putting this together. I think we've talked about this in the past a little bit, me and you. I'm big into public speaking. I do a lot of stuff here in Atlanta for our customer success network. We do a lot of in-person events, but not everybody has that in their cities or within their own systems or ecosystems that they belong to. So I think it's important for people like you to put together these platforms to allow for us to get that messaging out. Not just to solve problems, but to get people to think differently about the problems that are in front of them. So I just want to say thank you for that first. As you heard, I'm from Atlanta, Georgia, originally from Buffalo, New York, Go Bills, and most recently been down here for about the last 15 years. And my role today, I work at a company called OneTrust, and I run our digital strategy for our CX and operations departments. So think about more of the post sales marketing, post sales engagements, and building out our content to be able to touch the customer at the right moments within that customer life cycle. So have a little bit of background in marketing. I used to work at a company called Sixth Sense, really learned a lot about predictive modeling and how to really get in front of customers, drive intent, expand reach, all of the things that you would normally get from a marketing perspective. And then now brought that into the post sales component of my role here at OneTrust. Benjamin Ard (02:35) I love it. I love it. So Justin, you have such good background and you have spent a career figuring out the right moments to use content and the right ways in to really have it actually have solid outcomes. So when we talk about content having outcomes, what outcomes do teams think they're driving, but usually aren't? Like what are those outcomes that people think content is producing, but maybe isn't actually doing as much as they think? Justin Chappell (03:03) Yeah, I think about, I think it's because we think about the way that we're defining outcomes as more of consumption. Right. So when I think about what the outcomes that I'm trying to drive with the content that I have, it's usually trying to tie it back to some type of experience for that customer that I can tie it back to like a financial metric, faster time to onboard, adoption metrics, a higher retention rate, some type of expansion or cross-sell opportunity that was generated from it, where I think most marketers, most digital creators are always thinking about more around like, am I driving awareness? That's great, we need to do that. We need to hook and bring the customer in. We gotta create that FOMO. We wanna drive awareness, we wanna get engagement, we want them to interact with that content we've created, and then we wanna educate them. But none of that is tied to value. Those are all vanity, right? It's all vanity metrics to say, look at people are looking at what I've created and I'm driving awareness. I think what you actually, what we need to do is take it one step further than that and tie that back into like a financial metric. Like to me, it's always about gross revenue retention, net revenue retention, renewal rates, faster time to value. Those are hard numbers that we can report against. And we could show that a customer that's touched this piece of information, this content had this type of outcome. That means my content is now valuable, not just to myself, but for the rest of the organization as well. Benjamin Ard (04:26) I love that. And I love how you're tying it to the financial metrics, the things that really matter, the things that are going to get you more budget, the things that are going to validate the content's moving the needle, things of that nature. But often there's a breakdown in between. Like the reason that I think marketers so often focus on, I produced this content. Here's the open rate. Here's the awareness. Here's like all of that fun stuff is it's easy to track. How do you bridge this massive gap between content creation and revenue impact? You mentioned some really powerful metrics. How do you get to those with confidence and feel like you're actually driving results? Justin Chappell (05:01) That's a great question. I think that this is probably the one where I try to keep things simple. So hopefully I don't overcomplicate this. I do try to keep things simple when I explain it, but I talk in threes a lot and I always use threes as kind of like, there's three things that I would be thinking about. When you ask me that question, and the first is like, you know, where does it kind of break down? It starts that content's created kind of in isolation, right? For me, marketing owns the creation of the content. Then you have CX who owns outcomes. Sales owns the revenue number associated. But nobody really, when you think about it, the way I just explained it, nobody really owns the end to end value workflow of that content that's being used at all of those different touches within that customer's journey or that customer's life cycle. So I think that's where it starts to break down originally right off the bat is right there. When we think about content getting created in isolation and not being shared more universally within the different users or business units that actually leverage the content that's being created. So that's kind of like the first place I gravitate towards. The second one that I kind of move towards is, are we giving the customer and delivering the content at the moment that they need it? Too many times I've seen, and right, I'm gonna use some marketing terminology, because that's who's our audience. We create these drip campaigns. And to me, a drip campaign can be great, right? We can still drive awareness, education, everything we just talked about. But if it's not resonating with that customer right then and there when they needed it, it just becomes shelfware. It just becomes content that gets put into a folder somewhere. And I'm going to review that later when it matters. And then what happens? I never review it, right? As a customer, I just don't go back and look at it. So I think like this, this idea that like we don't deliver our content in this more, what I would consider as like predictive personalized fashion. And we still are focused on this like drip campaign style, which to me is more time-bound based activities. Say, 180 days to renewal. We're going to send them some information around value that we think they should be seeing versus these indicators within the product telemetry is telling me this customer seeing value. I should now send them this value piece of content that talks about feature functionality that they've been leveraging throughout that, throughout their contract with us. To me, it's the same piece of content. It's just delivered in a different fashion. And it's looking at where that customer is within the journey to then deliver it to them. Right. So the right message at the right time to the right person. And then I think the final thing that I kind of think about when I think about the breakdown is we measure our successes way, way too early. Right. So like you, like you were saying before, it's views, downloads, clicks, opens, right? Those are things that we measure almost immediately when our campaign starts versus not looking at that customer after the content. What was the 30 day, 60 day, 90 day behavior of that customer? Did they adopt? Did we see their deployment metrics go up? Did we see the retention of that customer? They renewed. Did we see an expansion signal created? They have now a CSQL that was created based off of that content. Those indicators are a little bit later down in the journey, but we always gravitate to the first ones because those are the easier ones to actually measure. And we can get instant gratification of, how great my program is. So I think those are kind of like the three areas I kind of always go back to, which is again, building that content in isolation, kind of thinking about sending it to the customer at the right time, the right message, and then making sure we're measuring the right outcomes of what that content's supposed to be doing and not falling into the trap of measuring the easy stuff, which is usually too early to measure anyways. Benjamin Ard (08:41) Okay, I love it. So I want to break down these three different areas because each one of them is fascinating to me and I want to double click on those. But I'm going to start right in the middle. Step number two, the right timing for the individuals. And I love that you talk about drip campaigns. Again, it kind of comes back to the same thing when it comes to measurement. We do opens and clicks and all that kind of fun stuff because it's easy. It's right at the fingertips. Same thing with drip campaigns. Often we create these drip campaigns and we're using some weird metric like 180 days after they become a customer or who knows what. How are you getting the right signals to know when to engage? You talk about product elements, things like that. I mean, are you working really closely with your product teams to be able to get those signals? Are there any other sources for signals that you typically use? Justin Chappell (09:30) Yeah, so this, this may be a little bit more customer success, customer experience focus, but I think it's an important thing to think about. So if you're a marketer, go talk to your CS ops team about this, this sort of stuff. So traditionally we use health scores a lot of times, right? Customers are like, we got these health scores. My customers are red, yellow, green. And this is the parameters that we've defined as a company that we think that are important to make sure that that customer is red, yellow or green. And we're going to set the benchmarks and the thresholds. That's a, that's a very peanut butter style approach of just spreading across your entire customer base. What we as a company think is important for that specific customer. What I look at and what I always try to do is I take a step back and say, and especially in today's world with AI and predictive modeling, really at the forefront of what I do is I look at predictive models to say, look at every single piece of information for Benjamin and compare that to historical information for people that looked like Benjamin, and then tell me where and what I'm missing and what should I be looking at to drive the interaction. I'm not asking the system to tell me, I'm asking the system to tell me what to do. I'm not telling the system what to do. To me, that's the difference between a proactive engagement model versus a predictive engagement model. And what I try to drive is this more predictive engagement model because now I can actually do a couple things. I can personalize that experience because I know all of the activities that you've actually been doing within our product. I can then meet you where you're at because I can identify that when you do receive content from me, you like to watch videos more often than not. So I'm gonna give you video content versus text form content. And then I'm gonna actually be able to be more predictive to know this is the moment you need it because you've just completed this transaction. The next step is this. And then that's when we, that's when we get involved, right? Whereas if it was a drip campaign and we said, show me all customers that are yellow. Do they all need the same piece of content? No, we're basically just mass mailing out a bunch of stuff. And that's why people think we spam them, right? From a marketing perspective, like, that's spam or, I'm not going to read it. And we get, you know, put into the auto delete folder sometimes. And I think that's because we've, we've kind of over indexed on this idea that we're gonna use metrics that we've defined versus let's use all of the different touch data points that we have available, like product usage, product telemetry. We use a lot of like psychographic, firmographic information from our customers and pulling that into our predictive models so that we can really start to get a very good view of that individual customer versus all customers. Benjamin Ard (12:10) I love it. That's amazing. Now double clicking on the first point when you talk about creating content in silos. How do you break those down? How do you get everyone involved? How do you get the data to know what content you're creating? All of that fun stuff. What do you recommend for content marketers who need to take a step back and get other people involved when it comes to content creation? Justin Chappell (12:31) Yeah, it's a great question. I think this is probably the reason why I said it was the first one that I would go to it, because I think it's probably the more difficult out of these. Like technology can fix some of the other stuff we talked about. Discipline and rigor can really take care of the number three one that I talked about. But this one really takes the effort to go out there and create the platform or the forum to build the roadmap of here is the content that we need to have available on the ready. So the way that I talk to my team, all the content that we have, whether it's a knowledge-based article, whether it's a best practices, a white paper, whatever the content type is, I want three different forms of that for every piece of content. And you're going to be like, well, that's a lot, that's a lot, Justin. But think about it. I need a long form, which is the technical documentation that is the manual that some users want to read, right? We deal with lawyers here at OneTrust a lot of times. They want to be able to print it out and they want to take their highlighters and they like want to go through it the same way they would with research. Right. So we need the long form. That's traditionally what we normally create a lot of. But then we need to then do what, create some some kind of short form. Right. The too long don't read. Get to the point. Get the information that I need so that I can go and do my job and not have to read the entire piece of content that was created. So that's a lot more of like the push pull effect. So here's a blurb. It's a teaser. Click the link. Go to our community platform, get the full documentation and then be on your way. And then again, the last one, which I kind of hinted on a little bit earlier, which is in this day and age, I don't know many people that don't watch videos. We have been basically programmed, right? Whether it's TikTok, YouTube, Instagram, we want those quick two minute videos so we can quickly rattle through. And I think that's the other piece of content that I think is missing a lot of times is we don't spend enough time on that micro style learning or content creation that is really fast and consumable for our customers to digest, which is that video form. So to me, that starts with me having to say, I got to put the framework in place. And now I have that framework, long form, short form, micro learning. That now I can go out to those different departments and those different groups and say, okay, great. Here's my roadmap of content we have today. Here's the gap of content we need for the future. And here's how you can contribute to that. And then I go to each one of them, almost like a center of excellence and get each person involved in helping and curating and creating of that content because I need my program managers or product managers to help me with very technical documentation. I need my tech writers to help kind of water that down to make it a little bit more consumable. And then I need my marketing managers to really put that good spin on it to make sure it lives up to the brand and the promises that we do when we're actually selling. Each person plays a role in it, but it all starts with saying, okay, each of the different content types has those three different forms that we need to have it in. And now it's a plan and it's a strategy versus me coming out and just saying, hey guys, we've got to work better together. Benjamin Ard (15:31) I love that. Okay, that's a masterclass in and of itself. I love that right there. Justin, we're almost out of time. I want to talk about one other thing that we were emailing about before we got into the podcast. And it's the role of self service content. So the ability for users to not rely on a company using signals and all that kind of things for them to go and find the content they need in the moment they need it. What role does this play? What have you done in this space? What are your recommendations when it comes to self-serve content? Justin Chappell (16:03) Yeah, I think the first thing I would say is customers don't want self-service content just because they don't want your help. They want it because they don't want any friction when they get to that, to get to that information. Right. So that's where self-service to me is really kind of a key driver. Buyers want control over the pace of how they consume it. They want to have confidence in the decisions that they're being able to make based off of what's being put in front of them. And they want to make sure that they're on the right path. So when I think about those kind of three things, I'm thinking about a self-service component that allows for that customer to do it with the advancements in AI and co-pilots that are out there today and making sure that we really have that really robust library of content available. We can allow for that customer to self-serve, right? And then the big problem that I've seen though, and I'll just share is kind of like one of my kind of barriers that I'm trying to overcome this year is for the last 30 years, Google has taught us in a search to ask a question and it'll give us, you know, whatever, and we have to go search, continue to search for it. Whereas like co-pilots and these AI search capabilities is really like, give me something very specific you're looking for. And I'm going to answer that, right? So it's how do we teach customers to use prompting? How do we teach customers to put guardrails on their search criteria? Because if we can get that piece done correctly, then it's really easy for our customers to get the content they need at their fingertips whenever they need it, right? And now they're controlling the pace of consumption. They're having confidence in the decisions that they're gonna make based off of the content they were able to review. And it proves to them that they are on the right path because the answers, the questions they ask, they're getting the answers they're looking for. So I think that's kind of how I look at self-service really being there. And it's really to try to, the next evolution of that though, I think is really how do we get self-service to anticipate what those customers are really looking for? I've been doing a lot of stuff recently where I'm looking at when it costs, where a customer comes from within our community platform or from within our website, what is the domain address that they came from? That should be a clear indicator that, they were on this page. The relevant information they're looking for has to be in this content type because that's the only thing that would make sense if that's the page they were coming from. So I think we have to get better in using the data to drive the behaviors for our customers because it's there at our fingertips. We just need to be able to consume it and then action against it. So I think that's the next evolution of this is not just, you know, letting the customers go and be able to search and get the right information, but almost anticipating what it is they're looking for and getting it and surfacing it before they even ask for it. Benjamin Ard (18:45) I love it. That's amazing. Justin, this has been incredible. I really love your insights and all that information today. For anyone listening who wants to reach out and connect with you online, how and where can they find you? Justin Chappell (18:56) Yeah, I was just saying, I'm big on LinkedIn. Come find me on LinkedIn. It's Justin, last name Chappell, C-H-A-P-P-E-L-L. Or if you just type in Justin Chappell in the search, usually I pop up. But I love talking about this stuff. I really wanted to say thank you again for the opportunity to meet with your followers and share a little bit about myself and the insights that I have. And I'd love to come back and talk about this again or talk about some other topics, because I really love this stuff and super passionate about it. And, you know, again, anybody ever wanted to hit me up, talk about it, just, just ping me on LinkedIn. I'm always available. Benjamin Ard (19:29) I love it. Anyone who is interested in connecting with Justin, scroll down to the show notes, regardless of what platform you're listening or watching on. We'll have Justin's LinkedIn profile right there. Go ahead and click on it. Tell Justin you came from the podcast. Love for you to connect with him. Justin, again, thank you for the time and insights today. I really do appreciate it. Justin Chappell (19:48) Thank you so much. Cheers. Have a good day.

About the guest

Justin Chappell

Justin Chappell

Head of Digital Strategy, CX and Operations

Justin Chappell is Head of Digital Strategy, CX and Operations at a large enterprise software company, where he leads post-sales marketing and content strategy across the customer lifecycle. Based in Atlanta, Justin brings a marketing background rooted in predictive modeling, intent data, and reach expansion, and has carried those disciplines into the post-sales world to shape how content drives adoption, retention, and expansion. He is an active voice in the Atlanta customer success community and a frequent in-person speaker. Justin believes the best content programs are built like systems: one roadmap, three formats, and outcomes measured against financial metrics, not vanity ones.

Connect on LinkedIn

Continue Exploring

The Content Alignment Playbook

A practical framework for keeping marketing, sales, and customer-facing teams on the same story.

Open the playbook

Get new episodes in your inbox

Join listeners who get episode summaries, key takeaways, and content strategy insights every week.

Frequently Asked Questions

Justin argues that views, opens, clicks, and engagement tell you a customer saw something — they don't tell you whether it changed a business outcome. They're easy to measure and offer fast gratification, but they're disconnected from value. The better metrics are financial: gross revenue retention, net revenue retention, renewal rates, faster time to value, expansion signals, and CSQL creation. If a piece of content can be tied to one of those, it has real value across the organization, not just marketing's dashboard.

Start by replacing time-bound triggers (180 days to renewal) with signal-bound triggers drawn from product telemetry, usage data, and customer attributes. Partner with your CS ops team to understand health-score mechanics, then push past one-size-fits-all thresholds. Feed individual behavior into a predictive model that compares each customer to historical look-alikes and surfaces the next best action. The same piece of content delivered at the right moment in a customer's journey hits very differently than the same piece of content sent on a calendar cadence — that's the shift.

It's a rule that every piece of content — knowledge-base articles, best practices, white papers — should exist in three formats. Long form serves technical or detail-oriented readers who want to print and highlight. Short form is the TL;DR teaser that earns a click into a fuller resource. Micro-learning is a two-minute video for customers conditioned by TikTok, YouTube, and Instagram to skim. Mapping your current library against this 3x grid exposes the content gaps and gives you a roadmap every team — program managers, tech writers, marketers — can contribute to.

Self-service isn't about customers avoiding humans — it's about removing friction so buyers and customers can move at their own pace, gain confidence in decisions, and confirm they're on the right path. With AI and co-pilot capabilities, a well-built content library can surface the right answer on demand, provided customers are taught to prompt with specificity rather than browse like it's 2005 Google. The next evolution is anticipatory self-service: using signals like referring URLs, account attributes, and product behavior to surface the content a customer needs before they even search for it.

EP 44319 min

Why more AI content is not the same as more pipeline with Amanda Landsaw

with Amanda Landsaw

Speed is not strategy. In this episode of Content Amplified, Amanda Landsaw, CMO at Endeavor B2B (a marketing, media, and intelligence organization with 90+ brands across 16 verticals), explains why pumping out more AI-generated content does not translate into relevance, differentiation, or trust. Amanda argues that 'crap input equals crap output' and walks through what it actually takes to use AI well: developing an almost intimate relationship with the model, layering prompts to peel back the onion, and treating point of view as the one thing AI cannot replicate. She also covers how buyer behavior is shifting as people use ChatGPT and Claude as their new search, why personalization is really just relevancy in disguise, and how to train AI on your own talks, papers, and podcasts so it can ghostwrite in your voice without losing the human review layer. If you are wrestling with how to scale content without drowning in noise, this conversation gives you a sharper way to think about the work.

April 30, 2026Listen
EP 44517 min

How to run social media like a real-time testing ground with Austin Price

with Austin Price

Most marketers still treat social media like a megaphone. Austin Price treats it like a nervous system. In this episode of Content Amplified, Austin, Director of Social Media at H&L Agency in Oakland, walks through how he runs creative as a hypothesis and lets data confirm or kill it before a campaign scales. He explains why engagement rate is his default metric (and how it gets gamed), the 24-hour read he uses to decide whether to pivot or lean in, and why a 100 million person reach against a 5 million person addressable market should embarrass everyone in the room. Austin also reframes the quality versus quantity debate as a consistency problem, points to Chad Powers and the Dr. Pepper jingle as proof that social is now the testing ground for every other channel, and makes the case that the comment section is the context layer that data alone can never give you. If you want a practical model for running social as a portfolio of tests, this episode is for you.

May 5, 2026Listen

Get new episodes in your inbox

Join listeners who get episode summaries, key takeaways, and content strategy insights every week.

We use cookies to improve your experience and analyze site traffic. Privacy Policy