Stop Over-designing Your Landing Pages: CRO Secrets With Ryan Doney

According to Ryan Doney, Founder of Page Deck, landing-page CRO doesn’t have to be a design marathon. In this episode Ryan lays out a “template ‣ angle ‣ offer” testing framework and explains how media buyers can mine Reddit for real-customer language. He also talks through the value of spinning up lightweight landers built in Page Deck so you can more quickly push winners straight into paid campaigns without waiting on design tickets.

Key Takeaways:

  • Why your very first test should be a dead-simple “5-reasons-why” listicle built around your best-performing ad headline.

  • How adding a second offer page after a listicle—making the funnel longer—actually increases conversion and AOV.

  • Why raw Facebook-comment screenshots beat polished testimonial carousels

  • How this 1 KPI stack actually tells a fuller story than ROAS.

  • Why strict URL naming conventions keep you from “landing-page spaghetti” and what you can learn from them.

  • The post-purchase upsell that is grabbing a 20 % take-rate.


Learn more about the Scalability School Podcast or listen to other episodes head to https://scalabilityschool.com

To connect with Ryan Doney send him a DM at https://x.com/theryandoney

To learn more about Page Deck visit them here: https://www.pagedeck.com/

To connect with Andrew Foxwell send an email Andrew@foxwelldigital.com

To connect with Brad Ploch send him a DM at https://x.com/brad_ploch

To connect with Zach Stuck send him a DM at https://x.com/zachmstuck

Learn More about the Foxwell Founders Community at http://foxwelldigital.com/membership


Full Transcript

Transcript:

(01:15) welcome to episode 7 of the Scalability School podcast. You know, the one thing I want to say right away for all of you that are listening is we only get more listens by you sharing this with your friends. Okay? So, we're a bunch of losers at home on our computers alone.

(01:44) So, can you please share this with your internet friends? Guys, any other comments on that? We just I left my house for two days. I'll have you know, just as a sidebar about not leaving the house. Yeah. No, but yes, please, please share it. Uh, so we'll mention that again somewhere in the episode so that you can't skip it as easily.

(02:00) But anyway, big episode here for us. Honestly, super excited. Episode 7 talking with Ryan Dhoni who is the CEO and co-founder of Page Deck and honestly allaround good human being. Ryan, welcome to the show. Thanks for having me, guys. The Midwest energy is palpable on this podcast. It feels nice and comfy. We can't invite non Midwesterners. Yeah. Happy to be here.

(02:23) Yeah. So, we're going to talk about CRO and landing pages today. Tons of different things we can get into. Tons of experience that you bring obviously. So, let's go ahead and dive right in. So, well, actually, you know, first of all, what Ryan, can you tell us like what Page Deck is? Yeah, Page Deck is the fastest way to build e-commerce landing pages.

(02:42) If you are not a designer, more of a media buyer, media buyer, copywriter, it's the platform for you. You can spin new landers up super quickly. I love it. I love it. So, let's talk about types of tests right right out of the gate. So, somebody comes to you, they say, "We want to get some stuff going and we want it to be great.

(03:00) " What are you starting with and why? It's an interesting question. I would probably push back a little bit. Like, is this testing from just paid traffic? Is it organic? Is it elsewhere? you know, can you give me like a little bit of a scenario what you're testing from? Yeah, I mean, I think a lot of people with landing pages are an interesting one, right? And CRO especially because everybody knows they need to do it.

(03:19) People are doing it a little here and there, but they're still trying to pull so many levers on the meta end, which is good and on the creative end, but then they forget about this. So, if I had a budget of 50,000 a month I'm spending on meta, haven't really done too much landing page testing, where would I start? Let's say that I have a hat brand, for example.

(03:37) Where would I start? and well, how could I start to think about this properly? Totally. I think it's helpful to break it down into this kind of rough framework that we've settled on over the last few years. And it is find what type of page, what type of template you're going to test.

(03:54) What is the angle that you're going to write that page around? And then what's the offer? So, when you're testing a new landing page or you're even four, five, six iterations in, I think it's helpful to identify one of those things that you're going to test and build the entire test around that. So in your example, you're starting from complete scratch. Everything's going to be new.

(04:10) Of course, I do think a really good lowhanging fruit way of dipping into landing pages for the first time is just to write a listical. And I'm sure throughout this episode, we'll get into the nitty-gritty of this. But I think the really important thing is to not get super bogged down into the design minutia trying to make like a single perfect page. You just need to get something up, get traffic through it, see if it works.

(04:28) So a listical is nice because layout is super simple. You're not spending weeks and weeks in Figma, anything like that. You have a really good headline that matches your best performing ad angle. You have three to five product benefits, value props, whatever you want to call them, that kind of expand on that angle, and you have a CTA.

(04:47) There are other things that you can add, of course, but at the end of the day, you want a banger headline, three to five reasons, and a CTA, and you can go from there. You said something really interesting there. So, I think you said it intentionally. I'm not sure everybody picked up on it, but you said start with your best performing ad headline.

(05:04) So in in Andrew's scenario of spending 50k a month, if people are kind of coming from that perspective, do you think that they to get to that point? And Zach, I'm sure you have some thoughts on this too. It's like you probably have one or two maybe two things that are like working decently well in the ad account.

(05:17) So like you're just saying like what take the learning that you already have and go deeper with that learning and apply it uh across the landing page. Is that is that kind of what you're thinking? Totally. It's kind of it's a way to sort of bootstrap the research portion of this. Like you're not trying to spend hours and hours and hours on customer research. cohort research, like all this different stuff. Like, you know, this ad angle works.

(05:34) You may not know why if you're kind of, you know, still trying to figure it out in the ad account, but you know that that ad makes people click. Great. So, presumably the reasons that people click on the ad should be the same reasons that they end up clicking through to the PDP or buying on the landing page.

(05:48) So, if you start with an ad angle that's already tested out, you can pretty reasonably assume that it's a good place to start for the landing page. The one thing I feel like listicles do a good job of, at least like what we've found when we test them, is what I like them for is in an ad, you might find an angle that works, but it might not might not necessarily like hit on all the value props that your product can actually solve or, you know, uh, presents. So, to me, I tr I treat like a landing page more of the like pre-sell. So, they call these pre-sale pages, they

(06:13) call these landing pages, whatever you want to call it. But the whole idea here is explaining the why someone should buy this product before they get hit with a price point.

(06:25) Because I think like one of the biggest things is that in DOC, I'm sure we've seen this, a lot of these brands from a unit economic perspective need to sell a product for a certain price, which usually is higher of a price point than something that they could just go cruise down to Target and and go purchase, right? So, I think that that's where helping someone understand like what makes this product unique and why it stands out and why it might differentiate itself from another product in that same category is where these landing pages really come come in, right? Cuz an ad can only show you so much. An ad might be 30 seconds long. video might just be a static image was with like a good headline, but this is where like you can get into that nuance

(06:54) of the why before you go push them over to a product page or a collection page that's like, hey, bang, here's the price. So, for us, that's what we like to utilize them for. At least try to like figure out what is the language, what is the copy we can use to help do that more more in the education front.

(07:11) So, if you um you know, you're going through you've done a lot, you work with a lot of clients and everything, Ryan, you see a lot of tests, everything. What are some absolute awesome things that you've tested in the la I mean just to get right into it that you've tested in the last like 60 to 90 days that have been really fantastic for you in terms of messaging layout offer.

(07:30) I mean what you know whatever. Yeah. I mean two things come to mind. The first is doing multi-step funnels. So presumably you run like a listical and people end up clicking through to the product page to the collection page whatever it is more of a pre-sell.

(07:46) What you can also do if you have a really specific offer that you're trying to test out and you don't necessarily want to roll it out to the entire site, you go listical to an offer page with the same messaging. And that's super important because it's kind of taking the same things from your listical in terms of like cohort messaging specific angle again just like keeping the same thread from your ad.

(08:05) If you go listical to a second offer page, kind of counterintuitive because people view it as like lengthening the funnel requiring more clicks. Like I think 10 years ago this would have been like counterintuitive people saying like, "Oh, you want to, you know, shorten the funnel, get as close to the purchase as possible and as few clicks as possible." Not really what we see anymore.

(08:23) Giving people more reasons to buy and then putting them on an offer page that is kind of like reinforcing the same language works really, really well. So that's bucket one. Bucket two, and I I think I remember Zach talking about this in a previous episode, but taking social proof directly from like Facebook ad comments, Instagram ad comments, and just like either just getting screenshots and putting them on the landing page, that hits incredibly hard.

(08:43) Um, I think the the authenticity of that is really, really powerful because people are so used to seeing like a carousel full of reviews. Not to say that doesn't work, but if you can take it right from the source and put it on the page in a format that people recognize, that can be really, really powerful.

(09:01) Ryan, I think one thing that we were talking about when it came to like tests that you were running is for some of like the health and wellness brands, you were showcasing how like before and afters obviously are like huge and impactful, right? You can show the transformation. One thing that's super interesting that you told me to like dig into more is like how do we do that transformation, but for other products, right? So like for us for hollow even, how do we showcase like someone that has whatever banged up feet from running in like crappy cotton socks and stuff like that and then show them what it could

(09:24) look like if they actually wear like the correct gear. For us, we've seen that be successful and even for some like homestead clients that we built landing pages for that aren't in like this typical before after crazy transformation, but it's like trying to treat a transformation with your product.

(09:40) So it's just still showing the before and after because I think it's so simple to say before after. I think human, you know, consumer is just like very easy for them to to perceive like what's going on in that setting. So that's one I feel like that you told us about more recently that has been helpful for us, not just like across health and wellness brands at Homestead, but also just for like physical goods.

(09:58) I think there's always like an aspirational end state people are trying to get to when they buy a product. Might take some creativity if there's not like an obvious like problem solution thing you can go after, but there's always a reason that people are buying it and a problem they're trying to solve. You just got to dig into it. So how do you set up a clean test? Honestly, like how do you This is one I've always kind of like BS my way through, but like how do you is is it just an intelligence test or can it be some vibes? Like what's how do we do this? I know every media buyer that I know loves the vibe testing, just like chuck stuff in the ad account and uh and let

(10:28) it ride. I think there are I mean there's two sides to this. Obviously, you can like do straight up like split testing postclick or you can throw new pages in the ad account and like let Facebook do the targeting for you. pros and cons to both. I think I mean for me ideally like on the CRO side everything's a split test.

(10:46) You want to know that you're getting the same audience, the same targeting from the same ad creative. Like eliminate variables is the name of the game. The problem is a lot of brands don't necessarily carry like you know a dedicated LP testing campaign in the ad account. Like maybe they don't want to cannibalize existing ads that that they at least know are working pretty well.

(11:04) Like there's real money and some people can't afford it, I think. Yeah. I mean, that's where I was going to get to is like there's real money behind all these tests and especially when you're running CRO from paid, it's paytoplay. Like, you have a lot of money at stake that you could potentially use, especially if some of these tests take a while to reach stat sig.

(11:20) Like, you could just be burning money for weeks on end trying to reach stat sig through these tests. So, ideally, to answer your question, like yeah, everything is postclick, everything's a split test. But I think there is room to test directly from the ad account like spin up new ads, link it to your new page.

(11:38) you have to be okay with some margin for error in terms of Facebook's tracking because at the end of the day like they're basing it on the full funnel, not just the ads. So, you're going to get a slightly different customer cohort potentially.

(11:51) Like, we don't know that for sure, but at the end of the day, we're all just trying to like appease the meta gods and you need to be okay with a little bit of margin for error on that side. Ideally, split tests always, but if you have to do it, um, just know what you're getting into if you're testing from the ad account. Brad, on your end, are you running when you're running landing page tests for for clients or for your own internal brand, are you running them with proven ads, new ads? Like, how do you think about that? Like, what ads do you tie into the landing page test? Because I think that's like an important Yeah. If we work with Kaneka, who I know

(12:17) Ryan, you you have a wonderful podcast with with Kanuka. Um, and we worked with her several times. I have another point about working with Kanukica related to the short versus long comment that you made earlier, but we generally will run literally the top ad only through a postclick redirect with Intelligjam.

(12:34) And that's because, you know, I I think to to her credit, like they're they're super dialed into like what's working in the ad account because they do a lot of research up front and they kind of go through that process of like building out a landing page that reflects the messaging of the top performing ad.

(12:45) And usually in those ad accounts, like the the scale of that top performing ad is 50 plus% of the spend. So, we're pretty comfortable uh splitting it out and doing that. There are there are definitely times I mean I can think of a test that we launched literally this week for for a landing page where I duplicated the top performing ad and I'm just letting them compete against each other.

(13:03) Now the caveat and the thing that I'll look for with that is um and to your point Ryan is I want to make sure that I'm keeping an eye on maybe new visitor traffic as a percentage of the total traffic because if all of a sudden the new landing page is performing substantially better but 60% of the traffic is only new people and the other one is 90%. Well, it's like, okay, great.

(13:20) These are completely different users with different exposure to the brand and they might be converting at different points. But in the case of this test, it's like we don't have the we don't have the means yet to be splitting it out and running it um super cleanly because if we split 50% of our traffic and 50% of it tanks from a conversion standpoint, uh we just kind of screw the brand over a little bit.

(13:39) So, we're just trying to be conscious of that. So, um there's a couple different ways we'll do it. I I echo Ryan's points and how we like to prioritize it, but it comes out in a few different ways. Yeah.

(13:51) I mean, there was like a discussion that you were part of, I feel like a month ago in the founders community that was all essentially everybody was saying that that they're taking like the top two or three ads and that's how you're doing landing page testing after you've sort of established that, not necessarily with clean stuff. The, you know, I think going through this and and setting up the test.

(14:07) So, I think a lot of times people get confused like they'll want to test landing pages. They'll have different landing pages running with different creatives. Like, how is a better way to track this stuff that people can keep track of? Because a lot of people are listening to this podcast are, you know, seven to eight figure brand owners that are like doing it themselves.

(14:25) How do they keep track of this stuff without going insane, I guess, is is another question. I get that a lot. Yeah, it's I mean, it's so important to at least have some structure around what you're testing. Like I'm going to keep saying that like template angle offer thing probably a lot over the course of this episode because I've seen a lot of brands at that size like into eight figures that have just thrown a bunch of landing pages at the wall over the past six months to a year and you don't really learn anything that way. Um like you see something that you know you get some alpha from another brand that might

(14:55) you that might be working for them. You just like take a template from over here design from over there and you throw all of them in. Some work, some don't and you don't know why. So, at the very least, regardless of how you're testing from the ad account or postclick, you need to isolate as many variables as you possibly can.

(15:11) Like, if you're going to go and grab a brand new page from another brand and try and rewrite it for yourself at the very least, like keep the offer the same as you've already been running because you know how that converts at baseline. keep the headline angle and like the overall copyrightiting angle, I suppose, the same as one of your existing pages because if all of those are net new, you're going to get to the end of that test and if it doesn't work, you might throw out something that could be a winner. Um, it's just like your new headline was junk or your new offer was junk. So, I think isolate as many

(15:39) variables as you can on the landing page. Only test a new page design or only test a new angle for your copy or only test a new offer. Beyond that, keep a log. Like have a running list of everything you've tested over the last 6 12 months. And ideally, you have naming conventions in your page URLs.

(15:58) This is huge for us. like have like a template code like A is my listical, B is my shoppable page, C is my advertorial and then version number and then the product that's on the page and like you know roughly what the angle for the page was because you can really cleanly see like in G4 in Northbeam wherever you're looking like just by the by the landing page URL you know roughly what was on top of it.

(16:22) That's something that real quick that Ryan got us into was that like these naming conventions. We we started to do it at Homestead um for one of our first clients that we started testing landing pages on and then for us for Hollow, I mean right now we're on like a 103. Like we've literally gone through a 100 variants in the last 2 and a half years of these listical pages.

(16:40) But I think that's that's the really important part because without that naming convention, even like us, we use kind of motion or Northbeam like the combination of the two to look at, hey, what are my top landing pages like L7 or L14? You can pull in like Northbond one day click rorowaz. You can pull in kind of like some key metrics. We should talk about that in a second.

(16:53) Here is like the metrics that you're looking at Ryan versus even a metric that maybe a media buyer is looking at if there's something different there. But um to us like I can just right now look I'm like I have it pulled up in front of me. It's like you know a 101 evergreen multi meaning like hey this is a listical page that is an evergreen offer and we're showcasing multiple products not just a single product. Um and I can see like what percentage of spend that that's taking up. So I think that the naming conventions part like especially if

(17:18) you're trying to do a ton of these tests or if you're starting to hit more scale like it's going to come you know it's going to pay dividends down the road. Yeah. So like on that what are you looking at me right now? I want to know what what are you looking at? What is Ryan looking at versus like a media buyer? Zach's looking at his bank account and seeing if the numbers going up.

(17:35) Yeah, that's it. That's how I that's how I base it. No, Ryan, I'm curious how you think about it. I want actually Zach, I want you to start because like going from the ad account down like there's obviously some differences there. So when you're trying to judge in motion or whatever whether or not a landing page is look is working what are you looking at? So a couple things obviously spend is like our new kind of KPI. So if something is working like usually meta will start to force more spend behind

(17:58) it. So it'll start to like agitate more spend. Now depending on if you're like pulling out test versus kind of how you were you were mentioning before if you're doing the URL on the back end not even on the front end spend is kind of out of the picture.

(18:14) Then really what it comes down to is things like click to add to cart ratio, click to purchase ratio. So like even just seeing if we get a higher intent off of visitors hitting the listical and then shopping the site is like a metric that we'll look at. So period over period on that. We'll also look at like AOV which is really interesting.

(18:30) Some pages tend to push up higher AOV because you do a better job of like selling a collection of products versus just selling an individual product. But at the end of the day it's rorowaz. So for us it's like one day click rorowaz and in platform row. We're kind of comparing the two. We make a lot of decisions off of North Beach one day click, but that's kind of like the few things that we other metrics that you can easily pull in in in motion to look at to kind of compare this lander versus that lander.

(18:47) So those those are the main ones. Click, click to add a card, click to purchase are kind of on the back half. So it's like the experience after the landing page or even if you have a shoppable landing page, it's really interesting to see that.

(19:00) So, making sure that you're running your like metapixel helper to make sure like if you're running a shoppable landing page and you have a drawer cart that might slide out on that page or going straight to checkout that those are like firing properly. Um cuz that's where sometimes like we will run a listical page. We see listical click through goes into a collection like okay this is working now let's try a listical and put a buy box at the bottom of it with like a bundle or with an individual product and let's see if that click to add a card or click to purchase ratio

(19:25) goes up because even if we're seeing it go up and maybe they're not converting yet we might be doing something special on that page that we weren't doing previously. So I think then the click to purchase ratio is really like did I sell them properly through the landing page? Did I educate properly through the landing page to get them to go all the way through versus just like shop around? A few of these like other metrics are important to look at outside of just like your CPC and you know whatever rorowaz which are like the main

(19:48) two. So are you setting them up as in in in how's ad account? Are you setting them up on the the back end or are you setting them up as adsets? Ryan, we do both. Right. So like we'll we'll run them on the back end for the initial test. So then you will only see like technically like a URL, but then on the front end once it's like proven that this page is working, like that's how we run our AB tests. A versus B is on the back end.

(20:11) Once we have A versus B proven, then we'll take that full URL and drop it into a new ad in the ad account. Okay. And that's where you're saying the velocity is what you're paying attention to because if velocity picks up like I know we've talked about velocity a lot recently, something we pay attention to.

(20:25) Okay, I got So that's where like you know you might see like A101 in in our ad account L14 has like 100K of spend or A102 might only have 10K of spend but like the period that that's been launched. You always have to like pay attention to the period of time and how quickly then the spend can ramp up. Like A101 we'll run our AB test A101A versus A101B. We'll see which one works better.

(20:42) Then we'll take A101B which might be the winner. put it into the ad account and then see how quickly spend starts to go to that page versus our like main first page that we're running up against. That's kind of how we see the velocity change after we run the initial AB test.

(20:59) And so then like the AB test stuff I and correct me if I'm wrong, Ryan, but this is where we're looking at intelligence a lot for like revenue per visitor like profit per visitor in intelligence to see A versus B. So I think that there's a big distinction between AB testing looking at the performance and then actually looking at the performance from the in platform in North Beam because Northbeam won't correlate all that way through for you properly.

(21:17) Yeah, we it's not going to see that split. Yeah, we lean on Intelligjs a lot for like incremental testing after a page is already proven out. If it's completely net new, we don't necessarily like immediately stick that into a split test versus a hero page, which is really just like how we've done it.

(21:35) like it's not gospel for how necessarily it should be done. But broadly for me in terms of like evaluating whether or not something's working. I think everything that Zach is talking about tells you effectively whether a page is working or not. Great. Like how profitable is that funnel? Can you keep running it? But then everything postclick like I look at G4 and I look at heat maps whether it's clarity whether it's another heat map tool which tells you more effectively why it's working or not working.

(21:58) So what's the conversion rate? What's the revenue per session? What's the bounce rate on the G4 side? On the heat map side, like how far down is your scroll depth going? What are the elements on the page that get a lot of engagement versus others? Especially if you're looking at like a page with really low scroll depth or rather I guess high scroll depth.

(22:20) People aren't scrolling further down the page, but there's like some stuff below that say like 50% drop off point that gets a ton of engagement relative to its page position. you get way more insights into the why of a page by looking at all the postclick stuff as opposed to kind of like the broader picture in platform stuff that Zach is talking about.

(22:38) So, so just to like real quick Andrew just to like pause on this for a second cuz I think that this is a important piece. People are like I've heard this a million times. I saw someone tweet I think it was Brock from frost buddies like I've tried a million landing pages and they don't work. This is the in-depth level that you kind of have to get to to really understand if you're going to run landing page testing without like the proper tool like a page deck that makes it really easy to build a page and then taking the time to look at all the all these things. Look at the click maps, look at the heat maps, look at the postclick, look at the like

(23:00) metrics. This is where like I think a lot of people give up on landing pages too quickly. They'll try it. They'll maybe even try five or six and they're taking like big swings, but it's not they're not actually getting into the weeds on why.

(23:13) And this is where I think Ryan has always told me and always told our Homestead clients like when he's ever like consulting or just giving advice to them is like start simple. Do one page, compare that versus your your main variant, which might be your product page or your collection page. Okay? Look at the click to purchase. Look at the click to add a car.

(23:28) Look at the clickthrough from landing page to that page. Look at your collection rate of emails. Like look at all of the elements of things that could be happening just on that first landing page before you give up on it. You have to kind of set a baseline of that versus where you're sending traffic to. And then that's what you're trying to work up against.

(23:45) That's where then I feel like Ryan, tell me if I'm wrong, but like this is where then you go do the AB test. It's like, okay, I have one page running. Maybe it's not working as well as my my PDP. How do I make a change to that landing page now to see if I can get a better version than my PDP? And to me, and what Ryan and I always talk about is copywriting.

(24:03) Copywriting is probably the biggest thing that will actually get that like flip in performance on the landing page versus like some crazy new design. I I know like the I'm kind of on my soap box right now, but like the DTOC community like a year or two ago went in this super deep dive into like landing pages are huge. Go that hire them. There's a bunch of landing page agencies now. And they were just overly designing these pages.

(24:19) They were not paying enough attention to let's just go with one simple template and then let's really focus on big swings in the copywriting, not just like the design, the aesthetic, whatever. So I mean like to me this is where if you're going to go this route, keep it really really simple. do one do a listical like traditionally listicals work better across all categories.

(24:37) Ryan you can tell me again if I'm wrong but I think that that's like the case. Start there start simple like set the baseline metrics of that versus where you're already sending traffic to. So on the topic of big swings like how are you and and copy uh with listicicals like what's an example of how you would do big like what's an example big swing within a listical maybe as it relates to copy specifically. Yeah. Like if you have an existing list of gold you've already built out, like you have one that's best blanket for hot

(25:03) sleepers and here's why. I mean to me a big swing is like here's why it's the best blanket for cold sleepers. Maybe it can be both. Who knows? The best blanket for like moms who like to backpack with their kids. Like trying to identify all these very different angles that you can put into the page because you think about what that involves. Like that's a new headline.

(25:23) It's new creatives throughout the entire page. It's new copy throughout the entire page. it's probably a new CTA section that speaks directly to that person. Like you're fundamentally changing the messaging of that entire page. Like it doesn't always feel that way because you're just rewriting the text fundamentally on the page, but completely different.

(25:39) And I think that type of testing is really important when you get started because it requires you to run tests for shorter amount of time. Like if you're stepping through these like small swing like incremental AB tests, you often have to run them for weeks and weeks and weeks. And like we said before, you're spending money on this traffic.

(25:56) So for you to validate that out for something that's probably not going to be a big difference in performance and not a big difference in money in your bank account, it's hard. Like most DOC brands can't afford to do that. So yes, always lean on the big swings by default until you're at a scale where the incremental stuff makes a difference.

(26:14) One thing that Ryan started to have us do is actually take copy from ads and turn that into the images. Like basically overlay copy on top of those images that sit next to the text. So, if you've got a three reasons why or five reasons why, taking one of those headlines from one of those top ads that maybe isn't the same headline that you're running on the landing page and carrying that through as well.

(26:36) Like this is where like some of that copy that you've figured out in headlines and ads or maybe even just like ad copy itself. Try to take some of that and bring it into the imagery. Um, and this is kind of where I was saying like the copy actually can carry the weight. It does then turn into design in theory because you have to take text and overlay it on top of images. But again, that's where we've seen at least a lot of like the incremental lifts is like from that part of it, too.

(26:54) If you're thinking about you're you're going to go through um building out more of these iterations and big swings you're talking about, are you tying it mostly to the persona building that you also have been doing in creative? Like is that where the parallel comes? Because I feel like a lot of times I'll hear from someone that it's hard. I don't know what to test.

(27:17) And we've gone through and talked about creative testing and a lot of what we do as ad buyers is you're building it to to try to unlock new angles, new personas and you're thinking about that. So is that like a good place to start to think about what big swings could look like in these? Is that kind of how you start to build it? Because as a I mean I it is hard because I think about a lot of people sitting there being like what do I where do I go? Like what does that mean in terms of testing? I know I need to, but I don't really know where to start in terms of what what a big change is. And persona seems to be something that people can really glom on to, you

(27:47) know, be a better idea. Yeah. I mean, for me broadly, you can either like lean into cohorts and personas, which does kind of require you to do more creative testing in the ad account. Like, you can't just like, you know, yolo create a new landing page with a new customer cohort just because you think it might exist.

(28:05) Like, you got to make sure that you've tested that out through your ads and you know for sure those cohorts exist and that you can lean on them. getting info from your post-purchase surveys, doing like research on Reddit. Like there's a there's a tool called Gigabrain, I think it is, where you can like just dive into like the Reddit or the subreddit for whatever category you're selling into and like try and like weed out some customer language from there.

(28:24) Like do that type of research and build new ads on top of that. If you're having a hard time with the cohorts, I think the other big bucket is trying to present your product benefits in a different way. So, just exactly what we talked about with like this blanket thing, like is it the temperature regulating portion of it? Is the fact that you don't get sweaty? Is it the fact that, you know, it's not going to smother your child? Like, I don't know.

(28:48) Like, you can rearrange the things that people care about, the the things that people care about for your product. So, you're either doing new customer personas or you're doing product benefits testing. I think broadly that kind of fits into one or two of those categories when you're thinking about what to test. Also, one question I've always wanted to ask just real quick. Do we have more on this? Because I have another question. You can go for it. I'll add after.

(29:06) Okay. My uh is that I've always people send me a landing page and say, "Hey, I've got a new landing page. I'm working on this. What do you think?" And they send it over to me and it's on it's been designed for desktop and then like I'll look at on my phone and it sucks.

(29:24) And I'm always like, "Isn't like 80 isn't like 90% of your traffic mobile? I don't understand." And so like how do you encourage people to design on mobile? Like is that what you're doing? Like do do you are like this is what it looks like on a device first and then like then go to the desktop cuz I feel like this is a major disconnect I've seen in digital practitioners and I'm curious how you approach it. I don't know why that is still the default for everyone.

(29:47) Like I mean we all work on desktop on laptops at the end of the day so I get it. But I mean yeah how many brands have you seen that don't get 80 plus% of their traffic on mobile? Like it's crazy. Oh yes. I mean, when I jump into any page builder or even if I'm like going to build something from scratch, like I make my window the mobile view by always by always.

(30:06) I make the window 400 pixels wide always. So, yeah. Ryan, when we when you have designers design stuff, are you doing like multiple mobile? Like, are you doing like big iPhone, small iPhone, like 400 pixels, 480, 560? Like how do you differentiate that versus just like desktop, 1200 pixels wide? Like how are you thinking about that? What are like the regulations? Yeah, 480 and down is like generally where mobile starts.

(30:30) Like above that, it's kind of tablet no man's land, which to me is kind of hard to optimize for because there's not a ton of tablet traffic for most brands, I think. So, I carry a tablet in my I wear Jenko jeans and I have a huge pocket. So, I carry a tablet all the time. He's the He's the one person that you see on your your Google Analytics from the I'm one viewed it on the iPad. Yeah.

(30:53) Yeah. 21 in. like broadly 480 and down. 480 is a big iPhone like a very big iPhone like on like they're scrolling sideways on in landscape orientation. 400ish is a good middle ground for most devices. 390 to 380 is like some of the smaller phones. So yeah, worry about 480 and down if you're working for like a if you're looking for a hard and fast rule.

(31:15) Yeah, we've been starting to design in like I think 400 and 480 um just because there's a lot of variance between that. like even like 480 to 400 that like little extra bit of pinch can like get things to wrap and text wraps and things look super weird.

(31:32) It doesn't take much time for a designer or you know like a page builder to like crank that out. So from my perspective that makes a lot of sense. Um the one thing I wanted to call out is like to Andrew's point about testing. I think this is like a really big call out that I I feel like people that are overwhelmed with like how much do I test? When do I test? So one of our brands in our portfolio is growing pretty quickly right now.

(31:52) We spent like 1.2 million in ad spend last month in the last 30 days. I was just looking. We've we have five active landing pages. That's it. And of those, every one is a completely different experience. One is a five reasons why, one is a shoppable page, one is like just like super short advertorial text, one is like a mimic of the product page with just like more content on it.

(32:18) So I think like the biggest thing here is like when we talk about big swings like pick a variant that you're going to go with listical advertorial my and I think Ryan I'm curious your thoughts my guess is like don't do advertorial I think like it works for unless you have like a very strong cohort of like a female buyer like female 20 to like 50 the avertorials don't necessarily work but for us for like men for hollow like it they just never have worked.

(32:41) We've tried a million of them we've tried them on like whitelisting pages we've tried them elsewhere like we've never gotten them to work they're hard to do. So my thought is like okay start with listical big swing with copy until you get a winner then go try a shoppable page then go try like maybe an ad avertorial then go try something else but I mean I think brands just need to realize like you don't need a million of these pages necessarily it's just like the test is the important part and that is where the test has to be a big swing and it doesn't mean that you need a whole new template for it to be a big swing. Yeah, totally. On advertorials, I generally

(33:10) agree with you. Harder to nail, super dependent on your demo. I I don't think it necessarily has to be women. Like if you're selling I mean anything that is like a super DR YouTube pre-roll ad in like the health and wellness space, you could probably get an advertorial ripping.

(33:27) Like it if you're going to build a super direct response funnel, advertorials can work really well. Harder to nail for like apparel for other like non-consumables for testing. Yeah. I it just gets so unwieldy when you get to the point where you've been testing landing pages for the last year and you have 103 pages in the account like you said.

(33:45) So if instead you have five really really different pages and you're kind of stepping through the iterations of those pages, it's easier for you to figure out what's working at the end of the day. And that's I think an underrated piece of all this testing stuff is like you have to be able to wrap your mind around where you started and then how you got to where you're at and whether performance is better or worse because of it.

(34:01) Um, if you're just like yolo slapping new landing pages in the account every week, it's going to get out of hand super quickly. And I know that there are some pages in that gray area of A40 to A65 that we just don't know what was going on at that point in time, but performance was down. And I when performance is down, people start making emotional decisions. You're just looking for anything that works.

(34:24) And that's the biggest difference I've seen between like the super hardcore CRO people and the super hardcore media buyers is media buyers are really used to like like quick feedback loops. You launch something either works or it doesn't. You kill it and move on.

(34:41) When you're testing pages or you're doing CRO tests, it requires you to have some type of statistical rigor for you to know whether or not that thing worked. So if you're going to do any testing at all, like you need to commit to actually, you know, running it for two weeks, running it to the point of stat sig. Otherwise, you're going to be spinning your wheels for months and months without knowing what's working. Yeah.

(34:58) I have a This might be a meaty question that bounces around a lot of different things, but it's kind of related to the progression of the testing. So, Zach, you just kind of alluded to spending a lot of money over the last 30 days for the specific brand. What did the progression of tests look like? Cuz I'm curious like when you launched ads originally, did it go to a PDP versus a lander or just a PDP? Okay.

(35:15) And then we figured out it was this type of lander. And then we did this other thing on the website which was pricing test cuz do you consider offers part of CRO? Like was it an offer test? Was it a shipping test? Was it a post-purchase thing? Was it a landing page? Like what did the progression look like as much as you guys remember over the last however long? Because I'm trying to give like I'm just like as a as a brand owner who's like I don't know where to start.

(35:36) I think you gave a really good road map of like you should start with a listical because it's really easy to do and you should focus on nailing the copy. What do they do next? like did they make another listical? I'm just curious what the progression has looked like for you guys generally.

(35:48) I'm sure it's different by brand. So, I mean like for us, for this one brand specifically, it was like PDP and collection page first. Like you just start with like the main the main two, right? And that's where like your your job with your website is to make it, you know, try to answer as many questions as you possibly can.

(36:05) We can go more into like CRO, which I treat CRO more as like website than landing pages, even though like they're related for that brand. Even 90 days ago, we were running to one landing page which was a listicle and then the the product page collection page. That's it. And so what we did then is we basically duplicated out that that listicle and we started to add in new sections.

(36:24) So then we started to add in like a video section where like there's customers talking about the product below the five reasons why. And then after that we started to add in like another section which was like the FAQs that we knew that people were like interacting with those on our website.

(36:41) So, it's like, okay, well, can we get even more educational information onto this page? I mean, even if you go look at Hollow's pages that we run now, which I talk pretty publicly about. I'm not too worried if you rip them off unless you're another stock brand. But you can see the same theory in in the works where it's like our hero kind of headline is like been the same for a while. Like that that that huge headline is like been the same.

(36:58) Then we added in like three videos that are from like our like basically whitelisting partners that we work with. Then we shortened the five reasons to three reasons and tested that AB. And then we added in like a us versus them section. And then we added in Facebook comments as another section. And then we added in FAQs as another section.

(37:18) To me, it's it's like adding in a new section to see what that does. If it's a completely different element and then seeing how people interact with it with like the whatever the the tools that Ryan was talking about like the heat maps and click maps and seeing then what is that like lift? Did that one extra piece of like copywriting or that one extra like video then add that extra value for them.

(37:38) Now more of them go through and go and purchase because the one piece of reassurance that they needed was was solved. So to me like I think they start with like a listical. It's the easiest thing to do. It's copyheavy. You use your like five reasons why or three reasons why. You take your top static images.

(37:56) You kind of make those part of your reasons why or if there's like headlines that stand out. And then that's the copy that you test. That's your A versus B. Completely different copy changes and completely different like reasons why images. Once one of those is working, add a new section into it. Like this is where like page deck is a great tool. Like it's so simple to literally just go boom, new section, that's FAQ. Boom, new section.

(38:12) That's like a faux buy box. Boom, new section. That's like a different headline, right? And it's already pre-built. You don't need to design it. You just like toss in your copy. Toss in your image. Like that's where I think like a lot of these people overly design these pages and then use like Figma to the max. And it's like, okay, you likely do not need that.

(38:29) You likely just need to take a bigger swing on the copy or the section you're about to go test next. Well, I mean, as a person, you know, we charge, I think starting it's 21,000 per landing page. And I try to make trying to Yeah. On the low, I try to make them designed appropriately the right way.

(38:48) So, the actual serious question, just to get like a little bit even nerdier and deep on this. So I feel like a lot of times Ryan I've heard you talk about too when is you know you were doing stuff in the founders community and reviewing landing pages for folks like you would talk about as well like once they've added to cart and what the optimizations were that could happen in terms of like a cart drawer and how that looks on mobile cuz a lot of that [ __ ] sucks honestly like and it's hard and it doesn't work and it's like clunky and it's and then like and I know you know in looking at the the redesign too

(39:19) Apollo and and and other brands too, but obviously you were involved in that. Like there's a lot of bundling happening to raise AOV. There's a lot of things of like I want to buy more. Like I go on that site, I'm like, yeah, I'm going to buy 10 pairs because it's like way cheaper to do that, right? Like I know I like these anyway.

(39:37) And so how do you introduce once somebody adds to cart opportunities for upsell and what are effective tests there? And then how do you like what are other things like sticky add to cart or sticky you know cart buttons like what what are other things there that are like really magical that you've seen be like that's good [ __ ] like that's really going to work.

(39:56) Yeah I think well point one to give Zach's team credit they are incredibly good at getting through the really hard things that are really big technical lips for the sake of testing things out. Um I think a lot of brands are scared of like spinning up new bundles spinning up new site features spinning up new upsells because a lot of the time it's a ton of work.

(40:14) So when you're going through Hollow's site, know that it's not an easy way of testing. But to your question, Andrew, I just invited Ryan on here for what it's worth, just to, you know, give me, you know, props. So sorry, Ryan. Keep going. No, it's a ton of work, but it's worth it. Uh, hire a good developer. With upsells specifically, like there, if you're on Shopify, there's kind of like three natural checkpoints for it.

(40:36) Like you're going to do in cart, you're going to do inch checkout, and you're going to do post-purchase. I think if you have nothing, start with post-purchase because it is as close to free money as you're going to get in DOC because the purchase is already secured. There's no risk in terms of like introducing more variables into the customer journey that cause people to like get decision fatigue and then they bounce out which is important, you know, and I'll come back to that. But start with post-purchase. If you're have very

(41:01) few SKs or one skew, you can upsell to like a 60-day 90day supply or buy more, save more, that type of thing, or even just like offer an additional product at a discount. Some brands prefer to think of it in the sense of they've already secured the purchase like your CAC is baked in.

(41:18) Now you can offer something super aggressive that you can't afford to offer on the main site because it's too steep of a discount and you don't want to devalue the product because everyone's going to see it. Great.

(41:32) Once you've done post-purchase and you have a few things that are working, I would go to in cart because it is this point where people have already like kind of shown some interest, but they're not totally in the checkout ready to ready to buy. You're not going to freak them out by introducing more variables. So again, if you have not a lot of SKs, probably more of the same like buy more, save more, upsell additional products.

(41:50) If you have a larger skew list, try to have some tact for what you're upselling. you know, don't just like stick on the default upsell app settings and let it just throw anything into the cart. Try to curate products that actually kind of build a logical bundle together or go together in some way and explain that.

(42:07) Are there apps that you're using to do that? Like I know for Halloween custom build they'll like in cart are carts custom built, but like are there any apps for that? And what are the apps you recommend for post-purchase? Like what is the easiest use if someone does not have a post-purchase upsell? Like what app would you recommend they go use? Or all the post-purchase apps are the same. Basically, it's the same API under the hood. They all look the same.

(42:25) You get basically no UI choices. You can use Afterell. You can use Checkout Blocks. Those are the two that I've used the most, I would say. So, if I'm going to like spin one up for a new brand, one of those two apps is what I'd go for. Both of those, if you're on Shopify Plus, offer checkout customization as well.

(42:42) So, you kind of get some more bang for your buck there. There's like pricing tiers, of course, but they kind of serve multiple purposes. Have you seen anything specifically work in like post-purchase upsells for like any brand or even like inch checkckout upsells for any brand that's like this is obvious. If they're buying one of this product, sell them more of that product.

(42:59) If they're buying one of this product, underell them. Like what have you seen historically? Is there like sell them 20 things if you if they're buying one? Like what is the one thing that you've done that or have seen with all these brands that you you have touch points with that seems to work both post-purchase and in checkout? Yeah, post-purchase.

(43:17) biggest one I've seen recently is um consumable subscription product go from a single 30-day supply to a 90-day supply. Just flip them right from a one month to a 3-month. That particular upsell has like a 20% take rate. It's crazy for this brand. Um like what that does to your margin on those new customers is nuts. So if you sell a consumable, definitely try in post-purchase upgrading from single purchase to 30, 60, 90.

(43:41) just like try and find the right time block to like get the right amount of take versus like higher AOB. You can back into that. In cart, it's a little a little more gray because there's so much surface area for the different types of stuff you can offer.

(44:00) I think safely you should have one upsell running that is like more of the product that's already in cart. It's super safe. It works really well, especially if it's kind of lower AOV. Like just add another one, get a 10% discount, something like that. Or if you have like two, three hero products, just like offer your best sellers. Like don't use it as a checkpoint just to like try and sell through all your [ __ ] inventory that no one wants.

(44:18) Like give a deal on things that people are already buying. Those are the two that come to mind. You don't you don't want people what can you please just buy my extra small joggers? I literally cannot get rid of them. I know they don't fit you, but I I really got a lot of them. Kids like joggers, too. Yeah, there you go. Um Brad, I know Brad, you were asking about CRO stuff, more like side stuff.

(44:36) Did you have something there? Yeah. I mean, I I think you kind of you kind of answered it, Ryan, with like I was going to ask like, okay, so with the with the progression thing, like how do you guys go about like identifying the size of the opportunity and like thinking through like this makes sense whether it's on the landing page? So, and maybe the better question to start with is like when you guys are talking about, you know, going wide, like is it wided with the the messaging of the listical page? Is it the diversity of the type of page that you're actually

(45:00) testing? Is it I'm going to test a listical and then I'm also going to have post-purchase at the same time because they don't totally impact each other. Like I guess like how are you guys thinking about identifying the size of the opportunity and like where you put your effort and attention? Um that's an interesting question.

(45:18) I think it's less about like specifically the tactic you're trying to shove into and more about trying to kind of qualitatively figure out how big the opportunity is in terms of one are you doing like are you swapping a headline? are you changing like one or two little elements on the page versus building something totally different. I think that's the first checkpoint is like how different is that customer journey as the result of the thing that you've launched.

(45:40) The second thing is obviously the traffic level. Like I mean everyone's running paid and has kind of their, you know, their go-to traffic destination. So you're going to get a bigger result off of testing there versus testing on your about us page, something like that. So I would say like it's pretty simple.

(46:00) It's like taking big swings in terms of the destination they're going to, the design, the messaging, all the things that we've talked about so far, and then making sure that it's a hightraic destination. So, I'll just like share some numbers for us right now for Hollow for like the post-purchase stuff. So, I've got After Sell pulled up.

(46:12) That's a tool that we use. I have no affiliation with them. It's just like one that we use that we've liked. We're running like three tests right now post-purchase and it's based on AOV. So, it's like if you spend less than 60, we're going to send you this uh post-purchase funnel. you spend 60 to 100 this post-p purchase funnel more than 100 this post-purchase funnel and like just to like share results in the last 30 days the one that's people spending over a hundred just by trying this out and again we're no experts at

(46:36) this this is just us kind of like giving it a shot we've added $345 revenue per visitor so like an additional $345 by just like launching this on that one now collectively if you look at all of them we're adding 291 and 210 so on average across all of our orders we're adding what is two, three, almost $9 revenue per user collectively across our website just through po postpurchase upselles.

(47:03) So, this is where it's like it doesn't have to be rocket science what you do, but like making sure you have these in the first place is just a no-brainer. So, you can obviously go down the rabbit hole to what Ryan said and like go super deep and do it by like I if I sold them this collection, upsell them this collection. If I sold them one product, try to upsell them 20 products. Like if I sold them a one month supply, try to sell them a two-month supply.

(47:21) There's a bunch of different ways to do it. I think that that's like the obvious point here is like just give it a shot if you're not if you're not doing it. And like from a take rate perspective, I mean, we're seeing like 8 to 9% take rates on some of these. 20% is absolutely absurd. Like that's insane, whatever that is. But I think that that's where just even like setting the baseline.

(47:39) 5 to 10% take rate is like a good spot. So 5 to 10% of people should just take the upsell that you give them because they've already gotten to the point where they've already purchased your product. So why not give them another one? And for us, we give a slight discount more than what they're getting on the site to to buy more stuff. So, it's like they've already bought more stuff. I'm not worried about my margin at that point.

(47:55) Might as well just try and sell them a few more things. Yeah. Something that's kind of tactically interesting that we talked about when you guys told me about the 20% to take great one is that like it's not going to show up in your Meta account. Like Meta is not tracking that additional uh value added. So, like if you're paying attention to your AMR and that all of a sudden starts to increase as a result of you launching these post purchases, meta doesn't look better, but you can just start ripping more budget. Um, which is a very like niche thing to think about, but we're the media buyers.

(48:18) That's what we pay attention to. I have I think it would be fun at some point. I don't know if we have time today, but like maybe we need to have a recurring recurring guest theme here. I'd love to like spend a bunch of time just like going through the research stuff. And so maybe that's something that that we can come back to. But I'm curious like how are you using AI in your workflow? Everybody's talking about AI.

(48:36) Like are you using it for one? And two, like do you have a couple of examples of of how how you are if you are? Yeah, I mean obviously like I run a SAS on the code side. It's incredible. Um, that's kind of tangental to this conversation, but for me, I found by far the biggest value in AI in data analysis.

(48:55) Like going and getting something out of this platform, something out of this platform, and then kind of mixing them together in a way that's helpful. Like recently doing a an analysis of conversion rate by every like $10 AOV tunch. Um, can't get that in Shopify. Super hard to do in J4.

(49:14) But I just like exported a bunch of orders um and then exported like conversion rate over time and then asked AI like all right I have these two CSVs like I chucked them into a like a cursor project and said give me the conversion rate last 60 for every individual $10 trunch like people who ordered up to $20 up to $30 etc etc and just spit it out for me.

(49:36) That's the type of data analysis that would have taken me a week to step through trying to do it myself because I'm not a data scientist and I'm a dummy. But it just gave me the answers right away. Um, and that's huge context like you know then like all right this is the most profitable like avanche for us to shove more traffic into. Great.

(49:54) You can change out the offers on your landers, change out your upsells, change out everything that people are seeing based on that one thing alone. Man, that's like it's like three-dimensional chess. I think you I think you've showed our guests that or I think you've showed our listeners that you Ryan are a person that thinks in that way.

(50:11) I think we should finish out the episode by just uh maybe introducing a new segment too since Brad's introducing the AI segment uh called beefs which is uh where we just bring up like beefs about CRO and landing pages that we wouldn't technically share anywhere else um or that maybe is a little bit of a hot take but like to me I've always felt and I think Ryan you know I've known you for a number of years you're the one that taught me this that simplicity wins.

(50:39) I think I for a long time have seen and heard a lot of [ __ ] about how complicated or overbuilding of pages and on and on and on and on and every time you've been like simplify like go to the copy go to the images that are working in your ads and just do that first and that's really helped me a lot like that I would say that's a huge message that I would love to broadcast the universe about this stuff because it's very complicated but often the most successful things are the simplest in what we've seen with our clients, what we've seen with founder

(51:11) members, etc. So, who wants to go next? Who are you who are you beefing with the world on that? Is that just like everyone besides besides you? Generally, I don't like to Yeah. Yeah. Yeah. I would say like generally just like beefing about it with anyone that wants to fight. I'm I'm fine with that, you know. Yeah.

(51:34) I don't start fights really. I'm not I'm not you, Zach. Yeah. Yeah. Yeah. Well, someone had someone here has to do it. My beef is when people say landing pages don't work. Like you just you just wrote bad copy. You just like didn't try hard enough. Like it sounds really like harsh to say that, but like shout out to Brock at Frostbuddy. Like you just haven't you haven't written bad like good enough copy for your landing pages.

(51:52) Like I definitely feel like he could if he really wanted to take his time figure out what the pain points are of why people buy Frost Buddy versus a Yeti versus any other product and say here's the five reasons why or the three reasons why that that actually happens. Even if he goes postpurchase survey and says, "Why did you buy us versus the big brands, there's probably two or three nuggets in there that he could put in a listical page, very simple, fire up on page deck within the same day, no design needed, and I bet he would see a higher

(52:15) one day click rorowaz and a better CPA." So to me, it's the people that just say, "I tried it, it didn't work, and then they give up on it." Because for us, like go look at our our landing pages. If you go spy on our stuff, 95% of hollow spend still today drives to a landing page first before it drives to the website.

(52:33) I think that a lot of people are missing out on that opportunity. We'll uh we'll save the best for last here and I will jump in. Zach picked a fight very directly with one person and Andrew picked a fight with everybody. So I'll go somewhere in the middle. My beef is with the people that take screenshots of intelligence 3 days after their test goes live. That's all. I'll just leave it there.

(52:50) And and sessions winning show session. Don't show sessions. No context. You got to say save those ones for the for the for the group chats. That's I send Zach the the screenshot 3 days after mine goes live and convinced that I've just changed my brand. Tony, you're such a nice guy.

(53:08) Anything you want to bring up beefwise? I'm beefing with everyone who has sent an AI DM to me pitching their landing page services because I know there are so many brands just getting hosed by like three weeks of Figma design, another week of like building it in a page tool, and then another week of edits, and you got to test the thing for two more weeks.

(53:25) and you you're a month and a half in and you've tested one thing that probably didn't work and you probably paid like five grand for it. I just don't do that. Like just go and grab a listical template from someone and just build the thing based on what you know as a founder. I mean don't spend too much on your landing page builder is also like a great summary, right? Like let's just fire up a great solution page.

(53:42) There's other ones out there, but I took this really great class about um DM automation for sales lead generation and it's been insane for my business. I don't know about you guys. Um yeah, everyone hates you. You hate you ate the world and the world hates you, Andrew. You're the following here, by the way. Yeah. Yeah, that's true.

(54:00) I also read the $und00 million landing page. I'm going to implement that. There you go. Nice. Nice. Well, it's time to go because we can hear Brad's kids in the background screaming. So, everyone, I appreciate Ryan. Honestly, appreciate you being here. Thank you. Um, and guys, always good to speak with you.

(54:17) Thanks for reminder, as a reminder, go check out uh us on YouTube as well as Spotify. Please leave us a review. Please subscribe to YouTube. Hit up Andrew. Andrew@foxwelldigital.com. Brad and I don't share our emails on purpose. Um, and then if you have any questions or feedback on the pod or anything else, hit us up on Twitter. All of us are there.

Next
Next

Whitelisting: Turning Non-brand Pages Into Your ROAS Engine