How To Use AI To Win In Marketing

This episode is basically a tactical hands-on walkthrough for using AI in creative, without turning your ad account into generic “AI slop.”

Will Sartourious, owner of Selfmade.co, an agency that focuses on AI personas and AI performance creatives, joins us to lead us through his AI creative process and the importance of AI adoption in the social creative space.  He primarily lumps current AI creative usage into two buckets:

  • Type 1: “Clone this ad for me.” Fast volume, but you drift into sameness. 

  • Type 2: Use AI to do the thinking work (research, synthesis, structure), then let humans do the “mushy middle” that differentiates. He’s big on building PAM clusters (Persona–Angle–Motivation) so strategists aren’t winging it and you can see where your creative gaps are before you ship more ads. 

    From here we get into the nitty gritty of his current workflows:

  • Static → GIF/video: why GIFs often beat statics in clean A/Bs, and how to generate animated variants fast. 

  • Tool split: Will uses Claude for more “creative” thinking, but uses ChatGPT specifically because it reads on-ad copy more reliably critical when your prompt needs to include every word that appears on the creative.

  • Prompt iteration loop: generate → inspect output → feed output back in → ask for prompt correction (ex: “the copy disappears—fix it so it stays on screen”).

  • Midjourney reality check: great for vibes/backdrops and motion experiments, but “sucks at product rendering,” so you generate the scene and then swap in the real product using other tools. 

We also dive into these Key takeaways

  1. How to find the missing creative angles in your account before spending another dollar.

  2. Cut out the Middle man: How you can stop using stock photos entirely with AI, without the brand even noticing. 

  3. The fastest way to turn a winning static into a GIF without opening any editing software.

  4. Why including every word of on-ad copy in the prompt matters more than being a “prompt genius”.

  5. Why this AI tool is a secret weapon for AI creativity (It's literally  10X better than ChatGPT).

  6. The feedback-loop method to improve prompts without manually tweaking yourself into madness (we've all been here).

  7. The 1 tip that will make your AI people look less “plastic” and more realistic (texture/pores/realism)

  8. How Will used Sore to close a client live mid-call and how you can too.

This episode is sponsored by Northbeam, the marketing attribution platform that we love here at Scalability School. If you’re ready to cut through the noise, stop guessing, and actually see which ads are driving your business, book a demo at www.northbeam.io/demo, and tell them Scalability School sent you. Join the club. 

To connect with Will Sartorious, you can follow him on x at https://x.com/will_sartorius
or www.Selfmade.co 

To connect with Andrew Foxwell send an email Andrew@foxwelldigital.com

To connect with Brad Ploch send him a DM at https://x.com/brad_ploch  

To connect with Zach Stuck send him a DM at https://x.com/zachmstuck

 Learn More about the Foxwell Founders Community at https://foxwellfounders.com/

SUBSCRIBE TO NOT MISS AN EPISODE

Full Transcript

(00:01) For the foreseeable future, in order to  differentiate, you need that sort of mushy   middle human ground. Use these tools to scrape  the internet constantly for your content and   constantly be putting that into Claude, not GBT,  to analyze. And we always sort of call them Pam   clusters, persona, angle, motivation.

(00:19) Like for  every persona we're generating, we want to know   which angles to utilize and under each of those  angles, which motivations to ultimately utilize. I   think people get so bogged down with their prompts  and I think that's the wrong mentality to have.   I'm sure your prompt is fine. That's why when you  subscribe to MidJourney or, you know, Veo 3.

(00:33) 1,   I think it's like the Google ultra subscription.  You can use your prompt unlimited amount of times   and I promise you, you will get something that  is in the ballpark of what you're looking for.   Editors are stuck in their ways. You know, your  strategists are sort of stuck in your ways. So,   we did a sort of a a competition for each model.

(00:54) So we would say like take your favorite statics,   animate it, let's present in four weeks and  we're going to judge it and whoever gets,   you know, the best output gets, you know,  monetary incentive and then we sort of   leveled that up. Let's create a video. And  then the ending result was creating a full ad. And now let's take a listen to the Scalability  School podcast.

(01:17) All right, welcome to episode   17 of the Scalability School podcast. Almost  20 almost 20 episodes get to say episodes of   listeners. More episodes than listeners. There it  is. That's a new tagline. That's really good. I   actually did see that we just surpassed a big  number of downloads. I'm not going to share   it because it's private information. Uh only  available to our to our illustrious sponsors.

(01:39) But it's but it's like oh wow. Like this is real  and it's not inflated either. We haven't paid for   anything. You know, some of these some of these  places do that which we're not doing. So that's   why we only have tens of listeners, but the people  that are listening are real people. And we have   we're very big on YouTube.

(01:57) I don't know if you  ever heard of YouTube, but we're big on there as   well. By the way, special guest today, AI expert,  CEO of Self-Made, Will Sartorius. Will, glad to   have you. Yeah, thanks for having me. Stoked to  uh in Will also the AI captain of the Foxville   Founders membership. So yeah, unofficial mod of  the channel. That's right. That's right. So yeah,   will basically we had presentations in Portland.

(02:22) You weren't there for our Foxville founders   meet up, Brad, but basically everybody, we had  all these different tables of like, you know,   Q4 creative, you know, you could get an audit  and like Wills was about AI and there was like   the majority of folks were around Will's table  the entire time looking at what he was creating.   So that's what we brought to this podcast. Yeah.  Awesome. Yeah. Stoked to be on for sure.

(02:42) And I   feel like we should like put a uh put a counter  up like the number of times we say Andromeda, you   know, if you're listening, you take a shot. Hell  yeah. Yeah. I got my electrolytes right here, so   I'm ready to I'm ready to roll. Yeah.

(02:58) Um but like  you know, for us where this sort of entire impetus   behind getting really good at creative through AI  came from was well one at the front end, we sort   of perfected using social listening to collect,  you know, Facebook comments, subreddit comments,   Twitter, etc. like built our own Shopify scraper.  And so like now you can plug in a brand and we   can pull all this information like core personas,  core motivators, angles, etc.

(03:18) So that was like the   front end uh that I developed about a year ago. I  think I talked about that in Amsterdam. And now I   was like, okay, so the front end is done. Now we  need to sort of figure out the back end. Like how   can we take the content that we're creating and  sort of amplify it, right? And there's a lot of   different ways to go about this.

(03:37) There's sort of  like one sort of allstar flow that I like to talk   about, but there's also a lot of little tweaks you  can do which are probably best for specific use   cases. Whether it's you have an image and you just  like it's 10 p.m. your editor's offline and you   just like need to resize it, but you only have the  PNG. There's a solution for that.

(03:54) You have statics   that are maybe fatiguing and you just like want  to convert them to GIF, add a little animation.   CPMs on GIFs we always see being a lot better  than statics even for the same image. Generally   in a onetoone AB tests, GIFs will generally always  outperform statics. So just like adding a little   sort of fuel to the fire, that works too. I'm not  a big clone ads guy.

(04:15) Uh I know folks like to do   that and like sort of take competitors and like  insert their product. Like that's never something   I've wanted to do. But we can sort of talk about  a flow of like finding inspiration on Pinterest,   converting that to midjourney, adding your product  into that image. And then that is sort of like   how I like to clone ads because like at the end of  the day where AI is really helpful is again it's a   barbell. It's like it's setting the foundation for  strategists like which persona, angle, motivation

(04:38) do you want to focus on? And then it's amplifying  existing content. like that messy mushy middle I   feel like is so innately human and like unless you  want to just revert to the mean and just like do   what everyone else is doing and contribute to you  know the sort of slot parade you know by all means   but like I don't think any of us on this call or  you know our team are going are aiming to do that   um you know to that end like I talked to my team  I sort of about like type one versus type two

(05:03) thinking like Dan Conorman you know RIP if you  haven't read thinking fast and slow you should   pick it up right now it's the most amazing  book I've ever read but Like obviously type   one thinking is just like I my brain is shut  down. I'm just on autopilot. Type two is very   much more so I'm being sort of thoughtful in my  approach.

(05:23) And I feel like a lot of folks use AI   for type one thinking like I just going to [ __ ]  write this email like I'm just going to do this   or like oh you know I need a concept GPT help  me write a concept or what have you. And again   like if you want to sort of revert to the mean  and just like be average like do that. But like   if you can be thoughtful about your AI workflow  and again like amplifying existing content and   you know sort of human messy thought then you're  going to be in a much better place.

(05:48) Can you this   episode is brought to you by our friends over at  Northbeam the marketing attribution platform that   a lot of us that listen to Scalability School and  the hosts of Scalability School you know we swear   by and use. North Beam rolled out something  brand new that I think is pretty awesome.   It's real unlock I think called clicks plus  deterministic views and it's the first ever   deterministic view through attribution model.

(06:12) I mean let's be honest, right? One day click   had us flying a little bit blind. You know  the drill, right? Your Tik Tok or your CTV   campaigns are building awareness clearly driving  site traffic and lifting sales across channels,   but your attribution dashboard still gives all the  credit to Google search because it was last click.   It's like tracking who made the final shot in a  basketball game, but ignoring the five assists,   right, that got the ball there. So, clicks  plus deterministic views flips that.

(06:34) It ties   actual impressions, not guesses, but verified  impressions to real conversions. We're talking   direct integrations with Meta, Tik Tok, Snap,  Pinterest, and even CTV. And Northam uses hard   identifiers like order IDs and hashed emails  to prove that someone saw the ad, didn't click,   but later bought. That means you can finally  measure the real impact of upperfunnel campaigns.

(06:56) And the results are pretty nuts. One brand saw  283% more attributed conversions on Tic Tac.   Another got 175% more attributed revenue than  they knew even existed. And you know, another   one drove 159% rorowaz lift on Pinterest. So if  you're tired of underbudgeting awareness because   attribution can't keep up, this is your chance  to see the real journey and the full journey.

(07:20) So,   Northbeam click plus deterministic views  is live now. Head over to northbeam.io/demo to book a walkthrough and tell them we at  scalability school. Can you So, can you give   like an So, you gave a couple examples of type  one, which was like, okay, this is the email I   got. I I yapped into GPT with my my what I think  I'm going to respond with.

(07:41) Can you make me sound   like less of a dick? And then I kind of hit  send. So, that's maybe an example of type one.   Maybe you can give an example of of like where  people are using that in in the creative process.   But then like type two which is maybe some of  the examples that we're going to go through   but because I know in like the original version of  your notes here is like that was a big distinction   for you to go through.

(08:00) Um so maybe we spend just  a couple minutes like giving some examples before   we dive into some of the workloads that you have.  Yeah, no doubt. So I would say type one is clone   this ad. For me it's you know taking someone  else's content that may not be applicable to you   at all and just effectively using that as your  own. like maybe creative volume is what you're   after and you don't really care about anything  else.

(08:21) By all means do that where again like type   two sort of comes in. It's like you can outsource  a lot of thinking to AI but I do think at least   today and and you know for the foreseeable future  in order to differentiate you need that sort of   mushy middle human ground. But like use Appify,  you know, use Gigabrain. Use these tools to   scrape the internet constantly for your content  and constantly be putting that into Claude,   not GBT to analyze. And we always sort of call  them PAM clusters, persona, angle, motivation.

(08:47) Like for every persona we're generating, we want  to know which angles to utilize and under each of   those angles, which motivations to ultimately  utilize. And so that framework is created for   our strategist internally, right? And so then we  say, "Okay, strategist A, here's your persona,   angle, and motivation. Run wild." Right? Like  we've effectively now created a swim lane.

(09:07) Or,   you know, a better metaphor is we've put the  gutters down in your bowling lane. You know,   you can't sort of shoot a gutterball now. We know  that this is going to resonate with consumers.   But now you have these frameworks. Create a  bigger swing, right? Or, you know, conversely,   look at the ad account.

(09:25) Maybe with this persona  angle motivation combination, we haven't done an   ugly ad or maybe we haven't done like a three  reasons why or we haven't talked about that in   a founder interview and we need to get the founder  on the horn and jam with them about it, right? So,   it's like create these lanes beforehand using AI.

(09:41) Have the humans sort of pinpoint like where we're   missing and like that hole will be filled with  AI. I don't doubt that. Maybe give it 6 months,   12 months, like we're going to be able to identify  where the holes are in your creative output. Like   right now that's an innately human thing and then  from there like do you know sort of do the work   right use editors do post-production work and  then like once you sort of have something that   you're ready to go uh you know amplify that with  AI.

(10:08) I will say one thing that we're trying to to   do and we've mandated internally is we can't use  stock anymore. No stock photos. You know I got   like this dispute with artlist.io. They were like,  "We're going to charge you 10K to like, you know,   freaking uh, you know, up your license." And  I was like, "All right, Sara sucker." Like,   you know, eggs on your face, not mine.

(10:30) Like, I  was probably going to continue paying, you know,   120 bucks a month. But every piece of content  that we're creating that would be stock is now AI,   right? Because now we can create stock with  actually the product in it. And, you know,   using tools like Cream, you can actually make the  people look a lot less plasticky. And now that Veo   3.1 has start and end frames, like the world is  your oyster.

(10:51) You can create whatever content you   want. And so like my long-term strategy, and I  think this should be really anyone's long-term   strategy, is that like you on your teams have sort  of like creative visionaries, right? Someone who's   like thinking like this is a sort of vision. This  is where we need to go.

(11:07) And then we we hire people   to just where their entire job is to generate  B-roll for that brand, right? just using VO 3.1,   using Nano Banana, using Cream. And like now our  editors have this plethora of B-roll that they   can utilize that looks legit. And you know, if you  get clients saying, "I'm averse to using AI," like   we get this all the time. And we're like, "Okay,  cool.

(11:28) Let's just do one for you and we'll prove   you wrong, right?" And like we have clients that  are doing like around 5, six million a month that   were extremely averse to using AI. We did a video  for them and they're like, "So where did you guys   shoot this?" And we're like, "Gotcha.

(11:44) " So, it was  just like one of these instances where, you know,   again, I don't want to knock on there is a massive  value ad to be, you know, going shooting content,   but like if I just need a shot of me picking up  this coffee cup, you know, or I, you know, Brad,   maybe I needed you picking up that coffee cup,  it's going to be a lot easier for me to just do it   myself. Yeah.

(12:04) You know, it's funny you mentioned  that because literally within the last 48 like   prior to our conversation, but right like over  the last 48 hours, we've had somebody just like   as a role test it out just like building a B-roll  library with with whatever tools all all of the   tools I'm sure we're about to talk about. And it  yeah, I'm sure it'll be an iterative process to   hear it out.

(12:24) But selfishly, that's why I was even  more excited to talk is like we're we're playing   with this now a little bit more than we have been.  And uh I'm going to steal some of this for myself.   Yeah, there's a lot that we can there's I think  the frameworks already that you've gone through   will are really helpful.

(12:38) I mean I think a lot of  it is so much of this centers on people talking   about Andromeda and you know there has to be not  just creative diversity but now it's like you know   we had a question in the founders community this  morning of like all right look how are you even   reworking your creative department to make because  this requires more work now right that we have to   come up with really net new ideas and not just  iterations necessarily upon things so I think we   all feel that squeeze and obviously AI can help  us get there and make that a lot easier for us   so let's talk about so you have you have a number  of different workflows to go through. Um so static

(13:09) the first one you're talking about is static to  I believe you pronounce it GIF like the peanut   butter. Um I'm a GIF guy. I'm a GIF personally. I  don't know. Of course I've always been I've always   been a GIF guy. I just I just still love calling  it GIF. Uh so it's that using MidJourney. So how   does this one work in your opinion? And by the  way if you're not listening if you if you're not   watching this on YouTube this is also is going  to maybe even do some screen sharing here.

(13:33) And   if you're on YouTube do if you're if you're on  YouTube.com you'll be able to see this live but   otherwise if not uh or you'll be able to see this  recorded if not Will can talk about it and talk   through Yeah sure. And yeah um more than happy to  sort of talk through my process as I go through.   And one thing to sort of add you know Brad you  mentioned that like you're building sort of a   team around this.

(13:54) I think that was the number  one question I got in Portland is like how are   you getting your team to do this? Because it's a  lot it's very easy to be an agency owner and be   like guys use these AI tools and everyone's going  to be you know sort of flip you the bird be like   why would I do that like that sounds like a huge  pain right editors are stuck in their ways you   know your strategists are sort of stuck in your  ways so we did a sort of a a competition for each   model so we would say like exactly what we're sort  of talking about now it's like okay we're going to

(14:19) start with midjourney take your favorite statics  animate it let's present in four weeks we're going   to judge it and whoever gets you know the best  output gets it's, you know, monetary incentive.   And then we sort of leveled that up. Okay, take  that gift, now let's put it into VO3. Let's create   a video.

(14:37) And then the ending result was creating  a full ad, right? And like that was a great way to   get buyin from everyone into the process. And it  made it a lot more fun, too, cuz like we created   teams. Like it was just like there was a lot of  team spirit. So it was that was an unlock for me   because I was just like, why are we not using this  more? And then someone on our team much smarter   than me suggested doing that.

(14:56) So, can I ask some  really sorry I know we're like distracting from   the workflow stuff which I'm very excited to get  into, but do you feel like you have to like very   directly address like your perception of what you  think AI is going to do to their jobs to help get   over that as well? Because like we have we have  a bunch of designers on our team.

(15:09) It's like they   see the tools and they're like anxious about what  that means for them. And basically I tell them is   like if you can figure out how to use this like  you're you're replacing yourself and you continue   to be here. It's like you're just you're just  getting better and you're you're you're leveling   up in what you do. The same goes for me by the  way.

(15:24) There's a bunch of tools that be able to   do what I'm doing. You know, I used to spend we  used to spend hours building ads. Now we have   tools like ad manage that it takes three seconds  to launch 50 ad, you know. So it's like that's   perpetually happenings. Um do you feel like that's  the case for you? Like do you have to do you find   yourself explaining this to to your team? Totally.

(15:40) I mean I think I like to sort of belabor the pain   points that certain folks are feeling like we  don't have footage of this or blah like can we   get the the client to send us this? It's like well  we can but like now you have that sort of ability   to do it yourself. I also will say like and this  may be sort of like a hot take. I actually think   editors are going to becoming increasingly more  important and I think sort of the top offunnel   again like the and when I when I say top of funnel  I really mean like the personas angle motivation   like social listening discovery like that is  going to be AI. So, like I actually think the

(16:13) work up here is going to be more AI. The work down  here will cuz like if you I don't know if you guys   have played with Sora, but like it's just like  vibes, right? You sort of hope you get something   right.

(16:28) If you try to do like an intricate prompt,  godspeed, but if you get something that's like   close and you have a good editor, game over,  right? And so that's what I always sort of say   to my team like you know you are sort of filling  in the cracks and we are solving the pain points   that you know you have brought up many times in  the past. Cool. And everyone's creating something.   Everyone's bringing something new to the table  every week.

(16:48) And it's just like it's a great way to   get folks like more comfortable with these tools.  And then ultimately like I said initially our goal   is to have creative sort of visionaries. This is  the whole ad I want to create. I want to create a   30 secondond ad of this man going hunting and then  he comes home to his family, presents the deer on   the table, the kids eat, you know, whatever.

(17:09) And  now our team's going to be way more comfortable   creating that, right? And so like I don't want  to be confined anymore by the existing content we   have on hand. I want our team to be able to sort  of think, you know, not just outside of the box,   you know, sort of outside of any parameter that  had existed previously. A box, what is bigger   than a box? I don't know. Uh whatever, you know,  the prism. universe. Yeah, the universe.

(17:31) I don't   know. Prism. That's that's that's that's electric.  I like that. Yeah. Think outside the prism. Uh,   you know, sort of Pink Floyd style. Go to that  next level. And so, we want to have a world where   I I have this big lofty vision for an ad.

(17:48) Now, our  our our editors coming in and saying like, "Okay,   cool. Like, this is fun as hell. Like, this is up  to me to like come up with this." And like the the   visionaries and the visions are not going to be  too prescriptive. So, like that is where we're   heading. Candidly, you know, for for me and a few  others on my team, like this process is quick,   right? Like I could probably do some B-roll and  that I feel really comfortable with in like 20   minutes.

(18:12) For some someone else that's new to  this, it may take them like 4 hours, right? So,   like that is like a painstaking process. It's  exhausting, right? So, how do you sort of overcome   that hurdle? And what we've done is like for our  clients, it's like we're not going to do four AI   videos for them a month. We're going to do one  just so you can get a little bit more comfortable.   And it's just sort of like, you know, probably  a bad analogy, but put it the tote in the water   slowly start heating it up so people get more and  more comfortable, you know, with getting boiled,   I guess, but you know, they're going to hop  out. They're going to be fine. Yeah. I mean,

(18:39) I think it's I think it's you know, a member  asked me at the meetup that, you know, look,   I we have we have two people that are fully  dedicated to this on our creative team. This is an   agency person. this podcast is for seven and eight  figure brand owners, but I think it's I think   it's important nonetheless, you know, how they're  fully dedicated to this.

(18:57) Everyone was like, "Wow,   we're really jealous of that, right?" And and I  actually think that's okay, but I actually think   that's not necessarily the greatest idea because  I think you you want to not have this siloed and   you want to have everybody kind of talking about  be integrated as much as possible and be testing.   And I think that another piece that you didn't  mention, maybe you're doing this, is that you   want to incentivize people to do this, right? You  want to incentivize like, hey, who comes out with   the coolest concept? you know, you are going to  get a financial bonus side of this. And I think

(19:19) that if you have people in house or you yourself  as the founder are creating these ads, you know,   giving yourself a half an hour a day to screw  around with this, right? Like, and and if you're   utilizing Will's workflows, you could see all  that's happening is not necessarily dependent upon   one tool alone.

(19:35) It's just a number of different  things where you're going back and forth between   them and refining the outputs and going from  there. And, you know, I think if you dedicate some   time over a couple, you know, couple weeks, you're  going to get pretty good at this stuff. even if   it's 30 minutes a day. Um, I mean, that's what I  tell myself about weightlifting. I'm like, it's   it's just 60 minutes, you know, and over time I'm  going to get big like Brad.

(19:55) But anyway, so I think   that's I think that's important. I think that  uh how do you recommend that people continue to   stay on top of this stuff because it's obviously  changing super rapidly, like you know, I feel like   every other day there's some new thing. How do  you recommend people stay on up to date with it?   I mean, obviously your newsletter is a place they  can go to, which we can we can link in the show   notes and stuff, but where what are other places  as well? Yeah, obviously my Friday newsletter,   shameless plug, is you all new models. I I  always go through my workflows there. Um, and I,

(20:21) you know, just sort of backtrack for one second.  How do like you get people excited? I promise you,   if you give your team unlimited access to  MidJourney, they will have so much fun that   they're going to sort of like almost forget about  their other work. Like we have our team going in   there and constantly just like playing around with  it because it really is so much fun.

(20:38) But like I   obviously when you're getting much more analytical  with a process like with Nano Banana Bayo 3.1 it's   like very prescriptive midjourney again I can't  recommend everyone subscribing their team to it   you know create a group email and this is probably  not kosher but like you know have everyone go onto   that group me email it's $60 a month and everyone  can request as much as they want and like that   is a great sort of starting point. So uh yeah  sorry what was your second question? My second

(21:04) question is just like talking about where else  people can go to keep up to date on stuff. Yeah,   for sure. I can like I I can share a few Twitter  handles that I follow that you know sort of talk   about the first thing I do when a new model is  released is obviously I go to Twitter and I say   like okay what use cases are people talking about?  So it goes Veo 3.1 and so Veo 3.

(21:26) 1 dropped the day   before we were in Portland before I was supposed  to do this you know do a demo. I was like,   "Shit." Okay. And so I went to Twitter and I sort  of figured out like what are the use cases here?   Like if someone says they're building an NA n  flow and they're pumping out a thousand creatives,   block that person. N8 is a great tool for  in-house workflows.

(21:45) It is a [ __ ] tool for   creative output. I'm sorry because like you see  our process, right? Like we have to refine the   image or the video at every single stage to get  something that's actually workable. you are adding   so much risk in saying like I'm gonna use the  first image in mid Journey generates and I'm gonna   use the first nanobanana image and I'm gonna use  the first Veo 3.

(22:05) 1 image and like that's going to   be your output right well I don't update I never  put less than a thousand ads in when I update any   campaign minimum it's always a minimum thousand  yeah it's it's just the way that I do it one one   ASC per uh you know $1 a piece that's where I'm  at a 0.1 row as target and just let it rip. Yeah,   exactly.

(22:33) So, uh a final question that I have will  on a lot of this is um you you foresee that that   the right now you're you're manually kind of  plugging and playing. You foresee that a lot   of this is this like in what a year is all this is  going to be all in one. You're not going to have   to do this or like what's the time frame on this?  Yeah, of course. Like we're all hypothesizing   here.

(22:52) I would say if if you don't learn these  flows like now, you're going to be a little   left behind. I would say like get really good at  these workflows now. In 6 months, maybe a year,   the barrier to entry is going to be zero. You'll  be able to do effectively all of this on, you   know, what will be Americans WeChat. I don't know  who's going to build it, but like that is going   to sort of, you know, be what exists.

(23:10) And like I  would be remiss to not even talk about, you know,   the Open AI browser that was just released right  now. You can sort of just say buy me all these   different things. So like maybe creative will have  less of relevance in the future in and of itself,   right? So like that is GTM LLM optimization which  is like a completely different animal.

(23:27) But at the   end of the day like if you are generating creative  right now like I was I read an article the other   day that like Coca-Cola paid $100,000 for an ad  that was generated with Veo 3.1. So you know sadly   we're not contracted with Coca-Cola.

(23:46) But what I  can tell you is that like if you want to sign new   clients and you want to really impress big brands,  do this now and you will lock them in. If you   are on a sales call, go into sora.com, take their  product, say generate a UGC ad with this product,   show it to them on that call and they're going to  sign. Like we do this all the time.

(24:03) The idea is   that like you right now the closer the closer move  always be closing. One call is good stuff. This   is good stuff. I like Anyway, keep going. Sorry  I interrupt you. Yeah. No, all good. So, like I   will say like if you like these these enterprise  companies and these larger companies know they   need to be good at AI, right? They know they're  probably falling behind.

(24:25) And so like if you lock   in an annual contract or six-month contract now  when that sort of paradigm shifts and the barrier   to entry becomes lower, you're going to be already  going to be so much further ahead than ever anyone   else that's just joining when that sort of barrier  to entry is lower because you're going to know,   you know, how to prescribe certain elements  even within a low lower barrier to entry, right?   And like goes back to sort of type one, type two  thinking, uh, and type one, type two sort of AI.

(24:50) Like a lot of people are going to be generating a  lot of slop. If you know how to use these AI tools   to not generate slop and actually generate  something thoughtful with actual, you know,   creative prompting and and strategy behind it,  like you're going to be eons ahead. Like when   that barrier to entry, the slop gates is open. You  know, misinformation is going to be rampant.

(25:08) We're   not going to know what's up and down. But what  we will know is that like differentiated content   that doesn't fit that mold will rise to the top  and ultimately win. Where where can I get better   prompting? Yeah. How do I get better prompting?  Again, like I again, this is probably a hot take,   but I don't think you need to really be a good  prompter.

(25:28) You're just asking GPT, hey, this is   what I'm going to do. Write a prompt for this.  Exactly. And that's why like I I think people   get so bogged down with their prompts. And I think  that's the wrong mentality to have. I'm sure your   prompt is fine. That's why when you subscribe to  midjourney or you know Veo 3.1 I think it's like   the Google ultra subscription you can use your  prompt unlimited amount of times and I promise you   you will get something that is in the ballpark of  what you're looking for or conversely you know go

(25:53) to Sora and like you know just vibe it like that's  an option as well but like I think people just get   too hung up on when it comes to creative output  people get too hung up on prompting obviously   briefing is a completely different animal where  prompting actually matters a lot more But yeah,   with creative output, like don't overthink it.

(26:14) The only the only thing that's important and   and I can stop my rant after this. You know, nano  banana, cdream, midjourney, textbased prompts are   fine. If you're doing Veo 3.1, JSON prompts work  a lot better. So when you're when you're talking   to GPT5 ask for a JSON output JS O N if you're  not familiar you probably won't recognize like   this the style if you're not familiar with HTML  or coding but like use that and then like you can   tweak that.

(26:43) So rather than trying to engineer your  prompt yourself, put your JSON prompt into Veo,   take your output, put that back into GPT and  do that cycle we did earlier, right? Adapt your   prompt based on the output you got, right? Don't  go into the prompt yourself and like try to tweak   it. Like that will just drive you mad. Like this  system where you're just sort of like having a   positive feedback loop, I promise you like you'll  get a much better output and like you'll be,   you know, home in time to take pick your kid up  from daycare.

(27:07) So everybody listening to this will   be a year ahead because they they listen to this.  But I'll be at least a couple weeks ahead because   I'm going to download this and send it to my team  before this goes live. So you know he's going to   be closing deals sitting there prompt to this UC.  That's good stuff.

(27:22) So now let's go into actually   getting into some of these prompts really deeply.  If we want to, you know, if you want to learn how   to do this yourself, my recommendation for anyone  is to always sort of start in MidJourney. It's the   easiest platform to utilize. So, I'll just share  my full screen here.

(27:37) Actually, probably easier   just share it up my window. Okay, so again, uh,  Journey is the safest, easiest tool to use. If   you don't know anything about AI, get Midjourney.  60 bucks a month. You get unlimited prompts and   it does video, it does image. Obviously, Meta  did a partnership with Midjourney. Um, so I'm   sure we're going to be seeing more Midjourney  content in there.

(28:01) But again, like this is always   my recommended starting point because you can do a  lot with it. Um, and so in this sort of exercise,   we can talk about, you know, sort of taking  a static ad, adding some motion to it. And   you can sort of see, I was messing around with  this, you know, a little bit before our call.   One quick thing is when you start with MidJourney,  you're going to want to create a new profile um,   and answer these 200 questions.

(28:26) You know, which do  you like more, which do you like less? It doesn't   really affect your output. it's just, you know,  trying to train its LLM, of course, but this will   give you access to V7 in MidJourney, and that's  the model you ultimately want to utilize. So,   again, where I like to use MidJourney the most,  and I think the safest sort of place to start is   taking existing statics you've created and adding  some motion to them.

(28:47) So, where I like to do,   this is actually the only time I ever use chat  GBT is with uh prompting the sort of content we're   we're discussing now. Everything else I use Cloud  for. Cloud is like a little bit more creative. Uh   but chat GBT the main reason I use chat GBT and  this is really important. So like every prompt   that has copy in it you need to include that copy  in the prompt.

(29:09) So like or every ad rather that has   copy in it you need to use that copy in the prompt  and chat GBT does a better job reading copy on ads   than claude. That's really the main reason. Um and  I'm sure many folks have discovered that in the   past.

(29:26) So, uh, you know, here was a hollow socks  ad, uh, you know, built for serious hunters that   I found, you know, in, uh, Facebook ad library,  right? And so, the first thing I want to do is I   want to create a prompt. So, you know, let's just  sort of say like I'm exhausted. I've had a long   day and I say, you know, uh, I want to take this  static add and convert and add animation to it. I   will be using midjourney. Give me some ideas.

(29:55) So  this is generally you know sort of brainstorming   like um where do where do we begin? Um and so like  in this case if it's built for serious hunters we   probably need to like add some animation there.  And like of course we could do something simple   with with you know ruffling leaves or what have  you. Um it doesn't really matter. Um so we can   just sort of see if like anything here is sort  of interesting at all.

(30:19) uh you know, slow motion,   fog drifting, atmospheric lighting, etc. into  the hunt. This is I sent you guys this before,   but like you know, maybe having some animals  moving in the background. And so, like, if I   had a specific idea here, like I would obviously  skip this step. Uh but this is always generally   a good place to start. So, we'll do what we did  earlier. Say, uh actually, let's animate it.

(30:42) So,   there are a few animals in the background. We  want to create a prompt for midjourney 7. Please   include all of the copy that is on the static ad  in the prompt. So now this will give me something,   you know, semi-usable theoretically. And nothing  like the sickopanted GPT5. I always miss it when   I come back. Uh, okay, cool. So now we have our  prompt, right? So I'm just going to cop this.

(31:12) Copy this. I'm going to go into midjourney here.  Uh, I'm honestly going to get rid of all of this   and then just go into settings and do 9x6. Make  sure our model is on version 7. And then I'm going   to add So Will, so for those for those that are  listening, you're saying getting rid of this and   and what did you delete there? Sure.

(31:33) So, uh, GPT5  likes to add on all these sort of extra things,   right? Like aspect ratio 9 by6, version  seven, style raw, chaos 10, motion three,   camera move, slight pan, weird zero. like these  are things that you know you can customize for,   but ultimately I it's anything with sort of like  the two dashes before them, right? And so like if   you again like are listening to this, what I would  recommend doing is within um within midjourney,   there's a settings button and you can choose  all of these settings yourself within the   settings button. So here I'm able to choose my  image size. I can just like square, landscape,

(32:05) portrait. Uh we'll just go 9x6 here. stylization.  I just generally use a standards here. Standard   the most important thing is uh version seven. Um  and always I like to do four generations. What I   have historically said is that like one out of  five will be good for GIF. One out of 10 videos   will be good.

(32:31) So uh you know we'll prompt this and  you can see within my prompt it says hollow at the   top and it says built for series hunters. So in  my prompt, I'm copying the copy over um which   is you know definitely an essential part of you  know and I just realized I didn't add the image   first. So let me just do that. So now we have uh  the starting frame uh and the end frame. So now   it's just going to this is just going to make an  assumption. So I'm just going to cancel this one.

(32:51) Cool. Any questions so far while this generates?  No, honestly super straightforward. Yeah, it's   really not too complicated. So again, like this is  just a a way to like take your existing statics,   multiply them by two, 5, 10. Like we can go back  to our GPT5 and like maybe there are some other   cool ideas in here, right? Like um light fog  drifts, tiny particles, like you could do more   of like a cinemagraph style, right? Uh so it's  again the ability to take your existing content,   amplify it uh for uh for different audiences, if  you will. And in this one, it just put like a kind

(33:26) of put a deer walking behind it, which is pretty  cool. Um, and then what the other ideas that I   came up with are looks like some sort of British  hunter. Yeah. And and so yeah, this was there's an   there's like an alpaca idea that I came up with.  So yeah, it's kind of cool. Totally.

(33:44) And you know,   I actually did this sort of similar prompt  earlier. And so we can sort of see here, you know,   the outcome of that, right? We have these sort  of four versions with uh you know the buck in   the background. A few of them has have animation.  Obviously the copy was eliminated from this one.   So I would scrap that. But like something like  this you know I would certainly run.

(34:03) And you can   see the copy rendering is very strong right? Like  there's no sort of fuzziness. Uh everything sort   of comes across super clean and like now I have  a GIF that I just created in what uh you know 12   seconds that I can run as an ad. Should uh send it  to Zach and let it rip. See if it works. Yeah.

(34:22) So,   here we go. Like here's another option, right?  Those are wild. Yeah. So, now we have like a   little bit more animation in the background. And  you can see here we lost our copy, right? So,   like I said, one out of five will turn out well.  The rest will sort of be mid. Yeah. But honestly,   that's not even that bad, right? Like the build  for serious hunters is up long enough that you   might see it and then you get to see the deer  walking more clearly behind.

(34:41) I like how it's   rotating. Yeah, the sock is rotating. That's  kind of cool. That's kind of interesting. Yeah,   it's clean. I like that. Yeah. I have a question  about that quick before we move on though. Do you   find that you need to like I mean the the the  script that you have is not overly simplistic,   right? There's like a decent amount of detail to  that, but is there a is there a balance between   like how much you should be giving it or is what  you get back from GPT? Like GPT does a pretty good   job at keeping it the essentials? Yeah, it's funny  you say that. I think people overengineer their

(35:10) prompts and they think like this little tweak will  change that, this little tweak will change this.   I'm very much of the mindset that your prompt  is fine. Like what GPT spits out to you is good.   The the reality is is that you know sort of like  human beings AI is exceedingly inconsistent. So   if you prompt that same thing again a great part  of my bid journey unlimited 60 bucks a month.

(35:32) So   I could just reprompt this thing you know 20  times eventually I'm going to find something   I like. that's going to be a lot faster and  a lot less of a headache than me trying to,   you know, prompt it and then I'll be like, "Oh,  well, this prompt worked perfectly." But that   was probably just the reality of you got a good  output, you know, luckily with that prompt.

(35:50) So,   I would say your prompt is fine. Unless you're  like obviously, you know, there's something   really explicit in here that doesn't make any  sense. Um, like in this case, if I wanted the   copy to sort of stay on longer, what I would do  is I would go back to my GPT. I would download   this as a GIF and then I would say I mean we  can do it. Um I would download this as a GIF.

(36:13) I   would go back to my uh my GPT chat and I would  say this was the output. The copy disappears. Um can you correct the prompt to have the copy  remain on? And so like that's how I would prompt   engineer, right? I would take your output. I  would put it back into GBT and say like this is   what I got. This is what I want to fix.

(36:40) And like  this will make more sense when we sort of like do   product placement because like you sometimes you  just like really need to isolate one component.   Um and so in this case like what I did for those  listening is like I you know downloaded the GIF I   put in the GPT. I said this is the output the copy  disappears. Can you correct the prompt to have   the copy remain on screen for the entirety of the  animation? So now I'm going to go back.

(37:00) I'm going   to do the sort of the same process. I'm going to  delete our little double dashes that have AR-16   uh you know the aspect ratio. I'm just going to  make sure everything looks good here. I'm going to   add our starting frame with the built for series  hunters. And then I'm just going to reprompt it.   Um so yeah, that's sort of the GIF flow.

(37:21) And like  there's so much you can do here, right? Like this   is your playground. Uh, you know, if you were to  sort of scroll down in here, I would say like my   team like uses like this was an example of an  ad we created in Nana Banana and then we wanted   to add some motion to it, right? Like made the  fire, made the guy move around like you know,   cracking a can, you know, what have you. Like our  team is in midjourney all the time.

(37:41) It's just the   easiest. It's the lower lowest barrier to entry  and know we'll get here in a second, but also if   you need to replace a product, it's a great way  to create a good product backdrop. So like if I   will say midjourney sucks at product rendering  like that's why you know we don't use midjourney   for anything product specific.

(38:03) It's really for  vibes or for creating gifts but midjourney is best   as sort of step one. So let's sort of you know  move into sort of workflow two. So next you know   next sort of step in the process let's talk about  sort of product insertion. So uh I was doing this   before. So again, like what I like to do is go  to Pinterest and it seems like you know for Zach   and his brand like hunters are very much a big  part of uh you know who he's targeting.

(38:26) So if I   wanted you know I did this before but like Hunter  you know wearing socks right and this was my sort   of reference image. Uh maybe there's something a  little bit better. Um let's just say maybe we want   something like a little bit more vintage.

(38:43) Uh, I  would say Zach's probably looking for something   more like murka. Got it. Okay. You know what I'm  saying? Like a like a bearded beefy man, you know?   Yeah. Somebody like but with a full Yeah. Somebody  like Brad but with full facial hair that is like   that loves loves guns and loves America. Got  it. Loves his wife. Maybe a little more of a   gut. Drives a truck. Loves a loves a fresh a  fresh beer ski. Okay. Loves a burger.

(39:12) Do you   need more? Definitely one right there. The tobacco  field. This one. Left one more. That was actually   both. Both of those are vibes out. This is This  could be our guy. That seems like that's my guy,   honestly. All right.

(39:28) So So for those of you that  are listening, it's a it's an older gentleman that   uh is wearing socks. He's got jeans on and he's  not smiling um at all. And this is actually just   so we don't get sued. You know, this is actually  not the image we're going to use. So, what we're   going to actually do is we're going to, you know,  we're going to save this image into the void.

(39:46) Uh,   and then we're going to go into this little  nifty tool called free image uh to prompt   generator. Love this guy. Uh, and now I'm going  to upload that image that we just found. Oops,   not that one. Here we go. Here's our friend. And  I'm going to generate a midjourney prompt for him.   Right.

(40:08) So, obviously you can do this in GPT as  well, but I find that like this tool you get,   you know, five free a day. It's just like it gets  it really, really close. So, I'm going to take   this midjourney prompt that we just generated  with image prompt.org. Going to go back to our   friends in Midjourney. We can also look at these.  Oh, look. Our copy is now saying the entire time.   Like, great. We just tweaked that little thing.  And now we have an ad.

(40:25) Maybe that one's a little   wonky, but you sort of get the idea, right? Like  we all we did was we took our prompt, we put the   output in, and we sort of uh cleaned it up a bit.  Okay. So now I'm just going to take that prompt   that we just generated an image uh prompt.org. I'm  going to do the same thing.

(40:41) I'm going to do 9 by6   uh you know version 7 whatever. And then I'm  going to generate that. So this image that we   generate is going to be our baseline image that we  utilize. Um, and what we're going to do is we're   going to take uh our image and we'll probably  have to reprompt this because uh we need to be   able to see his socks more a little bit.

(41:06) Um, but  we're going to be taking this image and this is   going to be our baseline. And another thing that  Mjourney doesn't do so well is it like gets people   people look too like play-dohy like they're too  plastic. It's like very clearly AI. And so we're   also going to be using um, you know, seedream  to clean that up. So you can sort of see this   based on that image. This is what we got. So I'm  going to actually take one of these images.

(41:28) Uh I'm   going to do exactly what I did before. I'm going  to go into our GPT chat. I'm going to paste that   in here. I'm going to paste our initial prompt in  here. And this is an image of an older gentleman   sitting in front of a bunch of guns with a then  standing next to somebody with a flannel. And   he's downloading the image. And now you're taking  it into a different place into chat GBT.

(41:53) So you   went you went Pinterest to the free the image to  prompt generator and then you went image to prompt   generator into uh midjourney. Midjourney kind of  missed the socks. It didn't generate exactly what   you wanted with the socks showing obviously and  that's kind of what the product is.

(42:07) Uh so you're   taking that back to GPT and you're asking it to  show the socks better. Right. Bingo. Exactly. Oh   yeah. Nailed it. Okay. So, let's redo this. Let's  see what we get. No, socks are the main event   here. Okay, nice. You can tell we're getting  some We're getting some movement. It's a real   guy's guy. This guy loves to crush Budweiser after  work. Yeah. Yes. I like what I'm saying.

(42:32) He would   he between the choice of buying a new truck  or a older F-150, he would buy the older one   cuz that's when they that's when they they made  them better. The 1995 whatever version. Andrew,   that might be you in like 50 years. Yeah, it  looks pretty looks I mean what we're looking   at here for those of you that are listening is a  photo of an incredibly handsome man.

(42:54) He could be   a model actually like a handsome older man. And  uh we're going to put some socks on him. Uh some   some hollow socks of course because those are the  best. And because this model/handsome man loves   hunting and he likes to have the ability to have  his feet be dry, that's where we're giving him   holo socks. Totally. And it looks like we have a  little some horse saddles in the background.

(43:16) So   maybe we also maybe want to prompt this one more  time to add some, you know, rifles or, you know,   not a big gun guy, but maybe that's rifles. I  mean, what do people shoot with? Bows, guns. Yeah,   I'm not sure. Okay. I'm not sure either. I'm not  I'm I'm the opposite person to ask about hunting.   Okay. Okay.

(43:41) Um Yeah, me too. I live in New York  City. Okay. So let's redo this one more time and   say uh and again every time what I'm doing when  I'm refining my prompt I'm just taking the image   that we just generated in midjourney putting it  back into GPT5 and saying uh in this case I'm   going to say in the background I would like more  you know hunting parapelia nice um so I'm just   going to redo this you know one final time and  hopefully I mean what's interesting about it to me   is like it goes back and forth between all these  different tools and you're utilizing using them   to to just refine one another. Um, which like like  you said, it's obviously going to be in one place,

(44:15) but I think what's what's cool about it is it  actually is putting together really good outputs   that you know. So, if somebody So, let's say that  I run one of these, but this old guy will just to   be clear, you have to you have to say this is  created with AI in your ads. If it's utilizing   some like person, you have to disclose that from  an FTC standpoint, right? Yeah, I believe so.

(44:37) I   think that is uh you know that you have to say  this is AI generated. Same thing you'd have have   to do like if you're using UGC actor, right? Like  you'd have to say paid actor sort of same idea,   right? You have to just just FYI to everybody out  there that you do have to do that. Yeah. Yeah.   Everybody totally does that too, especially with  UGC creators. Oh, totally. Oh, everyone does this.

(44:58) Of course, it's 100% compliance across the board.  But I do know that the FTC, you know, we all   follow that that lawyer guy on on on Meta or X. Uh  I mean, it's case after case of like, hey, they,   you know, they utilize this, they didn't disclose  this, like this stuff's coming out more and more.

(45:15) And if you're a bigger company that's scaling,  you just really got to see why to make sure you're   okay. For sure. Okay. So, I actually feel great  about this image. I feel like there's, you know,   we're sort of hitting the nail on the head.

(45:28) We can  see the socks really well, so being able to swap   the product out will be great. He's got some camo  gear, you know, he's got some buck horns. Uh he's   got, you know, some rifles. Uh so yeah, I feel  pretty good about this. And again, you can sort of   see like if we zoom in on on our friend here, like  it still feels like a little plasticky.

(45:45) So that's   another thing we're going to have to adjust for.  Like especially these ones. These are much more   offensive. Like you can very clearly tell these  are AI. This gu is actually not horrible. Um but   ultimately, you know, we will want to sort of  clean that up. So, okay, next step in the process.   I'm going to download this image. And here, let me  actually share my full screen here.

(46:05) Um, so you can   see me do this. Okay. So, I've now downloaded this  image. What I'm going to do now is I'm going to   uh is this our friend? I'm going to open this  in preview. And then I'm going to go to tools,   annotate, uh, rectangle, and I'm going to  highlight just his socks. I'm going to do it   for both socks. Okay. This is some mastery [ __ ]  right here.

(46:28) I've done a lot of what you're talking   about thanks to you teaching me, but I haven't  done this, which is literally just creating a   box around the socks to say, "Hey, this is what  I'm looking for." Exactly. And the the reason   we're doing this is like we actually won't use  the box when we're generating the image prompt,   but we will use the the boxes for GPT to write uh  a a succinct prompt to ensure that nothing else   changes but the socks. And so we'll sort of see  what that looks like.

(46:56) So, we're going to go back   to our GPT that we've been working with. We're  going to grab our little friend, the hunter. Big   manly friend, sorry. Um, we're going to put that  in here. And then what I'm also going to do is   grab a photo of hollow socks. So, I'm just going  to, you know, grab uh something without copy on   it, I guess.

(47:18) Like, is this This is probably  This is the one that advertising for hunters,   so maybe I'll just save this one. It's a nice one.  Yep. Orange and black high top crew. Yep. Perfect.   So what I'm going to do now is we have our hunter  friend. I'm going to take this uh hollow socks   image and say okay next step uh please write  a nano banana prompt to uh put the socks on   uh the hollow socks in the prompt. Please include  all copy that appears on the sock.

(47:54) Please note we   will not be uploading the red boxes. So our prompt  needs to be succinct enough to change just the   element and nothing else. Um and then uh I think  that's I think I'm not missing anything there. So   let's do that.

(48:23) Um, oh, I will be uploading both  the image of the elderly gentleman and the hollow   socks. Uh, okay, cool. So, now what I'm going to  do, let's go back to our image. Maybe I'll save   this one as or I'll just, you know, now we have  this in GBT. We can just delete this. So, I'm   going to save this. Save this. Okay, cool. This  is for nano banana midjourney sub. Sometimes they   will think you're doing one thing then another.

(48:47) Again, we need to pinpoint and change just the   socks. Okay, this is a little short. I'm not super  bullish on this, but we'll see what happens. Um,   okay. So, now I'm in a tool called foul.ai.  If you've ever spoken to me about AI creative,   I've definitely mentioned this to you. It's a  Higsfield competitor. Uh, effectively what it   is is just like it combines all of the AI models  into one place.

(49:13) So it's just like very easy to you   know use anything. The only tool I don't utilize  in uh in foul actually two midjourney is because   obviously midjourney doesn't have an API so  foul couldn't use it and veo3 veo3.1 I guess I   use directly in Google deepmind uh in in what they  call flux because there you're able to if you pay   like 120 bucks a month you get unlimited veo3 fast  and in this case it's sort of pay as you go.

(49:43) Okay   so now we're in fell.ai AI in nano banana. I'm  going to put our prompt in here. I'm going to add   our images. Oh, I don't know why this is showing  up as I'm going to generate four images because   again I would say one out of five is okay. Aspect  ratio 9 by6 and I'm going to run it. And so again   for those of you listening, I did not include the  little red boxes around the guy's feet.

(50:09) uh you   know the only thing uh the red boxes are used is  to make sure your GPT prompt is proper right uh so   oh yeah looks like he's wearing some hollow socks  there yeah but again like this not a great first   output right because it just says hollow but as  you can see sort of nothing else changed so now I   would go back to our GBT copy this image paste it  in here and say uh we need to include more detail   in our prompt uh as the output did not match the  sock exactly. Please analyze the skew image of the

(50:51) sock and include that detail in the nano nano. You  know it what's interesting about this is that it's   utilizing all these tools in in combination with  one another and it's also giving some really good   outputs um very quickly. like we've only been  doing this for essentially like 30 minutes and   we have, you know, a number of things that we can  use. Okay, so again, like not perfect.

(51:15) I would   probably want to prompt this a few more times  to get the the socks to be exactly right. But   let's just say we were happy with this so we can  sort of switch topics. The most important thing,   not the most important thing, but an important  thing, this is just like a little bit of a hack,   is again making him look a little less plasticky.  Um, and that's what I use the C3 model for.

(51:37) Um,   so, uh, and we can include Will, by the way,  for for podcast listeners, we can include, um,   notes from you to make sure that people can like  walk through this a little bit themselves of the   things that you've covered and a couple of other  ideas. For sure. Just so they can guide through   them. Guide through it themselves. Definitely.

(51:57) And  like again, this is one of these things where you   just want to do it yourself. Like that's the best  way to learn. Uh, you know, watching me do this   is probably helpful. Uh but it's only starting  point. So there's one very simple prompt to make   people look a little less plasticky. I put it  in our notes over here, but it's just this. So   uh it's image_3984.CR2. The as I understand it,  the C3 model was trained on Canon RAW files.

(52:23) So when you include the CR2 within the prompt,  it makes it HD. And then I say, please upscale   uh you know this person, showcase pores, etc. and  make them feel less plastic. And so I'm going to   again let's do four images. Uh and we'll we'll do  you know portrait 9 by6. Uh sream takes a second.   So I can sort of pause there. You know we'll show  you guys the output.

(52:50) It'll probably take a minute   or two and you know we can sort of switch topics.  Yeah. I'd be curious like before we switch to   the operational stuff and come back to that. Like  why don't we just I think it'd be helpful to list   out some of the workflows that you have with the  specific tools. So like you started with the the   static to GIF GIF peanut butter static to GIF with  with midjourney.

(53:10) The next thing you did was you um   I mean you kind of combined some of those things  but then you also use nano banana for product   insertion fixing the plastic nature of them is  is using Cream. Is there anything else with those   three tools that that are worthwhile to chat on or  do you want to just like quickly mention the other   couple that you've got? Yeah, so Cream uh again  I would say has two really good use cases.

(53:30) One is   like making people look like people and two it's  image resizing. If you and uh it's a very simple   prompt if you have like a 9 by6 image upload it  to GPT you know please read all of the components   of my 9 by6 image take that prompt put it into  Cream and then change the aspect ratio directly   in Cream and that way you can like very I think I  did this actually ahead of our call for um and I   can show you this quickly for these guys right so  like this was a 9x6 image uh I said please resize

(54:00) this exact ad to one by one square format while  preserving every visual ual element, composition,   text hierarchy, keeping the two black and orange  hollow, you know, one labeled over the calf,   etc. Like what I did is I uploaded my 9x6 image  into GPT. I had GPT read every piece of copy on it   and then include every piece of copy in the prompt  as well as the different components.

(54:21) So like of   course some of these like is they're going to  change a little bit, but like there we go, right?   That's a perfect, you know, one by one image of  that exact same thing. So like if you just like   don't have the the time, energy or effort to  just like quickly resize something, sometimes   we'll just do this and it works very well.

(54:43) Uh the  one sort of caveat is make sure you're using HD,   the HD feature directly in Cream. Um cool. So  now we sort of go back to our guy here again.   Helix looks a lot better, right? Like it actually  feels, you know, a bit more human. Uh like I think   this one's probably the best. It's just like a  great way to get like more consistent coloring.   um across the board.

(55:05) And so like if I were to go,  you know, one step further, my sort of final step   in this process would be taking this image going  to VO 3.1 and using this image as a start frame.   And then I would I would take this image as well,  upload it, you know, back into Nano Banana and   have this man maybe, you know, hunting and then  using those as a start and end frame. And now I   have a great piece of B-roll of the man sort of  sitting in his shop.

(55:27) uh you know just maybe he's   like moving a little bit and then it transitions  to him like walking around with the socks in the   woods uh actually like hunting dough or whatever  it is. Yeah. Or he's like pulling up the sock,   putting his boot on like something like that.  Yeah. Yeah. Exactly. Yeah. So like the it's all   you would really do is take this image that we  just generated in Cream, put it back into Nano   Banana and say using this man create XYZ  scene. And now you have an end frame.

(55:53) So   you can create like a whole piece of B-roll with  the person wearing, you know, the proper socks. Yeah, I mean I think I think there's a ton  of ton to say here. Will appreciate your   time. Appreciate you being here. Uh we'll put uh  share your document to anybody in the show notes   on scalabilitychool.com. Um it'll also be in the  show notes on youtube.

(56:20) com/scalabilitychool where   you can find us. Um, or you can email brad at  sexybrad.net. No, just kidding. That's not email.   You can email me at andrew@foxhwelldigital.com  and I'm happy to connect you into any of the   resources that you need. Will a banger episode.  Yeah, thanks for having me. Really appreciate it. This episode is brought to you by  Homestead's email and SMS service.

(56:48) Honestly,   we've we've done an episode with with Jacob  from Homestead. He's the man and he's he's got   to be probably one of the smartest email and  retention operators like on the agency side,   hands down. I look forward to seeing the  tweets that he puts out basically every   single month with like a recap of the designs  and as amazing as the designs themselves are,   it's the actual like tactical behind the scenes  work where he just like knows what he's doing.

(57:11) And the team at Homestead like really know what  they're doing on email and SMS. probably if not   the number one very very close to competing for  number one retention agency in the DDC space. So   just killing it. Yeah, absolutely agree. You know,  if you are looking at your emails and you're like   these are terrible and you know you can optimize  the flows and you know you can do SMS better,   you got to look at the Homestead team.

(57:36) So check  them out, reach out and they can definitely help   make your entire email and retention and SMS  program so much better. And I'm not just saying   that. We've all seen the work. I actually had  a a member who sat down with with Jacob at the   founder meetup that we had and he came over to me  and said, "I just talked to him for 10 minutes and   literally he he's revolutionized my entire email  department.

(57:59) " So, agencies are modeling themselves   off of what Jacob is giving them for advice. So,  anyway, onwards to uh hopefully working with them. The only way that we grow this podcast is by you  sharing it with your friends, honestly. Like,   reviews kind of don't really mean anything  too much anymore. They're really meaningful,   but they don't do a lot for the growth of the  podcast.

(58:24) Um, and so sharing YouTube links,   sharing Spotify links, sharing Apple, whatever  we call it under the podcast app. Now, anything   you can share, the better we're going to be. Um,  guys, anything else you want to say on this? Yeah,   please go check us out on YouTube. Rack up  those views for us. We'd love to see it. And   then subscribe. Make sure to subscribe  on YouTube as well.

(58:42) And I relentlessly   refresh the YouTube comments because it  dictates my mental health for the day. So,   please say something nice about all of us. Thank  you everyone. Thank you for listening. Honestly,

Next
Next

BFCM Recap & Tactics To Pull Forward