hey everybody Nick here in this video I'm going to build a completely automated endtoend AI newsletter system that sources posts completely autonomously on Reddit then curates them using AI filters then summarizes them and then writes new content based on that content before finally putting it into a newsletter format pushing it out through MailChimp although you could use whatever the hell you want uh and then you know getting it seen by real people this was a viewer request had a lot of fun putting it together so if you guys have any other systems you want to
watch me build then by all means drop a comment down below if it's a good one that I haven't done yet I absolutely will record a video just like this appreciate the time let's get into the system so I haven't actually built the system out what I want to do is I want to build this alongside you the reason why is because I found a lot of content on YouTube about automation is unfortunately only really showing you the finished product they never really show you the steps that you need to take to get there I
always think of it sort of like uh an engineer who's trying to you know improve his engineering skills looking at a picture of the Empire State Building and being like awesome I think I can build that right there's a lot that you don't know during the discovery process and my purpose in these videos and also my channel more generally is just to show you guys what that looks like so we're going to do some build alongs uh I'm going to try and figure these things out as they come I may have some stumbling blocks may
have some issues uh that's okay I want you guys to see what a real actual building process looks like and you know one of the reasons I'm doing this just because I've heard from a lot of my viewers that they tend to like this sort of content so if you don't maybe let me know um any who I've spent around five minutes thinking through this and here's how I believe this system is going to work we're going to start by getting posts from some popular subreddits I've did some I've done some thinking and I want
to do my newsletter on AI so I want this to be just some general purpose artificial intelligence newsletter um called Loop once we have you know this newsletter post Source from Reddit what we need to do obviously is we need to scrape it using a service so I have a couple services in mind I'm going to run through each of them in a moment then we're going to filter posts based off the subject and based off some other metrics to ensure relevance when I say other metrics I mean like you tend to get a lot
of info from Reddit like um up votes down votes there's comments all that stuff so we can actually use that as a second order filtering mechanism to tell us whether or not a comment is good or whether or not a post makes sense or whatever from there we're going to store posts in a database which is really just a Google sheet uh with a unique ID the idea being the second that you start having some sort of object permanence the second you start like storing data somewhere that you can retrieve it later you open up
your application to like dozens if not hundreds of other cool things that you can do and that's what we're doing we're future proofing this as well and then after that every week or however long you run to run your newsletter in my case it's going to be every seven days we're going to grab a certain number of posts then we're going to use AI to summarize those posts and then use that to write a standalone newsletter um we're then going to add the copy to a newsletter platform like MailChimp I just picked MailChimp here because
it's the one that's the most accessible from previous projects I know that they have an API endpoint we could call to populate a newsletter template and not all um newsletter platforms do and yeah this is sort of how I think things are going to go go reality is often deceptive and a lot of the time you know if you remember from my other videos where I've done something like this some of these steps might have to be swapped around we might have to do a couple of other things that I'm not necessarily anticipating but yeah
this is more or less how I think it'll work let's dive in with the very first step which is getting posts from a popular subreddit so I'm going to un screen this and what I'm going to do is I'm just going to type in like Reddit Singularity Reddit uh artificial intelligence Reddit um open AI right I'm just really quickly going to go top to bottom and just see what these threads look like realistically and this is important because anytime that you're building any sort of system you got to know the ground truth data and you
have to understand what what what the quality is and you have to like kind of you just have to massage it a little bit before you actually want to go through the rig roll of like building out a system if you make shitty assumptions with the ground truth data then a lot of the time you'll carry those shitty assumptions the entire way through the rest of your program or your flow so I'm just going to really quickly read through this top to bottom and I'm just going to see what kind of data we're getting okay
and I got these three subreddits open this looks pretty reasonable to me so I like Singularity just cuz you know it's a subreddit that I personally visit reasonably often scrolling through it's confirmed open as s is releasing for the public today sick new footage of Optimus walking Outdoors hell yeah chap chpt Pro users get infinite Sor usage of 20 second videos with 1080p hell yeah staff claims AGI is achieved with 01 cool so there's tons of posts here I'm noticing the vast majority of these are videos and or photos additionally um they have links that's
kind of annoying because if you think about it from my perspective like the simplest way to do this would just be to feed in the text of this uh Reddit post into my newsletter generator so I'm kind of now thinking all right there are like a couple things we could do uh okay this subr looks a little better most of it's text there are a couple things we could do we could develop a system that like goes through the images and also reads through the images summarizes them that sort of thing or we could just
look for uh posts that only really have text then we can just summarize those that'll probably be the easiest once read through the open AI one cool so more most the same um oh we got an MKBHD video interesting yeah so more of the same I'm thinking what I'm going to do is I'm just going to see how many of these communities I could scrape if I can funnel these into one data source that would be ideal um because I don't think I'm going to get enough data for like an actual cool newsletter if I
just Source only from The Singularity or only from the artificial intelligence subreddit so my my mindset now is okay great I'm going to try and see how many of these I can feed into a scraper of some kind then once I know um you know we're going to run through and I'm probably just going to try eliminating all of the images and all the videos uh first because I want to see if I can get by on text post alone if I can then I know the scrap is going to be a total walk in
the park it's going to not be a problem already so in terms of how to actually go about the scraping aspect what I basically always do anytime I want to build out an application that requires data from other social media Source or whatever is always go to this platform first called appify appify basically just lets you make apis out of anything and they have this Marketplace rest store component where other people create scrapers and then make them live for you they will charge you a small fee for usage but this abstracts away all the complexity
and all the of you having to actually like go out there and build your own scraper which is great for me because I just wanted to whip this thing up in like an hour or so and if I had to build my own Reddit scraper God damn would that take me forever so Google Maps extractor Instagram scraper Tik Tok data extractor you can see you can scrape whatever the hell you want I'm just going to type in Reddit and from here I see a cop I see Reddit scraper light which I've personally used before Reddit
scraper Reddit API scraper just opening these in new tabs and the idea is we want a scraper that allows us to do a couple things one we want it to be like reasonably affordable we don't want to spend a lot of money on this right um you know if your newsletter operation cost you $10 a month or something that's great you have't completely automated AI newsletter for 10 bucks a month Jesus people 5 years ago probably be spending hundreds of thousands of dollars on something equivalent but we also wanted to to to get us the
data that we need and what we need here is I'm thinking we need the Reddit post text so I'm seeing here it says scrape Reddit post with title and text usern name number of comments votes media elements that looks good this one here looks like it's made by the exact same guy TRX Gustavo rud diger um it's just one is light so I I guess you probably get less data the other is scraper so yeah 45 bucks a month screw that one this one seems to be reasonable as well this one seems to be reasonable
as well why don't we just use the Reddit scraper light um and I just I have to see the interface on this thing so I'm just going to go over here into my actors I'll go to store and then let me go to Reddit scrape Reddit light um let me save these changes here okay great and now we have what looks like a bunch of fields that we can edit um in order to scrape the specific resource so scrape information about post communities and users without loging that's great that's what we want start URLs if
you already have the URL Pages you want to scrape you can set them here if you want to use search field below remove all St eurs here I imagine this is probably just some place that I pump in the community uh excuse me for that I pump in the community and then uh you know it's probably going to scrape it and what's cool is it looks like I can do multiple so that's nice let's go back here add this let's go back here and add one more okay and I don't know if these settings are
automatically on or if this is just because I've used this before and like I set these on not really sure definitely search for posts comments yeah posts only um we don't want to include NSFW content we're going to do 100 are we going to do 100 yeah all right screw let's do 100 okay so I'm G to save this I'm going to start and I'm just going to give this thing a run and the idea here is I just I'm going to see the data I'm going to see the format before I like get ahead
of myself um once I verify the form format is okay assuming that it is then I'll jump into make and I'll see how I can take the data from the scraper and and stick it into make.com so yeah uh I'm seeing it's getting my data which is cool it's probably going to take a minute or two I'm not seeing any data inside of here which is weird why not maybe do I need to go hot I remember there being some issue with this sort of thing not entirely sure what is or maybe it's because it
says skip users posts that might have something to do with it okay no it's adding some stuff to the quebe yeah I think this scraper just had some weird thing uh where you need to select what type um Reddit can sort post based off hot uh new top or Rising so if you click top if you click hot you know you sort of have these URL endings and uh yeah I just remember there was some issue where when I didn't use that I didn't get the post but when I do I I get them obviously
there's no real way to know these quirks unless you actually go out and handle it but uh yeah I'm getting a lot of cool posts here looks like most of these are recent in the last week or so which is nice some of these don't have thumbnails and stuff okay great so you know I'm I'm getting enough data this is more than enough data for me for to be able to run a test now if you think about it logically what I have to do next is I have to set up a system that filters
these posts based on the subject matter like I don't like let's look at this post here um if you have a product to promote this is where you can do it outside of this post it will be removed I don't really want that in my newsletter right so I need a way to filter that out I'm thinking I'm probably going to use artificial intelligence to do this because it's flexible and smart but uh yeah so I think what I'm going to need to do so first of all I'm going to need to get this data
inside of make.com the way you do so is you go over to storage and then here it says data set ID just copy the server and then inside of make type appify what you want is you want get data set items paste in the data set that you just copied into this data set ID column let me select the right account and then just leave the rest of this blank the purpose of this is this just allows us to connect to appify and bring the data into a no code automation platform like make which is
allows you to do way more cool with it so yeah this looks pretty solid um we're getting all the data here so that's that post that I was referencing with the product promotions and stuff this is another one uh looks like we have a body column which is really nice I'm happy about the body very happy about the body um because uh yeah we you know we're getting all the information we need but we just want to make sure that we have some way to filter it so kind of like what I'm thinking is okay
how do we filter out this information probably based off the title of the post and the body using AI so that we only extract things that are relevant like this is what I want a drop in replacement for llama 3.1 70b approaches whatever this is the sort of stuff I want meta releases this that's that's a cool piece of news I want stuff like I don't know autism and artificial intelligence new discovery says blah blah blah chat GPT 40 better than Gemini advanced do you know what I mean so that's what I want we need
a way to filter the out what I'm going to be doing is I'm going to jump into GPT here and then I'm going to go create a completion or prompt and I'm actually just going to write a prompt that does all this filtering for me now because this is a filter I'm just going to use GPT 40 which is a much more inexpensive model and let me run you guys through what my prompt would actually look like if I were to design this for myself or for you know business that I'm working with the first
thing I do is I'd add a system prompt that says you are helpful intelligent assistant that's the very first this is basically just whatever the model identifies as I'm saying hey GPT you're a helpful intelligent writing assistant so be helpful be intelligent and assist me with writing and then afterwards we go down to user and this is sort of where we give it instructions we tell what we wanted to do so what do we want to do we are creating a newsletter with exciting with new and exciting developments in AI let's say that lists new
and exciting developments in II your task is to filter a Reddit post based on relevance if it's relevant and return either true or false um rules a post is relevant if it's about news developments exciting progress or if it yeah or if it' be considered newsworthy let's just go with that a post is irrelevant if it's about something personal or if it's a question or if it's um a community moderation or or or if it's like uh Community moderation post like this one or if it's or if it just contains an image SL link with
no contacts let's do that return your output in the following using this Json format we'll go relevance and then it'll just be true or FAL false and we're just going to put this um in Brackets here you don't have to sorry we're going to put this in quotes you don't have to I'm just going to do this for the purposes of uh communicating with this model kind of like an a an API okay great um and now in terms of the text content let's just go post and then let's feed in the body that looks
pretty good to me we'll just feed in the body well maybe we should do the title too you know maybe we'll go title and then we'll feed in the title and then we'll go body and then we'll go body perfect awesome that looks pretty good to me um how many did we scrape here we scraped 91 so why don't we just uh change the limit of this to 10 and then let's just get it to really quickly run a test on these 10 and just see what it's outputting oh my bad so we're getting the
output as Json but I actually want this Json to be output um and then parsed as well so in order to do that in open AI what we need to do is go down to show advance settings then scroll down to the bottom to where it says response format what you want to do is Select Json then click parse Json response then click okay we're going to save this and then run again and now what we should get is we should get a result variable with the relevance perfect awesome so I'm just really quickly going
to jump through here just see if any of these are true just want to see you know out of 10 how many realistically can I expect to be true I don't know if any of them will be um we'll go result okay corre this one looks like it's true let's see what the detail was the CEO who says cheaper a could actually mean more jobs okay cool that looks pretty solid I like it uh let's go down to this one relevance false uh this one false this one false oh nice so we got two out
of the five so far three out of the five that's great kind of losing my place here um yeah so three out of five so far sweet um honestly that's more than sufficient for me like if you think about it if we scrape um just based off the cost of this if the cost of this thing is4 thou $4 per 1,000 results we just scraped 10 and we got three so let's just even say that 20% of our records are okay how many records do we really want in our newsletter like I'm thinking what we're
going to do is we're going to feed in like five or six of these and then our newsletter automation is going to like summarize these and then provide a link back to the re maybe it's even provide a link back to the resource I don't even really think that's necessary we're just going to like compile this as if we were the new source um so realistically how many do we really want in a newsletter I don't know like let's say six so in order to get six you know if if it's 20% uh of posts
land then if we just multiply six by five then that's 30 so in order to get this done I think it's like 1.2 cents or something or 12 cents 12 cents per week that's a walk in the park yeah no problems there I'm not going to worry about the images or or anything like that I'm only going to select posts that don't have images um and that are just pure text posts that we can summarize based off of that that just seems to me the simpler way to do it let's call this automated newsletter uh
automated AI newsletter system I'm just going to do one because I have a feeling we're going to need two uh scenarios here and then what I'm going to do now is I'm just going to dump all this into a Google sheet so I have this Reddit post database set up here I don't have any fields or anything what I want to do is I just want to take this download all the output bundles you see all these I'm just going to copy this then I'm going to paste this in a chat EBT and I'll say
your task is to create a CSV of key names I'm going to paste these into a Google sheet as headings this is now going to take the data and it's going to just give me a bunch of headings that I can very quickly and easily use maybe quickly and easily was an overstatement yeah I don't know why I have to download a file come on man just I put it as text yeah there we go cool uh I'm just going to copy all this now go to my Google sheet and then paste this in and
then I'll just go data um split text to columns perfect and now I have basically like all the data columns in here that I'd want I'm just going to change the style this a little bit and I'm a huge for enter so I'm going to use enter uh let's give this a quick little click beautiful looks clean um and let's just call this scraped posts now let's head back over to make.com and what we want to do now is we just want to go to Google Sheets at a row and then I'm going to want
to go down here to Nick at left click. and then choose my spreadsheet ID it's going to be called Reddit post database then I'm going to have to select the sheet name which is just going to be scrape post it does contain headers and check out how simple and easy it is for me to map all of these now I still need to add the the filter which I'll do in a second but just look at that this is um substantially faster than you know when you make the fields in this um equivalent you know
it's just substantially easier and faster also there's one thing I'm realizing like I'm going to want to use this as a database right but I'm going to want to keep track of which posts in this I've used for my newsletter because logically I'm going to need a way to list all of these and then select them right so what I'm going to do is I'm going to add one more column here and I'm just going to call it post status I'll head back over here and then over here I'm just going to type in new
so new is just going to be you know after I scrape it this is where it goes awesome that looks pretty good let's add a filter uh actually there's one more thing we have to do before we do this if you think about it before we add a new record we have to check to see if the record already exists in the database if it does already exist in the database we shouldn't add it so one thing I noticed while I was reading through this data is that there's an ID field this ID field means
that we can actually search through the sheet looking for an ID and if there's an ID that matches this ID then we don't have to add it so I'm actually going to drag a search rows module and just add it in front of this I'll type Reddit oh I'm using the wrong email one sec I'm going to type in Reddit over here get grab my Reddit post database sheet name is going to be scraped post and then what I'm going to do is you see where it says filter I'm just going to go and um
check to see if the ID is equal to this ID here and if it is equal to this what I'm going to do is if it is equal to it total number of bundles will be greater than zero logically so if I want to check and see if something is new total number of bundles will be equal to zero what I'm doing is I'm searching for rows with the same ID so if there are uh Records with the same ID then it'll return more than one so if I just check to see if it's equal
to zero it'll basically always work which is nice so that looks good to me over here I'm just going to add a filter for relevant and I'm going to say condition relevance equal to true so now we have two filters we have one for relevant and one for new um now that that's good to go I'm I'm basically just going to run this puppy on all the records and see what happens let's go 100 and then let's just run looks like most of these aren't relevant the ones that are are now passed in the Google
sheet and they're all new oh it's kind of ugly also I'm adding this to the wrong column that's annoying okay let's uh let's fix this before it just gets unbelievable reset reset then I don't like how tall these are so I'm just going to click them back here hold on one sec just go right here also I think the reason why this is doing is something to do with wrapping so I just fix that okay cool and now we're actually scraping these which is pretty sweet this looks good to me um let's just double check
to see if this data is everything that we want like one thing that you'll notice um when I'm building these systems what I'm thinking about is basically like I'm I'm testing it iteratively at every step I'm saying okay cool is this is the output of this module or this step what I'm expecting and so this is me actually just double-checking that it is so this looks good this actually looks kind of weird here's what's making news in AI I don't know if I like that this is actually basically my same yeah this is The Verge
doing the same thing I'm trying to do here with my newsetter so actually we we shouldn't allow these in so I'm just going to bold this for now I'll keep track of that later this looks good that looks good that looks good what is this it's an advertisement hm this is my startup yeah so I don't want people advertising their own products here it seems kind of dumb I don't know if I want anything that dumb either okay anyway this actually looks pretty solid um there's only two records here I really don't like here's what's
making news in AI so I have to go back to this and what I want to do is I just want to update prompt to uh what's the best way to think about this this is basically like um I guess this is a newsletter so I'm going to now Mark things as irrelevant if they are sort of newsletteryou or if it's a product advertisement especially if it's the founder talking about their own product or if it contains an image link with no contact good so I just updated my prompt um what I'm going to do
now is I'm just going to delete these two records if you think about it I don't want these poisoning the well butth the rest of this looks pretty good to me so I think we should be able to move on now that I think about it you know we we could do some additional editing through the number of comments and stuff but I don't actually think that's relevant like odds are if something is in the hot stage yeah actually this one minute AI news is kind of so it's this rag thing let's let's remove that
too um if something is in the hot stage then it's probably already kind of preved so I'm not actually going to do that that other level of filtering what I am going to do is it looks like there's one more field here there's data type here looks like um I kind of screwed up with my patters here post status is not the right column so let's just scroll down here image URLs oh yeah yeah uh let's just go image URLs yeah okay so there's actually some image URLs here and then uh Let's do let's make
this one post status so sorry W is what w is video URL let's go video URLs image URLs and then post status here cool um so that's that uh we have basically everything that we need I believe in order to take this to the next level which is now like if you think about it let me just see is there anything else that I'm missing here no I'm pretty sure that's the whole system like yeah we uh we we now have everything that we need to scrape data and then publish it to this spreadsheet what
we need now is we need a mechanism that will allow us to take the Post in the spreadsheet and then convert them into newsletter text we have the data source so it's really just this this other half here every week grab X post use to summarize them wrate to stand newsletter add copy to some newsletter platform like MailChimp so I'm just going to I'm surprised that I haven't actually had to change any of these steps um this is going to be scenario one basically how do I not make this this is going to be scenario
one over here and then this is going to be scenario two and we're going to have to separate them because logically they both do different things right like scenario one is going to be getting the data and adding it to DB scenario two is going to be um curating content and passing it through Ai and posting to newsletter that's sort of like the two-step flow that we have so this is going to be called automated and newsletter system Source posts it's not entirely done it we still have one more thing we have to do um
but I'll talk about that later and I'm just going to create a new scenario here this new scenario it's just going to be called automated and newsletter system summarize with AI and post to and create new CER now let's just do that that looks good to me um awesome this is popping up because I'm attempting to command s save this while also selecting the text up here so it just doesn't know if I'm trying to save the text or if I'm trying to save the whole flow um I just want the title to appear here
so I'm just saving this so that I have both of these available so this is number one this is number two let's move into number two so now we just need some way to grab all this and you know I'm also going to want to store the outgoing newsletters somewhere else so why don't I go here and type generated posts um and then I'll worry about what that actually looks like later but okay let's go to my Google Sheets and then let's go search rows and basically what I want now if you think about it
is I just want to grab all of the rows in my database with a a post status equal to new because if the post status is equal to new it's basically up for grabs right I can use this in my flow so I'm going to go here to spreadsheet ID and just select the uh the Reddit post database I find in practice the lengthiest part of this is actually just dealing with this uh Google Sheets module the search rows one because it just takes forever to remap everything then what I want is I want filter
post status to be equal to new and for now I'm not going to set a limit alth I probably should eventually so we're just going to make this okay I'm going to right click run this module Lon and let's just see if we got all the day so we got this this is good nice this is good cool we got basically all the posts which is uh you know what we want I mean like you know logically we're not actually we we can't just have ai do 20 posts for news every week so realistically we
probably want like six or seven I used to run a newsletter way back in the day it's called the cusp it was followed by um dhma sha the founder HubSpot and it used to look like this this is way back in the day with doly um so I actually think like ideally what we would do is we'd actually just use this we'd replicate some of this functionality welcome to the cusp cutting Ed and using implication explain in simple English this week's issue here's some quick little examples teasers and then we have some H2 and then
we have the actual description here and then we just do the same thing over and over and over again yeah that looks to me pretty good so I'm actually going to use this as the basis of my GPT flow uh what we're going to do now is if you think about it we're going to use AI again so I'm going to go open AI create a completion actually I'm just going to copy the module from the previous flow because it has most of what I want I just need to change the prompt and then why
don't I just rename everything while I'm at it so this is get Reddit posts this one will be filter posts this will be check if exists this one will be add new post cool down here it's going to be get Reddit posts this one's going to be I guess generate summary well is just the summary no let's generate headline and snippet let's call it that you guys will see why in a sec I'm going to jump back over inside of this it has most of the text that we want which is cool just say writing
assistant I think I forgot that on the previous module call and we are still creating a newsletter um that lists new and exciting developments and that your task is to take as input a Reddit post title and body and rewrite it different words and rewrite it for our newsletter the rules are going to be um use a casual Spartan tone of voice use third person POV um you know most news letters they they write like really gratuitously with the new lines like they'll they'll keep the paragraphs really short and go like one to two sentences
so we should probably add some information about that here um use new lines gratuitously and keep paragraphs to one two sentences Max anything else I think that's good return your output using this Json format let's go headline and then let's go snippet rewrite it for our newsletter use a casual SP ton of voice 30% PV using toly keep paragraphs to on sentence to Max let's do try four 4 to six sentences total that looks pretty good maybe 5 to 7 I don't know whatever we we we'll see how it goes okay and then next up
I have to use this user prompt um I'm going to use headline and snippet right so actually let's make this simpler we'll go new new headline new snippet this would be old headline old snippet very cool now what I'm going to do for old headline is feed in title now what I'm going to do for old snippet is feed in uh body there we go and I'm just going to leave everything else is the same and I just want to test this out basically like a lot of the time you have to add a bunch
of examples of like before and after um but what I'll usually do is I'll just have ai generate me a draft of that and then I'll go through and I'll do some editing and if I have to add an example I'll do it then I don't want to do it for all 20 or 30 or however many I got so I'm just going to do like let's just do one for now I'm going to run once let's generate our headline and snip it and see what see what we're looking at metad drops llama 3370 Bill
metas launched their latest llama 3.37 bill which is a step up from llama 3.1 70b performance- wise almost catching up the 4 B J it's a smooth swap for anyone using the older model you can check it on hugging face it's worth peing at if you're into a advancements I don't like that you can check it out on hugging face so I need to add another Rule and that rule is don't include don't direct people to offsite resources don't direct people to offsite resources like that okay let's try this one more time that actually looked
like basically perfect so I don't think I'm going to need to do any editing cool that looks pretty reasonable to me awesome um so now that we have this you kind of think of it from my perspective uh what we need to do now is we need to format this in such a way that um we're basically going from post to post and delivering summaries of each so there variety of different ways we could do this and make probably the simplest is to use a text aggregator what a text aggregator will do so I'm going
to set the source module to get Reddit posts is it will allow us to map this into a format that I'm very comfortable with using which is called markdown where you could write headings and you could write like the subheadings so uh my heading let's try H3 first uh yeah let's try H3 The Heading would be this and then the snippet would be this right and then between the two I'd want a new line then after I'd also want another new line so basically what it happened is for all of the bundles in this sequence
it'll do this and then just concatenate them all together which seems to me pretty pretty good so let's do six um and then let's go over here click okay we're going to get this yellow little bubble here because technically a Transformer should never be the last module in the route but I'm going to give this a go we're just going to see what this looks like it's currently processing all this right now for us wonderful realistically I shouldn't have done all six I should have done two cuz then I would have used fewer outputs but
um this looks pretty good to me let's just um I'm going to copy all of this and then I'm going to go to markdown live preview which is just a tool which allows me to paste stuff in okay so this is basically what it's going to look like wow this is sweet yeah stuff like this is kind of annoying it's probably doing this cuz I tried to force it to do seven um new lines so I don't actually need seven new lines so sorry uh seven sentences I don't actually need seven sentences let's just a
for like three to five three to five sentences total that'll be a little better but anyway um still this is like good format this is what I like so everything here is nice all we need to do now is we just need to put this in I mean basically just put this in a newsletter I'm just going to use a Google doc though um it's be the simplest for me and the thing about Google Docs is they only take as input oh you know there are a couple more things we have to do actually we
obviously have to generate an introduction to generate conclusion um but but I just want to test this out in a Google doc um the thing is Google Docs requires um HTML format and I'm outputting this in markdown I could output this in HTML I could just ask GPT to do HTML but I find that when you ask AI to to Output things in HTML um sometimes the format is a little bit off or you know it's not like perfect for the Google Docs module when I do markdown there's a lot less room for ambiguity and
then make actually has a built-in markdown to HTML converter that just always outputs the result in like very easy accessible Google doc functional format so I'm going to convert this to HTML let's aggregate headlines and Snippets sorry my rice is burning one sec okay rice is definitely a little charred but hey we like it crispy uh so we're going to aggate the headlines and snippet Mark 9 HTML okay I'm going to feed the this in as just an example newsletter content's going to be this HTML and let's just see what happens right because we know
that this looks good just about how it's going to look in the Google Doc so I'm going to run this once a I should have only done two yeah whatever that's fine uh we're just going to aggregate all of them I guess and if you think about it logically what we're doing is we're going to use one AI model to do the um bodies and then we're going to do another a model to do the introductions and the conclusions I think that makes sense we should generate a title as well so this looks good to
me um I don't like that there's no new lines but I guess it's just a formatting thing yeah you see how that's like kind of stuck together there's no space between them it's just a formatting thing this actually looks pretty good I'm going to go enter here yeah I like how casual and sparten the the languages I don't like stuff like this a bold claim that adds fuel to the AFI imagine the possibilities if he's right I think I can probably reduce the incidence of this by um changing the temperature going like 0.8 Spartan tone
of voice 's a third person POV I'm just going to remove the try for three to five cents as total probably be better um but anyway cool looks pretty solid to me so now we just want some sort of introduction and then we just want some sort of conclusion the conclusion can probably be templated if I just look at how I did um this on my own newsletter welcome to so I'm going to say welcome to the loop well welcome to the welcome to loop I guess cutting atj news explain in simple English looks good
and this week's issue one two three let's dive in what's the conclusion like you see how there are all these images here there's also a bunch of links we could like if you really wanted this to crush what you would do is you would take these images sorry you would take the headlines you'd feed them into a search engine model like perplexity perplexity would return you a list of citations you take those citations and then you would pass the snippet along with uh the citations through another GPT call and say hey your job is to
add links so add links add these links to whatever this resource is wherever relevant it would then go through and it would it would add links then for images you could use like mid Journey you could have mid Journey generate images once every three Snippets or something just formatted and then it would actually do pretty well um I don't I don't think I'm going to do that for this just because if I do that this video is probably going to be like 3 hours long but I will give you all of the tools that you
need in order to go there and do it like little stems and stuff like that just to make it pretty simple U maybe you could do the same thing with with videos on YouTube and whatnot anyway the thing that's important for us is the conclusion this is good I'm just going to copy this conclusion this is just going to be the same conclusion every time pretty solid so I'm just going to copy this this is like G to go here oh hold on that's a wrap maybe we'll go that's a wrap enjoy this consider sharing
with somebody you know you can also follow me well this is going to be a pretty short format so you can also Subs uh follow me on Twitter if you prefer more straight to the point AI news like this see you next week- Nick cool and we need some intro right so that's what I'm going to do now let's take this let's use this to generate introduction and conclusion uh no conclusion sorry what was that other thing we need to generate a title right so we're going to go generate introduction and title we're actually going
to use this to generate an and like AI generated a title so we're creating a new Z list new and exciting development and your test take as input the newsletter and write an introduction and a title rules casual Spar tone of voice third person PV um you know what I'm just going to remove most of these rules and then I'm just going to say use a casual Spartan tone of voice follow this template then I'm just going to go back here scroll all the way up to the top of my thing then I'll just uh
go back here and then paste this in welcome to Loop Cutting Edge AI news explained in simple English cool this week's issue uh let's do one two three let's do this return your output using this Json format introduction and then title uh we should probably do title use these examples for titles why don't I just go back through my blog let get a couple of example titles that's a good one so I like this one kind of Click baity as you see the clickbait did not start on YouTube the clickbait has been happening for a
long time uh let's do that okay cool awesome uh H right so what I need is I need newsletter then I just need to put in this text and then I'm just going to go over here and then you know where it says result you can actually access parameters before they're even generated by going result title and then what I want next is I want result uh introduction I believe right yeah introduction okay this should be good let's see how it goes paste the mark down here let's run this bad boy I'm pretty excited I
hope this works it's going to be really cool man I love templating content like this because like even if even if hypothetically you wanted to use this for your own newsletter and it wasn't perfect even if it gotten 80% of the way there which these models are totally fine doing you just 5x The Leverage on whoever's time it is look at this oh that's clean man oops uh this that is clean okay metal launches llama 3.3 70 build to compete with the big players welcome to Loop Cutting Edge and explain in simple English in this
week's issue 1 2 3 4 5 6 you know to be honest I don't really need all six of these here like like this you already know all the news from this okay so let's just do let's just tell it to pick the top three for the introduction let's do that pick the top three news headlines don't use more than top three the top three news headlines let's just do that because because otherwise I mean there's no point you're just reading the whole newsletter right here I mean some of these are a little longer like
box CEO yeah some of these are a little too short Google's latest Quantum trip meet Willow how do I make this longer um what is the body of this oh it's an image you know what we have to remove the images here like just an image that doesn't that's not going to work for us the reason why this is so short is because it just says images and then it goes so actually we need to adjust our filter back here um or if it just contains a an image link let's just see what the exact
text was or if the body merely contains an image link with no contacts there we go this is going to force all of the ones that say images to just get out so we're not actually going to have these anymore so when we do our newsletter um you know we're not we're not going to have to worry about these like one or two line things ideally all of the posts that are here are going to be substantially longer which is cool okay okay okay so we got six here gener introductions and title markdown Google Docs
let's run this through and I want you guys to keep in mind that I'm using this Google doc here just as a just as an example um after this we're going to take the same flow and we're just going to connect it to MailChimp or clavia or whatever newsletter provider I I deem Worthy um I am the one who deems newsletter providers worthy okay we just ran it through we're creating the Google Doc I'm going to go down to web view link again paste it open it and then I'm going to go and just make
the spacing a little sexier so I want to see kind of what this would actually look like we could even do some formatting here we could like have this be bolded maybe do something like that instead because I'm using a colon down here but anyway um um meta launches this a might be boosting jobs according NATO uses a to persuade solders wow this is pretty clean not going to lie and the that's a wrap looks good too so yeah um I'm just going to change the introduction template to include a dash instead of the colon
just because I don't think it looks good in English to have two sentences one immediately after the other and both of them to have colons like this so I'm just going to use an M Dash and then if you think about it um what we need to do is we just need to update the Google sheet so that the post status is like published or something instead of new um so we just need a way to to update this way you do so is you go to Google Sheets go update a row and I need
to put this inside of this aggregator and then I'm just going to choose Reddit post database Reddit post database sheet name is going to be generated no um oh you know we got to dump this to a Google sheet too forgot about that um we're going to add in the row number from the Reddit post that we got so it actually Returns the row number so we know that hey you know when I update the post I want to update row two and then what we want is we want post status to be published set
post status to published maybe we should just say update post status it's probably a simpler oops do not consume any more of my operations thank you very much looks pretty good um and then yeah we also need to do one more thing what I want to do here is I want to go and add another row so instead of me going here and clicking add a new row and having to like do the reconnection again I know that I have one modu that already adds a row so I'm just going to paste this in then
I'm just going to remove this filter and I do that just by holding command C and then pressing command V and then what I want to do is is I want to add a row not to scraped posts but to generated posts here now I don't actually think I have the headings yet do I no so I have to add headings what headings am I going to add uh actually I guess I don't know yet because what I want to do is I just want to have a store in Google Sheets uh like a database
that includes all of the campaign information so yeah technically I can't do this until I have Campa information right so let me just delete all of these um variables and then why don't we actually do the MailChimp stuff that'll be smart so let's head over to mtm.com first of all I know that there's some editable text area here thing so I just have to really quickly read through the API and just remind myself of this I've done something similar quite a while ago so it looks like in order to populate uh a Content area in
MailChimp you need to include MC colon edit with the content that you provide so yeah this is interesting I've tried doing this before quite a while ago but I'm not entirely sure how so we're just going to jump in a MailChimp um see if I have this account that works which it does we'll go down to email templates I believe create template what we want to do next I think is code your own paste in code that's fine go Loop template okay great we have all of our code here nice uh what we want to
do is we just want to replace hold on a sec you know where it say body you just want to replace everything inside of this with MC edit I believe then save it it's going to delete all this should anyway should just say MC edit nice and then just want to save and exit which looks good and now we want to head back into make and then yeah yeah then we go to MailChimp create a campaign let me actually add this before and [Music] then H uh yeah I did the connection with MailChimp but what
would the title be I guess the title would just be title list audience ID would it just be Loop good subject line would just be title from name just be Nick at Loop uh from email address would be my email which I'm just going to use this one for it two name nothing folder ID okay so fill the body content by template ID by HTML format text and then feed in this HTML this output and then what you want is you want to go to MailChimp again do you see where it says perform a campaign
action this is what you're going to want to update select the campaign ID then for Action click Send a campaign and now it's going to this is going to create the campaign in MailChimp it's going to populate the text with the HTML that we just output from the markdown to HTML module and then it's going to send the campaign um to everybody now I'm going to get into the configuration of how exactly the schedule this and stuff like that in a minute so I don't think you we actually have to send this as part of
the test but um that that is more more or less the logic here through MailChimp you could probably do the same thing through clavio as well I I actually don't entirely know I know for sure like the thing about a lot of these um inbound or newsletter platforms is a lot of them don't provide you a way to create a campaign using an API or update the content of a campaign they actually want to lock you into like dragging and dropping stuff so I know Ma trip allows you to do this definitively realistically you could
probably try this with all the other main ones like clavio active campaign and so on and so forth but yeah this is what we're going to use with that time smiling yellow monkey uh and then we're going to go add a new post and then what we want is campaign web type okay so I actually just need to I need to do this once we just need to see how this goes so I'm going to go run once we're actually going to connect all the pieces here and this is really like our our end to
end test if you think about it I'm assuming that you know we're sending this once a week let's do like Monday or something Monday probably makes the most sense uh Monday like seven or eight I'll show you guys how to hook that up inside of appify stuff okay so yeah there's there's an issue here why oh because at let's just go Nick at Loop let's run this puppy again yeah melch told me what the error was which is that there was an at sign in the from field and apparently you're not allowed to do that
so I just got rid of that and then um reran it so we just finished we're now generating the intro on the title markdown creating the camp cign me chimp and then performing The Campaign action we actually sent it beautiful uh let me just see an archive URL long archive URL I believe this is going to contain the actual body of it yeah very cool AI ADP to cyber security retail and sports evolve and AI benchmarks fall behind welcome to Loop cutting Ed J ands explain in simple English in this week's issue three hits let's
dive in nice you see that this is different than previously the reason why is because we're using different content um so we marked the content in our sheet as published and so the first time it ran did the first six second time we ran did the second six now we only have a few posts left and so in this way you basically always have like new content your news that are pretty sick okay so what do we want to do though we want to I want all this data here um campaign ID recipients HTML all
this stuff so I'm just G to do the same thing I did a moment ago where I copy and paste this in chat gbt could do this turn this into the same thing um sorry let's just say use camel case for all this stuff just because the other sheet was with camel case and I believe there's something to be said about the Aesthetics of this we're going to go to data split text to columns and then bold this then paste this in here and then again sled for enter we're going to go enter and then
I'm going to go back here and and we just want to map all these fields in our Google Sheets module so let us refresh the headers and let's go top to bottom campaign web type create time archive URL long archive URL email send type archive recipients collection yeah know this is actually just going to be an option isn't it yeah that's kind of annoying sorry actually let's um oh you know what uh I don't like this collection stuff but like if we dump this in we're not going to get any actual data from this just
going to say object object probably hm okay for now I'm just going to dump this in um for the links let's just join links with this plain text long string HTML archived HTML okay cool we'll delete that uh looks pretty good to me let's just go here and then what we want to do is we only want to send this once a week right so I'm going to go to Monday and we'll just go I don't know six a.m. we'll say every Monday at 6 a.m. we send this puppy and then if you think about
the sheet we only have three new ones that are left so why don't I go back in h and then scrape a bunch more I guess like let's just do some more scraping so let's just go to to limits and then instead of 100 let's do 200 let's actually see how this works and then what I want to do is um I want to hook this up and make this like production ready right so there's a difference when you're testing a flow versus when it's actually like being pushed to production production just means like reality
so when we tested this flow we use this get data set items module but what we really want is we want there to be a way to trigger after the apfi actor is completed and then to automatically get the right data set so that the data set is dynamic it it gets updated it's different to do this you click on the get data set items you add the watch actor runs before it and then you replace the default you replace a data set ID with this default data set ID figure and then I'm just going
to set the limit to let's say 200 and then over here what we have to do is we actually have to add our own hook so I'm going to say finished Reddit scraper um I see there's another one called that so I'm just going to rename this just to be safe the actor I want is I want this Reddit scraper this one here now what this is going to do is it's going to run every time that we finish and to show you guys how this works I'm actually going to click run once and then
we're going to watch this get populated with a bunch of new posts and then after this is popul with a bunch of new posts we're then going to go in and we're going to we're going to run the second scenario as if it were the beginning of a new week and we're ready to pump out a newsletter so I'm pretty excited in order to get this hooked up let's go back to appify we got these three communities which looks pretty good I'm just going to click save and start and now this is going to run
and and we are just going to watch it run from start to finish and just do one final end to end test of our whole flow I consider endend tests like basically a a requirement um it's one thing to test iteratively and to do things one module at a time this enables me to build quickly with a lot of modules in my flow without necessarily breaking anything and having to wonder where the hell is the error like for instance if I just dropped 10 modules and just thought yeah this seems like it'll work and then
there was an error in my flow like it'd be hard to identify exactly where the problem was was it that module 2 was mis formatting the data that module 6 had the wrong function call there's a lot that that goes on but if you test iteratively with like the first module and you make sure that works inputs and outputs oras you expect second module that works inputs and up put are as you expect third fourth fifth sixth so on and so forth and basically the second that ER occurs you know that every other module before
that module worked so obviously the error is with this one and this allows you to do debugging substantially faster that said after you're done with that in order to like really make sure your flow works you still need to test it end to end and my rule is I always test it as close to a production environment ass humanely possible if I'm delivering a project to a client I will ask myself hey what is a client going to provide his input what sort of fuckups are they going to make when they filled the form wrong
or something and I basically try and put myself in their shoes just to determine like what my my flow would actually look like start to finish so this is currently running on a ton of these looks like we have 70 results so far a lot of these look like images so most of these are going to yeah most of these are going to um you know be filtered out of my system which is fine I think we're going to end up with 200 so we still got quite a ways to go um let me think
what are some other things we might be able to add to the system yeah there are a bunch of these mid Journey apis which are unofficial I don't know if mid Journey has an API yet yeah no they only have the uh The Unofficial ones but basically what you can do is you can hook up to something like this and then what this is doing behind the scenes is it's just calling um Mid journey through their interface and then ret return the results to you like an API call so what you could do is you
could just um you know over here where we generate the introduction and title you could also generate three image prompts or something and then generate three images with this and then you could distribute them every third snippet or something um and then VOA I have a bunch of images I think I've mentioned the link stuff but if you go to perplexity here you could create a chat completion with perplexity and when you look up something with perplexity like let's let's take one of these Reddit thread posts for example I'm going to feed this in and
then look this is going to return some citations one two three what you do then is you feed in your title here and then sorry you feed in your snippet and then you say hey with these three links this link that link and that link I want you to go into my snippet and then rewrite it and insert the link as markdown or you know a href equal link goes here and then you know now you have a snippet that's like Dynamic that has a bunch of links inside of it I mean there are many
things that you can do in order to make this better and a little bit cleaner but um yeah I think you guys see that they're like basically once you have the core idea of a newsletter automation um you can take this any which way you want now keep in mind I've scraped almost 200 entries here and hasn't even cost me a dollar those 200 entries are probably equivalent to just based off the math at least 40 or maybe 50 um pieces of news those 40 or 50 pieces of news effectively and you know it's not
great to post news from last month but effectively they allow you to like continue going for a whole month um so this whole scraping cost could be less than a dollar a month which to me is pretty crazy to think that you can get a fully automated newsletter for a dollar for the data probably like another 50 cents or so for the token usage in the operation usage you get this whole thing done for a $150 a week like you are Off to the Races you are laughing to the bank obviously you still have to
pay money to MailChimp you have to pay money to whatever the provider is that you're going to be sending the emails with but yeah okay great so we just finished this we are now going into chat PT this flow is now filtering through um and add things that are relevant as you see the first few were not new which is why they've been filtered out we had four that existed but only one that actually made it to the end this is worth saying um if you guys also have a low rate limit if you guys
are on a low tier then you might want to add like a sleep module before or after this flow the reason being um this is going to consume a lot of operations like if we go here and we go to usage you see I'm consuming 222 tokens and I'm basically doing it 1 2 3 four five like about two or three times a second so that's like 600 tokens a second so 600 time 60 seconds about 36,000 tokens a minute which I believe sets you a little bit over um the default limit provided for tier
ones so just keep that in mind um as you are doing you know these these token wise heavy operations okay so obviously this is working let's head back over to our Google sheet let's see what's going on we see some posts redit is joining the AI Market that's new scroll all the way to the right here we got a bunch of more new ones pretty sweet LM hallucinations world's second fastest supercomputer oo very clean I am still noticing that a lot of these have the images um text here so I'm not liking this now that
I'm running this again I'm noticing that tons of them just say images so there are a couple other things I could do I could set a procedural filter where when it has images like this what I do is I um I just like I don't allow it through so relevant would not only include the AI relevant thing but it would also include um a call that says if this contains the text Capital IM m a g s with a colon immediately afterwards then don't allow in um and that seems pretty reasonable yeah that's probably what
I'm going to do um you could also update the AI prompt to do that for you I mean you know kind of the way that I see it is like if you're going to be consuming this many tokens anyway that's like a pretty quick and easy hack I think in my case the images hack makes sense though so that's what I'm going to do after this is done okay we just finished with all of these so what I'm going to do is I'm just going to make sure that this body does not contain images with
this line here with this colon I should say so this was only now going to proceed if this does not contain said colon and then awesome what we're going to do now is we're just going to write a whole newsletter let me just see am I sending this to myself am I going to get this yes I've already gotten one cool I'm going to run this now it's going to go through and it's going to select six and I want you guys just to pay attention to what's happening here we're updating each of these from
new to published on the right hand side right so you're seeing this is basically now incorporating this into the flow it's pretty sexy we're not going to aggregate the headlines on the snippits write the intro in the title convert it to markdown create the campaign and then perform the campaign action uh it does look like there's an issue the Google sheet unfortunately and it's what I thought the collection um is Raising issue here which which blows but essentially anything that has like a nested collection here um is is not going to work so let's see
uh recipients collection here H maybe we just want list ID instead so I'm just going to store this settings I'm just going to replace these with other variables settings collection is just going to be subject line then tracking collection is just going to be HTML clex and then delivery status collection I'm just putting these in my URL bar so that I can just remember them later um oh actually we don't need this we could just go enabled tracking collection was going to be HTML clicks settings collection uh was going to be what did we say
subject line up here and recipients collection we're going to be list ID uh links array I think that should be okay I'm not entirely sure maybe we want H we just feed in the first one okay cool should be sufficient now I'm just going to go back to my Google sheet and just update this cuz I don't want this to break again obviously we can only run this a limited number of times right uh why don't we go through and run this one more time and what we want is we want to update this call
this list ID want to call this subject line want to call this HTML clicks and then we want to call this ooh uh I don't remember what we call that probably should have written that down hm anyway we did get all of the Texs which is nice um there's this giant blue wall in front of us because anytime that you add something to a Google sheet um it will automatically inherit the style of the element above it so that's what that's for but we can fix that just by selecting all of the elements and then
sticking it here okay great and now we have uh we have the plain text long string as well with the actual whole newslet letter just listed here which is cool and here as well and this is HTML the pl text long string is going to include some stuff like um Telephone unsub links and so on and so forth Just because that's what MailChimp does but uh it doesn't actually like exist in actuality and voila we now have our finished system with our newsletter had a lot of fun putting that one together um if you wanted
to change the style or whatever you could do so by changing the HTML template but man is that sexy and is that completely autonomous last but not least before we finish let me just show you guys how to schedule this so that this runs whenever the hell you want so um there were two scenarios right there was scenario one which sourced the data and scenario two which actually did something with the data for scenario 2 just head over the schedule and make sure this is set for days of the week if you wanted to send
this weekly and then like Monday uh Thursday whatever with a time down here and then just make sure to to clarify whether it's a A.M or p.m once this is done just turn this on a custom schedule click on then head back to the first one where it says watch actor runs you'll notice that this scheduler is um a little lightning symbol which stands for immediately basically this is awaiting a web hook so if you wanted this to work autonomously for you you'd have to go to schedule and then you'd have to create a new
schedule in appify then go weekly and then just have this run whenever you want in my case uh Sunday at 12:00 a.m. UTC is is perfect and then uh go down here to add click actor and then just go to Reddit scraper or light the input is just going to be whatever those three or four however many subreddits you want were so we click save we then click save and enable and now we have a schedule where this is now triggered completely autonomously I'll say scrape Reddit weekly and basically what's going to happen is once
a week that is going to trigger every time that that trigger that is going to trigger the um dump to Google sheet automation which was automation number one after that's done that's going to automatically update this big sheet of ours and you know you don't have to do this once a week you could do this like two or three times a week if you wanted to like Source tons of posts right but anyway after that's done um then this which will also occur once per week in our case Monday at 6: a.m. again you can
do this as many times as you want we'll go through that sheet autonomously and will'll update all of the new posts to published as they are sourced and used in actual content and so in this way we have effectively created a closed loop automation which does all of this completely without any sort of human Intervention which is why an approach like this can be so goddamn powerful I really hope you guys like that video had a lot of fun putting the system together if you have any questions about how I did so feel free to
leave them down below as a comment and as I mentioned at the beginning of this clip I take a lot of my requests from viewers now so um yeah I'm more than happy to build out a system as long as I haven't done it before and I'm just like doing the same thing over and over and over again uh aside from that do the fun YouTube stuff like subscribe get me to the top of the Alo if you haven't already subbed my watchtime um rate of non-subscribers to subscribers is going way up just because I'm
getting more and more popular but if you find yourself in the unsubscribe Camp please do me a solid and subscribe and yeah I'll catch youall in the next video really appreciate the time