hey guys so in this video I'll show you how to set up a no code AI agent team that scrapes any website or social media in seconds all by just telling it what to do through a quick message on slack this agent team can for example be used to find and scrape New Leads do competitor analysis research potential prospects but the possibilities with this setup are really endless and in this video I'll show you how you let your agents SCP any website do Vision based scraping and scrape all social media and posts which can be
a bit trickier to do I'll show you all of the this by giving you a demo and a detailed breakdown of a competitor analysis agent that I recently delivered to a client but this is just one use case this setup could be used for many other purposes and the template will be for free as always in my free community if you don't know me yet I'm Ben I Implement AI Automation and AI agents into businesses since 2023 I also run a community with over 500 AI agent Builders and if you're a company and you're looking
to be one of the first companies to adopt AI into your business and want me and my team to help you out you can also book me for free call in the description below so I'll first give you a quick demo of the competitor analysis agent inter action then I'll give you an overview of the setup for this agent team and then I'll give you a detailed breakdown of how you can set it up yourself and also how you can scrape let your agent scrape all of these different platforms now of course uh this is
not the actual agent I delivered to the client I just recreated it quickly for myself but it has almost the same functionalities so the way this agent is set up is basically we have it here inside of our slack right competitor analysis agent and basically we can instruct this agent to to uh research different companies um we can also instructed which platforms to do the competitor analysis on and also for which time frame now I want to give you a full breakdown of everything you can do so I will instruct our agent now to do
a full um competitor analysis and that's how he will scrape everything basically so we can say something like hi please uh do a full competitor analysis on I'll take the same examples I always take I come from the CRM space that's why on Hotpot and pipe Drive um for the last month now this is going to take a while because it's scraping lots of data in the background and basically making reports on each of them uh so this is going to take a few minutes I'm just going to sneak in a quick call to action
I know there's some people who have watched some of my earlier videos but haven't subscribed yet uh I understand I I do the same but it does really help me if you can subscribe and maybe like this video I do really appreciate it anyway this is going to take a while so a few minutes so I'm going to speed up the video now and now we got it back I sent it back into another Channel I should have sent it here but uh yes we got it back so here here are the comprehensive research reports
for HubSpot and pipe drive right so basically we get multiple documents here um for all the platforms it has done an analysis on right so we have news report we have review analysis right so review platforms uh we have uh a report on our blogs on branding on LinkedIn X YouTube right uh now you can add more if you want this company Wanted only LinkedIn and X but I'll also show you if you want how to do this on other social media platforms so you can see the research reports here so I'll open them up
quickly so you get an idea so here we have the news right so we it just checks if there's any interesting you know uh mentions of hopspot and pipe Drive in the news over the last month right has launched Breeze a comprehensive AI Power Platform at the inbound 2024 conference right so just some important information a company might need to know about how hopspot or pipe Drive was in the news right pipe Drive analys the beta launch of pipe Drive polls right Etc so it just sort of find some interesting uh things that that might
be useful to know about competitors then we have reviews which are uh basically it looks at these review platforms and identifies any sort of opportunities for our company meaning any uh sort of bad reviews or or good reviews to see and how a competitor how you can basically position your your brand a little bit better so you can see overall weight rating we get right 2.9 out of five common issues customers frequently complain about poor customer support report right uh significant concerns about the cost of the platform right users have issues with account management including
on out authorized reactivation right positive highlights it's comprehensive CRM and marketing tools and the same for pipe drive right and then it even extracts one negative and one positive review if you want to get a better idea absolutely terrible company to deal with this system is complicated and purely built to take your money with underhand tactics uh positive review right we use hopspot every day right and the same for pipe drive and then it even generates some opportunities right so for your own brand meaning given the ne negative feedback and customer support we should emphasize
our commitment to providing responsive knowledgeable and personalized customer service right pricing transparency which should not noticed is some of the bad reviews account management user experience so you can see you get a nice uh little overview very quickly of you know how uh sort of the review uh status of these competitors then we have the blog right so summary right we can see for hopspot what are the current teams and the content right of s recent blog post focus on digital marketing strategies AI integration in marketing customer Journey mapping SEO optimization examples right it gives
you some examples and the same for pipe drive and then again it gives you some opportunities based on uh their content and what's working well and what's not uh so content gaps right some things they they are not covering thematic uh opportunities uh Etc and then we have branding now this is actually visually visual based scraping uh I actually forgot one thing here normally you also have a screenshot of their homepage but with the visual with this branding research reports we basically uh do visual scraping to get an idea of their overall branding and visual
identity right so the homepage utilizes a soft color palette featuring Peach and blue elements enhancing a friendly and approachable tone ET TR the copy right it analyzes the copy and the visual elements uh I put this in just to show you how uh visual scraping works too because it can be quite powerful especially with these sort of image or brand uh things you you can't really do with text Bas scraping uh so visual scraping can be very powerful uh for certain use cases too uh so we can see here also we get opportunities then we
have LinkedIn right LinkedIn research uh research report the summaries of each company again we give common themes and topics right frequently focus on seasonal themes company events and motivational content they often Incorporated casual and engaging tone making their post relatable and sharable right and then we can see engagement metrics right the average engagement metrics are you know 1,100 likes 44 comments post frequency and timing every two days right uses commment hashtags right same for pipe drive and then we even have the top performing Post in that time frame right falls about all about pumpkin spice
lattes Gilmore Girls returns and counting down to Q4 right got 1,100 likes and we can also check out the link if we want to here and uh we get another one here right we get the top post basically and the same for pipe drive right and then again opportunities right same same same idea as before and then we have the X very similar right common themes and topics see Tas they're using a lot get your tickets here watch the live stream right average likes 12 post frequency in time right same for bip drive and again
we get the top performing tweets right here which we can also check out if we want to uh Etc and again opportunities you get the idea right and lastly we have the same for YouTube Right summary of each Channel General Channel engagement metrix Etc it's just a quick uh idea but you can see the power of this uh and you can see also why it took so long because it actually took uh what was it yeah 15 minutes our agent was working for 15 minutes because you can imagine the amount of data it's scraping in
the background and analyzing the data and then creating these uh competitor analysis reports but you can see the power of this like if if a human would do this and there are many marketing people who do this uh they would save a lot of time and that's why the company was very impressed and happy with this solution but I want to give you a quick breakdown first of the agent setup because there are I think many other use cases for for this and then I'll give you a detailed breakdown of uh the agent inside of
relevance Ai and the scrapers now I've set this system up inside of relevance Ai and make.com if you're new to relevance AI it's a no code AI agent and AI agent team Builder now this setup is a little bit more complex if you're completely new to relevance AI I do have many other tutorials on relevance a on my YouTube channel which might be a bit simpler I tried to keep it straightforward in this one too but if uh this goes over your head I do have a full beginners tutorial too on relevance AI which I'll
link up here and then I'll use make.com uh to scrape some of the platforms we can't scrape inside of relevance a then we give our agent inside of relevance a access to those make.com automations to get access to that uh scrape data from other platforms now both platforms you can start out completely for free um they are completely no codes some people have some doubts about the relevance sayi pricing because I think the first plan is $200 but don't get fooled by that you can actually start out with a free plan and you only pay
$2 for every thousand credits you use so you can you use this quite cheaply and play around with it quite cheaply I can tell you you can do quite a lot with a th000 credits so both of these platforms very cheap and easy to start out I'll make sure to link them in the description below too so the way this agent is set up is in the following way so here we have the trigger for our agent here we have our competitor competitor analysis manager agent and his tool and here we have his sub agents
and their tools so the way this works in practice is of course I trigger it through a slack message which I just showed in my demo that message will be sent to my manager agent who basically has two responsibilities as always delegating to the sub agents the task to the sub agents who actually do the work and second communicating back to me right that's why we've equipped him with one tool which is the send slack message tool which I set up wrongly as you saw I I send it back wrong to the to the wrong
uh slack Channel but why don't we let our manager agent do all of these tasks because in general we want to limit the amount of responsibilities for competitor for our manager agent as much as possible because L&M in general are not good at doing multiple tasks and this manager agent already has to break down my query and communicate back to me so basically we're trying to limit the amount of responsibilities as much as possible by offloading all the other work to sub agents and tools and that's really how we want to think about these systems
to make them as reliable as possible so if if the query comes in our competitor analysis manager agent will basically uh instruct these two sub agents what to do so we have two sub agents here the first one is the social media scraper who basically has all the tools available to him to scrape all the social media platforms so we have the LinkedIn scraper tool we have the X scraper tool we have the YouTube scraper tool and this these ones I didn't set up in this specific setup but you could also uh give a scrape
Instagram uh scrape Facebook tool and basically he write he do does the scraping writes the reports and then uh sends them back to our manager agent and then we have the second sub agent which is the general web scraper agent who has four tools the public review website scraper uh the blog scraper news scraper and a vision based scraper right which I uh showed in the example of The Branding competitor analysis report and again right he sends it back to the manager agent who then sends me back all the reports through slack that's the way
this system is set up now this is just one use case right of this setup but I think there are many other very interesting use cases I just wrote down a few that I came up with very quickly the first one that I think is very powerful is use this system for lead scraping right you can imagine a setup like this where you scrape leads from websites from directories but maybe even more interesting is scraping leads from social media because for example and I'm going to show you how to do this later too you can
scrape leads from people who engaged or interacted with certain types of post on different uh social media channels so for example someone who liked or commented on a post about sales or about uh on a post of a competitor we can uh scrape those leads and you can even imagine a scenario where we actually personalize Outreach emails or DMS based on the engagement we've seen these lead have with a specific post uh but that's just one example many use cases for the for the lead scraping with this setup they could also Imagine finding influencers right
so companies that try to find influencers that talk uh about specific Topics in a specific space so we can let for example this agent run every day identify the top posts in H certain topic uh the top influencers on those specific topics and again we can then immediately start the agent could immediately start uh personalizing Outreach to those influencers to try to you know get them on board Etc we can think of content idea generation through this system of course too right um just giving reports back on on uh competitors what works for them uh
people in the space that have uh you know well performing posts right we can we can think of many many scenarios of course Outreach personalization as I said before and researching Pro potential prospects but again many more use cases here uh I think very interesting setup and lots of possibilities anyway now let me show you in detail how I set this up inside of relevance Ai and make.com so here we are in my relevance AI dashboard uh remember I put the full template of this agent team inside of my free community too right so if
you want to check it out more detail you have it there but even if you're going to close it you do have to uh change some things for yourself so make sure to stick with me so I'll go over agent by agent so first I'll start with the uh competitor analysis manager agent and then I'll go over the sub agent and their tools and uh through that process I'll show you basically how you can scrape anything so let's start with the uh manager agent so you can see here we have the request that I put
in uh on on slack right and we can basically see here what happened in the background so you can see here in the background here we got the trigger right please do a for competitor analysis on hopspot and pipe drive and you can see in the background our agent did the following things so first it got the current date now basically does that to uh sort of identify when is it actually the last month now if you want to know what this tool does it's basically just a very uh quick python script on getting the
getting the current date now that's the only piece of code I used so don't worry if you don't know how to code uh but that's all that tool does right so that's the first thing it did and then it delegated it uh to the general scraper agent and here we can actually see what it told the sub his sub agent what to do right so you can see here please perform a full research analysis on hopspot and pip drive for the last month include blog posts review website news and branding right so basically you can
see this as a prompt right so our manager agent basically prompts our sub agent what to do right so and then you can see here in the background what our general scraper agent did he also used the get current datee okay so actually I didn't need to get current date in the in the manager agent uh because we also have that one in the sub agent that's a mistake on my part but yes you can see they both have it so uh then you can see he used the news uh uh news research scraper then
the review analysis then the blog analysis and The Branding visual analysis which is the which is the visual scraper right and then basically he did all of that and then got wrote the the competitor analysis reports and then you can see this he said SS back to our manager agent here are the links to the comprehensive research reports for hopspot and pip drive right you can see we got all the links here and that's basically the message he sends back to our manager agent and then you can see in the background the next step our
man uh manager agent took is to delegate it to the social media scraper agent and same thing right he prompts him what to do right please perform a full research analysis on hopspot and P Drive include LinkedIn X and YouTube right again in the background our social media scraper agent does the link analysis the X the X analysis and the YouTube analysis and sends back the reports back to the social media to the manager agent and then lastly you can see our manager agent use the send slack message tool to send all the research reports
back to me through slack so that's how it works in the background now let me go over uh very quickly the the manager agent setup and then I'll go over uh the other sop agents and all of their tools uh step by step uh for the slack integration um relevant actually doesn't have a way to um trigger your agent through slack so I also set that one up through make.com I'll show you that slack trigger if you're interested all the way at the end of this video um because first I want to show you these
agent setups and the scraping setups and if you're interested you can check that out at the end so here we have the manager agent setup right uh here we just have the name and the agent description not that important for the manager agent and here we have the Integrations now again we don't have to Slack integration here so that's why I did it true make and then here we have the core instructions which is basically the system prompt or the agent prompt um I'm not going to go over it in detail because you can check
it out in my free template and also I do have a full uh tutorial also on my YouTube channel about agent prompting and prompting in general in these AI agent and AI automation systems so if you want to learn more about agent prompting which is a bit different than normal prompting uh check out my uh my video I'll make sure to link it up here too uh so yeah basically we give it a roll uh the objective some context the SOP very important in these agent prompts right what does it have to do in which
case right this is really the most important part usually in those uh agent prompts and uh this of course is also very important the tools and sub agent section where we basically gave it the manage agent more context on what tools it has when to use it and also which sub agents they have and what they can do and also when to use them of course and I also always add in how to communicate right so what to they actually have to send or instruct their sub agents to do when they get a query because
our sub agents can only do the work properly if they get profit properly so that is an important part to to uh to include too then we have the flow Builder right flow Builder is basically to double down in that on that sop right we have in the prom sometimes very difficult to write out in language um sort of the the flow our manager agent or any agent has to follow and that's sort of what we can do in a in an easier way in the flow Builder now this case it's not an extremely complex
sop uh but I did put it in also just for example purposes so basically this we can just yeah double down on that sop in this this flow Builders you can see for here we can put in either instructions or conditions so instructions is just like this you have to do and these conditions are like if this happens then do this if this happens then do that so you can see make make sure you have all necessary info a companies to research B platforms to research and C time frame to research because our manager agent
always needs those three data points to actually be able to do his work right so that's the first instruction we give him make sure you have all of that information before you actually get to work right and then we have a condition which is if user ask for a full competitor analysis right then follow this sop which is then in that case of course it has to use all uh the sub agents to do a full competitor analysis uh um report right so we have used General scraper and use social media scraper you can add
in those sub agents by putting in a slash here and then you can choose your sub agent and so that's condition one and then the condition two is if user asked for analysis on a specific platform so let's say I I only want a research on LinkedIn then it doesn't have to use the general scraper so in that case it follows a different flow use the specific research agent for the specific platform to research so in this case you would only have to use the social media scraper and only instructed to do the research on
LinkedIn right so that's s so more important probably not that important this setup because it's quite an easy sop but when you have a more difficult one very works very well if you double down on theop in this flow Builder um now then we have uh these other are not that important then we have our tools here now we have the C get current date which is actually not necessary as I showed you before because these sub agents also have this tool um and then we have the send slack message right now very very easy
I can show you very quickly because uh relevant SII basically has this slack uh send slack message natively integrated so it's very easy uh here we have the message which our agent fills out right so and this is basically description which is a prompt to our manager agent on how to fill out this message this input right so you can really tell it here what to do right add all the links of the research reports to the uh uh to me make sure to specify which specific research reports each one for example X report link
Etc so our manage agent fills this message out we store that in a variable and then we use the send slack message module which is natively integrated into relevance AI right you can just search for slack send right send slack message and then you can connect it here right now I sent it to the wrong one right so actually had to put it in this one right and then we add in the message which in this case we put in the variable which our manager agent fills out right and then we'll send that that's it
very easy and then we have uh the sub agent section of our matter manager agent now you can see where he has access to his two sub agents the general scraper and the social media scraper agent uh now here we have some extra um configurations right so we can either decide if we want to all let these sub agents auto run or if we want to have human approval now that can be useful sometimes also for tools if we actually want to double check before actually taking an action right so you can imagine when you
send an email for example you may you want to actually check before sending it out then you can use these uh approval required steps and then uh it will have to ask for approval before it actually takes action on that tool or sub agent now this case I've put them on auto run and then we have uh two more extra settings here which is the prompt for how to use so basically this is a prompt again for our manager agent on what this sub agent the general scraper agent does and what he will report back
to you so in this case I prompted it like this this agent can scrape review websites on competitors blog posts of competitors news on competitors and do branding research you will report back the research reports to you so I make very clear what this General scraper agent can do and also which platforms it can actually scrape and also what results to expect back from this General scraper agent and then we can do one more thing which is template for communication so we can basically put in a template um which your manage agent will use when
he communicates to the sub agent because again remember these sub agents are basically prompted by the manager agent agent so here we can sort of um make sure that our manager agent always prompt the sub agent in the right way so in this case for example I left it empty because I did it right but you could say something like uh please do uh um research on right and then we can add in a variable here companies right so in this case this would be another prompt for your manager agent to know what to fill
out in this variable so companies I asked to be analyzed right so we can put it like that and please do a research on companies and then we put in another variable on these platforms right and then we can use another variable there to instruct our agent the platforms right to do research on right for and then time frame now like this you can imagine we have more control and we make sure that our manager agent always reports all the necessary data to the sub agent to do his work properly because for example if if
our manager agent would only send please do research on hopspot our sub agent doesn't have enough context to do his work properly so we can sort of make sure and make our system more reliable through this so we can do the time frame right so just an example but this can be used useful to make your system more more reliable and the same for the social media scraper agent right same principle so uh the rest of the settings not that important let me now go uh through the other first the general web scraper agent and
then the social media scraper agent and all their tools because I want to show you how we can actually scrape all of these platforms so let me get you through the scraper sub agents and their tools so if we go first to the general scraper now before starting I want to give you a quick overview and I don't want to make this a five hour tutorial because it will be very quickly I'm already struggling with making my videos concise so basically there are five types of scraping involved in this right so the first one is
very easy web scraping which we can do directly here inside of relevance AI then we have web scraping that we cannot do directly inside relevance AI because some websites have some limitations or relevant say doesn't have a native uh integration of how to scrape this for example um Google news we can't scrape directly from relevance AI right um then that's the second one then we have the third one which is visual scraping which is also not very uh straightforward to do in relevant say because for vision scraping we actually need to make screenshots and then
give it to a vision model to interpret those screenshot or or read those screenshots then we have social media scraping now for social media scraping uh with LinkedIn we can do it inside a relevant C but basically all platform all other platforms we cannot and we have a few options options um other options to do social media scraping now for social media Platforms in general they make it a bit hard to scrape because of course you can imagine these guys don't want people to scrape loads of their uh social media posts so but there's a
few uh ways around this to do this in a no code way and uh basic make.com because we can't do it in relevance AI for the review website we can do it in relevance AI because it's just web scraping for blog scraping we can also do it inside a relevance AI because we um we can ALS visual scraper which we also do in make.com because we can't do the screenshots inside the relevance and then for the social media ones um we have LinkedIn which we can Doom uh and YouTube uh we also do in make.com
we because we can do it in relevance Ai and if you want to scrape other social medias maybe Instagram um or Facebook ways to to scrape these social medias but I'll get into those later first I'll show you uh those General uh scraper tools right so let's start with the review analysis because we can do that one here inside of relevance Ai and then I'll go over to uh the other ones which are a little bit more complex to set up so for the review analysis the way these tools are set up is um first
of all if you don't know yet here we can actually give a description to our tool right uh analysis of public right I described it wrong reviews of companies right so this description by the way is important because again this is like a prompt to your agent on how to use this tool right or what this tool does so always good to give it a description what this tool does now for the user inputs for all of these tools this is the same right so it's important to understand um and it might be a little
bit complex again if this goes over your head make sure to check out that that tutorial also you can check out in detail this template right if you want and also I'm always available on my free community to answer any questions so if this goes over your head let me know uh so here we have the company names now in this case we actually want our agent to be able to do company research on multiple compan right and that's why we don't use a normal text input we use a list input right and a list
input basically allows our agent to fill in multiple different companies here right so that's what we instructed our agent to do here and you can see did it right right it put hopspot in one and pipe Drive in another now why do we do that it's basically because then we can run this tool on each of these separate um um competitors then we have the second input which is the number of days to retrieve info for right as you saw I wanted research on the last month so the amount of days in the past to
retrieve reviews from right so you see my agent did it correctly he put in 30 because we only want 30 days in the past to do this analysis for the last 30 days right again this is also again a prompt for your agent on how to fill this out right and then lastly we have the today's date and all of these of course again we we store a variable which we're then going to use in the next steps in our chain right so in this case the first step of this tool is actually finding the
public review websites for these competitors right so in this case we're only trying to find the trust pilot uh review page but you can do this for more if you want to so how do we do that first we just do a Google search to actually find the trust pilot page for these websites so you can see I have a Google search mod module set up here and um I added in here a four each now if you don't know what four is for each basically make sure we Loop through each of the competitors and
make sure that it does the Google search result on each of these variables in the list right so not only on one but because we want multiple we use the four each because it will make sure it Loops over each of the competitors now how does it work basically you have an advanced setting here sorry here and here you have you can add right you can add uh enable for each Lo right then you go go here to the variables and the for each Loop expect an array right and this these multiple inputs basically stored
as an array so we can add in that variable there and then once we do that in this search query where we normally just put in the text input right away right this is the the Google search query basically we put in another variable which is the for each item which you will be able to find as soon as you use this for each step you can find that here right the for each item it basically puts in each of the items one once when it Loops over this specific Google search step so I can
give you a quick example by running the tool now you can see we actually get back two results here right so we get result zero which is for Hotspot and result one for pipe drive right so that's what this four each does it runs it twice on each of the variables right and uh in The Next Step we're actually trying to get find the trustpilot page right from these Google search results we actually have to find the trustpilot page for hpot and pip drive so all we do is an AI step an LM step here
where we add in the variable um of the previous step right steps Google results organic right again you can find it here right and it automatically Loop over both of the Google search results right and then again in this part which is our sorry um which is find the url of the trustpilot uh page of the company in the Google search results below only output the URL nothing else no summary no explanation we put in the for each item so that's where we'll put in um the Google search results for both of the companies right
and then we get back again two results right answer with the trust pilot of hotspot and of pipe drive right and what we're going to do then is use relevance a built-in web scraper right which we have here extract website content to actually scrape the data from these review p ages right so uh that's we do the same principle with the 4 each right so basically we have to use that 4 each every time we want to run it on each different company and as we don't know if it's going to be if the user
is going to you ask for one company research or three that's why we use this for each and then um so we scrape the data here and you can see here we get the the scrape data for each of the companies huge blob of text right and that of course we want to actually filter out make sure that we only uh do it on the last month which we instructed it to do to only take sort of the relevant to see make the opportunities Etc so that's why we use here another L&M step right um
to only basically clean the data a bit and only extract the last 10 reviews right we you could also do 20 whatever you want uh but I just wanted to clean the data a bit so that's what I did so you can see we get two two results here for hopspot the last 10 and also the overall rating Etc right and then because we actually don't want to have this report back into one report we have to uh we have to combine these two into one because this is basically an array of strings right an
array is basically a list two types of strings in this case but because we want to make one report we have to put it back into one and that's where we can use uh a step that's called combine an array of strings now again if this goes over your head I I can understand um but you know you can look at this in detail in the template below and most of the scrapers are set up in a similar way so if you understand one you understand uh most of them so here we put in the
array to join right so in this case we want to combine these two arrays into one so we can just put in the variable from this previous step right to combine the two and then we have the separator how do we want to separate the two inside of our uh single string so here we get back one with all the reviews of of each one and then we actually want to make a nice report right so that's why in this one long prompt again right you can look it in detail but you're a professional marketing
an analyst with expertise in writing competive analysis reports based on review data from public review website so I just instructed how to write right uh what to include in in this report Etc some examples and then we get back the research report now in this case we want to have it in a nice Google Docs right so Google Docs accepts HTML so when we get back this uh research report we're going to transform it into an HTML format for a Google docs so we uh ask an AI to do that HTML format this for Google
Docs right so we get back uh an HTML here and then we have to actually put it inside of a Google Docs now unfortunately we can't still uh do that inside a relevance AI so I use make.com to make this uh Google Docs now again if you don't uh know how to do this I do have a full tutorial also on how to integrate make.com with relevance Ai and give your agent basically access to these make.com automations you completely new to make make sure to check that one out I'll make sure to link it here
uh on top too but basically what we can do here is here we have this particular automation that creates our Google Docs for these reviews right so all we have very simple is custom web hook and you can see basically custom web hook as our agent or our tool will send data to this custom web hook and then we can integrate different tools here in make.com um uh with that data and then with this web hook response we can send that data back so all we're doing here is we've set up a custom web hook
which you do right here right custom web hook if you create it you will get you will get uh here you can see it you get a web hook you can copy that you put an API step here you use the method post then you put in the URL of that web hook you just copied you put it in here and then here in the URL params you can decide which information you want to send over right so here's the title of the information you want to send send over and here's the uh sorry here's
the here are the the names of the variables you want to send over and here's the data you want to send over so in this case I want the title for my Google docs to be reviews uh report and then this is the variable of the date right so it will say reviews report of 24th of September right and here's the report right which is the HTML formatted doc right which I put in here now we can run it quickly so you see what happens so if we run it here and I run this stab
you can see it now got triggered right and you can see here we've we got uh this is the information that was sent to this web hook right the report and the title right and then we use a create a Google Docs module here builtin make.com to create a Google Docs right and inside of this module you can see I put in the title the variable which we received from the web Hook and the report in the content which is the HTML uh competitor analysis report and then what we do is the Google Docs will
be generated and then we can send that Google Docs back to our tool and to our agent through a web hook response right this web hook response as you can see all we put in here is the web basically the link of the Google Docs right so as soon as it's generated it will send back that Google link Doc and you can see then here we get that back and then we have our Google Docs right so uh that's how it works uh again if you if this goes too fast make sure to check out
my other tutorial so this is the way this one is set up now this one is a bit easier because we can scrape inside of uh relevant say these websites right now so this is this a very similar setup for the blog I'll go through it very quickly but because also for blog we can just use a normal web scraper we use uh relevance AI so you can see same user inputs for the blog then we have the Google search step to find the blog of the different competitors again we use four for each same
system right then we have the AI step to find the url of the blog for each of the companies then we scrape the blog uh then we uh look at extract of last 15 blog post to clean it up a bit then we combine both of the results from UPS spot or from the different competitors into one right then here we write the research report make sure that we only write it on the time frame we're looking for and and again we HTML format the doc and we do the same same thing right we send
it to make.com to create to generate the report right as I just showed you so that's it for the blog very similar process right then we can go to um Google news right so Google news is a little bit different because Google news we can't actually scrape directly from inside relevance AI so in this case the setup is similar but uh instead of scraping inside of relevance AI we're actually scraping this inside of make.com because m.com basically gives us way more possibilities to scrape than inside a relevant SII so basically what we do here user
inputs again are the same right so what we're going to do is we're going to send over these competitors or these names right away into a make.com automation where we're actually going to scrape the Google news so you can see I put in the list here which includes all the different competitors right so we send that information to our make.com automation I'm going to show you right now um news and here you can see we have the automation to scrape Google news so I'm going to show you quickly how this works um so we're going
to run it quickly so this one is a little bit more complex as you can see but uh what we do here basically we send over the two names right you can see hopspot and pipe drive and this module is the one that actually scrapes the data from Google news right but because we get one variable here back with both of the competitors we actually have to separate them right and make them into two and then run this module as you can see it run it on two on two on both of the um the
competitors to do uh a Google search analysis on both right so that's why we're using a text parser here and that's basically what a text parser does it separates um uh these these these different uh variables out of one right so if you're completely new to this I'm I'm happy to also do a make.com full uh full tutorial but basically this this is what this text par does and you do that through writing this pattern right now this is what they call regular expression and this might seem a little bit complex I don't really understand
it either but chat GPT is amazing at it so if that's always always what I do if you just uh use this variable and say Hey I want to separate these into two variables in make darkom with the text parser then chat GPT is very good at giving this pattern right and this pattern basically separates those two right so that's what this module does and then I use dumpling AI which is a great little tool for scraping right inside of make.com I really like dumpling AI because basically allows us to do lots of different types
of scraping in a very easy straightforward way there's many possibilities to do scraping but uh many of them are very comple complex and dumpling a is basically taken one of the main most of the main use cases and make it really easy to scrape right so highly recommend it I also have a link the description below so basically you can hear you can you can get YouTube transcripts you can do URLs you can do screenshot URLs you can do uh search Google which we can do in relevant too you can search Google Maps for example
to scrape uh leads from Google Maps uh we can do Google News uh lots of possibilities here but in this case we do search Google news right so you can see how this module set up and we basically put in the variable which we got out of this one which is basically the two different uh uh variables of the names of the companies right so the search query for the Google news is basically just hopspot right so you can see you can see here what data we get back which is huge information from Google news
on hopspot uh news articles right you can see the title the link the snippet now if we want we can even start scraping these uh links in next steps if we want in this case I didn't do it because we already have some information in the snippet and in the title so to identify new things they're doing BAS basically but you can see we get back all of that information we he ran it twice because it's both for hopspot and pipe drive right so we get back all of this information and then we have two
text aggregators here now what do these modules do without going too much in detail because again I don't want to make this a six hour tutorial and it will become one very quickly what these do is basically they clean up the data and give us back one result because you can see we got two here it first of all cleans up this this data in this first step so you can see the way I set up this up is I only want the title the link the snippet and the date I don't want all that
other information I got back which is not relevant to us right Source Etc image URL we don't need that so basically what this step does is it cleans that up right so I just put in the variables of the data I actually want right and that's also the nice thing of dumpling right it already organizes it in sort of these different sections for you right so that's what this step does again you can look at it in detail in the template and then in the Second Step uh because here we still get it all all
these you can see it's more clean right we have the title the link the snippet and the date and then in this next step it will basically put all of that information together into one uh big text so we can actually send it back to our tool inside of relevance AI so that's what we do here with the web webop response right we send that back the the output of this last text aggregator model so we have a clean overview of all the scraped uh Google news items now that information is then sent back to
relevance AI right as you can see here right so here we have all the scraped news items and then of course we want to actually filter for the dates right because we only want to see it about the last month in this case and we also want to get a nice report right so again here we do that in this long prompt right only extract relevant and interesting business news from competitor companies right so identify relevant news filter irrelevant mention only extract relevant news from the last 30 days right this the variable today's date is
right so that's what it does here right generates only takes out the most important or that relevant data wres a little report and then again we do the same as before right we write in HTML we write in HTML and then create a Google Docs and then we get the Google Docs back with the report right so that's how it works so that's it for the news now for the visual scraper uh it's a very similar setup because we can actually also use dumpling AI because for visual scraping what we need to to do in
this case we want to do a visual branding analysis on the competitor's homepage in this example but for visual scraping in general I think very interesting for for all these sort of image based scraping you want to do uh you actually have to make a screenshot right and then feed that to a vision based L&M model to interpret that screenshot and take the do the analysis you want it to do now again we can take a screenshot inside a relevance AI so what we're doing is again we're using make.com and specifically dumpling AI which also
as a module to make a screenshot so I'm going to show you this first of course we need to get the uh the company's homepage so we use the Google search with homepage again for each right we extract the URL of the homepage we combine the array of strings because here we get to URLs back to actually be able to send it to the API to our make.com animation so here we send over the websites I'm going to show you visual so here we have the visual scraper right I'm going to run it quickly so
we can see it in action you can see it was triggered right so in this case we send over the two websites now again we have to separate that that's why we use the text parser to separate those two websites and then we run this screenshot URL twice for each of the website so this dumpling AI module takes a screenshot shot and then what we do here in the next step is an open eye step we can also do this in inside relevance if we want but in this case I did it in make uh
we basically feeded those screenshots that this module took and to analyze the screenshots and write the report right so I can show you this is the The Whisper oh no sorry not not the whisper model this just the GPT 40 module that can um also read images right so we just add in the image here you can see from the previous step the screenshot URL and then we have a prompt basically defining how to write the research report and what to analyze from these screen screenshots so we get the report back now again we get
two reports back so we want to clean up the data and put it out in one we send that back to the web hook uh here and so we get it back inside relevance AI so you can see here we get it back the research report on the homepages and then again what we do is we want to uh Google Docs right so we HTML format it and make a Google Docs now you might be wondering why am I not why am I doing this part inside a relevance sayi not directly in make.com you can
also do this directly in make.com right you just add in another Google Docs module here to make it right away and then send it back I I just did it because for the client I actually send the information to um to another platform uh instead of Google docs right so that's it for visual scraping uh I think there's some interesting use cases now let let me get to uh social media scraping now for social media uh let me first show you LinkedIn because LinkedIn we can actually do uh inside of relevance Ai and then I'll
show you X and YouTube and through that process you also understand if you want to do it on Instagram or um uh Facebook so LinkedIn again very similar setup right same user inputs uh Right company names number of days today's date right then in this case we do a for each with the Google search on finding their LinkedIn right so we do Google search on LinkedIn right then in the AI step we identify the URL of the LinkedIn of the company right then we then we use the actual built-in LinkedIn scraper from uh relevance AI
right so that's why we don't need make.com in this case because um they have it right in here right if you don't know you can just go here LinkedIn get a LinkedIn profile and post right so we can create the posts which I do here on a in a four each way again right because we want it on both both the companies now I actually use I have to correct myself this is this the second part I used a little bit of code to actually clean that data uh up a little bit because we got
lots of scraped data back from uh Linkin so this basically does that it scrapes uh it's it cleans that data up a little bit and it also puts it inside a one instead of an array of strings it puts it into one right you don't have to change this it's already set up for for you uh and then of course you want to make the research report based on the LinkedIn results long prompt again right on how to do that and what to extract what not to extract and that's it and then of course we're
going to do the HTML to make the Google Docs and same process again right now that's it for LinkedIn little bit easier now for the other platforms it's a little bit harder right um because in general these social media platforms they make it a little bit tricky to uh to scrap right so uh there's a few options and the options are mostly available in make.com right so B media there basically three main Tools in make.com which makes makes make it easy for us to uh scrape social media platforms so the first one I'll go over
them uh all three very quickly is Phantom Buster which I specifically used in in this one uh which is a little bit more expensive Works quite well uh Phantom boster basically allows us here you can see to scrape LinkedIn Google Instagram Facebook Twitter ET right so and you can scrape quite some interesting things right you can scrape leads from uh LinkedIn company posts right uh we can we can also even scrape leads right from people that engaged with certain posts which I think can be very interesting for uh lead prospecting but yes there's quite some
uh possibilities here with Phantom Buster which I'm going to show you in detail because mine is set up actually set up with Phantom Buster because the company already used Phantom Buster there are two other ways to scrape easily uh so media which is either through appify uh appify is basically a marketplace where anyone can post their own apis and make them available to other people uh now it's quite easy to set up these uh these appify uh apis inside of make and you can see there's quite some uh interesting uh ones available not only for
social media but I think especially useful for uh social media so you can see Instagram profile scraper right Instagram post scraper so we can really post scrape anything now the downside of this is a little bit the pricing it's very unclear right because you basically pay depending on which one you use H sometimes you pay a subscription and a usage based and sometimes it get gets a little bit unclear how much this actually costs uh as you can see here we have right a monthly price but also we're paying uh for each different API separately
for usage cost so that's a little bit the downside but if you really want to scrape the social media this is one of the easiest ways to do it right and there's lots of very interesting scrapers available and Rapid API is a very similar one similar concept right you can see also here if you type in Instagram right you have developers who can basically post their own apis here to make it easy for you to get access to their apis use them uh for these things so Instagram scraper API Etc so lots of different ones
available here uh definitely interesting to check out uh now epy actually has a very easy way to integrate this inside of make rapid API not so straightforward so F5 might be a little bit easier but in this case I use Phantom Buster uh I think it's a lot clearer on in terms of the pricing and in this case my the company I built this for already used Phantom Buster that's why I used these guys if you want guys want to see me do a tutorial on appify or rapid API let me know in the comments
and I'll do that after but let me show you how I set this up with Fantom Buster um specifically for the X uh scraping so same s same setup again right uh and then we have find Twitter profiles right in this case we're doing a Google search result for the X account right so we're trying to find the X accounts and then I'm going to run that tool so you can see how this works too uh we go to x get X post now I can see this a little bit of a more complex setup
but I'm going to I'm going to show you so let me run it so I've run it right and you can see here we found the X accounts of both of the platforms then our make.com got triggered here and we send over those links right uh the hopspot X uh link and the the pipe drive then again what we do is we separate these two right so we have two the two different links uh and then and then what we do is we actually have to add it to a Google Sheets right because the way
Phantom Buster works is we basically built a tool first inside of phantom Buster um I can show you here we have the Tweeter tweet extractor so first we have to set up a tool right so I'm going to show you quickly so you could just pick one here from your Solutions what what kind of scraper you want right and once you set once you select one you basically end up at a screen like this where we can uh do the configurations for our scraper so first of all here we can decide if we want to
scrape from a URL my list Etc so in this case what I did is I let it scrape for a Google Docs so as you can see you can basically put in and now it's twice of course but you can basically put in the links there and then the Phantom Buster uh tool is going to scrape the accounts from this school sheet right so in this case I put it uh on a Google sheet because what I do here is I add those data points those two two variables right the the two um accounts X
accounts I put them on that Google sheet here and then I run automatically that Phantom Buster you can also run it manually from here but we want to do it automatically so that's why I add those into the sheet and then I launch uh this tool to basically start scraping the ones in that sheet if it doesn't make sense let me know uh but yes uh then we can connect we have to connect it to Twitter right you can just click there and basically gets your cookie right so important with these things right it's actually
got to go do this from your account right so as Twitter and x and some of these platforms have limitations and they don't want people to mass scrape uh these social media platforms you have to be a little bit careful and limit this amount of scraping to a certain amount and to a certain amount of times per day so what I do is maximum three times a day and it seems to be fine and also don't try to extract hundreds of posts right because you know red flags will go up inside of these social media
platforms yeah here we can for example put in some uh um configurations like how many posts to extract per company Etc right and that's it and then we can uh then we can add in inside of make.com uh they have a phantom Buster there and there you can basically use once you've created a phantom how do you call it right you can you can say launch a phantom right then you just select your Phantom as you can see here Untitled tweet extractor right and basically that's all we do here right because we've added in those
links here to the Google sheet so this Phantom will basically pull those from the sheet right and start running it on those two and then what we do is we do a little bit of a sleep module here because this Phantom takes a little bit right it takes a little bit of time to scrape so if we don't use this sleep module it basically here in get an output that's where we sort of download the output if we do that right away it won't have a result back yet so that's why we put a little
bit of a sleep module at 40 seconds is way too long probably only need 10 15 or 20 seconds right and then what we do is we use another Phantom Buster module we just get an output right and you can see here we get an output right now this is not a very uh nice output so ah okay sorry this so we first we have to use two models here we have to get an output and then download a result right that's how we do it so here we get uh okay yes so we need
a container idea ID to actually download a result and here's and then we feed that into the next step which is actually downloading the results right so here we put in the container ID and then we actually get back a Json right so as you can see this is not any data we can work we use another two modules which is a parse Json right so we add that data there right and as you can see we get back the data finally here in this look at it in detail check out the template uh but
here we clean it up we get all the all the tweets back uh then we put them all together inside of one text here in the array aggregator and then we send it back to the web H right so here we can see we get all the tweets so if we go back to uh to our tool here you can see we get it here we get it in a sort of table format we get all the all the tweets with the account the Tweet date and the Tweet link right and then again as always
we we create the research report we make the Google docks and that's it so yes little bit of a setup but uh very powerful use cases so I think uh it's worth exploring so let me get back and then we have one more thing last one and then I'll show you quickly the slack integration to is the YouTube one now for YouTube we can scrape two things we can scrape a YouTube Channel and we can scrape YouTube transcripts now dumpling AI we can also uh scrape YouTube transcripts I know there's some people had some issues
with my previous YouTube transcript scraper which was from inside relev AI that one stopped working so if you want to know how to do YouTube uh transcription scraping you can do it with dumpling AI too now for this case I only um I only uh scraped the Google channels you can also scrape both if you want right you can just scrape the channels get all the links and then scrape the YouTube transcript but uh in this case I only did the channels so same system again right finding the YouTube channel link right that's it that's
what I'm doing here combining the array of strings right combining both of the YouTube URLs with both of the companies into one send it over to make and why because to scrape a YouTube channel page it's not that straightforward we can't use a normal web scraper so in this case we use get YouTube videos here we use what what's called an RSS right feed items now what is RSS RSS stands for really simple syndication it's basically some websites allow for this to sort of automatically pull standardized data back from these websites and usually it's sort
of done by these uh websites that have constant updates uh like for example news websites or uh YouTube in this case uh podcasts Etc so uh not only interesting for YouTube but also for other channels so the way this works again I'm going to show you through an example so we're going to run this tool as you can see that went quite quick so again right we get the YouTube links here which we gathered inside a relevance AI right here we separate the two links then we feed it into the RSS module very easy right
you can just add it in here right uh watch RSS feed oh sorry it's retrieve RSS feeds items right and then it automatically sort of brings us that data back in a structured way already right you can see the title of their last um YouTube video right the URL the date created images category Source Etc right and then if we want we can even start scraping YouTube transcripts from each of these videos if we want to know in depth what happened there right we can just use a dumpling AI module there to uh get the
transcript get YouTube transcript right and then we can uh exactly use that in an AI step too whatever uh but yes you get the idea so here we get back all of the last sort of videos uploads from these both of these uh companies then again we're going to clean this data up right we don't need everything so in this case you can see I only want the channel author the title date uploaded the URL and the view count right and uh then we get that back cleaner right you can see here one example right
cleaner data right and then we put it all together into one one uh text as you can see and then we send it back to our web hook and we get it back inside of relevance Ai and again we write the research report we HTML format it and make it Google doc right that's how it works then we get the Google Doc back so that's it uh very long one sorry uh now let me very quickly show you how you set up the slack trigger which is a little bit of a h one some people
don't know uh so basically we can go into make look up slack so here we have the slack integration now it looks quite simple but it's actually a little bit of a hassle to set up I'll actually create a new one so I can show you step by step how you do this so the problem with slack is we actually do have a slack integration here but it won't trigger automatically when we send a new message so we actually have to um start working with the slack API now it's a little bit of a setup
but I'll take you step by step on how to do this right now so basically we all we have to do is set up a web hook here right custom web hook uh we can call it whatever we want right we click save then we get a web hook here right then we go to api. slack.com right and then we can create a new app we can click on from scratch we can call it whatever we want example app relevance right then we pick our workspace we create app and we go to zero off and
permissions here and there we go to redirect URLs and you'll have to add in two redirect URLs I'll make sure to add them in my uh templates too so the first one is this one we click on ADD then we add another one which is the same but just with three we click on ADD we click save URLs right then we go to event subscriptions here we toggle on and here we have to put in the uh make.com web hook URL right so we C copy this one we put it in here now it will
give us an error which is normal right because it basically needs a web hook response uh so what we do here is we add in a web hook response right in the body we can add the challenge variable and then in advanced settings we add in one custom header which is content type text SL playay right now we run it once we can go back to slack and retry and you can see now it's verified because it received the web hook response now we go to subscribe to bot events here basically we can decide when
to trigger inside a slack so uh we can do it either through a bot like when we tag with an ad sign or specific application but in this case we're just going to do it by sending a message to a channel so you can just type in Channel and here channels message Channel a message was posted to a channel right so we select that one we go to save changes and then all we have to do is go here on install app we can click on install click allow and now it's installed now all we
have to do is go back to slack right select a channel you want your uh bot to be triggered by or your your relevant AI agent you go to edit settings and here you go to Integrations and here you can add the apps right so here you have to add in the app you just created now what I've noticed is this takes actually a little bit of time before it starts appearing here so uh it won't appear for me right now uh you but you can see here I have an one demo app that appears
here and then you can just add it here and then as soon as you send a message to that channel it will be triggered right now of course now we still have to integrate it once you've done that to the relevant AI agent now how do you do that you can basically just delete this web hook response right and we can we add in an HTTP module where we're going to make an API call to the relevance AI h right so here we select make a request again I have a another video that explains this
in a little bit more detail if this goes too fast for you but uh basically what we do here is we go to the agent profile right and in the agent profile you'll find one setting here that's called API here we can copy the endpoint we go back to make we put that in the URL for the methods we use post and then we have to add in two headers the first one the name will be content type right and the value will be application slash Json and the second one will be authorization which is
our authorization key or API key and here you have to paste in your API key which you can find right here you can generate it you copy that one and you paste it in here and then you all you do is Select here raw um you select application Json and here you have to put in the request content which you can also find here the request body it's is basically the message you're going to send to your agent right so you paste that in here and here you'll see hello right that's basically the message you're
going to send to your agent so you can take that hello away and that's where you want to put in the variable of the text message you're going to send on slack now as I have not run this web hook yet I won't have access to this variable yet right so as soon as you have your demo uh installed in your slack do a test make sure you run it here do a test and then you'll uh find that you get your your message variable uh available here you add that one there and then you're
all good then you're basically uh sending that message to your agent whenever uh you send it inside of the slack uh now if you're still watching sorry for the long one again I really have to learn how to make it more concise uh but thank you so much for watching uh if you're still watching and I'd highly appreciate it if you can like And subscribe and maybe leave a comment if you haven't already uh thank you so much and I'll see you in the next one