in this video I'm going to show you guys how to build this incredibly useful real estate AI agent that's going to utilize a Zillow URL for a particular city or a town and we're going to scrape that using Jenna AI which is this amazing new AI tool that you can scrape website just by simply putting in the URL so we're going to grab that information scrape that site feed it into our real estate AI agent where this will provide really valuable insights like market trends investment besides timing of the market the condition and we're going to grab all this information send it to us via nice little Google sheet that will have the address of the property the price the bedroom size the URL the realtor and a bunch of other useful information but not only that on top of that we're going to utilize more AI notes to be able to send us a summary an investment summary key highlights the average prices so that way you're aware of the market trends whether you're investor or just somebody who's looking to buy a property all of these information will be extremely valuable to you and if your company will also going to send a really cool key takeaways and then also actionable recommendations on a weekly basis or a daily basis because we're going to utilize our schedule trigger so that way it's always scraping the site whether it's done daily basis or weekly basis this is going to be a very very useful AI agent so stick around till the end because you'll learn a lot from this the template for this is going to be available on my school community in the classroom section right here the template will also be available in that AI agents with NAD so that way all you have to do is just import this template on your own workflow and you'll be able to get started or make changes and update the prom to be able to meet your particular needs if you're new to the channel my name is a my YouTube channel my school Community is all about building incredibly useful AI agents for personal use or for business use I have a Consulting agency so you'll always be learning from real life examples and real life experience that I bring so you'll have a lot of value in our classroom section I have exclusive topics there like the Deep dive topics here that goes deep into Vector databases all things AI agents and then obviously all the tutorials and templates there so that way someone who is looking to learn more update their skills or even if you're new to nadn you'll be able to find a lot of value and the biggest asset in the community are the community members we have extremely passionate group of people who are always looking to collaborate on different projects learn from each other so you'll have a lot of value I'll put the link in the description hopefully I'll see you there all right so so what I'm doing is actually um I'm going to remove this schedule trigger just to show you guys the demo and also I um used um open AI chat gp4 Mini model because that's a lot cheaper but you can use entopic you can use whatever chat model you uh prefer um I used entopic earlier and it worked great uh but for this particular demo cuz I'm going to go through it a few times so open ey gp4 mini is a lot cheaper so I'm going to show you guys that our second note is going to be our Jenna AI so this is a very useful tool right here I'm using just my http request Tool uh the me method is going to be a get request and the URL so this is very important right here so you're going to go to jenna. so that jenna. a is this really powerful tool that provides you different services like embeddings reranking but we're going to utilize the reader right here and it's very very simple to use you don't even have to create an account actually so you just go over there click on API key and billing right here you'll clck your you'll be given a unique uh API key obviously you'll copy that you'll come back to this reader section so the way this scrapes the website is all you have to do is just put https r.
j. a and then afterwards you can put URL and you'll be able to go ahead and scrape that website and grab that information for that particular page obviously we'll identify the different uh Json schema that we want to grab and by the way just to give credit this section right here this was inspired by ER who's a great nend workflow architect I'll put the link in the description so that way you can check it out but anyway so the the way this thing gets scraped is as you can see right here inside the URL I'm just putting this r. j.
and then I'm literally copying pasting the URL from this Zillow right here right and again based on whatever you're using so whether you're using realtor Zillow or whatever it may be all you have to do is just go to that City search that City search the name of the town and all you have to do is just copy that URL and then bring it back here and paste it so let's go ahead and do that so that way I have the most fresh set of data so I'm going to go ahead and copy this URL we'll come back here and I will just basically let me go back here paste this right here so I'm going to paste that and now we're going to provide our authentication here so the authentication is going to be generic credential type you're going to choose that and the only um requirement for this is sending header and that header is going to be um the API key that we're going to send so if you come to the Jenna ai's documentation right here this shows you an example that all you have to do is just provide the git method and the header is going to be authorization Bearer and you're going to just paste your API key that you're going to grab from right here so once you do that all you have to do is go back to your HTTP request the header a and you can create a new credential right here you'll put authorization and then you will paste that bear with your uh API key and that's pretty much it you're done at that point so the rest we going to leave it as it is so now if you go ahead and test this step if I click on test step now this should be able to scrape that page and provide all this HTML obviously this is raw HTML so there's going to be a lot of data here that we can't even read so therefore we need to uh extract the relevant information that we're looking for and if we go back to our URL right here you'll see these are all the um the real estate properties that have been sold recently so this was sold 1127 today is November 27 and then um these other ones are also like for example sold yesterday so you'll have the most um recent Market data right so once we grab that from this point the next step is going to be utilize this information extractor uh chain which is native to NN so you're going to add that here and the text is going to be json. dat and again this I'm literally just gra grabbing this and putting it here we're going to define a schema below and this is going to be the input schema and the way I generated This was um basically I first of all I came up with a prompt so this is the prompt that I came up with uh I said you're an expert data extractor analyze the given data and extract the det detail information about the recently sold properties for each property you need to provide the following right we need to grab the address the price the bed rooms the size the realtor when it was sent what type of a property is it is it a condo a single family so I identified that and I say always output the data in Json as an array called results and the way I generated the system prompt here and then also the input schema was basically I provided a uh screenshot of that URL from that that Real Estate website and I told it to create that Json scamer for me and I used my naden flow architect and if you're part of the school Community make sure you use this custom GPT because this will provide you better results but I just created that based on the prompt and based on the interaction that I have with GPT so now it's giving me these custom uh input schema this Json schema so that way I'm getting the proper information extracted from this blob of HTML that we receive from Jenna AI so now all I have to do is click on test step here and this is going to go ahead and utilize uh the chp4 mini or gp4 mini here to grab that relevant information that's coming out of that that um HTML data that we received from Jenna AI all right so once that's complete um as you can see right here we're getting this nice uh Json output here with the address for every property the price that was sold the bedroom and all of that information that we requested right here right so we're getting uh all this data and we can take a look at it right so the first one is and we're also getting by the way the URL for each one so the first one is this 2694 403rd Avenue San Francisco it was sold for $1. 3 million so let's take a look at it and right here as you can see $26.
94 fully 3rd Avenue San Francisco sold one for $1. 3 million on November 27 and if we take a look at it we want to make sure that the date sale date is correct yep that's exactly correct it's a single family home so the next one was sold for $895,000 on November 25th and this is 318 Harris Street so if we take a look at that again 318 320 har street $895,000 so you you can see this is very very accurate and we're getting the most recent data here because again we're just inputting that URL into um ji so once we have all of this great information now that we grabbed from this HTML that came so now we're splitting this and sending it to two different routes the first one we're just going to split all of that information that's coming out so right now there's this one item coming out meaning that this is just one output result right so this is um if you take a look at the table or Json view you'll see that all of this is being sent out as one output we want to be able to split this so that way we can utilize our um oop sorry yeah we can utilize our um real estate agent data uh Google doc sheet that I created and again you can use air table or something else if you prefer but this is a clean setup right here so this was from the previous one okay so let me just go ahead and delete this all of the existing data here going to get rid of that so now we have this clean um uh set of Google sheet with no data here and I only have the um column titles here you want to make sure that the column titles are matching the column titles that are coming out of this so that way we can automatically map this out to our Google Documents so let's go ahead and split this out because I want to be able to um grab this data and separate it so if I just click on so output. results so it's right here and I'm just clicking on test step to split this out by individual items and as you can see now I got nine items right so there was one item coming out here which was just uh the output result from this information extractor chain and now we split that into nine items and now we're going to map this out and if you use this map column mode automatically you want to again make sure and even if says right here that in this mode you make sure that incoming data fields are named the same as the columns in the Google sheet so this address price bedroom size s date URL you want to make sure that this is the same exact um title right here for each col so that way it can map it automatically so let's go ahead and test that step out so if I click on test step out now now it should be able to populate that data right here in our Google Sheets all right perfect let's go back there you go perfect right so now we have this really nice set of data very cleaned out the URL for each one if I just hover over it I can actually see the URL and I could just click on it it will give me more information like the name of the realtor uh the type uh price per square footage and again this is just a few set of pieces of data that I created you can create whatever you want based on whatever you're looking for and all you have to do is make sure you go back and to the information extractor and change the prompt so that way you're getting whatever you're looking for all right so now that that step is done we're going to send the data from this information structure to our real estate AI agent which is going to use some conditional logic based on the prompt that we provide and it's going to have access access to tools like the calculator and Sur API so means that it has access to Google uh Google's data or the search engine data and the reason why I put calculator is because it's going to utilize the calculator to some because there's numbers involved here like the price um there's a square footage so I want to be able to give it that operation of mathematical operation so that way if it's uh utilizing any kind of uh number related activities it can use that tool so this is going to be just a tools agent and we're going to prompt uh Define the prompt below so if I maximize this again The Prompt is just an example I got this prompt based on the conversation that I had with my chat GPT nadn architect to go back and forth to figure out exactly what I need so it generated this really great prompt for me your prompt might look different uh so make sure you utilize that um custom GPT to be able to go back and forth to so that way you can get like a nice prompt here but mine is just saying hey you're a real estate data analyst specializing in market trends and investment opportun unities based on the following weekly or daily or minute whatever uh you're looking for as far as the data I just put weekly I can even do daily actually because today uh is that uh uh URL was for today uh or for more most recently uh real estate market data analyze the provide provided property data and provide detailed report covering the following aspects and I'm defining the aspects there like the market trends investment insights timing of the market condition right uh specific suggestion that it's doing for buyers for investors and then summary of the recommendation and I am providing it this data below as a Json object you want to make sure you're stringifying this because this is coming out as an array uh from this information extractor so you want to make sure you doing json.