hello everyone welcome to AI anytime channel in this video we are going to look at defi AI so defy AI is a low code no code platform where you can build AI applications faster so they say it's an innovation engine for Gen applications see this as more of a drag and drop where you can just you know create a workflow you know you can add components for example for a chat board for a summarization task for you know if you want to build AI agents you can just drag and drop on the dashboard and you
can have an AI applications in the end I already have a couple of videos on defy AI where I have shown them how you can access the dashboard and build it there but in this video we are going to set this up through Docker so we're going to do this locally and you do not have to you know uh go on their Cloud dashboard and do it so this is a a local uh local setup video that we're going to do for defy AI so let's jump in and see how we can do that so
if you look at here on the screen I am currently on their website called defy AI it says an innovation engine and they are fantastic guys defi is not the only one where you can build AI applications without writing code you have it started with flow wise flow wise is also backed by ycy combinator you can also check it out I have video on that as well then there is make.com it's an automation platform where you can build very similar to defy where you can build AI applications if you don't have coding expertise or if
you don't have technical knowledge of how to build AI apps you can just try it out and build some hobby projects you know faster and of course you can also use this for different use cases and validations but if you want to take this in production you should have at least understanding of AI we also have gum Loop gum Loop is again backed by YC if I'm not wrong I also have videos on gum Loop a couple of videos where I have shown tasks like OCR optical character recognition and also how to do uh blog
post automation on LinkedIn now here on defy it's available through their cloud and also through Docker as I said we're going to set this up locally you can access the GitHub here it's called Lang genus SL D5 now I have a video as I said on D5 you can see on a AI agents how you can create AI agents without without writing any codes okay now I'm just going to I'm going to give this link in description if you don't know Docker amazing technology one of my favorite technology ever I have been using Docker when
when it was started so and I was in college that time so uh build with the number one most used developer tool it it should be number one uh Docker makes our life easier you know if you want to uh you know build ones or and then sip it in any other systems which supports Docker and it's fantastic right because it basically removes the uh version conflicts dependencies from the equation so we're going to use Docker to set this up and specifically we're going to use dock compose uh and let's jump in now so what
I'm going to do here open my terminal you can see this is the terminal uh if you are on Windows you can have Docker desktop if you are on Linux you can just use to to install Docker on line command and that's it you can add some user groups on Linux ENT to example if you're using it on Windows you need hyperv the virtualization needs to be enabled so you have to go to bios and then you have to enable uh virtualization to use Docker now what I'm going to do here first so you I
have lot of images so the command to see image it's not a Docker tutorial but I'm going to explain a few things now if you want to find out how many images you already have okay you can just do Docker images and I you can see I have bunch of images right for that now if you want to find out if there if there is already a running container okay now your program or the software or the code that you have written it runs within a container if you use Docker okay and you can use
Docker PS to find out if there is any running container you can see I'm running open web UI uh like an earlier it used to known as o Lama web UI so this this is all already running and this is what it is now what I'm going to do is you can also use Docker it to have you know more interactive ways now the first thing that we're going to do is get clone the repository so let me just do get clone here and then I'm going to copy this and let's let's copy this thingy
here and then I'm just going to uh excuse me paste it here so what I'm what I'm doing here I'm doing git clone and then giving the GitHub rep and then just I will hit enter you can see it says cloning into D5 you know it it will take a bit of time the Gory is huge they have lot of examples it's a complete uh you know open source no code low code platform they have lot of configurations presets already done for you so you do not have to kind of worry about it so that's
what that's what I'm doing here I'm cloning it out and you can see it takes a bit of time the next thing is let me show you here we going to go inside this repository of Docker we're going to CD into this so if I open this you can see it says Docker and I'm going to use this file called Docker compose do yam so it's a config file and that will help us run this piece of code or this entire Dy within a container and this is an entire Docker compos file and I'm just
going to come here come back and you can see this is done the next thing that I'm going to do is CD and then dy/ Docker because that's the folder so let's do that and you can see I am now inside Docker if you do LS you can find it out all the files that we have within this uh particular directory and now the next thing that I'm going to do is I'm going to use Docker compose of hyph D so let me just do that I'm using hyph D to run this in a detach
mode and for the first time it will take a lot of time guys because it has to pull uh different images for uh from official dockerhub or the GitHub container registry or aure container registry or AWS container registry from wherever container registry are something where see a lot of these organizations you know they already have created their images so you can just download or pull that image and run within a container so you do not have to go and you know build it yourself they have the image pull it and use it in your application
so that's what it is doing let me pause the video and come back once this is done all right guys as you can see we have our Docker like it started I started uh it's on Local Host install you can see all the images have been pulled successfully everything is running running it says running 11 out of 11 so the network proxy Network Docker default and multiple containers running Vector databases uh server web servers and whatnot right caching databases and whatnot you not you will not require everything so do not worry about it now you
have to come here and says Local Host SL install and then you have to give your email addresses to do that so I can you know give some of my email address for example if I just give my email address I give an username and I give a password so I'm just going to give any password Here excuse me okay oh sorry email address it's not that okay I just clicked on auto saved yeah and now you can logged in uh on this Dy dashboard you can see it says Local Host SL apps and this
is fantastic because we don't not have to worry about a lot of things guys here now if you look at here it says create from blank create from template or import DSL file now DSL file is something that for example if I create a chatbot on Dy so I will have a entire flow you know from a start to and all the components now I can save that as a file and I can give it to you so you can just up import that file and use it as it is so that's what DSL import
DSL file is here I'm going to click on create blank you can create blank that's what you should do you should click on create blank and just start working with it so let me just do that or if you look at chatbot let's just create blank now once you click on create blank it it has uh different types of apps chatbot text generator agent and workflow so I'm going to keep it chatbot basic just for this just for the demo purpose here guys I'm going to call it trial bot or something and just click on
create now once you click on create you can see llm provider key has not been set I'm going to you can see all the models so right now you know if I just do Gro do not have that you have to set that model by the way so for example if it says gp40 and LM provider key has not been set I have to set the key here you know go we have to go in the if uh go to settings and then set it but we'll go there uh before that if you look at
here on the left hand side we have something called orchestrate API access logs so you can also find out logs they also have observability so you can also monitor it what whatever is happening now here you have instructions variables context citations you can add tools I'm going to show that in a bit let's go on settings and in the settings you can find it out all these llms guys open AI anthropic Aur op you know cohor Lama uh replicate grow Cloud blah blah blah blah blah okay now I'm going to click on grow Cloud so
let's set it up and I can just get a let's go to Gro Gro Cloud because Gro can give us an API key so we can use that just for the demo purpose here yeah yeah yeah verifying you are a human app I'm a human app I have to see how they deal with uh know AI in near future the uh console. grow create an API key I'm going to call it defi demo or something and submit and it will give me a key so I'm just and I'm going to delete this of course after
the video so that's not a problem and save now you can see I have saved so my Gro is set I'm just going to say no thanks and you can see it has been you can also remove it cancel it if you want to remove from here and if you're in the model setting you can find out all the models okay now members if you want to add more people you can also connect your data source Notions website but that's I think you know premium thingy you can have an API extension blah blah blah this
is the model provider now let's do a restart here I'm going to select let's select any model so you have Lama 3 and mixol ATX 7B I'm going to click on Lama 38b 8192 context window uh these are fine you can keep a bit high if you want you know and Max new tokens uh you can yeah you can increase that a bit and that's fine okay now here you have you can also build a rag chat bot where you can bring up your own knowledge base so here you have something called knowledge and I'm
not going into it guys because I will recommend you to watch my previous videos because the main purpose of this video is to set it up through Docker and get it up and running and the rest of the flow is same as what I have shown in the previous video on that on the uh defy Cloud now here you can do a few things so here you can set up your instructions you can give some role or instructions whatever you want to do it different types of prompt engineering you can do here and uh the
next is let me just bring up my charger here okay now the next what you can also do you can add variables variables are nothing but where you pass your context so you can also do that you know if you have some knowledge bases you can you can see this input now you can input knowledge as context you know you can create a knowledge base and do that here as well I'm not going to do it citations and attributions not but you can add features now you can have a followup where you have you know
followup inbuilt where you can ask followup questions uh I'll just remove this here these are all good uh you know you can have one welcome you can Define some questions but but I'm just going to get rid of it so I have add one feature and let's publish now when you run this app it says trial bot start chat and it says powered by Dy okay now I'm going to start chat we don't have any you know default questions or preset you can Define it now I'm going to ask a question what is uh AI
for example it says model gp4 credentials to let me update this guys here and run app again and I'm going to ask write a fast API code snippet to call and API using request modules yeah so you can see I have asked a question and it pretty fast because Gro provide you that faster inference due to their tensor processing units that's what they call it now one thing you have to do it that you have to update once you select the model I for forgot to do that now you can see it says here is
an example of how you can use the request library with fast API to call an external API and you can find it out the code and it explains that a bit blah blah blah it also suggest you the follow uh it also suggest you some other question that you can ask ask so you can also ask some follow-up questions can I use post request and you can see instead of get and it's pretty fast guys so it's running it locally within your system without writing any code guys what else you need right get it get
it up and running quickly you know with D or any other platform like gum Loop or anything that you want to do so if you look at but def is uh powerful I recommend you use dii the next thing that what you can do you can also embed this so you can embed into site you can have website where you have the right hand side or on top you can embed this as a chatbot you have to use this script in this is Javascript code that you can use it and you can also do that
here you can also embed anywhere you want to embed it also have locks and annotations so for each question that you ask the locks get recorded over here you can see uh and you also have annotations annotations are helpful if you want to further use this data to fine tune it in the monitoring part you can find it out everything and you can also use this as an API endpoint you can see there's something called backend service API now imagine if you want to use this in any existing applications or a mobile devices you can
just call this to this end point and and it will work you can see it over here this is a public URL you can share with your friends you can do that guy that's not a problem to let's go back to orchestrate followup when you create tools you know if you look at there so many tools available okay Duck Duck Go you can search it on internet a lot of tools that you can use it you know so duck du go can be used to search there archive which which brings a paper there is permit
you can use permit which is a biomedical literature permit Central you can bring up information trb you know and variables you can add variables as I said you can add context I don't have any knowledge base as I said but you can create it guys very pretty easy click on knowledge create knowledge you know it says input from file sync from website sync from notion notion is a database that you can use now here you can upload your files you can upload any files guys you know it depends on you what kind of files you
want to upload I don't seem seems like I don't have any file so I'm not going to use that okay so you can use it and just sync up with your uh bot here and that's what it is guys you know this is how easy it is to set it up on uh your local machine through docker so do not have to worry about you know uh uh connecting with internet and so on and so forth when you use grow Cloud API you have you need internet to do that but you can also set it
up models locally if you want to do it so that's up to you I leave up to you to do that okay now if you have any question thoughts or feedbacks about this video let me know in the comment box you can also join our Discord Community guys you know uh that is available that is available for you to join in free we have lot of opportunity ities job updates internships we do hackathons we do coding session jamming sessions Town Halls you know we have help channels we have uh freelancing opportunities as well so please
join our Discord I will give the link in description now if you like the content I'm creating please hit the like icon and if you haven't subscribed the channel yet please do subscribe the channel guys that motivates me to create more such videos in your future that's all for this video guys thank you so much for watching see you in the next one