let's have a look at prompt training prompt training allows us to combine several chains and models to produce an output for our application this is one of the core benefits of using Lang chain and flow wise and it makes it possible to build Advanced AI driven applications let's have a look at a simple example where we combine three chains in our application first go back to the flowwise dashboard and create the new chat flow go ahead and save this chat flow and let's call it prompt chain before we continue let me explain what we'll be
building in this application this might be a simple application but it does demonstrate the power and flexibility that you have when combining different chains and AI models this application will contain three chains in the first chain we will ask the AI to give us an ingredient for a recipe that matches the name of a public holiday which we as the user will provide right to the application in the second chain we will ask our model to generate a unique recipe based on the public holiday and the main ingredient provided by the first model and this
chain needs to produce a recipe with step-by-step instructions and an ingredient list for the third chain we will prompt the AI to behave like a food critic which will analyze the public holiday and the recipe generated by the previous chain and then produce a review let's go through this step by step let's start with the first chain this chain will be responsible with coming up with the ingredient that matches the public holiday first let's add our chain and for this I'll use a simple llm chain of course you are more than welcome to make use
of any of these other chains in your application but let's keep things simple for this demo our chain also needs an llm I will simply use the openai alloy volume and then connect that to our chain let's also add our prompt template so from nodes I'll go down to prompts and then grab the prompt template node and let's connect this to the chain let's also add in our API key and for the prompt template let's enter something like assistant which will respond with a suitable main ingredient for a recipe based on a public holiday provided
by the user and will sit public holiday equal to a prompt value called holiday and then in the prompt value list we can give holiday a value so we can do that by clicking on the edit button and for the value we will select the value provided by the user in the chat box and we can close this pop-up let's go ahead and test this by saving the chat flow and let's enter a public holiday like Halloween and we get this main ingredient coming back as pumpkins great we we now want to add the second
step to this chain for the second chain we want the model to generate a recipe that is related to the theme and uses the main ingredient generated by this chain it's close to chat window and let's give this chain a name like ingredient chain let's add our second chain under nodes I'll go to chains I'll grab another llm chain and add that to the canvas let's also add an llm under nodes I'll just go to llms and I'll grab the openai node again feel free to use any chain and llm of your choice let's connect
the llm to the chain and let's add our API key for the prompt let's also add a prompt template so under nodes I'll go to prompts and let's grab prompt template and add that to the canvas and let's connect our prompt template to the chain let's also go ahead and enter the template for the sprung something like you are an experienced Chef that creates unique food recipes based on a public holiday and a main ingredient that matches that holiday and will also add public holiday to this prompt as well as the plant value called holiday
and just below this we'll set the main ingredient to a prompt value of ingredient it's very important to note that when doing prompt training the first variable in the prompt template will always be assumed to be the value passed in by the user like the public holiday let's go ahead and set these prompt values for holiday or click on edit and this is the value coming in from the user for the ingredient we also need to select the value but we do not want this to be the value provided by the user but instead this
value needs to be passed in from the previous chain so let's close this pop-up and let me show you how to create this prompt chain you will notice that on the prompt template node to the left we've got a little node that we can attach here and this will be the input from the previous chain this means that we need to connect the output from this llm chain to this prompt template we can do that by clicking on this drop down and changing the output from llm chain to Output prediction so instead of the result
being written to the chat box the result will instead be passed along to another node so now we can simply drag and connect this llm chain to The Prompt template now let's click on format plant values again and let's edit the value of ingredient now when we click on this box we can see another value showing up in this list and this is the output prediction coming from the ingredient chain we can now close this box also let's give this chain an name something like shift chain and let's save this and let's go ahead and
test this as well I'll clear the chat history and let's type in a holiday like Easter and we can see a recipe being generated and this recipe is for an Easter roasted lamb with garlic and rosemary the result that we see in the chat will always be the result coming from the Lost chain in our application which at the moment is this shift chain over here but how do we know if this is correct in other words how do we know what the main ingredient was that was generated by the ingredient chain for that we
can have a look at debugging when starting up flow wise you typically run yarn start but in order to add debugging you can simply add a space dash dash debug equals true and that will now start flow wise in debugging mode so let's have a look at what we get in the debugger when running this exact same chat let's just save our flywise project and let's run this chat I'm going to clear the chat history as well so that we have a clean start to make things even more visible I'm going to put a debugger
to the side of our application now let's provide a public holiday name let's do Halloween again now have a look at the output in the chat we can see the recipe being generated and on the right hand side we see a lot of information being written to the console what we can see at the top is that we are calling the first chain with a value of Halloween this is because in our prompt value list we set the user's input to a variable called holiday we can see this by going to the first prompt template
and clicking on format prompt values and this is where it's getting that name from we set holiday equal to the user's input then we can see the output of the prompt template was formatted into this you are an AI assistant which will respond with a suitable main ingredient for the recipe based on a public holiday provided by the user followed by public holiday and the value of that variable so this is a great way to ensure that our variables are indeed pulling through in the prompt then when we scroll down to Output prediction we can
see the output as being main ingredient Pumpkins this output prediction is the output being produced by this chain now we can see the start of the second chain and that's this chain over here the shift chain we can also see the value of the two variables being paused to this chain just as a reminder we can see those variables by clicking on format prompt values and we have a variable for Holiday which is equal to the user's input and the variable called gradient which is equal to the output prediction from the previous chain and in
the debugger we can see that value as being main ingredient pumpkins we can also see the final result of the prompt and we can see that for public holiday the value is Halloween and Main Ingredient is indeed set to pumpkins so that confirms that our prompt template is correct but let me show you a little gotcha let's say that we didn't specify the value of holiday and we only specify the value of Main Ingredient and this is just to show you an issue that you most likely will run into while building your applications let's cover
it and delete public holiday for now we will add it back in a minute also in the prompt values box let's go ahead and delete holiday so at the moment we are not assigning the user's input to any variables in this second chain if we go ahead and run this you will notice an issue let's provide a public holiday again something like Easter the chat will go ahead and generate some recipe but if we look at the debugger we can see something funny first we can see the first chain running with the input of holiday
and this is coming from the user we can then see the upper prediction from this chain which says that a suitable main ingredient is lamb then we call the second chain in our application but we can see an issue we only have one variable called ingredient and the value of that ingredient is set to Easter which is not correct if we have a look at the prompt values we can see that ingredient is set to the output prediction of the previous chain so we will expect this value to be equal to this output prediction this
is a small gotja that can definitely trip you up if you are not aware of this the basic rule here is that the value passed in from the chat by the user like Easter must always be paused into the prompt template the first variable in the prompt template will always be assumed to be the input from the user because we only have one variable in this prompt template the value that we assign in the prime template values is ignored and it is assumed that this value is the input from the user so my advice to
you is to always include the input from the user in each of your chains and simply add it somewhere in the prompt template let's add that value back in and I'll set holiday equals to Holiday and let's add it to the prompt values as well as holiday and will set holiday equal to the input from the user let's save this and let's test it out again for the holiday I'll enter Christmas and let's check if this is working scroll down in the terminal so we can see that the first chain is being called with holiday
equals to Christmas and in the output prediction we get the main ingredient of turkey we can now see the second chain being called with the ingredient set as turkey and holiday set as Christmas so hopefully you can see that as long as we assign the user's input to some variable everything else seems to work and if you scroll down in the console we can see the output being generated from the second chain and we can get our final result and this is the result that is finally written to the chat box hopefully as you can
see the debugger is really helpful for having a view of the data being passed between chains let's get back to our application I'll go ahead and clear this chat history and let's now add a third chain to our application or click on ADD nodes or select the chain like llm Channel line let's assign our model by going to adult Ms and I'll select open Ai and let's connect this to our chain and let's also add our openai API key let's also go ahead and add our prompt template by clicking on ADD nodes prompts and prompt
template and let's connect this template to our chain let's enter our prompt template we can do something like you are a food critic that will review a food recipe based on a public holiday this also provide these variables to The Prompt template like holiday is equal to a variable called holiday and recipe is equal to a variable called recipe we can now assign values to these variables by clicking on format prompt values for Holiday I will assign the input from the user for recipe we want to pass in the value from the previous chain the
please close this pop-up box let's change the output from the previous chain to Output prediction and let's connect this chain to The Prompt template just click on format prompt values again for the recipe let's now change it to shift chain and let's close this pop-up I will also go ahead and give this chain a name like critic chain let's save this and let's also have a look at the debugger while we run this in the chat window let's provide our public holiday as Easter again and let's press enter and in the chat we can see
the review coming through as well something in the lines of the this Easter roasted lamb looks and smells Divine the lamb was cooked to Perfection with a juicy interior and crispy exterior let's have a look at the debugger we can see the first chain being called with the holiday Easter and if we look at the output prediction the main ingredient is roasted ham lamb and asparagus then the second chain was called with with the ingredients as well as the holiday and for its output prediction we can see the recipe with the ingredients and cooking instructions
then for the third chain we can see that holiday was indeed posting as Easter and the full recipe was passed in from the previous chain as well and we can also see the entire recipe being added to the prompt and this then gives us the final result note that sometimes the formatting might look a bit strange and that is very much related to the model that you are using I do want to show you that it is possible to swap out these openai models so let's say instead of using this model let's delete it and
instead replies it with one of the chat models so I'll just go to chat models and select chat open Ai and let's drop that into this canvas and hook it up to the chain like so it's also placed in our API key and save this let's run the chat and let's give it a different holiday name something like Halloween again let's press enter and now you can see that the quality of the output has drastically increased due to the advanced nature of the gpt3 model so depending on the application that you are building different models
might be more ideal for specific steps in the chain so go ahead and experiment if you like this video please consider liking the video and subscribing to my channel I'll see you in the next one bye bye