in this video we extend flow wise to use cloud tables to store values online Cloud tables provide an API but the API is not listed on the rapid API hub to connect them together we can use yarn to add Cloud tables API as a new flowwise dependency then we can use the API in custom tools and open AI agent and use function calling to store values in the cloud here for example we use a custom tool to store stock alarms in one data set and use another Custom Tool to sort stock orders in another data set after saving our data in the cloud we can embed the tables in other applications and extend the functionality so let's first explore Cloud tables for those developers who worked with PHP and WordPress data tables is a known Library and the editor is the online editing tool for the tables Cloud tables combines these two in a cloud infrastructure and provides seamless integration of the tables in other application and Frameworks you can access and manipulate Cloud tables with the API in many languages including node. js you can check the functionality on cloudtables. com here you can see and test a responsive online table that can hide extra fields to sorting and pagination and provide search functionality out of the box you can search for example for a city like London and select a row from the result set and edit the row the changes are reflected immediately and the table is updated to test the service you can start a free trial and create your first data set a data set is like a table every data set had its own unique ID which we will use later in our API to reference the data set after creating the data set it's time to create the data points which are like The Columns of the table each data point has a unique ID like dp-1 dp-2 and so on we will use this data point IDs later in the API to assign our properties to the columns for the alarm data set we create three data points two Text data points for simple and Company and one number data point with two decimal places for the alarm price next we create another data set for our order data the order data set has its own unique ID which we will use in our second Custom Tool this time we add four data points three Text data points and one number data point it is the same procedure as the alarm data set but notice that the data point IDs are sequential and do not start from one again instead we have pp-4 5 6 and 7.
so our two data sets are ready now in the security section we have two pre-configured API Keys one with read write permission and one with read-only permission we do not have any billing plan yet and in the settings section we see our web address and the assigned subdomain which acts like our application ID you can complete the registration and give your login information to access the service later and on other devices after the registration is done we see it in the dashboard the onboarding progress and some statistics about the usage of the service now we can go to the next step and start extending flow wise to add dependencies to flow wise we use yarn in the more recent versions of node. js you can use yarn by enabling the core pack keep in mind that for enabling core pack you need to run the command prompt as administrator here we first check the installed version of node. js in our case it is version 20.
4. 0 next we type core pack enable to enable core pack and to be able to use yarn the process of adding dependencies to flow wise is described in the documentation navigate to docs dot flowwiseai. com then to tools and then to custom tools then navigate to the section additional dependencies and here you find the instruction on how to add additional dependencies to flowwise also we have to do some configuration which we will do in a few moments but first let's navigate to Cloud tables to the node.
js part to check the name of the library to be added in the documentation is mentioned that we can add the library by yarn add Cloud tables Dash API so let's start by cloning the flowwise repo we navigate to github. com flowwise Ai and flow wise and copy the https link back to our project folder we cloned a flowwise repo as mentioned in the flowwise documentation we change to the flowwise directory then to packages and then to the components directory we clear the screen to have more room and type yarn at Cloud tables API this will add the cloud tables API to our project and depending on the speed of your machine this will take some time after the prom comes back we go two levels up to get to the project root and type yarn install again sum y and when the prompt comes back we can type yarn build and wait till the build process is done insulation is done we need to do some configuration so we start Visual Studio code here we see there is one change and Cloud tables Dash API is added to the dependencies we can now close this file we still need to do some configuration so we go to the server folder and copy and paste the dot end dot sample file and rename it to dot NF but inside the server folder in the dot end file we can uncomment the debug instruction to see what's going on under the hood and the next change is to uncomment Tool underscore function underscore external underscore depth and assign our library to it now the installation and configuration of flowwise are done and we can start flow wise by typing npx flowwise start flow wise starts listening to Port tree towers we navigate to localhost Port 3000 and notice the new menu credentials and click add to set our open AI key we gave it a name like flowwise Cloud tables key and navigate to open Ai and create a new secret key the name is optional but we can give it the same name create and copy the key and go back to flowwise and place the key as the open AI API key and add the credential to use it later in the flow after the key is set we go to the marketplace scroll down and select open AI agent and click on use template to start creating our flow we remove unwanted tools to have a clean start in chat open AI we select our just created open AI key select gpt-3. 5-turbo-0613 and reduce the temperature to be more deterministic now comes the main part we create our first tool and give it a name like store stock price alarm in Cloud tables and a description like this tool stores stock alarms in the cloud table to set an alarm it needs the symbol of the stock and the company name and the price then we add the properties first we add symbol AS string and the description thicker symbol of the stock and make it require next we add company as string with the description company name of the stock and make it required to and finally we add price but this time we choose number as type and give the description set alarm for this stock price and make it required to after the three properties are set we can go and Implement our function in JavaScript I paste the code in the box but to have the syntax highlighting while going through the code I paste the function to visual studio code 2.
on the first line we require the newly added dependency Cloud tables Dash API all the steps we have done till now were to be able to add this line to our script next we create an API using the subdomain and our API key the client ID and client name are optional but the data set ID is very important as we want to store our properties in the alarm data set so we need to use the ID associated with the alarm data set furthermore we must use the data point IDs related to The Columns of the alarm data set we assign the data point dp-1 to our properties symbol which will proceed with the dollar sign and so on when storing the data in the cloud is successful we get a result from cloud tables and we cancel lock the result to see it in our terminal and return a success message back to chat GPT if you have an error we again log to the console and inform chat GPT about the error to see the corresponding values on cloud tables we navigate to The Alarm data set here you see the ID of the alarm data set or table next you see the corresponding data point IDs of each of the data points or columns in the embed section for node. js you see the corresponding embed code it includes the subdomain or application ID and the API key which can be seen in the security section and has read and write permission in the cloud tables node. js API you can see more information on how to integrate the code and if you scroll down you see a link to data set insert and in data set insert you see a related example here you can see how to assign values to their data points back to flow wise we need to click on add to save our first custom flow now comes the second Custom Tool to save our order data we create our second tool and give it the name like place a buy or sell order for a stock and a description like place a buy or sell order for the stock of a company given by a ticker symbol this time we need to add four properties first we add the symbol AS string and the description per symbol of the stock and make it require next we add the company as string with the description company name of the stock and make it required to next we add a type as string and give it a description lag order type which can be buy or sell an inform chat GPT to use one of these two values and finally we add the price but this time we choose the number as type and give the description price of the stock to execute the order after the four properties are set we add our function in the JavaScript function box the code is similar to the first tool but with some changes the subdomain and our API key are the same but the data set ID is different and it is the data set ID of the order data set this time we assign our four properties proceeding with the dollar sign to data point IDs dp-4 to dp-7 notice that the data point ID has an ongoing ID and does not start from 1 for each data set the rest of the code is just logging and returning information based on the success of the operation to check again we go to Cloud tables but this time to the order data set and check the ID of the data set and the corresponding for data point IDs which start from tp-4 to dp-7 the embed section is the same so we go back to flow wise and click add to save the second tool to now it's time to connect the dots and save the flow and give it a name like flowwise Cloud tables we can now test our flow we open the chat box and make it bigger Here Comes our first question what is a ticker symbol chat GPT doesn't need any function calling to answer this question I did answer the question and gives back some examples next we write set an alarm for Apple at two hundred dollars this time chatgpt uses our first Custom Tool to store this information in Cloud tables in the alarm data set the part will you will be notified when the stock price reaches that level is a chat GPT hallucination as we do not have implemented this functionality yet next we write place a buy order for Apple at 210 dollars this time chat GPT users are second tool to sort this information in the order data set as you can see we did not provide any symbol and chatgpt uses its knowledge till September 2021 to find out the ticker symbol next we put a sell order to see if the type field is filled correctly to test if chat GPT uses the symbols of the first prompt using buffer memory but uses trained data we asked to set an alarm for Netflix at 400.
it successfully finds the symbol and adds the alarm next comes the danger of using chat GPT for Mission critical data like Trading we just say add an order at 500 and do not mention buy or sell or which stock GPD wants to help and assumes we want to buy Netflix and stores this information in our table so let's go to Cloud tables and check the save data our first Custom Tool has successfully added the entries to the alarm data set and our second Custom Tool has successfully added the entries to the order data set you may want to change some data or even delete some entries online if you are happy with the data you can use one of the stylings like bootstripe 5 and embed the result in an intranet site and extend the example as you can remember we had set the debug mode to true and added some console log statements so we can see what's going on under the hood as you can see in the logs our first prompt about the ticker symbol did not use any function call and the answer is in the content part our next prompt used our first Custom Tool with the arguments AAPL and apple and 200. when we scroll down we find our next function call to the second function with the arguments AAPL and apple and pi and 210 and the answer from cloud tables with the ID 11.