What is LangChain?

212.08k views1220 WordsCopy TextShare
IBM Technology
Learn about IBM watsonx→ https://ibm.biz/BdvkK8 LangChain became immensely popular when it was laun...
Video Transcript:
now stop me if you've heard this one before but there are a lot of large language models available today and they have their own capabilities and specialities what if I prefer to use one llm to interpret some user queries in my business application but a whole other llm to author a response to those queries well that scenario is exactly what Lang chain caters to Lang chain is an open-source orchestration framework for the development of applications that use large language models and it comes in both Python and JavaScript libraries it's it's essentially a generic interface for
nearly any llm so you have a centralized development environment to build your large language model applications and then integrate them with stuff like data sources and software workflows now when it was launched by Harrison Chase in October 2022 Lang chain enjoyed a meteoric rise and by June of the following year it was the single fastest growing open- source project on GitHub and while the Lang chain hype train has uh slightly cooled a little bit there's plenty of utility here so let's take a look at its components so what makes up Lang chain well Lang chain
streamlines the programming of llm applications through something called abstractions now what do I mean by that well your thermostat that allows you to control the temperature in your home with without needing to understand all the complex circuitary that this entails we just set the temperature that's an abstraction so Lang chains abstractions represent common steps and Concepts necessary to work with language models and they can be chained together to create applications minimizing the amount of code required to execute complex NLP tasks so let's start with the llm module now nearly any LM LM can be used
in Lang chain you just need an API key the llm class is designed to provide a standard interface for all models so pick an llm of your choice be that a closed Source One like gp4 or an Open Source One like llama 2 or this being Lang chain pick both okay what else we got we have prompts now prompts are the instructions given to a large language model and the prompt template class in Lang chain formalizes the composition of prompts without the need to manually hardcode context and queries a prompt template can contain instructions like
uh do not use technical terms in your response that would be a good one or it could be a set of examples to guide its responses that's called f shot prompting or it could specify an output format now chains as the name implies are the core of Lang chain workflows they combine llms with other components creating applications by executing a sequence of functions so let's say our application that needs to first of all retrieve data from a website then it needs to summarize the text it gets back and then finally it needs to use that
summary to answer User submitted questions that's a sequential chain where the output of one function access the input to the next and each function in the chain could use different prompts different parameters and even different models now to achieve certain tasks llms might need to access specific external data sources that are not included in the training data set of the llm itself so things like internal documents or emails that sort of thing now Lang chain collectively refers to this sort of documentation as indexes and there are a number of them so let's take a look
at a few now one of them is called a document loader now document loaders they work with thirdparty applications for importing data sources from sources like file storage services so think Dropbox or Google drive or web content from like YouTube transcripts or collaboration tools like air table or databases like pandas and mongod DB there's also support for vector databases as well now unlike traditional structured databases Vector databases represent data points by converting them into something called Vector embeddings which are numerical representations in the form of vectors with a fixed number of dimensions and you can
store a lot of information in this format as as it's a very efficient means of retrieval there are also something called text Splitters which can be very useful as well because they can split text up into small semantically meaningful chunks that can then be combined using the methods and parameters of your choosing Now llms by default don't really have any long-term memory of Prior conversations unless you happen to pass the chat history in as an input to your query but Lang chain solves this problem with simple utilities for adding in memory into your application and
you have options retain for retaining like the entire High conversations through two options to just retain a summarization of the conversation that we've had so far and then finally the last one we'll look at are agents now agents can use a given language model as a reasoning engine to determine which actions to take and when building a chain for an agent you'll want to include inputs like a list of the available tools that it should use uh the user input like the prompts and the queries and then any other relevant previously executed steps so how
can we put all of this to work for our applications well let's talk about a few Lang chain use cases now obviously we have chatbots Lang chain can be used to provide proper context for the specific use of a chatbot and to integrate chatbots into existing communication channels and workflows with their own apis we also have summarization language model can be tasked with summarizing many types of text from breaking down complex academic papers and transcripts to providing just a digest of incoming emails we've also seen lots of examples where this is used for question answering
so using specific documents or specialized knowledge basis llms can retrieve the relevant information from the storage and then articulate helpful answers using the information that would otherwise not have been in their training data set and uh yeah this is a good one data augmentation llms can be used to generate synthetic data for use of machine learning so for example llm can be trained to generate additional samples that closely resemble the real data points in a training data set and there are of course virtual agents as we already started to discuss integrated with the the right
workflows Lang chains agent modules can use an llm to autonomously determine the next steps and then take the action that it needs to complete that step using something called RPA or robotic process automation Lang chain is open source and free to use there are also related Frameworks like Lang serve for creating chains as rest apis and Lang Smith which provides tools to monitor evaluate and debug applications essentially Lang Chain's tools and apis simplify the process of building applications that make use of large language models if you have any questions please drop us a line below
and if you want to see more videos like this in the future please like And subscribe thanks for watching
Copyright © 2024. Made with ♥ in London by YTScribe.com