LangChain 101: The Beginner’s Guide to Chains, Agents, Tools & More

Imagine a world where the apps you build don’t just read language – they speak it fluently and contextually, making every single user interaction smarter and more intuitive. This is where LangChain comes in, and we’ve got a helpful guide to tell you all about how to elevate your projects!

For example, you want to create the next big social media app — the next TikTok, Facebook, or Instagram. LangChain is the tool that you and your team might use to develop automated systems that review and moderate user-generated content by identifying and filtering inappropriate or harmful material. This is just one of the many uses of LangChain, which offers a whole arsenal of tools to take your generative AI projects to the next level.

Keep reading to learn more about LangChain, how it works, what it can do, and how you can use it to bring your web applications to life.

Table of Contents:

woman in glasses with hand on face, thinking

Is Tech Right For you? Take Our 3-Minute Quiz!

You Will Learn: If a career in tech is right for you What tech careers fit your strengths What skills you need to reach your goals

Take The Quiz!

What is LangChain?

LangChain is an open-source framework for developing apps that make use of large language models (LLMs) and similar features, like chatbots. LangChain’s tools and APIs make it easier to set up some impressive uses of natural language processing (NLP) and LLMs (more on that later!).

LangChain has a lot to offer as one of the top frameworks for working with LLMs, supplying your app with various data sources and giving it the ability to actually make informed decisions on the best way to generate output. Another advantage is that its interface lets you work with virtually any LLM, as well as multiple LLM or data sources simultaneously.

If your app needs access to current information, integrating it with a framework like LangChain is necessary. Remember — LLMs are trained on data from a set time, so if you need current data, you probably won’t be able to get it using a singular LLM by itself.

For example, Google’s Gemini pulls information directly from Google, so it can respond with real-time updates. ChatGPT, on the other hand, is limited to its most recent data update. At the time of writing, that’s October 2023, so if you want information on the results of the 2024 Summer Olympics, it will tell you to look elsewhere (or hallucinate some results, but that’s a different article). If you’re set on using OpenAI’s powerful and popular GPT LLM, LangChain is the framework that allows you to integrate it with an up-to-date search tool.

Because you can receive data from multiple sources and in multiple formats, it’s easy for you to swap between using different LLMs like GPT (OpenAI), Gemini (Google), Cohere, Mistral, and others. If you want to build complex web applications, LangChain is an extremely powerful tool to have in your arsenal.

Related: How to Use ChatGPT to Supercharge Your Tech Career Change

How does LangChain work?

We now know that LangChain is a framework, and frameworks are a collection of libraries, tools, and features that make it easier for developers to build apps that use LLMs. Frameworks like LangChain work by using abstraction.

So then what’s an abstraction? When you’re coding and building an app, complex processes require a lot of code. LangChain uses abstraction as a shortcut to replace complex code with simplified code and syntax.

We’ve used π (pi) as an example before, but that’s because it’s a good one. In general, we know that pi represents the ratio of a circle’s circumference to its diameter. And instead of writing an equation with “22 ÷ 7” or the even longer “3.14285714…,” you only need the pi symbol to represent it. That’s abstraction.

The idea of abstraction is that the processes — no matter how complicated they are — are contained by a single component. And these components can be LangCHAINED (get it?) together to create an app.

What are the core features of LangChain?

If it helps, think of LangChain as a shortcut that makes building web apps that use LLMs much, much easier. The options for using LangChain are endless, so the way you use it really just depends on what exactly you’re using it for.

The power of LangChain lies in what it can do, and some of its fundamental features include:

Prompt Templates

Quick crash course on prompts! Prompts are instructions you give to a generative AI model to guide it in creating whatever it is you want it to create. You don’t need to be familiar with coding languages, like Python, for prompt writing. You can use your natural language, whether that’s English, Spanish, or any other preferred language. Depending on your use-case, a lot of people talk to ChatGPT like they talk to a friend, for example.

Prompt template:
A prompt template is a predefined format for a prompt. It is then sent to an LLM — or chat model — to generate an output.

A prompt template is a predefined format for a prompt. It’s sent to an LLM — or chat model — to generate an output. Generally, templates are a standard layout that lets you reuse certain sections while filling in the changeable parts that make the entire entry something new. They’re like incomplete sentences where the missing information is actively added to it.

A prompt template for generating social media posts might look like this where you swap out the bracketed info per-use:

Write a social media post for [Product/Service] that targets [Audience]. The post should:

  • Highlight [Key Benefit or Feature]
  • Include a call-to-action like “Shop Now” or “Learn More”
  • Use hashtags such as [Relevant Hashtags]
  • Maintain a tone of [Desired Tone, e.g., engaging, professional]

The TL;DR? Prompt templates are like a calculated game of mad-libs.

Prompt template:
You can use them for more complex tasks like a tool that lets you repeatedly generate new content.

Prompt templates aren’t only for when a user types an input into a chatbot, like in our example above. When developing an app, a prompt template can be as simple or as complex as you need. It all depends on what your application does and what kinds of requests your user may make. But there is a difference between what the user of your app sees and does versus what the prompt is behind the scenes. Depending on your app, the user might not actually have to do anything.

Imagine you create a travel app that gives users weather updates and news based on the location they’re traveling to. When your user opens the app, they don’t really need to do anything specific (maybe select a location or something super simple), but behind the scenes, the app could be using a prompt template that looks like this to generate the necessary results:

  • “What is the current weather in {location}?”
  • “What is the typical weather forecast in {location} between {trip_start_date} and {trip_end_date}?”
  • “What local events are scheduled in {location} between {trip_start_date} and {trip_end_date}?”

This is how LangChain’s prompt templates can make your life as a developer much easier! Their reusability, sustainability, and improved accuracy results in more relevant and useful responses from the model, and therefore a much better experience for your end user.

Chains

It’s finally time…to put the “chain” in LangChain! Chains are the fundamental building blocks of LangChain. Like the name suggests, they’re a sequence of calls (or requests) that you can make to an LLM, that will let you go beyond a single API call. Instead, you chain them together to make multiple calls to different LLMs, external APIs, and databases.

Chains:
A sequence of calls that you can make to an LLM, and they let you go beyond a single API call. You can chain them together to make multiple calls to be executed in a specific order.

Chains work by breaking down a complex action into smaller steps that have to be done in a specific order. If they’re not in order, the entire action doesn’t make sense.

It’s like doing laundry.

Step 1: You input dirty clothes into the washer.

Step 2: The washing machine outputs clean, wet clothes.

This output becomes your new input.

Step 3: You input clean, wet clothes into the dryer.

Step 4: The dryer outputs dry clothes.

Long story short — the output from one step becomes the input for another, and you won’t get clean clothes by doing the steps out of order.

Chains let you take the output of one API call and use it as the input for the next, giving you more control over the overall logic.

Let’s go back to our travel app. If you develop the app with external APIs like Google Maps and Open Weather Map, the chain is what can help connect location to weather. This lets your app generate content, like the typical weather forecast, for a specific location, during a specific time.

Stringing together multiple calls together gives you the opportunity to clean or filter outputs or data between API calls. In our laundry scenario, it’s like removing any delicate fabrics from the washer load before throwing the rest of your clothes in the dryer.

One downfall of LangChain chains is that they’re hard-coded. Remember that since chains break an action down into different steps, they have to be done in a specific order. And because chains are hard-coded, you can’t change the way a chain works, so you can’t change the flow of the steps while the code is being executed. This can be a headache, but another LangChain feature — agents — give you more power and flexibility.

LCEL

LangChain Expression Language (LCEL) is a special syntax that simplifies how chains are constructed. Although it’s specific to LangChain, its inspiration comes from the concept of piping in Unix or Linux.

So what’s piping?

A pipeline is a set of processes chained together. Piping lets users connect commands — usually two or more — to create a seamless flow of data from one process to another. In this process, the output of one command becomes the input for another command. Sounds a lot like the laundry example, right?

At this point, you don’t have to worry about knowing the ins and outs of LCEL syntax. What you should know is that while coding with LCEL isn’t perfect, it requires less code which sometimes — but not always — means less time and complexity.

Agents

Remember what I said about chains being hard-coded? Chains are great for workflows that have a set of steps that don’t change. With LangChain agents, instead of being forced to stick to a sequence, you can pass off the decision-making to your LLM.

Agents:
With agents you’re passing off the decision-making to a language model. And the model reasons and decides what actions to take and their sequence based on the input.

It’s time for a new example – you’re the owner of a restaurant. When you’re working in a kitchen, different meals require different steps, ingredients, and equipment. Even the same meals can be made differently. If you were using only chains in your kitchen, you’d have to follow the predetermined set of steps for every single meal, and you could only make one dish. That’d be pretty boring for your clients (and for anyone cooking).

So instead, you hire a chef to revamp your menu. To output a meal, the chef uses their reasoning to decide what actions to take and what order to do them in, and they can choose between the grilled chicken dish or the salmon before executing.

Langchain agent as chef

This is exactly what a LangChain agent does for a web app.

What if your app needs to handle more complex inputs where the steps aren’t easily defined?

What if you need your app to have access to external tools and resources without defining exactly how or when your app should use them to generate outputs?

LangChain lets you pass these decisions to a language model. Like the chef, the model uses input to reason and decide what actions to take and their order.

Don’t worry about losing total control over your creation. You — the developer — aren’t removed from this process. You’re still the restaurant owner who supplies the kitchen with food, appliances, and equipment. For your app, you specify what model, tools, resources, and external APIs the agent can use, but the agent, like a chef, decides which to use and when. All of these steps are chains, and they’re chained together by the agent.

But if you want your LangChain agent to make the best decisions, you need to give it access to tools — quality ingredients, appliances, etc., if you will — that it can use.

Tools

Built-in tools are interfaces you can use so your LangChain agents can communicate and interact with external services. “External” means that the services are outside the capabilities of what an LLM can do. Remember — LLMs are designed to recognize text and generate responses based ONLY on data they’ve been trained on. If you want your app to do something like search the internet for real-time information, you’ll have to integrate it with a search tool.

LangChain has about 60+ built-in tools for common actions. And while you can use any one of them to fit the function of your LLM, here’s a quick look at some of them.

  • Google Drive: connecting to Google Drive API and retrieving data from Google Docs like documents, spreadsheets, presentations, and more
  • Google Search: using Google Search to retrieve search results
  • OpenWeatherMap: fetching weather information and real-time data
  • Wikipedia: searching Wikipedia articles and retrieving content

There are tons of other tools available, some of which connect LLMs to APIs you’ve definitely already heard of, like: YouTube, Yahoo Finance News, Reddit Search, and Bing Search.

While it might seem like you should give your agent access to every available tool, that would actually be a mistake. The agent has to use the description of the imported tools to decide which one is best for gathering relevant information for the prompt. Then, the agent gets to work and takes the context of the given prompt to decide to search for other information using resources like Google Search or Wikipedia. So, the more tools you include, the harder it’ll be for the LLM to decide which one to use.

woman in glasses with hand on face, thinking

Is Tech Right For you? Take Our 3-Minute Quiz!

You Will Learn: If a career in tech is right for you What tech careers fit your strengths What skills you need to reach your goals

Take The Quiz!

What is LangChain Used For?

LangChain is designed to unlock the full potential of LLMs by making them more versatile and interactive. And as a developer, you probably want that. Like we’ve discussed previously, LangChain can help you create a program that not only understands and generates text but integrates seamlessly with other data sources and systems.

Some of LangChain’s most noteworthy use cases include:

  • Chatbots: Chatbots may be the ultimate showcase of how powerful LLMs can be by turning complex AI into an everyday tool. With LangChain, you can give chatbots the right context for their unique tasks so you can seamlessly integrate them into your existing communication channels and workflows.
  • Content Summarization: Language models can summarize mountains of text — books, essays, transcripts, and more — into bite-sized insights.
  • Question Answering: To find relevant information and deliver appropriate answers, language models can use specialized documents and knowledge bases to get the job done. With the right data training, they can tackle a wide range of questions even without extra data.
  • Data Augmentation: Data augmentation allows you to create new, synthetic data from your existing examples. This helps boost the performance of your language model by giving it more diverse examples so it’s better trained and more adaptable.
  • Data Extraction: Using LangChain with LLMs lets you pull key insights and vital information from complex data and sources.
  • Query Analysis: To optimize your queries, query analysis transforms vague questions into precise questions so your data searches are smarter and more effective.

LangChain — In the Real World

So now you know what LangChain is. We’ve covered how it works, its core features, and what it’s used for, but how does all this translate in the real world?

You can use LangChain:

  • For any platform: Use LangChain to create a virtual assistant that can handle complex customer questions and provide personalized support.
  • For a content marketing platform: Use LangChain as a content creation tool to generate blog posts, social media content, and marketing copy based on user input, SEO requirements, and trending topics.
  • For an online education platform: Use LangChain to develop a virtual tutor that answers questions, provides real-time feedback, and adapts lesson plans based on individual learning styles and progress.
  • For a medical platform: Use LangChain to create a medical chatbot that can interact with patients and direct them to appropriate resources or specialists based on their symptoms.
  • For a human resources platform: Use LangChain for a recruitment tool to analyze job applications and resumes and highlight the most qualified candidates based on candidate profiles and the job requirements.
  • For a financial platform: Use LangChain to create a virtual financial advisor that interacts with clients, analyzes their financial goals, and offers personalized investment recommendations.

The Future of Generative AI (with LangChain)

Say “hello” to the new kid on the block. Compared to other popular LLM frameworks like Tensorflow and PyTorch — launched in 2015 and 2016, respectively — LangChain came on the scene a few short years ago in October 2022. And since then, it’s become a game changer. LangChain helps developers build intelligent systems and apps that understand context, adapt to user needs, and create unique experiences.

LangChain’s framework is key to unlocking tons of new AI-driven possibilities. It’s constantly evolving and redefining what we used to think was possible with LLMs, making the apps you create smarter, more responsive, and more capable.

This is your opportunity to start now! Sign up for our free coding camp designed to jumpstart your journey with tech and eventually generative AI tools and LangChain. In our class, you’ll master the skills you need to harness the full potential of language models and build impressive apps.

Author Image

Jouviane Alexandre

After spending her formative years in the height of the Internet Age, Jouviane has had her fair share of experience in adapting to the inner workings of the fast-paced technology industry. Note: She wasn't the only 11-year-old who learned how to code when building and customizing her MySpace profile page. Jouviane is a professional freelance writer who has spent her career covering technology, business, entrepreneurship, and more. She combines nearly a decade’s worth of experience, hours of research, and her own web-building projects to help guide women toward a career in web development. When she's not working, you'll find Jouviane binge-watching a series on Netflix, planning her next travel adventure, or creating digital art on Procreate.