LangChain Graph Checkpoint Example

Table of Contents

Introduction

This document demonstrates how to use checkpoints in LangChain graphs. Checkpoints allow you to save the state of a graph at a specific point in time and resume execution from that point later. This is useful for debugging, human-in-the-loop interactions, and resuming long-running graphs.

Setup

Install necessary packages

pip install langchain | grep langchain

Import necessary modules

from langchain.graphs import Graph
from langchain.chains import LLMChain
from langchain.llms import OpenAI

Define the Graph

Create a simple graph with two nodes

graph = Graph()

# Add a node that generates text
graph.add_node(
    LLMChain(llm=OpenAI(temperature=0), prompt="What is the capital of France?"),
    "generate_text",
)

# Add a node that summarizes the text
graph.add_node(
    LLMChain(llm=OpenAI(temperature=0), prompt="Summarize the following text: {text}"),
    "summarize_text",
)

# Connect the nodes
graph.add_edge("generate_text", "summarize_text", "text")

Run the Graph with Checkpoints

Execute the graph with the checkpoint parameter

# Run the graph with checkpoints enabled
result = graph.execute(
    inputs={},
    checkpoint="my_checkpoint.json",  # Save checkpoint to this file
)

print(result)

Resume the Graph from a Checkpoint

Load the checkpoint and resume execution

# Load the graph from the checkpoint
graph = Graph.from_checkpoint("my_checkpoint.json")

# Resume execution from the checkpoint
result = graph.execute(inputs={})

print(result)

Conclusion

This example demonstrates how to use checkpoints in LangChain graphs. This powerful feature can be used to manage the state of your graphs and make them more robust and reliable.

Author: Gemini Advanced

jwalsh@nexus

Last Updated: 2025-07-30 13:45:27

build: 2025-12-23 09:11 | sha: a10ddd7