Fixing TypeError Msgpack Serializable Send With Langgraph Create_react_agent And BaseStore

by ADMIN 91 views

Introduction

Hey guys! Ever run into a weird error that just makes you scratch your head? I recently encountered a TypeError: Type is not msgpack serializable: Send while working with langgraph and thought I'd share my experience and the solution. This error popped up when I was using create_react_agent with a tool that involved BaseStore. Let's dive into the details, so you can avoid this headache yourself!

Background on Langgraph and the Issue

For those new to the party, langgraph is a cool library for building conversational agents. The create_react_agent function is a powerful tool for setting up agents that can reason and act. One of the features I was trying to leverage was the ability to inject a BaseStore into my tools. This allows the agent to persist and retrieve information across interactions, making it more stateful and aware of past conversations.

However, things went south when I tried to combine this with Langgraph's checkpointing mechanism. Checkpointing is essential for saving the agent's state, so you can resume conversations later or recover from failures. It turns out there's an incompatibility issue with how Langgraph serializes data for checkpointing when BaseStore and certain internal objects like Send are involved. The error message TypeError: Type is not msgpack serializable: Send is a clear indicator of this serialization problem. Msgpack, a binary serialization format, couldn't handle the Send object, leading to the error.

Minimal Reproducible Example

To illustrate the issue, I've put together a minimal, reproducible example. This is crucial because it helps pinpoint the exact cause of the error without any distractions from other parts of the codebase. Here’s the code snippet that triggers the error:

from typing import Annotated

from langchain_core.tools import tool
from langchain_openai import ChatOpenAI
from langgraph.checkpoint.memory import InMemorySaver
from langgraph.prebuilt import InjectedStore, create_react_agent
from langgraph.store.base import BaseStore
from langgraph.store.memory import InMemoryStore


@tool
def add(
    a,
    b,
    store: Annotated[BaseStore, InjectedStore],
):
    """Add 2 numbers"""
    return str(a + b)


agent = create_react_agent(
    model=ChatOpenAI(model="gpt-4.1-nano"),
    tools=[add],
    checkpointer=InMemorySaver(),
    store=InMemoryStore(),
    # version="v1",
)


for event in agent.stream(
    input={"messages": ["What is 1 + 1?"]},
    config={"configurable": {"thread_id": "abc"}},
):
    print(event)

When you run this code, you’ll encounter the dreaded TypeError. Let's break down what's happening here:

  1. We define a simple tool called add that takes two numbers and a BaseStore as input. The BaseStore is injected using InjectedStore.
  2. We create a react_agent using create_react_agent, providing the model, tools, a checkpoint saver (InMemorySaver), and an in-memory store (InMemoryStore).
  3. We then try to stream events from the agent using agent.stream. This is where the error occurs.

The critical part is the combination of the BaseStore injection and the checkpointing mechanism. The agent attempts to serialize the state, which includes the Send object, for checkpointing, but msgpack can't handle it.

Digging into the Error Message and Stack Trace

To really understand what's going on, let's dissect the error message and stack trace. Here’s the traceback I encountered:

Traceback (most recent call last):
  File "/Users/vincent.min/Projects/langgraph-fullstack/backend/../dev/serde_send.py", line 30, in <module>
    for event in agent.stream(
                 ~~~~~~~~~~~~^
        input={"messages": ["What is 1 + 1?"]},
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        config={"configurable": {"thread_id": "abc"}},
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^    ):
    ^
  File "/Users/vincent.min/Projects/langgraph-fullstack/backend/.venv/lib/python3.13/site-packages/langgraph/pregel/main.py", line 2582, in stream
    with SyncPregelLoop(
         ~~~~~~~~~~~~~~^
        input,
        ^^^^^^
    ...<17 lines>...
        cache_policy=self.cache_policy,
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    ) as loop:
    ^
  File "/Users/vincent.min/Projects/langgraph-fullstack/backend/.venv/lib/python3.13/site-packages/langgraph/pregel/_loop.py", line 1060, in __exit__
    return self.stack.__exit__(exc_type, exc_value, traceback)
           ~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^  File "/Users/vincent.min/.local/share/uv/python/cpython-3.13.2-macos-aarch64-none/lib/python3.13/contextlib.py", line 619, in __exit__
    raise exc
  File "/Users/vincent.min/.local/share/uv/python/cpython-3.13.2-macos-aarch64-none/lib/python3.13/contextlib.py", line 604, in __exit__
    if cb(*exc_details):
       ~~^^^^^^^^^^^^^^
  File "/Users/vincent.min/Projects/langgraph-fullstack/backend/.venv/lib/python3.13/site-packages/langgraph/pregel/_executor.py", line 118, in __exit__
    task.result()
    ~~~~~~~~~~~^^
  File "/Users/vincent.min/.local/share/uv/python/cpython-3.13.2-macos-aarch64-none/lib/python3.13/concurrent/futures/_base.py", line 449, in result
    return self.__get_result()
           ~~~~~~~~~~~~~~~~~^^
  File "/Users/vincent.min/.local/share/uv/python/cpython-3.13.2-macos-aarch64-none/lib/python3.13/concurrent/futures/_base.py", line 401, in __get_result
    raise self._exception
  File "/Users/vincent.min/Projects/langgraph-fullstack/backend/.venv/lib/python3.13/site-packages/langgraph/pregel/_executor.py", line 81, in done
    task.result()
    ~~~~~~~~~~~^^
  File "/Users/vincent.min/.local/share/uv/python/cpython-3.13.2-macos-aarch64-none/lib/python3.13/concurrent/futures/_base.py", line 449, in result
    return self.__get_result()
           ~~~~~~~~~~~~~~~~~^^
  File "/Users/vincent.min/.local/share/uv/python/cpython-3.13.2-macos-aarch64-none/lib/python3.13/concurrent/futures/_base.py", line 401, in __get_result
    raise self._exception
  File "/Users/vincent.min/.local/share/uv/python/cpython-3.13.2-macos-aarch64-none/lib/python3.13/concurrent/futures/thread.py", line 59, in run
    result = self.fn(*self.args, **self.kwargs)
  File "/Users/vincent.min/Projects/langgraph-fullstack/backend/.venv/lib/python3.13/site-packages/langgraph/checkpoint/memory/__init__.py", line 403, in put_writes
    self.serde.dumps_typed(v),
    ~~~~~~~~~~~~~~~~~~~~~~^^^
  File "/Users/vincent.min/Projects/langgraph-fullstack/backend/.venv/lib/python3.13/site-packages/langgraph/checkpoint/serde/jsonplus.py", line 222, in dumps_typed
    raise exc
  File "/Users/vincent.min/Projects/langgraph-fullstack/backend/.venv/lib/python3.13/site-packages/langgraph/checkpoint/serde/jsonplus.py", line 216, in dumps_typed
    return "msgpack", _msgpack_enc(obj)
                      ~~~~~~~~~~~~^^^^^
  File "/Users/vincent.min/Projects/langgraph-fullstack/backend/.venv/lib/python3.13/site-packages/langgraph/checkpoint/serde/jsonplus.py", line 676, in _msgpack_enc
    return ormsgpack.packb(data, default=_msgpack_default, option=_option)
           ~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: Type is not msgpack serializable: Send

The stack trace clearly shows the error originating from the msgpack serialization process within the checkpointing mechanism. It's trying to serialize an object of type Send, which isn't directly serializable by msgpack.

The Curious Case of the Commented Lines

Here’s where it gets even more interesting. I noticed that commenting out the line store: Annotated[BaseStore, InjectedStore], or uncommenting # version="v1", makes the issue disappear. Why is that?

  • Commenting out store: Annotated[BaseStore, InjectedStore]: When you remove the BaseStore injection, the agent doesn't need to serialize the store's state, which includes the problematic Send object. Therefore, the error doesn't occur.
  • Uncommenting # version="v1",: This is a bit more subtle. By specifying version="v1", you're essentially using an older version of the agent creation logic. This older version might handle state serialization differently or not include the Send object in the state at all, thus avoiding the serialization issue.

The Solution: Upgrading Langgraph

After some digging and discussions, the solution turned out to be surprisingly straightforward: upgrade to the latest version of langgraph. The issue was identified and fixed in a more recent release. By upgrading, you get the fix that allows msgpack to correctly serialize the state, including the Send object.

To upgrade, you can use pip:

pip install --upgrade langgraph

After upgrading, the original code snippet should run without any issues. This highlights the importance of keeping your libraries up to date, especially when dealing with complex systems like Langgraph.

Additional System Information

For those who like to have all the details, here’s the system information I was running on when I encountered the issue:

System Information
------------------
> OS:  Darwin
> OS Version:  Darwin Kernel Version 24.2.0: Fri Dec  6 18:51:28 PST 2024; root:xnu-11215.61.5~2/RELEASE_ARM64_T8112
> Python Version:  3.13.2 (main, Mar 11 2025, 17:30:09) [Clang 20.1.0 ]

Package Information
-------------------
> langchain_core: 0.3.74
> langsmith: 0.4.14
> langchain_mcp_adapters: 0.1.9
> langchain_openai: 0.3.30
> langgraph_sdk: 0.2.0

Optional packages not installed
-------------------------------
> langserve

Other Dependencies
------------------
> httpx<1,>=0.23.0: Installed. No version info available.
> httpx>=0.25.2: Installed. No version info available.
> jsonpatch<2.0,>=1.33: Installed. No version info available.
> langchain-core<0.4,>=0.3.36: Installed. No version info available.
> langchain-core<1.0.0,>=0.3.74: Installed. No version info available.
> langsmith-pyo3>=0.1.0rc2;: Installed. No version info available.
> langsmith>=0.3.45: Installed. No version info available.
> mcp>=1.9.2: Installed. No version info available.
> openai-agents>=0.0.3;: Installed. No version info available.
> openai<2.0.0,>=1.99.9: Installed. No version info available.
> opentelemetry-api>=1.30.0;: Installed. No version info available.
> opentelemetry-exporter-otlp-proto-http>=1.30.0;: Installed. No version info available.
> opentelemetry-sdk>=1.30.0;: Installed. No version info available.
> orjson>=3.10.1: Installed. No version info available.
> orjson>=3.9.14;: Installed. No version info available.
> packaging>=23.2: Installed. No version info available.
> pydantic<3,>=1: Installed. No version info available.
> pydantic>=2.7.4: Installed. No version info available.
> pytest>=7.0.0;: Installed. No version info available.
> PyYAML>=5.3: Installed. No version info available.
> requests-toolbelt>=1.0.0: Installed. No version info available.
> requests>=2.0.0: Installed. No version info available.
> rich>=13.9.4;: Installed. No version info available.
> tenacity!=8.4.0,<10.0.0,>=8.1.0: Installed. No version info available.
> tiktoken<1,>=0.7: Installed. No version info available.
> typing-extensions>=4.14.0: Installed. No version info available.
> typing-extensions>=4.7: Installed. No version info available.
> vcrpy>=7.0.0;: Installed. No version info available.
> zstandard>=0.23.0: Installed. No version info available.

Key Takeaways

Let's quickly recap the key takeaways from this debugging journey:

  • The TypeError: Type is not msgpack serializable: Send can occur when using create_react_agent with a tool that injects BaseStore and Langgraph's checkpointing is enabled.
  • The issue is related to the serialization of internal objects like Send by msgpack.
  • Upgrading to the latest version of langgraph resolves the issue.
  • Keeping your libraries updated is crucial for avoiding such errors and benefiting from the latest fixes and features.

Conclusion

Debugging can be a frustrating but also rewarding process. This particular issue taught me a lot about how Langgraph handles state serialization and the importance of staying up-to-date with library releases. I hope sharing my experience helps you guys avoid this error and makes your Langgraph journey a bit smoother. Happy coding!