Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bugs when bind tools in _create_message_from_message_type() #30074

Open
5 tasks done
MILK-BIOS opened this issue Mar 3, 2025 · 1 comment
Open
5 tasks done

Bugs when bind tools in _create_message_from_message_type() #30074

MILK-BIOS opened this issue Mar 3, 2025 · 1 comment
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature

Comments

@MILK-BIOS
Copy link

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

Hi Langchain Team! The following code:

def _create_message_from_message_type(
    message_type: str,
    content: str,
    name: Optional[str] = None,
    tool_call_id: Optional[str] = None,
    tool_calls: Optional[list[dict[str, Any]]] = None,
    id: Optional[str] = None,
    **additional_kwargs: Any,
) -> BaseMessage:
    """Create a message from a message type and content string.

    Args:
        message_type: (str) the type of the message (e.g., "human", "ai", etc.).
        content: (str) the content string.
        name: (str) the name of the message. Default is None.
        tool_call_id: (str) the tool call id. Default is None.
        tool_calls: (list[dict[str, Any]]) the tool calls. Default is None.
        id: (str) the id of the message. Default is None.
        additional_kwargs: (dict[str, Any]) additional keyword arguments.

    Returns:
        a message of the appropriate type.

    Raises:
        ValueError: if the message type is not one of "human", "user", "ai",
            "assistant", "function", "tool", "system", or "developer".
    """
    kwargs: dict[str, Any] = {}
    if name is not None:
        kwargs["name"] = name
    if tool_call_id is not None:
        kwargs["tool_call_id"] = tool_call_id
    if additional_kwargs:
        if response_metadata := additional_kwargs.pop("response_metadata", None):
            kwargs["response_metadata"] = response_metadata
        kwargs["additional_kwargs"] = additional_kwargs  # type: ignore[assignment]
        additional_kwargs.update(additional_kwargs.pop("additional_kwargs", {}))
    if id is not None:
        kwargs["id"] = id
    if tool_calls is not None:
        kwargs["tool_calls"] = []
        for tool_call in tool_calls:
            # Convert OpenAI-format tool call to LangChain format.
            if "function" in tool_call:
                args = tool_call["function"]["arguments"]
                if isinstance(args, str):
                    args = json.loads(args, strict=False)
                kwargs["tool_calls"].append(
                    {
                        "name": tool_call["function"]["name"],
                        "args": args,
                        "id": tool_call["id"],
                        "type": "tool_call",
                    }
                )
            else:
                kwargs["tool_calls"].append(tool_call)
    if message_type in ("human", "user"):
        if example := kwargs.get("additional_kwargs", {}).pop("example", False):
            kwargs["example"] = example
        message: BaseMessage = HumanMessage(content=content, **kwargs)
    elif message_type in ("ai", "assistant"):
        if example := kwargs.get("additional_kwargs", {}).pop("example", False):
            kwargs["example"] = example
        message = AIMessage(content=content, **kwargs)
    elif message_type in ("system", "developer"):
        if message_type == "developer":
            kwargs["additional_kwargs"] = kwargs.get("additional_kwargs") or {}
            kwargs["additional_kwargs"]["__openai_role__"] = "developer"
        message = SystemMessage(content=content, **kwargs)
    elif message_type == "function":
        message = FunctionMessage(content=content, **kwargs)
    elif message_type == "tool":
        artifact = kwargs.get("additional_kwargs", {}).pop("artifact", None)
        message = ToolMessage(content=content, artifact=artifact, **kwargs)
    elif message_type == "remove":
        message = RemoveMessage(**kwargs)
    else:
        msg = (
            f"Unexpected message type: '{message_type}'. Use one of 'human',"
            f" 'user', 'ai', 'assistant', 'function', 'tool', 'system', or 'developer'."
        )
        msg = create_message(message=msg, error_code=ErrorCode.MESSAGE_COERCION_FAILURE)
        raise ValueError(msg)
    return message

Error Message and Stack Trace (if applicable)

发生异常: JSONDecodeError (note: full exception trace is shown but execution is paused at: _run_module_as_main)
Expecting value: line 1 column 1 (char 0)
File "/opt/conda/envs/yolox/lib/python3.10/json/decoder.py", line 353, in raw_decode
obj, end = self.scan_once(s, idx)
StopIteration: 0

During handling of the above exception, another exception occurred:

File "/opt/conda/envs/yolox/lib/python3.10/json/decoder.py", line 355, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
File "/opt/conda/envs/yolox/lib/python3.10/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/opt/conda/envs/yolox/lib/python3.10/json/init.py", line 359, in loads
return cls(**kw).decode(s)
File "/opt/conda/envs/yolox/lib/python3.10/site-packages/langchain_core/messages/utils.py", line 252, in _create_message_from_message_type
args = json.loads(args, strict=False)
File "/opt/conda/envs/yolox/lib/python3.10/site-packages/langchain_core/messages/utils.py", line 337, in _convert_to_message
_message = _create_message_from_message_type(
File "/opt/conda/envs/yolox/lib/python3.10/site-packages/langchain_core/messages/utils.py", line 364, in
return [_convert_to_message(m) for m in messages]
File "/opt/conda/envs/yolox/lib/python3.10/site-packages/langchain_core/messages/utils.py", line 364, in convert_to_messages
return [_convert_to_message(m) for m in messages]
File "/opt/conda/envs/yolox/lib/python3.10/site-packages/langgraph/graph/message.py", line 173, in add_messages
for m in convert_to_messages(right)
File "/opt/conda/envs/yolox/lib/python3.10/site-packages/langgraph/graph/message.py", line 36, in _add_messages
return func(left, right, **kwargs)
File "/opt/conda/envs/yolox/lib/python3.10/site-packages/langgraph/channels/binop.py", line 88, in update
self.value = self.operator(self.value, value)
File "/opt/conda/envs/yolox/lib/python3.10/site-packages/langgraph/pregel/algo.py", line 305, in apply_writes
if channels[chan].update(vals) and get_next_version is not None:
File "/opt/conda/envs/yolox/lib/python3.10/site-packages/langgraph/pregel/algo.py", line 201, in local_read
apply_writes(copy_checkpoint(checkpoint), local_channels, [task], None)
File "/opt/conda/envs/yolox/lib/python3.10/site-packages/langgraph/pregel/read.py", line 109, in do_read
return read(select, fresh)
File "/opt/conda/envs/yolox/lib/python3.10/site-packages/langgraph/graph/graph.py", line 87, in _route
value = reader(config)
File "/opt/conda/envs/yolox/lib/python3.10/site-packages/langgraph/utils/runnable.py", line 310, in invoke
ret = context.run(self.func, *args, **kwargs)
File "/opt/conda/envs/yolox/lib/python3.10/site-packages/langgraph/utils/runnable.py", line 548, in invoke
input = step.invoke(input, config)
File "/opt/conda/envs/yolox/lib/python3.10/site-packages/langgraph/pregel/retry.py", line 40, in run_with_retry
return task.proc.invoke(task.input, config)
File "/opt/conda/envs/yolox/lib/python3.10/site-packages/langgraph/pregel/runner.py", line 230, in tick
run_with_retry(
File "/opt/conda/envs/yolox/lib/python3.10/site-packages/langgraph/pregel/init.py", line 1993, in stream
for _ in runner.tick(
File "/private/workspace/fhs/AN/agents/router.py", line 150, in stream_graph_updates
for event in graph.stream({"messages": [{"role": "user", "content": user_input}]}):
File "/private/workspace/fhs/AN/agents/router.py", line 174, in
stream_graph_updates(user_input)
File "/opt/conda/envs/yolox/lib/python3.10/runpy.py", line 86, in _run_code
exec(code, run_globals)
File "/opt/conda/envs/yolox/lib/python3.10/runpy.py", line 196, in _run_module_as_main (Current frame)
return _run_code(code, main_globals, None,
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

Description

I'm trying to bind tools with Deepseek-R1. I have followed the code to edit my prompt. In above fuction, when args is str, it is still required to transform into json.

if isinstance(args, str):
   args = json.loads(args, strict=False)

Is it correct? Thanks for your time!

System Info

System Information

OS: Linux
OS Version: #113-Ubuntu SMP Thu Feb 3 18:43:29 UTC 2022
Python Version: 3.10.16 | packaged by conda-forge | (main, Dec 5 2024, 14:16:10) [GCC 13.3.0]

Package Information

langchain_core: 0.3.40
langchain: 0.3.19
langchain_community: 0.3.18
langsmith: 0.3.11
langchain_google_community: 2.0.7
langchain_ollama: 0.2.3
langchain_text_splitters: 0.3.6
langgraph_sdk: 0.1.53

Optional packages not installed

langserve

Other Dependencies

aiohttp<4.0.0,>=3.8.3: Installed. No version info available.
async-timeout<5.0.0,>=4.0.0;: Installed. No version info available.
beautifulsoup4: Installed. No version info available.
dataclasses-json<0.7,>=0.5.7: Installed. No version info available.
db-dtypes: Installed. No version info available.
gapic-google-longrunning: Installed. No version info available.
google-api-core: 2.24.1
google-api-python-client: 2.162.0
google-auth: 2.38.0
google-auth-httplib2: 0.2.0
google-auth-oauthlib: Installed. No version info available.
google-cloud-aiplatform: Installed. No version info available.
google-cloud-bigquery: Installed. No version info available.
google-cloud-bigquery-storage: Installed. No version info available.
google-cloud-contentwarehouse: Installed. No version info available.
google-cloud-core: 2.4.2
google-cloud-discoveryengine: Installed. No version info available.
google-cloud-documentai: Installed. No version info available.
google-cloud-documentai-toolbox: Installed. No version info available.
google-cloud-speech: Installed. No version info available.
google-cloud-storage: Installed. No version info available.
google-cloud-texttospeech: Installed. No version info available.
google-cloud-translate: Installed. No version info available.
google-cloud-vision: Installed. No version info available.
googlemaps: Installed. No version info available.
grpcio: 1.70.0
httpx: 0.28.1
httpx-sse<1.0.0,>=0.4.0: Installed. No version info available.
jsonpatch<2.0,>=1.33: Installed. No version info available.
langchain-anthropic;: Installed. No version info available.
langchain-aws;: Installed. No version info available.
langchain-cohere;: Installed. No version info available.
langchain-community;: Installed. No version info available.
langchain-core<1.0.0,>=0.3.34: Installed. No version info available.
langchain-core<1.0.0,>=0.3.35: Installed. No version info available.
langchain-core<1.0.0,>=0.3.37: Installed. No version info available.
langchain-deepseek;: Installed. No version info available.
langchain-fireworks;: Installed. No version info available.
langchain-google-genai;: Installed. No version info available.
langchain-google-vertexai;: Installed. No version info available.
langchain-groq;: Installed. No version info available.
langchain-huggingface;: Installed. No version info available.
langchain-mistralai;: Installed. No version info available.
langchain-ollama;: Installed. No version info available.
langchain-openai;: Installed. No version info available.
langchain-text-splitters<1.0.0,>=0.3.6: Installed. No version info available.
langchain-together;: Installed. No version info available.
langchain-xai;: Installed. No version info available.
langchain<1.0.0,>=0.3.19: Installed. No version info available.
langsmith-pyo3: Installed. No version info available.
langsmith<0.4,>=0.1.125: Installed. No version info available.
langsmith<0.4,>=0.1.17: Installed. No version info available.
numpy<2,>=1.26.4;: Installed. No version info available.
numpy<3,>=1.26.2;: Installed. No version info available.
ollama: 0.4.7
orjson: 3.10.7
packaging: 24.2
packaging<25,>=23.2: Installed. No version info available.
pandas: 2.2.3
pyarrow: 19.0.1
pydantic: 2.10.6
pydantic-settings<3.0.0,>=2.4.0: Installed. No version info available.
pydantic<3.0.0,>=2.5.2;: Installed. No version info available.
pydantic<3.0.0,>=2.7.4: Installed. No version info available.
pydantic<3.0.0,>=2.7.4;: Installed. No version info available.
pytest: Installed. No version info available.
PyYAML>=5.3: Installed. No version info available.
requests: 2.32.3
requests-toolbelt: 1.0.0
requests<3,>=2: Installed. No version info available.
rich: 13.9.4
SQLAlchemy<3,>=1.4: Installed. No version info available.
tenacity!=8.4.0,<10,>=8.1.0: Installed. No version info available.
tenacity!=8.4.0,<10.0.0,>=8.1.0: Installed. No version info available.
typing-extensions>=4.7: Installed. No version info available.
zstandard: 0.23.0

@dosubot dosubot bot added the 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature label Mar 3, 2025
@andrasfe
Copy link

andrasfe commented Mar 5, 2025

can you please provide a code snippet to reproduce the error?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature
Projects
None yet
Development

No branches or pull requests

2 participants