-
Notifications
You must be signed in to change notification settings - Fork 16.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
core: basemessage.text() #29078
core: basemessage.text() #29078
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎ 1 Skipped Deployment
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good. Nicer than having to use StrOutputParser. For streaming this will work with Anthropic but haven't looked at any other providers.
How it works for StrOutputParser:
langchain/libs/core/langchain_core/outputs/chat_generation.py
Lines 50 to 63 in c5bee0a
if isinstance(self.message.content, str): | |
text = self.message.content | |
# HACK: Assumes text in content blocks in OpenAI format. | |
# Uses first text block. | |
elif isinstance(self.message.content, list): | |
for block in self.message.content: | |
if isinstance(block, str): | |
text = block | |
break | |
elif isinstance(block, dict) and "text" in block: | |
text = block["text"] | |
break | |
else: | |
pass |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would be good to document a streaming example, e.g.,
for chunk in llm.stream(...):
print(chunk.text(), end="|")
Should target being able to replace this page: https://python.langchain.com/docs/how_to/output_parser_string/
No description provided.