-
Notifications
You must be signed in to change notification settings - Fork 14.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
community: avoid double templating in langchain_prompty
#25777
community: avoid double templating in langchain_prompty
#25777
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎ 1 Skipped Deployment
|
Not relevant to the core logic, but I hesitated betwen sharing a dict for the mapping between I went with the latter, but if anybody has pointers on what would be the cleanest here I'd be happy to learn. |
Prompty templates the given `.prompty` file and parse it for messages. We convert those messages to langchain's Message objects. This avoids the second templating in the call to ChatPromptTemplate.
080a340
to
56b0da7
Compare
from pydantic import BaseModel | ||
|
||
from .core import Invoker, Prompty, SimpleModel | ||
|
||
|
||
class RoleMap: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does this cover the full scope of roles? What about "tool"
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, I simply adapted the roles that were defined when integrating Prompty into LangChain. The latest versions of Prompty removed ai
and human
, so we could even delete those (but that would break some users' prompts)
Tools are defined in the YAML header of the .prompty
files, here's an example
|
||
lc_messages.append( | ||
MessagesPlaceholder( | ||
variable_name=input_name_agent_scratchpad, optional=True | ||
) # type: ignore[arg-type] | ||
) | ||
lc_p = ChatPromptTemplate.from_messages( | ||
lc_messages, template_format=template_format |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you clarify why specifying template_format
with a default of "f-string"
is breaking? It looks like that is the default in ChatPromptTemplate.from_messages
:
langchain/libs/core/langchain_core/prompts/chat.py
Lines 1147 to 1151 in d6c4803
def from_messages( | |
cls, | |
messages: Sequence[MessageLikeRepresentation], | |
template_format: Literal["f-string", "mustache", "jinja2"] = "f-string", | |
) -> ChatPromptTemplate: |
Or is it just unrelated?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You're right that's it's unrelated.
Here the issue is that before initializing the ChatPromptTemplate
, Prompty has already parsed and templated the prompt using the user input.
When invoking ChatPromptTemplate
, it tries to template the messages and interprets text inside { }
(or {{ }}
) as missing variables. The fix proposed in this PR avoids this templating by converting to Messages
objects, which are already templated.
Since we skip this second templating entirely, it does not make sense to expose one of its parameters to the user, as it won't have any effect. You can select the template engine used by Prompty directly in the .prompty
s file YAML header.
Thanks for merging and for the time you spent on this! |
Thank you for the fix! Released in langchain-prompty 0.0.3, feel free to take a look. |
Description
In
langchain_prompty
, messages are templated by Prompty. However, a call toChatPromptTemplate
was initiating a second templating. We now convert parsed messages toMessage
objects before callingChatPromptTemplate
, signifying clearly that they are already templated.We also revert #25739 , which applied to this second templating, which we now avoid, and did not fix the original issue.
Issue
Closes #25703