Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue: pull_prompt include_model=True for ChatAnthropic model throws error when invoked #1365

Open
ncnlinh opened this issue Dec 31, 2024 · 0 comments

Comments

@ncnlinh
Copy link

ncnlinh commented Dec 31, 2024

Issue you'd like to raise.

Per title, the following code:

langsmith_client = Client()
prompt_with_model: RunnableSequence = langsmith_client.pull_prompt(prompt_id, include_model=True)
prompt_with_model.invoke(prompt_input)

throws this error:

anthropic.BadRequestError: Error code: 400 - {'type': 'error', 'error': {'type': 'invalid_request_error', 'message': 'tools.0.type: Extra inputs are not permitted'}}

I believe the tool is still in the format of OpenAI when pulled, and it has to do with this changeset: #1210.

Suggestion:

What I did so that it works:

langsmith_client = Client()
prompt_with_model: RunnableSequence = langsmith_client.pull_prompt(prompt_id, include_model=True)
if isinstance(prompt_with_model.last.bound, ChatAnthropic):
  # Rebind tools. For remote prompts with Anthropic, after pulling from hub, the tools are not bound correctly.
  rebound_llm = prompt_with_model.steps[1]
  prompt_with_model = RunnableSequence(
    prompt_with_model.first,
    rebound_llm.bind_tools(rebound_llm.kwargs["tools"])
  )
prompt_with_model.invoke(prompt_input)

Could someone with more context (@hinthornw , @madams0013 or @agola11 ) take a look?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant