Skip to content

fix(models): forward sampling params and populate finish_reason in AnthropicLlm#5441

Open
SuperMarioYL wants to merge 1 commit intogoogle:mainfrom
SuperMarioYL:fix/anthropic-llm-sampling-params-finish-reason
Open

fix(models): forward sampling params and populate finish_reason in AnthropicLlm#5441
SuperMarioYL wants to merge 1 commit intogoogle:mainfrom
SuperMarioYL:fix/anthropic-llm-sampling-params-finish-reason

Conversation

@SuperMarioYL
Copy link
Copy Markdown

Please ensure you have read the contribution guide before creating a pull request.

Link to Issue or Description of Change

1. Link to existing issues:

Testing Plan

Unit Tests:

  • I have added or updated unit tests for my change.
  • All unit tests pass locally.
tests/unittests/models/test_anthropic_llm.py::test_to_google_genai_finish_reason_mappings PASSED
tests/unittests/models/test_anthropic_llm.py::test_message_to_generate_content_response_sets_finish_reason PASSED
tests/unittests/models/test_anthropic_llm.py::test_message_to_generate_content_response_finish_reason_max_tokens PASSED
tests/unittests/models/test_anthropic_llm.py::test_streaming_final_response_has_finish_reason PASSED
tests/unittests/models/test_anthropic_llm.py::test_non_streaming_forwards_sampling_params PASSED
tests/unittests/models/test_anthropic_llm.py::test_streaming_forwards_sampling_params PASSED
tests/unittests/models/test_anthropic_llm.py::test_sampling_params_use_not_given_when_absent PASSED
48 passed in 2.98s (all existing tests continue to pass)

Manual End-to-End (E2E) Tests:

These fixes address configuration parameters that are forwarded to the Anthropic SDK. The behavior is fully covered by unit tests using mocked messages.create calls. To manually verify:

from google.adk.models.anthropic_llm import AnthropicLlm
from google.adk.models.llm_request import LlmRequest
from google.genai import types

agent = AnthropicLlm(model="claude-3-5-sonnet-20241022")
# temperature=0.7 is now forwarded to the API instead of silently dropped
request = LlmRequest(
    config=types.GenerateContentConfig(temperature=0.7, top_p=0.9),
    ...
)
async for response in agent.generate_content_async(request):
    print(response.finish_reason)  # now populated, e.g. "STOP" instead of None

Summary of Changes

Issue #5393 — sampling params silently dropped:

  • Extract temperature, top_p, top_k, stop_sequences from llm_request.config before both streaming and non-streaming paths
  • Pass extracted values (or NOT_GIVEN when absent/None) to both messages.create calls
  • LiteLlm already handles these correctly; AnthropicLlm now matches that behavior

Issue #5394 — finish_reason never populated:

  • Non-streaming: remove the # TODO: Deal with these later. comment block and call to_google_genai_finish_reason(message.stop_reason)
  • Streaming: initialize stop_reason: Optional[str] = None, capture it from event.delta.stop_reason in the message_delta handler, and pass to the final LlmResponse
  • Helper function: add the two missing stop_reason mappings: pause_turnSTOP (extended-thinking turn pause) and refusalSAFETY

Checklist

  • I have read the CONTRIBUTING.md document.
  • I have performed a self-review of my own code.
  • I have added tests that prove my fix is effective or that my feature works.
  • New and existing unit tests pass locally with my changes.

…thropicLlm

- Forward temperature, top_p, top_k, stop_sequences from LlmRequest.config
  to both the non-streaming and streaming Anthropic messages.create calls.
  Previously these parameters were silently ignored, making it impossible
  to control generation from the ADK config interface.

- Populate finish_reason on LlmResponse for both non-streaming
  (uncomment the existing helper call) and streaming paths (capture
  stop_reason from the message_delta event).

- Extend to_google_genai_finish_reason() with the two missing mappings:
  pause_turn -> STOP (extended thinking turn pause) and
  refusal -> SAFETY (model declined the request).

Fixes google#5393, google#5394
@google-cla
Copy link
Copy Markdown

google-cla Bot commented Apr 21, 2026

Thanks for your pull request! It looks like this may be your first contribution to a Google open source project. Before we can look at your pull request, you'll need to sign a Contributor License Agreement (CLA).

View this failed invocation of the CLA check for more information.

For the most up to date status, view the checks section at the bottom of the pull request.

@adk-bot adk-bot added the models [Component] Issues related to model support label Apr 21, 2026
@adk-bot
Copy link
Copy Markdown
Collaborator

adk-bot commented Apr 21, 2026

Response from ADK Triaging Agent

Hello @SuperMarioYL, thank you for your contribution!

It looks like the Contributor License Agreement (CLA) check is failing. Before we can merge this PR, you'll need to sign the CLA. Please check the details of the cla/google status check at the bottom of the PR for instructions.

Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

models [Component] Issues related to model support

Projects

None yet

Development

Successfully merging this pull request may close these issues.

AnthropicLlm never populates finish_reason on LlmResponse AnthropicLlm silently ignores generation config parameters

2 participants