-
Notifications
You must be signed in to change notification settings - Fork 285
Open
Labels
bugSomething isn't workingSomething isn't working
Description
Checks
- I have updated to the lastest minor and patch version of Strands
- I have checked the documentation and this is not expected behavior
- I have searched ./issues and there are no duplicates of my issue
Strands Version
1.01
Python Version
3.12.11
Operating System
Ubuntu 22.04.5 LTS
Installation Method
pip
Steps to Reproduce
strands-agents==1.0.1
strands-agents-tools==0.2.1
litellm==1.72.9
litellm-enterprise==0.1.7
litellm-proxy-extras==0.2.5
- Define agent tool with two arguments: agent instance and enum
- Initialize agent
- Run agent with the prompt
Expected Behavior
Successful agent run. This setup and agent run used to work up until today; I did not update any of the strands or litellm packages between yesterday and today, so I'm having trouble pinning down what changed in the environment
Actual Behavior
---------------------------------------------------------------------------
HTTPStatusError Traceback (most recent call last)
File ~/.airconda-environments/production--ml_infra--redspot--default--v0.4.21/lib/python3.12/site-packages/litellm/llms/bedrock/chat/invoke_handler.py:200, in make_call(client, api_base, headers, data, model, messages, logging_obj, fake_stream, json_mode, bedrock_invoke_provider)
196 client = get_async_httpx_client(
197 llm_provider=litellm.LlmProviders.BEDROCK
198 ) # Create a new client if none provided
--> 200 response = await client.post(
201 api_base,
202 headers=headers,
203 data=data,
204 stream=not fake_stream,
205 logging_obj=logging_obj,
206 )
208 if response.status_code != 200:
File ~/.airconda-environments/production--ml_infra--redspot--default--v0.4.21/lib/python3.12/site-packages/litellm/litellm_core_utils/logging_utils.py:135, in track_llm_api_timing.<locals>.decorator.<locals>.async_wrapper(*args, **kwargs)
134 try:
--> 135 result = await func(*args, **kwargs)
136 return result
File ~/.airconda-environments/production--ml_infra--redspot--default--v0.4.21/lib/python3.12/site-packages/litellm/llms/custom_httpx/http_handler.py:278, in AsyncHTTPHandler.post(self, url, data, json, params, headers, timeout, stream, logging_obj, files, content)
276 setattr(e, "status_code", e.response.status_code)
--> 278 raise e
279 except Exception as e:
File ~/.airconda-environments/production--ml_infra--redspot--default--v0.4.21/lib/python3.12/site-packages/litellm/llms/custom_httpx/http_handler.py:234, in AsyncHTTPHandler.post(self, url, data, json, params, headers, timeout, stream, logging_obj, files, content)
233 response = await self.client.send(req, stream=stream)
--> 234 response.raise_for_status()
235 return response
File ~/.airconda-environments/production--ml_infra--redspot--default--v0.4.21/lib/python3.12/site-packages/httpx/_models.py:829, in Response.raise_for_status(self)
828 message = message.format(self, error_type=error_type)
--> 829 raise HTTPStatusError(message, request=request, response=self)
HTTPStatusError: Client error '400 Bad Request' for url 'https://llm-fusion-hub.a.musta.ch/api/v2/proxy/aws/bedrock/model/us.anthropic.claude-sonnet-4-20250514-v1%3A0/converse-stream'
For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/400
During handling of the above exception, another exception occurred:
BedrockError Traceback (most recent call last)
File ~/.airconda-environments/production--ml_infra--redspot--default--v0.4.21/lib/python3.12/site-packages/litellm/main.py:525, in acompletion(model, messages, functions, function_call, timeout, temperature, top_p, n, stream, stream_options, stop, max_tokens, max_completion_tokens, modalities, prediction, audio, presence_penalty, frequency_penalty, logit_bias, user, response_format, seed, tools, tool_choice, parallel_tool_calls, logprobs, top_logprobs, deployment_id, reasoning_effort, base_url, api_version, api_key, model_list, extra_headers, thinking, web_search_options, **kwargs)
524 elif asyncio.iscoroutine(init_response):
--> 525 response = await init_response
526 else:
File ~/.airconda-environments/production--ml_infra--redspot--default--v0.4.21/lib/python3.12/site-packages/litellm/llms/bedrock/chat/converse_handler.py:144, in BedrockConverseLLM.async_streaming(self, model, messages, api_base, model_response, timeout, encoding, logging_obj, stream, optional_params, litellm_params, credentials, logger_fn, headers, client, fake_stream, json_mode)
134 logging_obj.pre_call(
135 input=messages,
136 api_key="",
(...)
141 },
142 )
--> 144 completion_stream = await make_call(
145 client=client,
146 api_base=api_base,
147 headers=dict(prepped.headers),
148 data=data,
149 model=model,
150 messages=messages,
151 logging_obj=logging_obj,
152 fake_stream=fake_stream,
153 json_mode=json_mode,
154 )
155 streaming_response = CustomStreamWrapper(
156 completion_stream=completion_stream,
157 model=model,
158 custom_llm_provider="bedrock",
159 logging_obj=logging_obj,
160 )
File ~/.airconda-environments/production--ml_infra--redspot--default--v0.4.21/lib/python3.12/site-packages/litellm/llms/bedrock/chat/invoke_handler.py:263, in make_call(client, api_base, headers, data, model, messages, logging_obj, fake_stream, json_mode, bedrock_invoke_provider)
262 error_code = err.response.status_code
--> 263 raise BedrockError(status_code=error_code, message=err.response.text)
264 except httpx.TimeoutException:
BedrockError: {"message":"The model returned the following errors: tools.0.custom.input_schema: JSON schema is invalid. It must match JSON Schema draft 2020-12 (https://json-schema.org/draft/2020-12). Learn more about tool use at https://docs.anthropic.com/en/docs/tool-use."}
During handling of the above exception, another exception occurred:
BadRequestError Traceback (most recent call last)
Cell In[8], line 9
4 session_prompt = (
5 f"Provide initial analysis of the user profile and activities, "
6 f"focusing on identifying any signs of ATO DP. "
7 )
8 start_time = time.time()
----> 9 res = run_ato_dp_agent(ato_dp_agent, BAD_ACTOR_HOST_USER_ID, session_prompt)
10 end_time = time.time()
11 logging.info(f"Agent run time: {end_time - start_time:.2f} seconds")
Cell In[7], line 24
21 logging.info(f"Complete session prompt for agent:\n{complete_prompt}")
23 # 3. Run agent
---> 24 res = agent(complete_prompt)
25 return res
...
976 )
977 elif original_exception.status_code == 404:
978 exception_mapping_worked = True
BadRequestError: litellm.BadRequestError: BedrockException - {"message":"The model returned the following errors: tools.0.custom.input_schema: JSON schema is invalid. It must match JSON Schema draft 2020-12 (https://json-schema.org/draft/2020-12). Learn more about tool use at https://docs.anthropic.com/en/docs/tool-use."}
Output is truncated. View as a scrollable element or open in a text editor. Adjust cell output settings...
Additional Context
No response
Possible Solution
One potential issue is that AttributeName
seems to be defined twice and is wrong here:
allOf:
- $ref: '#/$defs/UserAttributeName'
description: The name of the attribute to retrieve history for.
type: string
tools:
- function:
description: |-
Retrieve the history of changes of a specific attribute for a user.
Args:
attribute_name (UserAttributeName): The name of the attribute to retrieve history for.
agent (Agent): The agent instance containing the user ID and analysis end timestamp in its state.
Returns:
list[dict]: A list of dictionaries representing the history of changes of the specified attribute.
name: get_user_attribute_history
parameters:
$defs:
UserAttributeName:
description: |-
Attribute
enum:
- EMAIL
- PHONE
title: UserAttributeName
type: string
properties:
attribute_name:
allOf:
- $ref: '#/$defs/UserAttributeName'
description: The name of the attribute to retrieve history for.
type: string
required:
- attribute_name
type: object
Related Issues
No response
evehsu, dkmiller, Rudeg, mukundn, chinhuic and 1 more
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working