Try an interactive version of this dialog: Sign up at solve.it.com, click Upload, and pass this URL.
Whoops! An error (litellm.ServiceUnavailableError: litellm.MidStreamFallbackError: litellm.BadRequestError: Vertex_ai_betaException BadRequestError - b'{\n "error": {\n "code": 400,\n "message": "* GenerateContentRequest.tools[0].function_declarations[28].parameters.properties[types].any_of[1].items: missing field.\n* GenerateContentRequest.tools[0].function_declarations[29].parameters.properties[types].any_of[1].items: missing field.\n",\n "status": "INVALID_ARGUMENT"\n }\n}\n') occurred while processing your request. If this problem persists, please contact us on Discord. Please include your dialog url and error info in your message.
Traceback (most recent call last):
File "/usr/local/lib/python3.12/site-packages/litellm/llms/vertex_ai/gemini/vertex_and_google_ai_studio_gemini.py", line 2115, in make_call
response = await client.post(
^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/litellm/litellm_core_utils/logging_utils.py", line 190, in async_wrapper
result = await func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/litellm/llms/custom_httpx/http_handler.py", line 464, in post
raise e
File "/usr/local/lib/python3.12/site-packages/litellm/llms/custom_httpx/http_handler.py", line 420, in post
response.raise_for_status()
File "/usr/local/lib/python3.12/site-packages/httpx/_models.py", line 829, in raise_for_status
raise HTTPStatusError(message, request=request, response=self)
httpx.HTTPStatusError: Client error '400 Bad Request' for url 'https://generativelanguage.googleapis.com/v1alpha/models/gemini-3-flash-preview:streamGenerateContent?key=AIzaSyAS-uNfTCgVuBJ2CbvYwkMCb_7l1oatt3A&alt=sse'
For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/400
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.12/site-packages/litellm/litellm_core_utils/streaming_handler.py", line 1812, in __anext__
await self.fetch_stream()
File "/usr/local/lib/python3.12/site-packages/litellm/litellm_core_utils/streaming_handler.py", line 1796, in fetch_stream
self.completion_stream = await self.make_call(
^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/litellm/llms/vertex_ai/gemini/vertex_and_google_ai_studio_gemini.py", line 2121, in make_call
raise VertexAIError(
litellm.llms.vertex_ai.common_utils.VertexAIError: b'{\n "error": {\n "code": 400,\n "message": "* GenerateContentRequest.tools[0].function_declarations[28].parameters.properties[types].any_of[1].items: missing field.\\n* GenerateContentRequest.tools[0].function_declarations[29].parameters.properties[types].any_of[1].items: missing field.\\n",\n "status": "INVALID_ARGUMENT"\n }\n}\n'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.12/site-packages/litellm/litellm_core_utils/streaming_handler.py", line 2003, in __anext__
raise exception_type(
^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2340, in exception_type
raise e
File "/usr/local/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 1359, in exception_type
raise BadRequestError(
litellm.exceptions.BadRequestError: litellm.BadRequestError: Vertex_ai_betaException BadRequestError - b'{\n "error": {\n "code": 400,\n "message": "* GenerateContentRequest.tools[0].function_declarations[28].parameters.properties[types].any_of[1].items: missing field.\\n* GenerateContentRequest.tools[0].function_declarations[29].parameters.properties[types].any_of[1].items: missing field.\\n",\n "status": "INVALID_ARGUMENT"\n }\n}\n'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/root/solveit/solveit/aimsg.py", line 444, in run_ai
return await _astream_to_msg(rs,msg,inc_use)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/solveit/solveit/aimsg.py", line 157, in _astream_to_msg
async for o in AsyncStreamFormatter(inc_use, mx=600, debug=False).format_stream(rs):
File "/usr/local/lib/python3.12/site-packages/lisette/core.py", line 592, in format_stream
async for o in rs: yield self.format_item(o)
File "/usr/local/lib/python3.12/site-packages/lisette/core.py", line 487, in _call
async for chunk in res: yield chunk
File "/usr/local/lib/python3.12/site-packages/lisette/core.py", line 465, in astream_with_complete
async for chunk in agen:
File "/usr/local/lib/python3.12/site-packages/litellm/litellm_core_utils/streaming_handler.py", line 2013, in __anext__
raise MidStreamFallbackError(
litellm.exceptions.MidStreamFallbackError: litellm.ServiceUnavailableError: litellm.MidStreamFallbackError: litellm.BadRequestError: Vertex_ai_betaException BadRequestError - b'{\n "error": {\n "code": 400,\n "message": "* GenerateContentRequest.tools[0].function_declarations[28].parameters.properties[types].any_of[1].items: missing field.\\n* GenerateContentRequest.tools[0].function_declarations[29].parameters.properties[types].any_of[1].items: missing field.\\n",\n "status": "INVALID_ARGUMENT"\n }\n}\n' Original exception: BadRequestError: litellm.BadRequestError: Vertex_ai_betaException BadRequestError - b'{\n "error": {\n "code": 400,\n "message": "* GenerateContentRequest.tools[0].function_declarations[28].parameters.properties[types].any_of[1].items: missing field.\\n* GenerateContentRequest.tools[0].function_declarations[29].parameters.properties[types].any_of[1].items: missing field.\\n",\n "status": "INVALID_ARGUMENT"\n }\n}\n'
try: print(Chat('gemini/gemini-3-flash-preview', tools=[symfiles_package])('hi'))
except Exception as e: print(e)
Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm._turn_on_debug()'.
litellm.BadRequestError: GeminiException BadRequestError - {
"error": {
"code": 400,
"message": "* GenerateContentRequest.tools[0].function_declarations[0].parameters.properties[types].any_of[1].items: missing field.\n",
"status": "INVALID_ARGUMENT"
}
}
try: print(Chat('gemini/gemini-3-flash-preview', tools=[test_func])('hi'))
except Exception as e: print(e)
Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm._turn_on_debug()'.
litellm.BadRequestError: GeminiException BadRequestError - {
"error": {
"code": 400,
"message": "* GenerateContentRequest.tools[0].function_declarations[0].parameters.properties[tags].any_of[1].items: missing field.\n",
"status": "INVALID_ARGUMENT"
}
}
get_origin(list[str]), get_origin(list), get_origin(list[typing.Any]), get_origin(dict[str|int])
from typing import get_origin
from toolslm.funccall import NoneType, custom_types, inspect, _types
def _handle_type(t, defs):
"Handle a single type, creating nested schemas if necessary"
ot = ifnone(get_origin(t), t)
if t is NoneType: return {'type': 'null'}
if t in custom_types: return {'type':'string', 'format':t.__name__}
if ot is dict: return {'type': _types(t)[0]}
if ot in (list, tuple, set, dict): return {'type': _types(t)[0], 'items':{}} # <- this fixes the issue
if isinstance(t, type) and not issubclass(t, (int, float, str, bool)) or inspect.isfunction(t):
defs[t.__name__] = _get_nested_schema(t)
return {'$ref': f'#/$defs/{t.__name__}'}
return {'type': _types(t)[0]}
import toolslm.funccall
toolslm.funccall._handle_type=_handle_type
hello! how can I help you today?
- id:
ETRxaZTQGf7P_uMP2e7HmQQ - model:
gemini-3-flash-preview - finish_reason:
stop - usage:
Usage(completion_tokens=9, prompt_tokens=61, total_tokens=70, completion_tokens_details=None, prompt_tokens_details=PromptTokensDetailsWrapper(audio_tokens=None, cached_tokens=None, text_tokens=61, image_tokens=None))
c=Chat('anthropic/claude-haiku-4-5', tools=[symfiles_package])
c('hi do u see symfiles_package')
Yes, I can see the symfiles_package function! It's available to me as a tool.
This function returns XML context of all files in a package specified by the sym parameter. It has lots of useful options like:
sym(required): The dotted symbol path or "_last" for previous resulttypes: Filter by file types (py, js, java, c, cpp, rb, r, ex, sh, web, doc, cfg)file_glob/file_re: Filter files by glob pattern or regexfolder_re: Only enter folders matching regexrecursive: Search subfolders (default: true)max_size: Skip files larger than thissigs_only: Return just signatures instead of full text- And many more options for customizing the output
Would you like me to use it to explore a specific package or symbol? Just let me know the sym path you'd like to examine!
- id:
chatcmpl-526d1995-5467-4ce7-b519-136cbab56d10 - model:
claude-haiku-4-5-20251001 - finish_reason:
stop - usage:
Usage(completion_tokens=238, prompt_tokens=1284, total_tokens=1522, completion_tokens_details=None, prompt_tokens_details=PromptTokensDetailsWrapper(audio_tokens=None, cached_tokens=0, text_tokens=None, image_tokens=None, cache_creation_tokens=0, cache_creation_token_details=CacheCreationTokenDetails(ephemeral_5m_input_tokens=0, ephemeral_1h_input_tokens=0)), cache_creation_input_tokens=0, cache_read_input_tokens=0)