Overview
The Models endpoint provides information about available models on the server. It follows the OpenAI API specification.Endpoint
Request
No parameters required.Response
Response Body
Response Fields
Always
"list" for this endpointArray of model objects
Model identifier (currently always
"gpt-3.5-turbo")Object type:
"model"Unix timestamp of when the model information was created
Entity that owns the model (default:
"owner")Root model identifier (currently
null)Parent model identifier (currently
null)Model permissions (currently
null)Python Example
Using requests
Using OpenAI SDK
JavaScript Example
cURL Example
Notes
- The endpoint currently returns a single model entry with ID
"gpt-3.5-turbo"regardless of which Qwen model is loaded - This is for OpenAI API compatibility - the actual model serving requests is the Qwen model specified when starting the server
- The
createdtimestamp is generated at request time - The model list does not reflect the actual checkpoint path used to start the server
Model Information
To determine which Qwen model is actually running, check the server startup logs:--checkpoint-path, but the API will report it as "gpt-3.5-turbo" for compatibility.