Custom Models Integration
KoreShield can proxy any OpenAI-compatible API endpoint. This is useful for self-hosted models, gateways, or third-party providers that expose OpenAI-style endpoints.Setup
Identify Your API
Ensure your custom model API is OpenAI-compatible (supports
/v1/chat/completions).Basic Request
Streaming
Compatibility Notes
- The environment variable must match the provider name (uppercased +
_API_KEY) - If the upstream API expects additional headers, configure them in your gateway or extend the provider adapter
Error Handling
Next Steps
Configuration
Configure providers and security settings
OpenAI Integration
Review OpenAI-compatible routing