SelfHostedInferenceConfig
Agent Runtimes / inference/SelfHostedInferenceProvider / SelfHostedInferenceConfig
Interface: SelfHostedInferenceConfig
Defined in: inference/SelfHostedInferenceProvider.ts:27
Self-hosted provider configuration
Extends
Properties
apiKey?
optionalapiKey?:string
Defined in: types/inference.ts:45
API key for authentication
Inherited from
InferenceProviderConfig.apiKey
baseUrl
baseUrl:
string
Defined in: inference/SelfHostedInferenceProvider.ts:29
Base URL for the API endpoint (required)
Overrides
InferenceProviderConfig.baseUrl
customHeaders?
optionalcustomHeaders?:Record<string,string>
Defined in: inference/SelfHostedInferenceProvider.ts:35
Custom headers
model?
optionalmodel?:string
Defined in: inference/SelfHostedInferenceProvider.ts:32
Model name
Overrides
openaiCompatible?
optionalopenaiCompatible?:boolean
Defined in: inference/SelfHostedInferenceProvider.ts:38
Whether the endpoint is OpenAI-compatible
options?
optionaloptions?:Record<string,unknown>
Defined in: types/inference.ts:57
Additional provider-specific options
Inherited from
InferenceProviderConfig.options
timeout?
optionaltimeout?:number
Defined in: types/inference.ts:54
Request timeout in milliseconds