Skip to main content

SelfHostedInferenceConfig

Agent Runtimes


Agent Runtimes / inference/SelfHostedInferenceProvider / SelfHostedInferenceConfig

Interface: SelfHostedInferenceConfig

Defined in: inference/SelfHostedInferenceProvider.ts:27

Self-hosted provider configuration

Extends

Properties

apiKey?

optional apiKey?: string

Defined in: types/inference.ts:45

API key for authentication

Inherited from

InferenceProviderConfig.apiKey


baseUrl

baseUrl: string

Defined in: inference/SelfHostedInferenceProvider.ts:29

Base URL for the API endpoint (required)

Overrides

InferenceProviderConfig.baseUrl


customHeaders?

optional customHeaders?: Record<string, string>

Defined in: inference/SelfHostedInferenceProvider.ts:35

Custom headers


model?

optional model?: string

Defined in: inference/SelfHostedInferenceProvider.ts:32

Model name

Overrides

InferenceProviderConfig.model


openaiCompatible?

optional openaiCompatible?: boolean

Defined in: inference/SelfHostedInferenceProvider.ts:38

Whether the endpoint is OpenAI-compatible


options?

optional options?: Record<string, unknown>

Defined in: types/inference.ts:57

Additional provider-specific options

Inherited from

InferenceProviderConfig.options


timeout?

optional timeout?: number

Defined in: types/inference.ts:54

Request timeout in milliseconds

Inherited from

InferenceProviderConfig.timeout