Interface BaseModelParameters
- All Superinterfaces:
Serializable
- All Known Implementing Classes:
OpenAIModelParameters
Base interface for defining model-specific parameters used to configure remote inference clients.
Implementations of this interface encapsulate all configuration needed to initialize and communicate with a remote model inference service. This typically includes:
- Authentication credentials (API keys, tokens)
- Model identifiers or names
- Endpoint URLs or connection settings
- Inference configuration (temperature, max tokens, timeout values, etc.)
Parameters must be serializable. Consider using the builder pattern for complex parameter objects.