Class OpenAIModelHandler

java.lang.Object
org.apache.beam.sdk.ml.inference.openai.OpenAIModelHandler
All Implemented Interfaces:
BaseModelHandler<OpenAIModelParameters,OpenAIModelInput,OpenAIModelResponse>

public class OpenAIModelHandler extends Object implements BaseModelHandler<OpenAIModelParameters,OpenAIModelInput,OpenAIModelResponse>
Model handler for OpenAI API inference requests.

This handler manages communication with OpenAI's API, including client initialization, request formatting, and response parsing. It uses OpenAI's structured output feature to ensure reliable input-output pairing.

Usage


 OpenAIModelParameters params = OpenAIModelParameters.builder()
     .apiKey("sk-...")
     .modelName("gpt-4")
     .instructionPrompt("Classify the following text into one of the categories: {CATEGORIES}")
     .build();

 PCollection<OpenAIModelInput> inputs = ...;
 PCollection<Iterable<PredictionResult<OpenAIModelInput, OpenAIModelResponse>>> results =
     inputs.apply(
         RemoteInference.<OpenAIModelInput, OpenAIModelResponse>invoke()
             .handler(OpenAIModelHandler.class)
             .withParameters(params)
     );