Skip to content

Conversation

@geoand
Copy link
Collaborator

@geoand geoand commented Nov 23, 2025

- Add supportedCapabilities() method to both AzureOpenAiChatModel and AzureOpenAiStreamingChatModel
- Return RESPONSE_FORMAT_JSON_SCHEMA capability when responseFormat type is JSON
- Support request-level ResponseFormat override from ChatRequest
- Enables structured JSON output with schema validation for Azure OpenAI models (e.g., gpt-4o-2024-08-06)

Fixes #1953
@henkka14
Copy link

Build errors seem to be caused by these two type clashes
https://github.com/langchain4j/langchain4j/blob/main/langchain4j-open-ai/src/main/java/dev/langchain4j/model/openai/internal/chat/ResponseFormatType.java

https://github.com/langchain4j/langchain4j/blob/main/langchain4j-core/src/main/java/dev/langchain4j/model/chat/request/ResponseFormatType.java

The following could work in theory ResponseFormatType.JSON => ResponseFormatType.JSON_SCHEMA

        if (this.responseFormat != null && ResponseFormatType.JSON_SCHEMA.equals(this.responseFormat.type())) {
            this.supportedCapabilities.add(RESPONSE_FORMAT_JSON_SCHEMA);
        }

After testing this a bit by setting the quarkus property
quarkus.langchain4j.azure-openai.chat-model.response-format=json_schema

I received the following

2025-11-23 11:40:03,763 WARN  [dev.lan.int.RetryUtils] (executor-thread-3) A retriable exception occurred. Remaining retries: 1 of 1: dev.langchain4j.exception.HttpException: {"error":{"message":"Missing required parameter: 'response_format.json_schema'.","type":"invalid_request_error","param":"response_format.json_schema","code":"missing_required_parameter"}}
        at io.quarkiverse.langchain4j.openai.common.OpenAiRestApi.toException(OpenAiRestApi.java:175)
        at io.quarkiverse.langchain4j.openai.common.OpenAiRestApi_toException_ResponseExceptionMapper_f35c1c86580504f69920f9de921a22bd696c020f.toThrowable(Unknown Source)

So there seems to be something different still how dev.langchain4j configures the schema option

@henkka14
Copy link

Other issues with the following

    public ChatResponse doChat(ChatRequest chatRequest) {
        ResponseFormat requestResponseFormat = this.responseFormat;

        // Handle ChatRequest-level ResponseFormat if provided
        if (chatRequest.responseFormat() != null) {
            requestResponseFormat = chatRequest.responseFormat();
        }

is that ResponseFormat type clashes

java/io/quarkiverse/langchain4j/azure/openai/AzureOpenAiChatModel.java:[149,63] incompatible types: dev.langchain4j.model.chat.request.ResponseFormat cannot be converted to dev.langchain4j.model.openai.internal.chat.ResponseFormat

Unfortunately, toOpenAiResponseFormat method is private and cannot be used directly to cast the ResponseFormat type
https://github.com/langchain4j/langchain4j/blob/main/langchain4j-open-ai/src/main/java/dev/langchain4j/model/openai/internal/OpenAiUtils.java#L408

However, the method toOpenAiChatRequest is public and could potentially be used in the doChat method, but I was still not able to get this working to provide structured JSON outputs (while I can get dev.langchain4j.model.azure.AzureOpenAiChatModel to work with structued JSON output with same model).

    public ChatResponse doChat(ChatRequest chatRequest) {

        List<ToolSpecification> toolSpecifications = chatRequest.toolSpecifications();

        // TODO: consider using toOpenAiChatRequest?
        // strictTools, strictJsonSchema should be configurable? not default to true?
        ChatCompletionRequest.Builder requestBuilder = toOpenAiChatRequest(
                chatRequest,
                OpenAiChatRequestParameters.builder()
                        .temperature(temperature)
                        .seed(seed)
                        .topP(topP)
                        .maxCompletionTokens(maxTokens)
                        .presencePenalty(presencePenalty)
                        .frequencyPenalty(frequencyPenalty)
                        .responseFormat(chatRequest.responseFormat())
                        .build(),
                true,
                true);

@geoand
Copy link
Collaborator Author

geoand commented Nov 24, 2025

The AI experiment I tried obviously failed miserably :)

@geoand
Copy link
Collaborator Author

geoand commented Nov 25, 2025

Fixed the compilation errors:

  1. Changed ResponseFormatType.JSON to ResponseFormatType.JSON_SCHEMA: The enum value JSON doesn't exist in dev.langchain4j.model.openai.internal.chat.ResponseFormatType. The correct value for JSON schema support is JSON_SCHEMA.

  2. Removed request-level ResponseFormat override logic: The original code tried to assign dev.langchain4j.model.chat.request.ResponseFormat (from ChatRequest) to dev.langchain4j.model.openai.internal.chat.ResponseFormat (internal OpenAI type), which are incompatible types. The conversion method toOpenAiResponseFormat is private in OpenAiUtils and not accessible.

For now, this PR focuses on the core requirement from #1953: declaring RESPONSE_FORMAT_JSON_SCHEMA capability when the response format is configured with JSON_SCHEMA type. This allows the model to properly advertise JSON schema support.

The request-level ResponseFormat override can be added in a future PR once there's a public API in langchain4j for converting between the two ResponseFormat types.

@geoand
Copy link
Collaborator Author

geoand commented Nov 25, 2025

Let's see how well AI can perform this time 😆

@geoand
Copy link
Collaborator Author

geoand commented Nov 25, 2025

@henkka14 can you give it another try?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

AzureOpenAiChatModel and issues with Structured JSON schema output

3 participants