Description
Expected Behavior
I'm using Azure OpenAi and would like to pretty print the JSON exchanged with the LLM to make it easier to read. Note that I want the pretty printing of the over the wire request. Not the higher level JSON that we can get with the LoggingAdvisor.
Current Behavior
I'm seeing log messages like this..
May 12 09:53 INFO getChatCompletionsSync * {"az.sdk.message":"HTTP request","method":"POST","url":"https://mycompany.openai.azure.com//openai/deployments/gpt-4o-mini/chat/completions?api-version=2025-01-01-preview","tryCount":1,"Date":"Mon, 12 May 2025 04:23:28 GMT","x-ms-client-request-id":"8c945b84-11e5-4234-adea-2261b7f6ab6b","Accept":"application/json","User-Agent":"azsdk-java-azure-ai-openai/1.0.0-beta.16 (21.0.4; Mac OS X; 15.4.1)","redactedHeaders":"api-key,Content-Type","content-length":4039,"body":"{\"messages\":[{\"content\":\"You are a support chatbot for a travel agency.\\n\\nYour job is to answer user's questions related to their travel. Users may or may not be registered. Please determine that. If users are registered, you can use their email address and phone number to validate them and then provide details regarding their travel bookings.",\"role\":\"system\"},{\"role\":\"user\",\"content\":[{\"text\":\"can you help me regarding my booking?\\n\",\"type\":\"text\"}]}],\"temperature\":0.1,\"stream\":false,\"model\":\"gpt-4o-mini\",\"logprobs\":false,\"response_format\":{\"type\":\"json_object\"}}"}
Context
This is with Spring Ai 1.0.0-M8.
It seems that Spring-Azure uses Azure's Java SDK underneath which provides the HttpLogOptions class (https://learn.microsoft.com/en-us/java/api/com.azure.core.http.policy.httplogoptions?view=azure-java-stable). This class suggests the use of setRequestLogger and setResponseLogger to control log behaviour but it seems that SpringAi Azure does not expose this.