Conversation
| private static OpenAIClient buildClient(@Nonnull final HttpDestination destination) { | ||
| final DefaultHttpDestination defaultDest = (DefaultHttpDestination) destination; | ||
| final String baseUrl = defaultDest.getUri().toString() + "v1/"; | ||
| log.debug("Building OpenAI client with base URL: {}", baseUrl); | ||
|
|
||
| return OpenAIOkHttpClient.builder() | ||
| .baseUrl(baseUrl) | ||
| .credential(BearerTokenCredential.create(() -> extractBearerToken(defaultDest))) | ||
| .putHeader("AI-Resource-Group", getResourceGroupHeader(defaultDest)) | ||
| .build(); | ||
| } |
There was a problem hiding this comment.
(Major)
Afaik, OpenAI SDK do not handle token refresh. So effectively, the obtained instance of OpenAIOkHttpClient also "expires" which was the concern I had when I started out here.
So, I add an adapter (custom com.openai.core.http.HttpClient) in my PR as we have a familiar Apache client that handles destination logic (incl token refresh)(#794).
There was a problem hiding this comment.
We are also good with the wrapper strategy on Java side. Even better since model and api-version will soon be unnecessary in v1.
That said, currently, we still considering to release a Response API specific wrapper (checkout #819).
With a regular client instance, users can do the following
// supported
client.chat().completion()
client.response()
//unsupported
client.completion()
client.realtime()
client.audio()Ideally, we want to limit exposed api to what is supported.
Context
AI/ai-sdk-java-backlog#ISSUENUMBER.
Try the sample app and call:
The model call in OpenAI official client is not nice I agree, perhaps we can find a way in Java to hide it.
Ignore the
pom.xmlif the changes contain stupid stuff as they were AI generated but they work :)Feature scope:
Definition of Done
Aligned changes with the JavaScript SDK