-
Notifications
You must be signed in to change notification settings - Fork 19
feat: [OpenAI] PoC - An AiCore wrapper for OpenAi implementation #806
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
rpanackal
wants to merge
32
commits into
feat/poc-openai-responses-apache
Choose a base branch
from
feat/poc-aicore-openai-wrapper
base: feat/poc-openai-responses-apache
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Changes from all commits
Commits
Show all changes
32 commits
Select commit
Hold shift + click to select a range
27b7e3f
Minimal new module setup including spec
rpanackal a2c9456
Generation partial-success
rpanackal aed34ae
Remove examples
rpanackal 9e7c90d
Successfully filter by path
rpanackal a9a299a
Attach spec filter command
rpanackal eaf74ba
Initial setup
rpanackal 372769e
Successful PoC with OpenAI Models
rpanackal 3726089
Version 1
rpanackal 01c7f61
Stable api
rpanackal c921a86
Change class name
rpanackal 2491d07
Add tests
rpanackal c004481
fix dependency analyse issues
rpanackal fbff9b0
Initial draft untested
rpanackal 68524b2
Second draft
rpanackal 37ea275
Successful E2E
rpanackal 3bda22a
Streaming initial draft
rpanackal 229db60
Streaming E2E with chat completion
rpanackal 7ffe122
isStreaming check simplified
rpanackal 5553e1b
Cleanup PoC and rename module
rpanackal 2a55080
Reduce Javadoc verbosity
rpanackal 4ca8af1
Restrict to `/responses` api
rpanackal 7061cbc
Cleanup comments
rpanackal 6eb57fc
Charles review suggestions
rpanackal 38ced86
Charles review - round 2 suggestions
rpanackal d8e1902
Add dependency
rpanackal dae1906
Mark openai dependency optional and new client `@Beta`
rpanackal 88ae43c
Cleanup and no throw on missing model
rpanackal c995bee
pmd
rpanackal 145f162
Responses API complete
rpanackal bf29f13
ChatCompletionCreateParams throws without model. Needs rethink client…
rpanackal 96feb25
Cleanup and close with test documenting limitation
rpanackal 2bf90df
Merge branch 'feat/poc-openai-responses-apache' into feat/poc-aicore-…
rpanackal File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
174 changes: 174 additions & 0 deletions
174
...nai/src/main/java/com/sap/ai/sdk/foundationmodels/openai/AiCoreChatCompletionService.java
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,174 @@ | ||
| package com.sap.ai.sdk.foundationmodels.openai; | ||
|
|
||
| import com.openai.core.ClientOptions; | ||
| import com.openai.core.RequestOptions; | ||
| import com.openai.core.http.StreamResponse; | ||
| import com.openai.models.chat.completions.ChatCompletion; | ||
| import com.openai.models.chat.completions.ChatCompletionChunk; | ||
| import com.openai.models.chat.completions.ChatCompletionCreateParams; | ||
| import com.openai.models.chat.completions.ChatCompletionDeleteParams; | ||
| import com.openai.models.chat.completions.ChatCompletionDeleted; | ||
| import com.openai.models.chat.completions.ChatCompletionListPage; | ||
| import com.openai.models.chat.completions.ChatCompletionListParams; | ||
| import com.openai.models.chat.completions.ChatCompletionRetrieveParams; | ||
| import com.openai.models.chat.completions.ChatCompletionUpdateParams; | ||
| import com.openai.models.chat.completions.StructuredChatCompletion; | ||
| import com.openai.models.chat.completions.StructuredChatCompletionCreateParams; | ||
| import com.openai.services.blocking.chat.ChatCompletionService; | ||
| import com.openai.services.blocking.chat.completions.MessageService; | ||
| import java.util.function.Consumer; | ||
| import javax.annotation.Nonnull; | ||
| import lombok.AccessLevel; | ||
| import lombok.RequiredArgsConstructor; | ||
| import lombok.experimental.Delegate; | ||
|
|
||
| @RequiredArgsConstructor(access = AccessLevel.PACKAGE) | ||
| class AiCoreChatCompletionService implements ChatCompletionService { | ||
|
|
||
| @Delegate(types = PassThroughMethods.class) | ||
| private final ChatCompletionService delegate; | ||
|
|
||
| private final String deploymentModel; | ||
|
|
||
| @Override | ||
| @Nonnull | ||
| public ChatCompletionService withOptions( | ||
| @Nonnull final Consumer<ClientOptions.Builder> consumer) { | ||
| return new AiCoreChatCompletionService(delegate.withOptions(consumer), deploymentModel); | ||
| } | ||
|
|
||
| @Override | ||
| @Nonnull | ||
| public ChatCompletionService.WithRawResponse withRawResponse() { | ||
| throw new UnsupportedOperationException( | ||
| "withRawResponse() is not supported by AiCoreResponseService."); | ||
| } | ||
|
|
||
| @Override | ||
| @Nonnull | ||
| public ChatCompletion create(@Nonnull final ChatCompletionCreateParams params) { | ||
| return create(params, RequestOptions.none()); | ||
| } | ||
|
|
||
| @Override | ||
| @Nonnull | ||
| public ChatCompletion create( | ||
| @Nonnull final ChatCompletionCreateParams params, | ||
| @Nonnull final RequestOptions requestOptions) { | ||
| throwOnModelMismatch(params.model().asString()); | ||
| return delegate.create(params, requestOptions); | ||
| } | ||
|
|
||
| @Override | ||
| @Nonnull | ||
| public <T> StructuredChatCompletion<T> create( | ||
| @Nonnull final StructuredChatCompletionCreateParams<T> params) { | ||
| return create(params, RequestOptions.none()); | ||
| } | ||
|
|
||
| @Override | ||
| @Nonnull | ||
| public <T> StructuredChatCompletion<T> create( | ||
| @Nonnull final StructuredChatCompletionCreateParams<T> params, | ||
| @Nonnull final RequestOptions requestOptions) { | ||
| throwOnModelMismatch(params.rawParams().model().asString()); | ||
| return delegate.create(params, requestOptions); | ||
| } | ||
|
|
||
| @Override | ||
| @Nonnull | ||
| public StreamResponse<ChatCompletionChunk> createStreaming( | ||
| @Nonnull final ChatCompletionCreateParams params) { | ||
| return createStreaming(params, RequestOptions.none()); | ||
| } | ||
|
|
||
| @Override | ||
| @Nonnull | ||
| public StreamResponse<ChatCompletionChunk> createStreaming( | ||
| @Nonnull final ChatCompletionCreateParams params, | ||
| @Nonnull final RequestOptions requestOptions) { | ||
| throwOnModelMismatch(params.model().asString()); | ||
| return delegate.createStreaming(params, requestOptions); | ||
| } | ||
|
|
||
| @Override | ||
| @Nonnull | ||
| public StreamResponse<ChatCompletionChunk> createStreaming( | ||
| @Nonnull final StructuredChatCompletionCreateParams<?> params) { | ||
| return createStreaming(params, RequestOptions.none()); | ||
| } | ||
|
|
||
| @Override | ||
| @Nonnull | ||
| public StreamResponse<ChatCompletionChunk> createStreaming( | ||
| @Nonnull final StructuredChatCompletionCreateParams<?> params, | ||
| @Nonnull final RequestOptions requestOptions) { | ||
| throwOnModelMismatch(params.rawParams().model().asString()); | ||
| return delegate.createStreaming(params, requestOptions); | ||
| } | ||
|
|
||
| private void throwOnModelMismatch(@Nonnull final String givenModel) { | ||
| if (!deploymentModel.equals(givenModel)) { | ||
| throw new IllegalArgumentException( | ||
| """ | ||
| Model mismatch: | ||
| Expected : '%s' (configured via forModel()) | ||
| Actual : '%s' (set in request parameters) | ||
| Fix: Either remove the model from the request parameters, \ | ||
| or use forModel("%s") when creating the client.\ | ||
| """ | ||
| .formatted(deploymentModel, givenModel, givenModel)); | ||
| } | ||
| } | ||
|
|
||
| private interface PassThroughMethods { | ||
| ChatCompletionDeleted delete( | ||
| ChatCompletionDeleteParams chatCompletionDeleteParams, RequestOptions requestOptions); | ||
|
|
||
| ChatCompletionDeleted delete(String completionId); | ||
|
|
||
| ChatCompletionDeleted delete(String completionId, ChatCompletionDeleteParams params); | ||
|
|
||
| ChatCompletionDeleted delete( | ||
| String completionId, ChatCompletionDeleteParams params, RequestOptions requestOptions); | ||
|
|
||
| ChatCompletionDeleted delete(String completionId, RequestOptions requestOptions); | ||
|
|
||
| ChatCompletionDeleted delete(ChatCompletionDeleteParams params); | ||
|
|
||
| ChatCompletionListPage list(); | ||
|
|
||
| ChatCompletionListPage list( | ||
| ChatCompletionListParams chatCompletionListParams, RequestOptions requestOptions); | ||
|
|
||
| ChatCompletionListPage list(ChatCompletionListParams params); | ||
|
|
||
| ChatCompletionListPage list(RequestOptions requestOptions); | ||
|
|
||
| MessageService messages(); | ||
|
|
||
| ChatCompletion retrieve( | ||
| ChatCompletionRetrieveParams chatCompletionRetrieveParams, RequestOptions requestOptions); | ||
|
|
||
| ChatCompletion retrieve(String completionId); | ||
|
|
||
| ChatCompletion retrieve(String completionId, ChatCompletionRetrieveParams params); | ||
|
|
||
| ChatCompletion retrieve( | ||
| String completionId, ChatCompletionRetrieveParams params, RequestOptions requestOptions); | ||
|
|
||
| ChatCompletion retrieve(String completionId, RequestOptions requestOptions); | ||
|
|
||
| ChatCompletion retrieve(ChatCompletionRetrieveParams params); | ||
|
|
||
| ChatCompletion update( | ||
| ChatCompletionUpdateParams chatCompletionUpdateParams, RequestOptions requestOptions); | ||
|
|
||
| ChatCompletion update(String completionId, ChatCompletionUpdateParams params); | ||
|
|
||
| ChatCompletion update( | ||
| String completionId, ChatCompletionUpdateParams params, RequestOptions requestOptions); | ||
|
|
||
| ChatCompletion update(ChatCompletionUpdateParams params); | ||
| } | ||
| } | ||
40 changes: 40 additions & 0 deletions
40
...models/openai/src/main/java/com/sap/ai/sdk/foundationmodels/openai/AiCoreChatService.java
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,40 @@ | ||
| package com.sap.ai.sdk.foundationmodels.openai; | ||
|
|
||
| import com.openai.core.ClientOptions; | ||
| import com.openai.core.http.QueryParams; | ||
| import com.openai.services.blocking.ChatService; | ||
| import com.openai.services.blocking.chat.ChatCompletionService; | ||
| import java.util.function.Consumer; | ||
| import javax.annotation.Nonnull; | ||
| import lombok.AccessLevel; | ||
| import lombok.RequiredArgsConstructor; | ||
| import lombok.experimental.Delegate; | ||
|
|
||
| @RequiredArgsConstructor(access = AccessLevel.PACKAGE) | ||
| class AiCoreChatService implements ChatService { | ||
|
|
||
| @Delegate private final ChatService delegate; | ||
| private final String deploymentModel; | ||
|
|
||
| @Override | ||
| @Nonnull | ||
| public ChatService withOptions(@Nonnull final Consumer<ClientOptions.Builder> consumer) { | ||
| return new AiCoreChatService(delegate.withOptions(consumer), deploymentModel); | ||
| } | ||
|
|
||
| @Override | ||
| @Nonnull | ||
| public WithRawResponse withRawResponse() { | ||
| throw new UnsupportedOperationException( | ||
| "withRawResponse() is not supported for AiCoreChatService"); | ||
| } | ||
|
|
||
| @Override | ||
| @Nonnull | ||
| public ChatCompletionService completions() { | ||
| final var apiVersionQuery = QueryParams.builder().put("api-version", "2024-02-01").build(); | ||
| final var completions = | ||
| delegate.completions().withOptions(builder -> builder.queryParams(apiVersionQuery)); | ||
| return new AiCoreChatCompletionService(completions, deploymentModel); | ||
| } | ||
| } |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Limitation 2:
In
AiCoreResponseService, we are able to plug in themodelourselves, removing the burden from user having to mention it twice. Once for deployment resolution and once inChatCompletionCreateParams(otherwise server returns 400).The same is not true for
AiCoreChatCompletionService.ChatCompletionCreateParamsthrows right away whenmodelis not declared -> we can not provide any convenience to user by plugging in model downstream.Why not switch to dynamic deployment resolution for model in params ?
The AI Core supports multiple operations for the following endpoints.
"/chat/completions": POST
"/responses": GET, POST
"/responses/{response_id}": GET, DELETE
"/responses/compact": POST
Except for the POST operations there is no
modelavailable in payload. So, it becomes unavoidable to not ask a model to resolve deployment url.