Fix compilation errors from async-openai 0.32.2 and base64 0.22.1 upgrades #256
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Dependency upgrades introduced breaking API changes: async-openai 0.32.2 moved chat completion types to a gated submodule, and base64 0.22.1 deprecated the
decode()function.Changes
async-openai API migration:
chat-completionfeature flag to access Chat Completion APIasync_openai::types::*toasync_openai::types::chat::*base64 API migration:
base64::decode()withgeneral_purpose::STANDARD.decode()Dependency resolution:
Original prompt
fix issues due to dependencies upgrade: Compiling aws-config v1.8.12
error[E0432]: unresolved imports
async_openai::types::ChatCompletionRequestAssistantMessageArgs,async_openai::types::ChatCompletionRequestMessage,async_openai::types::ChatCompletionRequestUserMessageArgs,async_openai::types::CreateChatCompletionRequestArgs--> src/openai_client.rs:2:5
|
2 | ChatCompletionRequestAssistantMessageArgs, ChatCompletionRequestMessage,
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ no
ChatCompletionRequestMessageintypes| |
| no
ChatCompletionRequestAssistantMessageArgsintypes3 | ChatCompletionRequestUserMessageArgs, CreateChatCompletionRequestArgs,
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ no
CreateChatCompletionRequestArgsintypes| |
| no
ChatCompletionRequestUserMessageArgsintypeserror[E0432]: unresolved imports
async_openai::config,async_openai::Client--> src/openai_client.rs:5:20
|
5 | use async_openai::{config::OpenAIConfig, Client};
| ^^^^^^ ^^^^^^ no
Clientin the root| |
| could not find
configinasync_openai|
note: found an item that was configured out
--> /Users/blank/.cargo/registry/src/index.crates.io-1949cf8c6b5b557f/async-openai-0.32.2/src/lib.rs:258:9
|
257 | #[cfg(feature = "_api")]
| ---------------- the item is gated behind the
_apifeature258 | pub mod config;
| ^^^^^^
= help: consider importing one of these structs instead:
aws_config::imds::Client
aws_sdk_dynamodb::Client
reqwest::Client
note: found an item that was configured out
--> /Users/blank/.cargo/registry/src/index.crates.io-1949cf8c6b5b557f/async-openai-0.32.2/src/lib.rs:326:17
|
325 | #[cfg(feature = "_api")]
| ---------------- the item is gated behind the
_apifeature326 | pub use client::Client;
| ^^^^^^
warning: use of deprecated function
base64::decode: Use Engine::decode--> src/dynamo/create_item.rs:30:37
|
30 | ... base64::decode(s).unwrap_or_default(),
| ^^^^^^
|
= note:
#[warn(deprecated)]on by defaultwarning: use of deprecated function
base64::decode: Use Engine::decode--> src/dynamo/create_item.rs:55:53
|
55 | ... base64::decode(s).unwrap_or_default(),
| ^^^^^^
error[E0282]: type annotations needed
--> src/openai_client.rs:122:25
|
122 | Some(client) => client.clone(),
| ^^^^^^ cannot infer type
error[E0282]: type annotations needed
--> src/openai_client.rs:139:27
|
139 | .map_err(|e| e.to_string())?
| ^ - type must be known at this point
|
help: consider giving this closure parameter an explicit type
|
139 | .map_err(|e: /* Type */| e.to_string())?
| ++++++++++++
error[E0282]: type annotations needed
--> src/openai_client.rs:144:27
|
144 | .map_err(|e| e.to_string())?
| ^ - type must be known at this point
|
help: consider giving this closure parameter an explicit type
|
144 | .map_err(|e: /* Type */| e.to_string())?
| ++++++++++++
error[E0282]: type annotations needed
--> src/openai_client.rs:155:23
|
155 | .map_err(|e| e.to_string())?
| ^ - type must be known at this point
|
help: consider giving this closure parameter an explicit type
|
155 | .map_err(|e: /* Type */| e.to_string())?
| ++++++++++++
error[E0282]: type annotations needed
--> src/openai_client.rs:165:19
|
165 | .map_err(|e| e.to_string())?;
| ^ - type must be known at this point
|
help: consider giving this closure parameter an explicit type
|
165 | .map_err(|e: /* Type */| e.to_string())?;
| ++++++++++++
error[E0282]: type annotations needed
--> src/openai_client.rs:168:9
|
168 | let mut stream = match openai_client.chat().create_stream(request).await {
| ^^^^^^^^^^
...
184 | while let Some(result) = stream.next().await {
| ------ type must be known at this point
|
help: consider giving
streaman explicit type|
168 | let mut stream: /* Type */ = match openai_client.chat().create_stream(request).await {
| ++++++++++++
error[E0282]: type annotations needed
--> src/openai_client.rs:212:37
|
212 | let error_message = e.to_string();
| ^ cannot infer type
Some errors have detailed ...
💬 We'd love your input! Share your thoughts on Copilot coding agent in our 2 minute survey.