Add docs and examples

This commit is contained in:
Dongri Jin
2023-04-11 06:34:12 +09:00
parent 9d3db96165
commit f0ae2812a7

View File

@@ -7,12 +7,40 @@ Cargo.toml
openai-api-rs = "0.1.5"
```
## Example:
## Usage
The library needs to be configured with your account's secret key, which is available on the website. We recommend setting it as an environment variable. Here's an example of initializing the library with the API key loaded from an environment variable and creating a completion:
### Set OPENAI_API_KEY to environment variable
```bash
export OPENAI_API_KEY={YOUR_API_KEY}
$ export OPENAI_API_KEY=sk-xxxxxxx
```
### Chat
### Create client
```rust
use openai_api_rs::v1::api::Client;
use std::env;
let client = Client::new(env::var("OPENAI_API_KEY").unwrap().to_string());
```
### Create request
```rust
use openai_api_rs::v1::chat_completion::{self, ChatCompletionRequest};
let req = ChatCompletionRequest {
model: chat_completion::GPT4.to_string(),
messages: vec![chat_completion::ChatCompletionMessage {
role: chat_completion::MessageRole::user,
content: String::from("Hello OpenAI!"),
}],
};
```
### Send request
```rust
let result = client.completion(req).await?;
println!("{:?}", result.choices[0].text);
```
## Example of chat completion
```rust
use openai_api_rs::v1::api::Client;
use openai_api_rs::v1::chat_completion::{self, ChatCompletionRequest};
@@ -25,46 +53,23 @@ async fn main() -> Result<(), Box<dyn std::error::Error>> {
model: chat_completion::GPT4.to_string(),
messages: vec![chat_completion::ChatCompletionMessage {
role: chat_completion::MessageRole::user,
content: String::from("NFTとは"),
content: String::from("Hello OpenAI!"),
}],
};
let result = client.chat_completion(req).await?;
println!("{:?}", result.choices[0].message.content);
Ok(())
}
```
Check out the [full API documentation](https://platform.openai.com/docs/api-reference/completions) for examples of all the available functions.
### Completion
```rust
use openai_api_rs::v1::completion::{self, CompletionRequest};
use openai_api_rs::v1::api::Client;
use std::env;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let client = Client::new(env::var("OPENAI_API_KEY").unwrap().to_string());
let req = CompletionRequest {
model: completion::GPT3_TEXT_DAVINCI_003.to_string(),
prompt: Some(String::from("NFTとは")),
suffix: None,
max_tokens: Some(3000),
temperature: Some(0.9),
top_p: Some(1.0),
n: None,
stream: None,
logprobs: None,
echo: None,
stop: None,
presence_penalty: Some(0.6),
frequency_penalty: Some(0.0),
best_of: None,
logit_bias: None,
user: None,
};
let result = client.completion(req).await?;
println!("{:?}", result.choices[0].text);
Ok(())
}
```
## Supported APIs
- [x] [completions](https://platform.openai.com/docs/api-reference/completions)
- [x] [Chat](https://platform.openai.com/docs/api-reference/chat)
- [x] [Edits](https://platform.openai.com/docs/api-reference/edits)
- [x] [Images](https://platform.openai.com/docs/api-reference/images)
- [x] [Embeddings](https://platform.openai.com/docs/api-reference/embeddings)
- [ ] [Audio](https://platform.openai.com/docs/api-reference/audio)
- [x] [Files](https://platform.openai.com/docs/api-reference/files)
- [ ] [Fine-tunes](https://platform.openai.com/docs/api-reference/fine-tunes)
- [ ] [Moderations](https://platform.openai.com/docs/api-reference/moderations)