. update to version v0.1.14

This commit is contained in:
Jeremy Chone
2024-12-08 19:42:01 -08:00
parent a64126b9b5
commit b318d95526
4 changed files with 18 additions and 6 deletions

View File

@ -1,10 +1,18 @@
`.` minor | `-` Fix | `+` Addition | `^` improvement | `!` Change | `*` Refactor `.` minor | `-` Fix | `+` Addition | `^` improvement | `!` Change | `*` Refactor
> **IMPORTANT:** `0.1.x` will still have some breaking changes in patches. > **IMPORTANT:** `0.1.x` will still have some breaking changes in patches.
> - Make sure to **lock** your version, e.g., `genai = "=0.1.13"`. > - Make sure to **lock** your version, e.g., `genai = "=0.1.14"`.
> - Version `0.2.x` will follow semver more strictly. > - Version `0.2.x` will follow semver more strictly.
> - API changes will be denoted as "`!` - **API CHANGE** ...." > - API changes will be denoted as "`!` - **API CHANGE** ...."
## 2024-12-08 - `0.1.14`
- `+` adapter - xAI adapter
- `+` **ServiceTargetResolver** added (support for **custom endpoint**) (checkout [examples/c06-starget-resolver.rs](examples/c06-target-resolver.rs))
- `.` ollama - now use openai v1 api to list models
- `.` test - add test for Client::all_model_names
- `*` major internal refactor
## 2024-12-07 - `0.1.13` ## 2024-12-07 - `0.1.13`
- `.` ollama - removed workaround for multi-system lack of support (for old ollama) - `.` ollama - removed workaround for multi-system lack of support (for old ollama)

View File

@ -1,10 +1,10 @@
[package] [package]
name = "genai" name = "genai"
version = "0.1.14-WIP" version = "0.1.14"
edition = "2021" edition = "2021"
rust-version = "1.79" rust-version = "1.79"
license = "MIT OR Apache-2.0" license = "MIT OR Apache-2.0"
description = "Multi-AI Providers Library for Rust. (Ollama, OpenAI, Anthropic, Groq, Gemini, ...)" description = "Multi-AI Providers Library for Rust. (Anthropic, OpenAI, Gemini, xAI, Ollama, Groq, ...)"
keywords = ["generative-ai","openai","chatgpt","gemini","ollama"] keywords = ["generative-ai","openai","chatgpt","gemini","ollama"]
homepage = "https://github.com/jeremychone/rust-genai" homepage = "https://github.com/jeremychone/rust-genai"
repository = "https://github.com/jeremychone/rust-genai" repository = "https://github.com/jeremychone/rust-genai"

View File

@ -12,14 +12,14 @@ Currently supports natively: **Ollama**, **OpenAI**, **Anthropic**, **groq**, **
```toml ```toml
# cargo.toml # cargo.toml
genai = "=0.1.13" # Version lock for `0.1.x` genai = "=0.1.14" # Version lock for `0.1.x`
``` ```
<br /> <br />
The goal of this library is to provide a common and ergonomic single API to many generative AI Providers, such as OpenAI, Anthropic, Cohere, Ollama. Provides a common and ergonomic single API to many generative AI Providers, such as Anthropic, OpenAI, Gemini, xAI, Ollama, Groq, ....
- **IMPORTANT 1** `0.1.x` will still have some breaking changes in patches, so make sure to **lock** your version, e.g., `genai = "=0.1.13"`. In short, `0.1.x` can be considered "beta releases." Version `0.2.x` will follow semver more strictly. - **IMPORTANT 1** `0.1.x` will still have some breaking changes in patches, so make sure to **lock** your version, e.g., `genai = "=0.1.14"`. In short, `0.1.x` can be considered "beta releases." Version `0.2.x` will follow semver more strictly.
- **IMPORTANT 2** `genai` is focused on normalizing chat completion APIs across AI providers and is not intended to be a full representation of a given AI provider. For this, there are excellent libraries such as [async-openai](https://crates.io/search?q=async-openai) for OpenAI and [ollama-rs](https://crates.io/crates/ollama-rs) for Ollama. - **IMPORTANT 2** `genai` is focused on normalizing chat completion APIs across AI providers and is not intended to be a full representation of a given AI provider. For this, there are excellent libraries such as [async-openai](https://crates.io/search?q=async-openai) for OpenAI and [ollama-rs](https://crates.io/crates/ollama-rs) for Ollama.
@ -42,6 +42,7 @@ const MODEL_COHERE: &str = "command-light";
const MODEL_GEMINI: &str = "gemini-1.5-flash-latest"; const MODEL_GEMINI: &str = "gemini-1.5-flash-latest";
const MODEL_GROQ: &str = "gemma-7b-it"; const MODEL_GROQ: &str = "gemma-7b-it";
const MODEL_OLLAMA: &str = "gemma:2b"; // sh: `ollama pull gemma:2b` const MODEL_OLLAMA: &str = "gemma:2b"; // sh: `ollama pull gemma:2b`
const MODEL_XAI: &str = "grok-beta";
// NOTE: Those are the default environment keys for each AI Adapter Type. // NOTE: Those are the default environment keys for each AI Adapter Type.
// Can be customized, see `examples/c02-auth.rs` // Can be customized, see `examples/c02-auth.rs`
@ -52,6 +53,7 @@ const MODEL_AND_KEY_ENV_NAME_LIST: &[(&str, &str)] = &[
(MODEL_COHERE, "COHERE_API_KEY"), (MODEL_COHERE, "COHERE_API_KEY"),
(MODEL_GEMINI, "GEMINI_API_KEY"), (MODEL_GEMINI, "GEMINI_API_KEY"),
(MODEL_GROQ, "GROQ_API_KEY"), (MODEL_GROQ, "GROQ_API_KEY"),
(MODEL_XAI, "XAI_API_KEY"),
(MODEL_OLLAMA, ""), (MODEL_OLLAMA, ""),
]; ];

View File

@ -10,6 +10,7 @@ const MODEL_COHERE: &str = "command-light";
const MODEL_GEMINI: &str = "gemini-1.5-flash-latest"; const MODEL_GEMINI: &str = "gemini-1.5-flash-latest";
const MODEL_GROQ: &str = "gemma-7b-it"; const MODEL_GROQ: &str = "gemma-7b-it";
const MODEL_OLLAMA: &str = "gemma:2b"; // sh: `ollama pull gemma:2b` const MODEL_OLLAMA: &str = "gemma:2b"; // sh: `ollama pull gemma:2b`
const MODEL_XAI: &str = "grok-beta";
// NOTE: These are the default environment keys for each AI Adapter Type. // NOTE: These are the default environment keys for each AI Adapter Type.
// They can be customized; see `examples/c02-auth.rs` // They can be customized; see `examples/c02-auth.rs`
@ -20,6 +21,7 @@ const MODEL_AND_KEY_ENV_NAME_LIST: &[(&str, &str)] = &[
(MODEL_COHERE, "COHERE_API_KEY"), (MODEL_COHERE, "COHERE_API_KEY"),
(MODEL_GEMINI, "GEMINI_API_KEY"), (MODEL_GEMINI, "GEMINI_API_KEY"),
(MODEL_GROQ, "GROQ_API_KEY"), (MODEL_GROQ, "GROQ_API_KEY"),
(MODEL_XAI, "XAI_API_KEY"),
(MODEL_OLLAMA, ""), (MODEL_OLLAMA, ""),
]; ];