Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
51 changes: 50 additions & 1 deletion Cargo.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

1 change: 1 addition & 0 deletions Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -61,6 +61,7 @@ qdrant-client = { version = "1.14.0", default-features = false, features = [
quick-xml = "0.38.0"
quote = "1.0.40"
rayon = "1.10.0"
redis = { version = "0.27", default-features = false }
reqwest = { version = "0.13", default-features = false }
url = "2.5"
rusqlite = "0.32"
Expand Down
30 changes: 30 additions & 0 deletions rig-integrations/rig-redis/Cargo.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
[package]
name = "rig-redis"
version = "0.1.0"
edition = { workspace = true }
license = "MIT"
readme = "README.md"
description = "Redis vector store implementation for the rig framework"
repository = "https://github.com/0xPlaygrounds/rig"

[lints]
workspace = true

[dependencies]
rig-core = { path = "../../rig/rig-core", version = "0.31.0", default-features = false }
redis = { workspace = true, features = ["tokio-comp", "connection-manager"] }
serde = { workspace = true }
serde_json = { workspace = true }
uuid = { workspace = true, features = ["v4"] }

[dev-dependencies]
rig-core = { path = "../../rig/rig-core", version = "0.31.0", features = ["derive"] }
tokio = { workspace = true, features = ["macros", "rt-multi-thread"] }
anyhow = { workspace = true }
tracing-subscriber = { workspace = true, features = ["env-filter"] }
testcontainers = { workspace = true }
httpmock = { workspace = true }

[[example]]
name = "vector_search_redis"
required-features = ["rig-core/derive"]
136 changes: 136 additions & 0 deletions rig-integrations/rig-redis/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,136 @@
# Rig-Redis

Vector store index integration for [Redis](https://redis.io/) using RediSearch vector similarity search. This integration supports dense vector retrieval using Rig's embedding providers and leverages Redis's FT.SEARCH command with KNN queries for efficient similarity search.

## Features

- Vector similarity search using Redis's RediSearch module
- Support for KNN (k-nearest neighbors) queries
- Metadata filtering with Redis query syntax
- Document insertion with automatic embedding storage
- Compatible with Redis 7.2+ or Redis Stack

## Prerequisites

You need a Redis instance with RediSearch module enabled. This can be:
- [Redis Stack](https://redis.io/docs/stack/)
- Redis 7.2+ with RediSearch module loaded
- Redis Cloud with RediSearch enabled

## Creating a Vector Index

Before using the vector store, you need to create a RediSearch index with a vector field. Here's an example using redis-cli:

```bash
FT.CREATE word_idx
ON HASH
PREFIX 1 doc:
SCHEMA
document TEXT
embedded_text TEXT
embedding VECTOR FLAT 6
TYPE FLOAT32
DIM 1536
DISTANCE_METRIC COSINE
```

Replace `1536` with your embedding model's dimensionality.

## Usage Example

```rust
use rig::providers::openai;
use rig::vector_store::{InsertDocuments, VectorStoreIndex};
use rig_redis::RedisVectorStore;

// Create embedding model
let openai_client = openai::Client::from_env();
let model = openai_client.embedding_model(openai::TEXT_EMBEDDING_3_SMALL);

// Create Redis client
let redis_client = redis::Client::open("redis://127.0.0.1:6379")?;

// Create vector store
let vector_store = RedisVectorStore::new(
model,
redis_client,
"word_idx".to_string(), // index name
"embedding".to_string(), // vector field name
);

// Insert documents
vector_store.insert_documents(documents).await?;

// Search
let results = vector_store
.top_n::<MyDocument>(
VectorSearchRequest::builder()
.query("your search query")
.samples(5)
.build()?
)
.await?;
```

You can find complete examples [here](https://github.com/0xPlaygrounds/rig/tree/main/rig-integrations/rig-redis/examples).

## Distance Metrics

Redis supports three distance metrics:
- **COSINE** - Cosine similarity (default, recommended)
- **L2** - Euclidean distance
- **IP** - Inner product

Choose the metric that matches your embedding model when creating the index.

## Limitations

- Requires pre-created RediSearch index
- Vector dimensionality must match the index definition
- Embeddings are stored as FLOAT32 (converted from FLOAT64)

## Testing

### Prerequisites

Integration tests require Docker to be running, as they use testcontainers to spin up a Redis Stack instance.

### Running Tests

```bash
# Run all tests (unit + integration)
cargo test

# Run only unit tests
cargo test --lib

# Run only integration tests
cargo test --test integration_tests

# Or use the Makefile
make test # All tests
make test-unit # Unit tests only
make test-integration # Integration tests only
```

### Manual Testing with Local Redis

You can start a local Redis Stack instance for manual testing:

```bash
# Start Redis Stack
make redis-local
# or
docker run -d --name redis-stack -p 6379:6379 redis/redis-stack:latest

# Create a test index
redis-cli FT.CREATE word_idx ON HASH SCHEMA document TEXT embedded_text TEXT embedding VECTOR FLAT 6 TYPE FLOAT32 DIM 1536 DISTANCE_METRIC COSINE

# Run the example
make run-example
# or
cargo run --example vector_search_redis

# Stop Redis Stack
make redis-stop
```
82 changes: 82 additions & 0 deletions rig-integrations/rig-redis/examples/vector_search_redis.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,82 @@
use rig::client::ProviderClient;
use rig::vector_store::InsertDocuments;
use rig::vector_store::request::VectorSearchRequest;
use rig::{
Embed, client::EmbeddingsClient, embeddings::EmbeddingsBuilder, vector_store::VectorStoreIndex,
};
use serde::{Deserialize, Serialize};

#[derive(Embed, Serialize, Deserialize, Clone, Debug, Eq, PartialEq, Default)]
struct WordDefinition {
word: String,
#[serde(skip)]
#[embed]
definition: String,
}

impl std::fmt::Display for WordDefinition {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(f, "{}", self.word)
}
}

#[tokio::main]
async fn main() -> Result<(), anyhow::Error> {
// Create OpenAI client
let openai_client = rig::providers::openai::Client::from_env();
let model = openai_client.embedding_model(rig::providers::openai::TEXT_EMBEDDING_3_SMALL);

let redis_url =
std::env::var("REDIS_URL").unwrap_or_else(|_| "redis://127.0.0.1:6379".to_string());
let redis_client = redis::Client::open(redis_url)?;

let vector_store = rig_redis::RedisVectorStore::new(
model.clone(),
redis_client,
"word_idx".to_string(),
"embedding".to_string(),
);

// Create test documents with embeddings
let words = vec![
WordDefinition {
word: "flurbo".to_string(),
definition: "1. *flurbo* (name): A fictional digital currency that originated in the animated series Rick and Morty.".to_string()
},
WordDefinition {
word: "glarb-glarb".to_string(),
definition: "1. *glarb-glarb* (noun): A fictional creature found in the distant, swampy marshlands of the planet Glibbo in the Andromeda galaxy.".to_string()
},
WordDefinition {
word: "linglingdong".to_string(),
definition: "1. *linglingdong* (noun): A term used by inhabitants of the far side of the moon to describe humans.".to_string(),
}
];

let documents = EmbeddingsBuilder::new(model.clone())
.documents(words)
.unwrap()
.build()
.await
.expect("Failed to create embeddings");

vector_store.insert_documents(documents).await?;

// Query vector store
let query = "What does \"glarb-glarb\" mean?";

let req = VectorSearchRequest::builder()
.query(query)
.samples(2)
.build()
.expect("VectorSearchRequest should not fail to build here");

let results = vector_store.top_n::<WordDefinition>(req).await?;

println!("#{} results for query: {}", results.len(), query);
for (score, _id, doc) in results.iter() {
println!("Result score {score} for word: {doc}");
}

Ok(())
}
Loading