Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
126 changes: 76 additions & 50 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,90 +1,116 @@
# deja

> Persistent memory for agents. Store learnings, recall context.
Persistent memory for AI agents. Agents learn from runs — deja remembers across them.

[Docs](https://deja.coey.dev/docs) · [Quickstart](https://deja.coey.dev/guides/quickstart)
## Three ways to use deja

## What Deja does
| | **deja-local** | **deja-edge** | **deja (hosted)** |
|---|---|---|---|
| **What it is** | In-process SQLite memory | Cloudflare Durable Object memory | Hosted Cloudflare Worker service |
| **Search** | Vector embeddings (all-MiniLM-L6-v2) | FTS5 full-text search | Vectorize + Workers AI embeddings |
| **Runtime** | Bun | Cloudflare Workers | Cloudflare Workers |
| **Dependencies** | None (runs locally) | None (runs in your DO) | Vectorize index + Workers AI |
| **Latency** | Zero (in-process) | Zero (in-DO) | Network round-trip |
| **Install** | `npm install deja-local` | `npm install deja-edge` | Clone + `wrangler deploy` |
| **Best for** | Local agents, scripts, CLI tools | Edge agents running in Workers | Multi-tenant, team-shared memory |

Agents learn from runs. Deja remembers across them.
### deja-local

- **Learn** — store a learning with trigger, context, and confidence
- **Recall** — semantic search returns relevant learnings before the next run
- **Working state** — live snapshots and event streams for in-progress work
- **Scoped** — learnings are isolated by scope (`shared`, `agent:<id>`, `session:<id>`, or custom)
Memory lives in a local SQLite file. Vector search via ONNX embeddings on CPU. No network, no API keys. Requires Bun.

## Install
```ts
import { createMemory } from 'deja-local'

### As part of filepath (recommended)
const mem = createMemory({ path: './agent.db' })
await mem.remember('always run migrations before deploying')
const results = await mem.recall('deploying to staging')
```

Deja is an npm-installable Cloudflare Worker. When used with [filepath](https://github.com/acoyfellow/filepath), add it to `alchemy.run.ts` and it deploys alongside your filepath instance. See the [filepath README](https://github.com/acoyfellow/filepath#how-to-enable-memory-deja) for setup.
[Full docs →](./packages/deja-local/)

### Standalone
### deja-edge

```bash
git clone https://github.com/acoyfellow/deja
cd deja
bun install
wrangler login
wrangler vectorize create deja-embeddings --dimensions 384 --metric cosine
wrangler secret put API_KEY
bun run deploy
```
Memory lives in a Cloudflare Durable Object's SQLite. FTS5 full-text search with BM25 ranking. No Vectorize, no Workers AI — just text matching. Zero external dependencies.

## Connect
```ts
import { createEdgeMemory } from 'deja-edge'

### REST
export class MyDO extends DurableObject {
private memory = createEdgeMemory(this.ctx)

async onRemember(text: string) {
return this.memory.remember(text)
}
async onRecall(context: string) {
return this.memory.recall(context)
}
}
```
POST /learn — store a learning
POST /inject — recall relevant learnings for a context
POST /inject/trace — recall with debug info
POST /query — search learnings
GET /learnings — list learnings (filterable by scope)
GET /stats — counts by scope

Or use the drop-in DO class with HTTP routes:

```ts
export { DejaEdgeDO } from 'deja-edge/do'
```

### MCP
[Full docs →](./packages/deja-edge/)

### deja (hosted)

Any MCP-capable agent can connect:
Full-featured hosted service with Vectorize semantic search, scoped memory, working state, secrets, and MCP support. Deploy your own instance or use as part of [filepath](https://github.com/acoyfellow/filepath).

```bash
git clone https://github.com/acoyfellow/deja && cd deja
bun install && wrangler login
wrangler vectorize create deja-embeddings --dimensions 384 --metric cosine
wrangler secret put API_KEY
bun run deploy
```

Connect via REST, the [client package](./packages/deja-client/), or MCP:

```json
{
"mcpServers": {
"deja": {
"type": "http",
"url": "https://your-deja-instance.workers.dev/mcp",
"headers": {
"Authorization": "Bearer ${DEJA_API_KEY}"
}
"headers": { "Authorization": "Bearer ${DEJA_API_KEY}" }
}
}
}
```

Integration guides: [Cursor](https://deja.coey.dev/integrations/cursor) · [Claude Code](https://deja.coey.dev/integrations/claude-code) · [GitHub Actions](https://deja.coey.dev/integrations/github-actions)
## Core concepts

All three systems share the same mental model:

- **Remember** — store a memory (text, optionally with trigger/context/confidence)
- **Recall** — search for relevant memories given a context
- **Confirm / Reject** — feedback loop. Confirmed memories rise, rejected ones fade
- **Forget** — permanently delete a memory

Memories are deduplicated at write time. Conflicting memories (same topic, different content) are automatically resolved — the newer one supersedes the older one.

## Hosted service API

## API surface
The hosted service adds features beyond basic memory:

- **Memory**: `/learn`, `/inject`, `/inject/trace`, `/query`, `/learnings`, `/learning/:id`, `/learning/:id/neighbors`, `/stats`, `DELETE /learning/:id`, `DELETE /learnings`
- **Working state**: `/state/:runId`, `/state/:runId/events`, `/state/:runId/resolve`
- **Secrets**: `/secret`, `/secret/:name`, `/secrets`
- **Scoped memory** — `shared`, `agent:<id>`, `session:<id>`, or custom scopes
- **Working state** — live snapshots + event streams for in-progress work
- **Secrets** — scoped key-value storage
- **Loop runs** — track optimization loops with auto-learning from outcomes

Learnings track `lastRecalledAt` and `recallCount`. Bulk delete supports filters: `?confidence_lt=0.5`, `?not_recalled_in_days=90`, `?scope=shared`.
REST endpoints: `/learn`, `/inject`, `/query`, `/learnings`, `/stats`, `/state/:runId`, `/secret`, `/run`

Full reference: https://deja.coey.dev/docs

## Architecture

- **Runtime**: Cloudflare Worker + Durable Object (per-user SQLite)
- **Embeddings**: Workers AI (`@cf/baai/bge-small-en-v1.5`, 384 dimensions)
- **Search**: Cloudflare Vectorize (cosine similarity)
- **Auth**: optional `API_KEY` secret; open access if not set
- **deja-local**: Bun SQLite + ONNX embeddings (all-MiniLM-L6-v2). In-memory vector index loaded at startup.
- **deja-edge**: Cloudflare DO SQLite + FTS5. Porter stemming tokenizer. BM25 ranking blended with confidence.
- **deja (hosted)**: Cloudflare Worker + Durable Object + Vectorize + Workers AI. Per-user isolation via API key.

## Reference
## License

- Schema: `src/schema.ts`
- Worker entry: `src/index.ts`
- Durable Object: `src/do/DejaDO.ts`
- Client package: `packages/deja-client/`
- Architecture guide: https://deja.coey.dev/guides/architecture-and-self-hosting
MIT
147 changes: 17 additions & 130 deletions packages/deja-client/README.md
Original file line number Diff line number Diff line change
@@ -1,152 +1,39 @@
# deja-client

Thin client for [deja](https://github.com/acoyfellow/deja) — persistent memory for agents.

## Install

```bash
npm install deja-client
# or
bun add deja-client
```

## Usage
HTTP client for a hosted [deja](https://github.com/acoyfellow/deja) instance.

```ts
import deja from 'deja-client'

const mem = deja('https://deja.coey.dev')
const mem = deja('https://your-deja-instance.workers.dev', { apiKey: '...' })

// Store a learning
await mem.learn('deploy failed', 'check wrangler.toml first')

// Get relevant memories before a task
const { prompt, learnings } = await mem.inject('deploying to production')

// Search memories
const results = await mem.query('wrangler config')

// List all memories
const all = await mem.list()

// Delete a memory
await mem.forget('1234567890-abc123def')

// Get stats
const stats = await mem.stats()
```

## Types

All types are exported for use in your application:

```ts
import deja, { type Learning, type InjectResult, type QueryResult, type Stats } from 'deja-client'
```

### `Learning`

The core memory type returned by most methods:

```ts
interface Learning {
id: string // Unique identifier
trigger: string // When this memory applies
learning: string // What was learned
reason?: string // Why this was learned
confidence: number // 0-1 confidence score
source?: string // Source identifier
scope: string // "shared", "agent:<id>", or "session:<id>"
createdAt: string // ISO timestamp
}
```

### `InjectResult`

Returned by `mem.inject()`:

```ts
interface InjectResult {
prompt: string // Formatted prompt text
learnings: Learning[] // Raw learnings
}
```

### `QueryResult`

Returned by `mem.query()`:
## Install

```ts
interface QueryResult {
learnings: Learning[]
hits: Record<string, number> // Hits per scope
}
```bash
npm install deja-client
```

### `Stats`

Returned by `mem.stats()`:
## API

```ts
interface Stats {
totalLearnings: number
totalSecrets: number
scopes: Record<string, { learnings: number; secrets: number }>
}
const mem = deja(url, { apiKey?, fetch? })

await mem.learn(trigger, learning, { confidence?, scope?, reason?, source? })
await mem.inject(context, { scopes?, limit?, format? })
await mem.query(text, { scopes?, limit? })
await mem.list({ scope?, limit? })
await mem.forget(id)
await mem.stats()
await mem.recordRun(outcome, attempts, { scope?, code?, error? })
await mem.getRuns({ scope?, limit? })
```

## API

### `deja(url, options?)`

Create a client instance.

- `url` — Your deja instance URL
- `options.apiKey` — API key for authenticated endpoints
- `options.fetch` — Custom fetch implementation

### `mem.learn(trigger, learning, options?)`

Store a learning for future recall.

- `trigger` — When this learning applies
- `learning` — What was learned
- `options.confidence` — 0-1 (default: 0.8)
- `options.scope` — `shared`, `agent:<id>`, or `session:<id>` (default: `shared`)
- `options.reason` — Why this was learned
- `options.source` — Source identifier

### `mem.inject(context, options?)`

Get relevant memories for current context.

- `context` — Current task or situation
- `options.scopes` — Scopes to search (default: `['shared']`)
- `options.limit` — Max memories (default: 5)
- `options.format` — `'prompt'` or `'learnings'` (default: `'prompt'`)

### `mem.query(text, options?)`

Search memories semantically.

- `text` — Search query
- `options.scopes` — Scopes to search (default: `['shared']`)
- `options.limit` — Max results (default: 10)

### `mem.list(options?)`

List all memories.

- `options.scope` — Filter by scope
- `options.limit` — Max results

### `mem.forget(id)`

Delete a specific memory by ID.

### `mem.stats()`

Get memory statistics.
All types (`Learning`, `InjectResult`, `QueryResult`, `Stats`) are exported for TypeScript users.

## License

Expand Down
Loading
Loading