Skip to content

Conversation

@madeinoz67
Copy link

This pull request introduces the new pai-knowledge-system pack, which provides persistent AI memory and knowledge management using a Graphiti knowledge graph with a FalkorDB backend. This compliments the existing history pack

It adds documentation, configuration files, hook integration, and updates to the history system to support seamless knowledge graph syncing and prevent feedback loops. The changes are grouped below by theme.

New Pack Introduction and Documentation

  • Added the pai-knowledge-system pack to the repository, including its package.json and documentation entries in both PACKS.md and Packs/README.md, describing its features and installation order. [1] [2] [3] [4]

Configuration and Environment Setup

  • Added example environment configuration (config/.env.example) for the pack, detailing required variables for LLM providers, database settings, and concurrency, with clear namespacing to avoid conflicts.
  • Added MCP server configuration (config/.mcp.json) for connecting to the pack's SSE endpoint.

Hook Integration for Knowledge Sync

  • Added hook configuration (config/settings-hooks.json) to enable syncing history and learning events to the knowledge graph at key lifecycle points (SessionStart, Stop, SubagentStop).

History System Safeguards

  • Updated the history capture system (capture-all-events.ts) to exclude knowledge system tool operations from capture, preventing feedback loops and redundant knowledge entries. [1] [2]

Utility Library for Markdown Metadata Extraction

  • Added a frontmatter parser library (src/hooks/lib/frontmatter-parser.ts) to extract structured metadata and clean markdown files for reliable ingestion into the knowledge graph.

…umentation, update environment variable prefixes, and improve verification processes
@badosanjos
Copy link

Wow.. I loved this concept, I was thinking exactly about how to manage the memory among the Global and Project/Local concept. but how it connects with the core History Bundle ? It just sync and provides a 2nd bucket where the LLM my find more context ? Or it changes the way that the core History manage the learnings/facts ? If in future someone decide to turn off this pack, how the core history learnings were impact into the period that this pack was active ?

@pcockwell
Copy link

I'm hoping to resolve exactly the same issue but I'm not entirely sure what the OpenAI or secondary LLM is required for. Any chance you could explain that a little bit more in depth? I'm definitely curious.

@danielmiessler
Copy link
Owner

Thank you @madeinoz67 for this knowledge management system concept! 🙏

PAI v2.1 includes a new MEMORY/ system structure in pai-core-install that provides organized knowledge persistence. Your ideas around knowledge management helped inform this direction.

With the kai-*pai-* rename, the paths have changed. If you'd like to build on the new MEMORY/ foundation, we'd love to see it!

See the release: https://github.com/danielmiessler/PAI/releases/tag/v2.1.0

@madeinoz67
Copy link
Author

Wow.. I loved this concept, I was thinking exactly about how to manage the memory among the Global and Project/Local concept. but how it connects with the core History Bundle ? It just sync and provides a 2nd bucket where the LLM my find more context ? Or it changes the way that the core History manage the learnings/facts ? If in future someone decide to turn off this pack, how the core history learnings were impact into the period that this pack was active ?

thanks, the learnings etc that are stored in the history pack remain, this pack adds a hook into the history system and will stores any of the core learnings, research, decisions etc as knowledge totally seperate to the history without impacting. except with this pack it will utilse vector for relationships it learns. please see the readme in the dev branch, i'll work on integrating this pack into the new 2.1 and push a new PR

@madeinoz67
Copy link
Author

I'm hoping to resolve exactly the same issue but I'm not entirely sure what the OpenAI or secondary LLM is required for. Any chance you could explain that a little bit more in depth? I'm definitely curious.

Ive been running for about a week now and works very well, please read the packs readme as it explains the concepts, it uses graphiti, https://github.com/getzep/graphiti, therefore the LLM requirements stem from there, the readme in my branch explains explains a bit better, but in short the LLM(s) are used for inference, embeddings, and cross-encoding/reranking, currenly this pack just implements a single LLM, openAI, but others are easily configured

@madeinoz67
Copy link
Author

Thank you @madeinoz67 for this knowledge management system concept! 🙏

PAI v2.1 includes a new MEMORY/ system structure in pai-core-install that provides organized knowledge persistence. Your ideas around knowledge management helped inform this direction.

With the kai-*pai-* rename, the paths have changed. If you'd like to build on the new MEMORY/ foundation, we'd love to see it!

See the release: https://github.com/danielmiessler/PAI/releases/tag/v2.1.0

roger that will do

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants