AI Engine
Current engine active
Jobe
Recursive Intelligence Loop
Every cross-document connection discovered by the 3-tier search is analyzed by the LLM and persisted back into the knowledge base as a first-class document. Future queries find these insights, compound them, and generate deeper connections — the system literally gets smarter with every conversation.
Query ──► 3-Tier Search ──► Graph Cross-Refs │ LLM Insight Generation (Who / What / Why / When) │ ┌───────┴───────┐ ▼ ▼ Qdrant FAST Neo4j Graph (embedding) (entities) │ │ └───────┬───────┘ ▼ Intelligence Folder (DIG) │ ◄───────┘ Available for next query
Storage Pipeline
Insights follow the same pipeline as imported PDFs: SQLite chunk record → Titan v2 1024d embedding → Qdrant FAST vector store → Neo4j entity graph. Managed in the Intelligence/ folder in DIG with full lifecycle (view, re-extract, delete).
Trigger Conditions
Insight generation activates when Neo4j returns cross_reference type connections — shared entities between different documents. Uses the currently selected AI Engine (Balanced = Haiku, Advanced = Sonnet). Prompt: Who/What/Why/When + Hidden Insight framework, max 600 tokens.
Lifecycle
Deleting a conversation shows an option to also remove its generated insight from the knowledge base (Qdrant + Neo4j + SQLite). Keeping insights after deletion preserves the compounding intelligence effect.
-- insights -- Qdrant FAST 1024d