Page:
Data Flow 05 Entity Profile System
Pages
Architecture Commands and API
Architecture Context System
Architecture Core Engine
Architecture Event Sourcing
Architecture Generative Reflection
Architecture Helpers
Architecture Journal System
Architecture LLM Interaction
Architecture LLM Providers
Architecture Logging
Architecture Memory and Sleep
Architecture Overview
Architecture Persona Protection
Architecture Prompt System
Architecture RAG Implementation
Architecture Resilience System
Architecture Safety System
Architecture Self Management
Architecture Sub Agent Delegation
Architecture Task Assessment
Architecture Token Management
Architecture Tool System
Configuration Reference
Context and Memory Flow Analysis
Data Flow 01 Context Compaction
Data Flow 02 ReAct Loop
Data Flow 03 Memory Consolidation
Data Flow 04 Message Classification
Data Flow 05 Entity Profile System
Data Flow 06 Tool Execution
Data Flow 07 Sleep Mode Transitions
Data Flow 08 LLM Provider Interaction
Data Flow 09 Self Management Operations
Home
LLM Decision Patterns
Research Foundations
User Guide 00 Index
User Guide 01 Getting Started
User Guide 02 Configuration and Customization
User Guide 03 Advanced Capabilities
User Guide 04 Troubleshooting
No results
This file contains ambiguous Unicode characters
This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.
Data Flow 05: Entity Profile System
Engineering documentation series - Data flows in the AI Assistant system
Overview
The entity profile system provides per-entity memory for players, NPCs, and objects. Based on O-MEM research, it separates raw observations (Pf) from synthesized attributes (Pa) with relationship tracking using Dunbar-inspired thresholds.
Related Documents
| Document | Description |
|---|---|
| Data-Flow-03-Memory-Consolidation | How observations become attributes |
1. Three-Tier Architecture
┌─────────────────────────────────────────────────────────────────────────────┐
│ ENTITY MEMORY ARCHITECTURE │
│ ─────────────────────────────────────────────────────────────────────────── │
│ │
│ Tier 1: ENTITY PROFILES (long-term) │
│ Location: character.db.entity_profiles │
│ Contains: Attributes, observations, relationships │
│ Lifespan: Persistent, consolidated during sleep │
│ │
│ Tier 2: WORKING MEMORY (active session) │
│ Location: character.db.working_memory │
│ Contains: Active conversation context, pending actions │
│ Lifespan: Cleared when conversation stales (1 hour) │
│ │
│ Tier 3: EPISODIC INDEX (searchable experiences) │
│ Location: character.db.journal["entries"] │
│ Contains: Raw experiences with hybrid scoring │
│ Lifespan: Pruned based on importance + age │
│ │
└─────────────────────────────────────────────────────────────────────────────┘
2. Entity Profile Structure
Profile Schema
character.db.entity_profiles = {
"#123": { # Entity ID (dbref or key)
# Identity
"entity_type": "player", # player | npc | object
"name": "Alice",
"created": "2025-12-06T10:00:00Z",
"last_interaction": "2025-12-06T15:30:00Z",
# Synthesized knowledge (Pa - stable facts)
"attributes": [
{
"type": "preference",
"value": "Prefers tea over coffee",
"source": "consolidation",
"created": "2025-12-05T..."
},
{
"type": "trait",
"value": "Speaks formally, dislikes slang",
"source": "consolidation",
"created": "2025-12-04T..."
}
],
# Raw observations (Pf - recent experiences)
"observations": [
{
"content": "Alice mentioned her cat is named Whiskers",
"source": "direct", # direct | inferred | told
"timestamp": "2025-12-06T15:30:00Z"
}
],
# Relationship tracking
"relationship": {
"state": "acquaintance", # stranger | acquaintance | friend | ally
"favorability": 0.45, # 0.0 to 1.0
"interaction_count": 12,
"last_delta": 0.05, # Last favorability change
"history": [ # Recent relationship events
{"delta": 0.05, "reason": "helpful_response", "timestamp": "..."}
]
}
}
}
3. Relationship State Machine
Dunbar-Inspired Thresholds
┌─────────────────────────────────────────────────────────────────────────────┐
│ RELATIONSHIP STATES │
│ ─────────────────────────────────────────────────────────────────────────── │
│ │
│ STRANGER ACQUAINTANCE FRIEND ALLY │
│ [0.0 ─────────── 0.25 ─────────── 0.50 ─────────── 0.75 ─────────── 1.0] │
│ │
│ Stranger (< 0.25): │
│ - No relationship history │
│ - Formal/cautious interactions │
│ - Limited trust │
│ │
│ Acquaintance (0.25 - 0.50): │
│ - Some interaction history │
│ - Friendly but professional │
│ - Moderate trust │
│ │
│ Friend (0.50 - 0.75): │
│ - Significant positive interactions │
│ - Casual, warm communication │
│ - High trust, willing to help │
│ │
│ Ally (>= 0.75): │
│ - Strong positive relationship │
│ - Priority attention │
│ - Highest trust level │
│ │
└─────────────────────────────────────────────────────────────────────────────┘
Favorability Updates
# helpers.py:update_relationship()
def update_relationship(character, entity_id, delta, interaction_type=None):
"""
Adjust entity favorability.
Args:
character: AssistantCharacter
entity_id: Entity identifier
delta: Change amount (-0.1 to +0.1 typical)
interaction_type: Optional descriptor
Returns:
dict: {old_state, new_state, favorability, state_changed}
"""
profile = get_entity_profile(character, entity_id)
relationship = profile["relationship"]
# Apply delta with bounds
old_fav = relationship["favorability"]
new_fav = max(0.0, min(1.0, old_fav + delta))
relationship["favorability"] = new_fav
# Update state based on thresholds
old_state = relationship["state"]
new_state = _calculate_state(new_fav)
relationship["state"] = new_state
# Record history
relationship["history"].append({
"delta": delta,
"reason": interaction_type,
"timestamp": now()
})
return {
"old_state": old_state,
"new_state": new_state,
"favorability": new_fav,
"state_changed": old_state != new_state
}
4. Working Memory
Active Conversation Context
character.db.working_memory = {
"#123": { # Entity ID
"topic": "quest_help", # Current conversation topic
"started": "2025-12-06T15:00:00Z",
"last_message": "2025-12-06T15:30:00Z",
# Recent conversation messages (limited window)
"messages": [
{"role": "user", "content": "Can you help me find the sword?"},
{"role": "assistant", "content": "I'd be happy to help..."}
],
# Pending actions for this entity
"pending_actions": [
{
"action": "Explain quest mechanics",
"priority": 1,
"created": "2025-12-06T15:30:00Z"
}
],
# Conversation state
"context": {
"mentioned_items": ["sword", "cave"],
"unresolved_questions": ["sword location"]
}
}
}
Working Memory Operations
┌─────────────────────────────────────────────────────────────────────────────┐
│ WORKING MEMORY FLOW │
│ ─────────────────────────────────────────────────────────────────────────── │
│ │
│ start_conversation(character, entity_id, topic) │
│ └─> Initialize or resume working memory context │
│ │
│ add_conversation_message(character, entity_id, content, role) │
│ └─> Append to messages list (limited to last N) │
│ │
│ add_pending_action(character, entity_id, action, priority) │
│ └─> Queue action for future processing │
│ │
│ get_conversation_context(character, entity_id) │
│ └─> Retrieve full working memory for entity │
│ │
│ clear_stale_conversations(character, max_age_hours=1) │
│ └─> Remove inactive conversations (called during sleep) │
│ │
└─────────────────────────────────────────────────────────────────────────────┘
5. Episodic Index (Hybrid Search)
Search Algorithm
# helpers.py:search_episodic_memory()
def search_episodic_memory(
entries,
query,
days_back=7,
min_importance=5,
alpha_recency=1.0, # Weight for time decay
alpha_importance=1.0, # Weight for significance
alpha_relevance=1.0, # Weight for keyword match
):
"""
Hybrid-scored episodic memory search.
Score = α_r * recency + α_i * importance + α_q * relevance
Recency: Exponential decay based on age
Importance: Normalized journal importance (1-10)
Relevance: Keyword/semantic match score
"""
Scoring Formula
┌─────────────────────────────────────────────────────────────────────────────┐
│ HYBRID SCORING │
│ ─────────────────────────────────────────────────────────────────────────── │
│ │
│ Final Score = (α_r × Recency) + (α_i × Importance) + (α_q × Relevance) │
│ │
│ Recency Score: │
│ - exp(-age_days / decay_constant) │
│ - Recent entries score higher │
│ - Configurable decay rate │
│ │
│ Importance Score: │
│ - journal_importance / 10 │
│ - Normalized to 0.0-1.0 │
│ │
│ Relevance Score: │
│ - Keyword overlap with query │
│ - TF-IDF or simple matching │
│ - Semantic similarity (if embeddings available) │
│ │
│ Default weights: α_r = 1.0, α_i = 1.0, α_q = 1.0 (equal weighting) │
│ │
└─────────────────────────────────────────────────────────────────────────────┘
6. Entity Memory Tools
| Tool | Purpose |
|---|---|
recall_entity_profile |
Retrieve profile + recent observations |
update_entity_observation |
Add new observation |
update_relationship |
Adjust favorability |
start_entity_conversation |
Initialize working memory |
recall_conversation_context |
Get active conversation state |
add_pending_action |
Queue action for entity |
search_episodic_memory |
Hybrid-scored journal search |
7. Consolidation Flow (O-MEM)
┌─────────────────────────────────────────────────────────────────────────────┐
│ OBSERVATION → ATTRIBUTE CONSOLIDATION │
│ (During sleep phase) │
│ ─────────────────────────────────────────────────────────────────────────── │
│ │
│ 1. Find entities with >= 5 observations │
│ │
│ 2. For each entity: │
│ ┌─────────────────────────────────────────────────────────────────────┐ │
│ │ Observations (Pf): │ │
│ │ - "Alice mentioned she prefers tea" │ │
│ │ - "Alice ordered tea at tavern" │ │
│ │ - "Alice refused the coffee" │ │
│ └─────────────────────────────────────────────────────────────────────┘ │
│ │ │
│ ▼ LLM Synthesis │
│ ┌─────────────────────────────────────────────────────────────────────┐ │
│ │ Attribute (Pa): │ │
│ │ { │ │
│ │ "type": "preference", │ │
│ │ "value": "Strongly prefers tea, actively dislikes coffee" │ │
│ │ } │ │
│ └─────────────────────────────────────────────────────────────────────┘ │
│ │
│ 3. Add attribute to profile │
│ 4. Optionally prune consumed observations │
│ │
└─────────────────────────────────────────────────────────────────────────────┘
8. Key Files
| File | Relevant Functions |
|---|---|
helpers.py |
create_entity_profile(), get_entity_profile() |
helpers.py |
add_entity_observation(), update_relationship() |
helpers.py |
start_conversation(), add_conversation_message() |
helpers.py |
search_episodic_memory() |
helpers.py |
consolidate_entity_observations() |
helpers.py |
run_entity_consolidation_batch() |
Document created: 2025-12-06