Feature: Importance-weighted truncation #24

Open
opened 2025-12-11 00:06:47 +00:00 by blightbow · 0 comments
Owner

Overview

Implement truncation that preserves narrative coherence by scoring importance rather than using simple FIFO.

Truncation Strategy

When perception stream exceeds token budget:

  1. Never truncate: Most recent N events (configurable, default 10)
  2. Summarize first: Events with narrative_text longer than threshold
  3. Drop by importance: Remove lowest importance events first
  4. Preserve causality: Keep action→result pairs together

Algorithm

def truncate_perception_stream(stream, max_tokens, model):
    """
    Truncate stream while preserving coherence.

    Priority order:
    1. Recent events (always keep)
    2. High importance events
    3. Self-action chains (keep together)
    4. Older/lower importance events (drop first)
    """

Importance Preservation Rules

  • Self-action + result pairs stay together
  • @mention events preserved longer than ambient
  • Causal chains (trigger → response) kept intact

Files to Modify

  • llm_interaction.py - Truncation logic in prompt assembly
  • api/serializers/workbench.py - Update for new model
  • Tests for truncation edge cases

Phase

Phase 4 of Milestone #4

## Overview Implement truncation that preserves narrative coherence by scoring importance rather than using simple FIFO. ## Truncation Strategy When perception stream exceeds token budget: 1. **Never truncate**: Most recent N events (configurable, default 10) 2. **Summarize first**: Events with `narrative_text` longer than threshold 3. **Drop by importance**: Remove lowest importance events first 4. **Preserve causality**: Keep action→result pairs together ## Algorithm ```python def truncate_perception_stream(stream, max_tokens, model): """ Truncate stream while preserving coherence. Priority order: 1. Recent events (always keep) 2. High importance events 3. Self-action chains (keep together) 4. Older/lower importance events (drop first) """ ``` ## Importance Preservation Rules - Self-action + result pairs stay together - @mention events preserved longer than ambient - Causal chains (trigger → response) kept intact ## Files to Modify - `llm_interaction.py` - Truncation logic in prompt assembly - `api/serializers/workbench.py` - Update for new model - Tests for truncation edge cases ## Phase Phase 4 of Milestone #4
Sign in to join this conversation.
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
blightbow/evennia_ai#24
No description provided.