Conversations shouldn't be a single thread
We introduce the Conversation Tree Architecture - a hierarchical framework that organizes LLM conversations into isolated, context-aware nodes with principled mechanisms for branching, merging, and flow.
Definitions
The atomic element of a conversation - a human prompt paired with a model response. The minimal unit of meaning from which all higher-order structures are assembled.
A deliberate split from an existing conversation, initiated by the user to explore an alternative direction without contaminating the parent context.
The selective passage of information from a parent node to a newly created child node at the moment of branching.
When a child node is deleted, relevant content from its context window is incorporated back into the parent.
The broader class of operations by which context migrates across the tree - encompassing upstream, downstream, and any lateral transfer between sibling or cousin nodes.
A transient branch that exists only for the duration of a session. At session end, its context window is either merged upstream or purged entirely.
The Problem
The fundamental limitation of current LLM conversation interfaces lies in their treatment of context as a single, shared, append-only resource. When a user wishes to explore a tangential question, debug an edge case, or pursue an alternative framing of a problem, they are forced to do so within the same context window that governs the main thread. Over time, this produces conversations that are topically diffuse, difficult to navigate, and increasingly prone to model responses that fail to reflect the user's true intent.
We characterize this failure mode as logical context poisoning: the progressive corruption of conversational coherence caused not by erroneous model outputs per se, but by the structural mismanagement of context.
The Architecture
The Conversation Tree Architecture organizes LLM conversations as hierarchical tree structures in which each node is a self-contained conversational unit with its own isolated context window, and edges represent structured context-passing relationships between nodes.
Context Flow
Intelligent downstream flow is essential to avoid pre-loading a branch with irrelevant parent context that would itself become a source of poisoning. Upstream context merging is the inverse operation: when a child node is deleted, relevant content from its context window is incorporated back into the parent.
Downstream Context Flow
Selective passage of information from a parent to a child at branch time. What is relevant? Full context, summary, or selected units?
Upstream Context Merge
Distillation of a child node's contents back into the parent when deleted. Where does it insert? At what granularity?
Volatile Nodes
Volatile nodes are a distinct class of branch nodes defined by their transient lifecycle. A volatile node exists only for the duration of a session and carries no persistent state.
When a volatile node is deleted or the session ends, its local context window must either be merged upstream into the parent according to the upstream context merging rules, or it is purged entirely and its content is lost. They are particularly useful for exploratory or experimental conversational threads.
Significance
The problems addressed by this proposal are not merely theoretical inconveniences. Context poisoning is a real and recurring phenomenon that degrades the utility of LLM tools precisely in the high-stakes, high-complexity scenarios where they are most needed.
Maintains a primary analytical thread while spinning off volatile branches to explore citations, definitions, or counterarguments.
Maintains a high-level design conversation while branching into debugging sessions that are later summarized and merged back.
Preserves a narrative planning thread while experimenting with alternative scenes or tones in isolated branches.
Beyond individual utility, the architecture provides a conceptual framework for future work on multi-agent LLM systems, where context management across multiple cooperating agents is an open and pressing research problem. The node-and-flow abstraction generalizes naturally.
Research Goals
Formalize the Architecture
Define the computational model — data structures, operations, and invariants governing node creation, context flow, and deletion.
Intelligent Downstream Passing
Develop algorithms for selecting and compressing parent context prior to branching, maximizing child performance while minimizing irrelevant carryover.
Upstream Merge Strategies
Investigate tradeoffs between condensation granularity, relevance filtering, and chronological vs. append insertion positioning.
Empirical Evaluation
Compare the proposed architecture against baseline linear interfaces, measuring output quality, task completion, and user-reported coherence.
Timeline
Phase 1
- Formalize theoretical model
- Implement prototype interface
- Evaluate downstream context passing on benchmarks
Phase 2
- Upstream merge algorithm development
- Comprehensive empirical evaluation
- Prepare thesis document
- Research presentation