How We Built Temporal Decay: Making AI Memory Feel Human
Most AI memory systems remember everything forever. Here's why that's wrong, and how we modeled Haiven's decay system on actual human memory research.
When we started building Haiven’s memory system, the obvious approach was to store everything. Modern storage is cheap. Vector databases can handle billions of embeddings. Why let anything fade?
Then we realized: human memory doesn’t work that way. And there’s a reason.
The Problem with Perfect Memory
Imagine if you remembered every meal you’ve ever eaten with equal clarity. Every conversation at equal importance. Every passing thought preserved forever.
It would be overwhelming. Worse, it would be useless.
The value of memory isn’t in raw storage - it’s in relevance. What matters is having the right information surface at the right time. And that requires forgetting.
Ebbinghaus and the Forgetting Curve
In 1885, German psychologist Hermann Ebbinghaus discovered something fundamental about memory: we forget in a predictable pattern.
His “forgetting curve” shows that memory retention drops exponentially over time - unless reinforced through recall. Access a memory, and it strengthens. Let it sit, and it fades.
This isn’t a bug in human cognition. It’s a feature. The brain is constantly optimizing for what’s likely to be useful. Recent, frequently-accessed, emotionally-significant information stays accessible. The rest gracefully fades to make room.
Modeling Decay in Haiven
We built Haiven’s decay system on these principles. Here’s how it works:
Base Decay
Every memory has a “decay score” that decreases over time. The rate depends on the type of memory:
- Transient memories (preferences, states) decay faster
- Historical memories (events, decisions) decay slower
- Core identity (values, personality) barely decays at all
A memory about your lunch preference might decay in weeks. A memory about your core values might persist for years.
Reinforcement Through Use
When a memory is accessed - retrieved in a search, referenced in a conversation, explicitly recalled - its decay score resets. Frequently used memories stay strong.
This creates a natural feedback loop: memories that are useful get reinforced. Memories that aren’t fade away. The system learns what matters through actual usage patterns.
Importance Weighting
Some memories matter more than others, regardless of access frequency. We detect this through:
- Emotional intensity: Memories marked with strong emotions decay slower
- Explicit importance: If you star or pin a memory, it resists decay
- Contextual significance: Memories tied to active projects stay relevant
The Tier System
Based on decay scores, memories fall into tiers:
- Hot (0.8-1.0): Core memories, recently used, highly relevant
- Warm (0.5-0.8): Accessible with context, may need refresh
- Cold (0.2-0.5): Archive territory, retrieval requires effort
- Faded (<0.2): Candidates for cleanup or compression
This mirrors how human memory works. Some things are instantly accessible. Others require effort to recall. And some are genuinely lost.
The Technical Implementation
Under the hood, decay runs as a background process:
daily_decay_score = current_score * base_rate ^ (days_since_last_access)
adjusted_score = daily_decay_score * importance_multiplier
final_score = max(adjusted_score, minimum_threshold)
Users can tune these parameters. Want memories to persist longer? Increase the base rate toward 1.0. Want aggressive cleanup? Push it toward 0.9.
We also let users set per-category rates. Your work memories might decay faster than your learning notes. Your health information might persist longer than social context.
Why This Matters
Most AI memory systems are all-or-nothing. Either everything is remembered with equal weight, or memories are manually deleted. Neither approach is natural.
Temporal decay creates a dynamic, living memory system that:
- Automatically prioritizes recent, relevant information
- Reduces noise from outdated context
- Responds to your actual usage patterns
- Feels more human in its behavior
When you ask an AI with Haiven memory about “that thing from last month,” it knows which things from last month are likely to matter. The system has been quietly learning what’s important to you - not through explicit programming, but through the natural patterns of decay and reinforcement.
Want AI that remembers you intelligently? Try Haiven and experience memory that works like yours.