The Problem: 100-250ms Middleware Tax
Our Node.js Express backend had a dirty secret: before any route handler ran, the middleware stack consumed 100-250ms. Four separate Redis round-trips for rate limiting, session load, XSS sanitization — all serial.
For a platform targeting billions of users, this was unacceptable.
The Solution: CacheeEngine in Rust
We replaced the entire JS middleware stack with an in-process Rust cache engine called CacheeEngine:
| Component | What It Does | Memory |
|---|---|---|
| CacheeLFU eviction | Frequency counters with periodic halving decay | Inline |
| Count-Min Sketch | Admission doorkeeper — rejects one-hit-wonders | 512 KiB constant |
| DashMap storage | Lock-free concurrent reads | O(entries) |
| SWR support | Stale-while-revalidate built in | Built-in |
CacheeLFU vs W-TinyLFU
We chose CacheeLFU over W-TinyLFU (Caffeine/moka):
- No window cache
- Direct frequency comparison with CMS admission
- Periodic halving decay instead of reset-on-epoch
Count-Min Sketch: 512 KiB That Saves Everything
The CMS is the doorkeeper. Layout: 4 rows x 131,072 counters x 1 byte = 512 KiB. Constant regardless of entry count.
Results
| Path | JS (Express) | Rust (Axum + Cachee) |
|---|---|---|
| Middleware | 100-250ms | <5ms |
| Trust score hit | 5-10ms | <1us |
| Mining sync | 50ms-2s | <10ms |
| Swap quote stale | 40-150ms | <1us |
| Rate limit | 30-50ms | <100ns |
The binary is 5.2MB stripped. 15 tests pass.
RevMine is live. Check GitHub for open-source components.
Built on Cachee — H33 post-quantum cache engine with CacheeLFU eviction.