Auto-compact doesn't work. Anthropic flagged this as fixed on January 15, but the issue persists. When context window fills up, instead of auto-compacting and continuing the conversation, one of two things happens:
This happens even when the context being fed the chat shouldn't be anywhere near the 200k token limit.
Auto-compact should trigger automatically when approaching context limits, allowing the conversation to continue without manual intervention.
Usually none; message just silently returns to input box. Occasionally "limit reached" error.
Not sure / Multiple models
Yes, this worked in a previous version
Claude.ai (web/desktop) worked prior to January 14 major outage and January 15 compaction incident. Broken since.
N/A
Other
Windows
Other
Issue was reported previously and marked as resolved January 15. Still broken as of January 17.
Happens within Projects, unsure if it affects non-Project chats.
Note: Submitting here because there's no dedicated bug tracker for Claude.ai. This affects the web and desktop interfaces, not Claude Code specifically. Auto-compact was functional before the January 14 outage and has remained broken despite being flagged as fixed on January 15.