There should be a simple button that allows you refine the context. A fresh LLM could generate a new context from the input and outputs of the chat history, then another fresh LLM can start over with that context.
You are saying “fresh LLM” but really I think you’re referring to a curated context. The existing coding agents have mechanisms to do this. Saving context to a file. Editing the file. Clearing all context except for the file. It’s sort of clunky now but it will get better and slicker.