You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The auto-save functionality for keeping the specstory saved .md file up to date always.
Expected / wished-for behavior
File is auto-saved upon prompt submission and thereafter updating the file repeatedly with new response tokens as they stream in live. Saving to disk with every token, or possib. every 0.25-1 second as the response streams in, until it's finished.
Current behavior
Instead, auto-save only ever fires immediately upon prompting itself, saving no reply data as it's not yet available at that moment in time.
No matter how long it's left afterwards, the response is never added to the .md file.
Thanks so much
This would be hugely helpful for me to be able to keep all my systems, throughout, incl. git and supporting tools vastly more tightly in-sync with the current moment during my flow of work.
Super-useful tool btw !! much appreciation and enthusiasm for the idea of it and this change i think would take it to the next level for rubber-meets-the-road utility for that most-fresh in-the-moment "tying-things-together" developer experience ! :DD >.> :P :)
The text was updated successfully, but these errors were encountered:
Your description isn't totally accurate @Korolen in terms of how SpecStory works, but the essence of your concern still holds. We don't actually know anything "upon prompting itself", we get no invocation or signal of that happening at all.
Instead what we're doing is watching the sqlite DB Cursor file that uses to store the chat history. We watch that for changes at the OS level, and whenever it changes we look for updated chats that have a new date since the last time we auto-saved that chat.
This does explain the delay that's often seen between the time a human/AI interaction takes place and when the content makes it into an auto-save file, as we are dependent on the chat interaction making it out to the DB on disk, which doesn't happen in real-time (or "streaming" as you suggest). Cursor writes to in memory DB pages, and those get flushed to disk periodically. We only see the changes when they get flushed to disk.
What can happen though is that the AI response takes a while (it maybe thinks for a while on something, or has tool use that's slow, or what have you) a partial and incomplete portion of the AI response may get flushed to disk. Later, when the rest of the AI response makes it to disk, we check the time on the AI interaction, and see that it's one we've already seen, so we don't update the auto-save unless/until there is another prompt from the user (or, you can force a regeneration by deleting the auto-saved markdown file itself). This is problematic, and we'll use this issue #57 to track improving this.
belucid
changed the title
Doesn't auto-save response to latest prompt
Race condition on auto-saving long AI responses
Apr 30, 2025
belucid
changed the title
Race condition on auto-saving long AI responses
Race condition on auto-saving long (in time) AI responses
Apr 30, 2025
Doesn't auto-save response to latest prompt
Context
The auto-save functionality for keeping the specstory saved .md file up to date always.
Expected / wished-for behavior
File is auto-saved upon prompt submission and thereafter updating the file repeatedly with new response tokens as they stream in live. Saving to disk with every token, or possib. every 0.25-1 second as the response streams in, until it's finished.
Current behavior
Instead, auto-save only ever fires immediately upon prompting itself, saving no reply data as it's not yet available at that moment in time.
No matter how long it's left afterwards, the response is never added to the .md file.
Thanks so much
This would be hugely helpful for me to be able to keep all my systems, throughout, incl. git and supporting tools vastly more tightly in-sync with the current moment during my flow of work.
Super-useful tool btw !! much appreciation and enthusiasm for the idea of it and this change i think would take it to the next level for rubber-meets-the-road utility for that most-fresh in-the-moment "tying-things-together" developer experience ! :DD >.> :P :)
The text was updated successfully, but these errors were encountered: