Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix inline-edit prompts chat building #6003

Merged
merged 1 commit into from
Nov 18, 2024
Merged

Conversation

vovakulikov
Copy link
Contributor

I've noticed that on the main when you run the inline prompt, the chat doesn't have any different messages from the actual inline edit task. Apparently, it looks like there is a race condition between task execution and logic that populates chat with diffs. The main problem is that at the moment when we run populate with chat messages, task.diff has an undefined value.

This PR tries to fix this problem by awaiting the task finish state.

Test plan

  • Create inline prompt (s2 has prompt-creation-v2 flag enabled already)
  • Try to run it
  • You should see that chat has non-empty chat messages with actual task diffs

@vovakulikov vovakulikov requested review from thenamankumar and a team October 25, 2024 17:19
@vovakulikov vovakulikov self-assigned this Oct 25, 2024
@@ -97,8 +97,7 @@ export class EditProvider {
onTurnComplete: async () => {
typewriter.close()
typewriter.stop()
void this.handleResponse(text, false)
return Promise.resolve()
return this.handleResponse(text, false)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@abeatrix any concerns on awaiting here handleResponse?

@@ -157,7 +156,7 @@ export class EditProvider {
break
}
case 'complete': {
void multiplexer.notifyTurnComplete()
await multiplexer.notifyTurnComplete()
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same question here, I haven't seen anything suspicious in multiplexer implementation but maybe I'm missing here something important

Copy link
Member

@thenamankumar thenamankumar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have no idea what sideeffects this will have. Will have to dig into the code for it later. Just approving to unblock. Please get an approval from @abeatrix. I'll try to check the code myself soon, if it doesn't get approved.

Copy link
Contributor

@abeatrix abeatrix left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The code look good to me but I might be missing some context on why it was implemented this way. I'm afk rn so if you can check and confirm the PR that made this change did not make this change on purpose.

The only thing I can think of that could be affected by this change is the code lenses state, so if you can confirm they are working as expected for both JetBrains and VS Code, I think this should be safe.

@abeatrix
Copy link
Contributor

I approved to unblock but i think it'd be safer if we could move it to Fixup controller like I mentioned in the other PR, where we could update them accordingly when we change the state of the task, wdyt?

@dominiccooney
Copy link
Contributor

Is this fix effective? What's are the messages in the stream?

notifyTurnComplete sends onTurnComplete to all of the subscribers and waits for resolution. Assuming none of them throw, what you're effectively doing here is making the complete message handling wait for handleResponse--and any other subscribers' turn complete handling--before continuing to consume messages.

@vovakulikov
Copy link
Contributor Author

@dominiccooney

Is this fix effective?

This change ensures that all vital fields on the task object will be updated when we resolve executeCodyCommand, I checked this manually.

what you're effectively doing here is making the complete message handling wait for handleResponse--and any other subscribers' turn complete handling

This is exactly right, the original problem was that task object doesn't get updated in time when we resolve the original edit flow. I think this change should be okay, because if we take a look at the handleResponse it doesn't have nesting await, so we hit only sync code which updates original task object (the last important entry in callstack is applyTask function). So by the time original executeCodyCommand resolves we should have most relevant fields value.

@dominiccooney
Copy link
Contributor

@vovakulikov

What how does waiting for handleResponse thru onTurnComplete thru notifyTurnComplete thru consuming the stream messages affect the consumer? My question is: is it effective because it really ties the property updates to downstream the consumer, or does it just happen to work by yielding more.

@vovakulikov
Copy link
Contributor Author

vovakulikov commented Oct 28, 2024

@dominiccooney

how does waiting for handleResponse through onTurnComplete thru notifyTurnComplete through consuming the stream messages affect the consumer?

So, consumers send a command and expect a task object with a completed state. Previously, the edit command returned a task object with still inProgressState (with not properly updated task fields); this awaiting ensures that the consumer will get a proper "finished" task object. You can check the 'executeEdit' method in the EditManager class.

is it effective because it really ties the property updates to downstream the consumer, or does it just happen to work by yielding more.

If I understood this correctly, the first option is our case here

@vovakulikov vovakulikov merged commit f4a921f into main Nov 18, 2024
19 of 20 checks passed
@vovakulikov vovakulikov deleted the vk/fix-inline-edit-prompts branch November 18, 2024 16:38
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants