Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: improve rust-analyzer auto-completion #1782

Merged
merged 3 commits into from
Sep 29, 2023
Merged

fix: improve rust-analyzer auto-completion #1782

merged 3 commits into from
Sep 29, 2023

Conversation

pikaju
Copy link
Contributor

@pikaju pikaju commented Sep 24, 2023

This change fixes #1781. Please do try it out. You might have to restart rust-analyzer for it to "rebuild" the macros.

HOWEVER

There is a caveat.

The situation is a bit more tricky than I initially thought, because the macros (at least the client macro) actually uses non-standard Rust syntax: Function parameter documentation comments (and potentially more, I'm not sure). Here's what happens:

image

  1. The macro is fed the token stream with my extraneous let
  2. The macro realizes that the syntax is broken and returns the token stream as is
  3. The Rust compiler and rust-analyzer get token stream with function parameter doc comments, and then complain

This is not a problem with valid syntax, as the behavior there is unchanged. However, it might confuse users, and they might remove the doc comments (thinking they are not allowed) before fixing the actual issue.

Can we fix this?

I believe so. For this particular case, a simple fix would be to parse and re-serialize the function signature and body separately. This way, the macro can remove the doc comments before rust-analyzer sees them, even if there is a syntax error in the function body. Before I attempt this, I wanted to make sure you are fine with bigger changes?

I leave it up to you if the PR in its current state is an improvement or a regression (though I'm guessing the latter).

@pikaju
Copy link
Contributor Author

pikaju commented Sep 24, 2023

Okay, I didn't expect this to fail any tests, to be honest

Edit: The test seems unintended

@pikaju
Copy link
Contributor Author

pikaju commented Sep 25, 2023

Update: I implemented a dummy model for #[component] that is forgiving of errors in the function body, so auto-completion should be improved without the weird documentation comment errors I mentioned originally.

@gbj
Copy link
Collaborator

gbj commented Sep 25, 2023

Does this work consistently when you try it out? I tried it in a local example but found that it generally did not have any effect. However, my rust-analyzer is often also busted because I am switching back and forth between so many different PRs/examples/etc. so I wouldn't be surprised if it worked for you but not me, and I'd believe it was working if it does.

@pikaju
Copy link
Contributor Author

pikaju commented Sep 26, 2023

I also have a very bad experience with r-a (at least in this project), it often doesn't work at all, regardless of the macro. Maybe there is another issue, unrelated to what I'm trying to achieve here. My r-a logs are full of errors, perhaps those break it?

Just to clarify, did you:

  1. Restart r-a (or better yet your whole editor) after checking out the commit (this is strictly necessary, otherwise it uses the old, cached version of the macro internally, at least that's what I found)
  2. Wait a long time for it to actually start up? In the counter example, I need to wait about 30 seconds before it even tries. SimpleQueryCounter took even longer...

If possible, I wouldn't take my word for it, because I'm new to Leptos, proc-macros and even Rust itself. I played around with my first proc-macro like 3 days ago.

I can't speak about consistency, but here are a couple of screenshots where I get auto-completions despite a syntax error:

image

image

image

@blorbb
Copy link
Contributor

blorbb commented Sep 28, 2023

not sure if this would work, but I've been working on my own proc macro too and managed to improve r-a by parsing a TokenStream instead of some syn structure if I don't need to care about what's inside. Instead of parsing a syn::ItemFn, could you make a custom AST that parses all the way until the body of the function, then just takes a proc_macro2::TokenTree (which should be the Group variant containing the entire function body)? Then this TokenTree can be inserted directly when expanding, syntax errors don't matter. This might be easier than cloning everything into a dummy.

@pikaju
Copy link
Contributor Author

pikaju commented Sep 28, 2023

If I understand you correctly, that's basically what I'm doing, the dummy model only parses the function signature and leaves the body as a TokenStream (unparsed). The reason I have to parse the function signature is because of the error messages you'd otherwise get, see the original post.

@blorbb
Copy link
Contributor

blorbb commented Sep 28, 2023

ah my bad, I didn't read the code properly haha

@gbj gbj merged commit 772bb1d into leptos-rs:main Sep 29, 2023
56 checks passed
@gbj
Copy link
Collaborator

gbj commented Sep 29, 2023

I can't see any downside to this, so I'm merging it in the hopes that it does increase the recoverability a bit!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Rust Analyzer's auto-completion is weakened by the #[component] macro
3 participants