-
Notifications
You must be signed in to change notification settings - Fork 149
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Language server not starting for custom language type #384
Comments
Seems like an interesting challenge! I can have a look at the weekend, but a quick - possibly completely wrong - idea would be to set some Also, it would greatly help if you could (before attempting the workaround mentioned above) try to use a browser console to see if there are any errors in there. |
Does it work with the Editor for text files? But I'll agree, especially
notebooks will need the file extension metadata, so they they can be turned
into plain files.
|
Having a cookiecutter for a pygls-based language server and jupyter-lsp
spec would probably be a good addition to the docs/tests once we get this
figured out... For example, we can turn this issue into a gist/binder, then
go back through and add templates, and robot test a lab with the new thing
running. Until we get the new org setup, it could just be a long-running
branch....
|
Hey You're both totally right, I'm missing the file_extension everywhere. I assumed that wasn't important as the files were .ipynb, but clearly jupyter needs them. It works now really well, thanks for your help! |
Yeah LSP doesn't (yet) care about ipynb (though there's some M$ EEE
brewing, I have a feeling), and we have to write the files to something on
disk that a language server will understand. Just for asking: what language
do you plan to build your language server in?
|
The language server is implemented in scala, using LSP4J as a base (similiar to scala meta https://github.com/krassowski/jupyterlab-lsp/issues/17). At the moment I only have hover implemented but it looks like it works well with jupyter-lsp. |
My custom language kernel starts and works. I see my custom LSP listed in the LSP Servers -> 'Available' but is also says 'Missing: 1' -> 'plain'. I've tried many variations on the mime_types but no change. It seems like I have the issue with the file extension mentioned in a few issues. Where do I set the file_extension for my language and LSP that is mentioned above? |
What kernel or file type are you trying to support? It might not be the
In other words if you are seeing "plain" this indicates the issue is not on your side, but on the kernel side or JupyterLab needing additional CodeMirror extension (depending on whether you encounter this issue in notebook or file editor). I can look into the respective kennel of it's open source. |
Just after typing the above I realized I been just looking at the config files. I added the file_extension into the language_info returned by the Kernel and things connected. |
My language kernel is named 'sdtm'. My language server is the minimum example from the pygls doc that does completions in response to the comma character. The completion doesn't work. Does seeing the following status display point to what the problem is after I opened a new notebook using my kernel? I don't see any errors in the Jupyterlab debug output. Status shows: Status hover shows: Full status shows: LSP servers |
pygls in the most recent version violates the Language Server Protocol specification and we had to jump via some hoops to make the jedi-language-server work correctly; in short it returns
A code of the JSON response sent from your server would certainly help to understand the situation better. Sometimes a GIF from the UI helps too. |
Browser Console output when opening a new notebook:
|
More complete log from starting fresh after restarting Jupyter Lab and creating a new notebook:
|
This is good (as in: useful), but I will need to see the log of actual responses returned from your server to be able to understand what is going on. |
Server log attached. Output of commands below. $ jupyter serverextension list $ jupyter labextension list |
Ouch, you are running a very old JupyterLab. We no longer support JupyterLab 1.x. This is because the extension has progressed significantly since the times of JupyterLab 1.x, we fixed countless bugs and the extension system has changed significantly so backporting is not practical. We are just a small group of volunteers doing this in our free time and cannot support versions two major releases back. You should create a new virtual environment and install JupyterLab 3.0 there along with the latest extension version. |
Though your logs seem to be using Also please note that it is |
Something crazy must have happened in my environment or where I ran those commands. I was running 3. I'll check and run again. |
Sorry, ran the commands in the wrong window and didn't look at the output. $ jupyter serverextension list $ jupyter labextension list |
Please do try upgrading to v3.2 (lab extension) and 1.1.1 (server extension): https://github.com/krassowski/jupyterlab-lsp/releases/tag/v3.2.0 Also, please provide messages sent by your LSP server (sdtm) as this is not included in JupyterLab debug output. You may need to use some logging mechanism to get those (e.g. print to a file or to the standard error stream) as the standard output is actualy being directed to our extension. I see that we should probably include an option to show what the LSP servers provide us with in the debug mode.. |
Upgraded to v3.2 $ jupyter serverextension list $ jupyter labextension list jupyterlab-lsp-server-log.txt I switched to using a copy of the pygls example app at: https://github.com/openlawlibrary/pygls/blob/master/examples/json-extension/server/server.py. I cut out everything that looked non-essential and uploaded it as 'core copy.py.txt'. This is the output logged by the above LSP program: |
Forgot to say that it is still not working but is failing for new reasons. |
Let's break down the console logs first. In the beginning we see that your notebooks gets opened and kernel gets connected properly:
and the connection going to your language server gets established (connected but not initialized yet):
You can see that the connection has not been initialized yet based on this message:
And on all the
You then attempt to fetch completions for
The kernel reply arrives (it is empty -
You do not get reply from your language server because it is still not ready.
And as this is the end of the logs, I presume that your language server has never initialized. |
I think that your python code is ok. I wonder if you may have some buffering which prevents the code initialization request from being received. I suspect that you may need to:
|
You were right about the messages being lost. They were completely lost because I was starting the pygls language server in TCP instead of STDIO so I was in the wrong communication mode. The pygls modes are documented at https://pygls.readthedocs.io/en/latest/pages/advanced_usage.html?highlight=logging#connections. It was a mix of my lack of knowledge about the different modes and me thinking it automatically started in STDIO mode (a.k.a. user error). When I added an explicit call of server.start_io() my Language Server got to the 'Fully Initialized' status and completions started working. The shell file I was running doesn't do anything but run python to execute my language server code. I tried the -u switch and it doesn't seem to change anything. |
Great to hear it was resolved! I think that we need to improve the documentation on our side to note down all of this knowledge. Would you like to contribute a short note on what is required to make a new kernel/language server pair work with the lsp extension to our documentation? It would be a great addition and you may be a person who has done this most recently and have it in fresh memory. It could (but doesn't have to) include an example of steps needed to make a minimal pygls server work. |
I'd be happy to work on that documentation note. Where should I put it? |
Probably the best place would be a new section on "What is needed to connect a kernel with a language server" and an example on creating and configuring a new pygls-based language server below the scala example in https://github.com/krassowski/jupyterlab-lsp/blob/master/docs/Configuring.ipynb We gathered some useful notes for new contributors in the Contributing section: it explains how to install required developemenmt dependencies, how to build the documentation locally (though you might skip this step and on PR our RTD integration will do that for you) and how to lint the code with scripts/lint.py. |
Hi. Firstly jupyter-lsp is an awesome project.
Description
I'm struggling to get my custom language server to actually start when editing my custom language.
What's particularly strange is, it starts without an issue if I configure it to work with python. I just can't get it to start when I'm using my custom kernel.
I wonder if you could help me work around this issue? I've looked at all the existing issues and get the impression there might be ambiguity around how languages are discovered (eg. https://github.com/krassowski/jupyterlab-lsp/issues/190, robocorp/robotframework-lsp#13).
Thanks in advance.
Steps to reproduce
$ mkdir test
test/kernel.json:
$ jupyter kernelspec install --user test
testkernel.py:
~/.jupyter/jupyter_notebook_config.py
(put any language server in the argv here if desired)
Findings
When a new notebook is made with the Test kernel, the kernel works as expected. However the langauge server doesn't start up.
Logs
jupyter lab --log-level=DEBUG
Enironment
Running in latest scipy-notebook docker. Using conda for package management.
jupyter-lsp 0.9.2
jupyterlab 2.2.8
The text was updated successfully, but these errors were encountered: