-
Notifications
You must be signed in to change notification settings - Fork 32
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update for sphinx v7.3.x #346
Conversation
allows the config to be cached via pickling
I think there's a regression about consecutive builds using cpp.apigen. I now get warnings about documents not included in any toctree upon rebuilding docs (without flushing old build artifacts):
I think there's an undesired behavior specific to Windows (or any Case sensitive file system) in which a consecutive build fails after a certain number of rebuilds. It seems that a hash is added to the file name each time a build is invoked, and at some point the path to the generated file becomes too long. This happens with both python apigen and cpp apigen. |
8513a80
to
80df948
Compare
80df948
to
c859179
Compare
5ffc4db
to
46d8447
Compare
fix tooling ignore comments
46d8447
to
0be907d
Compare
We may need to investigate exactly how the previous state is reloaded when sphinx does an incremental build. cpp apigen stuffs some extra attributes into I think we may similarly add attributes to Therefore we may need to change where it is stored --- possibly on the |
I think I found the problem. sphinx-immaterial/sphinx_immaterial/apidoc/cpp/apigen.py Lines 897 to 902 in 61c3ef0
entities["page_name"] gets appended another hash every time _builder_inited() runs.
I think the fix is to consolidate the for loop in above link with the for loop located just afterward in the locally-scoped function sphinx-immaterial/sphinx_immaterial/apidoc/cpp/apigen.py Lines 903 to 915 in 61c3ef0
os.path.basename(entities["pagename"]) ) without caching the hash-appended file name. Honestly though, I think hashing the bytes written to the file might be a bit more of a cache buster.
|
Well I was able to use the file content (in bytes) as the hash seed, and it did prevent writing new files when the content hasn't changed (for both python and cpp apigen). See f97200f Now, I'm noticing that the site index is generated for 99 pages on a fresh doc build, but the site index is only generated for 63 files on a subsequent build. |
We don't want to use the file content because in fact the purpose is not cache busting but the opposite --- to ensure a stable URL even as the content changes when the documentation updates. The hash is to deal with case insensitive filesystems where we may have two document names that differ only in casing. I think there may have been a change in how incremental rebuild state is saved. We want to make sure that this information is not saved across rebuilds, so I think we should move it out of |
f97200f
to
fb5a32f
Compare
from sphinx.domains.python._annotations import _parse_annotation # type: ignore[import-not-found] # pylint: disable=import-error,no-name-in-module | ||
else: | ||
from sphinx.domains.python import _parse_annotation | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm --- I realize that I steered you wrong on this --- since we monkey patch this, we can run into problems by importing it. Instead I guess we should just create an alias for the module. We will also need to verify that our monkey patches still work --- e.g. if something in the Python domain itself imports this symbol, then we have to replace the imported symbol as well as the original one.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Lately, I've been wondering if it would be easier to have our own "better domains" that somehow share/merge data with the vanilla domains (where applicable).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Importing as alias seemed to be an over-complication; some of our modules do need to use other API of the python domain. Also, mypy doesn't like it when you import a module and then import it again under an alias (even if only conditionally imported as an alias).
I adjusted the unit test to assert that all modules in sphinx.domains.python
which import _parse_annotations()
are consistently monkeypatched.
d1ae1b4
to
0362dad
Compare
This comment was marked as resolved.
This comment was marked as resolved.
I'm not sure why, but libclang (v18.1.1) on macos is failing to parse the cpp.apigen demo sources. It seems related to sighingnow/libclang#71 I don't have a mac I could use to test/debug this one. |
Perhaps we can just add a libclang<17 requirement as a workaround until this gets fixed upstream? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks
resolves #345