You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
To keep cache pruning simple I think it would suffice to add a slim cleanup function that gets called last on return of the main query. All resources fetched for a specific extraction could be saved to a subfolder in /temp named with a temporary identifier. The cleanup function could then delete the whole cache folder with a xmldb:remove during return.
metacontext
changed the title
[feature request] cache remote files
Caching for remote files during extraction
May 10, 2020
Just to keep all thoughts we had about this feature so far here is a related snippet from my TODO.txt:
Only load external resources once => retrieve and cache them for all following statements
maybe by checking the configuration first for all resource attributes, load them and temp. put them in DB with URI <=> hashname map
file: cache/extraction_1235612rauafd.xml
When resources are fetched from remote servers, download them once and use the cached version instead of re-downloading them every time.
(Code is operational at my place (TM), move it to git branch and test it!)
Think about:
The text was updated successfully, but these errors were encountered: