diff --git a/CHANGELOG.md b/CHANGELOG.md index 0a0340dd7f..1927a54fb6 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -1,1266 +1,8 @@ -# rc: minor bump +# See elsewhere for changelog -This release has only a few code updates: +This project’s release notes are curated from the Git history of its main +branch. You can find them by looking at [the version of this file on the +`release` branch][branch] or the [GitHub release history][gh-releases]. -- Partial support for the `dvipdfmx:config` special has been added (#953, - @vlasakm). This should fix some aspects of PDF generation, including named - anchors created by `hyperref`. Other fixes might help with the `attachfile` - package, although that is awaiting further confirmation. -- A dumb crash was fixed when attempting to create HTML output with an input - that has not been set up for the Tectonic HTML compilation framework (#955, - @pkgw). Note, however, that generic documents will still fail to build in HTML - mode. The program just won't crash now. As of this release, the *only* example - of working HTML output from Tectonic is the [tt-weave] system (see below). - -More noteworthy are several non-code improvements! - -- A preliminary official build for the Apple Metal platform - (`aarch64-apple-darwin`) is now available (#959, @pkgw). Due to lack of - support in the continuous integration system we can't test the build - thoroughly, but it appears to work. -- @KaranAhlawat contributed a [how-to guide for using Tectonic in Emacs AucTeX][auctex]. -- @mnrvwl has done a fantastic job reviewing our GitHub issues, gathering more - information when needed, and closing out ones that have been solved. -- @pkgw has published *[XeTeX: A Pseudoprogram][xap]*, a digital book that - derives from Knuth's *[TeX: The Program][ttp]*. This book is generated from - the reference XeTeX code underlaying Tectonic’s typesetting using a new - processor called [tt-weave]. See [the book’s preface][xap] for more - information. - -Thank you to all of our contributors! - -[auctex]: https://tectonic-typesetting.github.io/book/latest/howto/auctex-setup/ -[tt-weave]: https://github.com/tectonic-typesetting/tt-weave/ -[xap]: https://stacks.fullyjustified.net/xap/2022.0/ -[ttp]: https://www.worldcat.org/title/876762639 - - -# tectonic 0.11.0 (2022-10-04) - -- Many updates to the experimental, unstable `spx2html` engine for creating HTML - output (#941, @pkgw). They will not be documented here because there are a lot - and the aforementioned experimental-ness and instability. This work is in - service of the [tt-weave] demo, which is almost ready for a preliminary - release. -- Add a tweak to the Harfbuzz build script that should hopefully fix builds on - macOS against old SDKs, as seen in conda-forge (#944, @pkgw). - -[tt-weave]: https://github.com/pkgw/tt-weave/ - - -# tectonic 0.10.0 (2022-10-03) - -This release updates Tectonic to support TeXLive 2022.0! There are not many code -changes in the engines, so the primary user-visible changes will stem from the -many package updates incorporated into the new TeXLive 2022.0 bundle. To switch -a preexisting Tectonic document to use the new bundle, update the `doc.bundle` -field in `Tectonic.toml` to -`https://data1.fullyjustified.net/tlextras-2022.0r0.tar`. Newly-created -documents will use this bundle (or subsequent updates) by default. - -This release also adds a new “drop-in” installation method. This adds a way to -quickly install Tectonic in the popular `curl`/`sh` style. On a Unix-like -operating system, run: - -```sh -curl --proto '=https' --tlsv1.2 -fsSL https://drop-sh.fullyjustified.net |sh -``` - -... to drop a system-appropriate `tectonic` binary in the current working directory. -On Windows, run the following in a PowerShell terminal: - -```ps1 -[System.Net.ServicePointManager]::SecurityProtocol = [System.Net.ServicePointManager]::SecurityProtocol -bor 3072 -iex ((New-Object System.Net.WebClient).DownloadString('https://drop-ps1.fullyjustified.net')) -``` - -Other changes: - -- Make it so that running `tectonic -Zhelp` works (#929, @pkgw). Before it would - error out because the argument parser wanted an input filename. -- Fix `-Z continue-on-errors` (#917, @vlasakm). This was broken in an earlier - refactoring. -- Add a `-Z shell-escape-cwd=` unstable option (#909, @0x00002a). This can - work around issues in Tectonic's handing of shell-escape processing, which is - very conservative about defaulting to launching programs in a limited - environment. In particular, if you set the directory to the document source - directory, commands like `\inputminted` can work. -- It is possible for one `.tex` file to generate multiple `.aux` files. Even if - more than one of those files should have triggered its own `bibtex` run, - Tectonic only ran `bibtex` once. This is now fixed (#906, #907, @Starrah). -- Give some more context in the error message if an external (shell-escape) tool - tries to open a file that's missing (#899, @matz-e). - -The known issue relating to OpenSSL 3 is believed to still be relevant: - -- The generic prebuilt Tectonic binaries for Linux are built for the version 1.1 - series of OpenSSL. The latest Ubuntu release, 22.04 (Jammy Jellyfish), now - targets OpenSSL 3, with no compatibility fallback, which means that the - prebuilt binaries won’t run. To run Tectonic on these systems, compile it - yourself, use the “semistatic” MUSL Linux builds, or install a package built - specifically for this OS. To be clear, there are no actual issues with OpenSSL - 3 compatibility — we just need to provide an alternative set of builds. See - #892 for updates. - -Thank you to everyone who contributed to this release! - - -# tectonic 0.9.0 (2022-04-27) - -This release updates Tectonic to correspond with TeXLive 2021.3, jumping -forward from the previous sync point of TeXLive 2020.0. - -- The primary user-visible changes will stem from the many package updates - incorporated into the new TeXLive 2021.3 bundle. We don't actually have a - listing of all of the updates, but they are numerous. To switch a preexisting - Tectonic document to use the new bundle, update the `doc.bundle` field in - `Tectonic.toml` to `https://data1.fullyjustified.net/tlextras-2021.3r1.tar`. - Newly-created documents will use this bundle (or subsequent updates) by - default. -- The XeTeX engine has mostly low-level updates, but there was a significant - rework of OpenType math kerning and sub/super-scripting. There is a new - `\tracingstacklevels` integer parameter. See [the changelog for the - `tectonic_engine_xetex` Rust crate][excl] for more details. -- The xdvipdfmx engine has numerous updates including improvements for Japanese - font fallbacks. See [the changelog for the `tectonic_engine_xdvipdfmx` Rust - crate][edcl] for more details. - -[excl]: https://github.com/tectonic-typesetting/tectonic/releases/tag/tectonic_engine_xetex%400.3.0 -[edcl]: https://github.com/tectonic-typesetting/tectonic/releases/tag/tectonic_engine_xdvipdfmx%400.2.0 - -Separately, the “GitHub Discussions” feature for the Tectonic repository has -been activated: - -### - -@pkgw has found himself unable to pay proper attention to the -`tectonic.newton.cx` Discourse service, which has been fairly moribund. The -intention is to sunset it. - -We have one known issue worth highlighting: - -- The generic prebuilt Tectonic binaries for Linux are built for the version 1.1 - series of OpenSSL. The latest Ubuntu release, 22.04 (Jammy Jellyfish), now - targets OpenSSL 3, with no compatibility fallback, which means that the - prebuilt binaries won’t run. To run Tectonic on these systems, compile it - yourself, use the “semistatic” MUSL Linux builds, or install a package built - specifically for this OS. To be clear, there are no actual issues with OpenSSL - 3 compatibility — we just need to provide an alternative set of builds. See - #892 for updates. - -Other improvements include: - -- Some TeX packages attempt to read input from external processed using a “pipe” - syntax. This capability is not currently implemented in Tectonic. Such uses - now trigger a warning (#859, #888, @pkgw). -- The location of the Tectonic cache is now customizable by setting the - `TECTONIC_CACHE_DIR` environment variable (#884, @wischi-chr). People are - encouraged to use the default whenever possible, but flexibility here can be - useful in some circumstances. -- Document the V2 `-X` flag better in the documentation and CLI output (#877, - @clbarnes). -- Some memory leaks during failed builds have been plugged as part of an ongoing - (albeit slow) effort to get it so that Tectonic can be used with modern input - fuzzing tools (@pkgw). -- Allow basic `\openin` of un-closed `\openout` files to succeed (#882, @pkgw). - This should get `hyperxmp` working (#862). - - -# tectonic 0.8.2 (2022-03-02) - -No code changes here. This release uses the newly-released version 0.1.4 of the -[pinot] font parsing crate, which includes what were previously -Tectonic-specific extensions (#870, @pkgw). The "patching" build feature that we -were using turned out to break `cargo install tectonic`. Thanks to [@dfrg] for -prompt follow-up! - -[pinot]: https://crates.io/crates/pinot -[@dfrg]: https://github.com/dfrg - - -# tectonic 0.8.1 (2022-02-28) - -- The most important change in this release is a fix for issue [#844], wherein - due to an implementation oversight Tectonic could obliterate `biber` input - files whose locations were given as absolute paths ([#868], @pkgw). This - should now be solved. -- This release also includes improved (i.e., "not broken") handling of `biber` - inputs in subdirectories ([#843], [#845], @KevoSoftworks) -- A long-standing issue where outputs could vary slightly from one platform to - the next, depending on system-dependent floating-point rounding with PNG images, - was fixed ([#847], @pkgw). - -There are also two big under-the-hood changes that won't make a noticeable difference -for now, but lay the groundwork for future work: - -- The internal parameters and definitions of the Tectonic/XeTeX engine are now - introspectable thanks to a new crate, [`tectonic_xetex_format`][xf]. This - crate is now used to emit the C/C++ headers used to compile the engine. It is - also able to introspect the "format files" that store engine state, adding the - capability to answer questions such as "What are the definitions of all of the - control strings defined by this format?" This should enable some *really* - interesting supporting tools in the future! -- *Very* preliminary support for native HTML output has been added ([#865], - @pkgw). This support isn't yet generally useful since it's undocumented and - requires a suite of support files that's still being developed, but - prototyping indicates that the generated output has promise for very - high-quality mathematical rendering. The new [`tectonic_engine_spx2html`][s2h] - crate provides the main new implementation. Hopefully there will be more to - report soon! - -[#843]: https://github.com/tectonic-typesetting/tectonic/issues/843 -[#844]: https://github.com/tectonic-typesetting/tectonic/issues/844 -[#845]: https://github.com/tectonic-typesetting/tectonic/pull/845 -[#847]: https://github.com/tectonic-typesetting/tectonic/pull/847 -[#865]: https://github.com/tectonic-typesetting/tectonic/pull/865 -[#868]: https://github.com/tectonic-typesetting/tectonic/pull/868 -[s2h]: https://crates.io/crates/tectonic_engine_spx2html -[xf]: https://crates.io/crates/tectonic_xetex_format - -This release also includes the usual updates to internal dependencies, build and -testing infrastructure, and so on. - - -# tectonic 0.8.0 (2021-10-11) - -This release fixes a showstopping issue introduced by recent changes to the -`archive.org` PURL ([persistent URL]) service. All users are advised to upgrade -immediately, although it is possible to continue using older releases in some -limited circumstances. - -[persistent URL]: https://purl.prod.archive.org/help - -By default, Tectonic downloads (La)TeX resource files from the internet as -needed. Before this release, Tectonic would query a PURL in order to know where -to locate the most recent “bundle” of resource files. On Wednesday, -`archive.org` updated the implementation of its service in a way that interacted -catastrophically with the way that Tectonic processes URL redirections. The -result was that Tectonic became unable to download any of its resource files, -breaking essential functionality. Thanks to [@rikhuijzer] for providing early -reporting and diagnosis of the problem. - -[@rikhuijzer]: https://github.com/rikhuijzer - -This release fixes the redirection functionality ([#832], [@pkgw]), but more -importantly it switches from querying `archive.org` to using a new dedicated -webservice hosted on the domain `fullyjustified.net` ([#833], [@pkgw]). The -motivation for this switch is that besides this particular incident, -`archive.org` has had low-level reliability problems in the past, and more -important, it is blocked in China, preventing a potentially large userbase from -trying Tectonic. - -[#832]: https://github.com/tectonic-typesetting/tectonic/pull/832 -[#833]: https://github.com/tectonic-typesetting/tectonic/pull/833 - -The new URL that is queried is: - -https://relay.fullyjustified.net/default_bundle.tar - -The redirection is implemented with a simple nginx server defined in the new -[tectonic-relay-service] repo and hosted on Microsoft Azure cloud infrastructure -defined in Terraform configuration in the [tectonic-cloud-infra] repo. [@pkgw] owns -the domain name and Azure subscription. - -[tectonic-relay-service]: https://github.com/tectonic-typesetting/tectonic-relay-service -[tectonic-cloud-infra]: https://github.com/tectonic-typesetting/tectonic-cloud-infra -[@pkgw]: https://github.com/pkgw - -Along with the above change, this release contains the following improvements: - -- Add the [`tectonic -X dump`] V2 CLI command, which runs a partial document - build and outputs a requested intermediate file. This can help integrate - external tools into a Tectonic-based document processing workflow (#810, - @pkgw) -- Add support for custom support file search directories with the `-Z - search-path=` [unstable option][sp] (#814, @ralismark) -- Fix the `watch` V2 CLI command on Windows (#828, @Sinofine) -- Fix repeated updates in the `watch` V2 CLI command (#807, @jeffa5) -- Fix an incorrect error message when running V2 CLI commands outside of a - workspace (#813, @ralismark) -- Add a more helpful warning if an input produces empty output (#817, - @ralismark) -- Prevent an incorrect warning when reading some kinds of EXIF metadata (#822, - @korrat) -- Reject `-Z shell-escape=false`, which would be parsed as *enabling* - shell-escape (#823, @ratmice) - -[`tectonic -X dump`]: https://tectonic-typesetting.github.io/book/latest/v2cli/dump.html -[sp]: https://tectonic-typesetting.github.io/book/latest/v2cli/compile.html#unstable-options - - -# tectonic 0.7.1 (2021-07-04) - -- Improve launching of `biber` by parsing the `.run.xml` file to find out which - resource files are needed. This should hopefully allow Tectonic to process - many more documents that use `biblatex` ([#796], [#804], [@pkgw]). -- Avoid misplaced newlines in warning output ([#803], [@ralismark]). -- Fix the build on Rust 1.46, which will be helpful for the conda-forge package. - We really ought to define and monitor a Minimum Supported Rust Version (MSRV) - for Tectonic, but we haven't set that up just yet ([#802], [@pkgw]). - -[#796]: https://github.com/tectonic-typesetting/tectonic/issues/796 -[#802]: https://github.com/tectonic-typesetting/tectonic/pull/802 -[#803]: https://github.com/tectonic-typesetting/tectonic/pull/803 -[#804]: https://github.com/tectonic-typesetting/tectonic/pull/804 -[@pkgw]: https://github.com/pkgw -[@ralismark]: https://github.com/ralismark - - -# tectonic 0.7.0 (2021-06-19) - -This release of Tectonic, at long last, adds support for [biber] to enable full -use of [biblatex][biber]! Biber is a complex Perl program, so, unlike the other -Tectonic “engines,” we can’t practically embed it within the Tectonic program. -This means that document builds using biber will have lessened reproducibility -and portability, but it’s better to have that than to fail to build the document -at all. - -[biber]: http://biblatex-biber.sourceforge.net/ - -Here's a sample document that should now get fully processed: - -```tex -% adapted from https://tex.stackexchange.com/a/34136/135094: -\documentclass{article} -\usepackage[autostyle]{csquotes} -\usepackage[ - backend=biber, - style=authoryear-icomp, - sortlocale=de_DE, - natbib=true, - url=false, - doi=true, - eprint=false -]{biblatex} -\addbibresource{biblatex-examples.bib} - -\usepackage[]{hyperref} -\hypersetup{ - colorlinks=true, -} - -\begin{document} - -Lorem ipsum dolor sit amet~\citep{kastenholz}. At vero eos et accusam et justo -duo dolores et ea rebum~\citet{sigfridsson}. - -\printbibliography -\end{document} -``` - -Tectonic’s new support detects a need to run `biber` by checking for the -creation of a file whose name ends in `.bcf`, and executes the `biber` program -inside a temporary directory, slurping any files that it creates into Tectonic’s -virtualized I/O subsystem. We’ll probably need to add a few new “knobs” to allow -users to control how and when biber is run — please file an issue if you run -into any limitations! - -Under the hood, the implementation includes the beginnings of a more generic -subsystem for including external tool programs in document builds. This may turn -out to be more generally useful going forward. - - -# tectonic 0.6.4 (2021-06-17) - -- Yet another new release to try to fix the docs.rs build. I think this one may - get it right. - - -# tectonic 0.6.3 (2021-06-17) - -- Another attempt to fix the docs.rs build. -- Update Cargo dependencies while we're at it. - - -# tectonic 0.6.2 (2021-06-16) - -- Attempt to fix the i686 Arch Linux package specification -- Attempt to fix the docs.rs build, hopefully. We might have to try a few - different approaches here before we find one that works. - - -# tectonic 0.6.1 (2021-06-15) - -- No code changes; the attempt to publish 0.6.0 to Crates.io failed spuriously, - so we're retriggering the release automation. - - -# tectonic 0.6.0 (2021-06-15) - -This release adds some helpful new utilities and internal cleanups, which -involve breaking API changes (see below). - -- New V2 command `tectonic -X show user-cache-dir` to print out the - location of the per-user cache directory. FAQ, answered! (@pkgw, #786) -- New V2 command `tectonic -X bundle search` to print out listings of files - contained in the "bundle" of TeX support files. If run in a workspace - containing a `Tectonic.toml` file, the workspace’s bundle is queried; - otherwise, the default bundle is queried. (@pkgw, #786) -- New V2 command `tectonic -X bundle cat` to print out one of the support files, - with the same general behavior as the `search` command. You could also use - this to ensure that a particular file has been loaded into the local cache. - (@pkgw, #786). -- Improved security model regarding the "shell-escape" feature, which has the - potential to be abused by untrusted inputs. A new `--untrusted` argument to - the V1 CLI and `tectonic -X build` disables the use of shell-escape, and any - other known-insecure features, regardless of the presence of `-Z shell-escape` - or any other options. Therefore, if you're writing a script that processes - untrusted input, if you make sure to run `tectonic --untrusted ...` you can be - confident that further command-line arguments can't undo your sandboxing. - Furthermore, if the environment variable `$TECTONIC_UNTRUSTED_MODE` is set to - a non-empty value, the effect is as if `--untrusted` had been provided. - (@pkgw, #787) -- You know what ... get rid of the "beta" message in the V1 CLI. -- Fix SyncTeX output, we hope (e.g., #720, #744; @hulloanson, @pkgw, #762). - Tectonic's SyncTeX files should now include correct, absolute filesystem paths - when appropriate. -- Fix some broken low-level XeTeX built-ins, reported by @burrbull (@pkgw, #714, - #783) - -A few more more words on the security model: the terminology is a bit slippery -here since we of course never intend to deliver a product that has security -flaws. But features like shell-escape, while providing useful functionality, can -certainly be abused to do malicious things given a hostile input. The default UX -aims to be conservative about these features, but if a user wants to enable -them, we'll allow them -- in the same way that Rust/Cargo will compile and run -`build.rs` scripts that in principle could do just about anything on your -machine. Our main security distinction is therefore whether the input is trusted -by the user running Tectonic. The decision of whether to "trust" an input or not -is something that fundamentally has to be made at a level higher above Tectonic -itself. Therefore the goal of Tectonic in this area is to provide the user with -straightforward and effective tools to express that decision. - -For developers, this release adds two new Cargo crates to the Tectonic -ecosystem: `tectonic_docmodel`, allowing manipulation of `Tectonic.toml` files -and their related data structures; and `tectonic_bundles`, allowing manipulation -of the Tectonic support file bundles. In both cases, third-party tools might -wish to use these formats without having to pull in all of the heavyweight -dependencies of the main `tectonic` crate. And in both cases, the separation has -led to many API improvements and cleanups that greatly improve the overall code -structure. These changes break the API of the `tectonic` crate by removing some -old modules and changing the particular traits and types used to implement these -systems. (@pkgw, #785, #786) - - -# tectonic 0.5.2 (2021-06-08) - -- Update dependencies, including [`watchexec`]. We believe that this should fix - the issues with the official Windows executables that have been reported - ([#780], [#782], [@pkgw]) - -[`watchexec`]: https://github.com/watchexec/watchexec -[#780]: https://github.com/tectonic-typesetting/tectonic/issues/780 -[#782]: https://github.com/tectonic-typesetting/tectonic/pull/782 -[@pkgw]: https://github.com/pkgw - - -# tectonic 0.5.1 (2021-06-07) - -**Note:** we have reports that the official 0.5.0 Windows executables don’t -work, or don’t always work ([#780]). This is under investigation but hasn’t been -addressed yet. - -- No code changes to the main crate -- Update the Arch Linux specification files to comply better with guidelines - ([#779], [@lmartinez-mirror]) - -[#779]: https://github.com/tectonic-typesetting/tectonic/pull/779 -[@lmartinez-mirror]: https://github.com/lmartinez-mirror -[#780]: https://github.com/tectonic-typesetting/tectonic/issues/780 - - -# tectonic 0.5.0 (2021-06-06) - -This is an exciting release! After [literally years of requests][i38], Tectonic -now supports the TeX “shell escape” mechanism required by some packages like the -[minted] code highlighter ([#708]). This is chiefly thanks to [@ralismark] who -put in the work to deliver a solid implementation and track ongoing changes to -the Tectonic backend. Thank you, [@ralismark]! - -[i38]: https://github.com/tectonic-typesetting/tectonic/issues/38 -[minted]: https://ctan.org/pkg/minted -[#708]: https://github.com/tectonic-typesetting/tectonic/pull/708 -[@ralismark]: https://github.com/ralismark - -Shell-escape remains disabled by default because it is, frankly, a hack that -detracts from the reproducibility and portability of document builds. It also -has significant security implications — you should never process untrusted input -with shell-escape enabled. But in those cases where shell-escape is necessary, -you can activate it with an [unstable option] in the [“V1”] command-line -interface: - -[unstable option]: https://tectonic-typesetting.github.io/book/latest/v2cli/compile.html#unstable-options -[“V1”]: https://tectonic-typesetting.github.io/book/latest/ref/v1cli.html - -``` -tectonic -Z shell-escape my-shell-escape-document.tex -``` - -In the [“V2”] model, you can activate shell-escape by adding the following line -to one or more `[output]` sections in your [`Tectonic.toml`] file: - -[“V2”]: https://tectonic-typesetting.github.io/book/latest/ref/v2cli.html -[`Tectonic.toml`]: https://tectonic-typesetting.github.io/book/latest/ref/tectonic-toml.html - -```toml -[output] -name = 'default' -type = 'pdf' -shell_escape = true # <== add this -``` - -The other major change associated with this release is for developers. The -Tectonic implementation has now been split into a number of specialized [Rust -crates][crate], each focusing on a specific piece of the overall Tectonic -functionality. Besides helping clarify and organize the large amount of code -that goes into Tectonic, this will make it easier for developers to create -Tectonic-based tools that use part of the codebase without having to depend on -every piece of it. - -[crate]: https://doc.rust-lang.org/book/ch07-01-packages-and-crates.html - -This change was made possible by adopting a new release automation tool called -[Cranko] that project lead [@pkgw] created last summer. Cranko is based on a -novel [“just-in-time versioning”][jitv] release workflow and extensive use of -Azure Pipelines continuous integration and deployment services — together these -make it feasible to manage versioning and releases of the 20 different crates -that now live within the Tectonic [monorepo]. This may not sound like the most -exciting kind of code to write, but Cranko has made it possible to almost -entirely automate the Tectonic release processes in a way that’s been nothing -short of transformative. - -[Cranko]: https://pkgw.github.io/cranko/ -[@pkgw]: https://github.com/pkgw -[jitv]: https://pkgw.github.io/cranko/book/latest/jit-versioning/index.html -[monorepo]: https://en.wikipedia.org/wiki/Monorepo - -This change comes with a bit of a downside, in that there have been a lot of API -breaks in the `tectonic` crate, as numerous internal APIs have been improved and -rationalized. If you only use the [`tectonic::driver`] module, changes should be -minimal, but lots of support systems have changed. It is likely that there will -be additional breaks in subsequent releases as a few remaining subsystems are -split out. The good news is that the APIs in the new sub-crates should be much -better designed and better documented than many of their former incarnations in -the main crate. - -[`tectonic::driver`]: https://docs.rs/tectonic/*/tectonic/driver/index.html - -There’s the usual collection of smaller improvements as well: - -- If a document referenced a filename that corresponded to a directory that - lived on the filesystem, you could get a hard-to-interpret error. Now, - directories are ignored when looking for files. - ([#754], [#759], [@pkgw]) -- A floating-point precision issue was fixed that broke the reproducibility of - builds on 32-bit versus 64-bit systems - ([#749], [#758], [@pkgw]) -- Fix potential undefined behavior in the `tectonic_xdv` crate reported by - [@sslab-gatech] - ([#752], [#753], [@pkgw]) -- Add the ability to customize the preamble, postamble, and index files in - V2 documents ([#745], [#746], [@jeffa5]) -- Add a V2 `tectonic -X watch` command to auto-rebuild documents when their - source files get updated ([#719], [#734], [@jeffa5]) -- Add an `--open` option to `tectonic -X build` to open the document(s) - after the build finishes ([#109], [#733], [@jeffa5]) -- The usual updates to dependencies, build fixes, and documentation tweaks - -[#109]: https://github.com/tectonic-typesetting/tectonic/issues/109 -[#719]: https://github.com/tectonic-typesetting/tectonic/issues/719 -[#745]: https://github.com/tectonic-typesetting/tectonic/issues/745 -[#749]: https://github.com/tectonic-typesetting/tectonic/issues/749 -[#752]: https://github.com/tectonic-typesetting/tectonic/issues/752 -[#754]: https://github.com/tectonic-typesetting/tectonic/issues/754 -[#733]: https://github.com/tectonic-typesetting/tectonic/pull/733 -[#734]: https://github.com/tectonic-typesetting/tectonic/pull/734 -[#746]: https://github.com/tectonic-typesetting/tectonic/pull/746 -[#753]: https://github.com/tectonic-typesetting/tectonic/pull/753 -[#758]: https://github.com/tectonic-typesetting/tectonic/pull/758 -[#759]: https://github.com/tectonic-typesetting/tectonic/pull/759 -[@sslab-gatech]: https://github.com/sslab-gatech -[@jeffa5]: https://github.com/jeffa5 - - -# tectonic 0.4.1 (2021-01-03) - -- Add support for aarch64-apple-darwin when building with vcpkg -- Prototype release automation infrastructure to update the new - [tectonic-bin](https://aur.archlinux.org/packages/tectonic-bin/) AUR package. - - -# tectonic 0.4.0 (2020-12-28) - -- Introduce a prototype new “V2” command line interface, accessible by running - Tectonic with an initial `-X` argument: `tectonic -X new`. This interface is - oriented around a new document model defined by a `Tectonic.toml` definition - file. Documentation is under development in [the - book](https://tectonic-typesetting.github.io/book/latest/). Eventually, this - new interface will become the default, after a migration period. It is - currently fairly basic, but will be fleshed out in the 0.4.x release series. -- Handle USV 0xFFFF in `print()` (#678, #682, @burrbull, @pkgw) -- Update the Arch Linux `makedepends` definitions (#691, @snowinmars) -- Update various Cargo dependencies. - - -# tectonic 0.3.3 (2020-11-16) - -- When testing whether the engine needs rerunning, compare the new file to the - entire old file, not just the part that was read by the engine. Should fix - unnecessary reruns in some less-common cases. (#679, #681, @pkgw) - - -# tectonic 0.3.2 (2020-11-14) - -- Slightly alter how some filenames are looked up. Before, if the TeX code - requested a file whose name contained an extension, e.g. `foo.bar`, if no such - file was available in the bundle we gave up immediately. Now we also check for - `foo.bar.tex` and friends. This fixes the `lipsum` package in TeXLive 2020.0 - (#669, #680, @pkgw), and quite possibly some other miscellaneous packages as - well. - - -# tectonic 0.3.1 (2020-11-02) - -- Fix compilation on Windows/MSVC (`sys/time.h` is not available) -- Don't print an empty `error:` line in the CLI (#665, #670) - - -# tectonic 0.3.0 (2020-11-01) - -The 0.3 series updates the core Tectonic engines to align with the code in -[TeXLive 2020.0][tl2020.0]. The default “bundle” of support files will soon be -updated to match TeXLive 2020.0 as well. Standard usages should work if you use -an older version of Tectonic with the new bundle, and vice versa, but we -recommend that you update your installations to the 0.3 promptly if you can. - -[tl2020.0]: https://www.tug.org/texlive/ - -This release introduces a straightforward but **breaking change** to the API of -the `tectonic` Rust crate, documented below. - -For context, Tectonic’s core TeX implementation is forked from the [XeTeX] -engine. Accumulated changes to XeTeX are periodically reviewed and imported into -Tectonic, a process that must be done manually because Tectonic’s modernized -developer and user experiences demand a huge amount of customization. (The -scripts to support the first stage of this process may be found in the -[tectonic-staging] repository.) It has been a while since the last -synchronization, but this release incorporates the new changes introduced -between the last update and the release of TeXLive 2020.0. - -[XeTeX]: https://tug.org/xetex/ -[tectonic-staging]: https://github.com/tectonic-typesetting/tectonic-staging - -The changes for TeXLive 2020.0 include: - -- New low-level primitives including `\filemoddate`, `\filedump`, - `\uniformvariate`, `\elapsedtime`, and a few others. -- Tweaks to how font design sizes are mapped to TeX values -- New magic numbers used in PDF last x/y position accounting, - instead of `cur_[hv]_offset`. -- Don't `print_raw_char` in `show_context` with `trick_buf` -- Back up `cur_cs` in `scan_keyword` and `compare_strings`. -- Handle `XETEX_MATH_GIVEN` in `scan_something_internal` -- If encountering an unexpandable primitive in `scan_something_internal`, - try to deal with it. Ditto for `scan_int`. -- Do something different with active characters in `str_toks_cat` -- Rework how file names are scanned. -- Defend against undefined eTeX registers in `main_control` -- Some `uc_hyph` tweak deep in the linebreaking algorithm - -There are also numerous changes in Tectonic’s included `xdvipdfmx`. - -The implementation of the `\filemoddate` API required a **breaking change** to -the API of the `tectonic` Rust crate: - -- We needed to add more data to the data structures of the `MemoryIo` provider. - Migration should be pretty easy: instead of `files` containing a bunch of - `Vec`s, it now contains a bunch of `MemoryFileInfo` structs that contain a - `Vec` field named `data`. So you just need to add some `.data` field - accessors to existing code. This API clearly needs some polish to allow - greater stability going forward. - -Other changes: - -- Issue a warning if `xdvipdfmx` needs to translate a VF font to PK format, - which is unimplemented in Tectonic (it relies on `mktexpk`) and so causes - failures on certain documents that work with XeTeX. -- The Windows [vcpkg]-based build is temporarily disabled, as vcpkg currently has - [a debilitating issue][vcpkg-issue] relating to SSL on Windows. -- There is a new `-Z continue-on-errors` unstable option that tells the engine - to emulate the classic TeX style of plunging on ahead even in the face of - severe errors. This is a nice example of the possibilities unlocked by the new - `-Z` infrastructure introduced in 0.2! - -[vcpkg]: https://github.com/microsoft/vcpkg -[vcpkg-issue]: https://github.com/tectonic-typesetting/tectonic/issues/668 - - -# tectonic 0.2.0 (2020-10-21) - -The 0.2 series finally adds "unstable" `-Z` flags! These allow you to configure -engine options that are relatively low-level. The hope is to eventually set -these kinds of things in a `Tectonic.toml` file, so their use is mildly -discouraged, and long-term availability is not guaranteed. But there are plenty -of times when such flags can be helpful. The currently supported options are: - -- `-Z min-crossrefs=` controls the `-min-crossrefs` option of standalone `bibtex` -- `-Z paper-size=` lets you control the output paper size, rather than - hardcoding it to be US Letter. - -Enormous thanks to @ralismark for finally implementing this! (#657) Now that the -infrastructure is in place, suggestions for additional flags are more than -welcome. - - -# tectonic 0.1.17 (2020-10-13) - -- Fix source-based installation by updating to cbindgen 0.15, after later - releases in the 0.14 series were yanked (@qtfkwk, #656) -- Fix unreachable code in the CID/Type0 code (@burrbull, @pkgw, #639, #646) -- Update the `cargo vcpkg` build process to use a newer version of `vcpkg`, fixing - Windows builds when msys2.org is unreliable (@pkgw). - - -# tectonic 0.1.16 (2020-10-02) - -- Add a "plain" backend for reporting status, used when the program is not - attached to a TTY. It will print out reports with colorization. (#636, - @ralismark) -- Start adding infrastructure to automate the creation of bindings from the - C/C++ code to the Rust code, using `cbindgen`. (#643, @ralismark) -- Update the code-coverage infrastructure to gather coverage information - from invocations of the CLI executable inside the test suite (@pkgw) -- Fully automated deployment should really actually totally work this time. - - -# tectonic 0.1.15 (2020-09-10) - -- Building on the work done in 0.1.13, we now capture and report diagnostics - nearly everywhere! Great work contributed by @ralismark (#635). -- Yet more revisions to the automated deployment system. Maybe *this* will be - the time that it all works (@pkgw, #637). - - -# tectonic 0.1.14 (2020-09-08) - -- No code changes from 0.1.13. Just trying to iron out some of the automated - deployment systems. Is this the time that the Arch Linux auto-deployment - finally works?? - - -# tectonic 0.1.13 (2020-09-07) - -It has once more been a long time since the last release. But this new release -features a move to a new release automation framework, [Cranko], which is -intended to promote a more aggressive release policy going forward. Cranko is -the result of a *lot* of careful thinking and design — resulting in a scheme -called [just-in-time versioning][jitv] — and it should offer a tractable and -low-friction framework for making releases even when there are many crates in -one repository. - -[Cranko]: https://github.com/pkgw/cranko -[jitv]: https://pkgw.github.io/cranko/book/latest/jit-versioning/ - -User-facing improvements: - -- Select core TeX warnings — notably, over/underfull boxes — are now surfaced as - Tectonic warnings, and not just reported in the detailed log files! The - infrastructure is now available to capture many more such warnings as needed. - (#625; @ralismark, @pkgw) -- Fix a few algorithmic mistakes introduced in manual editing of the C code. - Great catches by @burrbull! (#617, #624) -- Improve log formatting with backticks around filenames and human-friendly file - sizes (#539; @as-f) -- Fix segfaults (!) upon errors (#579, #606; @fmgoncalves) -- Default bibtex's `min_crossrefs` to 2, not 0 (#534; @jneem) -- Help debug "lost characters" with their detailed hex codes (#600; @pkgw) - -Developer-facing improvements: - -- CI system has been completely revamped to use [Cranko] and route entirely - through Azure Pipelines. Maintainability should be massively improved (@pkgw) -- Releases should now include pre-built binaries for a variety of architectures - (@pkgw). -- Switched to `app_dirs2`, since `app_dirs` is unmaintained (#620; @kornelski) -- Enable reproducible-ish builds through `cargo vcpkg` (#593; @mcgoo) -- Update to the 0.9 series of the `rust-crypto` packages (#596; @pkgw) -- Attempt to fix automated Arch Linux build (#587; @thomaseizinger) -- Fix a memory leak (#536; @elliott-wen) -- The usual large number of dependency updates with DependaBot. - - -# 0.1.12 (2019 Dec 6) - -It has been just more than a year since the last Tectonic release, mainly -because I (@pkgw) -[started a new job](https://newton.cx/~peter/2018/operation-innovation/) that -has massively restructured how I spend my time. But that is not to say that -things have been quiet for Tectonic! I count 81 pull requests merged since -0.1.11. (Ignoring automated ones issued by Dependabot.) - -User-facing improvements: - -- Thanks to @efx we now have the beginnings of - [a Tectonic book](https://tectonic-typesetting.github.io/book/)! It is - currently very sparse, but we hope to gradually flesh it out. The book is - updated automatically upon merges to `master` and with tagged releases as - well (if @pkgw wired up the infrastructure correctly). (#427, #444, #445, - #447, #505; @efx @PHPirates @pkgw) -- Tectonic’s caching scheme is now much smarter, saving a local copy of the - table-of-contents file associated with each online bundle. This means that - Tectonic doesn’t need to hit the network at all if a new file is referenced - that is not present in the bundle, and saves a large download if a new - needed file *is* present in the bundle. (#431; @rekka) -- Performance has been improved by avoiding the computation of SHA256 hashes - for read-only files. Since these files are known not to change, we don’t - have to monitor their contents. (#453; @rekka) -- Warnings are only flagged if they occur on the final pass of the TeX engine, - since sometimes ones that occur in the first pass get fixed by subsequent - reruns. (#458; @rekka) - -There have been a *ton* of developer-facing improvements: - -- Tectonic is now built using the Rust 2018 edition! (#388; @Mrmaxmeier) -- @Mrmaxmeier built an amazing system to start doing - [crater](https://github.com/rust-lang/crater)-like runs of Tectonic on the - [arxiv.org](https://arxiv.org) corpus, yielding bug fixes across the - codebase including issues with obscure PNG formats. (#401; @Mrmaxmeier) -- It is now possible to build various statically-linked versions of Tectonic. - One way to accomplish this is to turn off the new `serialization` Cargo - feature. This eliminates the use of Rust “procedural macros” in the build - process which in turn allows Tectonic to be built on statically-linked - platforms. (Note, however, that it is possible to build Tectonic for - statically-linked platforms *with* the serialization feature by - cross-compiling it from a dynamically-linked platform. This is the tactic - used by the Tectonic CI build system.) @efx also - [wrote instructions for how to accomplish a mostly-static build on macOS using vcpkg](https://tectonic-typesetting.github.io/book/latest/cookbook/vcpkg.html) - as well as - [how to do it on Linux/musl using Docker](https://github.com/tectonic-typesetting/tectonic/tree/master/dist/docker/x86_64-alpine-linux-musl) - (#260, #261, #325, #425, #451; @efx, @malbarbo, @pkgw) -- Tectonic now distributes a continuous-deployment - [AppImage](https://appimage.org/) build. (#283, #285; @pkgw, @probonopd, - @xtaniguchimasaya) -- The size of the binary has decreased somewhat by using a smaller collection - of compression libraries; avoiding the use of exceptions and RTTI in the - compiled C++ code; avoiding the use of the `aho_corasick` crate; and making - the `toml` crate optional. (#428, #439, #440, #491; @malbarbo) -- Tectonic now uses the - [reqwest](https://docs.rs/reqwest/0.10.0-alpha.2/reqwest/) crate as its HTTP - backend instead of direct use of [hyper](https://hyper.rs/). Reqwest offers - a simpler interface and adds better support for things like HTTP proxies and - cookie handling. These new features do increase the binary - size somewhat. (#330, @spl) -- Tectonic can now be built on `x86_64-pc-windows-msvc` by using - [vcpkg](https://github.com/microsoft/vcpkg) to discover dependent libraries. - This can be activated by setting the new environment variable - `TECTONIC_DEP_BACKEND=vcpkg` during the build process. (#420; @mcgoo) -- Potential issues with cross-compilation are fixed by properly respecting - `CARGO_TARGET_*` environment variables rather than using `cfg!()` macros, - which have the wrong values in the `build.rs` script. This support is - provided by a new `tectonic_cfg_support` crate that may be of interest to - other projects. (#477; @pkgw @ratmice) -- Tectonic now comes with beta-level fuzzing support using - [cargo-fuzz](https://github.com/rust-fuzz/cargo-fuzz). It is hoped that - eventually this infrastructure will help identify and close some truly - obscure and gnarly bugs in the Tectonic language implementation. At present, - the usefulness of the fuzzer is limited by memory leaks within multiple - reruns of the Tectonic engine, although in the process of setting up the - fuzzer several egregious leaks were fixed. (#315; @cyplo @pkgw) -- The Rust codebase is now formatted according to - [rustfmt](https://github.com/rust-lang/rustfmt) and generates no - [clippy](https://github.com/rust-lang/rust-clippy) complaints, and the CI - system now checs for these. (#282, #336, #337, #338, #339, #340, #341, #342, - #343, #344, #345, #346, #347, #348, #349, #352, #353; @pkgw @spl) -- A new `profile` feature allows building a debug version of the program - suitable for profiling. (#511; @ratmice) -- The test suite now covers the bibtex tool. (#407; @Mrmaxmeier) -- The test suite also now covers the local cache and tar bundle code. (#441; - @rekka) -- The CLI client now parses arguments using the `structopt` crate. (#465, #474; - @efx @Mrmaxmeier) -- A new `DirBundle` bundle backend provides a simple way for the engine to - access a bunch of files in a directory, although it is not yet wired up - to the CLI interface in a convenient way. (#492; @malbarbo) -- The current date tracked by the TeX engine is now settable from the Rust - level. (#486; @Mrmaxmeier). -- More cleanups and symbolification of the C/C++ code (#317, #327, #350, #398; - @Mrmaxmeier @pkgw @spl) -- C++ compilation on certain versions of g++ was fixed (#265; @pkgw) -- Fix deprecation warnings from the `error_chain` crate (#351; @spl) -- Improvements to the Travis CI infrastructure, output clarity, and - reliability. (#354, #360, #362, #394, #424, #443; @efx @rekka @spl) -- Attempts were made to increase the reliability of the Circle CI build, which - uses QEMU to compile Tectonic for a big-endian architecture. Unfortunately - it still just times out sometimes. (#290, #296; @pkgw) -- The deprecated `tempdir` crate has been replaced with `tempfile`. (#387; - @ratmice) -- Usage of `app_dirs` directories is now encapsulated better. (#429, #432; - @malbarbo @rekka) -- Bugs in reading unusual PDF files were fixed. (#396; @pkgw) -- A missing space in bibtex error messages is now back. (#485; @jneem) -- A memory corruption issue in the bibtex engine was fixed. (#493; @jneem) - - -# 0.1.11 (2018 Nov 5) - -This release is mainly about the following change: - -- The URL embedded in the code that points to the default bundle has been - changed to point to the archive.org domain. Hopefully this will result in - more reliable service — there have been problems with SSL certificate - updates on purl.org in the past - ([#253](https://github.com/tectonic-typesetting/tectonic/pull/253)). - -Developer-facing improvements: - -- The main crate now provides an all-in-one function, - `tectonic::latex_to_pdf()`, that does what it says, using “sensible” - defaults. Run a full TeX processing session, end-to-end, in a single - function call! - ([#252](https://github.com/tectonic-typesetting/tectonic/pull/252)) -- In support of the previous change, the behavior of the Rust code was changed - to use a static global - [mutex](https://doc.rust-lang.org/std/sync/struct.Mutex.html) to serialize - invocations of the C/C++ engine implementations, which currently include - massive amounts of global state and thus cannot be run in a multithreaded - fashion. The recommended approach used to be for users of the library to - provide such a mutex themselves. [@pkgw](https://github.com/pkgw) was - initially reluctant to include such a mutex at the crate level since he - feared the possibility of weird surprising behavior … but the *real* weird - surprising behavior is when you try to run the engine in a multithreaded - program and it blows up on you! -- *Also* in support of the previous change, the framework for running the test - suite has been revamped and improved. We can now run doctests that invoke - the full engine, and the tests of the executable artifacts now activate a - special debug mode that prevents accesses of the network and/or the calling - user’s personal resource file cache. -- The usual work on tidying the C/C++ code, and also more work towards the - planned HTML output mode. Activating the experimental “semantic pagination” - mode now alters the engine behavior in two key ways: it disables the - linebreaker and custom output routines. This breaks processing of all extant - documents, but [@pkgw](https://github.com/pkgw) believes that these changes - are important steps toward reliable generation of HTML output. - ([#237](https://github.com/tectonic-typesetting/tectonic/pull/237), - [#239](https://github.com/tectonic-typesetting/tectonic/pull/239), - [#245](https://github.com/tectonic-typesetting/tectonic/pull/245), - [#250](https://github.com/tectonic-typesetting/tectonic/pull/250)) - - -# 0.1.10 (2018 Sep 28) - -This release is mainly about upgrading a dependency related to SSL/TLS to -increase the range of systems on which Tectonic can be compiled. - -User-facing improvements: - -- Tectonic now correctly handles Unicode filenames — even ones containing - emoji! — without crashing - ([#165](https://github.com/tectonic-typesetting/tectonic/pull/165)). - -Developer/packager-facing improvements: - -- Tectonic now depends on the 0.3.x series of - [hyper-native-tls](https://crates.io/crates/hyper-native-tls), which can - build against the 1.1.x series of [OpenSSL](https://www.openssl.org/). - - -# 0.1.9 (2018 Sep 15) - -User-facing improvements: - -- Tectonic is now available on Windows! - ([#210](https://github.com/tectonic-typesetting/tectonic/pull/210), - [#231](https://github.com/tectonic-typesetting/tectonic/pull/231)). There - are likely to be rough edges to both the developer and user experience, but - the test suite passes and Windows is now included in the CI infrastructure. - Big thanks to new contributor [@crlf0710](https://github.com/crlf0710) who - really got the ball rolling on this important milestone. -- Fully offline operation is now much improved: - - There is a new `--only-cached` (AKA `-C`) option that will avoid all - Internet connections - ([#203](https://github.com/tectonic-typesetting/tectonic/pull/203)). While - Tectonic takes pains to avoid needing an Internet connection when compiling - documents, there are still times when you can get more done by explicitly - preventing it from even trying to talk to the network. - - The `--bundle` and `--web-bundle` options finally work again. The switch - to on-the-fly generation of format files broke them due to an internal - implementation problem; this has now been fixed - ([[#181](https://github.com/tectonic-typesetting/tectonic/pull/181)). - - If you put a `file://` URL into your Tectonic configuration file as your - default bundle, Tectonic will now load it correctly - ([#211](https://github.com/tectonic-typesetting/tectonic/pull/211)). - -Internal improvements: - -- Tectonic now avoids panicking from Rust into C code, which is not supported - behavior ([#91](https://github.com/tectonic-typesetting/tectonic/pull/91)). - Thanks to [@rekka](https://github.com/rekka) for persistence in getting this - one across the finish line. -- Tectonic now avoids crashing when trying to open empty filenames - ([#212](https://github.com/tectonic-typesetting/tectonic/pull/212)). - -Developer-facing improvements: - -- Tectonic is now more up-front about the fact that it requires Harfbuzz - version 1.4 or higher. -- Much of the code that drives compilation for the CLI tool has been moved - into the Tectonic library and has been made (more) reusable - ([#184](https://github.com/tectonic-typesetting/tectonic/pull/184)). Thanks - to new contributor [@jneem](https://github.com/jneem) for doing this! - - -# 0.1.8 (2018 Jun 17) - -This release contains a variety of bugfixes and features-in-development. - -User-facing improvements: - -- A prominent warning is now emitted when missing characters are encountered - in a font. The hope is that this will help un-confuse users who include - Unicode characters in their input files without loading a Unicode-capable - font. Before this change, such characters would just not appear in the - output document. -- Fix the implementation of the DVI “POP” operator, which was broken due to a - typo. This should fix various corner-case failures to generate output. -- The `.toc` and `.snm` output files emitted by Beamer are now treated as - intermediate files, and therefore not saved to disk by default (contributed - by Norbert Pozar). -- Various hardcoded `bibtex` buffer sizes are increased, allowing larger - bibliographies to be handled. -- Format files are now stored uncompressed. The compression did not save a ton - of disk space, but it did slow down debug builds significantly (contributed - by @Mrmaxmeier). -- The C code has been synchronized with XeTeX as of its Subversion - revision 46289. The chief difference from before is the use of newer - [Harfbuzz](https://www.freedesktop.org/wiki/Software/HarfBuzz/) features for - rendering OpenType math fonts, which should substantially improve “Unicode - math” output. - -Work towards HTML output: - -- The first steps have been taken! In particular, the engine now has an - internal flag to enable output to a new “SPX” format instead of XDV. SPX - stands for Semantically Paginated XDV — based on my (PKGW’s) research, to - achieve the best HTML output, the engine will have to emit intermediate data - that are incompatible with XDV. At the moment, SPX is the same as XDV except - with different identifying bytes, but this will change as the work towards - excellent HTML output continues. The command-line tool does **not** provide - access to this output format yet, so this work is currently purely internal. -- In addition, there is a stub engine called `spx2html` that will translate - SPX to HTML. At the moment it is a barely-functional proof-of-concept hook, - and it is not exposed to users. -- A new internal crate, `tectonic_xdv`, is added. It can parse XDV and SPX - files, and is used by the `spx2html` engine. - -Test suite improvements: - -- The test suite now supports reliable byte-for-byte validation of PDF output - files, through the following improvements: - - It is now possible for the engine to disable PDF compression (contributed - by @Mrmaxmeier). - - `xdvipdfmx` gained a mode to reproducibly generate the “unique tags” - associated with fonts. -- The testing support code is now centralized in a single crate (contributed - by @Mrmaxmeier). -- Continuous integration (CI) coverage now includes Linux and a big-endian - platform. -- The CI coverage now includes code coverage monitoring. - -Internal improvements: - -- Much of the command-line rebuild code has been moved inside the `tectonic` - crate so that it can be reused in a library context (contributed by @jneem). - -Improvements to the C code. As usual, there has been a great deal of tidying -that aims to make the code more readable and hackable without altering the -semantics. Many such changes are omitted below. - -- Tectonic’s synchronization with XeTeX is now tracked in version control - formally, by referencing the - [tectonic_staging](https://github.com/tectonic-typesetting/tectonic-staging) - repository as a Git submodule. It is not actually necessary to check out - this submodule to build Tectonic, however. -- The C code now requires, and takes advantage of, features in the - [C11](https://en.wikipedia.org/wiki/C11_(C_standard_revision)) revision of - the language standard. -- All remaining pieces of C code that needed porting to the Rust I/O backend - have been ported or removed. -- Virtually all hardcoded strings in the string pool have been removed - (contributed by @Mrmaxmeier). -- The C code has been split into a few more files. Some subsystems, like the - “shipout” code, use a lot of global variables that have been made static - thanks to the splitting. -- A big effort to clarify the pervasive and unintuitive `memory_word` - structure. -- Effort to tidy the `line_break()` function and significantly increase its - readability. This is in support of the goal of producing HTML output, for - which I believe it is going to be necessary to essentially defuse this - function. - - -# 0.1.7 (2017 Nov 15) - -(Copy-pasted from [the relevant forum post](https://tectonic.newton.cx/t/announcing-tectonic-0-1-7/76)). - -This is a fairly modest release — things have been very quiet lately as real -life and the day job have dominated my time. - -The most visible change is that I just figured out how to fix -[issue #58](https://github.com/tectonic-typesetting/tectonic/issues/58) — -Tectonic would fail to parse certain PDF images if one tried to include them -in a document. There was a bit of a silly bug in the Rust I/O layer that was -not getting exposed except in fairly specialized circumstances. It’s squashed -now! So certain documents that used to fail to compile should work now. - -There’s also been yet more nice work behind the scenes by some of our -indefatigable contributors: - -- @Mrmaxmeier contributed a whole bunch of cleanups to the C code as well as - fixes that should allow you to generate multiple PDFs inside a single - process. -- Ronny Chevalier updated the compilation infrastructure to work in parallel - and in the end contributed some features to Rust’s upstream [gcc] crate! - -There have been intermittent problems lately with the SSL certificate to the -purl.org domain which we use to seed the default “bundle” of LaTeX files. It’s -working for me at the moment, so it’s not totally busted, but the problem -seems to have come and gone over the past few weeks. See -[this thread](https://tectonic.newton.cx/t/problems-caused-by-expired-ssl-certificate-for-purl-org/75/1) -for more information and a workaround. - - -# 0.1.6 (2017 Jul 9) - -(Copy-pasted from -[the relevant forum post](https://tectonic.newton.cx/t/announcing-tectonic-0-1-6/45)). - -The version increment may be small but the project has seen an enormous amount -of work since the previous release, thanks to an awesome group of new -contributors. Here are some of the highlights: - -- Tectonic is now available for installation on Arch Linux as - [an AUR package](https://aur.archlinux.org/packages/tectonic/) and on macOS - as [a Homebrew formula](http://brewformulas.org/Tectonic), thanks to the - hard work of Alexander Bauer, Alexander Regueiro, Jan Tojnar, Kevin Yap, and - @ilovezfs. -- The web fetching is more robust and safer, using HTTPS by default - ([#69](https://github.com/tectonic-typesetting/tectonic/pull/69), Ronny - Chevalier) and more properly handling CDN redirections - ([#114](https://github.com/tectonic-typesetting/tectonic/pull/114), - @Mrmaxmeier) -- Input and output filenames with spaces and non-local paths are now handled - much better all around - ([#44](https://github.com/tectonic-typesetting/tectonic/pull/44), Alexander - Bauer; [#89](https://github.com/tectonic-typesetting/tectonic/pull/89), - Norbert Pozar; - [#94](https://github.com/tectonic-typesetting/tectonic/pull/94), Peter - Williams) -- SyncTeX output is now fully supported, activated with the new `--synctex` - option ([#55](https://github.com/tectonic-typesetting/tectonic/pull/55), - [#73](https://github.com/tectonic-typesetting/tectonic/pull/73), Norbert - Pozar) -- The output files can be placed in a directory other than the input directory - if the new `--outdir` or `-o` option is specified - ([#104](https://github.com/tectonic-typesetting/tectonic/pull/104), Felix - Döring) -- Tectonic will cleanly process TeX code provided on standard input if the - input path argument is `-` - ([#94](https://github.com/tectonic-typesetting/tectonic/pull/94), Peter - Williams) -- Tectonic's first new primitive, - [\TectonicCodaTokens](https://tectonic.newton.cx/t/engine-extension-tectoniccodatokens/16), - has been added to allow - [boilerplate-free document processing](https://tectonic.newton.cx/t/boilerplate-free-latex-documents/29) - (Peter Williams). -- The API docs can be, and are, built on [docs.rs](https://docs.rs/tectonic) - as of this release. - -Furthermore, I've launched a -[Tectonic forum site](https://tectonic.newton.cx/) (running an instance of the -[Discourse.org](https://www.discourse.org/) software). This is a bit of an -experiment since the project is so young and there are of course other venues, -like [GitHub issues](https://github.com/tectonic-typesetting/tectonic/issues) -and the [TeX StackExchange](https://tex.stackexchange.com/), for having -relevant discussions. But, by launching the Discourse site, we gain a venue -for project news (like this announcement!), more open-ended technical -discussions, Tectonic-specific tips and tricks that may not fit the -StackExchange model, and a knowledge base of answers to the roadblocks that -are so common in the TeX/LaTeX ecosystem. We hope that the forums will become -a valuable complement to the other community areas that are already out there. - -Here are some more improvements since the 0.1.5 release: - -- Some early work has occurred to make it possible to build Tectonic on - Android ([#105](https://github.com/tectonic-typesetting/tectonic/pull/105), - Marco Barbosa) -- The project’s build infrastructure is now more efficient - ([#60](https://github.com/tectonic-typesetting/tectonic/pull/60), Norbert - Pozar; [#116](https://github.com/tectonic-typesetting/tectonic/pull/116), - Ronny Chevalier) -- The style of the translated C code has been improved enormously thanks to - both manual interventions and the use of the neat tool - [Coccinelle](http://coccinelle.lip6.fr/), reducing warnings and increasing - cleanliness and portability - ([#66](https://github.com/tectonic-typesetting/tectonic/pull/66), - [#76](https://github.com/tectonic-typesetting/tectonic/pull/76), - [#83](https://github.com/tectonic-typesetting/tectonic/pull/83), - [#92](https://github.com/tectonic-typesetting/tectonic/pull/92), - [#107](https://github.com/tectonic-typesetting/tectonic/pull/107), - [#112](https://github.com/tectonic-typesetting/tectonic/pull/112), Ronny - Chevalier; - [#105](https://github.com/tectonic-typesetting/tectonic/pull/105), Norbert - Pozar; [#94](https://github.com/tectonic-typesetting/tectonic/pull/94), - [#98](https://github.com/tectonic-typesetting/tectonic/pull/98), Peter - Williams ) -- The test suite now covers behaviors of the Tectonic command-line program - itself ([#84](https://github.com/tectonic-typesetting/tectonic/pull/84), - Alexander Bauer) -- We now correctly run `bibtex` when using the `amsrefs` package - ([#48](https://github.com/tectonic-typesetting/tectonic/pull/48), Norbert - Pozar) -- Tectonic will correctly try a wider variety of file extensions when trying - to open resources - ([#93](https://github.com/tectonic-typesetting/tectonic/pull/93), Marek - Šuppa; [#100](https://github.com/tectonic-typesetting/tectonic/pull/100), - Norbert Pozar) -- Cached bundle files are now made read-only - ([#55](https://github.com/tectonic-typesetting/tectonic/pull/55), Alexander - Bauer) -- We’ve fixed a subtle path handling issue that was harming generation of the - standard LaTeX format - ([#77](https://github.com/tectonic-typesetting/tectonic/pull/77), Norbert - Pozar) -- Very large bibliographies are now better supported - ([#87](https://github.com/tectonic-typesetting/tectonic/pull/87), Marek - Šuppa) -- The UI now makes it clearer that network failures are not likely Tectonic’s - fault ([#88](https://github.com/tectonic-typesetting/tectonic/pull/88), - Marek Šuppa) -- It is now theoretically possible to load Omega font metrics files - ([#97](https://github.com/tectonic-typesetting/tectonic/pull/97), Peter - Williams) -- Output log files are now produced if `--keep-logs` is specified and an error - occurs ([#103](https://github.com/tectonic-typesetting/tectonic/pull/103), - Norbert Pozar) - -There are a few known problems with this release: - -- Tectonic doesn’t support HTTP proxies, and in some parts of the world you - can’t access the [purl.org](https://purl.org/) website that Tectonic checks - for its bundle. You can work around this by - [creating a custom configuration file](https://tectonic.newton.cx/t/how-to-use-tectonic-if-you-can-t-access-purl-org/44). -- Tectonic doesn’t have a mechanism to invoke the - [biber](http://biblatex-biber.sourceforge.net/) tool, so it cannot easily - work for anyone that uses - [biblatex](http://mirrors.rit.edu/CTAN/help/Catalogue/entries/biblatex.html). - This is a common complaint so it would be great to see a workaround be - devised - ([relevant issue](https://github.com/tectonic-typesetting/tectonic/issues/35))! - -Enormous thanks are in order to everyone who’s started contributing to the -project. - - -# Previous releases - -Are not documented here. Consult the Git history. +[branch]: https://github.com/tectonic-typesetting/tectonic/blob/release/CHANGELOG.md +[gh-releases]: https://github.com/tectonic-typesetting/tectonic/releases diff --git a/Cargo.lock b/Cargo.lock index cd46faabbb..23096ca0e4 100644 --- a/Cargo.lock +++ b/Cargo.lock @@ -4,9 +4,9 @@ version = 3 [[package]] name = "addr2line" -version = "0.17.0" +version = "0.19.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "b9ecd88a8c8378ca913a680cd98f0f13ac67383d35993f86c90a70e3f137816b" +checksum = "a76fd60b23679b7d19bd066031410fb7e458ccc5e958eb5c325888ce4baedc97" dependencies = [ "gimli", ] @@ -19,9 +19,18 @@ checksum = "f26201604c87b1e01bd3d98f8d5d9a8fcbb815e8cedb41ffccbeb4bf593a35fe" [[package]] name = "aho-corasick" -version = "0.7.19" +version = "0.7.20" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "cc936419f96fa211c1b9166887b38e5e40b19958e5b895be7c1f93adec7071ac" +dependencies = [ + "memchr", +] + +[[package]] +name = "aho-corasick" +version = "1.0.1" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "b4f55bd91a0978cbfd91c457a164bab8b4001c833b7f323132c0a4e1922dd44e" +checksum = "67fc08ce920c31afb70f013dcce1bfc3a3195de6a228474e45e1f145b36f8d04" dependencies = [ "memchr", ] @@ -41,36 +50,73 @@ version = "0.12.1" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "d52a9bb7ec0cf484c551830a7ce27bd20d67eac647e1befb56b0be4ee39a55d2" dependencies = [ - "winapi 0.3.9", + "winapi", ] [[package]] name = "anyhow" -version = "1.0.65" +version = "1.0.71" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "98161a4e3e2184da77bb14f02184cdd111e83bbbcc9979dfee3c44b9a85f5602" +checksum = "9c7d0618f0e0b7e8ff11427422b64564d5fb0be1940354bfe2e0529b18a9d9b8" [[package]] name = "app_dirs2" -version = "2.5.4" +version = "2.5.5" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "47a8d2d8dbda5fca0a522259fb88e4f55d2b10ad39f5f03adeebf85031eba501" +checksum = "a7e7b35733e3a8c1ccb90385088dd5b6eaa61325cb4d1ad56e683b5224ff352e" dependencies = [ "jni", "ndk-context", - "winapi 0.3.9", + "winapi", "xdg", ] +[[package]] +name = "async-priority-channel" +version = "0.1.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "c21678992e1b21bebfe2bc53ab5f5f68c106eddab31b24e0bb06e9b715a86640" +dependencies = [ + "event-listener", +] + +[[package]] +name = "async-recursion" +version = "1.0.4" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "0e97ce7de6cf12de5d7226c73f5ba9811622f4db3a5b91b55c53e987e5f91cba" +dependencies = [ + "proc-macro2", + "quote", + "syn 2.0.16", +] + +[[package]] +name = "async-trait" +version = "0.1.68" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "b9ccdd8f2a161be9bd5c023df56f1b2a0bd1d83872ae53b71a84a12c9bf6e842" +dependencies = [ + "proc-macro2", + "quote", + "syn 2.0.16", +] + +[[package]] +name = "atomic-take" +version = "1.1.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "a8ab6b55fe97976e46f91ddbed8d147d966475dc29b2032757ba47e02376fbc3" + [[package]] name = "atty" version = "0.2.14" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "d9b39be18770d11421cdb1b9947a45dd3f37e93092cbf377614828a319d5fee8" dependencies = [ - "hermit-abi", + "hermit-abi 0.1.19", "libc", - "winapi 0.3.9", + "winapi", ] [[package]] @@ -81,33 +127,30 @@ checksum = "d468802bab17cbc0cc575e9b053f41e72aa36bfa6b7f55e3529ffa43161b97fa" [[package]] name = "backtrace" -version = "0.3.66" +version = "0.3.67" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "cab84319d616cfb654d03394f38ab7e6f0919e181b1b57e1fd15e7fb4077d9a7" +checksum = "233d376d6d185f2a3093e58f283f60f880315b6c60075b01f36b3b85154564ca" dependencies = [ "addr2line", "cc", - "cfg-if 1.0.0", + "cfg-if", "libc", - "miniz_oxide", + "miniz_oxide 0.6.2", "object", "rustc-demangle", ] [[package]] name = "base64" -version = "0.10.1" +version = "0.13.1" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "0b25d992356d2eb0ed82172f5248873db5560c4721f564b13cb5193bda5e668e" -dependencies = [ - "byteorder", -] +checksum = "9e1b586273c5702936fe7b7d6896644d8be71e6314cfe09d3167c95f712589e8" [[package]] name = "base64" -version = "0.13.0" +version = "0.21.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "904dfeac50f3cdaba28fc6f57fdcddb75f49ed61346676a78c4ffe55877802fd" +checksum = "a4a4ddaa51a5bc52a6948f74c06d20aaaddb71924eab79b8c97a8c556e942d6a" [[package]] name = "bitflags" @@ -116,71 +159,54 @@ source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "bef38d45163c2f1dde094a7dfd33ccf595c92905c8f8f4fdc18d06fb1037718a" [[package]] -name = "block-buffer" -version = "0.7.3" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "c0940dc441f31689269e10ac70eb1002a3a1d3ad1390e030043662eb7fe4688b" -dependencies = [ - "block-padding", - "byte-tools", - "byteorder", - "generic-array 0.12.4", -] - -[[package]] -name = "block-buffer" -version = "0.9.0" +name = "bitflags" +version = "2.2.1" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "4152116fd6e9dadb291ae18fc1ec3575ed6d84c29642d97890f4b4a3417297e4" -dependencies = [ - "generic-array 0.14.6", -] +checksum = "24a6904aef64d73cf10ab17ebace7befb918b82164785cb89907993be7f83813" [[package]] name = "block-buffer" -version = "0.10.3" +version = "0.10.4" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "69cce20737498f97b993470a6e536b8523f0af7892a4f928cceb1ac5e52ebe7e" +checksum = "3078c7629b62d3f0439517fa394996acacc5cbc91c5a20d8c658e77abd503a71" dependencies = [ - "generic-array 0.14.6", + "generic-array", ] [[package]] -name = "block-padding" -version = "0.1.5" +name = "bstr" +version = "1.4.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "fa79dedbb091f449f1f39e53edf88d5dbe95f895dae6135a8d7b881fb5af73f5" +checksum = "c3d4260bcc2e8fc9df1eac4919a720effeb63a3f0952f5bf4944adfa18897f09" dependencies = [ - "byte-tools", + "memchr", + "once_cell", + "regex-automata", + "serde", ] [[package]] -name = "bstr" -version = "0.2.17" +name = "btoi" +version = "0.4.3" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "ba3569f383e8f1598449f1a423e72e99569137b47740b1da11ef19af3d5c3223" +checksum = "9dd6407f73a9b8b6162d8a2ef999fe6afd7cc15902ebf42c5cd296addf17e0ad" dependencies = [ - "memchr", + "num-traits", ] [[package]] name = "bumpalo" -version = "3.11.0" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "c1ad822118d20d2c234f427000d5acc36eabe1e29a348c89b63dd60b13f28e5d" - -[[package]] -name = "byte-tools" -version = "0.3.1" +version = "3.12.2" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "e3b5ca7a04898ad4bcd41c90c5285445ff5b791899bb1b0abdd2a2aa791211d7" +checksum = "3c6ed94e98ecff0c12dd1b04c15ec0d7d9458ca8fe806cea6f12954efe74c63b" [[package]] name = "byte-unit" -version = "4.0.14" +version = "4.0.19" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "95ebf10dda65f19ff0f42ea15572a359ed60d7fc74fdc984d90310937be0014b" +checksum = "da78b32057b8fdfc352504708feeba7216dcd65a2c9ab02978cbd288d1279b6c" dependencies = [ + "serde", "utf8-width", ] @@ -192,26 +218,15 @@ checksum = "14c189c53d098945499cdfa7ecc63567cf3886b3332b312a5b4585d8d3a6a610" [[package]] name = "bytes" -version = "0.4.12" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "206fdffcfa2df7cbe15601ef46c813fce0965eb3286db6b56c583b814b51c81c" -dependencies = [ - "byteorder", - "either", - "iovec", -] - -[[package]] -name = "bytes" -version = "1.2.1" +version = "1.4.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "ec8a7b6a70fde80372154c65702f00a0f56f3e1c36abbc6c440484be248856db" +checksum = "89b2fd2a0dcf38d7971e2194b6b6eebab45ae01067456a7fd93d5547a61b70be" [[package]] name = "cc" -version = "1.0.73" +version = "1.0.79" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "2fff2a6927b3bb87f9595d67196a70493f627687a71d87a0d692242c33f58c11" +checksum = "50d30906286121d95be3d479533b458f87493b30a4b5f79a607db8f5d11aa91f" [[package]] name = "cesu8" @@ -219,12 +234,6 @@ version = "1.1.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "6d43a04d8753f35258c91f8ec639f792891f748a1edbd759cf1dcea3382ad83c" -[[package]] -name = "cfg-if" -version = "0.1.10" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "4785bdd1c96b2a846b2bd7cc02e86b6b3dbf14e7e53446c4f54c92a361040822" - [[package]] name = "cfg-if" version = "1.0.0" @@ -233,36 +242,36 @@ checksum = "baf1de4339761588bc0619e3cbc0120ee582ebb74b53b4efbf79117bd2da40fd" [[package]] name = "chrono" -version = "0.4.22" +version = "0.4.24" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "bfd4d1b31faaa3a89d7934dbded3111da0d2ef28e3ebccdb4f0179f5929d1ef1" +checksum = "4e3c5919066adf22df73762e50cffcde3a758f2a848b113b586d1f86728b673b" dependencies = [ "iana-time-zone", "num-integer", "num-traits", - "winapi 0.3.9", + "winapi", ] [[package]] name = "chrono-tz" -version = "0.6.3" +version = "0.6.1" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "29c39203181991a7dd4343b8005bd804e7a9a37afb8ac070e43771e8c820bbde" +checksum = "58549f1842da3080ce63002102d5bc954c7bc843d4f47818e642abdc36253552" dependencies = [ "chrono", "chrono-tz-build", - "phf 0.11.1", + "phf 0.10.1", ] [[package]] name = "chrono-tz-build" -version = "0.0.3" +version = "0.0.2" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "6f509c3a87b33437b05e2458750a0700e5bdd6956176773e6c7d6dd15a283a0c" +checksum = "db058d493fb2f65f41861bfed7e3fe6335264a9f0f92710cab5bdf01fef09069" dependencies = [ "parse-zoneinfo", - "phf 0.11.1", - "phf_codegen 0.11.1", + "phf 0.10.1", + "phf_codegen 0.10.0", ] [[package]] @@ -273,8 +282,8 @@ checksum = "a0610544180c38b88101fecf2dd634b174a62eef6946f84dfc6a7127512b381c" dependencies = [ "ansi_term", "atty", - "bitflags", - "strsim 0.8.0", + "bitflags 1.3.2", + "strsim", "textwrap", "unicode-width", "vec_map", @@ -282,24 +291,15 @@ dependencies = [ [[package]] name = "clearscreen" -version = "1.0.10" +version = "2.0.1" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "c969a6b6dadff9f3349b1f783f553e2411104763ca4789e1c6ca6a41f46a57b0" +checksum = "72f3f22f1a586604e62efd23f78218f3ccdecf7a33c4500db2d37d85a24fe994" dependencies = [ - "nix 0.24.2", + "nix", "terminfo", "thiserror", "which", - "winapi 0.3.9", -] - -[[package]] -name = "cloudabi" -version = "0.0.3" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "ddfc5b9aa5d4507acaf872de71051dfd0e309860e88966e1051e462a077aac4f" -dependencies = [ - "bitflags", + "winapi", ] [[package]] @@ -308,18 +308,20 @@ version = "4.6.6" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "35ed6e9d84f0b51a7f52daf1c7d71dd136fd7a3f41a8462b8cdb8c78d920fad4" dependencies = [ - "bytes 1.2.1", + "bytes", "memchr", ] [[package]] name = "command-group" -version = "1.0.8" +version = "2.1.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "f7a8a86f409b4a59df3a3e4bee2de0b83f1755fdd2a25e3a9684c396fc4bed2c" +checksum = "5080df6b0f0ecb76cab30808f00d937ba725cebe266a3da8cd89dff92f2a9916" dependencies = [ - "nix 0.22.3", - "winapi 0.3.9", + "async-trait", + "nix", + "tokio", + "winapi", ] [[package]] @@ -334,15 +336,15 @@ dependencies = [ [[package]] name = "core-foundation-sys" -version = "0.8.3" +version = "0.8.4" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "5827cebf4670468b8772dd191856768aedcb1b0278a04f989f7766351917b9dc" +checksum = "e496a50fda8aacccc86d7529e2c1e0892dbd0f898a6b5645b5561b89c3210efa" [[package]] name = "cpufeatures" -version = "0.2.5" +version = "0.2.7" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "28d997bd5e24a5928dd43e46dc529867e207907fe0b239c3477d924f7f2ca320" +checksum = "3e4c1eaa2012c47becbbad2ab175484c2a84d1185b566fb2cc5b8707343dfe58" dependencies = [ "libc", ] @@ -353,64 +355,26 @@ version = "1.3.2" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "b540bd8bc810d3885c6ea91e2018302f68baba2129ab3e88f32389ee9370880d" dependencies = [ - "cfg-if 1.0.0", -] - -[[package]] -name = "crossbeam-deque" -version = "0.7.4" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "c20ff29ded3204c5106278a81a38f4b482636ed4fa1e6cfbeef193291beb29ed" -dependencies = [ - "crossbeam-epoch", - "crossbeam-utils 0.7.2", - "maybe-uninit", -] - -[[package]] -name = "crossbeam-epoch" -version = "0.8.2" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "058ed274caafc1f60c4997b5fc07bf7dc7cca454af7c6e81edffe5f33f70dace" -dependencies = [ - "autocfg", - "cfg-if 0.1.10", - "crossbeam-utils 0.7.2", - "lazy_static", - "maybe-uninit", - "memoffset 0.5.6", - "scopeguard", -] - -[[package]] -name = "crossbeam-queue" -version = "0.2.3" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "774ba60a54c213d409d5353bda12d49cd68d14e45036a285234c8d6f91f92570" -dependencies = [ - "cfg-if 0.1.10", - "crossbeam-utils 0.7.2", - "maybe-uninit", + "cfg-if", ] [[package]] -name = "crossbeam-utils" -version = "0.7.2" +name = "crossbeam-channel" +version = "0.5.8" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "c3c7c73a2d1e9fc0886a08b93e98eb643461230d5f1925e4036204d5f2e261a8" +checksum = "a33c2bf77f2df06183c3aa30d1e96c0695a313d4f9c453cc3762a6db39f99200" dependencies = [ - "autocfg", - "cfg-if 0.1.10", - "lazy_static", + "cfg-if", + "crossbeam-utils", ] [[package]] name = "crossbeam-utils" -version = "0.8.12" +version = "0.8.15" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "edbafec5fa1f196ca66527c1b12c2ec4745ca14b50f1ad8f9f6f720b55d11fac" +checksum = "3c063cd8cc95f5c377ed0d4b49a4b21f632396ff690e8470c29b3359b346984b" dependencies = [ - "cfg-if 1.0.0", + "cfg-if", ] [[package]] @@ -419,7 +383,7 @@ version = "0.1.6" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "1bfb12502f3fc46cca1bb51ac28df9d618d813cdc3d2f25b9fe775a34af26bb3" dependencies = [ - "generic-array 0.14.6", + "generic-array", "typenum", ] @@ -435,14 +399,14 @@ dependencies = [ "openssl-sys", "schannel", "socket2", - "winapi 0.3.9", + "winapi", ] [[package]] name = "curl-sys" -version = "0.4.56+curl-7.83.1" +version = "0.4.61+curl-8.0.1" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "6093e169dd4de29e468fa649fbae11cdcd5551c81fe5bf1b0677adad7ef3d26f" +checksum = "14d05c10f541ae6f3bc5b3d923c20001f47db7d5f0b2bc6ad16490133842db79" dependencies = [ "cc", "libc", @@ -450,152 +414,91 @@ dependencies = [ "openssl-sys", "pkg-config", "vcpkg", - "winapi 0.3.9", -] - -[[package]] -name = "darling" -version = "0.12.4" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "5f2c43f534ea4b0b049015d00269734195e6d3f0f6635cb692251aca6f9f8b3c" -dependencies = [ - "darling_core", - "darling_macro", -] - -[[package]] -name = "darling_core" -version = "0.12.4" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "8e91455b86830a1c21799d94524df0845183fa55bafd9aa137b01c7d1065fa36" -dependencies = [ - "fnv", - "ident_case", - "proc-macro2", - "quote", - "strsim 0.10.0", - "syn", -] - -[[package]] -name = "darling_macro" -version = "0.12.4" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "29b5acf0dea37a7f66f7b25d2c5e93fd46f8f6968b1a5d7a3e02e97768afc95a" -dependencies = [ - "darling_core", - "quote", - "syn", + "winapi", ] [[package]] -name = "derive_builder" -version = "0.10.2" +name = "deunicode" +version = "0.4.3" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "d13202debe11181040ae9063d739fa32cfcaaebe2275fe387703460ae2365b30" -dependencies = [ - "derive_builder_macro", -] +checksum = "850878694b7933ca4c9569d30a34b55031b9b139ee1fc7b94a527c4ef960d690" [[package]] -name = "derive_builder_core" -version = "0.10.2" +name = "digest" +version = "0.10.6" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "66e616858f6187ed828df7c64a6d71720d83767a7f19740b2d1b6fe6327b36e5" +checksum = "8168378f4e5023e7218c89c891c0fd8ecdb5e5e4f18cb78f38cf245dd021e76f" dependencies = [ - "darling", - "proc-macro2", - "quote", - "syn", + "block-buffer", + "crypto-common", ] [[package]] -name = "derive_builder_macro" -version = "0.10.2" +name = "dirs" +version = "4.0.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "58a94ace95092c5acb1e97a7e846b310cfbd499652f72297da7493f618a98d73" +checksum = "ca3aa72a6f96ea37bbc5aa912f6788242832f75369bdfdadcb0e38423f100059" dependencies = [ - "derive_builder_core", - "syn", + "dirs-sys", ] [[package]] -name = "deunicode" -version = "0.4.3" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "850878694b7933ca4c9569d30a34b55031b9b139ee1fc7b94a527c4ef960d690" - -[[package]] -name = "digest" -version = "0.8.1" +name = "dirs-sys" +version = "0.3.7" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "f3d0c8c8752312f9713efd397ff63acb9f85585afbf179282e720e7704954dd5" +checksum = "1b1d1d91c932ef41c0f2663aa8b0ca0342d444d842c06914aa0a7e352d0bada6" dependencies = [ - "generic-array 0.12.4", + "libc", + "redox_users", + "winapi", ] [[package]] -name = "digest" -version = "0.9.0" +name = "dunce" +version = "1.0.4" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "d3dd60d1080a57a05ab032377049e0591415d2b31afd7028356dbf3cc6dcb066" -dependencies = [ - "generic-array 0.14.6", -] +checksum = "56ce8c6da7551ec6c462cbaf3bfbc75131ebbfa1c944aeaa9dab51ca1c5f0c3b" [[package]] -name = "digest" -version = "0.10.5" +name = "either" +version = "1.8.1" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "adfbc57365a37acbd2ebf2b64d7e69bb766e2fea813521ed536f5d0520dcf86c" -dependencies = [ - "block-buffer 0.10.3", - "crypto-common", -] +checksum = "7fcaabb2fef8c910e7f4c7ce9f67a1283a1715879a7c230ca9d6d1ae31f16d91" [[package]] -name = "dirs" -version = "2.0.2" +name = "encoding_rs" +version = "0.8.32" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "13aea89a5c93364a98e9b37b2fa237effbb694d5cfe01c5b70941f7eb087d5e3" +checksum = "071a31f4ee85403370b58aca746f01041ede6f0da2730960ad001edc2b71b394" dependencies = [ - "cfg-if 0.1.10", - "dirs-sys", + "cfg-if", ] [[package]] -name = "dirs" -version = "4.0.0" +name = "endian-type" +version = "0.1.2" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "ca3aa72a6f96ea37bbc5aa912f6788242832f75369bdfdadcb0e38423f100059" -dependencies = [ - "dirs-sys", -] +checksum = "c34f04666d835ff5d62e058c3995147c06f42fe86ff053337632bca83e42702d" [[package]] -name = "dirs-sys" -version = "0.3.7" +name = "errno" +version = "0.3.1" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "1b1d1d91c932ef41c0f2663aa8b0ca0342d444d842c06914aa0a7e352d0bada6" +checksum = "4bcfec3a70f97c962c307b2d2c56e358cf1d00b558d74262b5f929ee8cc7e73a" dependencies = [ + "errno-dragonfly", "libc", - "redox_users", - "winapi 0.3.9", + "windows-sys 0.48.0", ] [[package]] -name = "either" -version = "1.8.0" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "90e5c1c8368803113bf0c9584fc495a58b86dc8a29edbf8fe877d21d9507e797" - -[[package]] -name = "encoding_rs" -version = "0.8.31" +name = "errno-dragonfly" +version = "0.1.2" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "9852635589dc9f9ea1b6fe9f05b50ef208c85c834a562f0c6abb1c475736ec2b" +checksum = "aa68f1b12764fab894d2755d2518754e71b4fd80ecfb822714a1206c2aab39bf" dependencies = [ - "cfg-if 1.0.0", + "cc", + "libc", ] [[package]] @@ -609,41 +512,41 @@ dependencies = [ ] [[package]] -name = "fake-simd" -version = "0.1.2" +name = "event-listener" +version = "2.5.3" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "e88a8acf291dafb59c2d96e8f59828f3838bb1a70398823ade51a84de6a6deed" +checksum = "0206175f82b8d6bf6652ff7d71a1e27fd2e4efde587fd368662814d6ec1d9ce0" [[package]] name = "fastrand" -version = "1.8.0" +version = "1.9.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "a7a407cfaa3385c4ae6b23e84623d48c2798d06e3e6a1878f7f59f17b3f86499" +checksum = "e51093e27b0797c359783294ca4f0a911c270184cb10f85783b118614a1501be" dependencies = [ "instant", ] [[package]] name = "filetime" -version = "0.2.17" +version = "0.2.21" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "e94a7bbaa59354bc20dd75b67f23e2797b4490e9d6928203fb105c79e448c86c" +checksum = "5cbc844cecaee9d4443931972e1289c8ff485cb4cc2767cb03ca139ed6885153" dependencies = [ - "cfg-if 1.0.0", + "cfg-if", "libc", "redox_syscall 0.2.16", - "windows-sys", + "windows-sys 0.48.0", ] [[package]] name = "flate2" -version = "1.0.24" +version = "1.0.26" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "f82b0f4c27ad9f8bfd1f3208d882da2b09c301bc1c828fd3a00d0216d2fbbff6" +checksum = "3b9429470923de8e8cbd4d2dc513535400b4b3fef0319fb5c4e1f520a7bef743" dependencies = [ "crc32fast", "libz-sys", - "miniz_oxide", + "miniz_oxide 0.7.1", ] [[package]] @@ -683,101 +586,100 @@ source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "9564fc758e15025b46aa6643b1b77d047d1a56a1aea6e01002ac0c7026876213" dependencies = [ "libc", - "winapi 0.3.9", -] - -[[package]] -name = "fsevent" -version = "0.4.0" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "5ab7d1bd1bd33cc98b0889831b72da23c0aa4df9cec7e0702f46ecea04b35db6" -dependencies = [ - "bitflags", - "fsevent-sys", + "winapi", ] [[package]] name = "fsevent-sys" -version = "2.0.1" +version = "4.1.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "f41b048a94555da0f42f1d632e2e19510084fb8e303b0daa2816e733fb3644a0" +checksum = "76ee7a02da4d231650c7cea31349b889be2f45ddb3ef3032d2ec8185f6313fd2" dependencies = [ "libc", ] [[package]] -name = "fuchsia-zircon" -version = "0.3.3" +name = "futures" +version = "0.3.28" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "2e9763c69ebaae630ba35f74888db465e49e259ba1bc0eda7d06f4a067615d82" +checksum = "23342abe12aba583913b2e62f22225ff9c950774065e4bfb61a19cd9770fec40" dependencies = [ - "bitflags", - "fuchsia-zircon-sys", + "futures-channel", + "futures-core", + "futures-executor", + "futures-io", + "futures-sink", + "futures-task", + "futures-util", ] -[[package]] -name = "fuchsia-zircon-sys" -version = "0.3.3" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "3dcaa9ae7725d12cdb85b3ad99a434db70b468c09ded17e012d86b5c1010f7a7" - -[[package]] -name = "futures" -version = "0.1.31" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "3a471a38ef8ed83cd6e40aa59c1ffe17db6855c18e3604d9c4ed8c08ebc28678" - [[package]] name = "futures-channel" -version = "0.3.24" +version = "0.3.28" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "30bdd20c28fadd505d0fd6712cdfcb0d4b5648baf45faef7f852afb2399bb050" +checksum = "955518d47e09b25bbebc7a18df10b81f0c766eaf4c4f1cccef2fca5f2a4fb5f2" dependencies = [ "futures-core", + "futures-sink", ] [[package]] name = "futures-core" -version = "0.3.24" +version = "0.3.28" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "4e5aa3de05362c3fb88de6531e6296e85cde7739cccad4b9dfeeb7f6ebce56bf" +checksum = "4bca583b7e26f571124fe5b7561d49cb2868d79116cfa0eefce955557c6fee8c" [[package]] -name = "futures-cpupool" -version = "0.1.8" +name = "futures-executor" +version = "0.3.28" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "ab90cde24b3319636588d0c35fe03b1333857621051837ed769faefb4c2162e4" +checksum = "ccecee823288125bd88b4d7f565c9e58e41858e47ab72e8ea2d64e93624386e0" dependencies = [ - "futures", - "num_cpus", + "futures-core", + "futures-task", + "futures-util", ] [[package]] name = "futures-io" -version = "0.3.24" +version = "0.3.28" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "bbf4d2a7a308fd4578637c0b17c7e1c7ba127b8f6ba00b29f717e9655d85eb68" +checksum = "4fff74096e71ed47f8e023204cfd0aa1289cd54ae5430a9523be060cdb849964" + +[[package]] +name = "futures-macro" +version = "0.3.28" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "89ca545a94061b6365f2c7355b4b32bd20df3ff95f02da9329b34ccc3bd6ee72" +dependencies = [ + "proc-macro2", + "quote", + "syn 2.0.16", +] [[package]] name = "futures-sink" -version = "0.3.24" +version = "0.3.28" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "21b20ba5a92e727ba30e72834706623d94ac93a725410b6a6b6fbc1b07f7ba56" +checksum = "f43be4fe21a13b9781a69afa4985b0f6ee0e1afab2c6f454a8cf30e2b2237b6e" [[package]] name = "futures-task" -version = "0.3.24" +version = "0.3.28" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "a6508c467c73851293f390476d4491cf4d227dbabcd4170f3bb6044959b294f1" +checksum = "76d3d132be6c0e6aa1534069c705a74a5997a356c0dc2f86a47765e5617c5b65" [[package]] name = "futures-util" -version = "0.3.24" +version = "0.3.28" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "44fb6cb1be61cc1d2e43b262516aafcf63b241cffdb1d3fa115f91d9c7b09c90" +checksum = "26b01e40b772d54cf6c6d721c1d1abd0647a0106a12ecaa1c186273392a69533" dependencies = [ + "futures-channel", "futures-core", "futures-io", + "futures-macro", + "futures-sink", "futures-task", "memchr", "pin-project-lite", @@ -787,18 +689,9 @@ dependencies = [ [[package]] name = "generic-array" -version = "0.12.4" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "ffdf9f34f1447443d37393cc6c2b8313aebddcd96906caf34e54c68d8e57d7bd" -dependencies = [ - "typenum", -] - -[[package]] -name = "generic-array" -version = "0.14.6" +version = "0.14.7" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "bff49e947297f3312447abdca79f45f4738097cc82b06e72054d2223f601f1b9" +checksum = "85649ca51fd72272d7821adaf274ad91c288277713d9c18820d8499a7ff69e9a" dependencies = [ "typenum", "version_check", @@ -806,45 +699,238 @@ dependencies = [ [[package]] name = "getrandom" -version = "0.1.16" +version = "0.2.9" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "8fc3cb4d91f53b50155bdcfd23f6a4c39ae1969c2ae85982b135750cccaf5fce" +checksum = "c85e1d9ab2eadba7e5040d4e09cbd6d072b76a557ad64e797c2cb9d4da21d7e4" dependencies = [ - "cfg-if 1.0.0", + "cfg-if", "libc", - "wasi 0.9.0+wasi-snapshot-preview1", + "wasi", ] [[package]] -name = "getrandom" -version = "0.2.7" +name = "gimli" +version = "0.27.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "ad0a93d233ebf96623465aad4046a8d3aa4da22d4f4beba5388838c8a434bbb4" + +[[package]] +name = "gix-actor" +version = "0.20.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "848efa0f1210cea8638f95691c82a46f98a74b9e3524f01d4955ebc25a8f84f3" +dependencies = [ + "bstr", + "btoi", + "gix-date", + "itoa", + "nom", + "thiserror", +] + +[[package]] +name = "gix-config" +version = "0.22.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "1d252a0eddb6df74600d3d8872dc9fe98835a7da43110411d705b682f49d4ac1" +dependencies = [ + "bstr", + "gix-config-value", + "gix-features", + "gix-glob", + "gix-path", + "gix-ref", + "gix-sec", + "log", + "memchr", + "nom", + "once_cell", + "smallvec", + "thiserror", + "unicode-bom", +] + +[[package]] +name = "gix-config-value" +version = "0.12.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "4eb1a864a501629691edf6c15a593b7a51eebaa1e8468e9ddc623de7c9b58ec6" +checksum = "786861e84a5793ad5f863d846de5eb064cd23b87e61ad708c8c402608202e7be" dependencies = [ - "cfg-if 1.0.0", + "bitflags 2.2.1", + "bstr", + "gix-path", "libc", - "wasi 0.11.0+wasi-snapshot-preview1", + "thiserror", ] [[package]] -name = "gimli" -version = "0.26.2" +name = "gix-date" +version = "0.5.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "22030e2c5a68ec659fde1e949a745124b48e6fa8b045b7ed5bd1fe4ccc5c4e5d" +checksum = "99056f37270715f5c7584fd8b46899a2296af9cae92463bf58b8bd1f5a78e553" +dependencies = [ + "bstr", + "itoa", + "thiserror", + "time", +] [[package]] -name = "glob" -version = "0.3.0" +name = "gix-features" +version = "0.29.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "cf69b0f5c701cc3ae22d3204b671907668f6437ca88862d355eaf9bc47a4f897" +dependencies = [ + "gix-hash", + "libc", + "sha1_smol", + "walkdir", +] + +[[package]] +name = "gix-fs" +version = "0.1.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "9b37a1832f691fdc09910bd267f9a2e413737c1f9ec68c6e31f9e802616278a9" +dependencies = [ + "gix-features", +] + +[[package]] +name = "gix-glob" +version = "0.7.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "c07c98204529ac3f24b34754540a852593d2a4c7349008df389240266627a72a" +dependencies = [ + "bitflags 2.2.1", + "bstr", + "gix-features", + "gix-path", +] + +[[package]] +name = "gix-hash" +version = "0.11.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "078eec3ac2808cc03f0bddd2704cb661da5c5dc33b41a9d7947b141d499c7c42" +dependencies = [ + "hex", + "thiserror", +] + +[[package]] +name = "gix-lock" +version = "5.0.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "2c693d7f05730fa74a7c467150adc7cea393518410c65f0672f80226b8111555" +dependencies = [ + "gix-tempfile", + "gix-utils", + "thiserror", +] + +[[package]] +name = "gix-object" +version = "0.29.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "2d96bd620fd08accdd37f70b2183cfa0b001b4f1c6ade8b7f6e15cb3d9e261ce" +dependencies = [ + "bstr", + "btoi", + "gix-actor", + "gix-features", + "gix-hash", + "gix-validate", + "hex", + "itoa", + "nom", + "smallvec", + "thiserror", +] + +[[package]] +name = "gix-path" +version = "0.8.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "4fc78f47095a0c15aea0e66103838f0748f4494bf7a9555dfe0f00425400396c" +dependencies = [ + "bstr", + "home", + "once_cell", + "thiserror", +] + +[[package]] +name = "gix-ref" +version = "0.29.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "1e03989e9d49954368e1b526578230fc7189d1634acdfbe79e9ba1de717e15d5" +dependencies = [ + "gix-actor", + "gix-features", + "gix-fs", + "gix-hash", + "gix-lock", + "gix-object", + "gix-path", + "gix-tempfile", + "gix-validate", + "memmap2", + "nom", + "thiserror", +] + +[[package]] +name = "gix-sec" +version = "0.8.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "794520043d5a024dfeac335c6e520cb616f6963e30dab995892382e998c12897" +dependencies = [ + "bitflags 2.2.1", + "gix-path", + "libc", + "windows", +] + +[[package]] +name = "gix-tempfile" +version = "5.0.3" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "9b919933a397b79c37e33b77bb2aa3dc8eb6e165ad809e58ff75bc7db2e34574" +checksum = "d71a0d32f34e71e86586124225caefd78dabc605d0486de580d717653addf182" +dependencies = [ + "gix-fs", + "libc", + "once_cell", + "parking_lot", + "tempfile", +] + +[[package]] +name = "gix-utils" +version = "0.1.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "c10b69beac219acb8df673187a1f07dde2d74092f974fb3f9eb385aeb667c909" +dependencies = [ + "fastrand", +] + +[[package]] +name = "gix-validate" +version = "0.7.4" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "7bd629d3680773e1785e585d76fd4295b740b559cad9141517300d99a0c8c049" +dependencies = [ + "bstr", + "thiserror", +] [[package]] name = "globset" -version = "0.4.6" +version = "0.4.10" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "c152169ef1e421390738366d2f796655fec62621dabbd0fd476f905934061e4a" +checksum = "029d74589adefde59de1a0c4f4732695c32805624aec7b68d91503d4dba79afc" dependencies = [ - "aho-corasick", + "aho-corasick 0.7.20", "bstr", "fnv", "log", @@ -857,44 +943,26 @@ version = "0.8.1" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "93e3af942408868f6934a7b85134a3230832b9977cf66125df2f9edcfce4ddcc" dependencies = [ - "bitflags", + "bitflags 1.3.2", "ignore", "walkdir", ] [[package]] name = "h2" -version = "0.1.26" +version = "0.3.19" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "a5b34c246847f938a410a03c5458c7fee2274436675e76d8b903c08efc29c462" +checksum = "d357c7ae988e7d2182f7d7871d0b963962420b0678b0997ce7de72001aeab782" dependencies = [ - "byteorder", - "bytes 0.4.12", - "fnv", - "futures", - "http 0.1.21", - "indexmap", - "log", - "slab", - "string", - "tokio-io", -] - -[[package]] -name = "h2" -version = "0.3.14" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "5ca32592cf21ac7ccab1825cd87f6c9b3d9022c44d086172ed0966bec8af30be" -dependencies = [ - "bytes 1.2.1", + "bytes", "fnv", "futures-core", "futures-sink", "futures-util", - "http 0.2.8", + "http", "indexmap", "slab", - "tokio 1.21.2", + "tokio", "tokio-util", "tracing", ] @@ -907,28 +975,27 @@ checksum = "8a9ee70c43aaf417c914396645a0fa852624801b24ebb7ae78fe8272889ac888" [[package]] name = "headers" -version = "0.2.3" +version = "0.3.8" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "882ca7d8722f33ce2c2db44f95425d6267ed59ca96ce02acbe58320054ceb642" +checksum = "f3e372db8e5c0d213e0cd0b9be18be2aca3d44cf2fe30a9d46a65581cd454584" dependencies = [ - "base64 0.10.1", - "bitflags", - "bytes 0.4.12", + "base64 0.13.1", + "bitflags 1.3.2", + "bytes", "headers-core", - "http 0.1.21", + "http", + "httpdate", "mime", - "sha-1", - "time", + "sha1", ] [[package]] name = "headers-core" -version = "0.1.1" +version = "0.2.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "967131279aaa9f7c20c7205b45a391638a83ab118e6509b2d0ccbe08de044237" +checksum = "e7f66481bfee273957b1f20485a4ff3362987f85b2c236580d81b4eb7a326429" dependencies = [ - "bytes 0.4.12", - "http 0.1.21", + "http", ] [[package]] @@ -950,46 +1017,53 @@ dependencies = [ ] [[package]] -name = "html-escape" -version = "0.2.11" +name = "hermit-abi" +version = "0.2.6" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "b8e7479fa1ef38eb49fb6a42c426be515df2d063f06cb8efd3e50af073dbc26c" +checksum = "ee512640fe35acbfb4bb779db6f0d80704c2cacfa2e39b601ef3e3f47d1ae4c7" dependencies = [ - "utf8-width", + "libc", ] [[package]] -name = "http" -version = "0.1.21" +name = "hermit-abi" +version = "0.3.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "fed44880c466736ef9a5c5b5facefb5ed0785676d0c02d612db14e54f0d84286" + +[[package]] +name = "hex" +version = "0.4.3" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "d6ccf5ede3a895d8856620237b2f02972c1bbc78d2965ad7fe8838d4a0ed41f0" +checksum = "7f24254aa9a54b5c858eaee2f5bccdb46aaf0e486a595ed5fd8f86ba55232a70" + +[[package]] +name = "home" +version = "0.5.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "5444c27eef6923071f7ebcc33e3444508466a76f7a2b93da00ed6e19f30c1ddb" dependencies = [ - "bytes 0.4.12", - "fnv", - "itoa 0.4.8", + "windows-sys 0.48.0", ] [[package]] -name = "http" -version = "0.2.8" +name = "html-escape" +version = "0.2.13" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "75f43d41e26995c17e71ee126451dd3941010b0514a81a9d11f3b341debc2399" +checksum = "6d1ad449764d627e22bfd7cd5e8868264fc9236e07c752972b4080cd351cb476" dependencies = [ - "bytes 1.2.1", - "fnv", - "itoa 1.0.3", + "utf8-width", ] [[package]] -name = "http-body" -version = "0.1.0" +name = "http" +version = "0.2.9" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "6741c859c1b2463a423a1dbce98d418e6c3c3fc720fb0d45528657320920292d" +checksum = "bd6effc99afb63425aff9b05836f029929e345a6148a14b7ecd5ab67af944482" dependencies = [ - "bytes 0.4.12", - "futures", - "http 0.1.21", - "tokio-buf", + "bytes", + "fnv", + "itoa", ] [[package]] @@ -998,8 +1072,8 @@ version = "0.4.5" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "d5f38f16d184e36f2408a55281cd658ecbd3ca05cce6d6510a176eca393e26d1" dependencies = [ - "bytes 1.2.1", - "http 0.2.8", + "bytes", + "http", "pin-project-lite", ] @@ -1017,62 +1091,35 @@ checksum = "c4a1e36c821dbe04574f602848a19f742f4fb3c98d40449f11bcad18d6b17421" [[package]] name = "humansize" -version = "1.1.1" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "02296996cb8796d7c6e3bc2d9211b7802812d36999a51bb754123ead7d37d026" - -[[package]] -name = "hyper" -version = "0.12.36" +version = "2.1.3" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "5c843caf6296fc1f93444735205af9ed4e109a539005abb2564ae1d6fad34c52" +checksum = "6cb51c9a029ddc91b07a787f1d86b53ccfa49b0e86688c946ebe8d3555685dd7" dependencies = [ - "bytes 0.4.12", - "futures", - "futures-cpupool", - "h2 0.1.26", - "http 0.1.21", - "http-body 0.1.0", - "httparse", - "iovec", - "itoa 0.4.8", - "log", - "net2", - "rustc_version", - "time", - "tokio 0.1.22", - "tokio-buf", - "tokio-executor", - "tokio-io", - "tokio-reactor", - "tokio-tcp", - "tokio-threadpool", - "tokio-timer", - "want 0.2.0", + "libm", ] [[package]] name = "hyper" -version = "0.14.20" +version = "0.14.26" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "02c929dc5c39e335a03c405292728118860721b10190d98c2a0f0efd5baafbac" +checksum = "ab302d72a6f11a3b910431ff93aae7e773078c769f0a3ef15fb9ec692ed147d4" dependencies = [ - "bytes 1.2.1", + "bytes", "futures-channel", "futures-core", "futures-util", - "h2 0.3.14", - "http 0.2.8", - "http-body 0.4.5", + "h2", + "http", + "http-body", "httparse", "httpdate", - "itoa 1.0.3", + "itoa", "pin-project-lite", "socket2", - "tokio 1.21.2", + "tokio", "tower-service", "tracing", - "want 0.3.0", + "want", ] [[package]] @@ -1081,31 +1128,35 @@ version = "0.5.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "d6183ddfa99b85da61a140bea0efc93fdf56ceaa041b37d553518030827f9905" dependencies = [ - "bytes 1.2.1", - "hyper 0.14.20", + "bytes", + "hyper", "native-tls", - "tokio 1.21.2", + "tokio", "tokio-native-tls", ] [[package]] name = "iana-time-zone" -version = "0.1.50" +version = "0.1.56" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "fd911b35d940d2bd0bea0f9100068e5b97b51a1cbe13d13382f132e0365257a0" +checksum = "0722cd7114b7de04316e7ea5456a0bbb20e4adb46fd27a3697adb812cff0f37c" dependencies = [ "android_system_properties", "core-foundation-sys", + "iana-time-zone-haiku", "js-sys", "wasm-bindgen", - "winapi 0.3.9", + "windows", ] [[package]] -name = "ident_case" -version = "1.0.1" +name = "iana-time-zone-haiku" +version = "0.1.2" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "b9e0384b61958566e926dc50660321d12159025e767c18e043daf26b70104c39" +checksum = "f31827a206f56af32e590ba56d5d2d085f558508192593743f16b2306495269f" +dependencies = [ + "cc", +] [[package]] name = "idna" @@ -1119,11 +1170,10 @@ dependencies = [ [[package]] name = "ignore" -version = "0.4.17" +version = "0.4.20" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "b287fb45c60bb826a0dc68ff08742b9d88a2fea13d6e0c286b3172065aaf878c" +checksum = "dbe7873dab538a9a44ad79ede1faf5f30d49f9a5c883ddbab48bce81b64b7492" dependencies = [ - "crossbeam-utils 0.8.12", "globset", "lazy_static", "log", @@ -1135,11 +1185,29 @@ dependencies = [ "winapi-util", ] +[[package]] +name = "ignore-files" +version = "1.3.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "ad7ef2262641fef97d2e3230dc14b50d80418a2b7a22497457f23c533e8a75de" +dependencies = [ + "dunce", + "futures", + "gix-config", + "ignore", + "miette", + "project-origins", + "radix_trie", + "thiserror", + "tokio", + "tracing", +] + [[package]] name = "indexmap" -version = "1.9.1" +version = "1.9.3" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "10a35a97730320ffe8e2d410b5d3b69279b98d2c14bdb8b70ea89ecf7888d41e" +checksum = "bd070e393353796e801d209ad339e89596eb4c8d430d18ede6a1cced8fafbd99" dependencies = [ "autocfg", "hashbrown", @@ -1147,11 +1215,11 @@ dependencies = [ [[package]] name = "inotify" -version = "0.7.1" +version = "0.9.6" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "4816c66d2c8ae673df83366c18341538f234a26d65a9ecea5c348b453ac1d02f" +checksum = "f8069d3ec154eb856955c1c0fbffefbf5f3c40a104ec912d4797314c1801abff" dependencies = [ - "bitflags", + "bitflags 1.3.2", "inotify-sys", "libc", ] @@ -1171,48 +1239,65 @@ version = "0.1.12" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "7a5bbe824c507c5da5956355e86a746d82e0e1464f65d862cc5e71da70e94b2c" dependencies = [ - "cfg-if 1.0.0", + "cfg-if", ] [[package]] -name = "iovec" -version = "0.1.4" +name = "io-lifetimes" +version = "1.0.10" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "b2b3ea6ff95e175473f8ffe6a7eb7c00d054240321b84c57051175fe3c1e075e" +checksum = "9c66c74d2ae7e79a5a8f7ac924adbe38ee42a859c6539ad869eb51f0b52dc220" dependencies = [ + "hermit-abi 0.3.1", "libc", + "windows-sys 0.48.0", ] [[package]] name = "ipnet" -version = "2.5.0" +version = "2.7.2" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "879d54834c8c76457ef4293a689b2a8c59b076067ad77b15efafbb05f92a592b" +checksum = "12b6ee2129af8d4fb011108c73d99a1b83a85977f23b82460c0ae2e25bb4b57f" [[package]] -name = "itoa" -version = "0.4.8" +name = "is-docker" +version = "0.2.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "b71991ff56294aa922b450139ee08b3bfc70982c6b2c7562771375cf73542dd4" +checksum = "928bae27f42bc99b60d9ac7334e3a21d10ad8f1835a4e12ec3ec0464765ed1b3" +dependencies = [ + "once_cell", +] + +[[package]] +name = "is-wsl" +version = "0.4.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "173609498df190136aa7dea1a91db051746d339e18476eed5ca40521f02d7aa5" +dependencies = [ + "is-docker", + "once_cell", +] [[package]] name = "itoa" -version = "1.0.3" +version = "1.0.6" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "6c8af84674fe1f223a982c933a0ee1086ac4d4052aa0fb8060c12c6ad838e754" +checksum = "453ad9f582a441959e5f0d088b02ce04cfe8d51a8eaf077f12ac6d3e94164ca6" [[package]] name = "jni" -version = "0.19.0" +version = "0.21.1" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "c6df18c2e3db7e453d3c6ac5b3e9d5182664d28788126d39b91f2d1e22b017ec" +checksum = "1a87aa2bb7d2af34197c04845522473242e1aa17c12f4935d5856491a7fb8c97" dependencies = [ "cesu8", + "cfg-if", "combine", "jni-sys", "log", "thiserror", "walkdir", + "windows-sys 0.45.0", ] [[package]] @@ -1223,21 +1308,31 @@ checksum = "8eaf4bc02d17cbdd7ff4c7438cafcdf7fb9a4613313ad11b4f8fefe7d3fa0130" [[package]] name = "js-sys" -version = "0.3.60" +version = "0.3.63" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "49409df3e3bf0856b916e2ceaca09ee28e6871cf7d9ce97a692cacfdb2a25a47" +checksum = "2f37a4a5928311ac501dee68b3c7613a1037d0edb30c8e5427bd832d55d1b790" dependencies = [ "wasm-bindgen", ] [[package]] -name = "kernel32-sys" -version = "0.2.2" +name = "kqueue" +version = "1.0.7" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "2c8fc60ba15bf51257aa9807a48a61013db043fcf3a78cb0d916e8e396dcad98" +dependencies = [ + "kqueue-sys", + "libc", +] + +[[package]] +name = "kqueue-sys" +version = "1.0.3" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "7507624b29483431c0ba2d82aece8ca6cdba9382bff4ddd0f7490560c056098d" +checksum = "8367585489f01bc55dd27404dcf56b95e6da061a256a666ab23be9ba96a2e587" dependencies = [ - "winapi 0.2.8", - "winapi-build", + "bitflags 1.3.2", + "libc", ] [[package]] @@ -1247,22 +1342,22 @@ source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "e2abad23fbc42b3700f2f279844dc832adb2b2eb069b2df918f455c4e18cc646" [[package]] -name = "lazycell" -version = "1.3.0" +name = "libc" +version = "0.2.144" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "830d08ce1d1d941e6b30645f1a0eb5643013d835ce3779a5fc208261dbe10f55" +checksum = "2b00cc1c228a6782d0f076e7b232802e0c5689d41bb5df366f2a6b6621cfdfe1" [[package]] -name = "libc" -version = "0.2.134" +name = "libm" +version = "0.2.7" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "329c933548736bc49fd575ee68c89e8be4d260064184389a5b77517cddd99ffb" +checksum = "f7012b1bbb0719e1097c47611d3898568c546d597c2e74d66f6087edd5233ff4" [[package]] name = "libz-sys" -version = "1.1.8" +version = "1.1.9" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "9702761c3935f8cc2f101793272e202c72b99da8f4224a19ddcf1279a6450bbf" +checksum = "56ee889ecc9568871456d42f603d6a0ce59ff328d291063a45cbdf0036baf6db" dependencies = [ "cc", "libc", @@ -1270,12 +1365,19 @@ dependencies = [ "vcpkg", ] +[[package]] +name = "linux-raw-sys" +version = "0.3.7" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "ece97ea872ece730aed82664c424eb4c8291e1ff2480247ccf7409044bc6479f" + [[package]] name = "lock_api" -version = "0.3.4" +version = "0.4.9" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "c4da24a77a3d8a6d4862d95f72e6fdb9c09a643ecdb402d754004a557f2bec75" +checksum = "435011366fe56583b16cf956f9df0095b405b82d76425bc8981c0e22e60ec4df" dependencies = [ + "autocfg", "scopeguard", ] @@ -1285,24 +1387,16 @@ version = "0.4.17" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "abb12e687cfb44aa40f41fc3978ef76448f9b6038cad6aef4259d3c095a2382e" dependencies = [ - "cfg-if 1.0.0", + "cfg-if", ] -[[package]] -name = "maybe-uninit" -version = "2.0.0" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "60302e4db3a61da70c0cb7991976248362f30319e88850c487b9b95bbf059e00" - [[package]] name = "md-5" -version = "0.9.1" +version = "0.10.5" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "7b5a279bb9607f9f53c22d496eade00d138d1bdcccd07d74650387cf94942a15" +checksum = "6365506850d44bff6e2fbcb5176cf63650e48bd45ef2fe2665ae1570e0f4b9ca" dependencies = [ - "block-buffer 0.9.0", - "digest 0.9.0", - "opaque-debug 0.3.0", + "digest", ] [[package]] @@ -1312,28 +1406,51 @@ source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "2dffe52ecf27772e601905b7522cb4ef790d2cc203488bbd0e2fe85fcb74566d" [[package]] -name = "memoffset" -version = "0.5.6" +name = "memmap2" +version = "0.5.10" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "043175f069eda7b85febe4a74abbaeff828d9f8b448515d3151a14a3542811aa" +checksum = "83faa42c0a078c393f6b29d5db232d8be22776a891f8f56e5284faee4a20b327" dependencies = [ - "autocfg", + "libc", ] [[package]] name = "memoffset" -version = "0.6.5" +version = "0.7.1" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "5aa361d4faea93603064a027415f07bd8e1d5c88c9fbf68bf56a285428fd79ce" +checksum = "5de893c32cde5f383baa4c04c5d6dbdd735cfd4a794b0debdb2bb1b421da5ff4" dependencies = [ "autocfg", ] +[[package]] +name = "miette" +version = "5.8.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "92a992891d5579caa9efd8e601f82e30a1caa79a27a5db075dde30ecb9eab357" +dependencies = [ + "miette-derive", + "once_cell", + "thiserror", + "unicode-width", +] + +[[package]] +name = "miette-derive" +version = "5.8.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "4c65c625186a9bcce6699394bee511e1b1aec689aa7e3be1bf4e996e75834153" +dependencies = [ + "proc-macro2", + "quote", + "syn 2.0.16", +] + [[package]] name = "mime" -version = "0.3.16" +version = "0.3.17" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "2a60c7ce501c71e03a9c9c0d35b861413ae925bd979cc7a4e30d060069aaac8d" +checksum = "6877bb514081ee2a7ff5ef9de3281f14a4dd4bceac4c09388074a6b5df8a139a" [[package]] name = "minimal-lexical" @@ -1343,84 +1460,39 @@ checksum = "68354c5c6bd36d73ff3feceb05efa59b6acb7626617f4962be322a825e61f79a" [[package]] name = "miniz_oxide" -version = "0.5.4" +version = "0.6.2" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "96590ba8f175222643a85693f33d26e9c8a015f599c216509b1a6894af675d34" +checksum = "b275950c28b37e794e8c55d88aeb5e139d0ce23fdbbeda68f8d7174abdf9e8fa" dependencies = [ "adler", ] [[package]] -name = "mio" -version = "0.6.23" +name = "miniz_oxide" +version = "0.7.1" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "4afd66f5b91bf2a3bc13fad0e21caedac168ca4c707504e75585648ae80e4cc4" +checksum = "e7810e0be55b428ada41041c41f32c9f1a42817901b4ccf45fa3d4b6561e74c7" dependencies = [ - "cfg-if 0.1.10", - "fuchsia-zircon", - "fuchsia-zircon-sys", - "iovec", - "kernel32-sys", - "libc", - "log", - "miow", - "net2", - "slab", - "winapi 0.2.8", + "adler", ] [[package]] name = "mio" -version = "0.8.4" +version = "0.8.6" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "57ee1c23c7c63b0c9250c339ffdc69255f110b298b901b9f6c82547b7b87caaf" +checksum = "5b9d9a46eff5b4ff64b45a9e316a6d1e0bc719ef429cbec4dc630684212bfdf9" dependencies = [ "libc", "log", - "wasi 0.11.0+wasi-snapshot-preview1", - "windows-sys", -] - -[[package]] -name = "mio-extras" -version = "2.0.6" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "52403fe290012ce777c4626790c8951324a2b9e3316b3143779c72b029742f19" -dependencies = [ - "lazycell", - "log", - "mio 0.6.23", - "slab", -] - -[[package]] -name = "mio-uds" -version = "0.6.8" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "afcb699eb26d4332647cc848492bbc15eafb26f08d0304550d5aa1f612e066f0" -dependencies = [ - "iovec", - "libc", - "mio 0.6.23", -] - -[[package]] -name = "miow" -version = "0.2.2" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "ebd808424166322d4a38da87083bfddd3ac4c131334ed55856112eb06d46944d" -dependencies = [ - "kernel32-sys", - "net2", - "winapi 0.2.8", - "ws2_32-sys", + "wasi", + "windows-sys 0.45.0", ] [[package]] name = "native-tls" -version = "0.2.10" +version = "0.2.11" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "fd7e2f3618557f980e0b17e8856252eee3c97fa12c54dff0ca290fb6266ca4a9" +checksum = "07226173c32f2926027b63cce4bcd8076c3552846cbe7925f3aaffeac0a3b92e" dependencies = [ "lazy_static", "libc", @@ -1436,81 +1508,65 @@ dependencies = [ [[package]] name = "ndk-context" -version = "0.1.1" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "27b02d87554356db9e9a873add8782d4ea6e3e58ea071a9adb9a2e8ddb884a8b" - -[[package]] -name = "net2" -version = "0.2.37" +version = "0.1.1" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "391630d12b68002ae1e25e8f974306474966550ad82dac6886fb8910c19568ae" -dependencies = [ - "cfg-if 0.1.10", - "libc", - "winapi 0.3.9", -] +checksum = "27b02d87554356db9e9a873add8782d4ea6e3e58ea071a9adb9a2e8ddb884a8b" [[package]] -name = "nix" -version = "0.22.3" +name = "nibble_vec" +version = "0.1.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "e4916f159ed8e5de0082076562152a76b7a1f64a01fd9d1e0fea002c37624faf" +checksum = "77a5d83df9f36fe23f0c3648c6bbb8b0298bb5f1939c8f2704431371f4b84d43" dependencies = [ - "bitflags", - "cc", - "cfg-if 1.0.0", - "libc", - "memoffset 0.6.5", + "smallvec", ] [[package]] name = "nix" -version = "0.24.2" +version = "0.26.2" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "195cdbc1741b8134346d515b3a56a1c94b0912758009cfd53f99ea0f57b065fc" +checksum = "bfdda3d196821d6af13126e40375cdf7da646a96114af134d5f417a9a1dc8e1a" dependencies = [ - "bitflags", - "cfg-if 1.0.0", + "bitflags 1.3.2", + "cfg-if", "libc", + "memoffset", + "pin-utils", + "static_assertions", ] [[package]] name = "nom" -version = "5.1.2" +version = "7.1.3" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "ffb4262d26ed83a1c0a33a38fe2bb15797329c85770da05e6b828ddb782627af" +checksum = "d273983c5a657a70a3e8f2a01329822f3b8c8172b73826411a55751e404a0a4a" dependencies = [ "memchr", - "version_check", + "minimal-lexical", ] [[package]] -name = "nom" -version = "7.1.1" +name = "normalize-path" +version = "0.2.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "a8903e5a29a317527874d0402f867152a3d21c908bb0b933e416c65e301d4c36" -dependencies = [ - "memchr", - "minimal-lexical", -] +checksum = "cf22e319b2e3cb517350572e3b70c6822e0a520abfb5c78f690e829a73e8d9f2" [[package]] name = "notify" -version = "4.0.17" +version = "5.1.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "ae03c8c853dba7bfd23e571ff0cff7bc9dceb40a4cd684cd1681824183f45257" +checksum = "58ea850aa68a06e48fdb069c0ec44d0d64c8dbffa49bf3b6f7f0a901fdea1ba9" dependencies = [ - "bitflags", + "bitflags 1.3.2", + "crossbeam-channel", "filetime", - "fsevent", "fsevent-sys", "inotify", + "kqueue", "libc", - "mio 0.6.23", - "mio-extras", + "mio", "walkdir", - "winapi 0.3.9", + "windows-sys 0.42.0", ] [[package]] @@ -1534,59 +1590,56 @@ dependencies = [ [[package]] name = "num_cpus" -version = "1.13.1" +version = "1.15.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "19e64526ebdee182341572e50e9ad03965aa510cd94427a4549448f285e957a1" +checksum = "0fac9e2da13b5eb447a6ce3d392f23a29d8694bff781bf03a16cd9ac8697593b" dependencies = [ - "hermit-abi", + "hermit-abi 0.2.6", "libc", ] [[package]] -name = "object" -version = "0.29.0" +name = "num_threads" +version = "0.1.6" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "21158b2c33aa6d4561f1c0a6ea283ca92bc54802a93b263e910746d679a7eb53" +checksum = "2819ce041d2ee131036f4fc9d6ae7ae125a3a40e97ba64d04fe799ad9dabbb44" dependencies = [ - "memchr", + "libc", ] [[package]] -name = "once_cell" -version = "1.15.0" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "e82dad04139b71a90c080c8463fe0dc7902db5192d939bd0950f074d014339e1" - -[[package]] -name = "opaque-debug" -version = "0.2.3" +name = "object" +version = "0.30.3" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "2839e79665f131bdb5782e51f2c6c9599c133c6098982a54c794358bf432529c" +checksum = "ea86265d3d3dcb6a27fc51bd29a4bf387fae9d2986b823079d4986af253eb439" +dependencies = [ + "memchr", +] [[package]] -name = "opaque-debug" -version = "0.3.0" +name = "once_cell" +version = "1.17.1" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "624a8340c38c1b80fd549087862da4ba43e08858af025b236e509b6649fc13d5" +checksum = "b7e5500299e16ebb147ae15a00a942af264cf3688f47923b8fc2cd5858f23ad3" [[package]] name = "open" -version = "1.7.1" +version = "4.1.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "dcea7a30d6b81a2423cc59c43554880feff7b57d12916f231a79f8d6d9470201" +checksum = "d16814a067484415fda653868c9be0ac5f2abd2ef5d951082a5f2fe1b3662944" dependencies = [ + "is-wsl", "pathdiff", - "winapi 0.3.9", ] [[package]] name = "openssl" -version = "0.10.42" +version = "0.10.52" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "12fc0523e3bd51a692c8850d075d74dc062ccf251c0110668cbd921917118a13" +checksum = "01b8574602df80f7b85fdfc5392fa884a4e3b3f4f35402c070ab34c3d3f78d56" dependencies = [ - "bitflags", - "cfg-if 1.0.0", + "bitflags 1.3.2", + "cfg-if", "foreign-types", "libc", "once_cell", @@ -1596,13 +1649,13 @@ dependencies = [ [[package]] name = "openssl-macros" -version = "0.1.0" +version = "0.1.1" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "b501e44f11665960c7e7fcf062c7d96a14ade4aa98116c004b2e37b5be7d736c" +checksum = "a948666b637a0f465e8564c73e89d4dde00d72d4d473cc972f390fc3dcee7d9c" dependencies = [ "proc-macro2", "quote", - "syn", + "syn 2.0.16", ] [[package]] @@ -1613,20 +1666,19 @@ checksum = "ff011a302c396a5197692431fc1948019154afc178baf7d8e37367442a4601cf" [[package]] name = "openssl-src" -version = "111.22.0+1.1.1q" +version = "111.25.3+1.1.1t" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "8f31f0d509d1c1ae9cada2f9539ff8f37933831fd5098879e482aa687d659853" +checksum = "924757a6a226bf60da5f7dd0311a34d2b52283dd82ddeb103208ddc66362f80c" dependencies = [ "cc", ] [[package]] name = "openssl-sys" -version = "0.9.76" +version = "0.9.87" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "5230151e44c0f05157effb743e8d517472843121cf9243e8b81393edb5acd9ce" +checksum = "8e17f59264b2809d77ae94f0e1ebabc434773f370d6ca667bd223ea10e06cc7e" dependencies = [ - "autocfg", "cc", "libc", "openssl-src", @@ -1636,28 +1688,25 @@ dependencies = [ [[package]] name = "parking_lot" -version = "0.9.0" +version = "0.12.1" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "f842b1982eb6c2fe34036a4fbfb06dd185a3f5c8edfaacdf7d1ea10b07de6252" +checksum = "3742b2c103b9f06bc9fff0a37ff4912935851bee6d36f3c02bcc755bcfec228f" dependencies = [ "lock_api", "parking_lot_core", - "rustc_version", ] [[package]] name = "parking_lot_core" -version = "0.6.2" +version = "0.9.7" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "b876b1b9e7ac6e1a74a6da34d25c42e17e8862aa409cbbbdcfc8d86c6f3bc62b" +checksum = "9069cbb9f99e3a5083476ccb29ceb1de18b9118cafa53e90c9551235de2b9521" dependencies = [ - "cfg-if 0.1.10", - "cloudabi", + "cfg-if", "libc", - "redox_syscall 0.1.57", - "rustc_version", + "redox_syscall 0.2.16", "smallvec", - "winapi 0.3.9", + "windows-sys 0.45.0", ] [[package]] @@ -1683,9 +1732,9 @@ checksum = "478c572c3d73181ff3c2539045f6eb99e5491218eae919370993b890cdbdd98e" [[package]] name = "pest" -version = "2.4.0" +version = "2.6.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "dbc7bc69c062e492337d74d59b120c274fd3d261b6bf6d3207d499b4b379c41a" +checksum = "e68e84bfb01f0507134eac1e9b410a12ba379d064eab48c50ba4ce329a527b70" dependencies = [ "thiserror", "ucd-trie", @@ -1693,9 +1742,9 @@ dependencies = [ [[package]] name = "pest_derive" -version = "2.4.0" +version = "2.6.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "60b75706b9642ebcb34dab3bc7750f811609a0eb1dd8b88c2d15bf628c1c65b2" +checksum = "6b79d4c71c865a25a4322296122e3924d30bc8ee0834c8bfc8b95f7f054afbfb" dependencies = [ "pest", "pest_generator", @@ -1703,35 +1752,35 @@ dependencies = [ [[package]] name = "pest_generator" -version = "2.4.0" +version = "2.6.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "f4f9272122f5979a6511a749af9db9bfc810393f63119970d7085fed1c4ea0db" +checksum = "6c435bf1076437b851ebc8edc3a18442796b30f1728ffea6262d59bbe28b077e" dependencies = [ "pest", "pest_meta", "proc-macro2", "quote", - "syn", + "syn 2.0.16", ] [[package]] name = "pest_meta" -version = "2.4.0" +version = "2.6.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "4c8717927f9b79515e565a64fe46c38b8cd0427e64c40680b14a7365ab09ac8d" +checksum = "745a452f8eb71e39ffd8ee32b3c5f51d03845f99786fa9b68db6ff509c505411" dependencies = [ "once_cell", "pest", - "sha1", + "sha2", ] [[package]] name = "phf" -version = "0.8.0" +version = "0.10.1" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "3dfb61232e34fcb633f43d12c58f83c1df82962dcdfa565a4e866ffc17dafe12" +checksum = "fabbf1ead8a5bcbc20f5f8b939ee3f5b0f6f281b6ad3468b84656b658b455259" dependencies = [ - "phf_shared 0.8.0", + "phf_shared 0.10.0", ] [[package]] @@ -1745,12 +1794,12 @@ dependencies = [ [[package]] name = "phf_codegen" -version = "0.8.0" +version = "0.10.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "cbffee61585b0411840d3ece935cce9cb6321f01c45477d30066498cd5e1a815" +checksum = "4fb1c3a8bc4dd4e5cfce29b44ffc14bedd2ee294559a294e2a4d4c9e9a6a13cd" dependencies = [ - "phf_generator 0.8.0", - "phf_shared 0.8.0", + "phf_generator 0.10.0", + "phf_shared 0.10.0", ] [[package]] @@ -1765,12 +1814,12 @@ dependencies = [ [[package]] name = "phf_generator" -version = "0.8.0" +version = "0.10.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "17367f0cc86f2d25802b2c26ee58a7b23faeccf78a396094c13dced0d0182526" +checksum = "5d5285893bb5eb82e6aaf5d59ee909a06a16737a8970984dd7746ba9283498d6" dependencies = [ - "phf_shared 0.8.0", - "rand 0.7.3", + "phf_shared 0.10.0", + "rand", ] [[package]] @@ -1780,16 +1829,17 @@ source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "b1181c94580fa345f50f19d738aaa39c0ed30a600d95cb2d3e23f94266f14fbf" dependencies = [ "phf_shared 0.11.1", - "rand 0.8.5", + "rand", ] [[package]] name = "phf_shared" -version = "0.8.0" +version = "0.10.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "c00cf8b9eafe68dde5e9eaa2cef8ee84a9336a47d566ec55ca16589633b65af7" +checksum = "b6796ad771acdc0123d2a88dc428b5e38ef24456743ddb1744ed628f9815c096" dependencies = [ "siphasher", + "uncased", ] [[package]] @@ -1799,7 +1849,6 @@ source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "e1fb5f6f826b772a8d4c0394209441e7d37cbbb967ae9c7e0e8134365c9ee676" dependencies = [ "siphasher", - "uncased", ] [[package]] @@ -1822,15 +1871,15 @@ checksum = "6ba3013ff85036c414a4a3cf826135db204de2bd80d684728550e7130421809a" [[package]] name = "pkg-config" -version = "0.3.25" +version = "0.3.27" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "1df8c4ec4b0627e53bdf214615ad287367e482558cf84b109250b37464dc03ae" +checksum = "26072860ba924cbfa98ea39c8c19b4dd6a4a25423dbdf219c1eca91aa0cf6964" [[package]] name = "ppv-lite86" -version = "0.2.16" +version = "0.2.17" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "eb9f9e6e233e5c4a35559a617bf40a4ec447db2e84c20b55a6f83167b7e57872" +checksum = "5b40af805b3121feab8a3c29f04d8ad262fa8e0561883e7653e024ae4479e6de" [[package]] name = "proc-macro-error" @@ -1841,7 +1890,7 @@ dependencies = [ "proc-macro-error-attr", "proc-macro2", "quote", - "syn", + "syn 1.0.109", "version_check", ] @@ -1858,43 +1907,50 @@ dependencies = [ [[package]] name = "proc-macro2" -version = "1.0.46" +version = "1.0.57" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "94e2ef8dbfc347b10c094890f778ee2e36ca9bb4262e86dc99cd217e35f3470b" +checksum = "c4ec6d5fe0b140acb27c9a0444118cf55bfbb4e0b259739429abb4521dd67c16" dependencies = [ "unicode-ident", ] +[[package]] +name = "project-origins" +version = "1.2.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "629e0d57f265ca8238345cb616eea8847b8ecb86b5d97d155be2c8963a314379" +dependencies = [ + "futures", + "tokio", + "tokio-stream", +] + [[package]] name = "quick-xml" -version = "0.22.0" +version = "0.28.2" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "8533f14c8382aaad0d592c812ac3b826162128b65662331e1127b45c3d18536b" +checksum = "0ce5e73202a820a31f8a0ee32ada5e21029c81fd9e3ebf668a40832e4219d9d1" dependencies = [ "memchr", ] [[package]] name = "quote" -version = "1.0.21" +version = "1.0.27" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "bbe448f377a7d6961e30f5955f9b8d106c3f5e449d493ee1b125c1d43c2b5179" +checksum = "8f4f29d145265ec1c483c7c654450edde0bfe043d3938d6972630663356d9500" dependencies = [ "proc-macro2", ] [[package]] -name = "rand" -version = "0.7.3" +name = "radix_trie" +version = "0.2.1" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "6a6b1679d49b24bbfe0c803429aa1874472f50d9b363131f0e89fc356b544d03" +checksum = "c069c179fcdc6a2fe24d8d18305cf085fdbd4f922c041943e203685d6a1c58fd" dependencies = [ - "getrandom 0.1.16", - "libc", - "rand_chacha 0.2.2", - "rand_core 0.5.1", - "rand_hc", - "rand_pcg", + "endian-type", + "nibble_vec", ] [[package]] @@ -1904,18 +1960,8 @@ source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "34af8d1a0e25924bc5b7c43c079c942339d8f0a8b57c39049bef581b46327404" dependencies = [ "libc", - "rand_chacha 0.3.1", - "rand_core 0.6.4", -] - -[[package]] -name = "rand_chacha" -version = "0.2.2" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "f4c8ed856279c9737206bf725bf36935d8666ead7aa69b52be55af369d193402" -dependencies = [ - "ppv-lite86", - "rand_core 0.5.1", + "rand_chacha", + "rand_core", ] [[package]] @@ -1925,16 +1971,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "e6c10a63a0fa32252be49d21e7709d4d4baf8d231c2dbce1eaa8141b9b127d88" dependencies = [ "ppv-lite86", - "rand_core 0.6.4", -] - -[[package]] -name = "rand_core" -version = "0.5.1" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "90bde5296fc891b0cef12a6d03ddccc162ce7b2aff54160af9338f8d40df6d19" -dependencies = [ - "getrandom 0.1.16", + "rand_core", ] [[package]] @@ -1943,40 +1980,25 @@ version = "0.6.4" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "ec0be4795e2f6a28069bec0b5ff3e2ac9bafc99e6a9a7dc3547996c5c816922c" dependencies = [ - "getrandom 0.2.7", -] - -[[package]] -name = "rand_hc" -version = "0.2.0" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "ca3129af7b92a17112d59ad498c6f81eaf463253766b90396d39ea7a39d6613c" -dependencies = [ - "rand_core 0.5.1", + "getrandom", ] [[package]] -name = "rand_pcg" -version = "0.2.1" +name = "redox_syscall" +version = "0.2.16" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "16abd0c1b639e9eb4d7c50c0b8100b0d0f849be2349829c740fe8e6eb4816429" +checksum = "fb5a58c1855b4b6819d59012155603f0b22ad30cad752600aadfcb695265519a" dependencies = [ - "rand_core 0.5.1", + "bitflags 1.3.2", ] [[package]] name = "redox_syscall" -version = "0.1.57" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "41cc0f7e4d5d4544e8861606a285bb08d3e70712ccc7d2b84d7c0ccfaf4b05ce" - -[[package]] -name = "redox_syscall" -version = "0.2.16" +version = "0.3.5" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "fb5a58c1855b4b6819d59012155603f0b22ad30cad752600aadfcb695265519a" +checksum = "567664f262709473930a4bf9e51bf2ebf3348f2e748ccc50dea20646858f8f29" dependencies = [ - "bitflags", + "bitflags 1.3.2", ] [[package]] @@ -1985,52 +2007,49 @@ version = "0.4.3" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "b033d837a7cf162d7993aded9304e30a83213c648b6e389db233191f891e5c2b" dependencies = [ - "getrandom 0.2.7", + "getrandom", "redox_syscall 0.2.16", "thiserror", ] [[package]] name = "regex" -version = "1.6.0" +version = "1.8.1" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "4c4eb3267174b8c6c2f654116623910a0fef09c4753f8dd83db29c48a0df988b" +checksum = "af83e617f331cc6ae2da5443c602dfa5af81e517212d9d611a5b3ba1777b5370" dependencies = [ - "aho-corasick", + "aho-corasick 1.0.1", "memchr", "regex-syntax", ] [[package]] -name = "regex-syntax" -version = "0.6.27" +name = "regex-automata" +version = "0.1.10" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "a3f87b73ce11b1619a3c6332f45341e0047173771e8b8b73f87bfeefb7b56244" +checksum = "6c230d73fb8d8c1b9c0b3135c5142a8acee3a0558fb8db5cf1cb65f8d7862132" [[package]] -name = "remove_dir_all" -version = "0.5.3" +name = "regex-syntax" +version = "0.7.1" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "3acd125665422973a33ac9d3dd2df85edad0f4ae9b00dafb1a05e43a9f5ef8e7" -dependencies = [ - "winapi 0.3.9", -] +checksum = "a5996294f19bd3aae0453a862ad728f60e6600695733dd5df01da90c54363a3c" [[package]] name = "reqwest" -version = "0.11.12" +version = "0.11.17" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "431949c384f4e2ae07605ccaa56d1d9d2ecdb5cadd4f9577ccfab29f2e5149fc" +checksum = "13293b639a097af28fc8a90f22add145a9c954e49d77da06263d58cf44d5fb91" dependencies = [ - "base64 0.13.0", - "bytes 1.2.1", + "base64 0.21.0", + "bytes", "encoding_rs", "futures-core", "futures-util", - "h2 0.3.14", - "http 0.2.8", - "http-body 0.4.5", - "hyper 0.14.20", + "h2", + "http", + "http-body", + "hyper", "hyper-tls", "ipnet", "js-sys", @@ -2043,7 +2062,7 @@ dependencies = [ "serde", "serde_json", "serde_urlencoded", - "tokio 1.21.2", + "tokio", "tokio-native-tls", "tower-service", "url", @@ -2055,24 +2074,29 @@ dependencies = [ [[package]] name = "rustc-demangle" -version = "0.1.21" +version = "0.1.23" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "7ef03e0a2b150c7a90d01faf6254c9c48a41e95fb2a8c2ac1c6f0d2b9aefc342" +checksum = "d626bb9dae77e28219937af045c257c28bfd3f69333c512553507f5f9798cb76" [[package]] -name = "rustc_version" -version = "0.2.3" +name = "rustix" +version = "0.37.19" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "138e3e0acb6c9fb258b19b67cb8abd63c00679d2851805ea151465464fe9030a" +checksum = "acf8729d8542766f1b2cf77eb034d52f40d375bb8b615d0b147089946e16613d" dependencies = [ - "semver", + "bitflags 1.3.2", + "errno", + "io-lifetimes", + "libc", + "linux-raw-sys", + "windows-sys 0.48.0", ] [[package]] name = "ryu" -version = "1.0.11" +version = "1.0.13" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "4501abdff3ae82a1c1b477a17252eb69cee9e66eb915c1abaa4f44d873df9f09" +checksum = "f91339c0467de62360649f8d3e185ca8de4224ff281f66000de5eb2a77a79041" [[package]] name = "same-file" @@ -2085,12 +2109,11 @@ dependencies = [ [[package]] name = "schannel" -version = "0.1.20" +version = "0.1.21" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "88d6731146462ea25d9244b2ed5fd1d716d25c52e4d54aa4fb0f3c4e9854dbe2" +checksum = "713cfb06c7059f3588fb8044c0fad1d09e3c01d225e25b9220dbfdcf16dbb1b3" dependencies = [ - "lazy_static", - "windows-sys", + "windows-sys 0.42.0", ] [[package]] @@ -2101,11 +2124,11 @@ checksum = "d29ab0c6d3fc0ee92fe66e2d99f700eab17a8d57d1c1d3b748380fb20baa78cd" [[package]] name = "security-framework" -version = "2.7.0" +version = "2.9.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "2bc1bb97804af6631813c55739f771071e0f2ed33ee20b68c86ec505d906356c" +checksum = "ca2855b3715770894e67cbfa3df957790aa0c9edc3bf06efa1a84d77fa0839d1" dependencies = [ - "bitflags", + "bitflags 1.3.2", "core-foundation", "core-foundation-sys", "libc", @@ -2114,82 +2137,64 @@ dependencies = [ [[package]] name = "security-framework-sys" -version = "2.6.1" +version = "2.9.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "0160a13a177a45bfb43ce71c01580998474f556ad854dcbca936dd2841a5c556" +checksum = "f51d0c0d83bec45f16480d0ce0058397a69e48fcdc52d1dc8855fb68acbd31a7" dependencies = [ "core-foundation-sys", "libc", ] -[[package]] -name = "semver" -version = "0.9.0" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "1d7eb9ef2c18661902cc47e535f9bc51b78acd254da71d375c2f6720d9a40403" -dependencies = [ - "semver-parser", -] - -[[package]] -name = "semver-parser" -version = "0.7.0" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "388a1df253eca08550bef6c72392cfe7c30914bf41df5269b68cbd6ff8f570a3" - [[package]] name = "serde" -version = "1.0.145" +version = "1.0.163" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "728eb6351430bccb993660dfffc5a72f91ccc1295abaa8ce19b27ebe4f75568b" +checksum = "2113ab51b87a539ae008b5c6c02dc020ffa39afd2d83cffcb3f4eb2722cebec2" dependencies = [ "serde_derive", ] [[package]] name = "serde_derive" -version = "1.0.145" +version = "1.0.163" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "81fa1584d3d1bcacd84c277a0dfe21f5b0f6accf4a23d04d4c6d61f1af522b4c" +checksum = "8c805777e3930c8883389c602315a24224bcc738b63905ef87cd1420353ea93e" dependencies = [ "proc-macro2", "quote", - "syn", + "syn 2.0.16", ] [[package]] name = "serde_json" -version = "1.0.85" +version = "1.0.96" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "e55a28e3aaef9d5ce0506d0a14dbba8054ddc7e499ef522dd8b26859ec9d4a44" +checksum = "057d394a50403bcac12672b2b18fb387ab6d289d957dab67dd201875391e52f1" dependencies = [ - "itoa 1.0.3", + "itoa", "ryu", "serde", ] [[package]] -name = "serde_urlencoded" -version = "0.7.1" +name = "serde_spanned" +version = "0.6.1" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "d3491c14715ca2294c4d6a88f15e84739788c1d030eed8c110436aafdaa2f3fd" +checksum = "0efd8caf556a6cebd3b285caf480045fcc1ac04f6bd786b09a6f11af30c4fcf4" dependencies = [ - "form_urlencoded", - "itoa 1.0.3", - "ryu", "serde", ] [[package]] -name = "sha-1" -version = "0.8.2" +name = "serde_urlencoded" +version = "0.7.1" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "f7d94d0bede923b3cea61f3f1ff57ff8cdfd77b400fb8f9998949e0cf04163df" +checksum = "d3491c14715ca2294c4d6a88f15e84739788c1d030eed8c110436aafdaa2f3fd" dependencies = [ - "block-buffer 0.7.3", - "digest 0.8.1", - "fake-simd", - "opaque-debug 0.2.3", + "form_urlencoded", + "itoa", + "ryu", + "serde", ] [[package]] @@ -2198,22 +2203,35 @@ version = "0.10.5" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "f04293dc80c3993519f2d7f6f511707ee7094fe0c6d3406feb330cdb3540eba3" dependencies = [ - "cfg-if 1.0.0", + "cfg-if", "cpufeatures", - "digest 0.10.5", + "digest", ] +[[package]] +name = "sha1_smol" +version = "1.0.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "ae1a47186c03a32177042e55dbc5fd5aee900b8e0069a8d70fba96a9375cd012" + [[package]] name = "sha2" -version = "0.9.9" +version = "0.10.6" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "4d58a1e1bf39749807d89cf2d98ac2dfa0ff1cb3faa38fbb64dd88ac8013d800" +checksum = "82e6b795fe2e3b1e845bafcb27aa35405c4d47cdfc92af5fc8d3002f76cebdc0" dependencies = [ - "block-buffer 0.9.0", - "cfg-if 1.0.0", + "cfg-if", "cpufeatures", - "digest 0.9.0", - "opaque-debug 0.3.0", + "digest", +] + +[[package]] +name = "signal-hook-registry" +version = "1.4.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "d8229b473baa5980ac72ef434c4415e70c4b5e71b423043adb4ba059f89c99a1" +dependencies = [ + "libc", ] [[package]] @@ -2224,9 +2242,9 @@ checksum = "7bd3e3206899af3f8b12af284fafc038cc1dc2b41d1b89dd17297221c5d225de" [[package]] name = "slab" -version = "0.4.7" +version = "0.4.8" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "4614a76b2a8be0058caa9dbbaf66d988527d86d003c11a94fbd335d7661edcef" +checksum = "6528351c9bc8ab22353f9d776db39a20288e8d6c37ef8cfe3317cf875eecfc2d" dependencies = [ "autocfg", ] @@ -2242,31 +2260,25 @@ dependencies = [ [[package]] name = "smallvec" -version = "0.6.14" +version = "1.10.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "b97fcaeba89edba30f044a10c6a3cc39df9c3f17d7cd829dd1446cab35f890e0" -dependencies = [ - "maybe-uninit", -] +checksum = "a507befe795404456341dfab10cef66ead4c041f62b8b11bbb92bffe5d0953e0" [[package]] name = "socket2" -version = "0.4.7" +version = "0.4.9" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "02e2d2db9033d13a1567121ddd7a095ee144db4e1ca1b1bda3419bc0da294ebd" +checksum = "64a4a911eed85daf18834cfaa86a79b7d266ff93ff5ba14005426219480ed662" dependencies = [ "libc", - "winapi 0.3.9", + "winapi", ] [[package]] -name = "string" -version = "0.2.1" +name = "static_assertions" +version = "1.1.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "d24114bfcceb867ca7f71a0d3fe45d45619ec47a6fbfa98cb14e14250bfa5d6d" -dependencies = [ - "bytes 0.4.12", -] +checksum = "a2eb9349b6444b326872e140eb1cf5e7c522154d69e7a0ffb0fb81c06b37543f" [[package]] name = "strsim" @@ -2274,12 +2286,6 @@ version = "0.8.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "8ea5119cdb4c55b55d432abb513a0429384878c15dde60cc77b1c99de1a95a6a" -[[package]] -name = "strsim" -version = "0.10.0" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "73473c0e59e6d5812c5dfe2a064a6444949f089e20eec9a2e5506596494e4623" - [[package]] name = "structopt" version = "0.3.26" @@ -2301,14 +2307,25 @@ dependencies = [ "proc-macro-error", "proc-macro2", "quote", - "syn", + "syn 1.0.109", +] + +[[package]] +name = "syn" +version = "1.0.109" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "72b64191b275b66ffe2469e8af2c1cfe3bafa67b529ead792a6d0160888b4237" +dependencies = [ + "proc-macro2", + "quote", + "unicode-ident", ] [[package]] name = "syn" -version = "1.0.101" +version = "2.0.16" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "e90cde112c4b9690b8cbe810cba9ddd8bc1d7472e2cae317b69e9438c1cba7d2" +checksum = "a6f671d4b5ffdb8eadec19c0ae67fe2639df8684bd7bc4b83d986b8db549cf01" dependencies = [ "proc-macro2", "quote", @@ -2321,14 +2338,14 @@ version = "0.0.0-dev.0" dependencies = [ "atty", "byte-unit", - "cfg-if 1.0.0", + "cfg-if", "error-chain", "filetime", "flate2", "fs2", "futures", "headers", - "hyper 0.12.36", + "hyper", "lazy_static", "libc", "md-5", @@ -2352,10 +2369,12 @@ dependencies = [ "tectonic_xetex_layout", "tempfile", "termcolor", - "tokio 0.1.22", + "tokio", "toml", "url", "watchexec", + "watchexec-filterer-globset", + "watchexec-signals", "zip", ] @@ -2467,6 +2486,8 @@ dependencies = [ "html-escape", "percent-encoding", "pinot", + "serde", + "serde_json", "tectonic_bridge_core", "tectonic_errors", "tectonic_io_base", @@ -2514,7 +2535,7 @@ dependencies = [ name = "tectonic_geturl" version = "0.0.0-dev.0" dependencies = [ - "cfg-if 1.0.0", + "cfg-if", "curl", "reqwest", "tectonic_errors", @@ -2565,7 +2586,7 @@ name = "tectonic_xetex_format" version = "0.0.0-dev.0" dependencies = [ "byteorder", - "nom 7.1.1", + "nom", "structopt", "tectonic_errors", ] @@ -2586,23 +2607,22 @@ dependencies = [ [[package]] name = "tempfile" -version = "3.3.0" +version = "3.5.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "5cdb1ef4eaeeaddc8fbd371e5017057064af0911902ef36b39801f67cc6d79e4" +checksum = "b9fbec84f381d5795b08656e4912bec604d162bff9291d6189a78f4c8ab87998" dependencies = [ - "cfg-if 1.0.0", + "cfg-if", "fastrand", - "libc", - "redox_syscall 0.2.16", - "remove_dir_all", - "winapi 0.3.9", + "redox_syscall 0.3.5", + "rustix", + "windows-sys 0.45.0", ] [[package]] name = "tera" -version = "1.17.1" +version = "1.18.1" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "3df578c295f9ec044ff1c829daf31bb7581d5b3c2a7a3d87419afe1f2531438c" +checksum = "95a665751302f22a03c56721e23094e4dc22b04a80f381e6737a07bf7a7c70c0" dependencies = [ "chrono", "chrono-tz", @@ -2612,341 +2632,215 @@ dependencies = [ "percent-encoding", "pest", "pest_derive", - "rand 0.8.5", + "rand", "regex", "serde", "serde_json", "slug", + "thread_local", "unic-segment", ] [[package]] name = "termcolor" -version = "1.1.3" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "bab24d30b911b2376f3a13cc2cd443142f0c81dda04c118693e35b3835757755" -dependencies = [ - "winapi-util", -] - -[[package]] -name = "terminfo" -version = "0.7.3" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "76971977e6121664ec1b960d1313aacfa75642adc93b9d4d53b247bd4cb1747e" -dependencies = [ - "dirs 2.0.2", - "fnv", - "nom 5.1.2", - "phf 0.8.0", - "phf_codegen 0.8.0", -] - -[[package]] -name = "textwrap" -version = "0.11.0" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "d326610f408c7a4eb6f51c37c330e496b08506c9457c9d34287ecc38809fb060" -dependencies = [ - "unicode-width", -] - -[[package]] -name = "thiserror" -version = "1.0.37" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "10deb33631e3c9018b9baf9dcbbc4f737320d2b576bac10f6aefa048fa407e3e" -dependencies = [ - "thiserror-impl", -] - -[[package]] -name = "thiserror-impl" -version = "1.0.37" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "982d17546b47146b28f7c22e3d08465f6b8903d0ea13c1660d9d84a6e7adcdbb" -dependencies = [ - "proc-macro2", - "quote", - "syn", -] - -[[package]] -name = "thread_local" -version = "1.1.4" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "5516c27b78311c50bf42c071425c560ac799b11c30b31f87e3081965fe5e0180" -dependencies = [ - "once_cell", -] - -[[package]] -name = "time" -version = "0.1.44" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "6db9e6914ab8b1ae1c260a4ae7a49b6c5611b40328a735b21862567685e73255" -dependencies = [ - "libc", - "wasi 0.10.0+wasi-snapshot-preview1", - "winapi 0.3.9", -] - -[[package]] -name = "tinyvec" -version = "1.6.0" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "87cc5ceb3875bb20c2890005a4e226a4651264a5c75edb2421b52861a0a0cb50" -dependencies = [ - "tinyvec_macros", -] - -[[package]] -name = "tinyvec_macros" -version = "0.1.0" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "cda74da7e1a664f795bb1f8a87ec406fb89a02522cf6e50620d016add6dbbf5c" - -[[package]] -name = "tokio" -version = "0.1.22" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "5a09c0b5bb588872ab2f09afa13ee6e9dac11e10a0ec9e8e3ba39a5a5d530af6" -dependencies = [ - "bytes 0.4.12", - "futures", - "mio 0.6.23", - "num_cpus", - "tokio-codec", - "tokio-current-thread", - "tokio-executor", - "tokio-fs", - "tokio-io", - "tokio-reactor", - "tokio-sync", - "tokio-tcp", - "tokio-threadpool", - "tokio-timer", - "tokio-udp", - "tokio-uds", -] - -[[package]] -name = "tokio" -version = "1.21.2" +version = "1.2.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "a9e03c497dc955702ba729190dc4aac6f2a0ce97f913e5b1b5912fc5039d9099" +checksum = "be55cf8942feac5c765c2c993422806843c9a9a45d4d5c407ad6dd2ea95eb9b6" dependencies = [ - "autocfg", - "bytes 1.2.1", - "libc", - "memchr", - "mio 0.8.4", - "num_cpus", - "pin-project-lite", - "socket2", - "winapi 0.3.9", + "winapi-util", ] [[package]] -name = "tokio-buf" -version = "0.1.1" +name = "terminfo" +version = "0.8.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "8fb220f46c53859a4b7ec083e41dec9778ff0b1851c0942b211edb89e0ccdc46" +checksum = "666cd3a6681775d22b200409aad3b089c5b99fb11ecdd8a204d9d62f8148498f" dependencies = [ - "bytes 0.4.12", - "either", - "futures", + "dirs", + "fnv", + "nom", + "phf 0.11.1", + "phf_codegen 0.11.1", ] [[package]] -name = "tokio-codec" -version = "0.1.2" +name = "textwrap" +version = "0.11.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "25b2998660ba0e70d18684de5d06b70b70a3a747469af9dea7618cc59e75976b" +checksum = "d326610f408c7a4eb6f51c37c330e496b08506c9457c9d34287ecc38809fb060" dependencies = [ - "bytes 0.4.12", - "futures", - "tokio-io", + "unicode-width", ] [[package]] -name = "tokio-current-thread" -version = "0.1.7" +name = "thiserror" +version = "1.0.40" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "b1de0e32a83f131e002238d7ccde18211c0a5397f60cbfffcb112868c2e0e20e" +checksum = "978c9a314bd8dc99be594bc3c175faaa9794be04a5a5e153caba6915336cebac" dependencies = [ - "futures", - "tokio-executor", + "thiserror-impl", ] [[package]] -name = "tokio-executor" -version = "0.1.10" +name = "thiserror-impl" +version = "1.0.40" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "fb2d1b8f4548dbf5e1f7818512e9c406860678f29c300cdf0ebac72d1a3a1671" +checksum = "f9456a42c5b0d803c8cd86e73dd7cc9edd429499f37a3550d286d5e86720569f" dependencies = [ - "crossbeam-utils 0.7.2", - "futures", + "proc-macro2", + "quote", + "syn 2.0.16", ] [[package]] -name = "tokio-fs" -version = "0.1.7" +name = "thread_local" +version = "1.1.4" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "297a1206e0ca6302a0eed35b700d292b275256f596e2f3fea7729d5e629b6ff4" +checksum = "5516c27b78311c50bf42c071425c560ac799b11c30b31f87e3081965fe5e0180" dependencies = [ - "futures", - "tokio-io", - "tokio-threadpool", + "once_cell", ] [[package]] -name = "tokio-io" -version = "0.1.13" +name = "time" +version = "0.3.21" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "57fc868aae093479e3131e3d165c93b1c7474109d13c90ec0dda2a1bbfff0674" +checksum = "8f3403384eaacbca9923fa06940178ac13e4edb725486d70e8e15881d0c836cc" dependencies = [ - "bytes 0.4.12", - "futures", - "log", + "itoa", + "libc", + "num_threads", + "serde", + "time-core", + "time-macros", ] [[package]] -name = "tokio-native-tls" -version = "0.3.0" +name = "time-core" +version = "0.1.1" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "f7d995660bd2b7f8c1568414c1126076c13fbb725c40112dc0120b78eb9b717b" -dependencies = [ - "native-tls", - "tokio 1.21.2", -] +checksum = "7300fbefb4dadc1af235a9cef3737cea692a9d97e1b9cbcd4ebdae6f8868e6fb" [[package]] -name = "tokio-reactor" -version = "0.1.12" +name = "time-macros" +version = "0.2.9" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "09bc590ec4ba8ba87652da2068d150dcada2cfa2e07faae270a5e0409aa51351" +checksum = "372950940a5f07bf38dbe211d7283c9e6d7327df53794992d293e534c733d09b" dependencies = [ - "crossbeam-utils 0.7.2", - "futures", - "lazy_static", - "log", - "mio 0.6.23", - "num_cpus", - "parking_lot", - "slab", - "tokio-executor", - "tokio-io", - "tokio-sync", + "time-core", ] [[package]] -name = "tokio-sync" -version = "0.1.8" +name = "tinyvec" +version = "1.6.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "edfe50152bc8164fcc456dab7891fa9bf8beaf01c5ee7e1dd43a397c3cf87dee" +checksum = "87cc5ceb3875bb20c2890005a4e226a4651264a5c75edb2421b52861a0a0cb50" dependencies = [ - "fnv", - "futures", + "tinyvec_macros", ] [[package]] -name = "tokio-tcp" -version = "0.1.4" +name = "tinyvec_macros" +version = "0.1.1" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "98df18ed66e3b72e742f185882a9e201892407957e45fbff8da17ae7a7c51f72" -dependencies = [ - "bytes 0.4.12", - "futures", - "iovec", - "mio 0.6.23", - "tokio-io", - "tokio-reactor", -] +checksum = "1f3ccbac311fea05f86f61904b462b55fb3df8837a366dfc601a0161d0532f20" [[package]] -name = "tokio-threadpool" -version = "0.1.18" +name = "tokio" +version = "1.28.1" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "df720b6581784c118f0eb4310796b12b1d242a7eb95f716a8367855325c25f89" +checksum = "0aa32867d44e6f2ce3385e89dceb990188b8bb0fb25b0cf576647a6f98ac5105" dependencies = [ - "crossbeam-deque", - "crossbeam-queue", - "crossbeam-utils 0.7.2", - "futures", - "lazy_static", - "log", + "autocfg", + "bytes", + "libc", + "mio", "num_cpus", - "slab", - "tokio-executor", + "pin-project-lite", + "signal-hook-registry", + "socket2", + "tokio-macros", + "windows-sys 0.48.0", ] [[package]] -name = "tokio-timer" -version = "0.2.13" +name = "tokio-macros" +version = "2.1.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "93044f2d313c95ff1cb7809ce9a7a05735b012288a888b62d4434fd58c94f296" +checksum = "630bdcf245f78637c13ec01ffae6187cca34625e8c63150d424b59e55af2675e" dependencies = [ - "crossbeam-utils 0.7.2", - "futures", - "slab", - "tokio-executor", + "proc-macro2", + "quote", + "syn 2.0.16", ] [[package]] -name = "tokio-udp" -version = "0.1.6" +name = "tokio-native-tls" +version = "0.3.1" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "e2a0b10e610b39c38b031a2fcab08e4b82f16ece36504988dcbd81dbba650d82" +checksum = "bbae76ab933c85776efabc971569dd6119c580d8f5d448769dec1764bf796ef2" dependencies = [ - "bytes 0.4.12", - "futures", - "log", - "mio 0.6.23", - "tokio-codec", - "tokio-io", - "tokio-reactor", + "native-tls", + "tokio", ] [[package]] -name = "tokio-uds" -version = "0.2.7" +name = "tokio-stream" +version = "0.1.14" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "ab57a4ac4111c8c9dbcf70779f6fc8bc35ae4b2454809febac840ad19bd7e4e0" +checksum = "397c988d37662c7dda6d2208364a706264bf3d6138b11d436cbac0ad38832842" dependencies = [ - "bytes 0.4.12", - "futures", - "iovec", - "libc", - "log", - "mio 0.6.23", - "mio-uds", - "tokio-codec", - "tokio-io", - "tokio-reactor", + "futures-core", + "pin-project-lite", + "tokio", ] [[package]] name = "tokio-util" -version = "0.7.4" +version = "0.7.8" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "0bb2e075f03b3d66d8d8785356224ba688d2906a371015e225beeb65ca92c740" +checksum = "806fe8c2c87eccc8b3267cbae29ed3ab2d0bd37fca70ab622e46aaa9375ddb7d" dependencies = [ - "bytes 1.2.1", + "bytes", "futures-core", "futures-sink", "pin-project-lite", - "tokio 1.21.2", + "tokio", "tracing", ] [[package]] name = "toml" -version = "0.5.9" +version = "0.7.3" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "b403acf6f2bb0859c93c7f0d967cb4a75a7ac552100f9322faf64dc047669b21" +dependencies = [ + "serde", + "serde_spanned", + "toml_datetime", + "toml_edit", +] + +[[package]] +name = "toml_datetime" +version = "0.6.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "3ab8ed2edee10b50132aed5f331333428b011c99402b5a534154ed15746f9622" +dependencies = [ + "serde", +] + +[[package]] +name = "toml_edit" +version = "0.19.8" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "8d82e1a7758622a465f8cee077614c73484dac5b836c02ff6a40d5d1010324d7" +checksum = "239410c8609e8125456927e6707163a3b1fdb40561e4b803bc041f466ccfdc13" dependencies = [ + "indexmap", "serde", + "serde_spanned", + "toml_datetime", + "winnow", ] [[package]] @@ -2957,35 +2851,48 @@ checksum = "b6bc1c9ce2b5135ac7f93c72918fc37feb872bdc6a5533a8b85eb4b86bfdae52" [[package]] name = "tracing" -version = "0.1.36" +version = "0.1.37" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "2fce9567bd60a67d08a16488756721ba392f24f29006402881e43b19aac64307" +checksum = "8ce8c33a8d48bd45d624a6e523445fd21ec13d3653cd51f681abf67418f54eb8" dependencies = [ - "cfg-if 1.0.0", + "cfg-if", + "log", "pin-project-lite", + "tracing-attributes", "tracing-core", ] +[[package]] +name = "tracing-attributes" +version = "0.1.24" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "0f57e3ca2a01450b1a921183a9c9cbfda207fd822cef4ccb00a65402cbba7a74" +dependencies = [ + "proc-macro2", + "quote", + "syn 2.0.16", +] + [[package]] name = "tracing-core" -version = "0.1.29" +version = "0.1.31" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "5aeea4303076558a00714b823f9ad67d58a3bbda1df83d8827d21193156e22f7" +checksum = "0955b8137a1df6f1a2e9a37d8a6656291ff0297c1a97c24e0d8425fe2312f79a" dependencies = [ "once_cell", ] [[package]] name = "try-lock" -version = "0.2.3" +version = "0.2.4" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "59547bce71d9c38b83d9c0e92b6066c4253371f15005def0c30d9657f50c7642" +checksum = "3528ecfd12c466c6f163363caf2d02a71161dd5e1cc6ae7b34207ea2d42d81ed" [[package]] name = "typenum" -version = "1.15.0" +version = "1.16.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "dcf81ac59edc17cc8697ff311e8f5ef2d99fcbd9817b34cec66f90b6c3dfd987" +checksum = "497961ef93d974e23eb6f433eb5fe1b7930b659f06d12dec6fc44a8f554c0bba" [[package]] name = "ucd-trie" @@ -2995,9 +2902,9 @@ checksum = "9e79c4d996edb816c91e4308506774452e55e95c3c9de07b6729e17e15a5ef81" [[package]] name = "uncased" -version = "0.9.7" +version = "0.9.9" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "09b01702b0fd0b3fadcf98e098780badda8742d4f4a7676615cad90e8ac73622" +checksum = "9b9bc53168a4be7402ab86c3aad243a84dd7381d09be0eddc81280c1da95ca68" dependencies = [ "version_check", ] @@ -3054,15 +2961,21 @@ dependencies = [ [[package]] name = "unicode-bidi" -version = "0.3.8" +version = "0.3.13" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "92888ba5573ff080736b3648696b70cafad7d250551175acbaa4e0385b3e1460" + +[[package]] +name = "unicode-bom" +version = "2.0.2" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "099b7128301d285f79ddd55b9a83d5e6b9e97c92e0ea0daebee7263e932de992" +checksum = "98e90c70c9f0d4d1ee6d0a7d04aa06cb9bbd53d8cfbdd62a0269a7c2eb640552" [[package]] name = "unicode-ident" -version = "1.0.4" +version = "1.0.8" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "dcc811dc4066ac62f84f11307873c4850cb653bfa9b1719cee2bd2204a4bc5dd" +checksum = "e5464a87b239f13a63a501f2701565754bae92d243d4bb7eb12f6d57d2269bf4" [[package]] name = "unicode-normalization" @@ -3075,9 +2988,9 @@ dependencies = [ [[package]] name = "unicode-segmentation" -version = "1.10.0" +version = "1.10.1" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "0fdbf052a0783de01e944a6ce7a8cb939e295b1e7be835a1112c3b9a7f047a5a" +checksum = "1dd624098567895118886609431a7c3b8f516e41d30e0643f03d94592a147e36" [[package]] name = "unicode-width" @@ -3122,26 +3035,14 @@ checksum = "49874b5167b65d7193b8aba1567f5c7d93d001cafc34600cee003eda787e483f" [[package]] name = "walkdir" -version = "2.3.2" +version = "2.3.3" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "808cf2735cd4b6866113f648b791c6adc5714537bc222d9347bb203386ffda56" +checksum = "36df944cda56c7d8d8b7496af378e6b16de9284591917d307c9b4d313c44e698" dependencies = [ "same-file", - "winapi 0.3.9", "winapi-util", ] -[[package]] -name = "want" -version = "0.2.0" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "b6395efa4784b027708f7451087e647ec73cc74f5d9bc2e418404248d679a230" -dependencies = [ - "futures", - "log", - "try-lock", -] - [[package]] name = "want" version = "0.3.0" @@ -3152,18 +3053,6 @@ dependencies = [ "try-lock", ] -[[package]] -name = "wasi" -version = "0.9.0+wasi-snapshot-preview1" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "cccddf32554fecc6acb585f82a32a72e28b48f8c4c1883ddfeeeaa96f7d8e519" - -[[package]] -name = "wasi" -version = "0.10.0+wasi-snapshot-preview1" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "1a143597ca7c7793eff794def352d41792a93c481eb1042423ff7ff72ba2c31f" - [[package]] name = "wasi" version = "0.11.0+wasi-snapshot-preview1" @@ -3172,36 +3061,36 @@ checksum = "9c8d87e72b64a3b4db28d11ce29237c246188f4f51057d65a7eab63b7987e423" [[package]] name = "wasm-bindgen" -version = "0.2.83" +version = "0.2.86" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "eaf9f5aceeec8be17c128b2e93e031fb8a4d469bb9c4ae2d7dc1888b26887268" +checksum = "5bba0e8cb82ba49ff4e229459ff22a191bbe9a1cb3a341610c9c33efc27ddf73" dependencies = [ - "cfg-if 1.0.0", + "cfg-if", "wasm-bindgen-macro", ] [[package]] name = "wasm-bindgen-backend" -version = "0.2.83" +version = "0.2.86" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "4c8ffb332579b0557b52d268b91feab8df3615f265d5270fec2a8c95b17c1142" +checksum = "19b04bc93f9d6bdee709f6bd2118f57dd6679cf1176a1af464fca3ab0d66d8fb" dependencies = [ "bumpalo", "log", "once_cell", "proc-macro2", "quote", - "syn", + "syn 2.0.16", "wasm-bindgen-shared", ] [[package]] name = "wasm-bindgen-futures" -version = "0.4.33" +version = "0.4.36" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "23639446165ca5a5de86ae1d8896b737ae80319560fbaa4c2887b7da6e7ebd7d" +checksum = "2d1985d03709c53167ce907ff394f5316aa22cb4e12761295c5dc57dacb6297e" dependencies = [ - "cfg-if 1.0.0", + "cfg-if", "js-sys", "wasm-bindgen", "web-sys", @@ -3209,9 +3098,9 @@ dependencies = [ [[package]] name = "wasm-bindgen-macro" -version = "0.2.83" +version = "0.2.86" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "052be0f94026e6cbc75cdefc9bae13fd6052cdcaf532fa6c45e7ae33a1e6c810" +checksum = "14d6b024f1a526bb0234f52840389927257beb670610081360e5a03c5df9c258" dependencies = [ "quote", "wasm-bindgen-macro-support", @@ -3219,47 +3108,103 @@ dependencies = [ [[package]] name = "wasm-bindgen-macro-support" -version = "0.2.83" +version = "0.2.86" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "07bc0c051dc5f23e307b13285f9d75df86bfdf816c5721e573dec1f9b8aa193c" +checksum = "e128beba882dd1eb6200e1dc92ae6c5dbaa4311aa7bb211ca035779e5efc39f8" dependencies = [ "proc-macro2", "quote", - "syn", + "syn 2.0.16", "wasm-bindgen-backend", "wasm-bindgen-shared", ] [[package]] name = "wasm-bindgen-shared" -version = "0.2.83" +version = "0.2.86" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "1c38c045535d93ec4f0b4defec448e4291638ee608530863b1e2ba115d4fff7f" +checksum = "ed9d5b4305409d1fc9482fee2d7f9bcbf24b3972bf59817ef757e23982242a93" [[package]] name = "watchexec" -version = "1.17.1" +version = "2.3.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "c52e0868bc57765fa91593a173323f464855e53b27f779e1110ba0fb4cb6b406" +checksum = "f8b97d05a9305a9aa6a7bedef64cd012ebc9b6f1f5ed0368fb48f0fe58f96988" dependencies = [ + "async-priority-channel", + "async-recursion", + "atomic-take", "clearscreen", "command-group", - "derive_builder", - "glob", - "globset", - "lazy_static", - "log", - "nix 0.22.3", + "futures", + "ignore-files", + "miette", + "nix", + "normalize-path", "notify", - "walkdir", - "winapi 0.3.9", + "once_cell", + "project-origins", + "thiserror", + "tokio", + "tracing", + "watchexec-events", + "watchexec-signals", +] + +[[package]] +name = "watchexec-events" +version = "1.0.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "01603bbe02fd75918f010dadad456d47eda14fb8fdcab276b0b4b8362f142ae3" +dependencies = [ + "nix", + "notify", + "watchexec-signals", +] + +[[package]] +name = "watchexec-filterer-globset" +version = "1.2.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "a4b0b0ccceaf3189f58cff6dcb1ef5df3d274b4b216202ac6686fa812e03b210" +dependencies = [ + "ignore", + "ignore-files", + "tracing", + "watchexec", + "watchexec-filterer-ignore", +] + +[[package]] +name = "watchexec-filterer-ignore" +version = "1.2.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "f345f020367ccdb7a19fcd59cafe5b8e99acdaf937b19d5847a6a1b81fd44ea3" +dependencies = [ + "dunce", + "ignore", + "ignore-files", + "tracing", + "watchexec", + "watchexec-signals", +] + +[[package]] +name = "watchexec-signals" +version = "1.0.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "cc2a5df96c388901c94ca04055fcd51d4196ca3e971c5e805bd4a4b61dd6a7e5" +dependencies = [ + "miette", + "nix", + "thiserror", ] [[package]] name = "web-sys" -version = "0.3.60" +version = "0.3.63" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "bcda906d8be16e728fd5adc5b729afad4e444e106ab28cd1c7256e54fa61510f" +checksum = "3bdd9ef4e984da1187bf8110c5cf5b845fbc87a23602cdf912386a76fcd3a7c2" dependencies = [ "js-sys", "wasm-bindgen", @@ -3267,21 +3212,15 @@ dependencies = [ [[package]] name = "which" -version = "4.3.0" +version = "4.4.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "1c831fbbee9e129a8cf93e7747a82da9d95ba8e16621cae60ec2cdc849bacb7b" +checksum = "2441c784c52b289a054b7201fc93253e288f094e2f4be9058343127c4226a269" dependencies = [ "either", "libc", "once_cell", ] -[[package]] -name = "winapi" -version = "0.2.8" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "167dc9d6949a9b857f3451275e911c3f44255842c1f7a76f33c55103a909087a" - [[package]] name = "winapi" version = "0.3.9" @@ -3292,12 +3231,6 @@ dependencies = [ "winapi-x86_64-pc-windows-gnu", ] -[[package]] -name = "winapi-build" -version = "0.1.1" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "2d315eee3b34aca4797b2da6b13ed88266e6d612562a0c46390af8299fc699bc" - [[package]] name = "winapi-i686-pc-windows-gnu" version = "0.4.0" @@ -3310,7 +3243,7 @@ version = "0.1.5" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "70ec6ce85bb158151cae5e5c87f95a8e97d2c0c4b001223f33a334e3ce5de178" dependencies = [ - "winapi 0.3.9", + "winapi", ] [[package]] @@ -3319,85 +3252,197 @@ version = "0.4.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "712e227841d057c1ee1cd2fb22fa7e5a5461ae8e48fa2ca79ec42cfc1931183f" +[[package]] +name = "windows" +version = "0.48.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "e686886bc078bc1b0b600cac0147aadb815089b6e4da64016cbd754b6342700f" +dependencies = [ + "windows-targets 0.48.0", +] + +[[package]] +name = "windows-sys" +version = "0.42.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "5a3e1820f08b8513f676f7ab6c1f99ff312fb97b553d30ff4dd86f9f15728aa7" +dependencies = [ + "windows_aarch64_gnullvm 0.42.2", + "windows_aarch64_msvc 0.42.2", + "windows_i686_gnu 0.42.2", + "windows_i686_msvc 0.42.2", + "windows_x86_64_gnu 0.42.2", + "windows_x86_64_gnullvm 0.42.2", + "windows_x86_64_msvc 0.42.2", +] + +[[package]] +name = "windows-sys" +version = "0.45.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "75283be5efb2831d37ea142365f009c02ec203cd29a3ebecbc093d52315b66d0" +dependencies = [ + "windows-targets 0.42.2", +] + [[package]] name = "windows-sys" -version = "0.36.1" +version = "0.48.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "677d2418bec65e3338edb076e806bc1ec15693c5d0104683f2efe857f61056a9" +dependencies = [ + "windows-targets 0.48.0", +] + +[[package]] +name = "windows-targets" +version = "0.42.2" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "ea04155a16a59f9eab786fe12a4a450e75cdb175f9e0d80da1e17db09f55b8d2" +checksum = "8e5180c00cd44c9b1c88adb3693291f1cd93605ded80c250a75d472756b4d071" dependencies = [ - "windows_aarch64_msvc", - "windows_i686_gnu", - "windows_i686_msvc", - "windows_x86_64_gnu", - "windows_x86_64_msvc", + "windows_aarch64_gnullvm 0.42.2", + "windows_aarch64_msvc 0.42.2", + "windows_i686_gnu 0.42.2", + "windows_i686_msvc 0.42.2", + "windows_x86_64_gnu 0.42.2", + "windows_x86_64_gnullvm 0.42.2", + "windows_x86_64_msvc 0.42.2", ] +[[package]] +name = "windows-targets" +version = "0.48.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "7b1eb6f0cd7c80c79759c929114ef071b87354ce476d9d94271031c0497adfd5" +dependencies = [ + "windows_aarch64_gnullvm 0.48.0", + "windows_aarch64_msvc 0.48.0", + "windows_i686_gnu 0.48.0", + "windows_i686_msvc 0.48.0", + "windows_x86_64_gnu 0.48.0", + "windows_x86_64_gnullvm 0.48.0", + "windows_x86_64_msvc 0.48.0", +] + +[[package]] +name = "windows_aarch64_gnullvm" +version = "0.42.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "597a5118570b68bc08d8d59125332c54f1ba9d9adeedeef5b99b02ba2b0698f8" + +[[package]] +name = "windows_aarch64_gnullvm" +version = "0.48.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "91ae572e1b79dba883e0d315474df7305d12f569b400fcf90581b06062f7e1bc" + +[[package]] +name = "windows_aarch64_msvc" +version = "0.42.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "e08e8864a60f06ef0d0ff4ba04124db8b0fb3be5776a5cd47641e942e58c4d43" + [[package]] name = "windows_aarch64_msvc" -version = "0.36.1" +version = "0.48.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "b2ef27e0d7bdfcfc7b868b317c1d32c641a6fe4629c171b8928c7b08d98d7cf3" + +[[package]] +name = "windows_i686_gnu" +version = "0.42.2" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "9bb8c3fd39ade2d67e9874ac4f3db21f0d710bee00fe7cab16949ec184eeaa47" +checksum = "c61d927d8da41da96a81f029489353e68739737d3beca43145c8afec9a31a84f" [[package]] name = "windows_i686_gnu" -version = "0.36.1" +version = "0.48.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "622a1962a7db830d6fd0a69683c80a18fda201879f0f447f065a3b7467daa241" + +[[package]] +name = "windows_i686_msvc" +version = "0.42.2" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "180e6ccf01daf4c426b846dfc66db1fc518f074baa793aa7d9b9aaeffad6a3b6" +checksum = "44d840b6ec649f480a41c8d80f9c65108b92d89345dd94027bfe06ac444d1060" [[package]] name = "windows_i686_msvc" -version = "0.36.1" +version = "0.48.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "4542c6e364ce21bf45d69fdd2a8e455fa38d316158cfd43b3ac1c5b1b19f8e00" + +[[package]] +name = "windows_x86_64_gnu" +version = "0.42.2" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "e2e7917148b2812d1eeafaeb22a97e4813dfa60a3f8f78ebe204bcc88f12f024" +checksum = "8de912b8b8feb55c064867cf047dda097f92d51efad5b491dfb98f6bbb70cb36" [[package]] name = "windows_x86_64_gnu" -version = "0.36.1" +version = "0.48.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "ca2b8a661f7628cbd23440e50b05d705db3686f894fc9580820623656af974b1" + +[[package]] +name = "windows_x86_64_gnullvm" +version = "0.42.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "26d41b46a36d453748aedef1486d5c7a85db22e56aff34643984ea85514e94a3" + +[[package]] +name = "windows_x86_64_gnullvm" +version = "0.48.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "4dcd171b8776c41b97521e5da127a2d86ad280114807d0b2ab1e462bc764d9e1" +checksum = "7896dbc1f41e08872e9d5e8f8baa8fdd2677f29468c4e156210174edc7f7b953" [[package]] name = "windows_x86_64_msvc" -version = "0.36.1" +version = "0.42.2" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "c811ca4a8c853ef420abd8592ba53ddbbac90410fab6903b3e79972a631f7680" +checksum = "9aec5da331524158c6d1a4ac0ab1541149c0b9505fde06423b02f5ef0106b9f0" [[package]] -name = "winreg" -version = "0.10.1" +name = "windows_x86_64_msvc" +version = "0.48.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "80d0f4e272c85def139476380b12f9ac60926689dd2e01d4923222f40580869d" +checksum = "1a515f5799fe4961cb532f983ce2b23082366b898e52ffbce459c86f67c8378a" + +[[package]] +name = "winnow" +version = "0.4.6" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "61de7bac303dc551fe038e2b3cef0f571087a47571ea6e79a87692ac99b99699" dependencies = [ - "winapi 0.3.9", + "memchr", ] [[package]] -name = "ws2_32-sys" -version = "0.2.1" +name = "winreg" +version = "0.10.1" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "d59cefebd0c892fa2dd6de581e937301d8552cb44489cdff035c6187cb63fa5e" +checksum = "80d0f4e272c85def139476380b12f9ac60926689dd2e01d4923222f40580869d" dependencies = [ - "winapi 0.2.8", - "winapi-build", + "winapi", ] [[package]] name = "xdg" -version = "2.4.1" +version = "2.5.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "0c4583db5cbd4c4c0303df2d15af80f0539db703fa1c68802d4cbbd2dd0f88f6" +checksum = "688597db5a750e9cad4511cb94729a078e274308099a0382b5b8203bbc767fee" dependencies = [ - "dirs 4.0.0", + "home", ] [[package]] name = "zip" -version = "0.5.13" +version = "0.6.6" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "93ab48844d61251bb3835145c521d88aa4031d7139e8485990f60ca911fa0815" +checksum = "760394e246e4c28189f19d488c058bf16f564016aefac5d32bb1f3b51d5e9261" dependencies = [ "byteorder", "crc32fast", + "crossbeam-utils", "flate2", - "thiserror", ] diff --git a/Cargo.toml b/Cargo.toml index eb879a1160..0de873d1a4 100644 --- a/Cargo.toml +++ b/Cargo.toml @@ -63,11 +63,11 @@ flate2 = { version = "^1.0.19", default-features = false, features = ["zlib"] } fs2 = "^0.4" lazy_static = "^1.4" libc = "^0.2" -md-5 = "^0.9" -open = "1.4.0" -quick-xml = "^0.22" +md-5 = "^0.10" +open = "^4.0" +quick-xml = "^0.28" serde = { version = "^1.0", features = ["derive"], optional = true } -sha2 = "^0.9" +sha2 = "^0.10" structopt = "0.3" tectonic_bridge_core = { path = "crates/bridge_core", version = "0.0.0-dev.0" } tectonic_bundles = { path = "crates/bundles", version = "0.0.0-dev.0", default-features = false } @@ -84,10 +84,13 @@ tectonic_xdv = { path = "crates/xdv", version = "0.0.0-dev.0" } tectonic_xetex_layout = { path = "crates/xetex_layout", version = "0.0.0-dev.0" } tempfile = "^3.1" termcolor = "^1.1" -toml = { version = "^0.5", optional = true } +tokio = "^1.0" +toml = { version = "^0.7", optional = true } url = "^2.0" -watchexec = "^1.15.3" -zip = { version = "^0.5", default-features = false, features = ["deflate"] } +watchexec = "^2.3.0" +watchexec-filterer-globset = "1.2" +watchexec-signals = "1.0" +zip = { version = "^0.6", default-features = false, features = ["deflate"] } [features] default = ["geturl-reqwest", "serialization"] @@ -112,21 +115,24 @@ profile = [] [dev-dependencies] filetime = "^0.2" -futures = "0.1" -headers = "0.2" -hyper = "0.12" +futures = "0.3" +headers = "0.3" +hyper = { version = "0.14", features = ["server"] } tempfile = "^3.1" -tokio = "0.1.22" [package.metadata.vcpkg] git = "https://github.com/microsoft/vcpkg" -rev = "1be5a98d090b1d2beeb63a48ba800fbf006e8aca" +rev = "ea222747b888b8d63df56240b262db38b095c68f" +overlay-triplets-path = "dist/vcpkg-triplets" +# If other targets start using custom triplets like x86_64-pc-windows-msvc, +# add them to crates/dep_support/src/lib.rs:new_from_vcpkg() to give users +# guidance if they might need to set $VCPKGRS_TRIPLET. [package.metadata.vcpkg.target] x86_64-apple-darwin = { install = ["freetype","harfbuzz[icu,graphite2]"] } aarch64-apple-darwin = { triplet = "arm64-osx", install = ["freetype","harfbuzz[icu,graphite2]"] } x86_64-unknown-linux-gnu = { install = ["fontconfig","freetype","harfbuzz[icu,graphite2]"] } -x86_64-pc-windows-msvc = { triplet = "x64-windows-static", install = ["fontconfig","freetype","harfbuzz[icu,graphite2]"] } +x86_64-pc-windows-msvc = { triplet = "x64-windows-static-release", install = ["fontconfig","freetype","harfbuzz[icu,graphite2]"] } [package.metadata.internal_dep_versions] tectonic_bridge_core = "526ff57d5dd9f80dff35a3a5dd856edc9f0f61aa" @@ -139,7 +145,7 @@ tectonic_cfg_support = "thiscommit:aeRoo7oa" tectonic_dep_support = "5faf4205bdd3d31101b749fc32857dd746f9e5bc" tectonic_docmodel = "a88a0418a9c3c559d023d9b1da9b03fce3a469e5" tectonic_engine_bibtex = "thiscommit:2021-01-17:KuhaeG1e" -tectonic_engine_spx2html = "thiscommit:2022-03-02:IQWAncv" +tectonic_engine_spx2html = "thiscommit:2022-11-22:vicemXu" tectonic_engine_xdvipdfmx = "8a003834b1f6d967d33cc07de4cc025af14560da" tectonic_engine_xetex = "c135e6a4a5a2e8c2dc4edcbcfd93f7d466ff8f88" tectonic_errors = "317ae79ceaa2593fb56090e37bf1f5cc24213dd9" diff --git a/build.rs b/build.rs index 84a8507176..11dbe81609 100644 --- a/build.rs +++ b/build.rs @@ -9,5 +9,5 @@ fn main() { // they want to spawn off executables. let target = env::var("TARGET").unwrap(); - println!("cargo:rustc-env=TARGET={}", target); + println!("cargo:rustc-env=TARGET={target}"); } diff --git a/crates/bridge_core/CHANGELOG.md b/crates/bridge_core/CHANGELOG.md index 8cd1ec515c..2816cd2b75 100644 --- a/crates/bridge_core/CHANGELOG.md +++ b/crates/bridge_core/CHANGELOG.md @@ -1,8 +1,56 @@ -# See elsewhere for changelog +# rc: micro bump -This project’s release notes are curated from the Git history of its main -branch. You can find them by looking at [the version of this file on the -`release` branch][branch] or the [GitHub release history][gh-releases]. +- Bump the `md-5` dep to the 0.10 series (#1038, @CraftSpider) +- Tidy up recent Clippy warnings. -[branch]: https://github.com/tectonic-typesetting/tectonic/blob/release/crates/bridge_core/CHANGELOG.md -[gh-releases]: https://github.com/tectonic-typesetting/tectonic/releases + +# tectonic_bridge_core 0.3.1 (2022-10-03) + +- Remove C's `time_t` from internal FFI APIs to avoid portability issues. This + should avoid issues with Linux Musl builds. + + +# tectonic_bridge_core 0.3.0 (2021-10-11) + +- Add `SecuritySettings::allow_extra_search_paths()` (#814, @ralismark). + + +# tectonic_bridge_core 0.2.2 (2021-06-17) + +- Switch from running [cbindgen] at build time to having the developer run it + manually. This really ought to fix the crate builds on docs.rs ([#788]), and + should speed builds too. + +[cbindgen]: https://github.com/eqrion/cbindgen +[#788]: https://github.com/tectonic-typesetting/tectonic/issues/788 + + +# tectonic_bridge_core 0.2.1 (2021-06-17) + +- Attempt to fix crate builds on docs.rs — see [#788]. This works around an + issue in Tectonic’s usage of [cbindgen] by configuring Cargo to operate in + offline mode when building on docs.rs, which builds crates with network access + turned off. + +[#788]: https://github.com/tectonic-typesetting/tectonic/issues/788 +[cbindgen]: https://github.com/eqrion/cbindgen + + +# tectonic_bridge_core 0.2.0 (2021-06-15) + +- Add a security infrastructure that gives a systematic way to control whether + features that can be abused by untrusted inputs, like shell-escape, are + enabled. The default is to disable all such features. Callers can request to + allow their use, but we use a centralized approach that ensures that such + requests will always be denied if the environment variable + `$TECTONIC_UNTRUSTED_MODE` is set to a nonempty value (@pkgw, #787). +- Add a C API allowing us to expose the filesystem paths for just-opened + inputs. This is needed for correct SyncTeX support (@hullanson, @pkgw, #762). + + +# tectonic_bridge_core 0.1.0 (2021-06-03) + +This is the first release of the "core" bridge crate. It provides a baseline of +APIs for C/C++ code to interact with an underlying "driver" implemented in Rust. +Those APIs mainly revolve around basic I/O and diagnostics, although we do have +a specialized "system request" to implement the TeX shell-escape feature. diff --git a/crates/bridge_core/Cargo.toml b/crates/bridge_core/Cargo.toml index 94264664cc..67543b307b 100644 --- a/crates/bridge_core/Cargo.toml +++ b/crates/bridge_core/Cargo.toml @@ -22,7 +22,7 @@ links = "tectonic_bridge_core" flate2 = { version = "^1.0", default-features = false, features = ["zlib"] } lazy_static = "^1.4" libc = "^0.2" -md-5 = "^0.9" +md-5 = "^0.10" tectonic_errors = { path = "../errors", version = "0.0.0-dev.0" } tectonic_io_base = { path = "../io_base", version = "0.0.0-dev.0" } tectonic_status_base = { path = "../status_base", version = "0.0.0-dev.0" } diff --git a/crates/bridge_core/src/lib.rs b/crates/bridge_core/src/lib.rs index db0673faad..b87060baf7 100644 --- a/crates/bridge_core/src/lib.rs +++ b/crates/bridge_core/src/lib.rs @@ -1,4 +1,4 @@ -// Copyright 2016-2021 the Tectonic Project +// Copyright 2016-2022 the Tectonic Project // Licensed under the MIT License. #![deny(missing_docs)] @@ -349,7 +349,7 @@ impl<'a> CoreBridgeState<'a> { // `lipsum.ltd.tex` under the name `lipsum.ltd`. for e in format.extensions() { - let ext = format!("{}.{}", name, e); + let ext = format!("{name}.{e}"); if let FileFormat::Format = format { match io.input_open_format(&ext, self.status) { @@ -714,11 +714,12 @@ pub struct SecuritySettings { /// Different high-level security stances that can be adopted when creating /// [`SecuritySettings`]. -#[derive(Clone, Debug)] +#[derive(Clone, Debug, Default)] pub enum SecurityStance { /// Ensure that all known-insecure features are disabled. /// /// Use this stance if you are processing untrusted input. + #[default] DisableInsecures, /// Request to allow the use of known-insecure features. @@ -730,13 +731,6 @@ pub enum SecurityStance { MaybeAllowInsecures, } -impl Default for SecurityStance { - fn default() -> Self { - // Obvi, the default is secure!!! - SecurityStance::DisableInsecures - } -} - impl SecuritySettings { /// Create a new security configuration. /// @@ -817,11 +811,7 @@ pub unsafe extern "C" fn ttbc_get_file_md5( let rpath = CStr::from_ptr(path).to_string_lossy(); let rdest = slice::from_raw_parts_mut(digest, 16); - if es.get_file_md5(rpath.as_ref(), rdest) { - 1 - } else { - 0 - } + libc::c_int::from(es.get_file_md5(rpath.as_ref(), rdest)) } /// Calculate the MD5 digest of a block of binary data. @@ -917,11 +907,7 @@ pub extern "C" fn ttbc_output_flush( es: &mut CoreBridgeState, handle: *mut OutputHandle, ) -> libc::c_int { - if es.output_flush(handle) { - 1 - } else { - 0 - } + libc::c_int::from(es.output_flush(handle)) } /// Close a Tectonic output file. @@ -934,11 +920,7 @@ pub extern "C" fn ttbc_output_close( return 0; // This is/was the behavior of close_file() in C. } - if es.output_close(handle) { - 1 - } else { - 0 - } + libc::c_int::from(es.output_close(handle)) } /// Open a Tectonic file for input. @@ -1137,11 +1119,7 @@ pub extern "C" fn ttbc_input_close( return 0; // This is/was the behavior of close_file() in C. } - if es.input_close(handle) { - 1 - } else { - 0 - } + libc::c_int::from(es.input_close(handle)) } /// A buffer for diagnostic messages. Rust code does not need to use this type. @@ -1214,11 +1192,7 @@ pub unsafe extern "C" fn ttbc_shell_escape( } }; - if es.shell_escape(&rcmd) { - 1 - } else { - 0 - } + libc::c_int::from(es.shell_escape(&rcmd)) } /// Different types of files that can be opened by TeX engines diff --git a/crates/bridge_flate/CHANGELOG.md b/crates/bridge_flate/CHANGELOG.md index 2e679ddbf9..faf5848607 100644 --- a/crates/bridge_flate/CHANGELOG.md +++ b/crates/bridge_flate/CHANGELOG.md @@ -1,8 +1,56 @@ -# See elsewhere for changelog +# rc: micro bump -This project’s release notes are curated from the Git history of its main -branch. You can find them by looking at [the version of this file on the -`release` branch][branch] or the [GitHub release history][gh-releases]. +- Tidy up recent Clippy warnings. -[branch]: https://github.com/tectonic-typesetting/tectonic/blob/release/crates/bridge_flate/CHANGELOG.md -[gh-releases]: https://github.com/tectonic-typesetting/tectonic/releases + +# tectonic_bridge_flate 0.1.6 (2022-10-03) + +No code changes, but some internal documentation improvements about managing FFI +APIs. + + +# tectonic_bridge_flate 0.1.5 (2021-06-17) + +- Switch from running [cbindgen] at build time to having the developer run it + manually. This really ought to fix the crate builds on docs.rs ([#788]), and + should speed builds too. + +[cbindgen]: https://github.com/eqrion/cbindgen +[#788]: https://github.com/tectonic-typesetting/tectonic/issues/788 + + +# tectonic_bridge_flate 0.1.4 (2021-06-17) + +- Attempt to fix crate builds on docs.rs — see [#788]. This works around an + issue in Tectonic’s usage of [cbindgen] by configuring Cargo to operate in + offline mode when building on docs.rs, which builds crates with network access + turned off. + +[#788]: https://github.com/tectonic-typesetting/tectonic/issues/788 +[cbindgen]: https://github.com/eqrion/cbindgen + + +# tectonic_bridge_flate 0.1.3 (2021-06-16) + +- Try again with our docs.rs workarounds. Looks like we need + `CARGO_NET_OFFLINE=true`, not `CARGO_NET_OFFLINE=1`. + + +# tectonic_bridge_flate 0.1.2 (2021-06-16) + +- Try some workarounds to get docs building on docs.rs, both for this crate on + its own and for the toplevel `tectonic` crate. + + +# tectonic_bridge_flate 0.1.1 (2021-01-16) + +- Fix a Clippy complaint + + +# tectonic_bridge_flate 0.1.0 (2021-01-03) + +Initial release of the `tectonic_bridge_flate` crate. This crate provides a +simple C API to the flate2 crate — even though flate2 often wraps zlib, which +has its own C API. This is the first step towards segmenting Tectonic's +native-library dependencies and starting to be able to vendor them. This new +crate doesn't change anything dramatic yet, but starts that process. diff --git a/crates/bridge_flate/src/lib.rs b/crates/bridge_flate/src/lib.rs index 1ea5ba4884..bf83271d0d 100644 --- a/crates/bridge_flate/src/lib.rs +++ b/crates/bridge_flate/src/lib.rs @@ -72,7 +72,7 @@ pub unsafe extern "C" fn tectonic_flate_compress( (c.total_out(), FlateResult::Success) }; - *output_len = size as u64; + *output_len = size; result } @@ -101,7 +101,7 @@ pub unsafe extern "C" fn tectonic_flate_decompress( Err(_) => (0, FlateResult::OtherError), }; - *output_len = size as u64; + *output_len = size; result } diff --git a/crates/bridge_graphite2/CHANGELOG.md b/crates/bridge_graphite2/CHANGELOG.md index 144b3099a8..8c49cb5bcb 100644 --- a/crates/bridge_graphite2/CHANGELOG.md +++ b/crates/bridge_graphite2/CHANGELOG.md @@ -1,8 +1,26 @@ -# See elsewhere for changelog +# rc: micro bump -This project’s release notes are curated from the Git history of its main -branch. You can find them by looking at [the version of this file on the -`release` branch][branch] or the [GitHub release history][gh-releases]. +- Tidy up recent Clippy warnings. -[branch]: https://github.com/tectonic-typesetting/tectonic/blob/release/crates/bridge_graphite2/CHANGELOG.md -[gh-releases]: https://github.com/tectonic-typesetting/tectonic/releases + +# tectonic_bridge_graphite2 0.2.1 (2021-10-11) + +- Fix the build script for Rust >=1.55 (#820, @pkgw) + + +# tectonic_bridge_graphite2 0.2.0 (2021-06-03) + +- Fix up handling of how C/C++ header file paths are exported to dependent + crates. This is a breaking change: we've moved from a single include directory + to a list of them. +- Some improvements to the documentation + +# tectonic_bridge_graphite2 0.1.1 (2021-01-16) + +- Export information about the `GRAPHITE2_STATIC` C preprocessor define that is + sometimes needed. + +# tectonic_bridge_graphite2 0.1.0 (2021-01-04) + +A new crate to encapsulate the location and use of the `graphite2` library used +by Tectonic. diff --git a/crates/bridge_graphite2/build.rs b/crates/bridge_graphite2/build.rs index 7ef29b4b42..53d130bdde 100644 --- a/crates/bridge_graphite2/build.rs +++ b/crates/bridge_graphite2/build.rs @@ -55,5 +55,5 @@ fn main() { "" }; - println!("cargo:define_static={}", define_static_flag); + println!("cargo:define_static={define_static_flag}"); } diff --git a/crates/bundles/CHANGELOG.md b/crates/bundles/CHANGELOG.md index 008638b0b7..c8caace2c2 100644 --- a/crates/bundles/CHANGELOG.md +++ b/crates/bundles/CHANGELOG.md @@ -1,8 +1,62 @@ -# See elsewhere for changelog +# rc: micro bump -This project’s release notes are curated from the Git history of its main -branch. You can find them by looking at [the version of this file on the -`release` branch][branch] or the [GitHub release history][gh-releases]. +- Bump the `zip` dependency to the 0.6 series (#1038, @CraftSpider) +- Tidy up formatting and recent Clippy warnings -[branch]: https://github.com/tectonic-typesetting/tectonic/blob/release/crates/bundles/CHANGELOG.md -[gh-releases]: https://github.com/tectonic-typesetting/tectonic/releases + +# tectonic_bundles 0.3.0 (2022-04-26) + +This minor bump contains a breaking change! + +- The default bundle URL is now parametrized with the "format version", which + captures the internal capabilities of the XeTeX engine. Since the bundle and + the engine are fairly tightly coupled, this allows us to provide bundles that + track the capabilities of newer engine versions, while preserving the behavior + of older engine versions. Anyway, instead of exporting a `FALLBACK_BUNDLE_URL` + const, we now export a `get_fallback_bundle_url()` method that takes the + format version as an argument. This argument should be the value of + `tectonic_engine_xetex::FORMAT_SERIAL` if you have a module that actually + links to the XeTeX engine. +- Make the cache location customizable with the environment variable + `TECTONIC_CACHE_DIR` (#880, #884, @wischi-chr). +- Fix "fetching" of zero-size files to succeed without attempting any I/O (#888, + @pkgw). + + +# tectonic_bundles 0.2.0 (2021-10-11) + +This release contains a major configuration change, updating the URL of the +default bundle to refer to a new, dedicated web service rather than using +`archive.org` (#833, @pkgw). The new default URL is: + +https://relay.fullyjustified.net/default_bundle.tar + +This switch was motivated by the recent breakage caused by a change in +archive.org's internal implementation, even though that breakage has been fixed +in the most recent release of the `tectonic_geturl` crate. The `archive.org` +redirection service has always had low-level reliability issues and, more +importantly, is blocked in China, which is a fatal issue for a potentially large +number of users. + +The new webservice is a very simple nginx server set up in a Docker container +defined in the [tectonic-relay-service] repo. The associated web infrastructure +runs on Microsoft Azure and is configured using Terraform files in the +[tectonic-cloud-infra] repo. + +[tectonic-relay-service]: https://github.com/tectonic-typesetting/tectonic-relay-service +[tectonic-cloud-infra]: https://github.com/tectonic-typesetting/tectonic-cloud-infra + +@pkgw owns the `fullyjustified.net` domain name and the Azure subscription into +which the services are deployed. + + +# tectonic_bundles 0.1.0 (2021-06-15) + +Add the `tectonic_bundles` crate! This separates out the implementation of the +various Tectonic file “bundles” into a standalone crate, so that you can use +them without having to link to harfbuzz and everything else pulled in by the +main crate. + +As usual, separating out this crate led to some good API clarifications and +improvements. The API offered here includes some nontrivial breakage compared to +the old APIs in `tectonic::io::*`, but it's much more rationalized. diff --git a/crates/bundles/Cargo.toml b/crates/bundles/Cargo.toml index d42d6d8bea..11eb48e1d6 100644 --- a/crates/bundles/Cargo.toml +++ b/crates/bundles/Cargo.toml @@ -24,7 +24,7 @@ tectonic_errors = { path = "../errors", version = "0.0.0-dev.0" } tectonic_geturl = { path = "../geturl", version = "0.0.0-dev.0", default-features = false } tectonic_io_base = { path = "../io_base", version = "0.0.0-dev.0" } tectonic_status_base = { path = "../status_base", version = "0.0.0-dev.0" } -zip = { version = "^0.5", default-features = false, features = ["deflate"] } +zip = { version = "^0.6", default-features = false, features = ["deflate"] } [features] default = ["geturl-reqwest"] diff --git a/crates/bundles/src/cache.rs b/crates/bundles/src/cache.rs index 3e1d131158..0c481b0d83 100644 --- a/crates/bundles/src/cache.rs +++ b/crates/bundles/src/cache.rs @@ -461,7 +461,7 @@ impl CachingBundle { // line-based manifest format. Be paranoid and refuse to record such // filenames. if !name.contains(|c| c == '\n' || c == '\r') { - writeln!(man, "{} {} {}", name, length, digest_text)?; + writeln!(man, "{name} {length} {digest_text}")?; } self.contents.insert( @@ -525,7 +525,7 @@ impl CachingBundle { // The resolved URL has changed, but the digest is the same. So // let's just update the URL and keep going. let resolved_path = make_txt_path(&self.resolved_base, &pull_data.digest.to_string()); - file_create_write(&resolved_path, |f| { + file_create_write(resolved_path, |f| { f.write_all(pull_data.resolved_url.as_bytes()) })?; @@ -647,7 +647,7 @@ impl IoProvider for CachingBundle { OpenResult::Err(e) => return OpenResult::Err(e), }; - let f = match File::open(&path) { + let f = match File::open(path) { Ok(f) => f, Err(e) => return OpenResult::Err(e.into()), }; @@ -762,5 +762,5 @@ fn ensure_cache_dir(root: &Path, path: &str) -> Result { /// Convenience to generate a text filename fn make_txt_path(base: &Path, name: &str) -> PathBuf { - base.join(&name).with_extension("txt") + base.join(name).with_extension("txt") } diff --git a/crates/bundles/src/dir.rs b/crates/bundles/src/dir.rs index db431ddfa5..1ec980a710 100644 --- a/crates/bundles/src/dir.rs +++ b/crates/bundles/src/dir.rs @@ -60,7 +60,7 @@ impl Bundle for DirBundle { let mut files = Vec::new(); // We intentionally do not explore the directory recursively. - for entry in fs::read_dir(&self.0.root())? { + for entry in fs::read_dir(self.0.root())? { let entry = entry?; // This catches both regular files and symlinks:` diff --git a/crates/bundles/src/lib.rs b/crates/bundles/src/lib.rs index b8c3f6c38b..6ff8bd0ee0 100644 --- a/crates/bundles/src/lib.rs +++ b/crates/bundles/src/lib.rs @@ -117,10 +117,7 @@ pub fn get_fallback_bundle_url(format_version: u32) -> String { if format_version < 32 { "https://relay.fullyjustified.net/default_bundle.tar".to_owned() } else { - format!( - "https://relay.fullyjustified.net/default_bundle_v{}.tar", - format_version - ) + format!("https://relay.fullyjustified.net/default_bundle_v{format_version}.tar") } } diff --git a/crates/dep_support/CHANGELOG.md b/crates/dep_support/CHANGELOG.md index 12d192fcb9..2c72f0b99e 100644 --- a/crates/dep_support/CHANGELOG.md +++ b/crates/dep_support/CHANGELOG.md @@ -1,8 +1,14 @@ -# See elsewhere for changelog +# rc: micro bump -This project’s release notes are curated from the Git history of its main -branch. You can find them by looking at [the version of this file on the -`release` branch][branch] or the [GitHub release history][gh-releases]. +- Default the Windows vcpkg build to use a custom triplet that doesn't + do debug builds (#961, @pkgw). This significantly speeds up the + Tectonic Windows CI runs. +- Tidy up recent Clippy warnings. -[branch]: https://github.com/tectonic-typesetting/tectonic/blob/release/crates/dep_support/CHANGELOG.md -[gh-releases]: https://github.com/tectonic-typesetting/tectonic/releases + +# tectonic_dep_support 0.1.0 (2021-01-04) + +A new crate to support Tectonic's searching for external libraries +("dependencies"). Notably, this crate supports finding deps using either +pkg-config or vcpkg. It does *not* (yet?) handle the question of deciding +whether to find a dependency externally or vendor it (build it locally). diff --git a/crates/dep_support/src/lib.rs b/crates/dep_support/src/lib.rs index 4c5956efc7..ab06149c2f 100644 --- a/crates/dep_support/src/lib.rs +++ b/crates/dep_support/src/lib.rs @@ -14,21 +14,16 @@ use std::{ }; /// Supported depedency-finding backends. -#[derive(Clone, Copy, Debug, Eq, PartialEq)] +#[derive(Clone, Copy, Debug, Default, Eq, PartialEq)] pub enum Backend { /// pkg-config + #[default] PkgConfig, /// vcpkg Vcpkg, } -impl Default for Backend { - fn default() -> Self { - Backend::PkgConfig - } -} - /// Dep-finding configuration. #[derive(Clone, Debug, Eq, PartialEq)] pub struct Configuration { @@ -91,6 +86,7 @@ struct VcPkgState { /// State for discovering and managing a dependency, which may vary /// depending on the framework that we're using to discover them. #[derive(Debug)] +#[allow(clippy::large_enum_variant)] enum DepState { /// pkg-config PkgConfig(PkgConfigState), @@ -124,10 +120,26 @@ impl DepState { let mut include_paths = vec![]; for dep in spec.get_vcpkg_spec() { - let library = vcpkg::Config::new() - .cargo_metadata(false) - .find_package(dep) - .unwrap_or_else(|e| panic!("failed to load package {} from vcpkg: {}", dep, e)); + let library = match vcpkg::Config::new().cargo_metadata(false).find_package(dep) { + Ok(lib) => lib, + Err(e) => { + if let vcpkg::Error::LibNotFound(_) = e { + // We should potentially be referencing the CARGO_CFG_TARGET_* + // variables to handle cross-compilation (cf. the + // tectonic_cfg_support crate), but vcpkg-rs doesn't use them + // either. + let target = env::var("TARGET").unwrap_or_default(); + + if target == "x86_64-pc-windows-msvc" { + println!("cargo:warning=you may need to export VCPKGRS_TRIPLET=x64-windows-static-release ..."); + println!("cargo:warning=... which is a custom triplet used by Tectonic's cargo-vcpkg integration"); + } + } + + panic!("failed to load package {} from vcpkg: {}", dep, e) + } + }; + include_paths.extend(library.include_paths.iter().cloned()); } @@ -206,7 +218,7 @@ impl<'a, T: Spec> Dependency<'a, T> { // graphite2 is not available on Debian. So // let's jump through the hoops of testing // whether the static archive seems findable. - let libname = format!("lib{}.a", libbase); + let libname = format!("lib{libbase}.a"); state .libs .link_paths @@ -216,11 +228,11 @@ impl<'a, T: Spec> Dependency<'a, T> { }; let mode = if do_static { "static=" } else { "" }; - println!("cargo:rustc-link-lib={}{}", mode, libbase); + println!("cargo:rustc-link-lib={mode}{libbase}"); } for fw in &state.libs.frameworks { - println!("cargo:rustc-link-lib=framework={}", fw); + println!("cargo:rustc-link-lib=framework={fw}"); } } else { // Just let pkg-config do its thing. diff --git a/crates/docmodel/CHANGELOG.md b/crates/docmodel/CHANGELOG.md index 2b2ba1a3be..9e7808fa63 100644 --- a/crates/docmodel/CHANGELOG.md +++ b/crates/docmodel/CHANGELOG.md @@ -1,8 +1,32 @@ -# See elsewhere for changelog +# rc: micro bump -This project’s release notes are curated from the Git history of its main -branch. You can find them by looking at [the version of this file on the -`release` branch][branch] or the [GitHub release history][gh-releases]. +- Update the `toml` dependency to the 0.7 series (#1038, @CraftSpider) +- Have `shell_escape_cwd` imply `shell_escape = true` (#966, @pkgw) -[branch]: https://github.com/tectonic-typesetting/tectonic/blob/release/crates/docmodel/CHANGELOG.md -[gh-releases]: https://github.com/tectonic-typesetting/tectonic/releases + +# tectonic_docmodel 0.2.0 (2022-10-03) + +- Define a new TOML item, `shell_escape_cwd`, that can be used to specify the + directory in which shell-escape state should be managed. The main expected use + case is to set it to the TeX source directory, to make it possible to work + around limitations in Tectonic’s encapsulated shell-escape support. + + +# tectonic_docmodel 0.1.2 (2022-02-28) + +- Define HTML options for build output (#865, @pkgw) +- Fixes for newer versions of Clippy + + +# tectonic_docmodel 0.1.1 (2021-10-11) + +- Fix the error message given when a "V2" command is run outside of a Tectonic + document workspace (#813, @ralismark) +- Fixes for Clippy >=1.53.0 (@pkgw) + + +# tectonic_docmodel 0.1.0 (2021-06-15) + +This crate isolates the file formats used by the Tectonic “document model”, +primarily `Tectonic.toml`. This makes it possible to interact with these data +formats without needing to link in with the full Tectonic dependency stack. diff --git a/crates/docmodel/Cargo.toml b/crates/docmodel/Cargo.toml index 914015cebf..73a722c4eb 100644 --- a/crates/docmodel/Cargo.toml +++ b/crates/docmodel/Cargo.toml @@ -20,7 +20,7 @@ edition = "2018" [dependencies] serde = { version = "^1.0", features = ["derive"] } tectonic_errors = { path = "../errors", version = "0.0.0-dev.0" } -toml = { version = "^0.5" } +toml = { version = "^0.7" } [package.metadata.internal_dep_versions] tectonic_errors = "5c9ba661edf5ef669f24f9904f99cca369d999e7" diff --git a/crates/docmodel/src/document.rs b/crates/docmodel/src/document.rs index 26959b7983..b1abdf23ba 100644 --- a/crates/docmodel/src/document.rs +++ b/crates/docmodel/src/document.rs @@ -390,6 +390,8 @@ mod syntax { } pub fn to_runtime(&self) -> super::OutputProfile { + let shell_escape_default = self.shell_escape_cwd.is_some(); + super::OutputProfile { name: self.name.clone(), target_type: self.target_type.to_runtime(), @@ -411,7 +413,7 @@ mod syntax { .postamble_file .clone() .unwrap_or_else(|| DEFAULT_POSTAMBLE_FILE.to_owned()), - shell_escape: self.shell_escape.unwrap_or_default(), + shell_escape: self.shell_escape.unwrap_or(shell_escape_default), shell_escape_cwd: self.shell_escape_cwd.clone(), } } @@ -469,3 +471,45 @@ mod syntax { } } } + +#[cfg(test)] +mod tests { + use std::io::Cursor; + + use super::*; + + #[test] + fn shell_escape_default_false() { + const TOML: &str = r#" + [doc] + name = "test" + bundle = "na" + + [[output]] + name = "o" + type = "pdf" + "#; + + let mut c = Cursor::new(TOML.as_bytes()); + let doc = Document::new_from_toml(".", ".", &mut c).unwrap(); + assert!(!doc.outputs.get("o").unwrap().shell_escape); + } + + #[test] + fn shell_escape_cwd_implies_shell_escape() { + const TOML: &str = r#" + [doc] + name = "test" + bundle = "na" + + [[output]] + name = "o" + type = "pdf" + shell_escape_cwd = "." + "#; + + let mut c = Cursor::new(TOML.as_bytes()); + let doc = Document::new_from_toml(".", ".", &mut c).unwrap(); + assert!(doc.outputs.get("o").unwrap().shell_escape); + } +} diff --git a/crates/engine_bibtex/CHANGELOG.md b/crates/engine_bibtex/CHANGELOG.md index 659dbb91e9..66d9aee571 100644 --- a/crates/engine_bibtex/CHANGELOG.md +++ b/crates/engine_bibtex/CHANGELOG.md @@ -1,8 +1,43 @@ -# See elsewhere for changelog +# rc: micro bump -This project’s release notes are curated from the Git history of its main -branch. You can find them by looking at [the version of this file on the -`release` branch][branch] or the [GitHub release history][gh-releases]. +- Treat `\r\n` sequences as a single unit (#1037, @CraftSpider). This leads to + more uniform behavior on Windows and non-Windows platforms. -[branch]: https://github.com/tectonic-typesetting/tectonic/blob/release/crates/engine_bibtex/CHANGELOG.md -[gh-releases]: https://github.com/tectonic-typesetting/tectonic/releases + +# tectonic_engine_bibtex 0.1.4 (2022-10-03) + +No code changes, but some internal documentation improvements about managing FFI +APIs. + + +# tectonic_engine_bibtex 0.1.3 (2021-06-17) + +- Switch from running [cbindgen] at build time to having the developer run it + manually. This really ought to fix the crate builds on docs.rs ([#788]), and + should speed builds too. + +[cbindgen]: https://github.com/eqrion/cbindgen +[#788]: https://github.com/tectonic-typesetting/tectonic/issues/788 + + +# tectonic_engine_bibtex 0.1.2 (2021-06-17) + +- Attempt to fix crate builds on docs.rs — see [#788]. This works around an + issue in Tectonic’s usage of [cbindgen] by configuring Cargo to operate in + offline mode when building on docs.rs, which builds crates with network access + turned off. + +[#788]: https://github.com/tectonic-typesetting/tectonic/issues/788 +[cbindgen]: https://github.com/eqrion/cbindgen + + +# tectonic_engine_bibtex 0.1.1 (2021-06-04) + +No code changes; the Cargo package didn't publish because I hit the crates.io +rate limit in the previous batch of updates! + + +# tectonic_engine_bibtex 0.1.0 (2021-06-03) + +This crate introduces the `bibtex` engine as a standalone crate, building on +the new "core bridge" functionality. diff --git a/crates/engine_bibtex/bibtex/bibtex.c b/crates/engine_bibtex/bibtex/bibtex.c index 77699a9bd0..3c183edddd 100644 --- a/crates/engine_bibtex/bibtex/bibtex.c +++ b/crates/engine_bibtex/bibtex/bibtex.c @@ -529,8 +529,17 @@ input_ln(peekable_input_t *peekable) buffer[last] = peekable_getc(peekable); last++; } + + // For side effects - consume the eoln we saw + int eoln = peekable_getc(peekable); - peekable_getc(peekable); + if (eoln == '\r') { + // Handle \r\n newlines on Windows by trying to consume a \n after a \r, unget if it's not that exact pair + int next = peekable_getc(peekable); + if (next != '\n') { + peekable_ungetc(peekable, next); + } + } while (last > 0) { if (lex_class[buffer[last - 1]] == 1 /*white_space */ ) diff --git a/crates/engine_spx2html/CHANGELOG.md b/crates/engine_spx2html/CHANGELOG.md index aa5684b614..9157ba689a 100644 --- a/crates/engine_spx2html/CHANGELOG.md +++ b/crates/engine_spx2html/CHANGELOG.md @@ -1,4 +1,13 @@ -# rc: micro bump +# rc: minor bump + +- A massive rework to support more sophisticated HTML output for the + [Tectonopedia] project (#1016, @pkgw). This crate is still highly unstable so + we're not going to document them. + +[Tectonopedia]: https://github.com/tectonic-typesetting/tectonopedia + + +# tectonic_engine_spx2html 0.2.1 (2022-10-27) - Avoid a dumb crash when attempting to compile documents that have not been set up for the Tectonic HTML compilation framework (#955, @pkgw). Note, diff --git a/crates/engine_spx2html/Cargo.toml b/crates/engine_spx2html/Cargo.toml index 4e830d198d..4e38b1c3b4 100644 --- a/crates/engine_spx2html/Cargo.toml +++ b/crates/engine_spx2html/Cargo.toml @@ -22,6 +22,7 @@ byteorder = "^1.4" html-escape = "^0.2" percent-encoding = "^2.1" pinot = "^0.1.4" +serde = { version = "^1.0", features = ["derive"] } tectonic_bridge_core = { path = "../bridge_core", version = "0.0.0-dev.0" } tectonic_errors = { path = "../errors", version = "0.0.0-dev.0" } tectonic_io_base = { path = "../io_base", version = "0.0.0-dev.0" } @@ -29,6 +30,7 @@ tectonic_status_base = { path = "../status_base", version = "0.0.0-dev.0" } tectonic_xdv = { path = "../xdv", version = "0.0.0-dev.0" } tempfile = "^3.1" tera = "^1.13" +serde_json = "^1.0" [package.metadata.internal_dep_versions] tectonic_bridge_core = "4e16bf963700aae59772a6fb223981ceaa9b5f57" diff --git a/crates/engine_spx2html/src/assets.rs b/crates/engine_spx2html/src/assets.rs new file mode 100644 index 0000000000..edd08ed6e7 --- /dev/null +++ b/crates/engine_spx2html/src/assets.rs @@ -0,0 +1,648 @@ +// Copyright 2022 the Tectonic Project +// Licensed under the MIT License. + +//! Assets generated by a Tectonic HTML build. + +use serde::Serialize; +use std::{ + borrow::Cow, + collections::{hash_map::Iter, HashMap}, + fs::File, + io::{Read, Write}, + path::{Path, PathBuf}, +}; +use tectonic_errors::{anyhow::Context, prelude::*}; +use tectonic_status_base::tt_warning; + +use crate::{fonts::FontEnsemble, specials::Special, Common}; + +/// Runtime state about which non-font assets have been created. +#[derive(Debug, Default)] +pub(crate) struct Assets { + paths: HashMap, +} + +/// Different kinds of non-font assets that can be defined at runtime. +#[derive(Debug)] +enum AssetOrigin { + /// Copy a file from the source stack directly to the output directory. + Copy(String), + + /// Emit a CSS file containing information about the ensemble of fonts + /// that have been used. + FontCss, +} + +impl Assets { + /// Returns true if the special was successfully handled. The false case + /// doesn't distinguish between a special that wasn't relevant, and one that + /// was malformatted or otherwise unparseable. + pub fn try_handle_special(&mut self, special: Special, common: &mut Common) -> bool { + match special { + Special::ProvideFile(spec) => { + let (src_tex_path, dest_path) = match spec.split_once(' ') { + Some(t) => t, + None => { + tt_warning!(common.status, "ignoring malformatted special `{}`", special); + return false; + } + }; + + self.copy_file(src_tex_path, dest_path); + true + } + + Special::ProvideSpecial(spec) => { + let (kind, dest_path) = match spec.split_once(' ') { + Some(t) => t, + None => { + tt_warning!(common.status, "ignoring malformatted special `{}`", special); + return false; + } + }; + + match kind { + "font-css" => { + self.emit_font_css(dest_path); + true + } + _ => { + tt_warning!(common.status, "ignoring unsupported special `{}`", special); + false + } + } + } + + _ => false, + } + } + + fn copy_file(&mut self, src_path: S1, dest_path: S2) { + self.paths.insert( + dest_path.to_string(), + AssetOrigin::Copy(src_path.to_string()), + ); + } + + fn emit_font_css(&mut self, dest_path: S) { + self.paths + .insert(dest_path.to_string(), AssetOrigin::FontCss); + } + + /// This functional must only be called if `common.out_path` is not None. + pub(crate) fn emit(mut self, mut fonts: FontEnsemble, common: &mut Common) -> Result<()> { + let faces = fonts.emit(common.out_base)?; + + for (dest_path, origin) in self.paths.drain() { + match origin { + AssetOrigin::Copy(ref src_path) => emit_copied_file(src_path, &dest_path, common), + AssetOrigin::FontCss => emit_font_css(&dest_path, &faces, common), + }?; + } + + Ok(()) + } + + pub(crate) fn into_serialize(mut self, fonts: FontEnsemble) -> impl Serialize { + let (mut assets, css_data) = fonts.into_serialize(); + + for (dest_path, origin) in self.paths.drain() { + let info = match origin { + AssetOrigin::Copy(src_path) => syntax::AssetOrigin::Copy(src_path), + AssetOrigin::FontCss => syntax::AssetOrigin::FontCss(css_data.clone()), + }; + assets.0.insert(dest_path, info); + } + + assets + } +} + +/// This functional must only be called if `common.out_path` is not None. +fn emit_copied_file(src_tex_path: &str, dest_path: &str, common: &mut Common) -> Result<()> { + let mut ih = atry!( + common.hooks.io().input_open_name(src_tex_path, common.status).must_exist(); + ["unable to open provideFile source `{}`", &src_tex_path] + ); + + { + let (mut out_file, out_path) = create_asset_file(dest_path, common)?; + + atry!( + std::io::copy(&mut ih, &mut out_file); + ["cannot copy to output file `{}`", out_path.display()] + ); + } + + let (name, digest_opt) = ih.into_name_digest(); + common + .hooks + .event_input_closed(name, digest_opt, common.status); + Ok(()) +} + +/// This functional must only be called if `common.out_path` is not None. +fn emit_font_css(dest_path: &str, faces: &str, common: &mut Common) -> Result<()> { + let (mut out_file, out_path) = create_asset_file(dest_path, common)?; + + atry!( + write!(&mut out_file, "{faces}"); + ["cannot write output file `{}`", out_path.display()] + ); + + Ok(()) +} + +/// This functional must only be called if `common.out_path` is not None. +fn create_asset_file(dest_path: &str, common: &mut Common) -> Result<(File, PathBuf)> { + let out_path = create_output_path(dest_path, common)?.0.unwrap(); + + let out_file = atry!( + File::create(&out_path); + ["cannot open output file `{}`", out_path.display()] + ); + + Ok((out_file, out_path)) +} + +/// Process a TeX output path into one for the actual filesystem. +/// +/// We have a separate argument `do_create`, rather than just looking at +/// `common.do_not_emit`, since this function is used for assets as well as +/// templated HTML outputs. +pub(crate) fn create_output_path( + dest_path: &str, + common: &mut Common, +) -> Result<(Option, usize)> { + let mut out_path = common.out_base.map(|p| p.to_owned()); + let mut n_levels = 0; + + for piece in dest_path.split('/') { + if let Some(out_path) = out_path.as_mut() { + match std::fs::create_dir(&out_path) { + Ok(_) => {} + Err(e) if e.kind() == std::io::ErrorKind::AlreadyExists => {} + Err(e) => { + return Err(e).context(format!( + "cannot create output parent directory `{}`", + out_path.display() + )); + } + } + } + + if piece.is_empty() { + continue; + } + + if piece == ".." { + bail!( + "illegal provideFile dest path `{}`: it contains a `..` component", + &dest_path + ); + } + + let as_path = Path::new(piece); + + if as_path.is_absolute() || as_path.has_root() { + bail!( + "illegal provideFile path `{}`: it contains an absolute/rooted component", + &dest_path, + ); + } + + if let Some(out_path) = out_path.as_mut() { + out_path.push(piece); + } + + n_levels += 1; + } + + Ok((out_path, n_levels)) +} + +/// Information about assets that have been defined in an SPX-to-HTML run. +#[derive(Clone, Debug, Default)] +pub struct AssetSpecification(syntax::Assets); + +impl AssetSpecification { + /// Update this specification with information from one that's been + /// serialized. + /// + /// It is possible for two specifications to be incompatible, in which case + /// an error will be returned and this object will be left in an undefined + /// state. + pub fn add_from_saved(&mut self, reader: R) -> Result<&mut Self> { + let new: syntax::Assets = atry!( + serde_json::from_reader(reader); + ["failed to deserialize saved specification"] + ); + + // As things are currently structured, we can parse the new entries in + // any order. This is because we assume that the both inputs (self and + // the new one) have internally-consistent cross-referencing, in which + // case their merger must as well. (Here, "cross-referencing" means + // aspects like the font-family information referencing output filenames + // for font-files.) + + use syntax::AssetOrigin as AO; + + for (path, new_origin) in &new.0 { + if let Some(cur_origin) = self.0 .0.get_mut(path) { + match (new_origin, cur_origin) { + (AO::Copy(new_src), AO::Copy(cur_src)) => { + if cur_src != new_src { + bail!( + "disagreeing sources `{}` and `{}` for copied output asset `{}`", + cur_src, + new_src, + path + ); + } + } + + (AO::FontFile(new_ff), AO::FontFile(cur_ff)) => { + if new_ff.source != cur_ff.source { + bail!( + "disagreeing sources `{}` and `{}` for output font asset `{}`", + cur_ff.source, + new_ff.source, + path + ); + } + + if new_ff.face_index != cur_ff.face_index { + bail!( + "disagreeing face indices `{}` and `{}` for output font asset `{}`", + cur_ff.face_index, + new_ff.face_index, + path + ); + } + + // We have two font assets with the same source. We need + // to merge the vglyph information, but otherwise we're + // good! + syntax::merge_vglyphs(&mut cur_ff.vglyphs, &new_ff.vglyphs); + } + + (AO::FontCss(new_fe), AO::FontCss(cur_fe)) => { + // We have two font ensembles. Try merging. + syntax::merge_font_ensembles(&mut cur_fe.0, &new_fe.0)?; + } + + (new2, cur2) => { + bail!( + "disagreeing origin types {} and {} for output asset `{}`", + cur2, + new2, + path + ); + } + } + } else { + // This path is undefined in the current object. Just add it! + self.0 .0.insert(path.clone(), new_origin.clone()); + } + } + + Ok(self) + } + + /// Save this asset specification to a stream. + /// + /// Currently, this is done in a JSON format, but this is not guaranteed to + /// always be the case. The serialization format does not make any effort to + /// provide for backwards or forwards compatibility. The serialized data + /// should be viewed as ephemera that are only guaranteed to remain useful + /// so long as the executing program remains unchanged. + pub fn save(&self, writer: W) -> Result<()> { + serde_json::to_writer_pretty(writer, &self.0).map_err(|e| e.into()) + } + + /// Produce the TeX paths of the output files associated with this + /// specification. + pub fn output_paths(&self) -> impl Iterator> { + AssetOutputsIterator { + iter: self.0 .0.iter(), + cur_vg_path: None, + next_vg_index: 0, + } + } + + /// Check that a set of fonts defined at runtime are a subset of those + /// defined in this specification. + /// + /// This function is used in the "precomputed assets" mode, to make sure + /// that the SPX file doesn't set up any font configuration that we didn't + /// expect. + pub(crate) fn check_runtime_fonts( + &self, + fonts: &mut FontEnsemble, + common: &mut Common, + ) -> Result<()> { + fonts.match_to_precomputed(&self.0, common) + } + + /// Check that the assets defined at runtime are a subset of those defined + /// in this specification, and update them to cover the specification. + /// + /// This function is used in the "precomputed assets" mode, to make sure + /// that the SPX file doesn't try to define anything that we didn't expect. + /// Fonts have already been looked at, so we just need to check output + /// filenames. We also need to update the collection of runtime assets so + /// that if we are asked to emit assets, we'll emit *everything*, not just + /// the ones this particular session knows about. + pub(crate) fn check_runtime_assets(&self, assets: &mut Assets) -> Result<()> { + for (path, run_origin) in &assets.paths { + if let Some(pre_origin) = self.0 .0.get(path) { + match (run_origin, pre_origin) { + (AssetOrigin::Copy(run_path), syntax::AssetOrigin::Copy(pre_path)) => { + ensure!( + run_path == pre_path, + "asset `{}` should \ + copy out path `{}`, but in this session the source is `{}`", + path, + pre_path, + run_path + ); + } + + (AssetOrigin::FontCss, syntax::AssetOrigin::FontCss(_)) => {} + + _ => { + bail!( + "this session and the precomputed assets disagree on `{}`", + path + ); + } + } + } else { + bail!( + "this session defines an asset at `{}` that is not in the precomputed bundle", + path + ); + } + } + + // Now update the runtime assets to include all precomputed ones. + + for (path, pre_origin) in &self.0 .0 { + let mapped = match pre_origin { + syntax::AssetOrigin::Copy(pre_path) => AssetOrigin::Copy(pre_path.to_owned()), + syntax::AssetOrigin::FontCss(_) => AssetOrigin::FontCss, + syntax::AssetOrigin::FontFile(_) => continue, + }; + + assets.paths.entry(path.to_owned()).or_insert(mapped); + } + + Ok(()) + } +} + +struct AssetOutputsIterator<'a> { + iter: Iter<'a, String, syntax::AssetOrigin>, + cur_vg_path: Option, + next_vg_index: usize, +} + +impl<'a> Iterator for AssetOutputsIterator<'a> { + type Item = Cow<'a, str>; + + fn next(&mut self) -> Option> { + if let Some(p) = self.cur_vg_path.as_ref() { + let rv = Cow::Owned(format!("vg{}{}", self.next_vg_index, p)); + + if self.next_vg_index == 0 { + self.cur_vg_path = None; + } else { + self.next_vg_index -= 1; + } + + return Some(rv); + } + + self.iter.next().map(|(path, origin)| { + if let syntax::AssetOrigin::FontFile(ref ffi) = origin { + if !ffi.vglyphs.is_empty() { + // If we have moved on to a font file with variant glyphs, + // we first (now) yield the unmodified filename, then set up + // to iterate through the `vg` versions. + let mut highest_vg_index = 0; + + for mapping in ffi.vglyphs.values() { + highest_vg_index = std::cmp::max(highest_vg_index, mapping.index); + } + + self.cur_vg_path = Some(path.to_owned()); + self.next_vg_index = highest_vg_index; + } + } + + Cow::Borrowed(path.as_ref()) + }) + } +} + +/// The concrete syntax for saving asset-output state, wired up via serde. +/// +/// The top-level type is Assets. +pub(crate) mod syntax { + use serde::{Deserialize, Serialize, Serializer}; + use std::collections::{BTreeMap, HashMap}; + use tectonic_errors::prelude::*; + + /// Annoyingly we need to wrap this hashmap in a struct because we need to + /// customize the serializer to sort the keys for reproducible outputs. + /// Likewise for all other hashmaps in this module. + #[derive(Clone, Debug, Default, Serialize, Deserialize)] + pub struct Assets(#[serde(serialize_with = "ordered_map")] pub HashMap); + + fn ordered_map( + value: &HashMap, + serializer: S, + ) -> Result + where + S: Serializer, + { + let ordered: BTreeMap<_, _> = value.iter().collect(); + ordered.serialize(serializer) + } + + #[derive(Clone, Debug, Deserialize, Serialize)] + #[serde(tag = "kind")] + pub enum AssetOrigin { + /// Copy a file from the source stack directly to the output directory. + Copy(String), + + /// Emit a CSS file containing information about the ensemble of fonts + /// that have been used. + FontCss(FontEnsembleAssetData), + + /// An OpenType/TrueType font file and variants with customized CMAP tables + /// allowing access to unusual glyphs. + FontFile(FontFileAssetData), + } + + impl std::fmt::Display for AssetOrigin { + fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> Result<(), std::fmt::Error> { + match self { + AssetOrigin::Copy(src) => write!(f, "copy out `{src}`"), + + AssetOrigin::FontCss(fe) => { + let mut first = true; + + write!(f, "CSS for font faces")?; + + for facename in fe.0.keys() { + if first { + write!(f, " ")?; + first = false; + } else { + write!(f, ", ")?; + } + + write!(f, "\"{facename}\"")?; + } + + Ok(()) + } + + AssetOrigin::FontFile(ff) => { + write!(f, "font face #{} from `{}`", ff.face_index, ff.source) + } + } + } + } + + #[derive(Clone, Debug, Default, Deserialize, Serialize)] + pub struct FontFileAssetData { + /// The path to find the font file in the source stack. + pub source: String, + + /// The face index of this font in the source file. + pub face_index: u32, + + /// Variant glyphs that require us to emit variant versions of the font + /// file. + /// + /// Due to limitations of (serde's) JSON serialization, the keys of this + /// dictionary have to be strings, even though we would like them to be + /// GlyphIds. + #[serde(serialize_with = "ordered_map")] + pub vglyphs: HashMap, + } + + /// Merge one table of variant glyph USV mappings into another. + pub(crate) fn merge_vglyphs( + cur: &mut HashMap, + new: &HashMap, + ) { + // First, get the maximum seen index for each USV. + + let mut next_index = HashMap::new(); + + for mapping in cur.values() { + let idx = next_index.entry(mapping.usv).or_default(); + *idx = std::cmp::max(*idx, mapping.index + 1); + } + + // Now add mappings for any new glyphs that we need. + + for (gid, mapping) in new { + // If the glyph is already in the "cur" mapping, great. If not, add + // a new mapping, using the "new" map's suggested USV. + cur.entry(gid.clone()).or_insert_with(|| { + let next_idx = next_index.entry(mapping.usv).or_default(); + let index = *next_idx; + *next_idx = index + 1; + GlyphVariantMapping { + usv: mapping.usv, + index, + } + }); + } + } + + #[derive(Clone, Copy, Debug, Deserialize, Eq, Hash, PartialEq, Serialize)] + pub struct GlyphVariantMapping { + /// The USV that the glyph should be mapped to + pub usv: char, + + /// Which alternative-mapped font to use. These indices start at zero. + pub index: usize, + } + + impl From for GlyphVariantMapping { + fn from(m: crate::fontfile::GlyphVariantMapping) -> Self { + GlyphVariantMapping { + usv: m.usv, + index: m.variant_map_index, + } + } + } + + /// Map from symbolic family name to info about the fonts defining it. + #[derive(Clone, Debug, Default, Deserialize, Serialize)] + pub struct FontEnsembleAssetData( + #[serde(serialize_with = "ordered_map")] pub HashMap, + ); + + /// Merge one font ensemble (table of font-family definitions) into another. + /// This can fail if the tables are not self-consistent. + pub fn merge_font_ensembles( + cur: &mut HashMap, + new: &HashMap, + ) -> Result<()> { + for (name, new_ff) in new { + if let Some(cur_ff) = cur.get_mut(name) { + for (facetype, new_facepath) in &new_ff.faces { + if let Some(cur_facepath) = cur_ff.faces.get(facetype) { + // This facetype is already defined in this family -- + // check that we agree on what font it is. + if cur_facepath != new_facepath { + bail!( + "disagreeing asset paths for font family {}/{:?}: `{}` and `{}`", + name, + facetype, + cur_facepath, + new_facepath + ); + } + } else { + // This facetype is new for this family. + cur_ff.faces.insert(*facetype, new_facepath.clone()); + } + } + } else { + // This family is a new definition. Just copy it. + cur.insert(name.clone(), new_ff.clone()); + } + } + + Ok(()) + } + + #[derive(Clone, Debug, Default, Deserialize, Eq, PartialEq, Serialize)] + pub struct FontFamilyAssetData { + /// Map from face type to the output path of the font file providing it. + #[serde(serialize_with = "ordered_map")] + pub faces: HashMap, + } + + #[derive(Clone, Copy, Debug, Deserialize, Eq, Hash, Ord, PartialEq, PartialOrd, Serialize)] + pub enum FaceType { + /// The regular (upright) font of a font family. + Regular, + + /// The bold font of a family. + Bold, + + /// The italic font of a family. + Italic, + + /// The bold-italic font a current family. + BoldItalic, + } +} diff --git a/crates/engine_spx2html/src/emission.rs b/crates/engine_spx2html/src/emission.rs new file mode 100644 index 0000000000..19ff5075bd --- /dev/null +++ b/crates/engine_spx2html/src/emission.rs @@ -0,0 +1,1214 @@ +// Copyright 2018-2022 the Tectonic Project +// Licensed under the MIT License. + +//! The main "emission" phase of SPX to HTML processing. + +use std::{ + collections::HashMap, + fmt::{Arguments, Error as FmtError, Write as FmtWrite}, + result::Result as StdResult, +}; +use tectonic_errors::prelude::*; +use tectonic_status_base::tt_warning; + +use crate::{ + assets::Assets, + finalization::FinalizingState, + fonts::{FamilyRelativeFontId, FontEnsemble, FontFamilyAnalysis, PathToNewFont}, + html::Element, + specials::Special, + templating::Templating, + Common, FixedPoint, TexFontNum, +}; + +#[derive(Debug)] +pub(crate) struct EmittingState { + fonts: FontEnsemble, + content: ContentState, + templating: Templating, + assets: Assets, + tag_associations: HashMap, + rems_per_tex: f32, + elem_stack: Vec, + current_canvas: Option, +} + +#[derive(Debug, Default)] +struct ContentState { + current_content: String, + last_content_x: i32, + last_content_space_width: Option, +} + +impl ContentState { + fn is_empty(&self) -> bool { + self.current_content.is_empty() + } + + fn push_str(&mut self, text: &str) { + self.current_content.push_str(text); + } + + fn push_char(&mut self, ch: char) { + self.current_content.push(ch); + } + + fn push_close_tag(&mut self, tag: &str) { + self.current_content.push('<'); + self.current_content.push('/'); + self.current_content.push_str(tag); + self.current_content.push('>'); + } + + fn push_with_html_escaping>(&mut self, raw_text: S) { + html_escape::encode_safe_to_string(raw_text, &mut self.current_content); + } + + fn push_with_html_double_quoted_attribute_escaping>(&mut self, raw_text: S) { + html_escape::encode_double_quoted_attribute_to_string(raw_text, &mut self.current_content); + } + + fn push_with_html_unquoted_attribute_escaping>(&mut self, raw_text: S) { + html_escape::encode_unquoted_attribute_to_string(raw_text, &mut self.current_content); + } + + fn take(&mut self) -> String { + std::mem::take(&mut self.current_content) + } + + /// Figure out if we need to push a space into the text content right now. + fn is_space_needed( + &self, + x0: i32, + cur_space_width: Option, + do_auto_spaces: bool, + ) -> bool { + // We never want a leading space. + if self.current_content.is_empty() { + return false; + } + + // Auto-spaces can be disabled. + if !do_auto_spaces { + return false; + } + + // TODO: RTL ASSUMPTION!!!!! + // + // If the "next" x is smaller than the last one, assume that we've + // started a new line. We ignore Y values since those are going to + // get hairy with subscripts, etc. + + if x0 < self.last_content_x { + return true; + } + + // Check the advance against the size of the space, which can be + // determined from either the most recent content or the new content, + // since in various circumstances either one or the other might not + // be defined. If both are defined, use whatever's smaller. There's + // probably a smoother way to do this logic? + + let space_width = match (&self.last_content_space_width, &cur_space_width) { + (Some(w1), Some(w2)) => FixedPoint::min(*w1, *w2), + (Some(w), None) => *w, + (None, Some(w)) => *w, + (None, None) => 0, + }; + + // If the x difference is larger than 1/4 of the space_width, let's say that + // we need a space. I made up the 1/4. + 4 * (x0 - self.last_content_x) > space_width + } + + fn update_content_pos(&mut self, x: i32, cur_space_width: Option) { + self.last_content_x = x; + + if cur_space_width.is_some() { + self.last_content_space_width = cur_space_width; + } + } + + /// Maybe push a space into the text content right now, if we think we need one. + fn push_space_if_needed( + &mut self, + x0: i32, + cur_space_width: Option, + do_auto_spaces: bool, + ) { + if self.is_space_needed(x0, cur_space_width, do_auto_spaces) { + self.current_content.push(' '); + } + + // This parameter should be updated almost-instantaneously + // if a run of glyphs is being rendered, but this is a good start: + self.update_content_pos(x0, cur_space_width); + } +} + +impl FmtWrite for ContentState { + fn write_str(&mut self, s: &str) -> StdResult<(), FmtError> { + self.current_content.write_str(s) + } + + fn write_char(&mut self, c: char) -> StdResult<(), FmtError> { + self.current_content.write_char(c) + } + + fn write_fmt(&mut self, args: Arguments<'_>) -> StdResult<(), FmtError> { + self.current_content.write_fmt(args) + } +} + +#[derive(Debug)] +pub(crate) struct ElementState { + /// The associated HTML element. This is None for the bottom item in the + /// stack, or for changes in state that are not associated with actual HTML + /// tags. + elem: Option, + + /// The origin of this element/state-change. + origin: ElementOrigin, + + /// Whether HTML tags that are automatically generated by the TeX + /// engine, such as

and

at the start and end of paragraphs, + /// should be emitted (true) or ignored (false). + do_auto_tags: bool, + + /// Whether this library should automatically insert spaces into text + /// content. This is done by looking at the horizontal positions of + /// different runs of text and applying a threshold for the amount of space + /// between the end of the previous one and the start of the next one. + do_auto_spaces: bool, + + /// The font-num of the regular font associated with the current font + /// family. This code is currently only exercised with a single "font + /// family" defined in a document, but there could be multiple. + font_family_id: TexFontNum, + + /// The currently active font, as we understand it, relative to the + /// currently active font family. + active_font: FamilyRelativeFontId, +} + +impl ElementState { + /// Should this element automatically be closed if a new tag starts or ends? + fn is_auto_close(&self) -> bool { + matches!(self.origin, ElementOrigin::FontAuto) + } +} + +/// How a particular ElementState ended up on the stack. +#[derive(Clone, Copy, Debug, Eq, PartialEq)] +pub(crate) enum ElementOrigin { + /// This is the root element in our stack. + Root, + + /// The element was manually inserted by the TeX code. + Manual, + + /// The element was automatically inserted by the TeX engine. + EngineAuto, + + /// The element was automatically inserted by us to + /// activate the desired font. + FontAuto, +} + +#[derive(Debug)] +struct CanvasState { + kind: String, + depth: usize, + x0: i32, + y0: i32, + glyphs: Vec, + rules: Vec, +} + +impl CanvasState { + fn new(kind: &str, x0: i32, y0: i32) -> Self { + CanvasState { + kind: kind.to_owned(), + depth: 1, + x0, + y0, + glyphs: Vec::new(), + rules: Vec::new(), + } + } +} + +#[derive(Debug)] +struct GlyphInfo { + dx: i32, + dy: i32, + font_num: TexFontNum, + glyph: u16, +} + +#[derive(Debug)] +struct RuleInfo { + dx: i32, + dy: i32, + width: i32, + height: i32, +} + +impl EmittingState { + pub(crate) fn new_from_init( + fonts: FontEnsemble, + main_body_font_num: Option, + templating: Templating, + tag_associations: HashMap, + ) -> Result { + let rems_per_tex = 1.0 + / main_body_font_num + .map(|fnum| fonts.get_font_size(fnum)) + .unwrap_or(65536) as f32; + + Ok(EmittingState { + templating, + fonts, + tag_associations, + rems_per_tex, + content: Default::default(), + assets: Default::default(), + elem_stack: vec![ElementState { + elem: None, + origin: ElementOrigin::Root, + do_auto_tags: true, + do_auto_spaces: true, + font_family_id: main_body_font_num.unwrap_or_default(), + active_font: FamilyRelativeFontId::Regular, + }], + current_canvas: None, + }) + } + + /// Convenience helper that applies the right defaults here. + /// + /// We can't always use this function because sometimes we need mutable + /// access to the `fonts` and `content` items separately. + fn push_space_if_needed(&mut self, x0: i32, fnum: Option) { + let cur_space_width = self.fonts.maybe_get_font_space_width(fnum); + self.content + .push_space_if_needed(x0, cur_space_width, self.cur_elstate().do_auto_spaces); + } + + fn create_elem(&self, name: &str, is_start: bool, common: &mut Common) -> Element { + // Parsing can never fail since we offer an `Other` element type + let el: Element = name.parse().unwrap(); + + if el.is_deprecated() { + tt_warning!( + common.status, + "HTML element `{}` is deprecated; templates should be updated to avoid it", + name + ); + } + + if is_start && el.is_empty() { + tt_warning!( + common.status, + "HTML element `{}` is an empty element; insert it with `tdux:mfe`, not as a start-tag", + name + ); + } + + if let Some(cur) = self.cur_elstate().elem.as_ref() { + if cur.is_autoclosed_by(&el) { + tt_warning!( + common.status, + "currently open HTML element `{}` will be implicitly closed by new \ + element `{}`; explicit closing tags are strongly encouraged", + cur.name(), + name + ); + } + } + + el + } + + #[inline(always)] + fn cur_elstate(&self) -> &ElementState { + self.elem_stack.last().unwrap() + } + + /// Close the topmost element in the stack. + fn close_one(&mut self) { + // Refuse the close the root element + if self.elem_stack.len() > 1 { + let cur = self.elem_stack.pop().unwrap(); + + if let Some(e) = cur.elem.as_ref() { + self.content.push_close_tag(e.name()); + } + } + } + + /// Close an auto-close elements that are currently at the top of the stack. + /// These elements are things like tags that were automatically + /// generated with the detection of the use of the bold font face. + fn close_automatics(&mut self) { + while self.elem_stack.len() > 1 { + let close_it = self.cur_elstate().is_auto_close(); + + if close_it { + self.close_one(); + } else { + break; + } + } + } + + fn push_elem(&mut self, el: Element, origin: ElementOrigin) { + self.close_automatics(); + + let new_item = { + let cur = self.cur_elstate(); + + let font_family_id = self + .tag_associations + .get(&el) + .copied() + .unwrap_or(cur.font_family_id); + + ElementState { + elem: Some(el), + origin, + font_family_id, + ..*cur + } + }; + + self.elem_stack.push(new_item); + } + + /// TODO: may need to hone semantics when element nesting isn't as expected. + fn pop_elem(&mut self, name: &str, common: &mut Common) { + self.close_automatics(); + + let mut n_closed = 0; + + while self.elem_stack.len() > 1 { + let cur = self.elem_stack.pop().unwrap(); + + if let Some(e) = cur.elem.as_ref() { + self.content.push_close_tag(e.name()); + n_closed += 1; + + if e.name() == name { + break; + } + } + } + + if n_closed != 1 { + tt_warning!( + common.status, + "imbalanced tags; had to close {} to find `{}`", + n_closed, + name + ); + } + } + + pub(crate) fn handle_special( + &mut self, + x: i32, + y: i32, + special: Special<'_>, + common: &mut Common, + ) -> Result<()> { + match special { + Special::AutoStartParagraph => { + if self.cur_elstate().do_auto_tags { + // Why are we using
s instead of

? As the HTML spec + // emphasizes,

tags are structural, not semantic. You cannot + // put tags like

    or
    inside

    -- they automatically + // close the paragraph. This does not align with TeX's idea of a + // paragraph, and there's no upside to trying to use

    's -- as + // the spec notes, the

    tag does not activate any important + // semantics itself. The HTML spec explicitly recommends that + // you can use

    elements to group logical paragraphs. So + // that's what we do. + let el = self.create_elem("div", true, common); + self.push_space_if_needed(x, None); + self.content.push_str("
    "); + self.push_elem(el, ElementOrigin::EngineAuto); + } + Ok(()) + } + + Special::AutoEndParagraph => { + if self.cur_elstate().do_auto_tags { + self.pop_elem("div", common); + } + Ok(()) + } + + Special::CanvasStart(kind) => { + if let Some(canvas) = self.current_canvas.as_mut() { + canvas.depth += 1; + } else { + self.current_canvas = Some(CanvasState::new(kind, x, y)); + } + Ok(()) + } + + Special::CanvasEnd(kind) => { + if let Some(canvas) = self.current_canvas.as_mut() { + canvas.depth -= 1; + if canvas.depth == 0 { + self.handle_end_canvas(common)?; + } + } else { + tt_warning!( + common.status, + "ignoring unpaired tdux:c[anvas]e[nd] special for `{}`", + kind + ); + } + Ok(()) + } + + Special::ManualFlexibleStart(spec) => { + self.handle_flexible_start_tag(x, y, spec, common) + } + + Special::ManualEnd(tag) => { + self.pop_elem(tag, common); + Ok(()) + } + + Special::DirectText(text) => { + self.content.push_with_html_escaping(text); + Ok(()) + } + + Special::Emit => self.finish_file(common), + + Special::SetTemplate(path) => { + self.templating.handle_set_template(path); + Ok(()) + } + + Special::SetOutputPath(path) => { + self.templating.handle_set_output_path(path); + Ok(()) + } + + Special::SetTemplateVariable(spec) => { + self.templating.handle_set_template_variable(spec, common) + } + + Special::ProvideFile(_) | Special::ProvideSpecial(_) => { + self.assets.try_handle_special(special, common); + Ok(()) + } + + other => { + tt_warning!(common.status, "ignoring unrecognized special: {}", other); + Ok(()) + } + } + } + + /// Handle a "flexible" start tag. + /// + /// These start tags are built with a line-oriented structure that aims to + /// make it so that the TeX code doesn't have to worry too much about + /// escaping, etc. The general format is: + /// + /// ```notest + /// \special{tdux:mfs tagname + /// Cclass % add a CSS class + /// Sname value % Add a CSS setting in the style attr + /// Uname value % Add an unquoted attribute + /// Dname value % Add a double-quoted attribute + /// NAS % Turn off automatic space insertion while processing this tag + /// NAT % Turn off automatic tag insertion while processing this tag + /// } + /// ``` + /// + /// More ... + fn handle_flexible_start_tag( + &mut self, + x: i32, + _y: i32, + remainder: &str, + common: &mut Common, + ) -> Result<()> { + let mut lines = remainder.lines(); + + let tagname = match lines.next() { + Some(t) => t, + None => { + tt_warning!( + common.status, + "ignoring TDUX flexible start tag -- no tag name: {:?}", + remainder + ); + return Ok(()); + } + }; + + if !tagname.chars().all(char::is_alphanumeric) { + tt_warning!( + common.status, + "ignoring TDUX flexible start tag -- invalid tag name: {:?}", + remainder + ); + return Ok(()); + } + + let el = self.create_elem(tagname, true, common); + + let mut elstate = { + let cur = self.cur_elstate(); + + let font_family_id = self + .tag_associations + .get(&el) + .copied() + .unwrap_or(cur.font_family_id); + + ElementState { + elem: Some(el), + origin: ElementOrigin::Manual, + font_family_id, + ..*cur + } + }; + + let mut classes = Vec::new(); + let mut styles = Vec::new(); + let mut unquoted_attrs = Vec::new(); + let mut double_quoted_attrs = Vec::new(); + + for line in lines { + if let Some(cls) = line.strip_prefix('C') { + // For later: apply any restrictions to allowed class names? + if !cls.is_empty() { + classes.push(cls.to_owned()); + } else { + tt_warning!( + common.status, + "ignoring TDUX flexible start tag class -- invalid name: {:?}", + cls + ); + } + } else if let Some(rest) = line.strip_prefix('S') { + // For later: apply any restrictions to names/values here? + let mut bits = rest.splitn(2, ' '); + let name = match bits.next() { + Some(n) => n, + None => { + tt_warning!( + common.status, + "ignoring TDUX flexible start tag style -- no name: {:?}", + rest + ); + continue; + } + }; + let value = match bits.next() { + Some(v) => v, + None => { + tt_warning!( + common.status, + "ignoring TDUX flexible start tag style -- no value: {:?}", + rest + ); + continue; + } + }; + styles.push((name.to_owned(), value.to_owned())); + } else if let Some(rest) = line.strip_prefix('U') { + // For later: apply any restrictions to names/values here? + let mut bits = rest.splitn(2, ' '); + let name = match bits.next() { + Some("class") | Some("style") => { + tt_warning!( + common.status, + "ignoring TDUX flexible start tag attr -- use C/S command: {:?}", + rest + ); + continue; + } + Some(n) => n, + None => { + tt_warning!( + common.status, + "ignoring TDUX flexible start tag attr -- no name: {:?}", + rest + ); + continue; + } + }; + unquoted_attrs.push((name.to_owned(), bits.next().map(|v| v.to_owned()))); + } else if let Some(rest) = line.strip_prefix('D') { + // For later: apply any restrictions to names/values here? + let mut bits = rest.splitn(2, ' '); + let name = match bits.next() { + Some("class") | Some("style") => { + tt_warning!( + common.status, + "ignoring TDUX flexible start tag attr -- use C/S command: {:?}", + rest + ); + continue; + } + Some(n) => n, + None => { + tt_warning!( + common.status, + "ignoring TDUX flexible start tag attr -- no name: {:?}", + rest + ); + continue; + } + }; + double_quoted_attrs.push((name.to_owned(), bits.next().map(|v| v.to_owned()))); + } else if line == "NAS" { + elstate.do_auto_spaces = false; + } else if line == "NAT" { + elstate.do_auto_tags = false; + } else { + tt_warning!( + common.status, + "ignoring unrecognized TDUX flexible start tag command: {:?}", + line + ); + } + } + + self.push_space_if_needed(x, None); + self.content.push_char('<'); + self.content.push_with_html_escaping(tagname); + + if !classes.is_empty() { + self.content.push_str(" class=\""); + + let mut first = true; + for c in &classes { + if first { + first = false; + } else { + self.content.push_char(' '); + } + + self.content + .push_with_html_double_quoted_attribute_escaping(c); + } + + self.content.push_char('\"'); + } + + if !styles.is_empty() { + self.content.push_str(" style=\""); + + let mut first = true; + for (name, value) in &styles { + if first { + first = false; + } else { + self.content.push_char(';'); + } + + self.content + .push_with_html_double_quoted_attribute_escaping(name); + self.content.push_char(':'); + self.content + .push_with_html_double_quoted_attribute_escaping(value); + } + + self.content.push_char('\"'); + } + + for (name, maybe_value) in &unquoted_attrs { + self.content.push_char(' '); + self.content.push_with_html_escaping(name); + + if let Some(v) = maybe_value { + self.content.push_char('='); + self.content.push_with_html_unquoted_attribute_escaping(v); + } + } + + for (name, maybe_value) in &double_quoted_attrs { + self.content.push_char(' '); + self.content.push_with_html_escaping(name); + self.content.push_str("=\""); + + if let Some(v) = maybe_value { + self.content + .push_with_html_double_quoted_attribute_escaping(v); + } + + self.content.push_char('\"'); + } + + self.content.push_char('>'); + self.elem_stack.push(elstate); + Ok(()) + } + + pub(crate) fn handle_text_and_glyphs( + &mut self, + font_num: TexFontNum, + text: &str, + glyphs: &[u16], + xs: &[i32], + ys: &[i32], + common: &mut Common, + ) -> Result<()> { + if let Some(c) = self.current_canvas.as_mut() { + for i in 0..glyphs.len() { + c.glyphs.push(GlyphInfo { + dx: xs[i] - c.x0, + dy: ys[i] - c.y0, + glyph: glyphs[i], + font_num, + }); + } + } else if !glyphs.is_empty() { + self.set_up_for_font(xs[0], font_num, common); + self.push_space_if_needed(xs[0], Some(font_num)); + self.content.push_with_html_escaping(text); + + // To figure out when we need spaces, we need to care about the last + // glyph's actual width (well, its advance). + // + // TODO: RTL correctness!!!! + + let idx = glyphs.len() - 1; + let gm = atry!( + self.fonts.get_glyph_metrics(font_num, glyphs[idx]); + ["undeclared font {} in canvas", font_num] + ); + let advance = match gm { + Some(gm) => gm.advance, + None => 0, + }; + + let cur_space_width = self.fonts.maybe_get_font_space_width(Some(font_num)); + self.content + .update_content_pos(xs[idx] + advance, cur_space_width); + } + + Ok(()) + } + + pub(crate) fn handle_glyph_run( + &mut self, + font_num: TexFontNum, + glyphs: &[u16], + xs: &[i32], + ys: &[i32], + common: &mut Common, + ) -> Result<()> { + if let Some(c) = self.current_canvas.as_mut() { + for i in 0..glyphs.len() { + c.glyphs.push(GlyphInfo { + dx: xs[i] - c.x0, + dy: ys[i] - c.y0, + glyph: glyphs[i], + font_num, + }); + } + } else { + let cur_space_width = self.fonts.maybe_get_font_space_width(Some(font_num)); + let do_auto_spaces = self.cur_elstate().do_auto_spaces; + let mut ch_str_buf = [0u8; 4]; + + // Ideally, the vast majority of the time we are using + // handle_text_and_glyphs and not this function, outside of + // canvases. But sometimes we get spare glyphs outside of the canvas + // context. We can use our glyph-mapping infrastructure to try to + // translate them to Unicode, hoping for the best that the naive + // inversion suffices. + + self.set_up_for_font(xs[0], font_num, common); + + let fonts = &mut self.fonts; + + let iter = atry!( + fonts.process_glyphs_as_text(font_num, glyphs, common.status); + ["undeclared font {} in glyph run", font_num] + ); + + for (idx, text_info, advance) in iter { + if let Some((ch, font_sel)) = text_info { + let ch_as_str = ch.encode_utf8(&mut ch_str_buf); + + // XXX this is (part of) push_space_if_needed + if self + .content + .is_space_needed(xs[idx], cur_space_width, do_auto_spaces) + { + self.content.push_char(' '); + } + + write!(self.content, "").unwrap(); + self.content.push_with_html_escaping(ch_as_str); + write!(self.content, "").unwrap(); + } + + self.content + .update_content_pos(xs[idx] + advance, cur_space_width); + } + } + + Ok(()) + } + + fn set_up_for_font(&mut self, x0: i32, fnum: TexFontNum, common: &mut Common) { + let (cur_ffid, cur_af, cur_is_autofont) = { + let cur = self.cur_elstate(); + ( + cur.font_family_id, + cur.active_font, + cur.origin == ElementOrigin::FontAuto, + ) + }; + + let (path, desired_af) = match self.fonts.analyze_font_for_family(fnum, cur_ffid, cur_af) { + FontFamilyAnalysis::AlreadyActive => return, + FontFamilyAnalysis::Reachable(p, d) => (p, d), + FontFamilyAnalysis::NoMatch(fid) => { + // We don't seem to be in a defined "family". So we have to + // select it explicitly. + let path = PathToNewFont { + close_all: true, + select_explicitly: true, + ..Default::default() + }; + + let desired_af = FamilyRelativeFontId::Other(fid); + (path, desired_af) + } + + FontFamilyAnalysis::Unrecognized => { + tt_warning!(common.status, "undeclared font number {}", fnum); + return; + } + }; + + if path.close_one_and_retry { + if cur_is_autofont { + self.close_one(); + return self.set_up_for_font(x0, fnum, common); + } else { + // This is a logic error in our implementation -- this + // should never happen. + tt_warning!( + common.status, + "font selection failed (ffid={}, active={:?}, desired={})", + cur_ffid, + cur_af, + fnum + ); + return; + } + } + + if path.close_all { + self.close_automatics(); + } + + if let Some(af) = path.open_b { + self.push_space_if_needed(x0, Some(fnum)); + self.content.push_str(""); + self.elem_stack.push(ElementState { + elem: Some(Element::B), + origin: ElementOrigin::FontAuto, + active_font: af, + ..*self.cur_elstate() + }); + } + + if let Some(af) = path.open_i { + self.push_space_if_needed(x0, Some(fnum)); + self.content.push_str(""); + self.elem_stack.push(ElementState { + elem: Some(Element::I), + origin: ElementOrigin::FontAuto, + active_font: af, + ..*self.cur_elstate() + }); + } + + if path.select_explicitly { + self.push_space_if_needed(x0, Some(fnum)); + self.fonts + .write_styling_span_html(fnum, self.rems_per_tex, &mut self.content) + .unwrap(); + self.elem_stack.push(ElementState { + elem: Some(Element::Span), + origin: ElementOrigin::FontAuto, + active_font: desired_af, + ..*self.cur_elstate() + }); + } + } + + pub(crate) fn handle_rule( + &mut self, + x: i32, + y: i32, + height: i32, + width: i32, + common: &mut Common, + ) -> Result<()> { + // Rules with non-positive dimensions are not drawn. Perhaps there are + // tricky cases where they should affect the computation of the canvas + // bounding box, but my first guess is that it is fine to just ignore + // them completely. + if width < 1 || height < 1 { + return Ok(()); + } + + if let Some(c) = self.current_canvas.as_mut() { + c.rules.push(RuleInfo { + dx: x - c.x0, + dy: y - c.y0, + width, + height, + }); + } else { + // Not sure what to do here. Does this even happen in + // non-pathological cases? + tt_warning!( + common.status, + "ignoring rule outside of Tectonic HTML canvas" + ); + } + + Ok(()) + } + + fn handle_end_canvas(&mut self, common: &mut Common) -> Result<()> { + let mut canvas = self.current_canvas.take().unwrap(); + + // This is the *end* of a canvas, but we haven't pushed anything into + // the content since whatever started the canvas, so we need this: + self.push_space_if_needed(canvas.x0, None); + + let inline = match canvas.kind.as_ref() { + "math" => true, + "dmath" => false, + _ => false, + }; + + // First pass: get overall bounds of all the glyphs (from their metrics) + // and rules. We need to gather this information first because as we + // emit glyphs we have to specify their positions relative to the edges + // of the containing canvas box, and the size of that box is defined by + // the extents of all of the glyphs it contains. The bounds are measured + // in TeX units. + + let mut first = true; + let mut x_min_tex = 0; + let mut x_max_tex = 0; + let mut y_min_tex = 0; + let mut y_max_tex = 0; + + for gi in &canvas.glyphs[..] { + let gm = atry!( + self.fonts.get_glyph_metrics(gi.font_num, gi.glyph); + ["undeclared font {} in canvas", gi.font_num] + ); + + if let Some(gm) = gm { + // to check: RTL correctness + let xmin = gi.dx - gm.lsb; + let xmax = gi.dx + gm.advance; + let ymin = gi.dy - gm.ascent; + let ymax = gi.dy - gm.descent; // note: descent is negative + + if first { + x_min_tex = xmin; + x_max_tex = xmax; + y_min_tex = ymin; + y_max_tex = ymax; + first = false; + } else { + x_min_tex = std::cmp::min(x_min_tex, xmin); + x_max_tex = std::cmp::max(x_max_tex, xmax); + y_min_tex = std::cmp::min(y_min_tex, ymin); + y_max_tex = std::cmp::max(y_max_tex, ymax); + } + } + } + + for ri in &canvas.rules[..] { + // The canvas is constructed so that we know `width` and `height` + // are positive. + + let xmin = ri.dx; + let xmax = ri.dx + ri.width; + let ymin = ri.dy; + let ymax = ri.dy + ri.height; + + if first { + x_min_tex = xmin; + x_max_tex = xmax; + y_min_tex = ymin; + y_max_tex = ymax; + first = false; + } else { + x_min_tex = std::cmp::min(x_min_tex, xmin); + x_max_tex = std::cmp::max(x_max_tex, xmax); + y_min_tex = std::cmp::min(y_min_tex, ymin); + y_max_tex = std::cmp::max(y_max_tex, ymax); + } + } + + // Now that we have that information, we can lay out the individual + // glyphs. + // + // A resource I found very helpful: + // https://iamvdo.me/en/blog/css-font-metrics-line-height-and-vertical-align + + let mut inner_content = String::default(); + let mut ch_str_buf = [0u8; 4]; + + for gi in canvas.glyphs.drain(..) { + let (text_info, size, baseline_factor) = + self.fonts + .process_glyph_for_canvas(gi.font_num, gi.glyph, common.status); + + // The size of the font being used for this glyph, in rems; that is, + // relative to the main body font. + let rel_size = size as f32 * self.rems_per_tex; + + if let Some((ch, font_sel)) = text_info { + // dy gives the target position of this glyph's baseline + // relative to the canvas's baseline. For our `position: + // absolute` layout, we have to convert that into the distance + // between the top of this glyph's box and the top of the + // overall canvas box (or bottom/bottom). + // + // In order to do this, we need to know the size of this glyph's + // box according to CSS, and the position of the glyph's + // baseline within that box. + // + // The baseline position is straightforward: it is given by what + // we call the font's "baseline factor". This is true no matter + // the specific size of the CSS box relative to the font + // rendering size, due to the way in which the drawn glyph is + // centered vertically within its CSS box. + // + // The CSS glyph box height can be funky: it depends on the + // font-size setting, font metrics (not just ascender/descender + // but "line gap") and `line-height` setting in "exciting" ways. + // One convenient approach is to set `line-height: 1` in the + // container, in which case the box height is the `font-size` + // setting. + + let top_rem = + (-y_min_tex + gi.dy) as f32 * self.rems_per_tex - baseline_factor * rel_size; + + // Stringify the character so that we can use html_escape in + // case it's a `<` or whatever. + let ch_as_str = ch.encode_utf8(&mut ch_str_buf); + + write!( + inner_content, + "", + top_rem, + gi.dx as f32 * self.rems_per_tex, + rel_size, + font_sel, + ) + .unwrap(); + html_escape::encode_text_to_string(ch_as_str, &mut inner_content); + write!(inner_content, "").unwrap(); + } + } + + // Now do the rules. (At the moment, I can't think of any reason to + // favor a particular ordering of glyphs and rules within a given + // canvas?) + + for ri in canvas.rules.drain(..) { + let rel_width = ri.width as f32 * self.rems_per_tex; + let rel_height = ri.height as f32 * self.rems_per_tex; + + let left_rem = ri.dx as f32 * self.rems_per_tex; + let top_rem = (-y_min_tex + ri.dy) as f32 * self.rems_per_tex - rel_height; + + write!( + inner_content, + "", + ) + .unwrap(); + } + + // Wrap it up. + + let (element, layout_class, valign) = if inline { + // A numerical vertical-align setting positions the bottom edge of + // this block relative to the containing line's baseline. This is + // the best (only?) way to make sure that this block's baseline + // lines up with that of its container. + ( + "span", + "canvas-inline", + format!( + "; vertical-align: {}rem", + -y_max_tex as f32 * self.rems_per_tex + ), + ) + } else { + ("div", "canvas-block", "".to_owned()) + }; + + let element = self.create_elem(element, true, common); + + write!( + self.content, + "<{} class=\"canvas {}\" style=\"width: {}rem; height: {}rem; padding-left: {}rem{}\">", + element.name(), + layout_class, + (x_max_tex - x_min_tex) as f32 * self.rems_per_tex, + (y_max_tex - y_min_tex) as f32 * self.rems_per_tex, + -x_min_tex as f32 * self.rems_per_tex, + valign, + ) + .unwrap(); + self.content.push_str(&inner_content); + write!(self.content, "", element.name()).unwrap(); + let cur_space_width = self.fonts.maybe_get_font_space_width(None); + self.content + .update_content_pos(x_max_tex + canvas.x0, cur_space_width); + Ok(()) + } + + fn finish_file(&mut self, common: &mut Common) -> Result<()> { + self.templating + .set_variable("tduxContent", self.content.take()); + self.templating.emit(common)?; + + let cur_space_width = self.fonts.maybe_get_font_space_width(None); + self.content.update_content_pos(0, cur_space_width); + Ok(()) + } + + pub(crate) fn emission_finished(mut self, common: &mut Common) -> Result { + if !self.content.is_empty() { + tt_warning!( + common.status, + "non-empty content left at the end without an explicit `emit` in HTML output" + ); + + if self.templating.ready_to_output() { + self.finish_file(common)?; + } + } + + FinalizingState::new(self.fonts, self.templating, self.assets) + } +} diff --git a/crates/engine_spx2html/src/finalization.rs b/crates/engine_spx2html/src/finalization.rs new file mode 100644 index 0000000000..ec5cbc5227 --- /dev/null +++ b/crates/engine_spx2html/src/finalization.rs @@ -0,0 +1,96 @@ +// Copyright 2018-2022 the Tectonic Project +// Licensed under the MIT License. + +//! The finalization phase of SPX to HTML processing. + +use tectonic_errors::prelude::*; +use tectonic_status_base::tt_warning; + +use crate::{ + assets::Assets, fonts::FontEnsemble, specials::Special, templating::Templating, Common, +}; + +#[derive(Debug)] +pub(crate) struct FinalizingState { + fonts: FontEnsemble, + templating: Templating, + assets: Assets, + warning_issued: bool, +} + +impl FinalizingState { + pub(crate) fn new(fonts: FontEnsemble, templating: Templating, assets: Assets) -> Result { + Ok(FinalizingState { + templating, + fonts, + assets, + warning_issued: false, + }) + } + + fn warn_finished_content(&mut self, detail: &str, common: &mut Common) { + if !self.warning_issued { + tt_warning!(common.status, "dropping post-finish content ({})", detail); + self.warning_issued = true; + } + } + + pub(crate) fn handle_special( + &mut self, + special: Special<'_>, + common: &mut Common, + ) -> Result<()> { + match special { + Special::Emit => self.finish_file(common), + + Special::SetTemplate(path) => { + self.templating.handle_set_template(path); + Ok(()) + } + + Special::SetOutputPath(path) => { + self.templating.handle_set_output_path(path); + Ok(()) + } + + Special::SetTemplateVariable(spec) => { + self.templating.handle_set_template_variable(spec, common) + } + + Special::ProvideFile(_) | Special::ProvideSpecial(_) => { + self.assets.try_handle_special(special, common); + Ok(()) + } + + other => { + self.warn_finished_content(&format!("special {other}"), common); + Ok(()) + } + } + } + + pub(crate) fn handle_text_and_glyphs(&mut self, text: &str, common: &mut Common) -> Result<()> { + self.warn_finished_content(&format!("text `{text}`"), common); + Ok(()) + } + + pub(crate) fn handle_glyph_run(&mut self, common: &mut Common) -> Result<()> { + self.warn_finished_content("glyph run", common); + Ok(()) + } + + pub(crate) fn handle_rule(&mut self, common: &mut Common) -> Result<()> { + self.warn_finished_content("rule", common); + Ok(()) + } + + fn finish_file(&mut self, common: &mut Common) -> Result<()> { + self.templating.set_variable("tduxContent", ""); + self.templating.emit(common)?; + Ok(()) + } + + pub(crate) fn finished(self) -> (FontEnsemble, Assets) { + (self.fonts, self.assets) + } +} diff --git a/crates/engine_spx2html/src/font.rs b/crates/engine_spx2html/src/fontfile.rs similarity index 69% rename from crates/engine_spx2html/src/font.rs rename to crates/engine_spx2html/src/fontfile.rs index bcb1f43646..8455f06b4a 100644 --- a/crates/engine_spx2html/src/font.rs +++ b/crates/engine_spx2html/src/fontfile.rs @@ -1,11 +1,14 @@ // Copyright 2021-2022 the Tectonic Project // Licensed under the MIT License. -//! Reverse-map glyph IDs to the Unicode inputs that should create them. +//! Data pertaining to a specific (OpenType) font file. //! -//! Whenever possible we try to get "ActualText" info out of the engine so that -//! we don't have to do this, but for math and potentially other situations this -//! is sometimes necessary. +//! The most interesting functionality here is our "variant glyph" +//! infrastructure used to be able to show specific glyphs out of the font when +//! we don't know a Unicode character that will reliably produce it. Whenever +//! possible we try to get "ActualText" info out of the engine so that we don't +//! have to do this, but for math and potentially other situations this is +//! sometimes necessary. use byteorder::{BigEndian, ByteOrder, WriteBytesExt}; use percent_encoding::{utf8_percent_encode, CONTROLS}; @@ -33,9 +36,7 @@ const SSTY: Tag = Tag(0x73_73_74_79); /// A type for retrieving data about the glyphs used in a particular font. #[derive(Debug)] -pub struct FontData { - basename: String, - +pub struct FontFileData { /// The complete font data. /// /// Currently, this must be an OpenType font. @@ -61,21 +62,25 @@ pub struct FontData { /// typically negative. baseline_factor: f32, - /// Map from Unicode charactors to how many alternate character map records + /// Map from Unicode charactors to how many variant character map records /// have been allocated for them. We need this to know how "deep" into the - /// list of alternates we need to push if a new glyph<->char pair has to be + /// list of variants we need to push if a new glyph<->char pair has to be /// handled. - alternate_map_counts: HashMap, + variant_map_counts: HashMap, + + /// Map from glyph ID to variant character map setting. + variant_map_allocations: HashMap, - /// Map from glyph ID to alternate character map setting. - alternate_map_allocations: HashMap, + /// When we've been initialized to match a precomputed set of assets, + /// we're not allowed to allocate any new variant glyph mappings. + no_new_variants: bool, /// The index of the CMAP table record in the font data structure. We need - /// this for the alternate cmap munging. + /// this for the variant cmap munging. fontdata_cmap_trec_idx: usize, /// The offset of the HEAD table within the font data. We need - /// this for the alternate cmap munging. + /// this for the variant cmap munging. fontdata_head_offset: u32, } @@ -90,7 +95,7 @@ pub enum MapEntry { /// /// In an OpenType/TrueType font, this glyph representation is obtained with /// the first glyph substitution obtained using the `ssty` feature. If the - /// associated bool is false, the glyph was the first alternate form, used + /// associated bool is false, the glyph was the first variant form, used /// for sub/super-scripts on regular equation terms. If it is true, it is a /// "double" sub/super-script, e.g. the "z" in `x^{y^z}`. SubSuperScript(char, bool), @@ -113,7 +118,7 @@ impl MapEntry { } } -/// Information about an "alternate mapping" to be used for a glyph. +/// Information about an "variant mapping" to be used for a glyph. /// /// When parsing XDV output, we may encounter glyphs that do not directly map to /// an originating Unicode character (e.g., it maps with a MapEntry like @@ -123,7 +128,7 @@ impl MapEntry { /// they were just standard characters in a different font, and it turns out /// that manipulating the font file to do this isn't so hard. /// -/// We need to maintain a sequence of these alternate maps because we may wish +/// We need to maintain a sequence of these variant maps because we may wish /// to map several different glyphs to the same Unicode character in this /// fashion. /// @@ -134,12 +139,12 @@ impl MapEntry { /// We might also one day wish to extend this system to emit a subsetted version /// of the original font. #[derive(Clone, Copy, Debug, Eq, Hash, PartialEq)] -pub struct GlyphAlternateMapping { +pub struct GlyphVariantMapping { /// The USV that the glyph should be mapped to pub usv: char, - /// Which alternative-mapped font to use. These indices start at zero. - pub alternate_map_index: usize, + /// Which variant-mapped font to use. These indices start at zero. + pub variant_map_index: usize, } #[derive(Clone, Copy, Debug, Eq, Hash, PartialEq)] @@ -166,11 +171,11 @@ struct HorizontalMetrics { lsb: FWord, } -impl FontData { +impl FontFileData { /// Load glyph data from OpenType font data. /// /// We take ownership of the font data that we're given. - pub fn from_opentype(basename: String, buffer: Vec, face_index: u32) -> Result { + pub fn from_opentype(buffer: Vec, face_index: u32) -> Result { let font_data = a_ok_or!( FontDataRef::new(&buffer); ["unable to parse buffer as OpenType font"] @@ -291,8 +296,7 @@ impl FontData { // All done! - Ok(FontData { - basename, + Ok(FontFileData { buffer, gmap, space_glyph, @@ -301,8 +305,9 @@ impl FontData { ascender, descender, baseline_factor, - alternate_map_counts: HashMap::new(), - alternate_map_allocations: HashMap::new(), + variant_map_counts: HashMap::new(), + variant_map_allocations: HashMap::new(), + no_new_variants: false, fontdata_head_offset, fontdata_cmap_trec_idx, }) @@ -356,59 +361,83 @@ impl FontData { } } - /// Request that an alternative mapping be allocated for a glyph. + /// Request that a variant mapping be allocated for a glyph. + /// + /// The caller must suggest a Unicode character to use for the variant, but + /// if a different variant has already been allocated, that suggestion may + /// be ignored. /// - /// The caller must suggest a Unicode character to use for the alternative, - /// but if a different alternative has already been allocated, that - /// suggestion may be ignored. - pub fn request_alternative( + /// This function may return None if a new variant would need to be + /// allocated, but that has been prohibited. + pub fn request_variant( &mut self, glyph: GlyphId, suggested: char, - ) -> GlyphAlternateMapping { + ) -> Option { + let map_entry = self.variant_map_allocations.entry(glyph); + + if self.no_new_variants { + if let std::collections::hash_map::Entry::Vacant(_) = map_entry { + return None; + } + } + let new_index = self - .alternate_map_counts + .variant_map_counts .get(&suggested) .copied() .unwrap_or(0); - let map = self - .alternate_map_allocations - .entry(glyph) - .or_insert(GlyphAlternateMapping { - usv: suggested, - alternate_map_index: new_index, - }); - if map.usv == suggested && map.alternate_map_index == new_index { + let map = map_entry.or_insert(GlyphVariantMapping { + usv: suggested, + variant_map_index: new_index, + }); + + if map.usv == suggested && map.variant_map_index == new_index { // If this is the case, we just created the mapping, // and need to bump the associated character's index for // the next glyph that wants to map to it. - self.alternate_map_counts.insert(suggested, new_index + 1); + self.variant_map_counts.insert(suggested, new_index + 1); } - *map + Some(*map) } /// Emit customized fonts to the filesystem and return information so that /// appropriate CSS can be generated. Consumes the object. /// - /// Return value is a vec of (alternate-map-index, CSS-src-field). - pub fn emit(self, out_base: &Path) -> Result, String)>> { - // Write the main font file. - - let mut out_path = out_base.to_owned(); - out_path.push(&self.basename); - atry!( - std::fs::write(&out_path, &self.buffer); - ["cannot write output file `{}`", out_path.display()] - ); + /// `rel_path` is the path, relative to the output root, where the font + /// file(s) shouldb emitted. Currently, this may not contain any directory + /// components, due to the way that the "variant" font file paths are + /// constructed. This wouldn't be too hard to change. + /// + /// `out_base` is the output directory, or None if we shouldn't be writing + /// anything to disk. + /// + /// Return value is a vec of (variant-map-index, CSS-src-field). + pub fn emit( + self, + out_base: Option<&Path>, + rel_path: &str, + ) -> Result, String)>> { + // Write the main font file ... maybe. + + let mut out_path = out_base.map(|p| p.to_owned()); + + if let Some(out_path) = out_path.as_mut() { + out_path.push(rel_path); + atry!( + std::fs::write(&out_path, &self.buffer); + ["cannot write output file `{}`", out_path.display()] + ); + } // CSS info for the main font. - let rel_url = utf8_percent_encode(&self.basename, CONTROLS).to_string(); - let mut rv = vec![(None, format!(r#"url("{}") format("opentype")"#, rel_url))]; + let rel_url = utf8_percent_encode(rel_path, CONTROLS).to_string(); + let mut rv = vec![(None, format!(r#"url("{rel_url}") format("opentype")"#))]; - // Alternates until we're done + // Variants until we're done let mut buffer = self.buffer; let orig_len = buffer.len(); @@ -416,8 +445,8 @@ impl FontData { for cur_map_index in 0.. { let mut mappings = Vec::new(); - for (glyph, altmap) in &self.alternate_map_allocations { - if altmap.alternate_map_index == cur_map_index { + for (glyph, altmap) in &self.variant_map_allocations { + if altmap.variant_map_index == cur_map_index { mappings.push((altmap.usv, *glyph)); } } @@ -426,49 +455,53 @@ impl FontData { break; } - // We have some alternates to emit! - // - // Step 1: create new CMAP, appending to buffer. - // - // Might be nice to sort mappings as we construct it, rather than - // after the fact? + // We have some variants to emit! If we're not actually writing + // files, we might not have much work to actually do though. - buffer.truncate(orig_len); - mappings.sort_unstable(); - append_simple_cmap(&mut buffer, &mappings[..]); - let cmap_size = buffer.len() - orig_len; + let varname = format!("vg{cur_map_index}{rel_path}"); - // step 2: modify CMAP table record + if let Some(out_path) = out_path.as_mut() { + // Step 1: create new CMAP, appending to buffer. + // + // Might be nice to sort mappings as we construct it, rather than + // after the fact? - let cs = opentype_checksum(&buffer[orig_len..]); - let ofs = 12 + self.fontdata_cmap_trec_idx * 16; - BigEndian::write_u32(&mut buffer[ofs + 4..ofs + 8], cs); // checksum - BigEndian::write_u32(&mut buffer[ofs + 8..ofs + 12], orig_len as u32); // offset - BigEndian::write_u32(&mut buffer[ofs + 12..ofs + 16], cmap_size as u32); // length + buffer.truncate(orig_len); + mappings.sort_unstable(); + append_simple_cmap(&mut buffer, &mappings[..]); + let cmap_size = buffer.len() - orig_len; - // step 3: update HEAD "checksum adjustment" field + // step 2: modify CMAP table record - let cs = opentype_checksum(&buffer[..]); - let chkadj = Wrapping(0xB1B0AFBA) - Wrapping(cs); - let ofs = self.fontdata_head_offset as usize + 8; - BigEndian::write_u32(&mut buffer[ofs..ofs + 4], chkadj.0); + let cs = opentype_checksum(&buffer[orig_len..]); + let ofs = 12 + self.fontdata_cmap_trec_idx * 16; + BigEndian::write_u32(&mut buffer[ofs + 4..ofs + 8], cs); // checksum + BigEndian::write_u32(&mut buffer[ofs + 8..ofs + 12], orig_len as u32); // offset + BigEndian::write_u32(&mut buffer[ofs + 12..ofs + 16], cmap_size as u32); // length - // step 4: write new file + // step 3: update HEAD "checksum adjustment" field - out_path.pop(); - let varname = format!("vg{}{}", cur_map_index, self.basename); - out_path.push(&varname); - atry!( - std::fs::write(&out_path, &buffer); - ["cannot write output file `{}`", out_path.display()] - ); + let cs = opentype_checksum(&buffer[..]); + let chkadj = Wrapping(0xB1B0AFBA) - Wrapping(cs); + let ofs = self.fontdata_head_offset as usize + 8; + BigEndian::write_u32(&mut buffer[ofs..ofs + 4], chkadj.0); + + // step 4: write new file + + out_path.pop(); + out_path.push(&varname); + atry!( + std::fs::write(&out_path, &buffer); + ["cannot write output file `{}`", out_path.display()] + ); + } // step 5: update CSS let rel_url = utf8_percent_encode(&varname, CONTROLS).to_string(); rv.push(( Some(cur_map_index), - format!(r#"url("{}") format("opentype")"#, rel_url), + format!(r#"url("{rel_url}") format("opentype")"#), )); } @@ -476,6 +509,39 @@ impl FontData { Ok(rv) } + + /// Emit customized fonts to the filesystem and return information so that + /// appropriate CSS can be generated. Consumes the object. + /// + /// Return value is a vec of (variant-map-index, CSS-src-field). + pub fn into_vglyphs(mut self) -> HashMap { + let mut vglyphs = HashMap::default(); + + for (glyph, altmap) in self.variant_map_allocations.drain() { + vglyphs.insert(glyph.to_string(), altmap.into()); + } + + vglyphs + } + + /// Update this "runtime" information to match the precomputed asset + /// information. At the moment the only thing we need to change is the table + /// of variant glyphs. + pub(crate) fn match_to_precomputed(&mut self, ffad: &crate::assets::syntax::FontFileAssetData) { + self.variant_map_counts.clear(); + self.variant_map_allocations.clear(); + + for (gid, mapping) in &ffad.vglyphs { + let gid: GlyphId = gid.parse().unwrap(); + + self.variant_map_allocations.insert(gid, (*mapping).into()); + + let c = self.variant_map_counts.entry(mapping.usv).or_default(); + *c = std::cmp::max(mapping.index + 1, *c); + } + + self.no_new_variants = true; + } } fn load_ssty_mappings( @@ -587,7 +653,7 @@ fn append_simple_cmap(buf: &mut Vec, map: &[(char, GlyphId)]) { buf.write_u32::(map.len() as u32).unwrap(); // subtable number of groups // We could actually try to be smart here, but based on the expected usage - // of our glyph alternative scheme, I think it is unlikely that we'd realize + // of our glyph variant scheme, I think it is unlikely that we'd realize // any significant efficiencies. for (usv, gid) in map { @@ -596,3 +662,12 @@ fn append_simple_cmap(buf: &mut Vec, map: &[(char, GlyphId)]) { buf.write_u32::(*gid as u32).unwrap(); // glyph id } } + +impl From for GlyphVariantMapping { + fn from(m: crate::assets::syntax::GlyphVariantMapping) -> Self { + GlyphVariantMapping { + usv: m.usv, + variant_map_index: m.index, + } + } +} diff --git a/crates/engine_spx2html/src/fonts.rs b/crates/engine_spx2html/src/fonts.rs new file mode 100644 index 0000000000..a826700bd3 --- /dev/null +++ b/crates/engine_spx2html/src/fonts.rs @@ -0,0 +1,972 @@ +// Copyright 2022 the Tectonic Project +// Licensed under the MIT License. + +//! Manage families of related fonts. +//! +//! Here a "font family" is interpreted in the HTML sense, meaning a set of +//! related fonts. In typography you might call this a typeface. + +use std::{collections::HashMap, fmt::Write, io::Read, path::Path}; +use tectonic_errors::prelude::*; +use tectonic_io_base::InputHandle; +use tectonic_status_base::{tt_warning, StatusBackend}; + +use crate::{ + assets::syntax, + fontfile::{FontFileData, GlyphId, GlyphMetrics, MapEntry}, + Common, FixedPoint, TexFontNum, +}; + +/// An identifier for a "font file" (which may be one face in a collection). +type FontId = usize; + +/// Information about an ensemble of font families. +/// +/// A given document may declare multiple families of related fonts. +#[derive(Debug, Default)] +pub struct FontEnsemble { + /// Information about fonts declared in the SPX file. There may be + /// a number of "native" fonts with different size/color/etc info + /// that all reference the same underlying font file. + tex_fonts: HashMap, + + /// Information about the individual font files used in this build. Although + /// we call these "font files", there may be multiple Fonts for one file on + /// disk, if that file is a collection containing multiple faces. There may + /// also be font files that do not have corresponding TeX fonts, if we are + /// loading a merged asset specification. + font_files: Vec, + + /// Information about font families. This is keyed by the font-id of the + /// "regular" font. + font_families: HashMap, + + /// Mapping font source TeX-paths and face indices to font IDs. This tuple + /// of info uniquely identifies a "font file", in our terminology. + src_index_map: HashMap<(String, u32), FontId>, +} + +impl FontEnsemble { + /// Test whether this ensemble contains a font identified by the given SPX + /// font number. + pub fn contains(&self, f: TexFontNum) -> bool { + self.tex_fonts.contains_key(&f) + } + + #[inline(always)] + fn lookup_tex(&self, fnum: TexFontNum) -> Result<&TexFontInfo> { + Ok(a_ok_or!( + self.tex_fonts.get(&fnum); + ["undeclared font number {}", fnum] + )) + } + + /// Register a new "native" font with this data structure. Font-family + /// relations aren't recorded here. + /// + /// At this point, the calling function has checked whether this particular + /// font-num has already been registered. But there can be multiple + /// font-nums that point at the same source path and face index, which means + /// they will have the same backing "font file" in our terminology. In + /// particular, different sizes of the same font get different font-nums. + /// + /// The styling options like *color_rgba* and *slant* are currently stored + /// but unused. + #[allow(clippy::too_many_arguments)] + pub(crate) fn register_tex_font( + &mut self, + font_num: TexFontNum, + size: FixedPoint, + face_index: u32, + color_rgba: Option, + extend: Option, + slant: Option, + embolden: Option, + texpath: String, + ih: InputHandle, + common: &mut Common, + ) -> Result<()> { + let fid = self.ensure_font_file(&texpath, face_index, ih, common)?; + + let info = TexFontInfo { + fid, + size, + color_rgba, + extend, + slant, + embolden, + }; + + self.tex_fonts.insert(font_num, info); + Ok(()) + } + + /// Make sure that a font file is loaded. Font files are uniquely identified + /// by their source TeX paths and face indices. + fn ensure_font_file( + &mut self, + texpath: &str, + face_index: u32, + mut ih: InputHandle, + common: &mut Common, + ) -> Result { + // Figure out if we've already loaded the appropriate font file. + + let si_key = (texpath.to_string(), face_index); + let next_id = self.font_files.len(); + let fid = *self.src_index_map.entry(si_key).or_insert(next_id); + + if fid == next_id { + // No, we haven't. Load it up. + + let mut contents = Vec::new(); + atry!( + ih.read_to_end(&mut contents); + ["unable to read input font file `{}`", texpath] + ); + + let (name, digest_opt) = ih.into_name_digest(); + common + .hooks + .event_input_closed(name, digest_opt, common.status); + + let ffd = atry!( + FontFileData::from_opentype(contents, face_index); + ["unable to load glyph data for font `{}`", texpath] + ); + + // Figure out the output path that we'll use for this font. For now, + // that's just the basename. TODO: make sure that we don't have + // basename clashes! This will happen trivially if we ever actually + // use font collections that contain more than one face. + + let out_rel_path = texpath.rsplit('/').next().unwrap(); + + // That's all we need. + + self.font_files + .push(Font::new(texpath, face_index, out_rel_path, ffd)); + } + + Ok(fid) + } + + /// Load a font that is *not* defined in the input SPX tile. + /// + /// This should only be called for fonts that are definitely not loaded by + /// TeX, because we don't do any checks to prevent creating duplicate fonts + /// outputs. + fn load_external_font( + &mut self, + texpath: impl Into, + face_index: u32, + common: &mut Common, + ) -> Result { + let texpath = texpath.into(); + + // All we have to do is open up the file and pass off to the shared + // implementation. + + let io = common.hooks.io(); + + let ih = atry!( + io.input_open_name(&texpath, common.status).must_exist(); + ["failed to find a font file `{}`", texpath] + ); + + self.ensure_font_file(&texpath, face_index, ih, common) + } + + /// Register a font-family relation. + /// + /// For the time being, the full quartet of bold/italic variations must be + /// defined in order to declare a family. + pub fn register_family( + &mut self, + name: String, + regular: TexFontNum, + bold: TexFontNum, + italic: TexFontNum, + bold_italic: TexFontNum, + ) -> Result<()> { + let regular = self.lookup_tex(regular)?.fid; + let bold = self.lookup_tex(bold)?.fid; + let italic = self.lookup_tex(italic)?.fid; + let bold_italic = self.lookup_tex(bold_italic)?.fid; + + // Update the info records for the relevant fonts to capture the + // established relationship. + + self.font_files[regular].family_name = name.clone(); + self.font_files[regular].family_relation = FamilyRelativeFontId::Regular; + self.font_files[bold].family_name = name.clone(); + self.font_files[bold].family_relation = FamilyRelativeFontId::Bold; + self.font_files[italic].family_name = name.clone(); + self.font_files[italic].family_relation = FamilyRelativeFontId::Italic; + self.font_files[bold_italic].family_name = name.clone(); + self.font_files[bold_italic].family_relation = FamilyRelativeFontId::BoldItalic; + + self.font_families.insert( + regular, + FontFamily { + name, + regular, + bold, + italic, + bold_italic, + }, + ); + + Ok(()) + } + + /// Get the size at which the specified SPX font is defined. + /// + /// If the TeX font number is undefined, a default of 10.0 is returned. + pub fn get_font_size(&self, fnum: TexFontNum) -> FixedPoint { + self.tex_fonts + .get(&fnum) + .map(|tfi| tfi.size) + .unwrap_or(655360) + } + + /// Get the width of the space character in a SPX font. + /// + /// This width is not always known, depending on the font file structure. + /// For convenience, this function's input font number is also optional. + pub fn maybe_get_font_space_width(&self, font_num: Option) -> Option { + font_num + .and_then(|fnum| self.tex_fonts.get(&fnum)) + .and_then(|tfi| self.font_files[tfi.fid].details.space_width(tfi.size)) + } + + /// Get the metrics for a glyph in a font. + /// + /// The return value is only `Err` if the font number is undeclared. If the + /// glyph's metrics are not defined in the font, `Ok(None)` is returned. + pub fn get_glyph_metrics( + &mut self, + fnum: TexFontNum, + glyph: GlyphId, + ) -> Result> { + let tfi = self.lookup_tex(fnum)?; + Ok(self.font_files[tfi.fid] + .details + .lookup_metrics(glyph, tfi.size)) + } + + /// Get information needed to render a glyph in a canvas context. + /// + /// The return value is a tuple `(text_info, size, baseline_factor)`. In + /// turn, `text_info` is an optional tuple of `(ch, style)`, where `ch` is + /// the Unicode character to yield the desired glyph and `style` is a bit of + /// CSS to go into an HTML `style` attribute in order to select the font + /// that will map `ch` to the correct glyph. + /// + /// If we're unable to figure out a way to render the desired glyph, a + /// warning is logged to the status backend. + pub fn process_glyph_for_canvas( + &mut self, + fnum: TexFontNum, + glyph: GlyphId, + status: &mut dyn StatusBackend, + ) -> (Option<(char, String)>, FixedPoint, f32) { + // Can't borrow `self` in the map() closure. + let font_files = &mut self.font_files; + + self.tex_fonts + .get(&fnum) + .map(|tfi| { + let text_info = get_text_info(&mut font_files[tfi.fid], glyph, status); + let size = tfi.size; + let baseline_factor = font_files[tfi.fid].details.baseline_factor(); + + (text_info, size, baseline_factor) + }) + .unwrap_or((None, 655360, 1.0)) + } + + /// Create an iterator for rendering glyphs as Unicode text. + /// + /// The iterator yields tuples of `(index, text_info, advance)`, where + /// `index` is the index of the glyph in the passed-in array, `text_info` is + /// an optional tuple of information about how to get the glyph to appear in + /// HTML, and `advance` is the horizontal advance length associated with the + /// glyph in question, according to the font's metrics. If not None, + /// `text_info` is a tuple of `(ch, style)`, where `ch` is the Unicode + /// character to yield the desired glyph and `style` is a bit of CSS to go + /// into an HTML `style` attribute in order to select the font that will map + /// `ch` to the correct glyph. + /// + /// If we're unable to figure out a way to render the desired glyph, a + /// warning is logged to the status backend. + pub fn process_glyphs_as_text<'a>( + &'a mut self, + font_num: TexFontNum, + glyphs: &'a [GlyphId], + status: &'a mut dyn StatusBackend, + ) -> Result, FixedPoint)> + 'a> { + // Can't use lookup_tex() here since the borrow checker treats it as + // borrowing all of `self`, not just the `tex_fonts` member. + let fi = a_ok_or!( + self.tex_fonts.get(&font_num); + ["undeclared font number {}", font_num] + ); + let font = &mut self.font_files[fi.fid]; + + Ok(GlyphTextProcessingIterator { + fi, + font, + glyphs, + status, + next: 0, + }) + } + + /// Determine how an SPX font relates to a font family. + /// + /// The *fnum* argument is some font number. The *cur_ffid* argument is the + /// identifier of a font family, which is defined as the TexFontNum of its + /// "regular" font. The *cur_af* argument defines the currently active font + /// within that family, as identified with a [`FamilyRelativeFontId`]. + pub fn analyze_font_for_family( + &self, + fnum: TexFontNum, + cur_ffid: TexFontNum, + cur_af: FamilyRelativeFontId, + ) -> FontFamilyAnalysis { + if let Ok(tf) = self.lookup_tex(fnum) { + if let Ok(tc) = self.lookup_tex(cur_ffid) { + if let Some(cur_fam) = self.font_families.get(&tc.fid) { + // Already set up for the right font? If so, great! + return if cur_fam.relative_id_to_font_num(cur_af) == tf.fid { + FontFamilyAnalysis::AlreadyActive + } else { + // No. Figure out what we need to do. + let desired_af = cur_fam.font_num_to_relative_id(tf.fid); + FontFamilyAnalysis::Reachable( + cur_fam.path_to_new_font(cur_af, desired_af), + desired_af, + ) + }; + } + } + + FontFamilyAnalysis::NoMatch(tf.fid) + } else { + FontFamilyAnalysis::Unrecognized + } + } + + /// Write HTML code for an open `` element that activates a font. + /// + /// The font size is specified in CSS "rem" units, which need to be + /// calculated with the *rems_per_tex* parameter. + pub fn write_styling_span_html( + &self, + fnum: TexFontNum, + rems_per_tex: f32, + mut dest: W, + ) -> Result<()> { + let tfi = self.lookup_tex(fnum)?; + let rel_size = tfi.size as f32 * rems_per_tex; + + write!( + dest, + "", + rel_size, + self.font_files[tfi.fid].selection_style_text(None) + ) + .map_err(|e| e.into()) + } + + /// Emit the font files and return CSS code setting up the files. + /// + /// This function clears this object's internal data structures, making it + /// effectively unusable for subsequent operations. + pub fn emit(&mut self, out_base: Option<&Path>) -> Result { + let mut faces = String::default(); + + for font in self.font_files.drain(..) { + font.emit(out_base, &mut faces)?; + } + + Ok(faces) + } + + pub(crate) fn into_serialize(mut self) -> (syntax::Assets, syntax::FontEnsembleAssetData) { + let mut assets: syntax::Assets = Default::default(); + let mut css_data: syntax::FontEnsembleAssetData = Default::default(); + let mut fid_to_filename = Vec::new(); + + for font in self.font_files.drain(..) { + let vglyphs = font.details.into_vglyphs(); + + let ffad = syntax::FontFileAssetData { + source: font.src_tex_path, + face_index: font.face_index, + vglyphs, + }; + + let filename = ffad.source.clone(); + assets + .0 + .insert(filename.clone(), syntax::AssetOrigin::FontFile(ffad)); + fid_to_filename.push(filename); + } + + for ffi in self.font_families.values() { + let mut faces = HashMap::new(); + + faces.insert( + syntax::FaceType::Regular, + fid_to_filename[ffi.regular].clone(), + ); + faces.insert(syntax::FaceType::Bold, fid_to_filename[ffi.bold].clone()); + faces.insert( + syntax::FaceType::Italic, + fid_to_filename[ffi.italic].clone(), + ); + faces.insert( + syntax::FaceType::BoldItalic, + fid_to_filename[ffi.bold_italic].clone(), + ); + css_data + .0 + .insert(ffi.name.clone(), syntax::FontFamilyAssetData { faces }); + } + + (assets, css_data) + } + + /// Check that the fonts defined at runtime match the serialized assets, and + /// set up the runtime variant glyphs to align with the precomputed ones. + pub(crate) fn match_to_precomputed( + &mut self, + precomputed: &syntax::Assets, + common: &mut Common, + ) -> Result<()> { + let mut fid_to_filename = Vec::new(); + + // For the existing font file data, we need to check that they're + // present and the basenames match. We'll replace the runtime + // variant-glyph mappings with the precomputed ones. + + for font in &mut self.font_files { + match precomputed.0.get(&font.out_rel_path) { + Some(syntax::AssetOrigin::FontFile(ff)) => { + ensure!( + ff.source == font.out_rel_path, + "precomputed font asset `{}` \ + should have an origin of `{}`, but in this session it is `{}`", + font.out_rel_path, + font.out_rel_path, + ff.source + ); + + font.details.match_to_precomputed(ff); + } + + Some(other) => bail!( + "precomputed asset `{}` should be a font file, but it is {}", + font.out_rel_path, + other + ), + + None => bail!( + "precomputed assets for this session should contain a font file named `{}`", + font.out_rel_path + ), + } + + fid_to_filename.push(font.out_rel_path.clone()); + } + + // Now create new records for any fonts in the precomputed assets that + // we're missing. By definition, we shouldn't need them during our main + // processing, but we still might be responsible for creating the final + // output files at the end. + + for origin in precomputed.0.values() { + if let syntax::AssetOrigin::FontFile(ff) = origin { + let fid = atry!( + self.load_external_font(&ff.source, ff.face_index, common); + ["failed to load face #{} of font `{}` from precomputed assets", ff.face_index, ff.source] + ); + + self.font_files[fid].details.match_to_precomputed(ff); + } + } + + // This is a bit awkward, but our system currently lets there be + // multiple font-CSS outputs that could in principle declare different + // font families. To check consistency, we want to scan all of those. We + // ignore the possibility that different CSS files might define + // different families with the same name. + + let mut precomputed_families = HashMap::new(); + + for origin in precomputed.0.values() { + if let syntax::AssetOrigin::FontCss(fe) = origin { + for (fam_name, ff) in &fe.0 { + precomputed_families.insert(fam_name.to_owned(), ff); + } + } + } + + // Now we can check the runtime families. A helper closure uses + // fnum_to_filename to deal with the different faces to check. + + let check_face = |fam_name: &str, + fid: FontId, + ft: syntax::FaceType, + pff: &syntax::FontFamilyAssetData| + -> Result<()> { + let runtime_file = &fid_to_filename[fid]; + + if let Some(pre_file) = pff.faces.get(&ft) { + ensure!( + pre_file == runtime_file, + "font family {} face {:?} should \ + point to file `{}`, but in this session it is `{}`", + fam_name, + ft, + pre_file, + runtime_file + ); + } else { + bail!( + "this session defines unexpected face {:?} for font family {}", + ft, + fam_name + ); + } + + Ok(()) + }; + + for ffi in self.font_families.values() { + let fam_name = &ffi.name; + + if let Some(pff) = precomputed_families.get(fam_name) { + check_face(fam_name, ffi.regular, syntax::FaceType::Regular, pff)?; + check_face(fam_name, ffi.bold, syntax::FaceType::Bold, pff)?; + check_face(fam_name, ffi.italic, syntax::FaceType::Italic, pff)?; + check_face(fam_name, ffi.bold_italic, syntax::FaceType::BoldItalic, pff)?; + } else { + bail!( + "precomputed assets for this session should define a font family named {}", + fam_name + ); + } + } + + // All OK! + + Ok(()) + } +} + +/// A helper type for the [`FontEnsemble::process_glyphs_as_text`] method. +struct GlyphTextProcessingIterator<'a> { + fi: &'a TexFontInfo, + font: &'a mut Font, + glyphs: &'a [GlyphId], + status: &'a mut dyn StatusBackend, + next: usize, +} + +impl<'a> Iterator for GlyphTextProcessingIterator<'a> { + type Item = (usize, Option<(char, String)>, FixedPoint); + + fn next(&mut self) -> Option { + if self.next >= self.glyphs.len() { + return None; + } + + let glyph = self.glyphs[self.next]; + + // Get the advance info: + + let gm = self.font.details.lookup_metrics(glyph, self.fi.size); + + let advance = match gm { + Some(gm) => gm.advance, + None => 0, + }; + + // Get the textualization info: + + let text_info = get_text_info(self.font, glyph, self.status); + + // And that's it! + + let idx = self.next; + self.next += 1; + Some((idx, text_info, advance)) + } +} + +/// Get information about how to render a desired glyph from a font. +fn get_text_info( + font: &mut Font, + glyph: GlyphId, + status: &mut dyn StatusBackend, +) -> Option<(char, String)> { + let text_info = font.details.lookup_mapping(glyph).map(|mc| { + let (mut ch, need_alt) = match mc { + MapEntry::Direct(c) => (c, false), + MapEntry::SubSuperScript(c, _) => (c, true), + MapEntry::MathGrowingVariant(c, _, _) => (c, true), + }; + + let var_index = if need_alt { + if let Some(map) = font.details.request_variant(glyph, ch) { + ch = map.usv; + Some(map.variant_map_index) + } else { + tt_warning!( + status, + "prohibited from defining new variant glyph {} in font `{}` (face {})", + glyph, + font.out_rel_path, + font.face_index + ); + None + } + } else { + None + }; + + // For later: might help to allow some context about the active font so + // that we can maybe use a simpler selection string here. + let font_sel = font.selection_style_text(var_index); + + (ch, font_sel) + }); + + if text_info.is_none() { + tt_warning!( + status, + "unable to reverse-map glyph {} in font `{}` (face {})", + glyph, + font.out_rel_path, + font.face_index + ); + } + + text_info +} + +/// The return type for [`FontEnsemble::analyze_font_for_family`]. +#[derive(Debug)] +pub enum FontFamilyAnalysis { + /// The desired font is already active. + AlreadyActive, + + /// The desired font isn't active, but it can be "reached" in the context of + /// the current family by closing and/or opening tags like ``. + Reachable(PathToNewFont, FamilyRelativeFontId), + + /// The desired font can't be reached in the context of this family. We + /// can't activate it in a semantically-clean way. The associated value is + /// the mapped font-id of the input TeX font num. + NoMatch(FontId), + + /// The desired TeX font-num is unrecognized. This should only happen if the + /// SPX file is corrupt. + Unrecognized, +} + +/// Information about a "native font" declared in the SPX file that's specific +/// to the TeX font, not the "font file" data structure. +#[allow(dead_code)] +#[derive(Debug)] +struct TexFontInfo { + /// The font file pointed to by this TeX font. + fid: FontId, + + /// The size at which this font is rendered, in TeX units. + size: FixedPoint, + + /// Unused TeX/SPX setting. + color_rgba: Option, + + /// Unused TeX/SPX setting. + extend: Option, + + /// Unused TeX/SPX setting. + slant: Option, + + /// Unused TeX/SPX setting. + embolden: Option, +} + +#[derive(Debug)] +struct Font { + /// The TeX path of the file from which this font was loaded. In conjunction + /// with face_index, this uniquely identifies a Font. + src_tex_path: String, + + /// The index number of the particular face in the font file that was loaded. + face_index: u32, + + /// The path that this font file will be output to. + out_rel_path: String, + + details: FontFileData, + + /// The name of the family that this font is associated with. This may be a + /// user-given name, if they explicitly define a font family; but by default + /// it is automatically generated, so that we have *some* reliable way to + /// name the font in our output. + family_name: String, + + /// This font's role in relation to its family. Here, the `Other` enum + /// variant is illegal. + family_relation: FamilyRelativeFontId, +} + +impl Font { + pub fn new( + src_tex_path: impl Into, + face_index: u32, + out_rel_path: impl Into, + details: FontFileData, + ) -> Self { + let src_tex_path = src_tex_path.into(); + let out_rel_path = out_rel_path.into(); + let family_name = out_rel_path.replace(|c: char| !c.is_alphanumeric(), "_"); + + Font { + src_tex_path, + face_index, + out_rel_path, + details, + family_name, + family_relation: FamilyRelativeFontId::Regular, + } + } + + fn emit(self, out_base: Option<&Path>, mut dest: W) -> Result<()> { + for (var_index, css_src) in self.details.emit(out_base, &self.out_rel_path)? { + // This is almost identical to `selection_style_text`. A major + // factor is that we're consuming `self`, with `self.details` + // already consumed by the `emit()` call, so we can't borrow &self. + // Also, here we have double quotes around the font-family + // specifier, which we want to have in the CSS but shouldn't have + // (maybe???) in the HTML `style` attribute. + let var_text = var_index.map(|i| format!("vg{i}")).unwrap_or_default(); + + let extra = match self.family_relation { + FamilyRelativeFontId::Regular => "", + FamilyRelativeFontId::Bold => "\n font-weight: bold;", + FamilyRelativeFontId::Italic => "\n font-style: italic;", + FamilyRelativeFontId::BoldItalic => { + "\n font-weight: bold;\n font-style: italic;" + } + FamilyRelativeFontId::Other(_) => unreachable!(), + }; + + writeln!( + dest, + r#"@font-face {{ + font-family: "{}{}";{} + src: {}; +}}"#, + self.family_name, var_text, extra, css_src, + )?; + } + + Ok(()) + } + + /// Generate a snippet of CSS for an HTML `style` attribute that will select + /// the appropriate font, given that we might need to select one of the + /// "variants" generated to make unusual glyphs available. + fn selection_style_text(&self, variant_map_index: Option) -> String { + let var_text = variant_map_index + .map(|i| format!("vg{i}")) + .unwrap_or_default(); + + let extra = match self.family_relation { + FamilyRelativeFontId::Regular => "", + FamilyRelativeFontId::Bold => "; font-weight: bold", + FamilyRelativeFontId::Italic => "; font-style: italic", + FamilyRelativeFontId::BoldItalic => "; font-weight: bold; font-style: italic", + FamilyRelativeFontId::Other(_) => unreachable!(), + }; + + format!("font-family: {}{}{}", self.family_name, var_text, extra) + } +} + +/// The definition of a family of fonts. +#[derive(Clone, Debug, Eq, PartialEq)] +struct FontFamily { + name: String, + regular: FontId, + bold: FontId, + italic: FontId, + bold_italic: FontId, +} + +impl FontFamily { + fn font_num_to_relative_id(&self, fnum: FontId) -> FamilyRelativeFontId { + if fnum == self.regular { + FamilyRelativeFontId::Regular + } else if fnum == self.bold { + FamilyRelativeFontId::Bold + } else if fnum == self.italic { + FamilyRelativeFontId::Italic + } else if fnum == self.bold_italic { + FamilyRelativeFontId::BoldItalic + } else { + FamilyRelativeFontId::Other(fnum) + } + } + + fn relative_id_to_font_num(&self, relid: FamilyRelativeFontId) -> FontId { + match relid { + FamilyRelativeFontId::Regular => self.regular, + FamilyRelativeFontId::Bold => self.bold, + FamilyRelativeFontId::Italic => self.italic, + FamilyRelativeFontId::BoldItalic => self.bold_italic, + FamilyRelativeFontId::Other(fnum) => fnum, + } + } + + /// Figure out how to get "to" a desired font based on the current one. This + /// function should only be called if it has been established that the + /// desired font is in fact different than the current font. However, there + /// are some noop cases below so that we can make the compiler happy about + /// covering all of our enum variants. + fn path_to_new_font( + &self, + cur: FamilyRelativeFontId, + desired: FamilyRelativeFontId, + ) -> PathToNewFont { + match desired { + FamilyRelativeFontId::Other(_) => PathToNewFont { + close_all: true, + select_explicitly: true, + ..Default::default() + }, + + FamilyRelativeFontId::Regular => PathToNewFont { + close_all: true, + ..Default::default() + }, + + FamilyRelativeFontId::Bold => match cur { + FamilyRelativeFontId::Regular => PathToNewFont { + open_b: Some(desired), + ..Default::default() + }, + + FamilyRelativeFontId::Bold => Default::default(), + + FamilyRelativeFontId::Italic | FamilyRelativeFontId::Other(_) => PathToNewFont { + close_all: true, + open_b: Some(desired), + ..Default::default() + }, + + FamilyRelativeFontId::BoldItalic => PathToNewFont { + close_one_and_retry: true, + ..Default::default() + }, + }, + + FamilyRelativeFontId::Italic => match cur { + FamilyRelativeFontId::Regular => PathToNewFont { + open_i: Some(desired), + ..Default::default() + }, + + FamilyRelativeFontId::Italic => Default::default(), + + FamilyRelativeFontId::Bold | FamilyRelativeFontId::Other(_) => PathToNewFont { + close_all: true, + open_i: Some(desired), + ..Default::default() + }, + + FamilyRelativeFontId::BoldItalic => PathToNewFont { + close_one_and_retry: true, + ..Default::default() + }, + }, + + FamilyRelativeFontId::BoldItalic => match cur { + FamilyRelativeFontId::Regular => PathToNewFont { + open_i: Some(desired), + open_b: Some(FamilyRelativeFontId::Bold), // <= the whole reason these aren't bools + ..Default::default() + }, + + FamilyRelativeFontId::Italic => PathToNewFont { + open_b: Some(desired), + ..Default::default() + }, + + FamilyRelativeFontId::Bold => PathToNewFont { + open_i: Some(desired), + ..Default::default() + }, + + FamilyRelativeFontId::BoldItalic => Default::default(), + + FamilyRelativeFontId::Other(_) => PathToNewFont { + close_one_and_retry: true, + ..Default::default() + }, + }, + } + } +} + +/// How to "get to" a desired font based on the current font family and recently +/// active tags. +#[derive(Clone, Copy, Debug, Default, Eq, PartialEq)] +pub struct PathToNewFont { + /// Close all open automatically-generated font-selection tags. + pub close_all: bool, + + /// Close one automatically-generated font-selection tag, and try again. + pub close_one_and_retry: bool, + + /// Issue a `` element to explicitly choose the font; this is + /// our get-out-of-jail-free card. + pub select_explicitly: bool, + + /// If Some, open a `` tag. The value is the "family-relative" font that + /// will be active after doing so. If both this and `open_i` are Some, this + /// should be evaluated first. + pub open_b: Option, + + /// If Some, open an `` tag. The value is the "family-relative" font that + /// will be active after doing so. If both this and `open_b` are Some, the + /// `` tag should be evaluated first. + pub open_i: Option, +} + +/// A font's role relative to some font family. +#[derive(Clone, Copy, Debug, Eq, PartialEq)] +pub enum FamilyRelativeFontId { + /// This font is the regular font of the current family. + Regular, + + /// This font is the bold font of the current family. + Bold, + + /// This font is the italic font of the current family. + Italic, + + /// This font is the bold-italic font of the current family. + BoldItalic, + + /// This font is some other font with no known relation to the current + /// family. + Other(FontId), +} diff --git a/crates/engine_spx2html/src/initialization.rs b/crates/engine_spx2html/src/initialization.rs new file mode 100644 index 0000000000..1ec38ce3f2 --- /dev/null +++ b/crates/engine_spx2html/src/initialization.rs @@ -0,0 +1,374 @@ +// Copyright 2018-2022 the Tectonic Project +// Licensed under the MIT License. + +//! The initialization stage of SPX processing. + +use std::{collections::HashMap, io::Read, path::PathBuf}; +use tectonic_errors::prelude::*; +use tectonic_io_base::OpenResult; +use tectonic_status_base::tt_warning; + +use crate::{ + fonts::FontEnsemble, html::Element, specials::Special, templating::Templating, Common, + EmittingState, FixedPoint, TexFontNum, +}; + +#[derive(Debug)] +pub(crate) struct InitializationState { + templates: HashMap, + next_template_path: String, + next_output_path: String, + + fonts: FontEnsemble, + main_body_font_num: Option, + tag_associations: HashMap, + + cur_font_family_definition: Option, + cur_font_family_tag_associations: Option, + + variables: HashMap, +} + +impl Default for InitializationState { + fn default() -> Self { + InitializationState { + templates: Default::default(), + next_template_path: Default::default(), + next_output_path: "index.html".to_owned(), + + fonts: Default::default(), + main_body_font_num: None, + tag_associations: Default::default(), + + cur_font_family_definition: None, + cur_font_family_tag_associations: None, + + variables: Default::default(), + } + } +} + +impl InitializationState { + /// Return true if we're in not in the midst of a multi-step construct like + /// startDefineFontFamily. In such situations, if we see an event that is + /// associated with the beginning of the actual content, we should end the + /// initialization phase. + pub(crate) fn in_endable_init(&self) -> bool { + self.cur_font_family_definition.is_none() && self.cur_font_family_tag_associations.is_none() + } + + /// Handle a "native" font definition. + /// + /// The font *name* comes directly from the SPX file and currently + /// corresponds to the TeX path of a font file that can be opened as an + /// input, potentially without its extension. In the future, it is possible + /// that the font name might be something symbolic like "Times New Roman" + /// that might not work well as a file path. + #[allow(clippy::too_many_arguments)] + pub(crate) fn handle_define_native_font( + &mut self, + name: &str, + font_num: TexFontNum, + size: FixedPoint, + face_index: u32, + color_rgba: Option, + extend: Option, + slant: Option, + embolden: Option, + common: &mut Common, + ) -> Result<()> { + if self.fonts.contains(font_num) { + // Should we override the definition or something? + return Ok(()); + } + + // Figure out the TeX path of the font source. At the moment, this is + // just the name or something similar, but in principle we might do a + // lookup based on something like symbolic name. + + let io = common.hooks.io(); + let mut texpath = String::default(); + let mut ih = None; + + for ext in &["", ".otf"] { + texpath = format!("{name}{ext}"); + + match io.input_open_name(&texpath, common.status) { + OpenResult::Ok(h) => { + ih = Some(h); + break; + } + + OpenResult::NotAvailable => continue, + + OpenResult::Err(e) => return Err(e), + }; + } + + let ih = a_ok_or!(ih; + ["failed to find a font file associated with the name `{}`", name] + ); + + // Now that we have that, we can pass off to the font manager. + + self.fonts.register_tex_font( + font_num, size, face_index, color_rgba, extend, slant, embolden, texpath, ih, common, + ) + } + + pub(crate) fn handle_special( + &mut self, + special: Special<'_>, + common: &mut Common, + ) -> Result<()> { + match special { + Special::AddTemplate(t) => self.handle_add_template(t, common), + Special::SetTemplate(t) => self.handle_set_template(t, common), + Special::SetOutputPath(t) => self.handle_set_output_path(t, common), + Special::SetTemplateVariable(t) => self.handle_set_template_variable(t, common), + Special::StartDefineFontFamily => self.handle_start_define_font_family(), + Special::EndDefineFontFamily => self.handle_end_define_font_family(common), + Special::StartFontFamilyTagAssociations => { + self.handle_start_font_family_tag_associations() + } + + Special::EndFontFamilyTagAssociations => { + self.handle_end_font_family_tag_associations(common) + } + + Special::ProvideFile(_) => { + tt_warning!(common.status, "ignoring too-soon tdux:provideFile special"); + Ok(()) + } + + _ => Ok(()), + } + } + + fn handle_add_template(&mut self, texpath: &str, common: &mut Common) -> Result<()> { + let mut ih = atry!( + common.hooks.io().input_open_name(texpath, common.status).must_exist(); + ["unable to open input HTML template `{}`", texpath] + ); + + let mut contents = String::new(); + atry!( + ih.read_to_string(&mut contents); + ["unable to read input HTML template `{}`", texpath] + ); + + self.templates.insert(texpath.to_owned(), contents); + + let (name, digest_opt) = ih.into_name_digest(); + common + .hooks + .event_input_closed(name, digest_opt, common.status); + Ok(()) + } + + fn handle_set_template(&mut self, texpath: &str, _common: &mut Common) -> Result<()> { + self.next_template_path = texpath.to_owned(); + Ok(()) + } + + fn handle_set_output_path(&mut self, texpath: &str, _common: &mut Common) -> Result<()> { + self.next_output_path = texpath.to_owned(); + Ok(()) + } + + fn handle_set_template_variable(&mut self, remainder: &str, common: &mut Common) -> Result<()> { + if let Some((varname, varval)) = remainder.split_once(' ') { + self.variables.insert(varname.to_owned(), varval.to_owned()); + } else { + tt_warning!( + common.status, + "ignoring malformatted tdux:setTemplateVariable special `{}`", + remainder + ); + } + + Ok(()) + } + + // "Font family" definitions, allowing us to synthesize bold/italic tags + // based on tracking font changes, and also to know what the main body font + // is. + + fn handle_start_define_font_family(&mut self) -> Result<()> { + self.cur_font_family_definition = Some(FontFamilyBuilder::default()); + Ok(()) + } + + fn handle_end_define_font_family(&mut self, common: &mut Common) -> Result<()> { + if let Some(b) = self.cur_font_family_definition.take() { + let family_name = b.family_name; + let regular = a_ok_or!(b.regular; ["no regular face defined"]); + let bold = a_ok_or!(b.bold; ["no bold face defined"]); + let italic = a_ok_or!(b.italic; ["no italic face defined"]); + let bold_italic = a_ok_or!(b.bold_italic; ["no bold-italic face defined"]); + + self.fonts + .register_family(family_name, regular, bold, italic, bold_italic)?; + } else { + tt_warning!( + common.status, + "end of font-family definition block that didn't start" + ); + } + + Ok(()) + } + + // "Font family tag associations", telling us which font family is the + // default depending on which tag we're in. For instance, typical templates + // will default to the monospace font inside `` tags. + + fn handle_start_font_family_tag_associations(&mut self) -> Result<()> { + self.cur_font_family_tag_associations = Some(FontFamilyTagAssociator::default()); + Ok(()) + } + + fn handle_end_font_family_tag_associations(&mut self, common: &mut Common) -> Result<()> { + if let Some(mut a) = self.cur_font_family_tag_associations.take() { + for (k, v) in a.assoc.drain() { + self.tag_associations.insert(k, v); + } + } else { + tt_warning!( + common.status, + "end of font-family tag-association block that didn't start" + ); + } + + Ok(()) + } + + /// In the initialization state, this should only get called if we're in a + /// font-family definition (in which case we're using the contents to learn + /// the definition of a font family). Otherwise, the higher-level callback + /// will declare initialization done and move to the emitting state. + pub(crate) fn handle_text_and_glyphs( + &mut self, + font_num: TexFontNum, + text: &str, + _glyphs: &[u16], + _xs: &[i32], + _ys: &[i32], + common: &mut Common, + ) -> Result<()> { + if let Some(b) = self.cur_font_family_definition.as_mut() { + if text.starts_with("bold-italic") { + b.bold_italic = Some(font_num); + } else if text.starts_with("bold") { + b.bold = Some(font_num); + } else if text.starts_with("italic") { + b.italic = Some(font_num); + } else { + b.regular = Some(font_num); + b.family_name = if let Some(fname) = text.strip_prefix("family-name:") { + fname.to_owned() + } else { + format!("tdux{font_num}") + }; + + // Say that the "regular" font of the first font family definition + // is the main body font. + if self.main_body_font_num.is_none() { + self.main_body_font_num = Some(font_num); + } + } + } else if let Some(a) = self.cur_font_family_tag_associations.as_mut() { + for tagname in text.split_whitespace() { + let el: Element = tagname.parse().unwrap(); + a.assoc.insert(el, font_num); + } + } else { + // This shouldn't happen; the top-level processor should exit init + // phase if it's invoked and none of the above cases hold. + tt_warning!( + common.status, + "internal bug; losing text `{}` in initialization phase", + text + ); + } + + Ok(()) + } + + pub(crate) fn initialization_finished(mut self, common: &mut Common) -> Result { + // If we have precomputed assets, now is the time to confirm that the + // fonts defined in this run are a subset of those in the precomputed + // session, and copy over variant-glyph definitions to be used during + // the bulk processing. + + if let Some(precomputed) = common.precomputed_assets.as_ref() { + precomputed.check_runtime_fonts(&mut self.fonts, common)?; + } + + let mut context = tera::Context::default(); + + // Tera requires that we give it a filesystem path to look for + // templates, even if we're going to be adding all of our templates + // later. So I guess we have to create an empty tempdir. + + let tempdir = atry!( + tempfile::Builder::new().prefix("tectonic_tera_workaround").tempdir(); + ["couldn't create empty temporary directory for Tera"] + ); + + let mut p = PathBuf::from(tempdir.path()); + p.push("*"); + + let p = a_ok_or!( + p.to_str(); + ["couldn't convert Tera temporary directory name to UTF8 as required"] + ); + + let mut tera = atry!( + tera::Tera::parse(p); + ["couldn't initialize Tera templating engine in temporary directory `{}`", p] + ); + + atry!( + tera.add_raw_templates(self.templates.iter()); + ["couldn't compile Tera templates"] + ); + + // Other context initialization, with the possibility of overriding + // stuff that's been set up earlier. + + for (varname, varvalue) in self.variables { + context.insert(varname, &varvalue); + } + + let templating = Templating::new( + tera, + context, + self.next_template_path, + self.next_output_path, + ); + + // Ready to hand off. + + EmittingState::new_from_init( + self.fonts, + self.main_body_font_num, + templating, + self.tag_associations, + ) + } +} + +#[derive(Debug, Default)] +struct FontFamilyBuilder { + family_name: String, + regular: Option, + bold: Option, + italic: Option, + bold_italic: Option, +} + +#[derive(Debug, Default)] +struct FontFamilyTagAssociator { + assoc: HashMap, +} diff --git a/crates/engine_spx2html/src/lib.rs b/crates/engine_spx2html/src/lib.rs index 0baf0da59b..eee3dc9133 100644 --- a/crates/engine_spx2html/src/lib.rs +++ b/crates/engine_spx2html/src/lib.rs @@ -8,52 +8,156 @@ //! SPX is essentially the same thing as XDV, but we identify it differently to //! mark that the semantics of the content wil be set up for HTML output. -use percent_encoding::{utf8_percent_encode, CONTROLS}; -use std::{ - collections::HashMap, - fmt::Write as FmtWrite, - fs::File, - io::{Read, Write}, - path::{Path, PathBuf}, -}; +use std::path::{Path, PathBuf}; use tectonic_bridge_core::DriverHooks; use tectonic_errors::prelude::*; -use tectonic_io_base::OpenResult; -use tectonic_status_base::{tt_warning, StatusBackend}; +use tectonic_status_base::StatusBackend; use tectonic_xdv::{FileType, XdvEvents, XdvParser}; -use crate::font::{FontData, MapEntry}; - -mod font; +mod assets; +mod emission; +mod finalization; +mod fontfile; +mod fonts; mod html; +mod initialization; +mod specials; +mod templating; -use html::Element; +use self::{ + assets::Assets, emission::EmittingState, finalization::FinalizingState, fonts::FontEnsemble, + initialization::InitializationState, specials::Special, +}; /// An engine that converts SPX to HTML. -#[derive(Default)] -pub struct Spx2HtmlEngine {} +#[derive(Debug, Default)] +pub struct Spx2HtmlEngine { + output: OutputState, + precomputed_assets: Option, + assets_spec_path: Option, + do_not_emit_assets: bool, +} + +#[derive(Debug, Default)] +enum OutputState { + #[default] + Undefined, + NoOutput, + Path(PathBuf), +} impl Spx2HtmlEngine { - /// Process SPX into HTML. + /// Emit an asset specification file and not actual assets. + /// + /// "Assets" are files like fonts and images that accompany the HTML output + /// generated during processing. SPX files contain commands that implicitly + /// and explicitly create assets. By default, these are emitted during + /// processing. If this method is called, the assets will *not* be created, + /// as if you called [`Self::do_not_emit_assets`]. Instead, an "asset + /// specification" file will be emitted to the given output path. This + /// specification file contains the information needed to generate the + /// assets upon a later invocation. Asset specification files can be merged, + /// allowing the results of multiple separate TeX compilations to be + /// synthesized into one HTML output tree. + /// + /// Currently, the asset specification is written in JSON format, although + /// it is not guaranteed that this will always be the case. It will always + /// be a UTF8-encoded, line-oriented textual format, though. + pub fn assets_spec_path(&mut self, path: S) -> &mut Self { + self.assets_spec_path = Some(path.to_string()); + self + } + + /// Specify that this session should use a precomputed asset specification. + /// + /// If this function is used, subsequent runs will generate HTML outputs + /// assuming the information given in the asset specification. If the input + /// calls for new assets or different options inconsistent with the + /// specification, processing will abort with an error. + /// + /// The purpose of this mode is to allow for a unified set of assets to be + /// created from multiple independent runs of the SPX-to-HTML stage. First, + /// the different inputs should be processed independently, and their + /// individual assets should saved. These should then be merged. Then the + /// inputs should be reprocessed, all using the merged asset specification. + /// In one — but only one — of these sessions, the assets should actually be + /// emitted. + pub fn precomputed_assets(&mut self, assets: AssetSpecification) -> &mut Self { + self.precomputed_assets = Some(assets); + self + } + + /// Specify that templated output files should not actually be created. + /// + /// You probably want this engine to actually write its outputs to the + /// filesystem. If you call this function, it will not. This mode can be + /// useful if the main purpose of the processing run is to gather + /// information about the assets that will be generated. + pub fn do_not_emit_files(&mut self) -> &mut Self { + self.output = OutputState::NoOutput; + self + } + + /// Specify that supporting "asset" files should not actually be created. + /// + /// You probably want this engine to actually write these assets to the + /// filesystem. If you call this function, it will not. This mode can be + /// useful if the main purpose of the processing run is to gather + /// information about the assets that will be generated. + /// + /// Calling [`Self::assets_spec_path`] has the same effect as this function, + /// but also causes an asset specification file to be written to in + /// Tectonic's virtual I/O backend. + pub fn do_not_emit_assets(&mut self) -> &mut Self { + self.do_not_emit_assets = true; + self + } + + /// Specify the root path for output files. /// /// Because this driver will, in the generic case, produce a tree of HTML /// output files that are not going to be used as a basis for any subsequent - /// engine stages, it outputs directly to disk (via `out_base`) rather than - /// using the I/O layer. I don't like hardcoding use of the filesystem, but - /// I don't want to build up some extra abstraction layer right now. + /// engine stages, it outputs directly to disk rather than using the I/O + /// layer. I don't like hardcoding use of the filesystem, but I don't want + /// to build up some extra abstraction layer right now. + pub fn output_base(&mut self, out_base: impl Into) -> &mut Self { + self.output = OutputState::Path(out_base.into()); + self + } + + /// Process SPX into HTML. + /// + /// Before calling this function, you must explicitly specify the output + /// mode by calling either [`Self::do_not_emit_files`] or + /// [`Self::output_base`]. If you do not, this function will panic. pub fn process_to_filesystem( &mut self, hooks: &mut dyn DriverHooks, status: &mut dyn StatusBackend, spx: &str, - out_base: &Path, ) -> Result<()> { let mut input = hooks.io().input_open_name(spx, status).must_exist()?; + let out_base = match self.output { + OutputState::NoOutput => None, + OutputState::Path(ref p) => Some(p.as_ref()), + OutputState::Undefined => panic!("spx2html output mode not specified"), + }; + { - let state = EngineState::new(hooks, status, out_base); + let state = EngineState::new(hooks, status, out_base, self.precomputed_assets.as_ref()); let state = XdvParser::process_with_seeks(&mut input, state)?; - state.finished()?; + let (fonts, assets, mut common) = state.finished()?; + + if let Some(asp) = self.assets_spec_path.as_ref() { + let ser = assets.into_serialize(fonts); + let mut output = hooks.io().output_open_name(asp).must_exist()?; + serde_json::to_writer_pretty(&mut output, &ser)?; + let (name, digest) = output.into_name_digest(); + hooks.event_output_closed(name, digest, status); + } else if !self.do_not_emit_assets { + assets.emit(fonts, &mut common)?; + } } let (name, digest_opt) = input.into_name_digest(); @@ -62,6 +166,8 @@ impl Spx2HtmlEngine { } } +pub use assets::AssetSpecification; + struct EngineState<'a> { common: Common<'a>, state: State, @@ -70,20 +176,23 @@ struct EngineState<'a> { struct Common<'a> { hooks: &'a mut dyn DriverHooks, status: &'a mut dyn StatusBackend, - out_base: &'a Path, + out_base: Option<&'a Path>, + precomputed_assets: Option<&'a AssetSpecification>, } impl<'a> EngineState<'a> { pub fn new( hooks: &'a mut dyn DriverHooks, status: &'a mut dyn StatusBackend, - out_base: &'a Path, + out_base: Option<&'a Path>, + precomputed_assets: Option<&'a AssetSpecification>, ) -> Self { Self { common: Common { hooks, status, out_base, + precomputed_assets, }, state: State::Initializing(InitializationState::default()), } @@ -97,17 +206,27 @@ enum State { Invalid, Initializing(InitializationState), Emitting(EmittingState), + Finalizing(FinalizingState), } impl<'a> EngineState<'a> { - pub fn finished(mut self) -> Result<()> { - if let State::Emitting(mut s) = self.state { - if !s.current_content.is_empty() { - s.finish_file(&mut self.common)?; + pub fn finished(mut self) -> Result<(FontEnsemble, Assets, Common<'a>)> { + self.state.ensure_finalizing(&mut self.common)?; + + if let State::Finalizing(s) = self.state { + let (fonts, mut assets) = s.finished(); + + // If we have precomputed assets, make sure that this run didn't + // define anything surprising, and sync up the runtime manifest with + // the precomputed one so that we emit everything if needed. + if let Some(precomputed) = self.common.precomputed_assets { + precomputed.check_runtime_assets(&mut assets)?; } - } - Ok(()) + Ok((fonts, assets, self.common)) + } else { + panic!("invalid spx2html finalization state leaked"); + } } /// Return true if we're in the initializing phase, but not in the midst of @@ -116,12 +235,8 @@ impl<'a> EngineState<'a> { /// content, we should end the initialization phase. fn in_endable_init(&self) -> bool { match &self.state { - State::Invalid => false, - State::Initializing(s) => { - s.cur_font_family_definition.is_none() - && s.cur_font_family_tag_associations.is_none() - } - State::Emitting(_) => false, + State::Initializing(s) => s.in_endable_init(), + _ => false, } } } @@ -140,46 +255,36 @@ impl<'a> XdvEvents for EngineState<'a> { fn handle_special(&mut self, x: i32, y: i32, contents: &[u8]) -> Result<()> { let contents = atry!(std::str::from_utf8(contents); ["could not parse \\special as UTF-8"]); - // str.split_once() would be nice but it was introduced in 1.52 which is - // a bit recent for us. - - let mut pieces = contents.splitn(2, ' '); - - let (tdux_command, remainder) = if let Some(p) = pieces.next() { - if let Some(cmd) = p.strip_prefix("tdux:") { - (Some(cmd), pieces.next().unwrap_or_default()) - } else { - (None, contents) - } - } else { - (None, contents) + let special = match Special::parse(contents, self.common.status) { + Some(s) => s, + None => return Ok(()), }; // Might we need to end the initialization phase? - if self.in_endable_init() { - let end_init = matches!( - tdux_command.unwrap_or("none"), - "emit" | "provideFile" | "asp" | "aep" | "cs" | "ce" | "mfs" | "me" | "dt" - ); + if self.in_endable_init() && special.ends_initialization() { + self.state.ensure_initialized(&mut self.common)?; + } - if end_init { - self.state.ensure_initialized()?; - } + // Might we be entering the finalization phase? + + if let Special::ContentFinished = special { + return self.state.ensure_finalizing(&mut self.common); } // Ready to dispatch. match &mut self.state { State::Invalid => panic!("invalid spx2html state leaked"), - State::Initializing(s) => s.handle_special(tdux_command, remainder, &mut self.common), - State::Emitting(s) => s.handle_special(x, y, tdux_command, remainder, &mut self.common), + State::Initializing(s) => s.handle_special(special, &mut self.common), + State::Emitting(s) => s.handle_special(x, y, special, &mut self.common), + State::Finalizing(s) => s.handle_special(special, &mut self.common), } } fn handle_text_and_glyphs( &mut self, - font_num: FontNum, + font_num: TexFontNum, text: &str, _width: i32, glyphs: &[u16], @@ -187,7 +292,7 @@ impl<'a> XdvEvents for EngineState<'a> { y: &[i32], ) -> Result<()> { if self.in_endable_init() { - self.state.ensure_initialized()?; + self.state.ensure_initialized(&mut self.common)?; } match &mut self.state { @@ -198,6 +303,7 @@ impl<'a> XdvEvents for EngineState<'a> { State::Emitting(s) => { s.handle_text_and_glyphs(font_num, text, glyphs, x, y, &mut self.common)? } + State::Finalizing(s) => s.handle_text_and_glyphs(text, &mut self.common)?, } Ok(()) @@ -206,7 +312,7 @@ impl<'a> XdvEvents for EngineState<'a> { fn handle_define_native_font( &mut self, name: &str, - font_num: FontNum, + font_num: TexFontNum, size: i32, face_index: u32, color_rgba: Option, @@ -233,2092 +339,59 @@ impl<'a> XdvEvents for EngineState<'a> { fn handle_glyph_run( &mut self, - font_num: FontNum, + font_num: TexFontNum, glyphs: &[u16], x: &[i32], y: &[i32], ) -> Result<(), Self::Error> { - self.state.ensure_initialized()?; + self.state.ensure_initialized(&mut self.common)?; match &mut self.state { State::Invalid => panic!("invalid spx2html state leaked"), State::Initializing(_) => unreachable!(), State::Emitting(s) => s.handle_glyph_run(font_num, glyphs, x, y, &mut self.common), + State::Finalizing(s) => s.handle_glyph_run(&mut self.common), + } + } + + fn handle_rule(&mut self, x: i32, y: i32, height: i32, width: i32) -> Result<(), Self::Error> { + self.state.ensure_initialized(&mut self.common)?; + + match &mut self.state { + State::Invalid => panic!("invalid spx2html state leaked"), + State::Initializing(_) => unreachable!(), + State::Emitting(s) => s.handle_rule(x, y, height, width, &mut self.common), + State::Finalizing(s) => s.handle_rule(&mut self.common), } } } impl State { - fn ensure_initialized(&mut self) -> Result<()> { + fn ensure_initialized(&mut self, common: &mut Common) -> Result<()> { // Is this the least-bad way to do this?? let mut work = std::mem::replace(self, State::Invalid); if let State::Initializing(s) = work { - work = State::Emitting(s.initialization_finished()?); + work = State::Emitting(s.initialization_finished(common)?); } std::mem::swap(self, &mut work); Ok(()) } -} - -#[derive(Debug)] -struct InitializationState { - templates: HashMap, - next_template_path: String, - next_output_path: String, - - // TODO: terrible nomenclature. FontInfo is what we track here; FontData is - // the glyph measurements and stuff that we compute in the `font` module. - fonts: HashMap, - font_data_keys: HashMap<(String, u32), usize>, - font_data: HashMap, - main_body_font_num: Option, - /// Keyed by the "regular" font-num - font_families: HashMap, - tag_associations: HashMap, - - cur_font_family_definition: Option, - cur_font_family_tag_associations: Option, - - variables: HashMap, -} - -impl Default for InitializationState { - fn default() -> Self { - InitializationState { - templates: Default::default(), - next_template_path: Default::default(), - next_output_path: "index.html".to_owned(), - - fonts: Default::default(), - font_data_keys: Default::default(), - font_data: Default::default(), - main_body_font_num: None, - font_families: Default::default(), - tag_associations: Default::default(), - - cur_font_family_definition: None, - cur_font_family_tag_associations: None, - - variables: Default::default(), - } - } -} - -impl InitializationState { - #[allow(clippy::too_many_arguments)] - fn handle_define_native_font( - &mut self, - name: &str, - font_num: FontNum, - size: FixedPoint, - face_index: u32, - color_rgba: Option, - extend: Option, - slant: Option, - embolden: Option, - common: &mut Common, - ) -> Result<()> { - if self.fonts.contains_key(&font_num) { - // Should we override the definition or something? - return Ok(()); - } - - // TODO: often there are multiple font_nums with the same "name". We - // only need to copy the file once. - - let io = common.hooks.io(); - let mut texpath = String::default(); - let mut ih = None; - - for ext in &["", ".otf"] { - texpath = format!("{}{}", name, ext); - - match io.input_open_name(&texpath, common.status) { - OpenResult::Ok(h) => { - ih = Some(h); - break; - } - - OpenResult::NotAvailable => continue, - - OpenResult::Err(e) => return Err(e), - }; - } - - let mut ih = a_ok_or!(ih; - ["failed to find a font file associated with the name `{}`", name] - ); - - let mut contents = Vec::new(); - atry!( - ih.read_to_end(&mut contents); - ["unable to read input font file `{}`", &texpath] - ); - let (name, digest_opt) = ih.into_name_digest(); - common - .hooks - .event_input_closed(name.clone(), digest_opt, common.status); - - let mut out_path = common.out_base.to_owned(); - let basename = texpath.rsplit('/').next().unwrap(); - out_path.push(basename); - - { - let mut out_file = atry!( - File::create(&out_path); - ["cannot open output file `{}`", out_path.display()] - ); - - atry!( - out_file.write_all(&contents); - ["cannot write output file `{}`", out_path.display()] - ); - } - - let fd_key = (name, face_index); - let next_id = self.font_data_keys.len(); - let fd_key = *self.font_data_keys.entry(fd_key).or_insert(next_id); - - if fd_key == next_id { - let map = atry!( - FontData::from_opentype(basename.to_owned(), contents, face_index); - ["unable to load glyph data from font `{}`", texpath] - ); - self.font_data.insert(fd_key, map); - } - - let info = FontInfo { - rel_url: utf8_percent_encode(basename, CONTROLS).to_string(), - family_name: format!("tdux{}", font_num), - family_relation: FamilyRelativeFontId::Regular, - fd_key, - size, - face_index, - color_rgba, - extend, - slant, - embolden, - }; - - self.fonts.insert(font_num, info); - Ok(()) - } - - fn handle_special( - &mut self, - tdux_command: Option<&str>, - remainder: &str, - common: &mut Common, - ) -> Result<()> { - if let Some(cmd) = tdux_command { - match cmd { - "addTemplate" => self.handle_add_template(remainder, common), - "setTemplate" => self.handle_set_template(remainder, common), - "setOutputPath" => self.handle_set_output_path(remainder, common), - "setTemplateVariable" => self.handle_set_template_variable(remainder, common), - - "startDefineFontFamily" => self.handle_start_define_font_family(), - "endDefineFontFamily" => self.handle_end_define_font_family(common), - - "startFontFamilyTagAssociations" => { - self.handle_start_font_family_tag_associations() - } - - "endFontFamilyTagAssociations" => { - self.handle_end_font_family_tag_associations(common) - } - - "provideFile" => { - tt_warning!(common.status, "ignoring too-soon tdux:provideFile special"); - Ok(()) - } - - _ => Ok(()), - } - } else { - Ok(()) - } - } - - fn handle_add_template(&mut self, texpath: &str, common: &mut Common) -> Result<()> { - let mut ih = atry!( - common.hooks.io().input_open_name(texpath, common.status).must_exist(); - ["unable to open input HTML template `{}`", texpath] - ); - - let mut contents = String::new(); - atry!( - ih.read_to_string(&mut contents); - ["unable to read input HTML template `{}`", texpath] - ); - - self.templates.insert(texpath.to_owned(), contents); - - let (name, digest_opt) = ih.into_name_digest(); - common - .hooks - .event_input_closed(name, digest_opt, common.status); - Ok(()) - } - - fn handle_set_template(&mut self, texpath: &str, _common: &mut Common) -> Result<()> { - self.next_template_path = texpath.to_owned(); - Ok(()) - } - - fn handle_set_output_path(&mut self, texpath: &str, _common: &mut Common) -> Result<()> { - self.next_output_path = texpath.to_owned(); - Ok(()) - } - - fn handle_set_template_variable(&mut self, remainder: &str, common: &mut Common) -> Result<()> { - if let Some((varname, varval)) = remainder.split_once(' ') { - self.variables.insert(varname.to_owned(), varval.to_owned()); - } else { - tt_warning!( - common.status, - "ignoring malformatted tdux:setTemplateVariable special `{}`", - remainder - ); - } - - Ok(()) - } - - // "Font family" definitions, allowing us to synthesize bold/italic tags - // based on tracking font changes, and also to know what the main body font - // is. - - fn handle_start_define_font_family(&mut self) -> Result<()> { - self.cur_font_family_definition = Some(FontFamilyBuilder::default()); - Ok(()) - } - - fn handle_end_define_font_family(&mut self, common: &mut Common) -> Result<()> { - if let Some(b) = self.cur_font_family_definition.take() { - let family_name = b.family_name; - let regular = a_ok_or!(b.regular; ["no regular face defined"]); - let bold = a_ok_or!(b.bold; ["no bold face defined"]); - let italic = a_ok_or!(b.italic; ["no italic face defined"]); - let bold_italic = a_ok_or!(b.bold_italic; ["no bold-italic face defined"]); - - self.font_families.insert( - regular, - FontFamily { - regular, - bold, - italic, - bold_italic, - }, - ); - - // Now update the info records for the relevant fonts to capture the - // established relationship. - - if let Some(info) = self.fonts.get_mut(®ular) { - info.family_name = family_name.clone(); - info.family_relation = FamilyRelativeFontId::Regular; - } - - if let Some(info) = self.fonts.get_mut(&bold) { - info.family_name = family_name.clone(); - info.family_relation = FamilyRelativeFontId::Bold; - } - - if let Some(info) = self.fonts.get_mut(&italic) { - info.family_name = family_name.clone(); - info.family_relation = FamilyRelativeFontId::Italic; - } - - if let Some(info) = self.fonts.get_mut(&bold_italic) { - info.family_name = family_name; - info.family_relation = FamilyRelativeFontId::BoldItalic; - } - } else { - tt_warning!( - common.status, - "end of font-family definition block that didn't start" - ); - } - - Ok(()) - } - - // "Font family tag associations", telling us which font family is the - // default depending on which tag we're in. For instance, typical templates - // will default to the monospace font inside `` tags. - - fn handle_start_font_family_tag_associations(&mut self) -> Result<()> { - self.cur_font_family_tag_associations = Some(FontFamilyTagAssociator::default()); - Ok(()) - } - - fn handle_end_font_family_tag_associations(&mut self, common: &mut Common) -> Result<()> { - if let Some(mut a) = self.cur_font_family_tag_associations.take() { - for (k, v) in a.assoc.drain() { - self.tag_associations.insert(k, v); - } - } else { - tt_warning!( - common.status, - "end of font-family tag-association block that didn't start" - ); - } - Ok(()) - } + fn ensure_finalizing(&mut self, common: &mut Common) -> Result<()> { + self.ensure_initialized(common)?; - /// In the initialization state, this should only get called if we're in a - /// font-family definition (in which case we're using the contents to learn - /// the definition of a font family). Otherwise, the higher-level callback - /// will declare initialization done and move to the emitting state. - fn handle_text_and_glyphs( - &mut self, - font_num: FontNum, - text: &str, - _glyphs: &[u16], - _xs: &[i32], - _ys: &[i32], - common: &mut Common, - ) -> Result<()> { - if let Some(b) = self.cur_font_family_definition.as_mut() { - if text.starts_with("bold-italic") { - b.bold_italic = Some(font_num); - } else if text.starts_with("bold") { - b.bold = Some(font_num); - } else if text.starts_with("italic") { - b.italic = Some(font_num); - } else { - b.regular = Some(font_num); - b.family_name = if let Some(fname) = text.strip_prefix("family-name:") { - fname.to_owned() - } else { - format!("tdux{}", font_num) - }; + let mut work = std::mem::replace(self, State::Invalid); - // Say that the "regular" font of the first font family definition - // is the main body font. - if self.main_body_font_num.is_none() { - self.main_body_font_num = Some(font_num); - } - } - } else if let Some(a) = self.cur_font_family_tag_associations.as_mut() { - for tagname in text.split_whitespace() { - let el: Element = tagname.parse().unwrap(); - a.assoc.insert(el, font_num); - } - } else { - // This shouldn't happen; the top-level processor should exit init - // phase if it's invoked and none of the above cases hold. - tt_warning!( - common.status, - "internal bug; losing text `{}` in initialization phase", - text - ); + if let State::Emitting(s) = work { + work = State::Finalizing(s.emission_finished(common)?); } - Ok(()) - } - - fn initialization_finished(self) -> Result { - let mut context = tera::Context::default(); - - // Set up font stuff. - - let rems_per_tex = if let Some(fnum) = self.main_body_font_num { - let info = self.fonts.get(&fnum).unwrap(); - 1.0 / (info.size as f32) - } else { - 1. / 65536. - }; - - // Tera requires that we give it a filesystem path to look for - // templates, even if we're going to be adding all of our templates - // later. So I guess we have to create an empty tempdir. - - let tempdir = atry!( - tempfile::Builder::new().prefix("tectonic_tera_workaround").tempdir(); - ["couldn't create empty temporary directory for Tera"] - ); - - let mut p = PathBuf::from(tempdir.path()); - p.push("*"); - - let p = a_ok_or!( - p.to_str(); - ["couldn't convert Tera temporary directory name to UTF8 as required"] - ); - - let mut tera = atry!( - tera::Tera::parse(p); - ["couldn't initialize Tera templating engine in temporary directory `{}`", p] - ); - - atry!( - tera.add_raw_templates(self.templates.iter()); - ["couldn't compile Tera templates"] - ); - - // Other context initialization, with the possibility of overriding - // stuff that's been set up earlier. - - for (varname, varvalue) in self.variables { - context.insert(varname, &varvalue); - } - - // All done! - - Ok(EmittingState { - tera, - context, - fonts: self.fonts, - font_families: self.font_families, - tag_associations: self.tag_associations, - rems_per_tex, - font_data: self.font_data, - next_template_path: self.next_template_path, - next_output_path: self.next_output_path, - current_content: String::default(), - elem_stack: vec![ElementState { - elem: None, - origin: ElementOrigin::Root, - do_auto_tags: true, - do_auto_spaces: true, - font_family_id: self.main_body_font_num.unwrap_or_default(), - active_font: FamilyRelativeFontId::Regular, - }], - current_canvas: None, - content_finished: false, - content_finished_warning_issued: false, - last_content_x: 0, - last_content_space_width: None, - }) - } -} - -#[derive(Debug)] -struct EmittingState { - tera: tera::Tera, - context: tera::Context, - fonts: HashMap, - - /// Keyed by the "regular" font - font_families: HashMap, - tag_associations: HashMap, - - rems_per_tex: f32, - font_data: HashMap, - next_template_path: String, - next_output_path: String, - current_content: String, - elem_stack: Vec, - current_canvas: Option, - content_finished: bool, - content_finished_warning_issued: bool, - last_content_x: i32, - last_content_space_width: Option, -} - -#[derive(Debug)] -struct ElementState { - /// The associated HTML element. This is None for the bottom item in the - /// stack, or for changes in state that are not associated with actual HTML - /// tags. - elem: Option, - - /// The origin of this element/state-change. - origin: ElementOrigin, - - /// Whether HTML tags that are automatically generated by the TeX - /// engine, such as

    and

    at the start and end of paragraphs, - /// should be emitted (true) or ignored (false). - do_auto_tags: bool, - - /// Whether this library should automatically insert spaces into text - /// content. This is done by looking at the horizontal positions of - /// different runs of text and applying a threshold for the amount of space - /// between the end of the previous one and the start of the next one. - do_auto_spaces: bool, - - /// The font-num of the regular font associated with the current font - /// family. This code is currently only exercised with a single "font - /// family" defined in a document, but there could be multiple. - font_family_id: FontNum, - - /// The currently active font, as we understand it, relative to the - /// currently active font family. - active_font: FamilyRelativeFontId, -} - -impl ElementState { - /// Should this element automatically be closed if a new tag starts or ends? - fn is_auto_close(&self) -> bool { - matches!(self.origin, ElementOrigin::FontAuto) - } -} - -/// How a particular ElementState ended up on the stack. -#[derive(Clone, Copy, Debug, Eq, PartialEq)] -enum ElementOrigin { - /// This is the root element in our stack. - Root, - - /// The element was manually inserted by the TeX code. - Manual, - - /// The element was automatically inserted by the TeX engine. - EngineAuto, - - /// The element was automatically inserted by us to - /// activate the desired font. - FontAuto, -} - -#[derive(Debug)] -struct CanvasState { - kind: String, - depth: usize, - x0: i32, - y0: i32, - glyphs: Vec, -} - -impl CanvasState { - fn new(kind: &str, x0: i32, y0: i32) -> Self { - CanvasState { - kind: kind.to_owned(), - depth: 1, - x0, - y0, - glyphs: Vec::new(), - } - } -} - -#[derive(Debug)] -struct GlyphInfo { - dx: i32, - dy: i32, - font_num: FontNum, - glyph: u16, -} - -impl EmittingState { - fn warn_finished_content(&mut self, detail: &str, common: &mut Common) { - if !self.content_finished_warning_issued { - tt_warning!(common.status, "dropping post-finish content ({})", detail); - self.content_finished_warning_issued = true; - } - } - - fn create_elem(&self, name: &str, is_start: bool, common: &mut Common) -> Element { - // Parsing can never fail since we offer an `Other` element type - let el: html::Element = name.parse().unwrap(); - - if el.is_deprecated() { - tt_warning!( - common.status, - "HTML element `{}` is deprecated; templates should be updated to avoid it", - name - ); - } - - if is_start && el.is_empty() { - tt_warning!( - common.status, - "HTML element `{}` is an empty element; insert it with `tdux:mfe`, not as a start-tag", - name - ); - } - - if let Some(cur) = self.cur_elstate().elem.as_ref() { - if cur.is_autoclosed_by(&el) { - tt_warning!( - common.status, - "currently open HTML element `{}` will be implicitly closed by new \ - element `{}`; explicit closing tags are strongly encouraged", - cur.name(), - name - ); - } - } - - el - } - - #[inline(always)] - fn cur_elstate(&self) -> &ElementState { - self.elem_stack.last().unwrap() - } - - /// Close the topmost element in the stack. - fn close_one(&mut self) { - // Refuse the close the root element - if self.elem_stack.len() > 1 { - let cur = self.elem_stack.pop().unwrap(); - - if let Some(e) = cur.elem.as_ref() { - self.current_content.push('<'); - self.current_content.push('/'); - self.current_content.push_str(e.name()); - self.current_content.push('>'); - } - } - } - - /// Close an auto-close elements that are currently at the top of the stack. - /// These elements are things like tags that were automatically - /// generated with the detection of the use of the bold font face. - fn close_automatics(&mut self) { - while self.elem_stack.len() > 1 { - let close_it = self.cur_elstate().is_auto_close(); - - if close_it { - self.close_one(); - } else { - break; - } - } - } - - fn push_elem(&mut self, el: Element, origin: ElementOrigin) { - self.close_automatics(); - - let new_item = { - let cur = self.cur_elstate(); - - let font_family_id = self - .tag_associations - .get(&el) - .copied() - .unwrap_or(cur.font_family_id); - - ElementState { - elem: Some(el), - origin, - font_family_id, - ..*cur - } - }; - - self.elem_stack.push(new_item); - } - - /// TODO: may need to hone semantics when element nesting isn't as expected. - fn pop_elem(&mut self, name: &str, common: &mut Common) { - self.close_automatics(); - - let mut n_closed = 0; - - while self.elem_stack.len() > 1 { - let cur = self.elem_stack.pop().unwrap(); - - if let Some(e) = cur.elem.as_ref() { - self.current_content.push('<'); - self.current_content.push('/'); - self.current_content.push_str(e.name()); - self.current_content.push('>'); - n_closed += 1; - - if e.name() == name { - break; - } - } - } - - if n_closed != 1 { - tt_warning!( - common.status, - "imbalanced tags; had to close {} to find `{}`", - n_closed, - name - ); - } - } - - fn maybe_get_font_space_width(&self, font_num: Option) -> Option { - font_num.and_then(|fnum| { - if let Some(fi) = self.fonts.get(&fnum) { - let fd = self.font_data.get(&fi.fd_key).unwrap(); - fd.space_width(fi.size) - } else { - None - } - }) - } - - /// Figure out if we need to push a space into the text content right now. - fn is_space_needed(&self, x0: i32, cur_font_num: Option) -> bool { - // We never want a leading space. - if self.current_content.is_empty() { - return false; - } - - // Auto-spaces can be disabled. - if !self.cur_elstate().do_auto_spaces { - return false; - } - - // TODO: RTL ASSUMPTION!!!!! - // - // If the "next" x is smaller than the last one, assume that we've - // started a new line. We ignore Y values since those are going to - // get hairy with subscripts, etc. - - if x0 < self.last_content_x { - return true; - } - - // Check the advance against the size of the space, which can be - // determined from either the most recent content or the new content, - // since in various circumstances either one or the other might not - // be defined. If both are defined, use whatever's smaller. There's - // probably a smoother way to do this logic? - - let cur_space_width = self.maybe_get_font_space_width(cur_font_num); - - let space_width = match (&self.last_content_space_width, &cur_space_width) { - (Some(w1), Some(w2)) => FixedPoint::min(*w1, *w2), - (Some(w), None) => *w, - (None, Some(w)) => *w, - (None, None) => 0, - }; - - // If the x difference is larger than 1/4 of the space_width, let's say that - // we need a space. I made up the 1/4. - 4 * (x0 - self.last_content_x) > space_width - } - - fn update_content_pos(&mut self, x: i32, font_num: Option) { - self.last_content_x = x; - - let cur_space_width = self.maybe_get_font_space_width(font_num); - if cur_space_width.is_some() { - self.last_content_space_width = cur_space_width; - } - } - - /// Maybe push a space into the text content right now, if we think we need one. - fn push_space_if_needed(&mut self, x0: i32, cur_font_num: Option) { - if self.is_space_needed(x0, cur_font_num) { - self.current_content.push(' '); - } - - // This parameter should be updated almost-instantaneously - // if a run of glyphs is being rendered, but this is a good start: - self.update_content_pos(x0, cur_font_num); - } - - fn handle_special( - &mut self, - x: i32, - y: i32, - tdux_command: Option<&str>, - remainder: &str, - common: &mut Common, - ) -> Result<()> { - if let Some(cmd) = tdux_command { - match cmd { - "asp" => { - if self.content_finished { - self.warn_finished_content("auto start paragraph", common); - } else if self.cur_elstate().do_auto_tags { - // Why are we using
    s instead of

    ? As the HTML spec - // emphasizes,

    tags are structural, not semantic. You cannot - // put tags like

      or
      inside

      -- they automatically - // close the paragraph. This does not align with TeX's idea of a - // paragraph, and there's no upside to trying to use

      's -- as - // the spec notes, the

      tag does not activate any important - // semantics itself. The HTML spec explicitly recommends that - // you can use

      elements to group logical paragraphs. So - // that's what we do. - let el = self.create_elem("div", true, common); - self.push_space_if_needed(x, None); - self.current_content.push_str("
      "); - self.push_elem(el, ElementOrigin::EngineAuto); - } - Ok(()) - } - - "aep" => { - if self.content_finished { - self.warn_finished_content("auto end paragraph", common); - } else if self.cur_elstate().do_auto_tags { - self.pop_elem("div", common); - } - Ok(()) - } - - "cs" => { - if self.content_finished { - self.warn_finished_content("canvas start", common); - } else if let Some(canvas) = self.current_canvas.as_mut() { - canvas.depth += 1; - } else { - self.current_canvas = Some(CanvasState::new(remainder, x, y)); - } - Ok(()) - } - - "ce" => { - if self.content_finished { - self.warn_finished_content("canvas end", common); - } else if let Some(canvas) = self.current_canvas.as_mut() { - canvas.depth -= 1; - if canvas.depth == 0 { - self.handle_end_canvas(common)?; - } - } else { - tt_warning!( - common.status, - "ignoring unpaired tdux:c[anvas]e[nd] special for `{}`", - remainder - ); - } - Ok(()) - } - - "mfs" => { - if self.content_finished { - self.warn_finished_content( - &format!("manual flexible start tag {:?}", remainder), - common, - ); - Ok(()) - } else { - self.handle_flexible_start_tag(x, y, remainder, common) - } - } - - "me" => { - if self.content_finished { - self.warn_finished_content( - &format!("manual end tag ", remainder), - common, - ); - } else { - self.pop_elem(remainder, common); - } - Ok(()) - } - - "dt" => { - if self.content_finished { - self.warn_finished_content("direct text", common); - } else { - html_escape::encode_safe_to_string(remainder, &mut self.current_content); - } - Ok(()) - } - - "emit" => self.finish_file(common), - - "setTemplate" => { - self.next_template_path = remainder.to_owned(); - Ok(()) - } - - "setOutputPath" => { - self.next_output_path = remainder.to_owned(); - Ok(()) - } - - "setTemplateVariable" => self.handle_set_template_variable(remainder, common), - - "provideFile" => self.handle_provide_file(remainder, common), - - "contentFinished" => self.content_finished(common), - - other => { - tt_warning!( - common.status, - "ignoring unrecognized special: tdux:{} {}", - other, - remainder - ); - Ok(()) - } - } - } else { - Ok(()) - } - } - - /// Handle a "flexible" start tag. - /// - /// These start tags are built with a line-oriented structure that aims to - /// make it so that the TeX code doesn't have to worry too much about - /// escaping, etc. The general format is: - /// - /// ```notest - /// \special{tdux:mfs tagname - /// Cclass % add a CSS class - /// Sname value % Add a CSS setting in the style attr - /// Uname value % Add an unquoted attribute - /// Dname value % Add a double-quoted attribute - /// NAS % Turn off automatic space insertion while processing this tag - /// NAT % Turn off automatic tag insertion while processing this tag - /// } - /// ``` - /// - /// More ... - fn handle_flexible_start_tag( - &mut self, - x: i32, - _y: i32, - remainder: &str, - common: &mut Common, - ) -> Result<()> { - let mut lines = remainder.lines(); - - let tagname = match lines.next() { - Some(t) => t, - None => { - tt_warning!( - common.status, - "ignoring TDUX flexible start tag -- no tag name: {:?}", - remainder - ); - return Ok(()); - } - }; - - if !tagname.chars().all(char::is_alphanumeric) { - tt_warning!( - common.status, - "ignoring TDUX flexible start tag -- invalid tag name: {:?}", - remainder - ); - return Ok(()); - } - - let el = self.create_elem(tagname, true, common); - - let mut elstate = { - let cur = self.cur_elstate(); - - let font_family_id = self - .tag_associations - .get(&el) - .copied() - .unwrap_or(cur.font_family_id); - - ElementState { - elem: Some(el), - origin: ElementOrigin::Manual, - font_family_id, - ..*cur - } - }; - - let mut classes = Vec::new(); - let mut styles = Vec::new(); - let mut unquoted_attrs = Vec::new(); - let mut double_quoted_attrs = Vec::new(); - - for line in lines { - if let Some(cls) = line.strip_prefix('C') { - // For later: apply any restrictions to allowed class names? - if !cls.is_empty() { - classes.push(cls.to_owned()); - } else { - tt_warning!( - common.status, - "ignoring TDUX flexible start tag class -- invalid name: {:?}", - cls - ); - } - } else if let Some(rest) = line.strip_prefix('S') { - // For later: apply any restrictions to names/values here? - let mut bits = rest.splitn(2, ' '); - let name = match bits.next() { - Some(n) => n, - None => { - tt_warning!( - common.status, - "ignoring TDUX flexible start tag style -- no name: {:?}", - rest - ); - continue; - } - }; - let value = match bits.next() { - Some(v) => v, - None => { - tt_warning!( - common.status, - "ignoring TDUX flexible start tag style -- no value: {:?}", - rest - ); - continue; - } - }; - styles.push((name.to_owned(), value.to_owned())); - } else if let Some(rest) = line.strip_prefix('U') { - // For later: apply any restrictions to names/values here? - let mut bits = rest.splitn(2, ' '); - let name = match bits.next() { - Some("class") | Some("style") => { - tt_warning!( - common.status, - "ignoring TDUX flexible start tag attr -- use C/S command: {:?}", - rest - ); - continue; - } - Some(n) => n, - None => { - tt_warning!( - common.status, - "ignoring TDUX flexible start tag attr -- no name: {:?}", - rest - ); - continue; - } - }; - unquoted_attrs.push((name.to_owned(), bits.next().map(|v| v.to_owned()))); - } else if let Some(rest) = line.strip_prefix('D') { - // For later: apply any restrictions to names/values here? - let mut bits = rest.splitn(2, ' '); - let name = match bits.next() { - Some("class") | Some("style") => { - tt_warning!( - common.status, - "ignoring TDUX flexible start tag attr -- use C/S command: {:?}", - rest - ); - continue; - } - Some(n) => n, - None => { - tt_warning!( - common.status, - "ignoring TDUX flexible start tag attr -- no name: {:?}", - rest - ); - continue; - } - }; - double_quoted_attrs.push((name.to_owned(), bits.next().map(|v| v.to_owned()))); - } else if line == "NAS" { - elstate.do_auto_spaces = false; - } else if line == "NAT" { - elstate.do_auto_tags = false; - } else { - tt_warning!( - common.status, - "ignoring unrecognized TDUX flexible start tag command: {:?}", - line - ); - } - } - - self.push_space_if_needed(x, None); - self.current_content.push('<'); - html_escape::encode_safe_to_string(tagname, &mut self.current_content); - - if !classes.is_empty() { - self.current_content.push_str(" class=\""); - - let mut first = true; - for c in &classes { - if first { - first = false; - } else { - self.current_content.push(' '); - } - - html_escape::encode_double_quoted_attribute_to_string(c, &mut self.current_content); - } - - self.current_content.push('\"'); - } - - if !styles.is_empty() { - self.current_content.push_str(" style=\""); - - let mut first = true; - for (name, value) in &styles { - if first { - first = false; - } else { - self.current_content.push(';'); - } - - html_escape::encode_double_quoted_attribute_to_string( - name, - &mut self.current_content, - ); - self.current_content.push(':'); - html_escape::encode_double_quoted_attribute_to_string( - value, - &mut self.current_content, - ); - } - - self.current_content.push('\"'); - } - - for (name, maybe_value) in &unquoted_attrs { - self.current_content.push(' '); - html_escape::encode_safe_to_string(name, &mut self.current_content); - - if let Some(v) = maybe_value { - self.current_content.push('='); - html_escape::encode_unquoted_attribute_to_string(v, &mut self.current_content); - } - } - - for (name, maybe_value) in &double_quoted_attrs { - self.current_content.push(' '); - html_escape::encode_safe_to_string(name, &mut self.current_content); - self.current_content.push_str("=\""); - - if let Some(v) = maybe_value { - html_escape::encode_double_quoted_attribute_to_string(v, &mut self.current_content); - } - - self.current_content.push('\"'); - } - - self.current_content.push('>'); - self.elem_stack.push(elstate); - Ok(()) - } - - fn handle_set_template_variable(&mut self, remainder: &str, common: &mut Common) -> Result<()> { - if let Some((varname, varval)) = remainder.split_once(' ') { - self.context.insert(varname, varval); - } else { - tt_warning!( - common.status, - "ignoring malformatted tdux:setTemplateVariable special `{}`", - remainder - ); - } - - Ok(()) - } - - fn handle_provide_file(&mut self, remainder: &str, common: &mut Common) -> Result<()> { - let (src_tex_path, dest_path) = match remainder.split_once(' ') { - Some(t) => t, - None => { - tt_warning!( - common.status, - "ignoring malformatted tdux:provideFile special `{}`", - remainder - ); - return Ok(()); - } - }; - - // Set up input? - - let mut ih = atry!( - common.hooks.io().input_open_name(src_tex_path, common.status).must_exist(); - ["unable to open provideFile source `{}`", &src_tex_path] - ); - - // Set up output? TODO: create parent directories! - - let mut out_path = common.out_base.to_owned(); - - for piece in dest_path.split('/') { - if piece.is_empty() { - continue; - } - - if piece == ".." { - bail!( - "illegal provideFile dest path `{}`: it contains a `..` component", - &dest_path - ); - } - - let as_path = Path::new(piece); - - if as_path.is_absolute() || as_path.has_root() { - bail!( - "illegal provideFile path `{}`: it contains an absolute/rooted component", - &dest_path, - ); - } - - out_path.push(piece); - } - - // Copy! - - { - let mut out_file = atry!( - File::create(&out_path); - ["cannot open output file `{}`", out_path.display()] - ); - - atry!( - std::io::copy(&mut ih, &mut out_file); - ["cannot copy to output file `{}`", out_path.display()] - ); - } - - // All done. - - let (name, digest_opt) = ih.into_name_digest(); - common - .hooks - .event_input_closed(name, digest_opt, common.status); - - Ok(()) - } - - fn handle_text_and_glyphs( - &mut self, - font_num: FontNum, - text: &str, - glyphs: &[u16], - xs: &[i32], - ys: &[i32], - common: &mut Common, - ) -> Result<()> { - if self.content_finished { - self.warn_finished_content(&format!("text `{}`", text), common); - return Ok(()); - } - - if let Some(c) = self.current_canvas.as_mut() { - for i in 0..glyphs.len() { - c.glyphs.push(GlyphInfo { - dx: xs[i] - c.x0, - dy: ys[i] - c.y0, - glyph: glyphs[i], - font_num, - }); - } - } else if !glyphs.is_empty() { - self.set_up_for_font(xs[0], font_num, common); - - self.push_space_if_needed(xs[0], Some(font_num)); - html_escape::encode_text_to_string(text, &mut self.current_content); - - // To figure out when we need spaces, we need to care about the last - // glyph's actual width (well, its advance). - // - // TODO: RTL correctness!!!! - - let idx = glyphs.len() - 1; - let fi = a_ok_or!( - self.fonts.get(&font_num); - ["undeclared font {} in canvas", font_num] - ); - let fd = self.font_data.get_mut(&fi.fd_key).unwrap(); - let gm = fd.lookup_metrics(glyphs[idx], fi.size); - let advance = match gm { - Some(gm) => gm.advance, - None => 0, - }; - self.update_content_pos(xs[idx] + advance, Some(font_num)); - } - - Ok(()) - } - - fn handle_glyph_run( - &mut self, - font_num: FontNum, - glyphs: &[u16], - xs: &[i32], - ys: &[i32], - common: &mut Common, - ) -> Result<()> { - if self.content_finished { - self.warn_finished_content("glyph run", common); - return Ok(()); - } - - if let Some(c) = self.current_canvas.as_mut() { - for i in 0..glyphs.len() { - c.glyphs.push(GlyphInfo { - dx: xs[i] - c.x0, - dy: ys[i] - c.y0, - glyph: glyphs[i], - font_num, - }); - } - } else { - // Ideally, the vast majority of the time we are using - // handle_text_and_glyphs and not this function, outside of - // canvases. But sometimes we get spare glyphs outside of the canvas - // context. We can use our glyph-mapping infrastructure to try to - // translate them to Unicode, hoping for the best that the naive - // inversion suffices. - - self.set_up_for_font(xs[0], font_num, common); - - let fi = a_ok_or!( - self.fonts.get(&font_num); - ["undeclared font {} in glyph run", font_num] - ); - - // Super lame! Ideally we could just hold a long-lived mutable - // borrow of `fd`, but that gets treated as a long-lived mutable - // borrow of `self` which basically makes all other pieces of state - // inaccessible. So we need to keep on looking up into - // `self.font_data`, and even then we need to avoid functions that - // operation on `&mut self` because `fi` is a long-lived *immutable* - // borrow of `self`. I don't think we can avoid this without doing - // some significant rearranging of the data structures to allow the - // different pieces to be borrowed separately. - - let mut ch_str_buf = [0u8; 4]; - - for (idx, glyph) in glyphs.iter().copied().enumerate() { - let mc = { - let fd = self.font_data.get(&fi.fd_key).unwrap(); - fd.lookup_mapping(glyph) - }; - - if let Some(mc) = mc { - let (mut ch, need_alt) = match mc { - MapEntry::Direct(c) => (c, false), - MapEntry::SubSuperScript(c, _) => (c, true), - MapEntry::MathGrowingVariant(c, _, _) => (c, true), - }; - - let alt_index = if need_alt { - let fd = self.font_data.get_mut(&fi.fd_key).unwrap(); - let map = fd.request_alternative(glyph, ch); - ch = map.usv; - Some(map.alternate_map_index) - } else { - None - }; - - // For later: we could select the "default" font at an outer - // level and only emit tags as needed in here. - let font_sel = fi.selection_style_text(alt_index); - - // Stringify the character so that we can use html_escape in - // case it's a `<` or whatever. - let ch_as_str = ch.encode_utf8(&mut ch_str_buf); - - // XXX this is (part of) push_space_if_needed - if self.is_space_needed(xs[idx], Some(font_num)) { - self.current_content.push(' '); - } - - write!(self.current_content, "", font_sel).unwrap(); - html_escape::encode_text_to_string(ch_as_str, &mut self.current_content); - write!(self.current_content, "").unwrap(); - } else { - tt_warning!( - common.status, - "unable to reverse-map glyph {} in font `{}` (face {})", - glyph, - fi.rel_url, - fi.face_index - ); - } - - // Jump through the hoops needed by automatic space insertion: - - let gm = { - let fd = self.font_data.get(&fi.fd_key).unwrap(); - fd.lookup_metrics(glyphs[idx], fi.size) - }; - - let advance = match gm { - Some(gm) => gm.advance, - None => 0, - }; - - // XXX this is fn update_content_pos(): - self.last_content_x = xs[idx] + advance; - - let cur_space_width = self.maybe_get_font_space_width(Some(font_num)); - if cur_space_width.is_some() { - self.last_content_space_width = cur_space_width; - } - } - } - - Ok(()) - } - - fn set_up_for_font(&mut self, x0: i32, fnum: FontNum, common: &mut Common) { - let (cur_ffid, cur_af, cur_is_autofont) = { - let cur = self.cur_elstate(); - ( - cur.font_family_id, - cur.active_font, - cur.origin == ElementOrigin::FontAuto, - ) - }; - - let (path, desired_af) = if let Some(cur_fam) = self.font_families.get(&cur_ffid) { - // Already set up for the right font? If so, great! - if cur_fam.relative_id_to_font_num(cur_af) == fnum { - return; - } - - // No. Figure out what we need to do. - let desired_af = cur_fam.font_num_to_relative_id(fnum); - (cur_fam.path_to_new_font(cur_af, desired_af), desired_af) - } else { - // We don't seem to be in a defined "family". So we have to - // select it explicitly. - let path = PathToNewFont { - close_all: true, - select_explicitly: true, - ..Default::default() - }; - - let desired_af = FamilyRelativeFontId::Other(fnum); - (path, desired_af) - }; - - if path.close_one_and_retry { - if cur_is_autofont { - self.close_one(); - return self.set_up_for_font(x0, fnum, common); - } else { - // This is a logic error in our implementation -- this - // should never happen. - tt_warning!( - common.status, - "font selection failed (ffid={}, active={:?}, desired={})", - cur_ffid, - cur_af, - fnum - ); - return; - } - } - - if path.close_all { - self.close_automatics(); - } - - if let Some(af) = path.open_b { - self.push_space_if_needed(x0, Some(fnum)); - self.current_content.push_str(""); - self.elem_stack.push(ElementState { - elem: Some(html::Element::B), - origin: ElementOrigin::FontAuto, - active_font: af, - ..*self.cur_elstate() - }); - } - - if let Some(af) = path.open_i { - self.push_space_if_needed(x0, Some(fnum)); - self.current_content.push_str(""); - self.elem_stack.push(ElementState { - elem: Some(html::Element::I), - origin: ElementOrigin::FontAuto, - active_font: af, - ..*self.cur_elstate() - }); - } - - if path.select_explicitly { - self.push_space_if_needed(x0, Some(fnum)); - - let fi = self.fonts.get(&fnum).unwrap(); - let rel_size = fi.size as f32 * self.rems_per_tex; - - write!( - self.current_content, - "", - rel_size, - fi.selection_style_text(None) - ) - .unwrap(); - - self.elem_stack.push(ElementState { - elem: Some(html::Element::Span), - origin: ElementOrigin::FontAuto, - active_font: desired_af, - ..*self.cur_elstate() - }); - } - } - - fn handle_end_canvas(&mut self, common: &mut Common) -> Result<()> { - let mut canvas = self.current_canvas.take().unwrap(); - - // This is the *end* of a canvas, but we haven't pushed anything into - // the content since whatever started the canvas, so we need this: - self.push_space_if_needed(canvas.x0, None); - - let inline = match canvas.kind.as_ref() { - "math" => true, - "dmath" => false, - _ => false, - }; - - // First pass: get overall bounds of all the glyphs from their metrics. - // We need to gather this information first because as we emit glyphs we - // have to specify their positions relative to the edges of the - // containing canvas box, and the size of that box is defined by the - // extents of all of the glyphs it contains. The bounds are measured in - // TeX units. - - let mut first = true; - let mut x_min_tex = 0; - let mut x_max_tex = 0; - let mut y_min_tex = 0; - let mut y_max_tex = 0; - - for gi in &canvas.glyphs[..] { - let fi = a_ok_or!( - self.fonts.get(&gi.font_num); - ["undeclared font {} in canvas", gi.font_num] - ); - - let fd = self.font_data.get_mut(&fi.fd_key).unwrap(); - let gm = fd.lookup_metrics(gi.glyph, fi.size); - - if let Some(gm) = gm { - // to check: RTL correctness - let xmin = gi.dx - gm.lsb; - let xmax = gi.dx + gm.advance; - let ymin = gi.dy - gm.ascent; - let ymax = gi.dy - gm.descent; // note: descent is negative - - if first { - x_min_tex = xmin; - x_max_tex = xmax; - y_min_tex = ymin; - y_max_tex = ymax; - first = false; - } else { - x_min_tex = std::cmp::min(x_min_tex, xmin); - x_max_tex = std::cmp::max(x_max_tex, xmax); - y_min_tex = std::cmp::min(y_min_tex, ymin); - y_max_tex = std::cmp::max(y_max_tex, ymax); - } - } - } - - // Now that we have that information, we can lay out the individual - // glyphs. - // - // A resource I found very helpful: - // https://iamvdo.me/en/blog/css-font-metrics-line-height-and-vertical-align - - let mut inner_content = String::default(); - let mut ch_str_buf = [0u8; 4]; - - for gi in canvas.glyphs.drain(..) { - let fi = self.fonts.get(&gi.font_num).unwrap(); - - // The size of the font being used for this glyph, in rems; that is, - // relative to the main body font. - let rel_size = fi.size as f32 * self.rems_per_tex; - let fd = self.font_data.get_mut(&fi.fd_key).unwrap(); - let mc = fd.lookup_mapping(gi.glyph); - - if let Some(mc) = mc { - // Sometimes we need to render a glyph in one of our input fonts - // that isn't directly associated with a specific Unicode - // character. For instance, in math, we may need to draw a big - // integral sign, but the Unicode integral character maps to a - // small one. The way we handle this is by *creating new fonts* - // with custom character map tables that *do* map Unicode - // characters directly to the specific glyphs we want. - - let (mut ch, need_alt) = match mc { - MapEntry::Direct(c) => (c, false), - MapEntry::SubSuperScript(c, _) => (c, true), - MapEntry::MathGrowingVariant(c, _, _) => (c, true), - }; - - let alt_index = if need_alt { - let map = fd.request_alternative(gi.glyph, ch); - ch = map.usv; - Some(map.alternate_map_index) - } else { - None - }; - - let font_sel = fi.selection_style_text(alt_index); - - // dy gives the target position of this glyph's baseline - // relative to the canvas's baseline. For our `position: - // absolute` layout, we have to convert that into the distance - // between the top of this glyph's box and the top of the - // overall canvas box (or bottom/bottom). - // - // In order to do this, we need to know the size of this glyph's - // box according to CSS, and the position of the glyph's - // baseline within that box. - // - // The baseline position is straightforward: it is given by what - // we call the font's "baseline factor". This is true no matter - // the specific size of the CSS box relative to the font - // rendering size, due to the way in which the drawn glyph is - // centered vertically within its CSS box. - // - // The CSS glyph box height can be funky: it depends on the - // font-size setting, font metrics (not just ascender/descender - // but "line gap") and `line-height` setting in "exciting" ways. - // One convenient approach is to set `line-height: 1` in the - // container, in which case the box height is the `font-size` - // setting. - - let top_rem = (-y_min_tex + gi.dy) as f32 * self.rems_per_tex - - fd.baseline_factor() * rel_size; - - // Stringify the character so that we can use html_escape in - // case it's a `<` or whatever. - let ch_as_str = ch.encode_utf8(&mut ch_str_buf); - - write!( - inner_content, - "", - top_rem, - gi.dx as f32 * self.rems_per_tex, - rel_size, - font_sel, - ) - .unwrap(); - html_escape::encode_text_to_string(ch_as_str, &mut inner_content); - write!(inner_content, "").unwrap(); - } else { - tt_warning!( - common.status, - "unable to reverse-map glyph {} in font `{}` (face {})", - gi.glyph, - fi.rel_url, - fi.face_index - ); - } - } - - let (element, layout_class, valign) = if inline { - // A numerical vertical-align setting positions the bottom edge of - // this block relative to the containing line's baseline. This is - // the best (only?) way to make sure that this block's baseline - // lines up with that of its container. - ( - "span", - "canvas-inline", - format!( - "; vertical-align: {}rem", - -y_max_tex as f32 * self.rems_per_tex - ), - ) - } else { - ("div", "canvas-block", "".to_owned()) - }; - - let element = self.create_elem(element, true, common); - - write!( - self.current_content, - "<{} class=\"canvas {}\" style=\"width: {}rem; height: {}rem; padding-left: {}rem{}\">", - element.name(), - layout_class, - (x_max_tex - x_min_tex) as f32 * self.rems_per_tex, - (y_max_tex - y_min_tex) as f32 * self.rems_per_tex, - -x_min_tex as f32 * self.rems_per_tex, - valign, - ) - .unwrap(); - self.current_content.push_str(&inner_content); - write!(self.current_content, "", element.name()).unwrap(); - self.update_content_pos(x_max_tex + canvas.x0, None); - Ok(()) - } - - fn finish_file(&mut self, common: &mut Common) -> Result<()> { - // Prep the output path - - let mut out_path = common.out_base.to_owned(); - let mut n_levels = 0; - - for piece in self.next_output_path.split('/') { - if piece.is_empty() { - continue; - } - - if piece == ".." { - bail!( - "illegal HTML output path `{}`: it contains a `..` component", - &self.next_output_path - ); - } - - let as_path = Path::new(piece); - - if as_path.is_absolute() || as_path.has_root() { - bail!( - "illegal HTML output path `{}`: it contains an absolute/rooted component", - &self.next_output_path - ); - } - - out_path.push(piece); - n_levels += 1; - } - - self.context.insert("tduxContent", &self.current_content); - - if n_levels < 2 { - self.context.insert("tduxRelTop", ""); - } else { - let mut rel_top = String::default(); - - for _ in 0..(n_levels - 1) { - rel_top.push_str("../"); - } - - self.context.insert("tduxRelTop", &rel_top); - } - - // Read in the template. Let's not cache it, in case someone wants to do - // something fancy with rewriting it. If that setting is empty, probably - // the user is compiling the document in HTML mode without all of the - // TeX infrastructure that Tectonic needs to make it work. - - if self.next_template_path.is_empty() { - bail!("need to emit HTML content but no template has been specified; is your document HTML-compatible?"); - } - - let mut ih = atry!( - common.hooks.io().input_open_name(&self.next_template_path, common.status).must_exist(); - ["unable to open input HTML template `{}`", &self.next_template_path] - ); - - let mut template = String::new(); - atry!( - ih.read_to_string(&mut template); - ["unable to read input HTML template `{}`", &self.next_template_path] - ); - - let (name, digest_opt) = ih.into_name_digest(); - common - .hooks - .event_input_closed(name, digest_opt, common.status); - - // Ready to render! - - let rendered = atry!( - self.tera.render_str(&template, &self.context); - ["failed to render HTML template `{}` while creating `{}`", &self.next_template_path, &self.next_output_path] - ); - - // Save it. - - { - let mut out_file = atry!( - File::create(&out_path); - ["cannot open output file `{}`", out_path.display()] - ); - - atry!( - out_file.write_all(rendered.as_bytes()); - ["cannot write output file `{}`", out_path.display()] - ); - } - - self.current_content = String::default(); - self.update_content_pos(0, None); - Ok(()) - } - - fn content_finished(&mut self, common: &mut Common) -> Result<()> { - if !self.current_content.is_empty() { - tt_warning!(common.status, "un-emitted content at end of HTML output"); - self.current_content = String::default(); - } - - // The reason we're doing all this: we can now emit our customized font - // files that provide access to glyphs that we can't get the browser to - // display directly. First, emit the font files via the font data. - - let mut emitted_info = HashMap::new(); - - for (fd_key, data) in self.font_data.drain() { - let emi = data.emit(common.out_base)?; - emitted_info.insert(fd_key, emi); - } - - // Now we can generate the CSS. - - let mut faces = String::default(); - - for fi in self.fonts.values() { - let emi = emitted_info.get(&fi.fd_key).unwrap(); - - for (alt_index, css_src) in emi { - let _ignored = writeln!( - faces, - r#"@font-face {{ - {} - src: {}; -}}"#, - fi.font_face_text(*alt_index), - css_src, - ); - } - } - - self.context.insert("tduxFontFaces", &faces); - - // OK. - self.content_finished = true; + std::mem::swap(self, &mut work); Ok(()) } } type FixedPoint = i32; -type FontNum = i32; - -#[allow(dead_code)] -#[derive(Debug)] -struct FontInfo { - /// Relative URL to the font data file - rel_url: String, - - /// CSS name of the font family with which this font is associated; - /// autogenerated if not specified during initialization. - family_name: String, - - /// This font's "relationship" to its family. Defaults to Regular to - /// if it's not associated with a full-fledged family. - family_relation: FamilyRelativeFontId, - - /// Integer key used to relate this TeX font to its FontData. Multiple - /// fonts may use the same FontData, if they refer to the same backing - /// file. - fd_key: usize, - - size: FixedPoint, - face_index: u32, - color_rgba: Option, - extend: Option, - slant: Option, - embolden: Option, -} - -impl FontInfo { - fn selection_style_text(&self, alternate_map_index: Option) -> String { - let alt_text = alternate_map_index - .map(|i| format!("vg{}", i)) - .unwrap_or_default(); - - let extra = match self.family_relation { - FamilyRelativeFontId::Regular => "", - FamilyRelativeFontId::Bold => "; font-weight: bold", - FamilyRelativeFontId::Italic => "; font-style: italic", - FamilyRelativeFontId::BoldItalic => "; font-weight: bold; font-style: italic", - FamilyRelativeFontId::Other(_) => unreachable!(), - }; - - format!("font-family: {}{}{}", self.family_name, alt_text, extra) - } - - fn font_face_text(&self, alternate_map_index: Option) -> String { - let alt_text = alternate_map_index - .map(|i| format!("vg{}", i)) - .unwrap_or_default(); - - let extra = match self.family_relation { - FamilyRelativeFontId::Regular => "", - FamilyRelativeFontId::Bold => "\n font-weight: bold;", - FamilyRelativeFontId::Italic => "\n font-style: italic;", - FamilyRelativeFontId::BoldItalic => "\n font-weight: bold;\n font-style: italic;", - FamilyRelativeFontId::Other(_) => unreachable!(), - }; - - format!( - r#"font-family: "{}{}";{}"#, - self.family_name, alt_text, extra - ) - } -} - -#[derive(Clone, Debug, Eq, PartialEq)] -struct FontFamily { - regular: FontNum, - bold: FontNum, - italic: FontNum, - bold_italic: FontNum, -} - -impl FontFamily { - fn font_num_to_relative_id(&self, fnum: FontNum) -> FamilyRelativeFontId { - if fnum == self.regular { - FamilyRelativeFontId::Regular - } else if fnum == self.bold { - FamilyRelativeFontId::Bold - } else if fnum == self.italic { - FamilyRelativeFontId::Italic - } else if fnum == self.bold_italic { - FamilyRelativeFontId::BoldItalic - } else { - FamilyRelativeFontId::Other(fnum) - } - } - - fn relative_id_to_font_num(&self, relid: FamilyRelativeFontId) -> FontNum { - match relid { - FamilyRelativeFontId::Regular => self.regular, - FamilyRelativeFontId::Bold => self.bold, - FamilyRelativeFontId::Italic => self.italic, - FamilyRelativeFontId::BoldItalic => self.bold_italic, - FamilyRelativeFontId::Other(fnum) => fnum, - } - } - - /// Figure out how to get "to" a desired font based on the current one. This - /// function should only be called if it has been established that the - /// desired font is in fact different than the current font. However, there - /// are some noop cases below so that we can make the compiler happy about - /// covering all of our enum variants. - fn path_to_new_font( - &self, - cur: FamilyRelativeFontId, - desired: FamilyRelativeFontId, - ) -> PathToNewFont { - match desired { - FamilyRelativeFontId::Other(_) => PathToNewFont { - close_all: true, - select_explicitly: true, - ..Default::default() - }, - - FamilyRelativeFontId::Regular => PathToNewFont { - close_all: true, - ..Default::default() - }, - - FamilyRelativeFontId::Bold => match cur { - FamilyRelativeFontId::Regular => PathToNewFont { - open_b: Some(desired), - ..Default::default() - }, - - FamilyRelativeFontId::Bold => Default::default(), - - FamilyRelativeFontId::Italic | FamilyRelativeFontId::Other(_) => PathToNewFont { - close_all: true, - open_b: Some(desired), - ..Default::default() - }, - - FamilyRelativeFontId::BoldItalic => PathToNewFont { - close_one_and_retry: true, - ..Default::default() - }, - }, - - FamilyRelativeFontId::Italic => match cur { - FamilyRelativeFontId::Regular => PathToNewFont { - open_i: Some(desired), - ..Default::default() - }, - - FamilyRelativeFontId::Italic => Default::default(), - - FamilyRelativeFontId::Bold | FamilyRelativeFontId::Other(_) => PathToNewFont { - close_all: true, - open_i: Some(desired), - ..Default::default() - }, - - FamilyRelativeFontId::BoldItalic => PathToNewFont { - close_one_and_retry: true, - ..Default::default() - }, - }, - - FamilyRelativeFontId::BoldItalic => match cur { - FamilyRelativeFontId::Regular => PathToNewFont { - open_i: Some(desired), - open_b: Some(FamilyRelativeFontId::Bold), // <= the whole reason these aren't bools - ..Default::default() - }, - - FamilyRelativeFontId::Italic => PathToNewFont { - open_b: Some(desired), - ..Default::default() - }, - - FamilyRelativeFontId::Bold => PathToNewFont { - open_i: Some(desired), - ..Default::default() - }, - - FamilyRelativeFontId::BoldItalic => Default::default(), - - FamilyRelativeFontId::Other(_) => PathToNewFont { - close_one_and_retry: true, - ..Default::default() - }, - }, - } - } -} - -/// How to "get to" a desired font based on the current font family and recently -/// active tags. -#[derive(Clone, Copy, Debug, Default, Eq, PartialEq)] -struct PathToNewFont { - /// Close all open automatically-generated font-selection tags. - pub close_all: bool, - - /// Close one automatically-generated font-selection tag, and try again. - pub close_one_and_retry: bool, - - /// Issue a `` element to explicitly choose the font; this is - /// our get-out-of-jail-free card. - pub select_explicitly: bool, - - /// If Some, open a `` tag. The value is the "family-relative" font that - /// will be active after doing so. If both this and `open_i` are Some, this - /// should be evaluated first. - pub open_b: Option, - - /// If Some, open an `` tag. The value is the "family-relative" font that - /// will be active after doing so. If both this and `open_b` are Some, the - /// `` tag should be evaluated first. - pub open_i: Option, -} - -#[derive(Debug, Default)] -struct FontFamilyBuilder { - family_name: String, - regular: Option, - bold: Option, - italic: Option, - bold_italic: Option, -} - -#[derive(Clone, Copy, Debug, Eq, PartialEq)] -enum FamilyRelativeFontId { - /// This font is the regular font of the current family. - Regular, - - /// This font is the bold font of the current family. - Bold, - - /// This font is the italic font of the current family. - Italic, - - /// This font is the bold-italic font of the current family. - BoldItalic, - - /// This font is some other font with no known relation to the current - /// family. - Other(FontNum), -} - -#[derive(Debug, Default)] -struct FontFamilyTagAssociator { - assoc: HashMap, -} +type TexFontNum = i32; diff --git a/crates/engine_spx2html/src/specials.rs b/crates/engine_spx2html/src/specials.rs new file mode 100644 index 0000000000..13d94b7d86 --- /dev/null +++ b/crates/engine_spx2html/src/specials.rs @@ -0,0 +1,128 @@ +// Copyright 2018-2022 the Tectonic Project +// Licensed under the MIT License. + +//! TeX `\special` items recognized by the spx2html emitter. + +use std::fmt::{Display, Error, Formatter}; +use tectonic_status_base::{tt_warning, StatusBackend}; + +#[derive(Debug, Clone, Copy, Eq, PartialEq)] +pub(crate) enum Special<'a> { + AddTemplate(&'a str), + AutoStartParagraph, + AutoEndParagraph, + CanvasEnd(&'a str), + CanvasStart(&'a str), + ContentFinished, + DirectText(&'a str), + EndDefineFontFamily, + EndFontFamilyTagAssociations, + Emit, + ManualEnd(&'a str), + ManualFlexibleStart(&'a str), + ProvideFile(&'a str), + ProvideSpecial(&'a str), + SetOutputPath(&'a str), + SetTemplate(&'a str), + SetTemplateVariable(&'a str), + StartDefineFontFamily, + StartFontFamilyTagAssociations, +} + +impl<'a> Special<'a> { + pub(crate) fn parse(text: &'a str, status: &mut dyn StatusBackend) -> Option { + // str.split_once() would be nice but it was introduced in 1.52 which is + // a bit recent for us. + + let mut pieces = text.splitn(2, ' '); + + let (cmd, remainder) = if let Some(p) = pieces.next() { + if let Some(cmd) = p.strip_prefix("tdux:") { + (cmd, pieces.next().unwrap_or_default()) + } else { + return None; + } + } else { + return None; + }; + + Some(match cmd { + "asp" => Special::AutoStartParagraph, + "aep" => Special::AutoEndParagraph, + "cs" => Special::CanvasStart(remainder), + "ce" => Special::CanvasEnd(remainder), + "mfs" => Special::ManualFlexibleStart(remainder), + "me" => Special::ManualEnd(remainder), + "dt" => Special::DirectText(remainder), + "emit" => Special::Emit, + "addTemplate" => Special::AddTemplate(remainder), + "setTemplate" => Special::SetTemplate(remainder), + "setOutputPath" => Special::SetOutputPath(remainder), + "setTemplateVariable" => Special::SetTemplateVariable(remainder), + "provideFile" => Special::ProvideFile(remainder), + "provideSpecial" => Special::ProvideSpecial(remainder), + "contentFinished" => Special::ContentFinished, + "startDefineFontFamily" => Special::StartDefineFontFamily, + "endDefineFontFamily" => Special::EndDefineFontFamily, + "startFontFamilyTagAssociations" => Special::StartFontFamilyTagAssociations, + "endFontFamilyTagAssociations" => Special::EndFontFamilyTagAssociations, + _ => { + tt_warning!( + status, + "ignoring unrecognized Tectonic special: tdux:{} {}", + cmd, + remainder + ); + return None; + } + }) + } + + pub fn ends_initialization(&self) -> bool { + matches!( + self, + Special::Emit + | Special::ProvideFile(_) + | Special::ProvideSpecial(_) + | Special::AutoStartParagraph + | Special::AutoEndParagraph + | Special::CanvasStart(_) + | Special::CanvasEnd(_) + | Special::ManualFlexibleStart(_) + | Special::ManualEnd(_) + | Special::DirectText(_) + ) + } +} + +impl<'a> Display for Special<'a> { + fn fmt(&self, f: &mut Formatter<'_>) -> Result<(), Error> { + let (cmd, rest) = match self { + Special::AddTemplate(t) => ("addTemplate", Some(t)), + Special::AutoStartParagraph => ("asp", None), + Special::AutoEndParagraph => ("aep", None), + Special::CanvasEnd(t) => ("ce", Some(t)), + Special::CanvasStart(t) => ("cs", Some(t)), + Special::ContentFinished => ("contentFinished", None), + Special::DirectText(t) => ("dt", Some(t)), + Special::EndDefineFontFamily => ("endDefineFontFamily", None), + Special::EndFontFamilyTagAssociations => ("endFontFamilyTagAssociations", None), + Special::Emit => ("emit", None), + Special::ManualEnd(t) => ("me", Some(t)), + Special::ManualFlexibleStart(t) => ("mfs", Some(t)), + Special::ProvideFile(t) => ("provideFile", Some(t)), + Special::ProvideSpecial(t) => ("provideSpecial", Some(t)), + Special::SetOutputPath(t) => ("setOutputPath", Some(t)), + Special::SetTemplate(t) => ("setTemplate", Some(t)), + Special::SetTemplateVariable(t) => ("setTemplateVariable", Some(t)), + Special::StartDefineFontFamily => ("startDefineFontFamily", None), + Special::StartFontFamilyTagAssociations => ("startFontFamilyTagAssociations", None), + }; + + if let Some(t) = rest { + write!(f, "tdux:{cmd} {t}") + } else { + write!(f, "tdux:{cmd}") + } + } +} diff --git a/crates/engine_spx2html/src/templating.rs b/crates/engine_spx2html/src/templating.rs new file mode 100644 index 0000000000..6fd8e3bbc5 --- /dev/null +++ b/crates/engine_spx2html/src/templating.rs @@ -0,0 +1,147 @@ +// Copyright 2018-2022 the Tectonic Project +// Licensed under the MIT License. + +//! State relating to handling the Tera templating and file emission. + +use std::{ + fs::File, + io::{Read, Write}, +}; +use tectonic_errors::prelude::*; +use tectonic_status_base::tt_warning; + +use crate::Common; + +#[derive(Debug)] +pub(crate) struct Templating { + tera: tera::Tera, + context: tera::Context, + next_template_path: String, + next_output_path: String, +} + +impl Templating { + pub(crate) fn new( + tera: tera::Tera, + context: tera::Context, + next_template_path: String, + next_output_path: String, + ) -> Self { + Templating { + tera, + context, + next_template_path, + next_output_path, + } + } + + pub(crate) fn handle_set_template(&mut self, arg: S) { + self.next_template_path = arg.to_string(); + } + + pub(crate) fn handle_set_output_path(&mut self, arg: S) { + self.next_output_path = arg.to_string(); + } + + pub(crate) fn handle_set_template_variable( + &mut self, + remainder: &str, + common: &mut Common, + ) -> Result<()> { + if let Some((varname, varval)) = remainder.split_once(' ') { + self.set_variable(varname, varval); + } else { + tt_warning!( + common.status, + "ignoring malformatted tdux:setTemplateVariable special `{}`", + remainder + ); + } + + Ok(()) + } + + pub(crate) fn set_variable>(&mut self, name: &str, value: S) { + // Unfortunately tera doesn't seem to give us a way to move an owned + // value directly into the context object. + self.context.insert(name, value.as_ref()); + } + + pub(crate) fn ready_to_output(&self) -> bool { + !self.next_template_path.is_empty() && !self.next_output_path.is_empty() + } + + pub(crate) fn emit(&mut self, common: &mut Common) -> Result<()> { + if self.next_template_path.is_empty() { + bail!("need to emit HTML content but no template has been specified; is your document HTML-compatible?"); + } + + if self.next_output_path.is_empty() { + bail!("need to emit HTML content but no output path has been specified; is your document HTML-compatible?"); + } + + let (out_path, n_levels) = + crate::assets::create_output_path(&self.next_output_path, common)?; + + if n_levels < 2 { + self.context.insert("tduxRelTop", ""); + } else { + let mut rel_top = String::default(); + + for _ in 0..(n_levels - 1) { + rel_top.push_str("../"); + } + + self.context.insert("tduxRelTop", &rel_top); + } + + // Read in the template. Let's not cache it, in case someone wants to do + // something fancy with rewriting it. If that setting is empty, probably + // the user is compiling the document in HTML mode without all of the + // TeX infrastructure that Tectonic needs to make it work. + + let mut ih = atry!( + common.hooks.io().input_open_name(&self.next_template_path, common.status).must_exist(); + ["unable to open input HTML template `{}`", &self.next_template_path] + ); + + let mut template = String::new(); + atry!( + ih.read_to_string(&mut template); + ["unable to read input HTML template `{}`", &self.next_template_path] + ); + + let (name, digest_opt) = ih.into_name_digest(); + common + .hooks + .event_input_closed(name, digest_opt, common.status); + + // Ready to render! + + let rendered = atry!( + self.tera.render_str(&template, &self.context); + ["failed to render HTML template `{}` while creating `{}`", &self.next_template_path, &self.next_output_path] + ); + + // Save it. Unless we shouldn't, actually. + + if let Some(out_path) = out_path { + let mut out_file = atry!( + File::create(&out_path); + ["cannot open output file `{}`", out_path.display()] + ); + + atry!( + out_file.write_all(rendered.as_bytes()); + ["cannot write output file `{}`", out_path.display()] + ); + } + + // Clear the output path, because we don't want people to be accidentally + // overwriting the same file by failing to update it. + + self.next_output_path.clear(); + + Ok(()) + } +} diff --git a/crates/engine_xdvipdfmx/CHANGELOG.md b/crates/engine_xdvipdfmx/CHANGELOG.md index 9fb1e94fb2..0620d63a87 100644 --- a/crates/engine_xdvipdfmx/CHANGELOG.md +++ b/crates/engine_xdvipdfmx/CHANGELOG.md @@ -1,4 +1,9 @@ -# rc: minor bump +# rc: micro bump + +- Tidy up recent Clippy warnings. + + +# tectonic_engine_xdvipdfmx 0.4.0 (2022-10-27) - Use new support in the `pdf_io` backend to handle the `dvipdfmx:config` special (#904, #953, @vlasakm). This should fix some aspects of PDF generation, diff --git a/crates/engine_xdvipdfmx/src/lib.rs b/crates/engine_xdvipdfmx/src/lib.rs index e22c7dfbc2..ccd478c6d2 100644 --- a/crates/engine_xdvipdfmx/src/lib.rs +++ b/crates/engine_xdvipdfmx/src/lib.rs @@ -118,8 +118,8 @@ impl XdvipdfmxEngine { let config = c_api::XdvipdfmxConfig { paperspec: paperspec_str.as_c_str().as_ptr(), - enable_compression: if self.enable_compression { 1 } else { 0 }, - deterministic_tags: if self.deterministic_tags { 1 } else { 0 }, + enable_compression: u8::from(self.enable_compression), + deterministic_tags: u8::from(self.deterministic_tags), build_date: self .build_date .duration_since(SystemTime::UNIX_EPOCH) diff --git a/crates/engine_xetex/CHANGELOG.md b/crates/engine_xetex/CHANGELOG.md index 024a6e876f..5da4dabc1a 100644 --- a/crates/engine_xetex/CHANGELOG.md +++ b/crates/engine_xetex/CHANGELOG.md @@ -1,8 +1,141 @@ -# See elsewhere for changelog +# rc: micro bump -This project’s release notes are curated from the Git history of its main -branch. You can find them by looking at [the version of this file on the -`release` branch][branch] or the [GitHub release history][gh-releases]. +- Remove the automatic insertion of paragraph tags in HTML mode (#1016, @pkgw). + It turns out that in TeX's internals, the starts and ends of "paragraphs" + occur much more frequently than is apparent in the document source. And + TeXLive 2022 introduces new LaTeX-level hooks for paragraph starts and ends + that align much better with linguistic paragraphs. (This is not a coincidence, + since the LaTeX core team is being funded to add support for creating properly + semantically tagged PDFs.) So, for HTML output going forward, we'll use those + hooks, and then there's no need for paragraph tagging support to be built into + the engine here. -[branch]: https://github.com/tectonic-typesetting/tectonic/blob/release/crates/engine_xetex/CHANGELOG.md -[gh-releases]: https://github.com/tectonic-typesetting/tectonic/releases + +# tectonic_engine_xetex 0.4.1 (2022-10-04) + +- When emitting in HTML mode, express paragraphs with `
      ` + instead of `

      ` (#941, @pkgw). This might seem wrong, but matches TeX's + semantics better to the HTML specification, which is quite explicit that the + `

      ` element does not have any special semantic meaning, and in fact + recommends grouping semantic paragraphs with `

      `s. You can't nest an + `
        ` inside a `

        `, for instance, which does not align with TeX's view of + things. + + +# tectonic_engine_xetex 0.4.0 (2022-10-03) + +- Synchronize with TeXLive 2022.0 (#936, @pkgw)! Not many changes: + - Update the internal TECKit to 2.5.11, corresponding to + Unicode 14.0.0. + - Update the engine format version to 33, which removes unused + MLTeX `char_sub` parameters and expands the primitives table + because we've passed 500 of them. + - Update the XeTeX revision code to `.999994`. + - Remove some vestigial MLTeX code related to the above. + - Fix cleanup of TECKit in a few places + - Other upstream changes are not relevant to Tectonic. +- Remove C's `time_t` from internal FFI APIs to avoid portability issues. This + should avoid issues with Linux Musl builds. + + +# tectonic_engine_xetex 0.3.0 (2022-04-26) + +Update the XeTeX engine for TeXLive 2021 (#882, @pkgw). + +- Present as XeTeX revision 0.999993 +- Update the XeTeX format specification to the new version 32 +- Import [\Ucharcat update from 2018][ucc] that I seem to have missed before +- Fixes for [TeX bugs][tex82] 430-440 + - 430: not relevant to Tectonic (interactive features) + - 431: not relevant to Tectonic (interactive features) + - 432: skipped (date/time in system variables; no discernable impact on Tectonic) + - 433: "After nine parameters, delete both # and the token that follows" — breaking change! + - 434: Don't accept an implicit left brace after # in macro head + - 435: Keep garbage out of the buffer if a |\read| end unexpectedly + - 436: Zero out nonexistent chars, to prevent rogue TFM files + - 437: Don't classify fraction noads as inner noads + - 438: Properly identify tabskip glue when tracing repeated templates + - 439: not relevant to Tectonic + - 440: Normalize newlinechar when printing the final stats +- Significant rework/improvement of OpenType math kerning and super/sub-scripting +- Honor `PRIM_SIZE` correctly now that we have to change it! +- Implement `\tracingstacklevels` +- Guard against expansion depth overflow +- When reporting "lost characters", provide hex/UCS codes +- TECkit updated to TL21: version 2.5.10, upgrading from 2.5.9 + - This updates Unicode character names and normalization data to 13.0.0 + +[ucc]: https://github.com/TeX-Live/xetex/commit/0b12b29abb4748a9a85cc3e195ad388eba0d674e +[tex82]: https://ctan.math.utah.edu/ctan/tex-archive/systems/knuth/dist/errata/tex82.bug + +Also: + +- Allow `\openin` of `\openout` files to succeed (addresses #862, @pkgw). + + +# tectonic_engine_xetex 0.2.0 (2022-02-28) + +- Use the new `tectonic_xetex_format` crate as part of the build process (#851, + #848, @pkgw). This crate defines all of the metadata about the XeTeX engine + internals, with versioning, and generates the necessary header files and + macros. It also contains code for decoding XeTeX/Tectonic format files, so + that we'll be able to introspect engine data structures such as macro + definitions. +- Plumb in some specials that will be used by the prototype HTML output + mode (#865, @pkgw) +- Tidy up some of the auto-generated C code +- Fix an internal transcription error: `pre_display_direction`, not + `pre_display_correction` +- Fix a long-standing test issue with PNG image dimensions occasionally leading + to not-quite-reproducible output (#847, @pkgw) + + +# tectonic_engine_xetex 0.1.4 (2021-07-04) + +- Avoid misplaced newlines in warning output ([#803], [@ralismark]) +- Fix new warnings reported by Clippy 1.53.0 + +[#803]: https://github.com/tectonic-typesetting/tectonic/pull/803 +[@ralismark]: https://github.com/ralismark + + +# tectonic_engine_xetex 0.1.3 (2021-06-17) + +- Switch from running [cbindgen] at build time to having the developer run it + manually. This really ought to fix the crate builds on docs.rs ([#788]), and + should speed builds too. + +[cbindgen]: https://github.com/eqrion/cbindgen +[#788]: https://github.com/tectonic-typesetting/tectonic/issues/788 + + +# tectonic_engine_xetex 0.1.2 (2021-06-17) + +- Attempt to fix crate builds on docs.rs — see [#788]. This works around an + issue in Tectonic’s usage of [cbindgen] by configuring Cargo to operate in + offline mode when building on docs.rs, which builds crates with network access + turned off. + +[#788]: https://github.com/tectonic-typesetting/tectonic/issues/788 +[cbindgen]: https://github.com/eqrion/cbindgen + + +# tectonic_engine_xetex 0.1.1 (2021-06-15) + +- Fix SyncTeX output (@hulloanson, @pkgw, #720, #744). We needed to include + absolute paths and properly deal with file renames, etc. The only way to + really do this right is to have the I/O backend provide filesystem paths when + it has them, so we've extended the lower-level crates to make this possible. +- Fix the implementation of some special XeTeX commands, reported by @burrbull + (@pkgw, #714, #783). This requires a bump in the format file serial number. We + believe that this fix includes a fix to an upstream XeTeX bug, which has been + reported. + + +# tectonic_engine_xetex 0.1.0 (2021-06-03) + +This crate introduces the XeTeX engine as a standalone crate, building on the +new "core bridge" functionality. + +Compared to the implementation previously provided in the main `tectonic` crate, +it also adds shell-escape functionality and iterates the Rust API somewhat. diff --git a/crates/engine_xetex/xetex/xetex-xetex0.c b/crates/engine_xetex/xetex/xetex-xetex0.c index 6eda20bf7d..6c3639ba20 100644 --- a/crates/engine_xetex/xetex/xetex-xetex0.c +++ b/crates/engine_xetex/xetex/xetex-xetex0.c @@ -14789,10 +14789,6 @@ new_graf(bool indented) insert_src_special(); } - /* Tectonic customization: insert

        flagged as automatic */ - if (semantic_pagination_enabled) - tt_insert_special("tdux:asp"); - if (LOCAL(every_par) != TEX_NULL) begin_token_list(LOCAL(every_par), EVERY_PAR_TEXT); @@ -14855,10 +14851,6 @@ void end_graf(void) { if (cur_list.mode == HMODE) { - /* Tectonic customization: insert

        flagged as automatic */ - if (semantic_pagination_enabled) - tt_insert_special("tdux:aep"); - if (cur_list.head == cur_list.tail) pop_nest(); else diff --git a/crates/errors/CHANGELOG.md b/crates/errors/CHANGELOG.md index 25620a275a..45a6ede8ff 100644 --- a/crates/errors/CHANGELOG.md +++ b/crates/errors/CHANGELOG.md @@ -1,8 +1,20 @@ -# See elsewhere for changelog +# rc: micro bump -This project’s release notes are curated from the Git history of its main -branch. You can find them by looking at [the version of this file on the -`release` branch][branch] or the [GitHub release history][gh-releases]. +- Tidy up recent Clippy warnings. -[branch]: https://github.com/tectonic-typesetting/tectonic/blob/release/crates/errors/CHANGELOG.md -[gh-releases]: https://github.com/tectonic-typesetting/tectonic/releases + +# tectonic_errors 0.2.0 (2021-06-03) + +The only change in this release is to add a helpful `tectonic_errors::prelude` +module, which makes it easy to get all of the names you need without getting +compiler warnings about the ones that you don't end up using. + + +# tectonic_errors 0.1.0 (2021-01-15) + +Initial release. A new crate providing a generic boxed error type for Tectonic. + +We need a boxed error type because we have a bunch of optional dependencies, and +we can't abstract around their errors without boxing them. + +Strongly derived from [Cranko](https://github.com/pkgw/cranko). diff --git a/crates/errors/src/lib.rs b/crates/errors/src/lib.rs index b0f0bf7e57..22b6cafb3e 100644 --- a/crates/errors/src/lib.rs +++ b/crates/errors/src/lib.rs @@ -52,12 +52,12 @@ pub struct AnnotatedMessage { impl AnnotatedMessage { /// Set the primary message associated with this annotated report. pub fn set_message(&mut self, m: T) { - self.message = format!("{}", m); + self.message = format!("{m}"); } /// Add an additional note to be associated with this annotated report. pub fn add_note(&mut self, n: T) { - self.notes.push(format!("{}", n)); + self.notes.push(format!("{n}")); } /// Obtain the set of notes associated with this report. diff --git a/crates/geturl/CHANGELOG.md b/crates/geturl/CHANGELOG.md index bc521bde48..067a965aef 100644 --- a/crates/geturl/CHANGELOG.md +++ b/crates/geturl/CHANGELOG.md @@ -1,8 +1,60 @@ -# See elsewhere for changelog +# rc: micro bump -This project’s release notes are curated from the Git history of its main -branch. You can find them by looking at [the version of this file on the -`release` branch][branch] or the [GitHub release history][gh-releases]. +- Tidy up recent Clippy warnings. -[branch]: https://github.com/tectonic-typesetting/tectonic/blob/release/crates/geturl/CHANGELOG.md -[gh-releases]: https://github.com/tectonic-typesetting/tectonic/releases + +# tectonic_geturl 0.3.1 (2022-02-28) + +- No meaningful code changes; we're just fixing some new Clippy complaints. + + +# tectonic_geturl 0.3.0 (2021-10-11) + +This release contains an essential fix for what has been the default Tectonic +configuration, which access `archive.org` to look up the default bundle. + +- Update redirection logic to unbreak archive.org resolution (#832, @pkgw). The + Internet Archive PURL service added a new layer of indirection through the URL + `https://purl.prod.archive.org/net/pkgwpub/tectonic-default`, which had an + unfortunate interaction with logic in Tectonic intended avoid pursuing + redirections into S3-type hashed storage services. That logic stopped + resolution when the final element of the URL path (i.e. the filename) did not + contain a period character. This used to be fine when the base archive.org URL + redirected directly to the configured destination URL, but stopped too soon + with the new indirection layer. The logic has been updated to also continue + pursuing the redirection if the filename of the new URL matches the filename + of the original URL, which avoids the issue in this case and seems generally + reasonable. +- Related to the above, the new archive.org redirection used an HTTP status code + of 307, which is a slightly more fully-specified version of the 302 status + code. While the redirection code accepted a final status code of 302 + (indicating that it decided to stop resolving URLs, i.e., it thinks that it + has reached the edge of an S3-type hashed storage service), it did not accept + a 307 result. Now it does (#832, @pkgw). Note that if this behavior had been + in place before, Tectonic would not have broken with the new archive.org + update, but the behavior would have been somewhat incorrect: the URL + resolution would have stopped too soon. But given the semantic similarity of + 302 and 307, if we allow the former, we should allow the latter. + +These fixes are, however, effectively superseded because the release of Tectonic +that contains them will also contain an update of the default URL to a new +dedicated service (`relay.fullyjustified.net`), since `archive.org` is sometimes +unreliable and is blocked in China. + + +# tectonic_geturl 0.2.1 (2021-06-15) + +- Fix a deprecation warning in the latest version of `reqwest`. + + +# tectonic_geturl 0.2.0 (2021-06-03) + +- Expose a new `native-tls-vendored` Cargo feature, to allow people to control + vendoring in the `native-tls` dependency crate. +- Work on the docs a bit. + + +# tectonic_geturl 0.1.0 (2021-01-16) + +Initial release of "get-URL" support crate, with pluggable backends: either curl +or reqwest. Or nothing, if you know that you're not going to need the network. diff --git a/crates/geturl/src/curl.rs b/crates/geturl/src/curl.rs index 29520b4af7..178bd2e446 100644 --- a/crates/geturl/src/curl.rs +++ b/crates/geturl/src/curl.rs @@ -23,7 +23,7 @@ fn get_url_generic( if let Some((start, length)) = range { let end = start + length as u64 - 1; - handle.range(&format!("{}-{}", start, end))?; + handle.range(&format!("{start}-{end}"))?; } let mut buf = Vec::new(); diff --git a/crates/geturl/src/reqwest.rs b/crates/geturl/src/reqwest.rs index cb52310a7a..661df08eef 100644 --- a/crates/geturl/src/reqwest.rs +++ b/crates/geturl/src/reqwest.rs @@ -131,7 +131,7 @@ impl RangeReader for ReqwestRangeReader { fn read_range(&mut self, offset: u64, length: usize) -> Result { let end_inclusive = offset + length as u64 - 1; - let header_val = format!("bytes={}-{}", offset, end_inclusive).parse()?; + let header_val = format!("bytes={offset}-{end_inclusive}").parse()?; let mut headers = HeaderMap::new(); headers.insert(RANGE, header_val); diff --git a/crates/io_base/CHANGELOG.md b/crates/io_base/CHANGELOG.md index 06508cf2f2..79a25106d6 100644 --- a/crates/io_base/CHANGELOG.md +++ b/crates/io_base/CHANGELOG.md @@ -1,8 +1,58 @@ -# See elsewhere for changelog +# rc: micro bump -This project’s release notes are curated from the Git history of its main -branch. You can find them by looking at [the version of this file on the -`release` branch][branch] or the [GitHub release history][gh-releases]. +- Tidy up recent Clippy warnings. +- Update the `sha2` dependency to the 0.10 series (#1038, @CraftSpider) -[branch]: https://github.com/tectonic-typesetting/tectonic/blob/release/crates/io_base/CHANGELOG.md -[gh-releases]: https://github.com/tectonic-typesetting/tectonic/releases + +# tectonic_io_base 0.4.1 (2022-10-03) + +- Print a warning when absolute paths are accessed (#806, #911, @ralismark, + @pkgw). Any such access represents an aspect of the build that won't + necessarily be reproducible on other machines. + + +# tectonic_io_base 0.4.0 (2022-02-28) + +- Implement `Seek` for `InputHandle` (#865, @pkgw) +- Fixes for the latest versions of Clippy + + +# tectonic_io_base 0.3.1 (2021-10-11) + +- No code changes; fixing a couple of docstring typos. + + +# tectonic_io_base 0.3.0 (2021-06-15) + +- Add new "abspath" methods to the IoProvider trait. We need a new API to + generate proper SyncTeX output in the XeTeX engine, and this is the best + approach that we could devise that does a good job of maintaining backwards + compatibility. However, implementors of the IoProvider trait that delegate to + inner implementations will need to make sure to explicitly implement the new + methods in order to provide correct behavior (#762). +- Add a new `app_dirs` module for system-wide knowledge of per-user directories + (@pkgw, #768). It's valuable to put this low in the dependency stack so that + higher-level crates can just "know" where to go for per-user files such as the + bundle cache. +- Correct some broken internal links in the docs. + + +# tectonic_io_base 0.2.0 (2021-06-03) + +- BREAKING: use `&str` for TeX paths rather than `OsStr`. In principle this + prevents users from asking the TeX engine to load up files whose names aren't + expressible in Unicode, but that whole use case really meshes poorly with + Tectonic's goal to provide a portable, uniform user experience. And using + `str` just makes many parts of life much easier. +- Expose a new interface for TeX path normalization. +- If an engine requests to open a file from a filesystem provider, and that name + exists but is a directory, pretend that it's not found. This is sensible behavior + and prevents some hard-to-understand failures (#754) +- Add `FilesystemIo::root()` for users that want to query the root directory of + a filesystem I/O provider. +- Work on the docs a bit + + +# tectonic_io_base 0.1.0 (2021-01-15) + +Initial release: a new crate for basic Tectonic I/O types and traits. diff --git a/crates/io_base/Cargo.toml b/crates/io_base/Cargo.toml index 7bba659a7f..1de70e220b 100644 --- a/crates/io_base/Cargo.toml +++ b/crates/io_base/Cargo.toml @@ -19,7 +19,7 @@ edition = "2018" app_dirs2 = "^2.3" flate2 = { version = "^1.0.19", default-features = false, features = ["zlib"] } libc = "^0.2" # for EISDIR :-( -sha2 = "^0.9" # for digest computations +sha2 = "^0.10" # for digest computations thiserror = "1.0" tectonic_errors = { path = "../errors", version = "0.0.0-dev.0" } tectonic_status_base = { path = "../status_base", version = "0.0.0-dev.0" } diff --git a/crates/io_base/src/digest.rs b/crates/io_base/src/digest.rs index b76c4b9846..c0928dcaf9 100644 --- a/crates/io_base/src/digest.rs +++ b/crates/io_base/src/digest.rs @@ -31,7 +31,7 @@ pub struct BadLengthError { pub fn bytes_to_hex(bytes: &[u8]) -> String { bytes .iter() - .map(|b| format!("{:02x}", b)) + .map(|b| format!("{b:02x}")) .collect::>() .concat() } diff --git a/crates/pdf_io/CHANGELOG.md b/crates/pdf_io/CHANGELOG.md index da8254b1b0..273d57c19c 100644 --- a/crates/pdf_io/CHANGELOG.md +++ b/crates/pdf_io/CHANGELOG.md @@ -1,4 +1,9 @@ -# rc: minor bump +# rc: micro bump + +- Tidy up recent Clippy warnings. + + +# tectonic_pdf_io 0.4.0 (2022-10-27) - Make it possible to semi-properly handle the `dvipdfmx:config` special (#904, #953, @vlasakm). This should fix some aspects of PDF generation, including diff --git a/crates/pdf_io/build.rs b/crates/pdf_io/build.rs index 9f81e09058..c19c3edc0a 100644 --- a/crates/pdf_io/build.rs +++ b/crates/pdf_io/build.rs @@ -81,7 +81,7 @@ fn main() { fn compile(cfg: &mut cc::Build, s: &str) { cfg.file(s); - println!("cargo:rerun-if-changed={}", s); + println!("cargo:rerun-if-changed={s}"); } ccfg.include("pdf_io") diff --git a/crates/status_base/CHANGELOG.md b/crates/status_base/CHANGELOG.md index 3e2983c96d..6690f2e45c 100644 --- a/crates/status_base/CHANGELOG.md +++ b/crates/status_base/CHANGELOG.md @@ -1,8 +1,21 @@ -# See elsewhere for changelog +# rc: micro bump -This project’s release notes are curated from the Git history of its main -branch. You can find them by looking at [the version of this file on the -`release` branch][branch] or the [GitHub release history][gh-releases]. +- Tidy up recent Clippy warnings. -[branch]: https://github.com/tectonic-typesetting/tectonic/blob/release/crates/status_base/CHANGELOG.md -[gh-releases]: https://github.com/tectonic-typesetting/tectonic/releases + +# tectonic_status_base 0.2.0 (2021-06-15) + +- Add `PlainStatusBackend.always_stderr()`, allowing users to specify that + status-reporting output in this backend should always go to standard error + rather than standard output. This is useful in cases where a program's output + to stdout needs to be machine-parseable, since the status-reporting could + potentially interfere with that if not directed elsewhere (@pkgw, #768). + + +# tectonic_status_base 0.1.0 (2021-01-15) + +Initial release: a new crate with basic Tectonic status-reporting traits. + +A lot of this is admittedly close to generic logging infrastructure, but we do +have some custom methods to help support a nice polished Tectonic UX. And that +will likely continue to be the case going forward. diff --git a/crates/status_base/src/lib.rs b/crates/status_base/src/lib.rs index b2da01e731..cdf9598c0b 100644 --- a/crates/status_base/src/lib.rs +++ b/crates/status_base/src/lib.rs @@ -32,12 +32,13 @@ pub enum MessageKind { /// A setting regarding which messages to display. #[repr(usize)] #[non_exhaustive] -#[derive(Clone, Copy, Eq, Debug)] +#[derive(Clone, Copy, Debug, Default, Eq)] pub enum ChatterLevel { /// Suppress all informational output. Minimal = 0, /// Normal output levels. + #[default] Normal, } @@ -52,12 +53,6 @@ impl ChatterLevel { } } -impl Default for ChatterLevel { - fn default() -> Self { - ChatterLevel::Normal - } -} - impl FromStr for ChatterLevel { type Err = &'static str; @@ -121,7 +116,7 @@ pub trait StatusBackend { fn note_highlighted(&mut self, before: &str, highlighted: &str, after: &str) { self.report( MessageKind::Note, - format_args!("{}{}{}", before, highlighted, after), + format_args!("{before}{highlighted}{after}"), None, ) } diff --git a/crates/status_base/src/plain.rs b/crates/status_base/src/plain.rs index 79b7c6f30e..7c40c91c96 100644 --- a/crates/status_base/src/plain.rs +++ b/crates/status_base/src/plain.rs @@ -54,14 +54,14 @@ impl StatusBackend for PlainStatusBackend { }; if kind == MessageKind::Note && !self.always_stderr { - println!("{} {}", prefix, args); + println!("{prefix} {args}"); } else { - eprintln!("{} {}", prefix, args); + eprintln!("{prefix} {args}"); } if let Some(e) = err { for item in e.chain() { - eprintln!("caused by: {}", item); + eprintln!("caused by: {item}"); } } } @@ -70,7 +70,7 @@ impl StatusBackend for PlainStatusBackend { let mut prefix = "error"; for item in err.chain() { - eprintln!("{}: {}", prefix, item); + eprintln!("{prefix}: {item}"); prefix = "caused by"; } } @@ -78,7 +78,7 @@ impl StatusBackend for PlainStatusBackend { fn note_highlighted(&mut self, before: &str, highlighted: &str, after: &str) { self.report( MessageKind::Note, - format_args!("{}{}{}", before, highlighted, after), + format_args!("{before}{highlighted}{after}"), None, ); } diff --git a/crates/xdv/CHANGELOG.md b/crates/xdv/CHANGELOG.md index ce7092b08d..9e3d1c447c 100644 --- a/crates/xdv/CHANGELOG.md +++ b/crates/xdv/CHANGELOG.md @@ -1,8 +1,36 @@ -# See elsewhere for changelog +# rc: micro bump -This project’s release notes are curated from the Git history of its main -branch. You can find them by looking at [the version of this file on the -`release` branch][branch] or the [GitHub release history][gh-releases]. +- Tidy up formatting and recent Clippy warnings. -[branch]: https://github.com/tectonic-typesetting/tectonic/blob/release/crates/xdv/CHANGELOG.md -[gh-releases]: https://github.com/tectonic-typesetting/tectonic/releases + +# tectonic_xdv 0.2.1 (2022-10-04) + +- Remove a quasi-debug println (#941, @pkgw). + + +# tectonic_xdv 0.2.0 (2022-02-28) + +- Significant API reworks and extensions to support the forthcoming prototype + HTML output mode (#865, @pkgw) +- Fix some new Clippy complaints + + +# tectonic_xdv 0.1.12 (2021-06-03) + +- Fix a potential source of undefined behavior (#752) + + +# tectonic_xdv 0.1.11 (2021-01-16) + +- Bump `byteorder` dep from 1.3.x series to 1.4.x series + + +# tectonic_xdv 0.1.10 (2020-10-21) + +- No code changes; just issuing a new release to update deps and silence + Cranko's change detection. + + +# tectonic_xdv 0.1.9 (2020-09-07) + +- No code changes; testing new Cranko-powered release workflow. diff --git a/crates/xdv/examples/xdvdump.rs b/crates/xdv/examples/xdvdump.rs index 67c6163b29..e6695182c0 100644 --- a/crates/xdv/examples/xdvdump.rs +++ b/crates/xdv/examples/xdvdump.rs @@ -30,13 +30,13 @@ impl Display for Error { impl From for Error { fn from(e: io::Error) -> Self { - Error(format!("{}", e)) // note: weirdly, can't use `Self` on this line + Error(format!("{e}")) // note: weirdly, can't use `Self` on this line } } impl From for Error { fn from(e: XdvError) -> Self { - Error(format!("{}", e)) + Error(format!("{e}")) } } @@ -52,14 +52,14 @@ impl tectonic_xdv::XdvEvents for Stats { type Error = Error; fn handle_header(&mut self, filetype: FileType, comment: &[u8]) -> Result<(), Self::Error> { - println!("file type: {}", filetype); + println!("file type: {filetype}"); match str::from_utf8(comment) { Ok(s) => { - println!("comment: {}", s); + println!("comment: {s}"); } Err(e) => { - println!("cannot parse comment: {}", e); + println!("cannot parse comment: {e}"); } }; @@ -78,8 +78,7 @@ impl tectonic_xdv::XdvEvents for Stats { embolden: Option, ) -> Result<(), Self::Error> { println!( - "define native font: `{}` num={} size={} faceIndex={} color={:?} extend={:?} slant={:?} embolden={:?}", - name, font_num, size, face_index, color_rgba, extend, slant, embolden + "define native font: `{name}` num={font_num} size={size} faceIndex={face_index} color={color_rgba:?} extend={extend:?} slant={slant:?} embolden={embolden:?}" ); Ok(()) } @@ -109,10 +108,10 @@ impl tectonic_xdv::XdvEvents for Stats { fn handle_special(&mut self, x: i32, y: i32, contents: &[u8]) -> Result<(), Self::Error> { match str::from_utf8(contents) { Ok(s) => { - println!("special: {} (@ {},{})", s, x, y); + println!("special: {s} (@ {x},{y})"); } Err(e) => { - println!("cannot UTF8-parse special: {}", e); + println!("cannot UTF8-parse special: {e}"); } }; @@ -121,10 +120,7 @@ impl tectonic_xdv::XdvEvents for Stats { fn handle_char_run(&mut self, font_num: i32, chars: &[i32]) -> Result<(), Self::Error> { let all_ascii_printable = chars.iter().all(|c| *c > 0x20 && *c < 0x7F); - println!( - "chars font={}: {:?} all_ascii_printable={:?}", - font_num, chars, all_ascii_printable - ); + println!("chars font={font_num}: {chars:?} all_ascii_printable={all_ascii_printable:?}"); Ok(()) } @@ -135,12 +131,12 @@ impl tectonic_xdv::XdvEvents for Stats { x: &[i32], y: &[i32], ) -> Result<(), Self::Error> { - println!("glyphs font={}: {:?} (@ {:?}, {:?}", font_num, glyphs, x, y); + println!("glyphs font={font_num}: {glyphs:?} (@ {x:?}, {y:?}"); Ok(()) } fn handle_rule(&mut self, x: i32, y: i32, height: i32, width: i32) -> Result<(), Self::Error> { - println!("rule W={} H={} @ {:?}, {:?}", width, height, x, y); + println!("rule W={width} H={height} @ {x:?}, {y:?}"); Ok(()) } } @@ -164,7 +160,7 @@ fn main() { let path = matches.value_of_os("PATH").unwrap(); - let file = match File::open(&path) { + let file = match File::open(path) { Ok(f) => f, Err(e) => { eprintln!( @@ -203,6 +199,6 @@ fn main() { } }; - println!("{} bytes parsed.", n_bytes); + println!("{n_bytes} bytes parsed."); } } diff --git a/crates/xdv/src/lib.rs b/crates/xdv/src/lib.rs index 00b93a154a..4acfb526f1 100644 --- a/crates/xdv/src/lib.rs +++ b/crates/xdv/src/lib.rs @@ -47,22 +47,18 @@ impl Display for XdvError { fn fmt(&self, f: &mut Formatter) -> Result<(), FmtError> { match *self { XdvError::Malformed(offset) => { - write!(f, "unexpected XDV data at byte offset {}", offset) + write!(f, "unexpected XDV data at byte offset {offset}") } XdvError::IllegalOpcode(opcode, offset) => { - write!(f, "illegal XDV opcode {} at byte offset {}", opcode, offset) + write!(f, "illegal XDV opcode {opcode} at byte offset {offset}") } XdvError::UnexpectedEndOfStream => write!(f, "stream ended unexpectedly soon"), - XdvError::FromUTF8(offset) => write!( - f, - "illegal UTF8 sequence starting at byte offset {}", - offset - ), - XdvError::FromUTF16(offset) => write!( - f, - "illegal UTF16 sequence starting at byte offset {}", - offset - ), + XdvError::FromUTF8(offset) => { + write!(f, "illegal UTF8 sequence starting at byte offset {offset}") + } + XdvError::FromUTF16(offset) => { + write!(f, "illegal UTF16 sequence starting at byte offset {offset}") + } } } } @@ -86,7 +82,7 @@ impl error::Error for XdvError { /// In case you want to use String as your error type. impl From for String { fn from(e: XdvError) -> Self { - format!("{}", e) + format!("{e}") } } @@ -361,7 +357,7 @@ impl XdvParser { // Now we can do the main content. - stream.seek(SeekFrom::Start(0))?; + stream.rewind()?; parser.mode = ParserMode::UntilPostamble; parser.state = ParserState::Preamble; parser.process_part(&mut stream)?; @@ -954,7 +950,7 @@ impl XdvParser { } let char_num = cursor.get_compact_i32_smpos(opcode - Opcode::SetChar1 as u8)?; - self.cur_char_run.push(char_num as i32); + self.cur_char_run.push(char_num); Ok(()) } diff --git a/crates/xetex_format/CHANGELOG.md b/crates/xetex_format/CHANGELOG.md index 50660d57fa..e4b0fc991e 100644 --- a/crates/xetex_format/CHANGELOG.md +++ b/crates/xetex_format/CHANGELOG.md @@ -1,8 +1,35 @@ -# See elsewhere for changelog +# rc: micro bump -This project’s release notes are curated from the Git history of its main -branch. You can find them by looking at [the version of this file on the -`release` branch][branch] or the [GitHub release history][gh-releases]. +- Tidy up recent Clippy warnings. -[branch]: https://github.com/tectonic-typesetting/tectonic/blob/release/crates/xetex_format/CHANGELOG.md -[gh-releases]: https://github.com/tectonic-typesetting/tectonic/releases + +# tectonic_xetex_format 0.3.0 (2022-10-03) + +- Define version 33 of the format in support of TeXLive 2022.0 (#936, @pkgw) + - Remove CHAR_SUB_CODE_BASE from here on out. It is only needed for MLTeX, + which is disabled in XeTeX. This section of the equivalents table had one + entry for every USV, which is a lot of space. + - Synchronize PRIM_SIZE with TeXLive 2022, and provide PRIM_PRIME + +# tectonic_xetex_format 0.2.0 (2022-04-26) + +Update for TeXLive 2021 (#882, @pkgw): + +- There is one new integer parameter: `\tracingstacklevels` +- Bump `PRIM_SIZE` to 510, since we have passed 500 primitives! + + +# tectonic_xetex_format 0.1.0 (2022-02-28) + +The new `tectonic_xetex_format` crate defines metadata about the Tectonic/XeTeX +engine implementation. It has two major use cases: + +- Generate the C headers used by `tectonic_engine_xetex` for its implementation +- Allow introspection of Tectonic/XeTeX "format files" + +This latter functionality will allow use to answer questions such as "what +control strings are defined in this LaTeX format?" or "what is the built-in +definition of this macro?" + +The elements of the format definition are all versioned, so that as the engine +evolves we should retain the ability to introspect older formats. diff --git a/crates/xetex_format/examples/decode.rs b/crates/xetex_format/examples/decode.rs index 84e9c7d26d..11c9369719 100644 --- a/crates/xetex_format/examples/decode.rs +++ b/crates/xetex_format/examples/decode.rs @@ -117,7 +117,7 @@ fn main() { let options = Options::from_args(); if let Err(e) = options.execute() { - eprintln!("error: {}", e); + eprintln!("error: {e}"); process::exit(1); } } diff --git a/crates/xetex_format/examples/emit.rs b/crates/xetex_format/examples/emit.rs index 1c9300bf0c..1605c88591 100644 --- a/crates/xetex_format/examples/emit.rs +++ b/crates/xetex_format/examples/emit.rs @@ -17,7 +17,7 @@ fn inner() -> Result<()> { fn main() { if let Err(e) = inner() { - eprintln!("error: {}", e); + eprintln!("error: {e}"); process::exit(1); } } diff --git a/crates/xetex_format/src/commands.rs b/crates/xetex_format/src/commands.rs index 1c09abbb4b..0046c17094 100644 --- a/crates/xetex_format/src/commands.rs +++ b/crates/xetex_format/src/commands.rs @@ -368,7 +368,7 @@ impl Commands { if let Some(cmd) = self.codes.get(&code) { cmd.describe(arg) } else { - format!("[??? {} {}]", code, arg) + format!("[??? {code} {arg}]") } } @@ -381,7 +381,7 @@ impl Commands { if let Some(cmd) = self.codes.get(&code) { (cmd.describe(arg), cmd.extended_info(arg, format)) } else { - (format!("[??? {} {}]", code, arg), None) + (format!("[??? {code} {arg}]"), None) } } @@ -412,7 +412,7 @@ typedef struct xetex_format_primitive_def_t {{ for cmd in self.codes.values() { for prim in cmd.primitives() { let arg = match prim.arg { - ArgKind::Unnamed(v) => format!("{}", v), + ArgKind::Unnamed(v) => format!("{v}"), ArgKind::Symbol(s) => s.to_string(), }; diff --git a/crates/xetex_format/src/commands/simpleenum.rs b/crates/xetex_format/src/commands/simpleenum.rs index d9780819e0..257eea3ae3 100644 --- a/crates/xetex_format/src/commands/simpleenum.rs +++ b/crates/xetex_format/src/commands/simpleenum.rs @@ -276,9 +276,9 @@ impl Command for ShorthandDef { TOKS => "toksdef", XETEX_MATH_CHAR_NUM => "XeTeXmathcharnumdef", XETEX_MATH_CHAR => "XeTeXmathchardef", - _ => return format!("[{:?}?? {}]", self, arg), + _ => return format!("[{self:?}?? {arg}]"), }; - format!("[{}]", s) + format!("[{s}]") } fn primitives(&self) -> Vec { diff --git a/crates/xetex_format/src/cshash.rs b/crates/xetex_format/src/cshash.rs index d0b031f1f7..7c2e2adccf 100644 --- a/crates/xetex_format/src/cshash.rs +++ b/crates/xetex_format/src/cshash.rs @@ -190,7 +190,7 @@ impl ControlSeqHash { // The 1 here is formally ACTIVE_BASE return Some(format!( "[active character {}]", - crate::format::fmt_usv(p as i32 - 1) + crate::format::fmt_usv(p - 1) )); } } diff --git a/crates/xetex_format/src/eqtb.rs b/crates/xetex_format/src/eqtb.rs index 4f0d7baedf..f3a80ea6b0 100644 --- a/crates/xetex_format/src/eqtb.rs +++ b/crates/xetex_format/src/eqtb.rs @@ -108,7 +108,7 @@ impl EquivalenciesTable { engine.symbols.lookup("UNDEFINED_CONTROL_SEQUENCE") as EqtbPointer; let undefined_cs_cmd = engine.symbols.lookup("UNDEFINED_CS") as CommandCode; - let mut eqtb = vec![0; (eqtb_top as usize + 1) * SIZEOF_MEMORY_WORD]; + let mut eqtb = vec![0; (eqtb_top + 1) * SIZEOF_MEMORY_WORD]; write_eqtb_type(&mut eqtb[..], undefined_control_sequence, undefined_cs_cmd); write_eqtb_value(&mut eqtb[..], undefined_control_sequence, TEX_NULL); diff --git a/crates/xetex_format/src/format.rs b/crates/xetex_format/src/format.rs index bf04392682..00f6228ef9 100644 --- a/crates/xetex_format/src/format.rs +++ b/crates/xetex_format/src/format.rs @@ -73,7 +73,7 @@ impl Format { pub fn dump_string_table(&self, stream: &mut W) -> Result<()> { for sp in self.strings.all_sps() { let value = self.strings.lookup(sp); - writeln!(stream, "{} = \"{}\"", sp, value)?; + writeln!(stream, "{sp} = \"{value}\"")?; } Ok(()) @@ -165,10 +165,10 @@ impl Format { (self.engine.commands.describe(entry.ty, entry.value), None) }; - writeln!(stream, "{} => {}", cs_desc, cmd_desc)?; + writeln!(stream, "{cs_desc} => {cmd_desc}")?; if let Some(e) = extended { - writeln!(stream, "--------\n{}\n--------", e)?; + writeln!(stream, "--------\n{e}\n--------")?; } } @@ -240,7 +240,7 @@ impl Format { } (true, 5 /* OUT_PARAM */) => { - writeln!(result, "#{}", chr).unwrap(); + writeln!(result, "#{chr}").unwrap(); } _ => { @@ -271,9 +271,9 @@ impl Format { fn fmt_cs_pointer(&self, ptr: EqtbPointer) -> String { if let Some(text) = self.cshash.stringify(ptr, &self.strings) { - fmt_csname(&text) + fmt_csname(text) } else { - format!("[undecodable cseq pointer {}]", ptr) + format!("[undecodable cseq pointer {ptr}]") } } @@ -314,7 +314,7 @@ fn parse_body(engine: Engine, input: &[u8]) -> IResult<&[u8], Format> { let (input, hash_high) = be_i32(input)?; let (input, _mem_top) = parseutils::satisfy_be_i32(mem_top)(input)?; let (input, _eqtb_size) = parseutils::satisfy_be_i32(eqtb_size)(input)?; - let (input, _hash_prime) = parseutils::satisfy_be_i32(hash_prime as i32)(input)?; + let (input, _hash_prime) = parseutils::satisfy_be_i32(hash_prime)(input)?; let (input, _hyph_prime) = be_i32(input)?; // string table @@ -330,9 +330,9 @@ fn parse_body(engine: Engine, input: &[u8]) -> IResult<&[u8], Format> { let (input, eqtb) = eqtb::EquivalenciesTable::parse(input, &engine, hash_high)?; // nominally hash_top, but hash_top = eqtb_top since hash_extra is nonzero - let (input, _par_loc) = parseutils::ranged_be_i32(hash_base as i32, eqtb_top as i32)(input)?; + let (input, _par_loc) = parseutils::ranged_be_i32(hash_base, eqtb_top)(input)?; - let (input, _write_loc) = parseutils::ranged_be_i32(hash_base as i32, eqtb_top as i32)(input)?; + let (input, _write_loc) = parseutils::ranged_be_i32(hash_base, eqtb_top)(input)?; // Primitives. TODO: figure out best type for `prims`. @@ -349,7 +349,7 @@ fn parse_body(engine: Engine, input: &[u8]) -> IResult<&[u8], Format> { let (input, _font_info) = count(be_i64, fmem_ptr as usize)(input)?; // NB: FONT_BASE = 0 - let (input, font_ptr) = parseutils::ranged_be_i32(0, max_fonts as i32)(input)?; + let (input, font_ptr) = parseutils::ranged_be_i32(0, max_fonts)(input)?; let n_fonts = font_ptr as usize + 1; let (input, _font_check) = count(be_i64, n_fonts)(input)?; @@ -481,18 +481,18 @@ pub fn fmt_usv(c: i32) -> String { if let Some(chr) = maybe_chr { if chr == ' ' { - format!("' ' (0x{:06x})", c) + format!("' ' (0x{c:06x})") } else if chr == '\'' { - format!("\\' (0x{:06x})", c) + format!("\\' (0x{c:06x})") } else if chr == '\"' { - format!("\\\" (0x{:06x})", c) + format!("\\\" (0x{c:06x})") } else if chr.is_control() || chr.is_whitespace() { format!("{} (0x{:06x})", chr.escape_default(), c) } else { - format!("{} (0x{:06x})", chr, c) + format!("{chr} (0x{c:06x})") } } else { - format!("*invalid* (0x{:06x})", c) + format!("*invalid* (0x{c:06x})") } } @@ -503,7 +503,7 @@ pub fn fmt_csname>(name: S) -> String { match (name.len(), has_ws) { (0, _) => "[null CS]".to_owned(), (1, _) => format!("'\\{}'", fmt_usv(name.chars().next().unwrap() as i32)), - (_, false) => format!("\\{}", name), - (_, true) => format!("\"\\{}\"", name), + (_, false) => format!("\\{name}"), + (_, true) => format!("\"\\{name}\""), } } diff --git a/crates/xetex_format/src/mem.rs b/crates/xetex_format/src/mem.rs index d1e4f096b7..ebb99936dd 100644 --- a/crates/xetex_format/src/mem.rs +++ b/crates/xetex_format/src/mem.rs @@ -46,7 +46,7 @@ impl Memory { // Compressed memory loading; - let mut mem = vec![0; (mem_top as usize + 1) * SIZEOF_MEMORY_WORD]; + let mut mem = vec![0; (mem_top + 1) * SIZEOF_MEMORY_WORD]; let mut input = input; let mut p = 0; let mut q = rover; @@ -72,7 +72,7 @@ impl Memory { } // Loading the rest of low memory. TODO: straight into `mem`? - let nb = (lo_mem_max + 1 - p as i32) as usize * SIZEOF_MEMORY_WORD; + let nb = (lo_mem_max + 1 - p) as usize * SIZEOF_MEMORY_WORD; let (input, block) = count(be_u8, nb)(input)?; let idx = p as usize * SIZEOF_MEMORY_WORD; mem[idx..idx + nb].copy_from_slice(&block[..]); diff --git a/crates/xetex_format/src/symbols.rs b/crates/xetex_format/src/symbols.rs index d74d2ef5a6..7655c9e949 100644 --- a/crates/xetex_format/src/symbols.rs +++ b/crates/xetex_format/src/symbols.rs @@ -168,7 +168,7 @@ impl SymbolTable { if let Some(prev) = self.by_name.insert(name.clone(), value) { // We let identical values get re-inserted, mainly for NORMAL. - ensure!(prev == value, format!("changed symbol name `{}`", name)); + ensure!(prev == value, format!("changed symbol name `{name}`")); } else { let group = self.grouped.entry(cat).or_insert_with(Vec::new); group.push(name); @@ -195,7 +195,7 @@ impl SymbolTable { for name in names { let value = self.by_name.get(name).unwrap(); - writeln!(stream, "#define {} {} /* = 0x{:x} */", name, value, value)?; + writeln!(stream, "#define {name} {value} /* = 0x{value:x} */")?; } } diff --git a/crates/xetex_layout/CHANGELOG.md b/crates/xetex_layout/CHANGELOG.md index 35f43fc965..329ed1d139 100644 --- a/crates/xetex_layout/CHANGELOG.md +++ b/crates/xetex_layout/CHANGELOG.md @@ -1,8 +1,33 @@ -# See elsewhere for changelog +# rc: micro bump -This project’s release notes are curated from the Git history of its main -branch. You can find them by looking at [the version of this file on the -`release` branch][branch] or the [GitHub release history][gh-releases]. +- Tidy up recent Clippy warnings. -[branch]: https://github.com/tectonic-typesetting/tectonic/blob/release/crates/xetex_layout/CHANGELOG.md -[gh-releases]: https://github.com/tectonic-typesetting/tectonic/releases + +# tectonic_xetex_layout 0.2.1 (2022-10-03) + +- Work around ICU limitations in Alpine 3.16. The latest version of Alpine Linux + seems to provide a static ICU that no longer has the "macintosh" converter + built in. So don't error out if it fails to load; just hope that everything + will be OK. + + +# tectonic_xetex_layout 0.2.0 (2022-04-26) + +Update for TeXLive 2021 (#882, @pkgw). + +- Add new C API needed for TeXLive 2021: `ttxl_font_get_point_size`. + + +# tectonic_xetex_layout 0.1.1 (2021-10-11) + +- Require the latest version of `tectonic_bridge_graphite2`, which contains a + Windows build fix. +- Fixes for Clippy 1.53.0 + + +# tectonic_xetex_layout 0.1.0 (2021-06-03) + +This new crate encapsulates the font selection and layout code used by the +`tectonic_engine_xetex` crate. While it mostly consists of C/C++ code at the +moment and does not expose a Rust API, there is a hope that it can be made more +flexible and that its implementation can be migrated to be more Rust-based. diff --git a/crates/xetex_layout/build.rs b/crates/xetex_layout/build.rs index 7f0accf93a..551364332a 100644 --- a/crates/xetex_layout/build.rs +++ b/crates/xetex_layout/build.rs @@ -123,7 +123,7 @@ fn main() { fn compile(cfg: &mut cc::Build, s: &str) { cfg.file(s); - println!("cargo:rerun-if-changed={}", s); + println!("cargo:rerun-if-changed={s}"); } cppcfg @@ -214,22 +214,22 @@ fn main() { // that allows us to have a network of crates containing both C/C++ and Rust // code that all interlink. - print!("cargo:include-path={}", out_dir); + print!("cargo:include-path={out_dir}"); for item in harfbuzz_include_path.split(';') { - print!(";{}", item); + print!(";{item}"); } for item in freetype2_include_path.split(';') { - print!(";{}", item); + print!(";{item}"); } for item in graphite2_include_path.split(';') { - print!(";{}", item); + print!(";{item}"); } for item in icu_include_path.split(';') { - print!(";{}", item); + print!(";{item}"); } println!(); diff --git a/dist/azure-build-and-test-vcpkg.yml b/dist/azure-build-and-test-vcpkg.yml index edf8cedd99..96ccd583e0 100644 --- a/dist/azure-build-and-test-vcpkg.yml +++ b/dist/azure-build-and-test-vcpkg.yml @@ -13,9 +13,6 @@ parameters: - name: canaryBuild type: boolean default: false -- name: windowsVcpkgWorkaround - type: boolean - default: false - name: testIt type: boolean default: true @@ -40,52 +37,37 @@ steps: displayName: "Install vcpkg dependencies (Ubuntu)" condition: and(succeeded(), eq(variables['Agent.OS'], 'Linux')) +- bash: | + set -xeuo pipefail + echo CUSTOM VCPKG + ###cargo install cargo-vcpkg + cargo install --git https://github.com/mcgoo/cargo-vcpkg --branch master cargo-vcpkg + displayName: Install cargo-vcpkg + # Note: setvariable + `set -x` adds spurious single quotes at ends of variable values - bash: | echo "##vso[task.setvariable variable=VCPKG_ROOT;]$(pwd)/target/vcpkg" echo "##vso[task.setvariable variable=TECTONIC_DEP_BACKEND;]vcpkg" displayName: Setup build variables -- ${{ if parameters.windowsVcpkgWorkaround }}: - - download: current - - - bash: | - set -xeuo pipefail - BASH_WORKSPACE="$(Pipeline.Workspace)" - - # work around https://github.com/microsoft/azure-pipelines-tasks/issues/10653 - # (in an `if` statement for future-proofiness) - if [[ $AGENT_OS == Windows_NT ]] ; then - BASH_WORKSPACE=$(echo "$BASH_WORKSPACE" | sed -e 's|\\|\/|g' -e 's|^\([A-Za-z]\)\:/\(.*\)|/\L\1\E/\2|') - fi - - mkdir -p target - mv $BASH_WORKSPACE/vcpkg-deps-windows target/vcpkg - displayName: Recover vcpkg deps from artifacts - -- ${{ if not(parameters.windowsVcpkgWorkaround) }}: - - bash: | - set -xeuo pipefail - cargo install cargo-vcpkg - displayName: Install cargo-vcpkg - - - bash: | - set -xeuo pipefail - cargo vcpkg -v build --target $TARGET - ls target/vcpkg - echo target/vcpkg/installed/* - ls target/vcpkg/installed/*/lib - displayName: Build vcpkg deps - # Without RUST_TEST_THREAD=1, on Windows the doctests fail with a # PermissionDenied issue that seems to be due to creating multiple tempfiles in -# the same directory (tests/) at once. +# the same directory (tests/) at once. $VCPKG_DEFAULT_HOST_TRIPLET speeds +# up our builds a bit by further reducing the number of different +# builds we need to do. - bash: | echo "##vso[task.setvariable variable=RUSTFLAGS;]-Ctarget-feature=+crt-static" + echo "##vso[task.setvariable variable=VCPKGRS_TRIPLET;]x64-windows-static-release" + echo "##vso[task.setvariable variable=VCPKG_DEFAULT_HOST_TRIPLET;]x64-windows-static-release" echo "##vso[task.setvariable variable=RUST_TEST_THREADS;]1" displayName: Setup build variables (Windows) condition: and(succeeded(), eq(variables['Agent.OS'], 'Windows_NT')) +- bash: | + set -xeuo pipefail + cargo vcpkg -v build --target $TARGET + displayName: Build vcpkg deps + - template: azure-generic-build.yml parameters: canaryBuild: ${{ parameters.canaryBuild }} diff --git a/dist/azure-build-and-test.yml b/dist/azure-build-and-test.yml index f16661f689..4cbb1c9bca 100644 --- a/dist/azure-build-and-test.yml +++ b/dist/azure-build-and-test.yml @@ -157,8 +157,7 @@ parameters: - name: x86_64_pc_windows_msvc vmImage: windows-2019 - params: - windowsVcpkgWorkaround: true + params: {} vars: TARGET: x86_64-pc-windows-msvc TOOLCHAIN: stable-x86_64-pc-windows-msvc @@ -199,8 +198,6 @@ jobs: # vcpkg builds - ${{ each build in parameters.vcpkgBuilds }}: - job: ${{ format('build_{0}_vcpkg', build.name) }} - ${{ if eq(build.name, 'x86_64_pc_windows_msvc') }}: # work around timeouts with slow builds - dependsOn: windows_vcpkg_prebuild pool: vmImage: ${{ build.vmImage }} steps: @@ -302,38 +299,3 @@ jobs: artifactName: book - bash: cd docs && ../mdbook test displayName: mdbook test - -# Hack to build Windows vcpkg deps in their own job, because it takes so long -# that the Windows jobs routinely hit the 60-minute timeout. -- job: windows_vcpkg_prebuild - pool: - vmImage: windows-2019 - steps: - - bash: | - echo "##vso[task.setvariable variable=VCPKG_ROOT;]$(pwd)/target/vcpkg" - displayName: Set up build variables - - - bash: cargo install cargo-vcpkg - displayName: Install cargo-vcpkg - - - bash: | - set -xeuo pipefail - cargo vcpkg -v build - - # There is something weird about buildtrees/icu/ that prevents - # us from easily rm -rf'ing it. Too bad because that directory - # is easily the largest part of the vcpkg tree. - rm -rf target/vcpkg/downloads - cd target/vcpkg/buildtrees - for d in * ; do - if [ $d != icu ] ; then - rm -rf $d - fi - done - displayName: Build vcpkg deps - - - task: PublishPipelineArtifact@1 - displayName: Publish vcpkg deps as artifact - inputs: - targetPath: 'target/vcpkg' - artifactName: vcpkg-deps-windows diff --git a/dist/azure-coverage.yml b/dist/azure-coverage.yml index c7d01a1fc5..c7ab028edd 100644 --- a/dist/azure-coverage.yml +++ b/dist/azure-coverage.yml @@ -53,7 +53,6 @@ steps: # support in the test harness to wrap the invocations in `kcov` calls. - bash: | set -xeuo pipefail - p="$(pwd)" export TECTONIC_EXETEST_KCOV_RUNNER="kcov --include-path=$(pwd) --exclude-pattern=/tests/" cargo test --test executable displayName: Special-case executable tests diff --git a/dist/vcpkg-triplets/x64-windows-static-release.cmake b/dist/vcpkg-triplets/x64-windows-static-release.cmake new file mode 100644 index 0000000000..42894919c8 --- /dev/null +++ b/dist/vcpkg-triplets/x64-windows-static-release.cmake @@ -0,0 +1,4 @@ +set(VCPKG_TARGET_ARCHITECTURE x64) +set(VCPKG_CRT_LINKAGE static) +set(VCPKG_LIBRARY_LINKAGE static) +set(VCPKG_BUILD_TYPE release) diff --git a/docs/src/SUMMARY.md b/docs/src/SUMMARY.md index eb421945b1..ad67e0254e 100644 --- a/docs/src/SUMMARY.md +++ b/docs/src/SUMMARY.md @@ -16,6 +16,7 @@ - [`tectonic -X compile`](v2cli/compile.md) - [`tectonic -X dump`](v2cli/dump.md) - [`tectonic -X new`](v2cli/new.md) +- [`tectonic -X init`](v2cli/init.md) - [`tectonic -X show`](v2cli/show.md) - [`tectonic -X watch`](v2cli/watch.md) diff --git a/docs/src/howto/build-tectonic/cargo-vcpkg-dep-install.md b/docs/src/howto/build-tectonic/cargo-vcpkg-dep-install.md index aaf260bcaf..fec21c8488 100644 --- a/docs/src/howto/build-tectonic/cargo-vcpkg-dep-install.md +++ b/docs/src/howto/build-tectonic/cargo-vcpkg-dep-install.md @@ -32,13 +32,17 @@ export VCPKG_ROOT="${CARGO_TARGET_DIR:-$(pwd)/target}/vcpkg" [bash]: https://www.gnu.org/software/bash/ If you’re building on Windows, you’ll likely want to make sure that your -[`RUSTFLAGS`] variable includes a `+crt-static` [target feature] to get the -[vcpkg] build scripts to use the `x64-windows-static` [vcpkg triplet], which is -the default one used by our [cargo-vcpkg] setup, as opposed to -`x64-windows-static-md`, which is activated otherwise. And if you’ve done the -full vcpkg install, you might as well build with [an external Harfbuzz][external-harfbuzz]. -Therefore a full Windows build invocation — launched from bash — might look like -this: +[`RUSTFLAGS`] variable includes a `+crt-static` [target feature] and set the +`VCPKGRS_TRIPLET` variable to `x64-windows-static-release`. This is a custom +[vcpkg triplet] provided by Tectonic's build system (in the directory +`dist/vcpkg-triplets`) that is automatically activated by its [cargo-vcpkg] +integration. If you don't use [cargo-vcpkg], the default triplet is +`x64-windows-static` if the `+crt-static` feature is activated, or +`x64-windows-static-md` if it is not. + +If you’ve done the full vcpkg install, you might as well build with [an external +Harfbuzz][external-harfbuzz]. Therefore a full Windows build invocation — +launched from bash — might look like this: [`RUSTFLAGS`]: https://doc.rust-lang.org/cargo/reference/environment-variables.html [target feature]: https://rust-lang.github.io/packed_simd/perf-guide/target-feature/rustflags.html @@ -49,6 +53,7 @@ this: cargo vcpkg build export VCPKG_ROOT="${CARGO_TARGET_DIR:-$(pwd)/target}/vcpkg" export RUSTFLAGS='-Ctarget-feature=+crt-static' # Windows only +export VCPKGRS_TRIPLET='x64-windows-static-release' # Windows only export TECTONIC_DEP_BACKEND=vcpkg cargo build --features external-harfbuzz ``` diff --git a/docs/src/ref/tectonic-toml.md b/docs/src/ref/tectonic-toml.md index 276d0de357..3d4d7ad3a7 100644 --- a/docs/src/ref/tectonic-toml.md +++ b/docs/src/ref/tectonic-toml.md @@ -40,7 +40,7 @@ filesystem-friendly. ### `doc.bundle` A string identifying the location of the “bundle” of TeX support files -underyling the processing of the document. +underlying the processing of the document. In most circumstances this value should be a URL. The `tectonic -X new` command will populate this field with the current recommended default. diff --git a/docs/src/v2cli/init.md b/docs/src/v2cli/init.md new file mode 100644 index 0000000000..bf3448bdea --- /dev/null +++ b/docs/src/v2cli/init.md @@ -0,0 +1,39 @@ +# tectonic -X init + +Initializes a new Tectonic workspace in the current directory. + +**_This is a [V2 CLI][v2cli-ref] command. For information on the original (“V1” +CLI), see [its reference page][v1cli-ref]._** + +[v2cli-ref]: ../ref/v2cli.md +[v1cli-ref]: ../ref/v1cli.md + +#### Usage Synopsis + +```sh +tectonic -X init +``` + +#### Remarks + +This command will create a bare-bones [Tectonic.toml][tectonic-toml] file in the +target directory. The project’s name will be initialized to the name of the +workspace directory. + +[tectonic-toml]: ../ref/tectonic-toml.md + +It will also create placeholder source files in a `src` subdirectory: +`index.tex`, `_preamble.tex`, and `_postamble.tex`. The default build command +will process these files in the expected order: + +1. `src/_preamble.tex` +2. `src/index.tex` +3. `src/_postamble.tex` + +The intention of this framework is to allow you to isolate the main content of +your document from the usual LaTeX boilerplate. There are no restrictions on +what kind of content may be placed in each file, though. + +#### See Also + +- [`tectonic -X new`](./new.md) diff --git a/docs/src/v2cli/new.md b/docs/src/v2cli/new.md index 316ad995da..a3e13975a3 100644 --- a/docs/src/v2cli/new.md +++ b/docs/src/v2cli/new.md @@ -2,8 +2,8 @@ Create a new Tectonic workspace. -***This is a [V2 CLI][v2cli-ref] command. For information on the original (“V1” -CLI), see [its reference page][v1cli-ref].*** +**_This is a [V2 CLI][v2cli-ref] command. For information on the original (“V1” +CLI), see [its reference page][v1cli-ref]._** [v2cli-ref]: ../ref/v2cli.md [v1cli-ref]: ../ref/v1cli.md @@ -36,3 +36,7 @@ will process these files in the expected order: The intention of this framework is to allow you to isolate the main content of your document from the usual LaTeX boilerplate. There are no restrictions on what kind of content may be placed in each file, though. + +#### See Also + +- [`tectonic -X init`](./init.md) diff --git a/src/bin/tectonic/v2cli.rs b/src/bin/tectonic/v2cli.rs index 3aa96f4885..f682aad0ba 100644 --- a/src/bin/tectonic/v2cli.rs +++ b/src/bin/tectonic/v2cli.rs @@ -4,7 +4,10 @@ //! The "v2cli" command-line interface -- a "multitool" interface resembling //! Cargo, as compared to the classic "rustc-like" CLI. -use std::{env, ffi::OsString, io::Write, path::PathBuf, process, str::FromStr}; +use std::{ + convert::Infallible, env, ffi::OsString, io::Write, path::PathBuf, process, str::FromStr, + sync::Arc, +}; use structopt::{clap::AppSettings, StructOpt}; use tectonic::{ self, @@ -19,9 +22,16 @@ use tectonic::{ use tectonic_bridge_core::{SecuritySettings, SecurityStance}; use tectonic_bundles::Bundle; use tectonic_docmodel::workspace::{Workspace, WorkspaceCreator}; -use tectonic_errors::Error as NewError; use tectonic_status_base::plain::PlainStatusBackend; -use watchexec::run::OnBusyUpdate; +use tokio::runtime; +use watchexec::{ + action::{Action, Outcome, PreSpawn}, + command::{Command, Shell}, + config::InitConfig, + Watchexec, +}; +use watchexec_filterer_globset::GlobsetFilterer; +use watchexec_signals::Signal; /// The main options for the "V2" command-line interface. #[derive(Debug, StructOpt)] @@ -157,9 +167,13 @@ enum Commands { Dump(DumpCommand), #[structopt(name = "new")] - /// Create a new document + /// Create a new document project New(NewCommand), + /// Initializes a new document in the current directory + #[structopt(name = "init")] + Init(InitCommand), + #[structopt(name = "show")] /// Display various useful pieces of information Show(ShowCommand), @@ -177,6 +191,7 @@ impl Commands { Commands::Compile(_) => {} // avoid namespacing/etc issues Commands::Dump(o) => o.customize(cc), Commands::New(o) => o.customize(cc), + Commands::Init(o) => o.customize(cc), Commands::Show(o) => o.customize(cc), Commands::Watch(o) => o.customize(cc), } @@ -189,6 +204,7 @@ impl Commands { Commands::Compile(o) => o.execute(config, status), Commands::Dump(o) => o.execute(config, status), Commands::New(o) => o.execute(config, status), + Commands::Init(o) => o.execute(config, status), Commands::Show(o) => o.execute(config, status), Commands::Watch(o) => o.execute(config, status), } @@ -398,7 +414,7 @@ impl BundleSearchCommand { for filename in &files { if filter(filename) { - println!("{}", filename); + println!("{filename}"); } } @@ -515,46 +531,100 @@ pub struct WatchCommand { impl WatchCommand { fn customize(&self, _cc: &mut CommandCustomizations) {} - fn execute(self, _config: PersistentConfig, status: &mut dyn StatusBackend) -> Result { + async fn execute_inner(self, status: &mut dyn StatusBackend) -> Result { let exe_name = crate::watch::get_trimmed_exe_name() .into_os_string() .into_string() .expect("Executable path wasn't valid UTF-8"); let mut cmds = Vec::new(); for x in self.execute.iter() { - let mut cmd = format!("{} -X ", exe_name); let x = x.trim(); if !x.is_empty() { - cmd.push_str(x); + let cmd = Command::Exec { + prog: exe_name.clone(), + args: vec!["-X".to_string(), x.to_string()], + }; cmds.push(cmd) } } if cmds.is_empty() { - cmds.push(format!("{} -X build", exe_name)) + cmds.push(Command::Exec { + prog: exe_name, + args: vec!["-X".to_string(), "build".to_string()], + }); } - let command = cmds.join(" && "); - - let mut args = watchexec::config::ConfigBuilder::default(); - let mut final_command = command.clone(); - #[cfg(unix)] - final_command.push_str("; echo [Finished running. Exit status: $?]"); #[cfg(windows)] - { - final_command.push_str(" & echo [Finished running. Exit status: %ERRORLEVEL%]"); - args.shell(watchexec::Shell::Cmd); - } - #[cfg(not(any(unix, windows)))] - final_command.push_str(" ; echo [Finished running]"); + let (shell, command) = ( + Shell::Cmd, + "echo [Finished running. Exit status: %ERRORLEVEL%]", + ); + #[cfg(unix)] + let (shell, command) = ( + Shell::Unix("bash".to_string()), + "echo [Finished running. Exit status: $?]", + ); + + cmds.push(Command::Shell { + shell, + args: vec![], + command: command.to_string(), + }); + + let mut runtime_config = watchexec::config::RuntimeConfig::default(); + runtime_config.commands(cmds); + + let current_dir = env::current_dir()?; + + let filter = GlobsetFilterer::new( + ¤t_dir, + [], + // Ignore build directory, and things like vim swap files + [("build/**".to_string(), None), ("*.swp".to_string(), None)], + [], + [], + ) + .await + .unwrap(); + + runtime_config + .pathset([¤t_dir]) + .filterer(Arc::new(filter)) + .on_pre_spawn(|pre_spawn: PreSpawn| async move { + println!("[Running `{}`]", pre_spawn.command); + Ok::<_, Infallible>(()) + }) + .on_action(|action: Action| async move { + for event in &*action.events { + let is_kill = event.signals().any(|signal| { + matches!( + signal, + Signal::Interrupt + | Signal::Quit + | Signal::Terminate + | Signal::ForceStop + ) + }); + if is_kill { + action.outcome(Outcome::Exit); + return Ok::<_, Infallible>(()); + } + + let paths = event.paths().collect::>(); + if !paths.is_empty() { + action.outcome(Outcome::IfRunning( + Box::new(Outcome::DoNothing), + Box::new(Outcome::Start), + )); + return Ok(()); + } + } + Ok(()) + }); - args.cmd(vec![final_command]) - .paths(vec![env::current_dir()?]) - .ignores(vec!["build".to_owned()]) - .on_busy_update(OnBusyUpdate::Queue); - let args = args.build().map_err(NewError::from)?; + let exec_handler = Watchexec::new(InitConfig::default(), runtime_config); - let exec_handler = watchexec::run::ExecHandler::new(args); match exec_handler { Err(e) => { tt_error!( @@ -565,22 +635,22 @@ impl WatchCommand { Ok(1) } Ok(exec_handler) => { - let handler = crate::watch::Watcher { - command, - inner: exec_handler, - }; - if let Err(e) = watchexec::watch(&handler) { - tt_error!(status, "failed to execute watch"; e.into()); - Ok(1) - } else { - Ok(0) - } + exec_handler.main().await.unwrap().unwrap(); + Ok(0) } } } + + fn execute(self, _config: PersistentConfig, status: &mut dyn StatusBackend) -> Result { + let rt = runtime::Builder::new_multi_thread() + .enable_all() + .build() + .unwrap(); + rt.block_on(self.execute_inner(status)) + } } -/// `new`: Create a new document +/// `new`: Create a new document project #[derive(Debug, Eq, PartialEq, StructOpt)] pub struct NewCommand { /// The name of the document directory to create. @@ -607,6 +677,30 @@ impl NewCommand { } } +/// `init`: Initialize a document project in the current directory. +#[derive(Debug, Eq, PartialEq, StructOpt)] +pub struct InitCommand {} + +impl InitCommand { + fn customize(&self, _cc: &mut CommandCustomizations) {} + + fn execute(self, config: PersistentConfig, status: &mut dyn StatusBackend) -> Result { + let path = env::current_dir()?; + tt_note!( + status, + "creating new document in this directory ({})", + path.display() + ); + + let wc = WorkspaceCreator::new(path); + ctry!( + wc.create_defaulted(&config, status); + "failed to create the new Tectonic workspace" + ); + Ok(0) + } +} + /// `show`: Show various useful pieces of information. #[derive(Debug, Eq, PartialEq, StructOpt)] pub struct ShowCommand { diff --git a/src/bin/tectonic/watch.rs b/src/bin/tectonic/watch.rs index 107e576b35..6c34857f78 100644 --- a/src/bin/tectonic/watch.rs +++ b/src/bin/tectonic/watch.rs @@ -1,33 +1,6 @@ use std::env; use std::path::PathBuf; -pub(crate) struct Watcher { - pub(crate) command: String, - pub(crate) inner: watchexec::run::ExecHandler, -} - -impl watchexec::Handler for Watcher { - fn args(&self) -> watchexec::config::Config { - self.inner.args() - } - - fn on_manual(&self) -> watchexec::error::Result { - self.start(); - self.inner.on_manual() - } - - fn on_update(&self, ops: &[watchexec::pathop::PathOp]) -> watchexec::error::Result { - self.start(); - self.inner.on_update(ops) - } -} - -impl Watcher { - fn start(&self) { - println!("[Running `{}`]", self.command) - } -} - /// Obtain the executable name without a prefix if the executable is available in the PATH, e.g. /// most cases. Otherwise, use the full path e.g. in development. pub(crate) fn get_trimmed_exe_name() -> PathBuf { diff --git a/src/config.rs b/src/config.rs index ecd5e04f43..c82acbbf89 100644 --- a/src/config.rs +++ b/src/config.rs @@ -80,9 +80,9 @@ impl PersistentConfig { let config = match File::open(&cfg_path) { Ok(mut f) => { - let mut buf = Vec::::new(); - f.read_to_end(&mut buf)?; - toml::from_slice(&buf)? + let mut buf = String::new(); + f.read_to_string(&mut buf)?; + toml::from_str(&buf)? } Err(e) => { if e.kind() == IoErrorKind::NotFound { @@ -157,7 +157,8 @@ impl PersistentConfig { use std::io; if CONFIG_TEST_MODE_ACTIVATED.load(Ordering::SeqCst) { - return Ok(Box::new(crate::test_util::TestBundle::default())); + let bundle = crate::test_util::TestBundle::default(); + return Ok(Box::new(bundle)); } if self.default_bundles.len() != 1 { diff --git a/src/docmodel.rs b/src/docmodel.rs index e3d232ee8b..eedaa483cd 100644 --- a/src/docmodel.rs +++ b/src/docmodel.rs @@ -101,7 +101,8 @@ impl DocumentExt for Document { } if config::is_config_test_mode_activated() { - Ok(Box::new(test_util::TestBundle::default())) + let bundle = test_util::TestBundle::default(); + Ok(Box::new(bundle)) } else if let Ok(url) = Url::parse(&self.bundle_loc) { if url.scheme() != "file" { let mut cache = Cache::get_user_default()?; @@ -130,8 +131,7 @@ impl DocumentExt for Document { ) -> Result { let profile = self.outputs.get(output_profile).ok_or_else(|| { ErrorKind::Msg(format!( - "unrecognized output profile name \"{}\"", - output_profile + "unrecognized output profile name \"{output_profile}\"" )) })?; diff --git a/src/driver.rs b/src/driver.rs index c4968a2c4c..cc60b331f6 100644 --- a/src/driver.rs +++ b/src/driver.rs @@ -1,4 +1,4 @@ -// Copyright 2018-2021 the Tectonic Project +// Copyright 2018-2022 the Tectonic Project // Licensed under the MIT License. #![deny(missing_docs)] @@ -16,7 +16,7 @@ //! which contains tectonic's main CLI program. use byte_unit::Byte; -use quick_xml::{events::Event, Reader}; +use quick_xml::{events::Event, NsReader}; use std::{ collections::{HashMap, HashSet}, fs::File, @@ -30,6 +30,7 @@ use std::{ }; use tectonic_bridge_core::{CoreBridgeLauncher, DriverHooks, SecuritySettings, SystemRequestError}; use tectonic_bundles::Bundle; +use tectonic_engine_spx2html::AssetSpecification; use tectonic_io_base::{ digest::DigestData, filesystem::{FilesystemIo, FilesystemPrimaryInputIo}, @@ -115,7 +116,7 @@ impl FileSummary { } /// The different types of output files that tectonic knows how to produce. -#[derive(Clone, Copy, Debug, Eq, PartialEq)] +#[derive(Clone, Copy, Debug, Default, Eq, PartialEq)] pub enum OutputFormat { /// A '.aux' file. Aux, @@ -124,6 +125,7 @@ pub enum OutputFormat { /// An extended DVI file. Xdv, /// A '.pdf' file. + #[default] Pdf, /// A '.fmt' file, for initializing the TeX engine. Format, @@ -144,17 +146,12 @@ impl FromStr for OutputFormat { } } -impl Default for OutputFormat { - fn default() -> OutputFormat { - OutputFormat::Pdf - } -} - /// The different types of "passes" that [`ProcessingSession`] knows how to run. See /// [`ProcessingSession::run`] for more details. -#[derive(Clone, Copy, Debug, Eq, PartialEq)] +#[derive(Clone, Copy, Debug, Default, Eq, PartialEq)] pub enum PassSetting { /// The default pass, which repeatedly runs TeX and BibTeX until it doesn't need to any more. + #[default] Default, /// Just run the TeX engine once. Tex, @@ -162,12 +159,6 @@ pub enum PassSetting { BibtexFirst, } -impl Default for PassSetting { - fn default() -> PassSetting { - PassSetting::Default - } -} - impl FromStr for PassSetting { type Err = &'static str; @@ -182,9 +173,10 @@ impl FromStr for PassSetting { } /// Different places from which the "primary input" might originate. -#[derive(Clone, Debug, Eq, PartialEq)] +#[derive(Clone, Debug, Default, Eq, PartialEq)] enum PrimaryInputMode { /// This process's standard input. + #[default] Stdin, /// A path on the filesystem. @@ -194,18 +186,13 @@ enum PrimaryInputMode { Buffer(Vec), } -impl Default for PrimaryInputMode { - fn default() -> PrimaryInputMode { - PrimaryInputMode::Stdin - } -} - /// Different places where the output files might land. -#[derive(Clone, Debug, Eq, PartialEq)] +#[derive(Clone, Debug, Default, Eq, PartialEq)] enum OutputDestination { /// The "sensible" default. Files will land in the same directory as the /// input file, or the current working directory if the input is something /// without a path (such as standard input). + #[default] Default, /// Files should land in this particular directory. @@ -216,12 +203,6 @@ enum OutputDestination { Nowhere, } -impl Default for OutputDestination { - fn default() -> OutputDestination { - OutputDestination::Default - } -} - /// The subset of the driver state that is captured when running a C/C++ engine. /// /// The main purpose of this type is to implement the [`DriverHooks`] trait, @@ -280,9 +261,8 @@ impl BridgeState { /// mode”, in which the “primary input” is fixed, based on the requested /// format file name, and filesystem I/O is bypassed. fn enter_format_mode(&mut self, format_file_name: &str) { - self.format_primary = Some(BufferedPrimaryIo::from_text(&format!( - "\\input {}", - format_file_name + self.format_primary = Some(BufferedPrimaryIo::from_text(format!( + "\\input {format_file_name}" ))); } @@ -359,7 +339,7 @@ impl BridgeState { if tool_parent != tempdir.path() { ctry!( - std::fs::create_dir_all(&tool_parent); + std::fs::create_dir_all(tool_parent); "failed to create sub directory `{}`", tool_parent.display() ); } @@ -431,7 +411,7 @@ impl BridgeState { // Mark the input files as having been read, and we're done. for name in &read_files { - let mut summ = self.events.get_mut(name).unwrap(); + let summ = self.events.get_mut(name).unwrap(); summ.access_pattern = match summ.access_pattern { AccessPattern::Written => AccessPattern::WrittenThenRead, c => c, // identity mapping makes sense for remaining options @@ -740,7 +720,7 @@ impl DriverHooks for BridgeState { match Command::new(SHELL[0]) .args(&SHELL[1..]) - .arg(&command) + .arg(command) .current_dir(work.root()) .status() { @@ -778,11 +758,12 @@ impl DriverHooks for BridgeState { } /// Possible modes for handling shell-escape functionality -#[derive(Clone, Debug, Eq, PartialEq)] +#[derive(Clone, Debug, Default, Eq, PartialEq)] enum ShellEscapeMode { /// "Default" mode: shell-escape is disabled, unless it's been turned on in /// the unstable options, in which case it will be allowed through a /// temporary directory. + #[default] Defaulted, /// Shell-escape is disabled, overriding any unstable-option setting. @@ -798,12 +779,6 @@ enum ShellEscapeMode { ExternallyManagedDir(PathBuf), } -impl Default for ShellEscapeMode { - fn default() -> Self { - ShellEscapeMode::Defaulted - } -} - /// A custom extra pass that invokes an external tool. /// /// This is bad for reproducibility but comes in handy. @@ -842,6 +817,10 @@ pub struct ProcessingSessionBuilder { build_date: Option, unstables: UnstableOptions, shell_escape_mode: ShellEscapeMode, + html_assets_spec_path: Option, + html_precomputed_assets: Option, + html_do_not_emit_files: bool, + html_do_not_emit_assets: bool, } impl ProcessingSessionBuilder { @@ -1045,6 +1024,67 @@ impl ProcessingSessionBuilder { self } + /// When using HTML mode, emit an asset specification file instead of actual + /// asset files. + /// + /// "Assets" are files like fonts and images that accompany the HTML output + /// generated during processing. By default, these are emitted during + /// processing. If this method is called, the assets will *not* be created. + /// Instead, an "asset specification" file will be emitted to the given + /// output path. This specification file contains the information needed to + /// generate the assets upon a later invocation. Asset specification files + /// can be merged, allowing the results of multiple separate TeX + /// compilations to be synthesized into one HTML output tree. + /// + /// If the build does not use HTML mode, this setting has no effect. + pub fn html_assets_spec_path(&mut self, path: S) -> &mut Self { + self.html_assets_spec_path = Some(path.to_string()); + self + } + + /// In HTML mode, use a precomputed asset specification. + /// + /// "Assets" are files like fonts and images that accompany the HTML output + /// generated during processing. By default, the engine gathers these during + /// processing and emits them at the end. After this method is used, + /// however, it will generate HTML outputs assuming the information given in + /// the asset specification given here. If the input calls for new assets or + /// different options inconsistent with the specification, processing will + /// abort with an error. + /// + /// The purpose of this mode is to allow for a unified set of assets to be + /// created from multiple independent runs of the SPX-to-HTML stage. First, + /// the different inputs should be processed independently, and their + /// individual assets should saved. These should then be merged. Then the + /// inputs should be reprocessed, all using the merged asset specification. + /// In one — but only one — of these sessions, the assets should actually be + /// emitted. + pub fn html_precomputed_assets(&mut self, assets: AssetSpecification) -> &mut Self { + self.html_precomputed_assets = Some(assets); + self + } + + /// Set whether templated outputs should be created during HTML processing. + /// + /// This mode can be useful if you want to analyze what *would* be created + /// during HTML processing without actually creating the files. + pub fn html_emit_files(&mut self, do_emit: bool) -> &mut Self { + self.html_do_not_emit_files = !do_emit; + self + } + + /// Set whether supporting asset files should be created during HTML + /// processing. + /// + /// This mode can be useful if you want to analyze what *would* be created + /// during HTML processing without actually creating the files. If you call + /// [`Self::html_assets_spec_path`], this setting will ignored, and no + /// assets will be emitted to disk. + pub fn html_emit_assets(&mut self, do_emit: bool) -> &mut Self { + self.html_do_not_emit_assets = !do_emit; + self + } + /// Creates a `ProcessingSession`. pub fn create(self, status: &mut dyn StatusBackend) -> Result { // First, work on the "bridge state", which gathers the subset of our @@ -1199,6 +1239,10 @@ impl ProcessingSessionBuilder { build_date: self.build_date.unwrap_or(SystemTime::UNIX_EPOCH), unstables: self.unstables, shell_escape_mode, + html_assets_spec_path: self.html_assets_spec_path, + html_precomputed_assets: self.html_precomputed_assets, + html_emit_files: !self.html_do_not_emit_files, + html_emit_assets: !self.html_do_not_emit_assets, }) } } @@ -1265,6 +1309,11 @@ pub struct ProcessingSession { /// How to handle shell-escape. The `Defaulted` option will never /// be used here. shell_escape_mode: ShellEscapeMode, + + html_assets_spec_path: Option, + html_precomputed_assets: Option, + html_emit_files: bool, + html_emit_assets: bool, } const DEFAULT_MAX_TEX_PASSES: usize = 6; @@ -1284,7 +1333,7 @@ impl ProcessingSession { for (name, info) in &self.bs.events { if info.access_pattern == AccessPattern::ReadThenWritten { let file_changed = match (&info.read_digest, &info.write_digest) { - (&Some(ref d1), &Some(ref d2)) => d1 != d2, + (Some(d1), Some(d2)) => d1 != d2, (&None, &Some(_)) => true, (_, _) => { // Other cases shouldn't happen. @@ -1455,7 +1504,7 @@ impl ProcessingSession { if n_skipped_intermediates > 0 { status.note_highlighted( "Skipped writing ", - &format!("{}", n_skipped_intermediates), + &format!("{n_skipped_intermediates}"), " intermediate files (use --keep-intermediates to keep them)", ); } @@ -1560,7 +1609,7 @@ impl ProcessingSession { if file.data.is_empty() { status.note_highlighted( "Not writing ", - &format!("`{}`", sname), + &format!("`{sname}`"), ": it would be empty.", ); continue; @@ -1634,7 +1683,7 @@ impl ProcessingSession { match rerun_result { Some(RerunReason::Biber) => "biber was run".to_owned(), Some(RerunReason::Bibtex) => "bibtex was run".to_owned(), - Some(RerunReason::FileChange(ref s)) => format!("\"{}\" changed", s), + Some(RerunReason::FileChange(ref s)) => format!("\"{s}\" changed"), None => break, } }; @@ -1714,7 +1763,7 @@ impl ProcessingSession { let result = { self.bs - .enter_format_mode(&format!("tectonic-format-{}.tex", stem)); + .enter_format_mode(&format!("tectonic-format-{stem}.tex")); let mut launcher = CoreBridgeLauncher::new_with_security(&mut self.bs, status, self.security.clone()); let r = TexEngine::default() @@ -1773,7 +1822,7 @@ impl ProcessingSession { ) -> Result> { let result = { if let Some(s) = rerun_explanation { - status.note_highlighted("Rerunning ", "TeX", &format!(" because {} ...", s)); + status.note_highlighted("Rerunning ", "TeX", &format!(" because {s} ...")); } else { status.note_highlighted("Running ", "TeX", " ..."); } @@ -1825,7 +1874,7 @@ impl ProcessingSession { aux_file: &String, ) -> Result { let result = { - status.note_highlighted("Running ", "BibTeX", &format!(" on {} ...", aux_file)); + status.note_highlighted("Running ", "BibTeX", &format!(" on {aux_file} ...")); let mut launcher = CoreBridgeLauncher::new_with_security(&mut self.bs, status, self.security.clone()); let mut engine = BibtexEngine::new(); @@ -1894,15 +1943,27 @@ impl ProcessingSession { } fn spx2html_pass(&mut self, status: &mut dyn StatusBackend) -> Result { - let op = match self.output_path { - Some(ref p) => p, - None => return Err(errmsg!("HTML output must be saved directly to disk")), - }; - { let mut engine = Spx2HtmlEngine::default(); + + match (self.html_emit_files, self.output_path.as_ref()) { + (true, Some(p)) => engine.output_base(p), + (false, _) => engine.do_not_emit_files(), + (true, None) => return Err(errmsg!("HTML output must be saved directly to disk")), + }; + + if let Some(p) = self.html_assets_spec_path.as_ref() { + engine.assets_spec_path(p); + } else if !self.html_emit_assets { + engine.do_not_emit_assets(); + } + + if let Some(a) = self.html_precomputed_assets.as_ref() { + engine.precomputed_assets(a.clone()); + } + status.note_highlighted("Running ", "spx2html", " ..."); - engine.process_to_filesystem(&mut self.bs, status, &self.tex_xdv_path, op)?; + engine.process_to_filesystem(&mut self.bs, status, &self.tex_xdv_path)?; } self.bs.mem.files.borrow_mut().remove(&self.tex_xdv_path); @@ -1992,13 +2053,13 @@ impl ProcessingSession { } let curs = Cursor::new(&run_xml_entry.data[..]); - let mut reader = Reader::from_reader(curs); + let mut reader = NsReader::from_reader(curs); let mut buf = Vec::new(); let mut state = State::Searching; loop { let event = ctry!( - reader.read_event(&mut buf); + reader.read_event_into(&mut buf); "error parsing run.xml file" ); @@ -2008,7 +2069,7 @@ impl ProcessingSession { match (state, event) { (State::Searching, Event::Start(ref e)) => { - let name = reader.decode(e.local_name())?; + let name = reader.decoder().decode(e.local_name().into_inner())?; if name == "binary" { state = State::InBinaryName; @@ -2016,7 +2077,7 @@ impl ProcessingSession { } (State::InBinaryName, Event::Text(ref e)) => { - let text = e.unescape_and_decode(&reader)?; + let text = e.unescape()?; state = if &text == "biber" { State::InBiberCmdline @@ -2030,18 +2091,18 @@ impl ProcessingSession { } (State::InBiberCmdline, Event::Start(ref e)) => { - let name = reader.decode(e.local_name())?; + let name = reader.decoder().decode(e.local_name().into_inner())?; // Note that the "infile" might be `foo` without the `.bcf` // extension, so we can't use it for file-finding. - state = match name { + state = match &*name { "infile" | "outfile" | "option" => State::InBiberArgument, _ => State::InBiberRemainder, } } (State::InBiberCmdline, Event::End(ref e)) => { - let name = reader.decode(e.local_name())?; + let name = reader.decoder().decode(e.local_name().into_inner())?; if name == "cmdline" { state = State::InBiberRemainder; @@ -2049,21 +2110,21 @@ impl ProcessingSession { } (State::InBiberArgument, Event::Text(ref e)) => { - argv.push(e.unescape_and_decode(&reader)?); + argv.push(e.unescape()?.to_string()); state = State::InBiberCmdline; } (State::InBiberRemainder, Event::Start(ref e)) => { - let name = reader.decode(e.local_name())?; + let name = reader.decoder().decode(e.local_name().into_inner())?; - state = match name { + state = match &*name { "input" | "requires" => State::InBiberRequirementSection, _ => State::InBiberRemainder, } } (State::InBiberRemainder, Event::End(ref e)) => { - let name = reader.decode(e.local_name())?; + let name = reader.decoder().decode(e.local_name().into_inner())?; if name == "external" { break; @@ -2071,16 +2132,16 @@ impl ProcessingSession { } (State::InBiberRequirementSection, Event::Start(ref e)) => { - let name = reader.decode(e.local_name())?; + let name = reader.decoder().decode(e.local_name().into_inner())?; - state = match name { + state = match &*name { "file" => State::InBiberFileRequirement, _ => State::InBiberRemainder, } } (State::InBiberRequirementSection, Event::End(ref e)) => { - let name = reader.decode(e.local_name())?; + let name = reader.decoder().decode(e.local_name().into_inner())?; if name == "input" || name == "requires" { state = State::InBiberRemainder; @@ -2088,7 +2149,7 @@ impl ProcessingSession { } (State::InBiberFileRequirement, Event::Text(ref e)) => { - extra_requires.insert(e.unescape_and_decode(&reader)?); + extra_requires.insert(e.unescape()?.to_string()); state = State::InBiberRequirementSection; } diff --git a/src/errors.rs b/src/errors.rs index cd4ee2ca50..47de7035da 100644 --- a/src/errors.rs +++ b/src/errors.rs @@ -145,7 +145,7 @@ macro_rules! ctry { impl convert::From for io::Error { fn from(err: Error) -> io::Error { - io::Error::new(io::ErrorKind::Other, format!("{}", err)) + io::Error::new(io::ErrorKind::Other, format!("{err}")) } } @@ -165,13 +165,13 @@ impl Error { let mut s = io::stderr(); for item in self.iter() { - writeln!(s, "{} {}", prefix, item).expect("write to stderr failed"); + writeln!(s, "{prefix} {item}").expect("write to stderr failed"); prefix = "caused by:"; } if let Some(backtrace) = self.backtrace() { writeln!(s, "debugging: backtrace follows:").expect("write to stderr failed"); - writeln!(s, "{:?}", backtrace).expect("write to stderr failed"); + writeln!(s, "{backtrace:?}").expect("write to stderr failed"); } } } diff --git a/src/io/format_cache.rs b/src/io/format_cache.rs index 935298dcdc..dbb4e182fa 100644 --- a/src/io/format_cache.rs +++ b/src/io/format_cache.rs @@ -74,7 +74,7 @@ impl IoProvider for FormatCache { Err(e) => return OpenResult::Err(e), }; - let f = match super::try_open_file(&path) { + let f = match super::try_open_file(path) { OpenResult::Ok(f) => f, OpenResult::NotAvailable => return OpenResult::NotAvailable, OpenResult::Err(e) => return OpenResult::Err(e), @@ -99,7 +99,7 @@ impl IoProvider for FormatCache { .rand_bytes(6) .tempfile_in(&self.formats_base)?; temp_dest.write_all(data)?; - temp_dest.persist(&final_path)?; + temp_dest.persist(final_path)?; Ok(()) } } diff --git a/src/io/memory.rs b/src/io/memory.rs index c52f4e2d49..b8a65b6fa2 100644 --- a/src/io/memory.rs +++ b/src/io/memory.rs @@ -183,7 +183,7 @@ impl IoProvider for MemoryIo { let name = normalize_tex_path(name); - let oh = OutputHandle::new(name.to_owned(), MemoryIoItem::new(&self.files, &name, true)); + let oh = OutputHandle::new(name.clone(), MemoryIoItem::new(&self.files, &name, true)); // `hyperxmp.sty` does a thing where it tries to get today's date by // calling \filemoddate on `\jobname.log`. That essentially relies on it @@ -221,7 +221,7 @@ impl IoProvider for MemoryIo { if self.files.borrow().contains_key(&*name) { OpenResult::Ok(InputHandle::new( - name.to_owned(), + name.clone(), MemoryIoItem::new(&self.files, &name, false), InputOrigin::Other, )) diff --git a/src/status/termcolor.rs b/src/status/termcolor.rs index f2151be78d..e54173d056 100644 --- a/src/status/termcolor.rs +++ b/src/status/termcolor.rs @@ -114,10 +114,10 @@ impl TermcolorStatusBackend { }; self.styled(kind, |s| { - write!(s, "{}", text).expect("failed to write to standard stream"); + write!(s, "{text}").expect("failed to write to standard stream"); }); self.with_stream(kind, |s| { - writeln!(s, " {}", args).expect("failed to write to standard stream"); + writeln!(s, " {args}").expect("failed to write to standard stream"); }); } @@ -128,16 +128,16 @@ impl TermcolorStatusBackend { pub fn note_styled(&mut self, args: Arguments) { if self.chatter > ChatterLevel::Minimal { if self.always_stderr { - writeln!(self.stderr, "{}", args).expect("write to stderr failed"); + writeln!(self.stderr, "{args}").expect("write to stderr failed"); } else { - writeln!(self.stdout, "{}", args).expect("write to stdout failed"); + writeln!(self.stdout, "{args}").expect("write to stdout failed"); } } } pub fn error_styled(&mut self, args: Arguments) { self.styled(MessageKind::Error, |s| { - writeln!(s, "{}", args).expect("write to stderr failed"); + writeln!(s, "{args}").expect("write to stderr failed"); }); } @@ -145,7 +145,7 @@ impl TermcolorStatusBackend { let mut prefix = "error:"; for item in err.chain() { - self.generic_message(MessageKind::Error, Some(prefix), format_args!("{}", item)); + self.generic_message(MessageKind::Error, Some(prefix), format_args!("{item}")); prefix = "caused by:"; } } @@ -168,7 +168,7 @@ impl StatusBackend for TermcolorStatusBackend { if let Some(e) = err { for item in e.chain() { - self.generic_message(kind, Some("caused by:"), format_args!("{}", item)); + self.generic_message(kind, Some("caused by:"), format_args!("{item}")); } } } @@ -179,10 +179,10 @@ impl StatusBackend for TermcolorStatusBackend { for item in err.chain() { if first { - self.generic_message(kind, None, format_args!("{}", item)); + self.generic_message(kind, None, format_args!("{item}")); first = false; } else { - self.generic_message(kind, Some("caused by:"), format_args!("{}", item)); + self.generic_message(kind, Some("caused by:"), format_args!("{item}")); } } } @@ -195,13 +195,13 @@ impl StatusBackend for TermcolorStatusBackend { &mut self.stdout }; - write!(stream, "{}", before).expect("write failed"); + write!(stream, "{before}").expect("write failed"); stream .set_color(&self.highlight_spec) .expect("write failed"); - write!(stream, "{}", highlighted).expect("write failed"); + write!(stream, "{highlighted}").expect("write failed"); stream.reset().expect("write failed"); - writeln!(stream, "{}", after).expect("write failed"); + writeln!(stream, "{after}").expect("write failed"); } } diff --git a/src/test_util.rs b/src/test_util.rs index 1a37b116e8..c345d77279 100644 --- a/src/test_util.rs +++ b/src/test_util.rs @@ -110,7 +110,7 @@ pub struct TestBundle(DirBundle); impl Default for TestBundle { fn default() -> Self { - TestBundle(DirBundle::new(&test_path(&["assets"]))) + TestBundle(DirBundle::new(test_path(&["assets"]))) } } diff --git a/src/unstable_opts.rs b/src/unstable_opts.rs index 12d32d2406..c6cc42b341 100644 --- a/src/unstable_opts.rs +++ b/src/unstable_opts.rs @@ -68,8 +68,7 @@ impl FromStr for UnstableArg { let require_no_value = |unwanted_value: Option<&str>, builtin_value: UnstableArg| { if let Some(value) = unwanted_value { Err(format!( - "'-Z {}={}', was supplied but '-Z {}' does not take a value.", - arg, value, arg + "'-Z {arg}={value}', was supplied but '-Z {arg}' does not take a value." ) .into()) } else { @@ -84,7 +83,7 @@ impl FromStr for UnstableArg { "min-crossrefs" => require_value("num") .and_then(|s| { - FromStr::from_str(s).map_err(|e| format!("-Z min-crossrefs: {}", e).into()) + FromStr::from_str(s).map_err(|e| format!("-Z min-crossrefs: {e}").into()) }) .map(UnstableArg::MinCrossrefs), @@ -98,7 +97,7 @@ impl FromStr for UnstableArg { require_value("path").map(|s| UnstableArg::ShellEscapeCwd(s.to_string())) } - _ => Err(format!("Unknown unstable option '{}'", arg).into()), + _ => Err(format!("Unknown unstable option '{arg}'").into()), } } } @@ -141,6 +140,6 @@ impl UnstableOptions { } pub fn print_unstable_help_and_exit() { - print!("{}", HELPMSG); + print!("{HELPMSG}"); std::process::exit(0); } diff --git a/tests/bibtex.rs b/tests/bibtex.rs index ec865844d2..0ce4f6e6cc 100644 --- a/tests/bibtex.rs +++ b/tests/bibtex.rs @@ -3,6 +3,7 @@ use std::collections::HashSet; use std::default::Default; +use std::path::PathBuf; use tectonic::io::{FilesystemIo, IoProvider, IoStack, MemoryIo}; use tectonic::BibtexEngine; @@ -16,29 +17,43 @@ use crate::util::{test_path, ExpectedInfo}; struct TestCase { stem: String, + subdir: Option, + test_bbl: bool, } impl TestCase { - fn new(stem: &str) -> Self { + fn new(stem: &str, subdir: Option<&str>) -> Self { TestCase { stem: stem.to_owned(), + subdir: subdir.map(String::from), + test_bbl: true, } } - fn go(&mut self) { - util::set_test_root(); + fn test_bbl(mut self, test: bool) -> Self { + self.test_bbl = test; + self + } + fn test_dir(&self) -> PathBuf { let mut p = test_path(&["bibtex"]); + if let Some(subdir) = &self.subdir { + p.push(subdir); + } + p + } - p.push(&self.stem); + fn go(&mut self) { + util::set_test_root(); + + let mut p = self.test_dir(); - p.set_extension("aux"); - let auxname = p.file_name().unwrap().to_str().unwrap().to_owned(); + let auxname = format!("{}.aux", self.stem); // MemoryIo layer that will accept the outputs. let mut mem = MemoryIo::new(true); - let mut assets = FilesystemIo::new(&test_path(&["bibtex"]), false, false, HashSet::new()); + let mut assets = FilesystemIo::new(&p, false, false, HashSet::new()); let mut genio = GenuineStdoutIo::new(); @@ -55,17 +70,40 @@ impl TestCase { // Check that outputs match expectations. - let expected_bbl = ExpectedInfo::read_with_extension(&mut p, "bbl"); - let expected_blg = ExpectedInfo::read_with_extension(&mut p, "blg"); + p.push(&self.stem); let files = mem.files.borrow(); - expected_bbl.test_from_collection(&files); + if self.test_bbl { + let expected_bbl = ExpectedInfo::read_with_extension(&mut p, "bbl"); + expected_bbl.test_from_collection(&files); + } + + let expected_blg = ExpectedInfo::read_with_extension(&mut p, "blg"); expected_blg.test_from_collection(&files); } } #[test] fn single_entry() { - TestCase::new("single_entry").go() + TestCase::new("single_entry", None).go() +} + +#[test] +fn test_empty_files() { + TestCase::new("empty", Some("empty")).test_bbl(false).go() +} + +#[test] +fn test_mismatched_function() { + TestCase::new("function", Some("mismatched_braces")) + .test_bbl(false) + .go(); +} + +#[test] +fn test_mismatched_expr() { + TestCase::new("expr", Some("mismatched_braces")) + .test_bbl(false) + .go(); } diff --git a/tests/bibtex/empty/empty.aux b/tests/bibtex/empty/empty.aux new file mode 100644 index 0000000000..e69de29bb2 diff --git a/tests/bibtex/empty/empty.bbl b/tests/bibtex/empty/empty.bbl new file mode 100644 index 0000000000..e69de29bb2 diff --git a/tests/bibtex/empty/empty.bib b/tests/bibtex/empty/empty.bib new file mode 100644 index 0000000000..e69de29bb2 diff --git a/tests/bibtex/empty/empty.blg b/tests/bibtex/empty/empty.blg new file mode 100644 index 0000000000..e3e26c8848 --- /dev/null +++ b/tests/bibtex/empty/empty.blg @@ -0,0 +1,7 @@ +This is BibTeX, Version 0.99d +Capacity: max_strings=35307, hash_size=35307, hash_prime=30011 +The top-level auxiliary file: empty.aux +I found no \citation commands---while reading file empty.aux +I found no \bibdata command---while reading file empty.aux +I found no \bibstyle command---while reading file empty.aux +(There were 3 error messages) diff --git a/tests/bibtex/empty/empty.bst b/tests/bibtex/empty/empty.bst new file mode 100644 index 0000000000..e69de29bb2 diff --git a/tests/bibtex/mismatched_braces/expr.aux b/tests/bibtex/mismatched_braces/expr.aux new file mode 100644 index 0000000000..06fd23a800 --- /dev/null +++ b/tests/bibtex/mismatched_braces/expr.aux @@ -0,0 +1,5 @@ +\relax +\citation{Mismatched01} +\bibdata{expr} +\bibcite{Mismatched01}{1} +\bibstyle{expr} diff --git a/tests/bibtex/mismatched_braces/expr.bbl b/tests/bibtex/mismatched_braces/expr.bbl new file mode 100644 index 0000000000..e69de29bb2 diff --git a/tests/bibtex/mismatched_braces/expr.bib b/tests/bibtex/mismatched_braces/expr.bib new file mode 100644 index 0000000000..e69de29bb2 diff --git a/tests/bibtex/mismatched_braces/expr.blg b/tests/bibtex/mismatched_braces/expr.blg new file mode 100644 index 0000000000..d5dd083528 --- /dev/null +++ b/tests/bibtex/mismatched_braces/expr.blg @@ -0,0 +1,9 @@ +This is BibTeX, Version 0.99d +Capacity: max_strings=35307, hash_size=35307, hash_prime=30011 +The top-level auxiliary file: expr.aux +The style file: expr.bst +"}" can't start a style-file command---line 5 of file expr.bst + : + : } +(Error may have been on previous line) +(There was 1 error message) diff --git a/tests/bibtex/mismatched_braces/expr.bst b/tests/bibtex/mismatched_braces/expr.bst new file mode 100644 index 0000000000..394d4d8fd8 --- /dev/null +++ b/tests/bibtex/mismatched_braces/expr.bst @@ -0,0 +1,5 @@ + +FUNCTION {missing_expr_brace} +{ + pop$ "" } +} \ No newline at end of file diff --git a/tests/bibtex/mismatched_braces/function.aux b/tests/bibtex/mismatched_braces/function.aux new file mode 100644 index 0000000000..dd6c5f396c --- /dev/null +++ b/tests/bibtex/mismatched_braces/function.aux @@ -0,0 +1,5 @@ +\relax +\citation{Mismatched01} +\bibdata{function} +\bibcite{Mismatched01}{1} +\bibstyle{function} diff --git a/tests/bibtex/mismatched_braces/function.bbl b/tests/bibtex/mismatched_braces/function.bbl new file mode 100644 index 0000000000..e69de29bb2 diff --git a/tests/bibtex/mismatched_braces/function.bib b/tests/bibtex/mismatched_braces/function.bib new file mode 100644 index 0000000000..e69de29bb2 diff --git a/tests/bibtex/mismatched_braces/function.blg b/tests/bibtex/mismatched_braces/function.blg new file mode 100644 index 0000000000..a6e4b5c2c1 --- /dev/null +++ b/tests/bibtex/mismatched_braces/function.blg @@ -0,0 +1,8 @@ +This is BibTeX, Version 0.99d +Capacity: max_strings=35307, hash_size=35307, hash_prime=30011 +The top-level auxiliary file: function.aux +The style file: function.bst +Illegal end of style file in command: function---line 2 of file function.bst + : function { missing_end_brace + : +(There was 1 error message) diff --git a/tests/bibtex/mismatched_braces/function.bst b/tests/bibtex/mismatched_braces/function.bst new file mode 100644 index 0000000000..070b0e44c4 --- /dev/null +++ b/tests/bibtex/mismatched_braces/function.bst @@ -0,0 +1,2 @@ + +FUNCTION { missing_end_brace diff --git a/tests/cached_itarbundle.rs b/tests/cached_itarbundle.rs index 37967da15b..16591720a4 100644 --- a/tests/cached_itarbundle.rs +++ b/tests/cached_itarbundle.rs @@ -1,24 +1,25 @@ use flate2::{write::GzEncoder, GzBuilder}; -use futures::future; use headers::HeaderMapExt; use hyper::header::{self, HeaderValue}; -use hyper::rt::Future; -use hyper::service::service_fn; -use hyper::{Body, Method, Request, Response, Server, StatusCode}; +use hyper::server::Server; +use hyper::service::{make_service_fn, service_fn}; +use hyper::{Body, Method, Request, Response, StatusCode}; use std::collections::HashMap; +use std::convert::Infallible; +use std::future::Future; use std::io::{self, Write}; use std::net::SocketAddr; use std::ops::Bound; use std::path::Path; +use std::pin::Pin; use std::sync::{Arc, Mutex}; -use std::thread; -use std::{env, fs}; +use std::{env, fs, thread}; use tectonic::config::PersistentConfig; use tectonic::driver::ProcessingSessionBuilder; use tectonic::io::OpenResult; use tectonic::status::termcolor::TermcolorStatusBackend; use tectonic::status::ChatterLevel; -use tokio::runtime::current_thread; +use tokio::runtime; mod util; @@ -45,7 +46,7 @@ impl TarIndexBuilder { fn push(&mut self, name: &str, content: &[u8]) -> &mut Self { let offset = self.tar.len(); let len = content.len(); - let _ = writeln!(&mut self.index, "{} {} {}", name, offset, len); + let _ = writeln!(&mut self.index, "{name} {offset} {len}"); self.map .insert((offset as u64, len as u64), name.to_owned()); self.tar.extend_from_slice(content); @@ -101,7 +102,7 @@ struct TarIndexService { local_addr: Mutex>, } -type ResponseFuture = Box, Error = io::Error> + Send>; +type ResponseFuture = Pin> + Send + Sync + 'static>>; impl TarIndexService { fn new(tar_index: TarIndex) -> TarIndexService { @@ -128,8 +129,7 @@ impl TarIndexService { ) { (&Method::HEAD, "/tectonic-default", None) => { self.log_request(TectonicRequest::Head(req.uri().path().to_owned())); - let mut resp = Response::builder(); - resp.status(StatusCode::FOUND); + let mut resp = Response::builder().status(StatusCode::FOUND); resp.headers_mut().unwrap().insert( header::LOCATION, HeaderValue::from_str(&format!( @@ -138,11 +138,11 @@ impl TarIndexService { )) .unwrap(), ); - Box::new(future::ok(resp.body(Body::empty()).unwrap())) + Box::pin(async move { resp.body(Body::empty()).unwrap() }) } (&Method::HEAD, "/bundle.tar", None) => { self.log_request(TectonicRequest::Head(req.uri().path().to_owned())); - Box::new(future::ok(Response::new(Body::empty()))) + Box::pin(async move { Response::new(Body::empty()) }) } (&Method::GET, "/bundle.tar", Some(range)) => { if let Some((Bound::Included(l), Bound::Included(h))) = range.iter().next() { @@ -152,31 +152,27 @@ impl TarIndexService { .get(&(l, h - l + 1)) .expect("unknown file data requested"); self.log_request(TectonicRequest::File(name.to_owned())); - let mut resp = Response::builder(); - resp.status(StatusCode::PARTIAL_CONTENT); + let mut resp = Response::builder().status(StatusCode::PARTIAL_CONTENT); resp.headers_mut() .unwrap() .typed_insert(headers::ContentRange::bytes(l..=h, None).unwrap()); - Box::new(future::ok( - resp.body((tar_index.tar[l as usize..=h as usize]).to_vec().into()) - .unwrap(), - )) + let body = (tar_index.tar[l as usize..=h as usize]).to_vec().into(); + Box::pin(async move { resp.body(body).unwrap() }) } else { panic!("unexpected"); } } (&Method::GET, "/bundle.tar.index.gz", None) => { self.log_request(TectonicRequest::Index); - Box::new(future::ok(Response::new( - self.tar_index.lock().unwrap().index.to_vec().into(), - ))) + let resp = self.tar_index.lock().unwrap().index.to_vec().into(); + Box::pin(async move { Response::new(resp) }) } - _ => Box::new(future::ok( + _ => Box::pin(async move { Response::builder() .status(StatusCode::NOT_FOUND) .body(Body::empty()) - .unwrap(), - )), + .unwrap() + }), } } @@ -207,34 +203,53 @@ where .join("assets"); TarIndex::from_dir(root).unwrap() }))); + + let (url_available_tx, url_available_rx) = std::sync::mpsc::channel(); + let (server_shutdown_tx, server_shutdown_rx) = futures::channel::oneshot::channel::<()>(); let tar_service_clone = Arc::clone(&tar_service); - let server = Server::bind(&addr).serve(move || { - let tar_service = Arc::clone(&tar_service_clone); - service_fn(move |req| tar_service.response(req)) - }); + let server_thread = thread::spawn(move || { + let tar_service = tar_service_clone; - // server is listening now - tar_service.set_local_addr(server.local_addr()); - let url = tar_service.url(); + let rt = runtime::Builder::new_current_thread() + .enable_io() + .build() + .unwrap(); + + let tar_service_clone = Arc::clone(&tar_service); + rt.block_on(async move { + let server = Server::bind(&addr).serve(make_service_fn(move |_| { + let tar_service_clone = Arc::clone(&tar_service_clone); + async move { + Ok::<_, Infallible>(service_fn(move |req| { + let tar_service = Arc::clone(&tar_service_clone); + async move { Ok::<_, Infallible>(tar_service.response(req).await) } + })) + } + })); - let (server_shutdown_tx, server_shutdown_rx) = futures::sync::oneshot::channel::<()>(); + // server is listening now + tar_service.set_local_addr(server.local_addr()); + let url = tar_service.url(); + url_available_tx.send(url).unwrap(); - let graceful = server.with_graceful_shutdown(server_shutdown_rx); + let graceful = server.with_graceful_shutdown(async move { + server_shutdown_rx.await.unwrap(); + }); - let server_thread = thread::spawn(|| { - // Run the server on a single thread (current thread) - current_thread::run(graceful.map_err(|_| ())); + graceful.await + }) }); // Server running, run the provided test + let url = url_available_rx.recv().unwrap(); run(Arc::clone(&tar_service), &url); println!("Shutting down"); // Shut down server let _ = server_shutdown_tx.send(()); - server_thread.join().unwrap(); + server_thread.join().unwrap().unwrap(); // Check tectonic's requests. let requests = tar_service.requests.lock().unwrap(); @@ -246,8 +261,7 @@ fn check_req_count(requests: &[TectonicRequest], request: TectonicRequest, expec let number = requests.iter().filter(|r| **r == request).count(); assert_eq!( number, expected_number, - "Expected {} requests of {:?}, got {}", - expected_number, request, number + "Expected {expected_number} requests of {request:?}, got {number}" ); } #[test] diff --git a/tests/executable.rs b/tests/executable.rs index 91a17e123b..07d38e2ce1 100644 --- a/tests/executable.rs +++ b/tests/executable.rs @@ -8,7 +8,8 @@ use std::{ io::{Read, Write}, path::{Path, PathBuf}, process::{Command, Output, Stdio}, - str, + str, thread, + time::{Duration, Instant}, }; use tempfile::TempDir; @@ -32,7 +33,7 @@ lazy_static! { target.make_ascii_uppercase(); // run-time environment variable check: - if let Ok(runtext) = env::var(format!("CARGO_TARGET_{}_RUNNER", target)) { + if let Ok(runtext) = env::var(format!("CARGO_TARGET_{target}_RUNNER")) { runtext.split_whitespace().map(|x| x.to_owned()).collect() } else { vec![] @@ -71,8 +72,8 @@ fn prep_tectonic(cwd: &Path, args: &[&str]) -> Command { tectonic ) } - println!("using tectonic binary at {:?}", tectonic); - println!("using cwd {:?}", cwd); + println!("using tectonic binary at {tectonic:?}"); + println!("using cwd {cwd:?}"); // We may need to wrap the Tectonic invocation. If we're cross-compiling, we // might need to use something like QEMU to actually be able to run the @@ -114,19 +115,47 @@ fn prep_tectonic(cwd: &Path, args: &[&str]) -> Command { fn run_tectonic(cwd: &Path, args: &[&str]) -> Output { let mut command = prep_tectonic(cwd, args); command.env("BROWSER", "echo"); - println!("running {:?}", command); + println!("running {command:?}"); command.output().expect("tectonic failed to start") } +fn run_tectonic_until(cwd: &Path, args: &[&str], mut kill: impl FnMut() -> bool) -> Output { + // This harness doesn't work when running with kcov because there's no good + // way to stop the Tectonic child process that is "inside" of the kcov + // runner. If we kill kcov itself, the child process keeps running and we + // hang because our pipes never get fully closed. Right now I don't see a + // way to actually terminate the Tectonic subprocess short of guessing its + // PID, which is hackier than I want to implement. We could address this by + // providing some other mechanism to tell the "watch" subprocess to stop, + // such as closing its stdin. + assert_eq!(KCOV_WORDS.len(), 0, "\"until\" tests do not work with kcov"); + + let mut command = prep_tectonic(cwd, args); + command.stdout(Stdio::piped()).stderr(Stdio::piped()); + command.env("BROWSER", "echo"); + + println!("running {command:?} until test passes"); + let mut child = command.spawn().expect("tectonic failed to start"); + while !kill() { + thread::sleep(Duration::from_secs(1)); + } + + // Ignore if the child already died + let _ = child.kill(); + child + .wait_with_output() + .expect("tectonic failed to execute") +} + fn run_tectonic_with_stdin(cwd: &Path, args: &[&str], stdin: &str) -> Output { let mut command = prep_tectonic(cwd, args); command .stdin(Stdio::piped()) .stdout(Stdio::piped()) .stderr(Stdio::piped()); - println!("running {:?}", command); + println!("running {command:?}"); let mut child = command.spawn().expect("tectonic failed to start"); - write!(child.stdin.as_mut().unwrap(), "{}", stdin) + write!(child.stdin.as_mut().unwrap(), "{stdin}") .expect("failed to send data to tectonic subprocess"); child .wait_with_output() @@ -298,10 +327,10 @@ fn run_with_biber(args: &str, stdin: &str) -> Output { .stdin(Stdio::piped()) .stdout(Stdio::piped()) .stderr(Stdio::piped()); - println!("running {:?}", command); + println!("running {command:?}"); let mut child = command.spawn().expect("tectonic failed to start"); - write!(child.stdin.as_mut().unwrap(), "{}", stdin) + write!(child.stdin.as_mut().unwrap(), "{stdin}") .expect("failed to send data to tectonic subprocess"); child @@ -375,16 +404,16 @@ fn biber_no_such_tool() { command.env("TECTONIC_TEST_FAKE_BIBER", "ohnothereisnobiberprogram"); const REST: &str = r#"\bye"#; - let tex = format!("{}{}", BIBER_TRIGGER_TEX, REST); + let tex = format!("{BIBER_TRIGGER_TEX}{REST}"); command .stdin(Stdio::piped()) .stdout(Stdio::piped()) .stderr(Stdio::piped()); - println!("running {:?}", command); + println!("running {command:?}"); let mut child = command.spawn().expect("tectonic failed to start"); - write!(child.stdin.as_mut().unwrap(), "{}", tex) + write!(child.stdin.as_mut().unwrap(), "{tex}") .expect("failed to send data to tectonic subprocess"); let output = child @@ -411,7 +440,7 @@ a \fi \fi \bye"#; - let tex = format!("{}{}", BIBER_TRIGGER_TEX, REST); + let tex = format!("{BIBER_TRIGGER_TEX}{REST}"); let output = run_with_biber("success", &tex); success_or_panic(&output); } @@ -738,6 +767,7 @@ fn v2_dump_suffix() { { let mut file = File::create(&temppath).unwrap(); + #[allow(clippy::write_literal)] writeln!( file, "{}", // <= works around {} fussiness in Rust format strings @@ -846,9 +876,9 @@ fn shell_escape_env_override() { .stderr(Stdio::piped()) .env("TECTONIC_UNTRUSTED_MODE", "0"); - println!("running {:?}", command); + println!("running {command:?}"); let mut child = command.spawn().expect("tectonic failed to start"); - write!(child.stdin.as_mut().unwrap(), "{}", SHELL_ESCAPE_TEST_DOC) + write!(child.stdin.as_mut().unwrap(), "{SHELL_ESCAPE_TEST_DOC}") .expect("failed to send data to tectonic subprocess"); let output = child @@ -910,3 +940,65 @@ fn bad_v2_position_build() { let output = run_tectonic(&temppath, &["build", "-X"]); error_or_panic(&output); } + +/// Ensures that watch command succeeds, and when a file is changed while running it rebuilds +/// periodically +#[cfg(all(feature = "serialization", not(target_arch = "mips")))] +#[test] +fn v2_watch_succeeds() { + if KCOV_WORDS.len() > 0 { + return; // See run_tectonic_until() for an explanation of why this test must be skipped + } + + let (_tempdir, temppath) = setup_v2(); + + // Timeout the test after 5 minutes - we should definitely run twice in that range + let max_time = Duration::from_secs(60 * 5); + let path = temppath.clone(); + + // Make sure `default.pdf` already exists - just makes the test easier to implement + let output = run_tectonic(&temppath, &["-X", "build"]); + success_or_panic(&output); + + let thread = thread::spawn(move || { + // Give the process time to start up. Tried a channel, doesn't really work, so we just do + // a best-effort 'sleep for long enough it should have started'. + thread::sleep(Duration::from_secs(5)); + + let input = path.join("src/index.tex"); + let output = path.join("build/default/default.pdf"); + let start = Instant::now(); + let mut start_mod = None; + let mut modified = 0; + while Instant::now() - start < max_time { + if modified >= 3 { + break; + } + + { + let mut file = File::create(&input).unwrap(); + writeln!(file, "New Text {}", modified).unwrap(); + } + + let new_mod = output.metadata().and_then(|meta| meta.modified()).unwrap(); + if start_mod.map_or(true, |start_mod| new_mod > start_mod) { + start_mod = Some(new_mod); + modified += 1; + } + + thread::sleep(Duration::from_secs(5)); + } + }); + + let output = run_tectonic_until(&temppath, &["-X", "watch"], || thread.is_finished()); + // TODO: Make timeout kill child in a way that terminates it gracefully, such as ctrl-c, not SIGKILL + // success_or_panic(&output); + let stdout = String::from_utf8_lossy(&output.stdout); + let stderr = String::from_utf8_lossy(&output.stderr); + println!("stdout:\n{}", stdout); + println!("stderr:\n{}", stderr); + + thread.join().unwrap(); + + assert!(stdout.matches("Running TeX").count() >= 2); +}