Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support WDK Configuration in crates.io-distributed crate #82

Closed
wmmc88 opened this issue Jan 23, 2024 · 1 comment · Fixed by #186
Closed

Support WDK Configuration in crates.io-distributed crate #82

wmmc88 opened this issue Jan 23, 2024 · 1 comment · Fixed by #186
Assignees
Labels
enhancement New feature or request

Comments

@wmmc88
Copy link
Collaborator

wmmc88 commented Jan 23, 2024

Currently, the crate that's distributed via crates.io has several things hardcoded in the configuration. Some examples of this include the WdfFunctions_01033 functional table reference, and the WdfMinimumVersionDesired symbol. The wdk-build::Config used in wdk-sys also defaults to KMDF 1.33.

While the repo supports different WDF versions, different driver models (WDM, KMDF, UMDF), etc, it is currently cumbersome to use anything that is not KMDF v1.33. Our goal should be to expose allow the user to be able to use the crate off crates.io and be able to configure all of the common configuration options of the WDK.

A possible way to do this would be to have mutually-exclusive cargo-features for wdk-sys and wdk that configure what bindings are created by the crate.
There are some caveats for using mutually exclusive features, but it might be the best way forward that's currently available. There is extensive discussion about better ways to accomplish this in the future here and here, but we must make do with what is available right now.

Another possible approach is to use environment variables to control the various features. Env variables can only have one value at any given time, so they are already mutually exclusive. They could be set via normal environment variable in shell, via env section in a config.toml, or in any of the numerous ways cargo-make provides. This does harm discoverability of options, and there are some issues with some ways to configure env variables in Cargo right now (ex. rust-lang/cargo#10358. This one in particular seems like it will not be resolved for a while).

Some other thoughts:

  • How will this support multiple drivers in the same workspace?
    • If we go the path of mutually-exclusive features, workspace dependency resolution would make it so that each cargo workspace can only contain one combination of mutually-exclusive features. Its possible to get multiple drivers in the same file tree to have different combinations of mutually-exclusive features, by having multiple subfolders that are each a cargo workspace and orchestrating the entire build via cargo-make
    • The environment variable approach would have the same caveats as features.
  • Cargo currently has a 300 feature limit due to an incident where the icondata crate (v0.1.0 and earlier) used a build crate to generate over 23000 mutually exclusive features. I don't think we will get anywhere near 300 features though.
@wmmc88 wmmc88 self-assigned this Jan 23, 2024
@wmmc88
Copy link
Collaborator Author

wmmc88 commented Jan 23, 2024

Another possible approach is to piggyback off of the [package.metadata.wdk] tag introduced in #35 , and use that to specify all the configuration information. wdk-sys could then parse that information by using cargo_metadata in its build.rs. This would still have the same workspace issues as the other approaches, and would also need to have the appropriate rerun-if-changed calls emitted to cargo. Workspaces could actually define workspace-level metadata to inherit all the configurations from a top level virtual manifest, so things like upgrading a set of drivers to a newer WDF version is very easy

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant