-
Notifications
You must be signed in to change notification settings - Fork 33
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How should edward2
be installed?
#2
Comments
Good catch. It looks like a mistake that this codebase imports from edward2.experimental. We purposefully don't include that code into the library. I'll drop that dependence. |
copybara-service bot
pushed a commit
to google/uncertainty-baselines
that referenced
this issue
Feb 3, 2021
This path isn't part of the public Edward2 library. Fixes google-research/robustness_metrics#2. PiperOrigin-RevId: 355447858
copybara-service bot
pushed a commit
to google/uncertainty-baselines
that referenced
this issue
Feb 3, 2021
This path isn't part of the public Edward2 library. Fixes google-research/robustness_metrics#2. PiperOrigin-RevId: 355447858
copybara-service bot
pushed a commit
to google/uncertainty-baselines
that referenced
this issue
Feb 3, 2021
This path isn't part of the public Edward2 library. Fixes google-research/robustness_metrics#2. PiperOrigin-RevId: 355456305
copybara-service bot
pushed a commit
to google/uncertainty-baselines
that referenced
this issue
Feb 4, 2021
This path isn't part of the public Edward2 library. Fixes google-research/robustness_metrics#2. PiperOrigin-RevId: 355456305
copybara-service bot
pushed a commit
to google/uncertainty-baselines
that referenced
this issue
Feb 5, 2021
This path isn't part of the public Edward2 library. Fixes google-research/robustness_metrics#2. PiperOrigin-RevId: 355456305
copybara-service bot
pushed a commit
to google/uncertainty-baselines
that referenced
this issue
Feb 12, 2021
This path isn't part of the public Edward2 library. Fixes google-research/robustness_metrics#2. PiperOrigin-RevId: 355456305
copybara-service bot
pushed a commit
to google/uncertainty-baselines
that referenced
this issue
Feb 16, 2021
This path isn't part of the public Edward2 library. Fixes google-research/robustness_metrics#2. PiperOrigin-RevId: 355456305
copybara-service bot
pushed a commit
to google/uncertainty-baselines
that referenced
this issue
Feb 16, 2021
This path isn't part of the public Edward2 library. Fixes google-research/robustness_metrics#2. PiperOrigin-RevId: 355456305
znado
pushed a commit
to google/uncertainty-baselines
that referenced
this issue
Feb 16, 2021
This path isn't part of the public Edward2 library. Fixes google-research/robustness_metrics#2. PiperOrigin-RevId: 355456305
copybara-service bot
pushed a commit
that referenced
this issue
May 14, 2021
There are two approaches to implement ensembles. 1. Load all SavedModels into a single model. + Pro: Simple to compute results. + Con: All models must fit in memory and compute can't parallelize across models. 2. Eval each model in parallel, saving predictions. Then load predictions and compute metrics. (approach in Uncertainty Baselines) + Pro: Scales with compute and memory. + Con: Requires two stages (the first uses accelerators, the second is CPU-only). We're already doing the first stage to report non-ensemble results. So two stages is not that inconvenient. This CL does #2. Fixes google/uncertainty-baselines#63, google/uncertainty-baselines#71. Note: I added 'ece' back to the imagenet_variants report. PiperOrigin-RevId: 370938990
copybara-service bot
pushed a commit
that referenced
this issue
May 14, 2021
There are two approaches to implement ensembles. 1. Load all SavedModels into a single model. + Pro: Simple to compute results. + Con: All models must fit in memory and compute can't parallelize across models. 2. Eval each model in parallel, saving predictions. Then load predictions and compute metrics. (approach in Uncertainty Baselines) + Pro: Scales with compute and memory. + Con: Requires two stages (the first uses accelerators, the second is CPU-only). We're already doing the first stage to report non-ensemble results. So two stages is not that inconvenient. This CL does #2. Fixes google/uncertainty-baselines#63, google/uncertainty-baselines#71. Note: I added 'ece' back to the imagenet_variants report. TODOs in later PRs + Loading predictions is fairlow slow. Each file is at most 200MB with 50K predictions of 1000 float32 values, and read_predictions shouldn't take this long. np.load gets, say, read speeds of 200 MB/s (https://stackoverflow.com/a/30332316). It may be because we're loading batch_size=1? + Replace het_ensemble.py and sngp_ensemble.py. PiperOrigin-RevId: 370938990
copybara-service bot
pushed a commit
that referenced
this issue
May 14, 2021
There are two approaches to implement ensembles. 1. Load all SavedModels into a single model. + Pro: Simple to compute results. + Con: All models must fit in memory and compute can't parallelize across models. 2. Eval each model in parallel, saving predictions. Then load predictions and compute metrics. (approach in Uncertainty Baselines) + Pro: Scales with compute and memory. + Con: Requires two stages (the first uses accelerators, the second is CPU-only). We're already doing the first stage to report non-ensemble results. So two stages is not that inconvenient. This CL does #2. Fixes google/uncertainty-baselines#63, google/uncertainty-baselines#71. Note: I added 'ece' back to the imagenet_variants report. TODOs in later PRs + Loading predictions is slow. Each file is at most 200MB with 50K predictions of 1000 float32 values, and read_predictions shouldn't take this long. np.load gets, say, read speeds of 200 MB/s (https://stackoverflow.com/a/30332316). It may be because we're loading batch_size=1? + Replace het_ensemble.py and sngp_ensemble.py. PiperOrigin-RevId: 370938990
copybara-service bot
pushed a commit
that referenced
this issue
May 14, 2021
There are two approaches to implement ensembles. 1. Load all SavedModels into a single model. + Pro: Simple to compute results. + Con: All models must fit in memory and compute can't parallelize across models. 2. Eval each model in parallel, saving predictions. Then load predictions and compute metrics. (approach in Uncertainty Baselines) + Pro: Scales with compute and memory. + Con: Requires two stages (the first uses accelerators, the second is CPU-only). We're already doing the first stage to report non-ensemble results. So two stages is not that inconvenient. This CL does #2. Fixes google/uncertainty-baselines#63, google/uncertainty-baselines#71. Note: I added 'ece' back to the imagenet_variants report. TODOs in later PRs + Loading predictions is slow. Each file is at most 200MB with 50K predictions of 1000 float32 values, and read_predictions shouldn't take this long. np.load gets, say, read speeds of 200 MB/s (https://stackoverflow.com/a/30332316). It may be because we're loading batch_size=1? + Replace het_ensemble.py and sngp_ensemble.py. PiperOrigin-RevId: 370938990
copybara-service bot
pushed a commit
that referenced
this issue
May 18, 2021
There are two approaches to implement ensembles. 1. Load all SavedModels into a single model. + Pro: Simple to compute results. + Con: All models must fit in memory and compute can't parallelize across models. 2. Eval each model in parallel, saving predictions. Then load predictions and compute metrics. (approach in Uncertainty Baselines) + Pro: Scales with compute and memory. + Con: Requires two stages (the first uses accelerators, the second is CPU-only). We're already doing the first stage to report non-ensemble results. So two stages is not that inconvenient. This CL does #2. Fixes google/uncertainty-baselines#63, google/uncertainty-baselines#71. Note: I added 'ece' back to the imagenet_variants report. TODOs in later PRs + Loading predictions is slow. Each file is at most 200MB with 50K predictions of 1000 float32 values, and read_predictions shouldn't take this long. np.load gets, say, read speeds of 200 MB/s (https://stackoverflow.com/a/30332316). It may be because we're loading batch_size=1? + Replace het_ensemble.py and sngp_ensemble.py. PiperOrigin-RevId: 370938990
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I noticed this import in
robustness_metrics/models/uncertainty_baselines.py
:robustness_metrics/robustness_metrics/models/uncertainty_baselines.py
Line 28 in b4c4d4d
However,
edward2/experimental
is not covered by the thefind_packages()
call inedward2/setup.py
, meaning that theexperimental
directory will not be installed by pip.Therefore, I tried simply cloning the edward2 repository and setting the PYTHONPATH to its parent directory, such that
import edward2.experimental
works. This unfortunately caused these imports inedward2/experimental/sngp/__init__.py
to fail:These require
experimental
to be discoverable by the PYTHONPATH, but the path needs to be set to the parent directory of edward2 to make thefrom edward2.experimental import sngp
import above work.So, I am wondering what the recommended way to import
edward2.experimental
is and how you do it yourselves?The text was updated successfully, but these errors were encountered: