-
Notifications
You must be signed in to change notification settings - Fork 2.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
more useful output for update_fake_backends.py #8336
Conversation
Thank you for opening a new pull request. Before your PR can be merged it will first need to pass continuous integration tests and be reviewed. Sometimes the review process can be slow, so please be patient. While you're waiting, please feel free to review other open PRs. While only a subset of people are authorized to approve pull requests for merging, everyone is encouraged to review open pull requests. Doing reviews helps reduce the burden on the core team and helps make the project's code better for everyone. One or more of the the following people are requested to review this:
|
Pull Request Test Coverage Report for Build 3146527252
💛 - Coveralls |
5c2ee92
to
b0ccdf6
Compare
…e_backends/no_props
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The additional information coming out of the script certainly looks useful to me to help catch broken fake backends as they're generated. Now that it's got rather more complicated, I think it wouldn't hurt if we could write a bit more documentation into the script about what it outputs, why those statistics are important, and a mention of when it's ok to accept a calibration that's got some broken elements in it.
tools/update_fake_backends.py
Outdated
if config: | ||
config_path = os.path.join(args.dir, name, "conf_%s.json" % name) | ||
config_dict = config.to_dict() | ||
|
||
with open(config_path, "w") as fd: | ||
fd.write(json.dumps(config_dict, cls=BackendEncoder)) | ||
summary.conf_msg = f"conf_{name}.json okey" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These lines seem a little pointless, since the only way they can not get set is if the script throws some exception and exits without printing anything. If you do want to keep them, a minor spelling note: "ok" or "okay", but not "okey".
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I changed it to "ok" and clarified the alternative path in a60833b
tools/update_fake_backends.py
Outdated
output += [f"{line}" for line in self.readout_summary] | ||
|
||
if self.gate_summary: | ||
output.append("Gate errors == 1:") | ||
output += [f"{line}" for line in self.gate_summary] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Largely irrelevant, but f"{line}"
is a bit redundant since line
is already a string! output.extend(self.gate_summary)
ought to work fine, I think? (And similar for the others.)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ups! good catch. It was from the time I was doing some formatting in each line. Fixed in a2a1828
I added some extra explanation and follow the advice in #8336 (comment)
|
|
||
sys.path.append(args.dir) | ||
backend_mod = import_module(name) | ||
fake_backend = getattr(backend_mod, "Fake" + name.capitalize())() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I got this error for ibm_oslo
which is related to what I commented in the other PR.
File "/Users/junye/Documents/GitHub/qiskit-dev/qiskit-terra/./tools/update_fake_backends.py", line 300, in _main
summary = _Summary(props, fake_backend.properties())
AttributeError: 'FakeOslo' object has no attribute 'properties'
Here you assume that Fake{Backend}
is a V1 backend and it has properties
attribute. But FakeOslo
is actually a V2 backend.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This script should probably be updated for V2 backend
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's leave that for another PR, since it'll be a bit more complicated, most likely. Right now, we can just skip the backend if the one we're given is V2.
----- Co-authored-by: "Lev S. Bishop" <[email protected]> Co-authored-by: Junye Huang <[email protected]> Co-authored-by: Luciano Bello <[email protected]> Co-authored-by: Matthew Treinish <[email protected]> Co-authored-by: Thomas Alexander <[email protected]>
Closing in favor of Qiskit/qiskit-ibm-runtime#1262 as IBM specific fake backends are moving to https://github.com/Qiskit/qiskit-ibm-runtime |
This PR extends
tools/update_fake_backends.py
to warning the user when the snapshot has errors of 1.@nonhermitian noticed that some calibration data was bad when I was creating
FakeGeneva
in #8322 (comment)Indeed, many property json files contain gates with
gate_error==1
Moreover, many exploratory backends have errors==1 somewhere, but some calibrations are better than others. So it is important to show the relative amount of gates with error==1.
Additionally, the script
tools/update_fake_backends.py
might silently fail and it is hard to see why is not doing what it suppose to do. Take the situation of trying to update a backend you dont have access to. It runs without any output, but does not generate the expected files.In this PR, the script outputs when some of the
args.backends
did not get updated or if some of the json files were skipped.This is how it looks:
In addition, I added a parameter
--datetime
to be able to fetch specific snapshots, like suggested here.