Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add note on samping overhead to cutting tutorials #249

Merged
merged 10 commits into from
Jun 19, 2023

Conversation

caleb-johnson
Copy link
Collaborator

Resolves #225

@caleb-johnson caleb-johnson added the documentation Improvements or additions to documentation label Jun 11, 2023
@caleb-johnson caleb-johnson requested a review from garrison June 11, 2023 14:49
@github-actions
Copy link

github-actions bot commented Jun 11, 2023

Pull Request Test Coverage Report for Build 5306859455

  • 0 of 0 changed or added relevant lines in 0 files are covered.
  • No unchanged relevant lines lost coverage.
  • Overall coverage remained the same at 89.638%

Totals Coverage Status
Change from base Build 5271438667: 0.0%
Covered Lines: 2327
Relevant Lines: 2596

💛 - Coveralls

Copy link
Member

@garrison garrison left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The sampling overhead is the required number of samples needed to get an expectation value of a target observable within some error, $\epsilon$.

How about:

The sampling overhead is the factor by which the number of samples must increase for the quasiprobability decomposition to result in the same amount of error as one would get by sampling the original circuit.

@caleb-johnson
Copy link
Collaborator Author

caleb-johnson commented Jun 16, 2023

How about:

Hmm your comment doesnt say Outdated, but I made this change

EDIT: I see it was just a manual quote :)

garrison
garrison previously approved these changes Jun 19, 2023
Copy link
Member

@garrison garrison left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, this looks good. One final thought: I think the word "roughly" may be unnecessary, especially since it is followed by big-O, which is really about asymptotic behavior. I did not even notice this the first five times I read it, so it is a pretty minor point.

@caleb-johnson
Copy link
Collaborator Author

caleb-johnson commented Jun 19, 2023

Thanks, this looks good. One final thought: I think the word "roughly" may be unnecessary, especially since it is followed by big-O, which is really about asymptotic behavior. I did not even notice this the first five times I read it, so it is a pretty minor point.

I took this from the paper and wondered why it was rough. Is it because different gates have different base terms, or is it because there is something inexact going on with the chosen epsilon? I'm going to remove it since I'm not certain either

@caleb-johnson caleb-johnson merged commit 95ed594 into main Jun 19, 2023
@caleb-johnson caleb-johnson deleted the add-sampling-note branch June 19, 2023 01:35
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Explain sampling overhead in tutorials
2 participants