Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for V2 primitives #843

Merged
merged 94 commits into from
Nov 11, 2024
Merged
Show file tree
Hide file tree
Changes from 84 commits
Commits
Show all changes
94 commits
Select commit Hold shift + click to select a range
009c399
Update README.md
FrancescaSchiav Feb 29, 2024
5096546
Merge branch 'qiskit-community:main' into main
OkuyanBoga Mar 1, 2024
32219fb
Generalize the Einstein summation signature
edoaltamura Mar 14, 2024
17d8c33
Add reno
edoaltamura Mar 15, 2024
d6f688d
Update Copyright
edoaltamura Mar 15, 2024
034785b
Rename and add test
edoaltamura Mar 20, 2024
04b886d
Update Copyright
edoaltamura Mar 20, 2024
11cde5f
Merge branch 'qiskit-community:main' into main
OkuyanBoga Apr 3, 2024
aea890d
Merge pull request #18 from OkuyanBoga/torch_issue716
OkuyanBoga Apr 3, 2024
7b2e9be
Add docstring for `test_get_einsum_signature`
edoaltamura Apr 3, 2024
6a8a136
Correct spelling
edoaltamura Apr 3, 2024
31a826e
Disable spellcheck for comments
edoaltamura Apr 3, 2024
5b4f617
Add `docstring` in pylint dict
edoaltamura Apr 3, 2024
b0d0590
Delete example in docstring
edoaltamura Apr 3, 2024
240d02f
Add Einstein in pylint dict
edoaltamura Apr 3, 2024
f8c32dd
Add full use case in einsum dict
edoaltamura Apr 8, 2024
34322b2
Spelling and type ignore
edoaltamura Apr 8, 2024
94ec48c
Spelling and type ignore
edoaltamura Apr 8, 2024
16c8454
Spelling and type ignore
edoaltamura Apr 8, 2024
00130f2
Spelling and type ignore
edoaltamura Apr 8, 2024
e045c16
Spelling and type ignore
edoaltamura Apr 8, 2024
22d94ce
Remove for loop in einsum function and remove Literal arguments (1/2)
edoaltamura Apr 24, 2024
95dd9df
Remove for loop in einsum function and remove Literal arguments (1/2)
edoaltamura Apr 24, 2024
4cbf0c3
Remove for loop in einsum function and remove Literal arguments (2/2)
edoaltamura Apr 24, 2024
c4dca19
Update RuntimeError msg
edoaltamura Apr 30, 2024
d5ed96b
Update RuntimeError msg - line too long
edoaltamura Apr 30, 2024
d6f3d47
Trigger CI
edoaltamura May 2, 2024
5ed6345
Merge branch 'main' into main
edoaltamura May 2, 2024
9ccc3a2
Merge branch 'qiskit-community:main' into main
edoaltamura Jun 7, 2024
3f669b0
Merge branch 'qiskit-community:main' into main
edoaltamura Jun 19, 2024
3846d4d
Merge algos, globals.random to fix
edoaltamura Jul 11, 2024
070aa81
Fixed `algorithms_globals`
edoaltamura Jul 11, 2024
ddc160f
Import /tests and run CI locally
edoaltamura Jul 11, 2024
c5a55ad
Fix copyrights and some spellings
edoaltamura Jul 11, 2024
f4d49eb
Ignore mypy in 8 instances
edoaltamura Jul 18, 2024
2d1209f
Merge spell dicts
edoaltamura Jul 29, 2024
2735810
Black reformatting
edoaltamura Jul 29, 2024
840c270
Black reformatting
edoaltamura Jul 31, 2024
c2f726a
Add reno
edoaltamura Jul 31, 2024
cf2d6b0
Merge remote-tracking branch 'origin/main' into migrate-algo
edoaltamura Jul 31, 2024
5976830
Lint sanitize
edoaltamura Jul 31, 2024
5e07acc
Pylint
edoaltamura Jul 31, 2024
b997bb0
Pylint
edoaltamura Jul 31, 2024
c464459
Pylint
edoaltamura Jul 31, 2024
51610a1
Pylint
edoaltamura Jul 31, 2024
c42688c
Fix relative imports in tutorials
edoaltamura Jul 31, 2024
db9b03f
Fix relative imports in tutorials
edoaltamura Jul 31, 2024
21badc4
Remove algorithms from Jupyter magic methods
edoaltamura Jul 31, 2024
e8628cc
Temporarily disable "Run stable tutorials" tests
edoaltamura Aug 1, 2024
da63c5b
Change the docstrings with imports from qiskit_algorithms
edoaltamura Aug 1, 2024
0c5825c
Styling
edoaltamura Aug 1, 2024
c0974f9
Update qiskit_machine_learning/optimizers/gradient_descent.py
edoaltamura Aug 1, 2024
7490b38
Update qiskit_machine_learning/optimizers/optimizer_utils/learning_ra…
edoaltamura Aug 1, 2024
d38154b
Add more tests for utils
edoaltamura Aug 2, 2024
fe021e9
Add more tests for optimizers: adam, bobyqa, gsls and imfil
edoaltamura Aug 2, 2024
51e3ea7
Fix random seed for volatile optimizers
edoaltamura Aug 2, 2024
fb4fc39
Fix random seed for volatile optimizers
edoaltamura Aug 2, 2024
3cb3850
Add more tests
edoaltamura Aug 2, 2024
3da9109
Pylint dict
edoaltamura Aug 2, 2024
d34c4c9
Activate scikit-quant-0.8.2
edoaltamura Aug 2, 2024
1f6ca7a
Remove scikit-quant methods
edoaltamura Aug 2, 2024
b5875a3
Remove scikit-quant methods (2)
edoaltamura Aug 2, 2024
800cca4
Edit the release notes and Qiskit version 1+
edoaltamura Aug 5, 2024
e98200a
Edit the release notes and Qiskit version 1+
edoaltamura Aug 5, 2024
f349f7c
Add Qiskit 1.0 upgrade in reno
edoaltamura Aug 5, 2024
154d6a7
Add Qiskit 1.0 upgrade in reno
edoaltamura Aug 5, 2024
c728400
Add Qiskit 1.0 upgrade in reno
edoaltamura Aug 5, 2024
3294731
Apply line breaks
edoaltamura Aug 6, 2024
9e53371
Restructure line breaks
edoaltamura Aug 6, 2024
2bbb57c
Added support for SamplerV2 primitives (#49)
OkuyanBoga Nov 7, 2024
1712ebe
Added support for EstimatorV2 primitives (#48)
OkuyanBoga Nov 7, 2024
2bf2668
Pulled changes from main
OkuyanBoga Nov 8, 2024
e52575b
Quick fix
OkuyanBoga Nov 8, 2024
805a6b1
bugfix for V1
OkuyanBoga Nov 8, 2024
9a6574b
formatting
oscar-wallis Nov 8, 2024
1d03d4f
Prep-ing for 0.8 (#53)
oscar-wallis Nov 8, 2024
79e9b2e
Merge remote-tracking branch 'upstream/main' into update-V2
edoaltamura Nov 8, 2024
5606dd6
Update test_qbayesian
OkuyanBoga Nov 8, 2024
45bc6f8
Bugfixing the test_gradient
oscar-wallis Nov 8, 2024
e69c03d
Fixing an Options error with sampler_gradient
oscar-wallis Nov 8, 2024
56dc948
Merge branch 'update-V2' of https://github.com/OkuyanBoga/hc-qiskit-m…
oscar-wallis Nov 8, 2024
bd41778
Linting and formatting
edoaltamura Nov 8, 2024
3622bc2
Add reno
edoaltamura Nov 8, 2024
4844894
Fix dict typing definition
edoaltamura Nov 8, 2024
e386aaf
Fix mypy
edoaltamura Nov 8, 2024
2527ea7
Issue deprecation warnings
edoaltamura Nov 8, 2024
6c6efc5
Update skip test message
edoaltamura Nov 8, 2024
da04d85
Update deprecation warning for qbayesian.py
edoaltamura Nov 8, 2024
9472261
Update deprecation warning for qbayesian.py
edoaltamura Nov 8, 2024
fc716fe
Add headers in deprecation.py
edoaltamura Nov 8, 2024
e0c6b7d
Add headers in deprecation.py
edoaltamura Nov 8, 2024
0f09f4f
Add headers in deprecation.py
edoaltamura Nov 8, 2024
6a44cbe
Correct spelling
edoaltamura Nov 8, 2024
5d653b0
Add spelling `msg`
edoaltamura Nov 8, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/main.yml
Original file line number Diff line number Diff line change
Expand Up @@ -312,4 +312,4 @@ jobs:
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: coveralls --service=github
shell: bash
shell: bash
56 changes: 45 additions & 11 deletions qiskit_machine_learning/algorithms/inference/qbayesian.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,11 +15,15 @@

import copy
from typing import Tuple, Dict, Set, List

from qiskit import QuantumCircuit, ClassicalRegister
from qiskit.quantum_info import Statevector
from qiskit.circuit.library import GroverOperator
from qiskit.primitives import BaseSampler, Sampler
from qiskit.circuit import Qubit
from qiskit.circuit.library import GroverOperator
from qiskit.primitives import BaseSampler, Sampler, BaseSamplerV2
from qiskit.transpiler.passmanager import BasePassManager
from qiskit.transpiler.preset_passmanagers import generate_preset_pass_manager
from qiskit.providers.fake_provider import GenericBackendV2


class QBayesian:
Expand Down Expand Up @@ -62,7 +66,8 @@ def __init__(
*,
limit: int = 10,
threshold: float = 0.9,
sampler: BaseSampler | None = None,
sampler: BaseSampler | BaseSamplerV2 | None = None,
edoaltamura marked this conversation as resolved.
Show resolved Hide resolved
pass_manager: BasePassManager | None = None,
):
"""
Args:
Expand All @@ -83,7 +88,8 @@ def __init__(
# Test valid input
for qrg in circuit.qregs:
if qrg.size > 1:
raise ValueError("Every register needs to be mapped to exactly one unique qubit")
raise ValueError("Every register needs to be mapped to exactly one unique qubit.")

# Initialize parameter
self._circ = circuit
self._limit = limit
Expand All @@ -92,6 +98,11 @@ def __init__(
sampler = Sampler()
self._sampler = sampler

if pass_manager is None:
_backend = GenericBackendV2(num_qubits=max(circuit.num_qubits, 2))
pass_manager = generate_preset_pass_manager(optimization_level=1, backend=_backend)
self._pass_manager = pass_manager

# Label of register mapped to its qubit
self._label2qubit = {qrg.name: qrg[0] for qrg in self._circ.qregs}
# Label of register mapped to its qubit index bottom up in significance
Expand Down Expand Up @@ -139,11 +150,34 @@ def _get_grover_op(self, evidence: Dict[str, int]) -> GroverOperator:

def _run_circuit(self, circuit: QuantumCircuit) -> Dict[str, float]:
"""Run the quantum circuit with the sampler."""
# Sample from circuit
job = self._sampler.run(circuit)
result = job.result()
# Get the counts of quantum state results
counts = result.quasi_dists[0].nearest_probability_distribution().binary_probabilities()
counts = {}

if isinstance(self._sampler, BaseSampler):
# Sample from circuit
job = self._sampler.run(circuit)
result = job.result()

# Get the counts of quantum state results
counts = result.quasi_dists[0].nearest_probability_distribution().binary_probabilities()

elif isinstance(self._sampler, BaseSamplerV2):

# Sample from circuit
circuit_isa = self._pass_manager.run(circuit)
job = self._sampler.run([circuit_isa])
result = job.result()

bit_array = list(result[0].data.values())[0]
bitstring_counts = bit_array.get_counts()

# Normalize the counts to probabilities
total_shots = result[0].metadata["shots"]
counts = {k: v / total_shots for k, v in bitstring_counts.items()}

# Convert to quasi-probabilities
# counts = QuasiDistribution(probabilities)
# counts = {k: v for k, v in counts.items()}

return counts

def __power_grover(
Expand Down Expand Up @@ -360,12 +394,12 @@ def limit(self, limit: int):
self._limit = limit

@property
def sampler(self) -> BaseSampler:
def sampler(self) -> BaseSampler | BaseSamplerV2:
"""Returns the sampler primitive used to compute the samples."""
return self._sampler

@sampler.setter
def sampler(self, sampler: BaseSampler):
def sampler(self, sampler: BaseSampler | BaseSamplerV2):
"""Set the sampler primitive used to compute the samples."""
self._sampler = sampler

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,10 +24,12 @@

from qiskit.circuit import Parameter, ParameterExpression, QuantumCircuit
from qiskit.primitives import BaseEstimator
from qiskit.primitives.base import BaseEstimatorV2
from qiskit.primitives.utils import _circuit_key
from qiskit.providers import Options
from qiskit.quantum_info.operators.base_operator import BaseOperator
from qiskit.transpiler.passes import TranslateParameterizedGates
from qiskit.transpiler.passmanager import BasePassManager

from .estimator_gradient_result import EstimatorGradientResult
from ..utils import (
Expand All @@ -46,13 +48,15 @@ class BaseEstimatorGradient(ABC):

def __init__(
self,
estimator: BaseEstimator,
estimator: BaseEstimator | BaseEstimatorV2,
options: Options | None = None,
derivative_type: DerivativeType = DerivativeType.REAL,
pass_manager: BasePassManager | None = None,
):
r"""
Args:
estimator: The estimator used to compute the gradients.
pass_manager: pass manager for isa_circuit transpilation.
options: Primitive backend runtime options used for circuit execution.
The order of priority is: options in ``run`` method > gradient's
default options > primitive's default setting.
Expand All @@ -69,6 +73,7 @@ def __init__(
finite difference.
"""
self._estimator: BaseEstimator = estimator
self._pass_manager = pass_manager
self._default_options = Options()
if options is not None:
self._default_options.update_options(**options)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,7 @@
from qiskit.primitives.utils import _circuit_key
from qiskit.providers import Options
from qiskit.transpiler.passes import TranslateParameterizedGates
from qiskit.transpiler.passmanager import BasePassManager

from .sampler_gradient_result import SamplerGradientResult
from ..utils import (
Expand All @@ -41,7 +42,13 @@
class BaseSamplerGradient(ABC):
"""Base class for a ``SamplerGradient`` to compute the gradients of the sampling probability."""

def __init__(self, sampler: BaseSampler, options: Options | None = None):
def __init__(
self,
sampler: BaseSampler,
options: Options | None = None,
len_quasi_dist: int | None = None,
pass_manager: BasePassManager | None = None,
):
"""
Args:
sampler: The sampler used to compute the gradients.
Expand All @@ -51,6 +58,8 @@ def __init__(self, sampler: BaseSampler, options: Options | None = None):
Higher priority setting overrides lower priority setting
"""
self._sampler: BaseSampler = sampler
self._pass_manager = pass_manager
self._len_quasi_dist = len_quasi_dist
self._default_options = Options()
if options is not None:
self._default_options.update_options(**options)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,14 +17,19 @@

from collections.abc import Sequence

import numpy as np

from qiskit.circuit import Parameter, QuantumCircuit
from qiskit.quantum_info.operators.base_operator import BaseOperator
from qiskit.primitives.base import BaseEstimatorV2
from qiskit.primitives import BaseEstimatorV1
from qiskit.providers.options import Options

from ..base.base_estimator_gradient import BaseEstimatorGradient
from ..base.estimator_gradient_result import EstimatorGradientResult
from ..utils import _make_param_shift_parameter_values

from ...exceptions import AlgorithmError
from ...exceptions import QiskitMachineLearningError


class ParamShiftEstimatorGradient(BaseEstimatorGradient):
Expand Down Expand Up @@ -97,26 +102,59 @@ def _run_unique(
job_param_values.extend(param_shift_parameter_values)
all_n.append(n)

# Run the single job with all circuits.
job = self._estimator.run(
job_circuits,
job_observables,
job_param_values,
**options,
)
try:
# Determine how to run the estimator based on its version
if isinstance(self._estimator, BaseEstimatorV1):
# Run the single job with all circuits.
job = self._estimator.run(
job_circuits,
job_observables,
job_param_values,
**options,
)
results = job.result()
except Exception as exc:
raise AlgorithmError("Estimator job failed.") from exc

# Compute the gradients.
gradients = []
partial_sum_n = 0
for n in all_n:
result = results.values[partial_sum_n : partial_sum_n + n]
gradient_ = (result[: n // 2] - result[n // 2 :]) / 2
gradients.append(gradient_)
partial_sum_n += n

opt = self._get_local_options(options)

# Compute the gradients.
gradients = []
partial_sum_n = 0
for n in all_n:
result = results.values[partial_sum_n : partial_sum_n + n]
gradient_ = (result[: n // 2] - result[n // 2 :]) / 2
gradients.append(gradient_)
partial_sum_n += n

opt = self._get_local_options(options)

elif isinstance(self._estimator, BaseEstimatorV2):
isa_g_circs = self._pass_manager.run(job_circuits)
isa_g_observables = [
op.apply_layout(isa_g_circs[i].layout) for i, op in enumerate(job_observables)
]
# Prepare circuit-observable-parameter tuples (PUBs)
circuit_observable_params = []
for pub in zip(isa_g_circs, isa_g_observables, job_param_values):
circuit_observable_params.append(pub)

# For BaseEstimatorV2, run the estimator using PUBs and specified precision
job = self._estimator.run(circuit_observable_params)
results = job.result()
results = np.array([float(r.data.evs) for r in results])

# Compute the gradients.
gradients = []
partial_sum_n = 0
for n in all_n:
result = results[partial_sum_n : partial_sum_n + n]
gradient_ = (result[: n // 2] - result[n // 2 :]) / 2
gradients.append(gradient_)
partial_sum_n += n

opt = Options(**options)

else:
raise QiskitMachineLearningError(
"The accepted estimators are BaseEstimatorV1 and BaseEstimatorV2; got "
+ f"{type(self._estimator)} instead. Note that BaseEstimatorV1 is deprecated in"
+ "Qiskit and removed in Qiskit IBM Runtime."
)

return EstimatorGradientResult(gradients=gradients, metadata=metadata, options=opt)
Original file line number Diff line number Diff line change
Expand Up @@ -20,11 +20,15 @@

from qiskit.circuit import Parameter, QuantumCircuit

from qiskit.primitives import BaseSamplerV1
from qiskit.primitives.base import BaseSamplerV2
from qiskit.result import QuasiDistribution

from ..base.base_sampler_gradient import BaseSamplerGradient
from ..base.sampler_gradient_result import SamplerGradientResult
from ..utils import _make_param_shift_parameter_values

from ...exceptions import AlgorithmError
from ...exceptions import AlgorithmError, QiskitMachineLearningError


class ParamShiftSamplerGradient(BaseSamplerGradient):
Expand Down Expand Up @@ -91,18 +95,52 @@ def _run_unique(
all_n.append(n)

# Run the single job with all circuits.
job = self._sampler.run(job_circuits, job_param_values, **options)
if isinstance(self._sampler, BaseSamplerV1):
job = self._sampler.run(job_circuits, job_param_values, **options)
elif isinstance(self._sampler, BaseSamplerV2):
if self._pass_manager is None:
raise QiskitMachineLearningError(
"To use ParameterShifSamplerGradient with SamplerV2 you "
+ "must pass a gradient with a pass manager"
)
isa_g_circs = self._pass_manager.run(job_circuits)
circ_params = [
(isa_g_circs[i], job_param_values[i]) for i in range(len(job_param_values))
]
job = self._sampler.run(circ_params)
else:
raise AlgorithmError(
"The accepted estimators are BaseSamplerV1 (deprecated) and BaseSamplerV2; got "
+ f"{type(self._sampler)} instead."
)

try:
results = job.result()
except Exception as exc:
raise AlgorithmError("Estimator job failed.") from exc
raise AlgorithmError("Sampler job failed.") from exc

# Compute the gradients.
gradients = []
partial_sum_n = 0
opt = None # Required by PyLint: possibly-used-before-assignment
for n in all_n:
gradient = []
result = results.quasi_dists[partial_sum_n : partial_sum_n + n]

if isinstance(self._sampler, BaseSamplerV1):
result = results.quasi_dists[partial_sum_n : partial_sum_n + n]
opt = self._get_local_options(options)
elif isinstance(self._sampler, BaseSamplerV2):
result = []
for i in range(partial_sum_n, partial_sum_n + n):
bitstring_counts = results[i].data.meas.get_counts()
# Normalize the counts to probabilities
total_shots = sum(bitstring_counts.values())
probabilities = {k: v / total_shots for k, v in bitstring_counts.items()}
# Convert to quasi-probabilities
counts = QuasiDistribution(probabilities)
result.append({k: v for k, v in counts.items() if int(k) < self.len_quasi_dist})
opt = options

for dist_plus, dist_minus in zip(result[: n // 2], result[n // 2 :]):
grad_dist: dict[int, float] = defaultdict(float)
for key, val in dist_plus.items():
Expand All @@ -113,5 +151,4 @@ def _run_unique(
gradients.append(gradient)
partial_sum_n += n

opt = self._get_local_options(options)
return SamplerGradientResult(gradients=gradients, metadata=metadata, options=opt)
Loading
Loading