Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: Raise exception for probs with observable #280

Open
wants to merge 3 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@
},
extras_require={
"test": [
"autoray<0.7.0", # autoray.tensorflow_diag no longer works
"autoray<0.7.0", # autoray.tensorflow_diag no longer works
"docutils>=0.19",
"flaky",
"pre-commit",
Expand Down
2 changes: 2 additions & 0 deletions src/braket/pennylane_plugin/translation.py
Original file line number Diff line number Diff line change
Expand Up @@ -576,6 +576,8 @@ def translate_result_type( # noqa: C901
observable = measurement.obs

if return_type is ObservableReturnTypes.Probability:
if observable and observable.diagonalizing_gates():
raise qml.DeviceError("Probability result type not supported for observables")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks to be causing some issues for the tests. Is this the expected outcome now?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So at the moment the pl-device-tests are xfailing the hadamard gradient tests due to the incorrect results. We may instead need to xfail at a different location as now we get an error on execution, instead of on result comparison:

https://github.com/PennyLaneAI/pennylane/blob/883c3ed0a66d06841e074c90f59317972af36cee/pennylane/devices/tests/test_gradients_autograd.py#L141

return Probability(targets)

if return_type is ObservableReturnTypes.State:
Expand Down
11 changes: 11 additions & 0 deletions test/unit_tests/test_translation.py
Original file line number Diff line number Diff line change
Expand Up @@ -826,6 +826,17 @@ def test_translate_result_type_unsupported_obs():
translate_result_type(tape.measurements[0], [0], frozenset())


def test_translate_result_type_probs_observable():
"""Tests if a DeviceError is raised by translate_result_type for a Probability return type
with an observable attached"""
mp = qml.probs(op=qml.X(wires=0))

with pytest.raises(
qml.DeviceError, match="Probability result type not supported for observables"
):
translate_result_type(mp, [0], frozenset())


def test_translate_result():
result_dict = _result_meta()
result_dict["resultTypes"] = [
Expand Down
Loading