diff --git a/AnoncredsProofValidation.md b/AnoncredsProofValidation.md new file mode 100644 index 0000000000..9114c3a727 --- /dev/null +++ b/AnoncredsProofValidation.md @@ -0,0 +1,84 @@ +# Anoncreds Proof Validation in Aca-Py + +Aca-Py does some pre-validation when verifying Anoncreds presentations (proofs), some scenarios are rejected (things that are indicative of tampering, for example) and some attributes are removed before running the anoncreds validation (for example removing superfluous non-revocation timestamps). Any Aca-Py validations or presentation modifications are indicated by the "verify_msgs" attribute in the final presentation exchange object + +The list of possible verification messages is [here](https://github.com/hyperledger/aries-cloudagent-python/blob/main/aries_cloudagent/indy/verifier.py#L24), and consists of: + +``` +class PresVerifyMsg(str, Enum): + """Credential verification codes.""" + + RMV_REFERENT_NON_REVOC_INTERVAL = "RMV_RFNT_NRI" + RMV_GLOBAL_NON_REVOC_INTERVAL = "RMV_GLB_NRI" + TSTMP_OUT_NON_REVOC_INTRVAL = "TS_OUT_NRI" + CT_UNREVEALED_ATTRIBUTES = "UNRVL_ATTR" + PRES_VALUE_ERROR = "VALUE_ERROR" + PRES_VERIFY_ERROR = "VERIFY_ERROR" +``` + +If there is additional information, it will be included like this: `TS_OUT_NRI::19_uuid` (which means the attribute identified by `19_uuid` contained a timestamp outside of the non-revocation interval (which is just a warning)). + +A presentation verification may include multiple messages, for example: + +``` + ... + "verified": "true", + "verified_msgs": [ + "TS_OUT_NRI::18_uuid", + "TS_OUT_NRI::18_id_GE_uuid", + "TS_OUT_NRI::18_busid_GE_uuid" + ], + ... +``` + +... or it may include a single message, for example: + +``` + ... + "verified": "false", + "verified_msgs": [ + "VALUE_ERROR::Encoded representation mismatch for 'Preferred Name'" + ], + ... +``` + +... or the `verified_msgs` may be null or an empty array. + +## Presentation Modifications and Warnings + +The following modifications/warnings may be done by Aca-Py which shouldn't affect the verification of the received proof): + +- "RMV_RFNT_NRI": Referent contains a non-revocation interval for a non-revocable credential (timestamp is removed) +- "RMV_GLB_NRI": Presentation contains a global interval for a non-revocable credential (timestamp is removed) +- "TS_OUT_NRI": Presentation contains a non-revocation timestamp outside of the requested non-revocation interval (warning) +- "UNRVL_ATTR": Presentation contains attributes with unrevealed values (warning) + +## Presentation Pre-validation Errors + +The following pre-verification checks are done, which will fail the proof (before calling anoncreds) and will result in the following message: + +``` +VALUE_ERROR:: +``` + +These validations are all done within the [Indy verifier class](https://github.com/hyperledger/aries-cloudagent-python/blob/main/aries_cloudagent/indy/verifier.py) - to see the detailed validation just look for anywhere a `raise ValueError(...)` appears in the code. + +A summary of the possible errors is: + +- information missing in presentation exchange record +- timestamp provided for irrevocable credential +- referenced revocation registry not found on ledger +- timestamp outside of reasonable range (future date or pre-dates revocation registry) +- mis-match between provided and requested timestamps for non-revocation +- mis-match between requested and provided attributes or predicates +- self-attested attribute is provided for a requested attribute with restrictions +- encoded value doesn't match raw value + +## Anoncreds Verification Exceptions + +Typically when you call the anoncreds `verifier_verify_proof()` method, it will return a `True` or `False` based on whether the presentation cryptographically verifies. However in the case where anoncreds throws an exception, the exception text will be included in a verification message as follows: + +``` +VERIFY_ERROR:: +``` + diff --git a/DIDResolution.md b/DIDResolution.md index c1666adab2..cbe00edc1d 100644 --- a/DIDResolution.md +++ b/DIDResolution.md @@ -29,7 +29,7 @@ In practice, DIDs and DID Documents are used for a variety of purposes but espec ## `DIDResolver` -In ACA-Py, the `DIDResolver` provides the interface to resolve DIDs using registered method resolvers. Method resolver registration happens on startup through the `DIDResolverRegistry`. This registry enables additional resolvers to be loaded via plugin. +In ACA-Py, the `DIDResolver` provides the interface to resolve DIDs using registered method resolvers. Method resolver registration happens on startup in a `did_resolvers` list. This registry enables additional resolvers to be loaded via plugin. #### Example usage: ```python= @@ -73,17 +73,17 @@ The following is an example method resolver implementation. In this example, we ```python= from aries_cloudagent.config.injection_context import InjectionContext -from aries_cloudagent.resolver.did_resolver_registry import DIDResolverRegistry +from ..resolver.did_resolver import DIDResolver from .example_resolver import ExampleResolver async def setup(context: InjectionContext): """Setup the plugin.""" - registry = context.inject(DIDResolverRegistry) + registry = context.inject(DIDResolver) resolver = ExampleResolver() await resolver.setup(context) - registry.register(resolver) + registry.append(resolver) ``` #### `example_resolver.py` diff --git a/Endorser.md b/Endorser.md index 54f545fce3..81b611f04d 100644 --- a/Endorser.md +++ b/Endorser.md @@ -1,7 +1,5 @@ # Transaction Endorser Support -Note that the ACA-Py transaction support is in the process of code refactor and cleanup. The following documents the current state, but is subject to change. - ACA-Py supports an [Endorser Protocol](https://github.com/hyperledger/aries-rfcs/pull/586), that allows an un-privileged agent (an "Author") to request another agent (the "Endorser") to sign their transactions so they can write these transactions to the ledger. This is required on Indy ledgers, where new agents will typically be granted only "Author" privileges. Transaction Endorsement is built into the protocols for Schema, Credential Definition and Revocation, and endorsements can be explicitely requested, or ACA-Py can be configured to automate the endorsement workflow. @@ -61,3 +59,44 @@ Endorsement: For Authors, specify whether to automatically promote a DID to the wallet public DID after writing to the ledger. ``` +## How Aca-py Handles Endorsements + +Internally, the Endorsement functionality is implemented as a protocol, and is implemented consistently with other protocols: + +- a [routes.py](https://github.com/hyperledger/aries-cloudagent-python/blob/main/aries_cloudagent/protocols/endorse_transaction/v1_0/routes.py) file exposes the admin endpoints +- [handler files](https://github.com/hyperledger/aries-cloudagent-python/tree/main/aries_cloudagent/protocols/endorse_transaction/v1_0/handlers) implement responses to any received Endorse protocol messages +- a [manager.py](https://github.com/hyperledger/aries-cloudagent-python/blob/main/aries_cloudagent/protocols/endorse_transaction/v1_0/manager.py) file implements common functionality that is called from both the routes.py and handler classes (as well as from other classes that need to interact with Endorser functionality) + +The Endorser makes use of the [Event Bus](https://github.com/hyperledger/aries-cloudagent-python/blob/main/CHANGELOG.md#july-14-2021) (links to the PR which links to a hackmd doc) to notify other protocols of any Endorser events of interest. For example, after a Credential Definition endorsement is received, the TransactionManager writes the endorsed transaction to the ledger and uses teh Event Bus to notify the Credential Defintition manager that it can do any required post-processing (such as writing the cred def record to the wallet, initiating the revocation registry, etc.). + +The overall architecture can be illustrated as: + +![Class Diagram](./docs/assets/endorser-design.png) + +### Create Credential Definition and Revocation Registry + +An example of an Endorser flow is as follows, showing how a credential definition endorsement is received and processed, and optionally kicks off the revocation registry process: + +![Sequence Diagram](./docs/assets/endorse-cred-def.png) + +You can see that there is a standard endorser flow happening each time there is a ledger write (illustrated in the "Endorser" process). + +At the end of each endorse sequence, the TransactionManager sends a notification via the EventBus so that any dependant processing can continue. Each Router is responsibiel for listening and responding to these notifications if necessary. + +For example: + +- Once the credential definition is created, a revocation registry must be created (for revocable cred defs) +- Once the revocation registry is created, a revocation entry must be created +- Potentially, the cred def status could be updated once the revocation entry is completed + +Using the EventBus decouples the event sequence. Any functions triggered by an event notification are typically also available directly via Admin endpoints. + +### Create DID and Promote to Public + +... and an example of creating a DID and promoting it to public (and creating an ATTRIB for the endpoint: + +![Sequence Diagram](./docs/assets/endorse-public-did.png) + +You can see the same endorsement processes in this sequence. + +Once the DID is written, the DID can (optionally) be promoted to the public DID, which will also invoke an ATTRIB transaction to write the endpoint. diff --git a/aries_cloudagent/config/default_context.py b/aries_cloudagent/config/default_context.py index fb0867cddc..0c5f90cac2 100644 --- a/aries_cloudagent/config/default_context.py +++ b/aries_cloudagent/config/default_context.py @@ -12,7 +12,6 @@ from ..core.protocol_registry import ProtocolRegistry from ..core.goal_code_registry import GoalCodeRegistry from ..resolver.did_resolver import DIDResolver -from ..resolver.did_resolver_registry import DIDResolverRegistry from ..tails.base import BaseTailsServer from ..protocols.actionmenu.v1_0.base_service import BaseMenuService @@ -50,12 +49,8 @@ async def build_context(self) -> InjectionContext: # Global event bus context.injector.bind_instance(EventBus, EventBus()) - # Global did resolver registry - did_resolver_registry = DIDResolverRegistry() - context.injector.bind_instance(DIDResolverRegistry, did_resolver_registry) - # Global did resolver - context.injector.bind_instance(DIDResolver, DIDResolver(did_resolver_registry)) + context.injector.bind_instance(DIDResolver, DIDResolver([])) await self.bind_providers(context) await self.load_plugins(context) diff --git a/aries_cloudagent/core/tests/test_conductor.py b/aries_cloudagent/core/tests/test_conductor.py index 02fbe4ff49..203ca1b3b4 100644 --- a/aries_cloudagent/core/tests/test_conductor.py +++ b/aries_cloudagent/core/tests/test_conductor.py @@ -24,7 +24,7 @@ from ...protocols.coordinate_mediation.v1_0.models.mediation_record import ( MediationRecord, ) -from ...resolver.did_resolver import DIDResolver, DIDResolverRegistry +from ...resolver.did_resolver import DIDResolver from ...multitenant.base import BaseMultitenantManager from ...multitenant.manager import MultitenantManager from ...storage.base import BaseStorage @@ -92,7 +92,7 @@ async def build_context(self) -> InjectionContext: context.injector.bind_instance(ProfileManager, InMemoryProfileManager()) context.injector.bind_instance(ProtocolRegistry, ProtocolRegistry()) context.injector.bind_instance(BaseWireFormat, self.wire_format) - context.injector.bind_instance(DIDResolver, DIDResolver(DIDResolverRegistry())) + context.injector.bind_instance(DIDResolver, DIDResolver([])) context.injector.bind_instance(EventBus, MockEventBus()) return context diff --git a/aries_cloudagent/indy/credx/verifier.py b/aries_cloudagent/indy/credx/verifier.py index c6677cfa7b..e625076ecd 100644 --- a/aries_cloudagent/indy/credx/verifier.py +++ b/aries_cloudagent/indy/credx/verifier.py @@ -7,7 +7,7 @@ from ...core.profile import Profile -from ..verifier import IndyVerifier +from ..verifier import IndyVerifier, PresVerifyMsg LOGGER = logging.getLogger(__name__) @@ -33,7 +33,7 @@ async def verify_presentation( credential_definitions, rev_reg_defs, rev_reg_entries, - ) -> bool: + ) -> (bool, list): """ Verify a presentation. @@ -46,16 +46,21 @@ async def verify_presentation( rev_reg_entries: revocation registry entries """ + msgs = [] try: - self.non_revoc_intervals(pres_req, pres, credential_definitions) - await self.check_timestamps(self.profile, pres_req, pres, rev_reg_defs) - await self.pre_verify(pres_req, pres) + msgs += self.non_revoc_intervals(pres_req, pres, credential_definitions) + msgs += await self.check_timestamps( + self.profile, pres_req, pres, rev_reg_defs + ) + msgs += await self.pre_verify(pres_req, pres) except ValueError as err: + s = str(err) + msgs.append(f"{PresVerifyMsg.PRES_VALUE_ERROR.value}::{s}") LOGGER.error( f"Presentation on nonce={pres_req['nonce']} " f"cannot be validated: {str(err)}" ) - return False + return (False, msgs) try: presentation = Presentation.load(pres) @@ -68,11 +73,13 @@ async def verify_presentation( rev_reg_defs.values(), rev_reg_entries, ) - except CredxError: + except CredxError as err: + s = str(err) + msgs.append(f"{PresVerifyMsg.PRES_VERIFY_ERROR.value}::{s}") LOGGER.exception( f"Validation of presentation on nonce={pres_req['nonce']} " "failed with error" ) verified = False - return verified + return (verified, msgs) diff --git a/aries_cloudagent/indy/sdk/tests/test_verifier.py b/aries_cloudagent/indy/sdk/tests/test_verifier.py index 44784b1ad5..d4abc1bdd1 100644 --- a/aries_cloudagent/indy/sdk/tests/test_verifier.py +++ b/aries_cloudagent/indy/sdk/tests/test_verifier.py @@ -336,7 +336,7 @@ async def test_verify_presentation(self, mock_verify): ) as mock_get_ledger: mock_get_ledger.return_value = (None, self.ledger) INDY_PROOF_REQ_X = deepcopy(INDY_PROOF_REQ_PRED_NAMES) - verified = await self.verifier.verify_presentation( + (verified, msgs) = await self.verifier.verify_presentation( INDY_PROOF_REQ_X, INDY_PROOF_PRED_NAMES, "schemas", @@ -370,7 +370,7 @@ async def test_verify_presentation_x_indy(self, mock_verify): IndyLedgerRequestsExecutor, "get_ledger_for_identifier" ) as mock_get_ledger: mock_get_ledger.return_value = ("test", self.ledger) - verified = await self.verifier.verify_presentation( + (verified, msgs) = await self.verifier.verify_presentation( INDY_PROOF_REQ_NAME, INDY_PROOF_NAME, "schemas", @@ -397,7 +397,7 @@ async def test_check_encoding_attr(self, mock_verify): ) as mock_get_ledger: mock_get_ledger.return_value = (None, self.ledger) mock_verify.return_value = True - verified = await self.verifier.verify_presentation( + (verified, msgs) = await self.verifier.verify_presentation( INDY_PROOF_REQ_NAME, INDY_PROOF_NAME, "schemas", @@ -415,6 +415,8 @@ async def test_check_encoding_attr(self, mock_verify): json.dumps("rev_reg_entries"), ) assert verified is True + assert len(msgs) == 1 + assert "TS_OUT_NRI::19_uuid" in msgs @async_mock.patch("indy.anoncreds.verifier_verify_proof") async def test_check_encoding_attr_tamper_raw(self, mock_verify): @@ -426,7 +428,7 @@ async def test_check_encoding_attr_tamper_raw(self, mock_verify): IndyLedgerRequestsExecutor, "get_ledger_for_identifier" ) as mock_get_ledger: mock_get_ledger.return_value = ("test", self.ledger) - verified = await self.verifier.verify_presentation( + (verified, msgs) = await self.verifier.verify_presentation( INDY_PROOF_REQ_NAME, INDY_PROOF_X, "schemas", @@ -438,6 +440,11 @@ async def test_check_encoding_attr_tamper_raw(self, mock_verify): mock_verify.assert_not_called() assert verified is False + assert len(msgs) == 2 + assert "TS_OUT_NRI::19_uuid" in msgs + assert ( + "VALUE_ERROR::Encoded representation mismatch for 'Preferred Name'" in msgs + ) @async_mock.patch("indy.anoncreds.verifier_verify_proof") async def test_check_encoding_attr_tamper_encoded(self, mock_verify): @@ -449,7 +456,7 @@ async def test_check_encoding_attr_tamper_encoded(self, mock_verify): IndyLedgerRequestsExecutor, "get_ledger_for_identifier" ) as mock_get_ledger: mock_get_ledger.return_value = (None, self.ledger) - verified = await self.verifier.verify_presentation( + (verified, msgs) = await self.verifier.verify_presentation( INDY_PROOF_REQ_NAME, INDY_PROOF_X, "schemas", @@ -461,6 +468,11 @@ async def test_check_encoding_attr_tamper_encoded(self, mock_verify): mock_verify.assert_not_called() assert verified is False + assert len(msgs) == 2 + assert "TS_OUT_NRI::19_uuid" in msgs + assert ( + "VALUE_ERROR::Encoded representation mismatch for 'Preferred Name'" in msgs + ) @async_mock.patch("indy.anoncreds.verifier_verify_proof") async def test_check_pred_names(self, mock_verify): @@ -470,7 +482,7 @@ async def test_check_pred_names(self, mock_verify): mock_get_ledger.return_value = ("test", self.ledger) mock_verify.return_value = True INDY_PROOF_REQ_X = deepcopy(INDY_PROOF_REQ_PRED_NAMES) - verified = await self.verifier.verify_presentation( + (verified, msgs) = await self.verifier.verify_presentation( INDY_PROOF_REQ_X, INDY_PROOF_PRED_NAMES, "schemas", @@ -491,6 +503,10 @@ async def test_check_pred_names(self, mock_verify): ) assert verified is True + assert len(msgs) == 3 + assert "TS_OUT_NRI::18_uuid" in msgs + assert "TS_OUT_NRI::18_id_GE_uuid" in msgs + assert "TS_OUT_NRI::18_busid_GE_uuid" in msgs @async_mock.patch("indy.anoncreds.verifier_verify_proof") async def test_check_pred_names_tamper_pred_value(self, mock_verify): @@ -502,7 +518,7 @@ async def test_check_pred_names_tamper_pred_value(self, mock_verify): IndyLedgerRequestsExecutor, "get_ledger_for_identifier" ) as mock_get_ledger: mock_get_ledger.return_value = (None, self.ledger) - verified = await self.verifier.verify_presentation( + (verified, msgs) = await self.verifier.verify_presentation( deepcopy(INDY_PROOF_REQ_PRED_NAMES), INDY_PROOF_X, "schemas", @@ -514,6 +530,14 @@ async def test_check_pred_names_tamper_pred_value(self, mock_verify): mock_verify.assert_not_called() assert verified is False + assert len(msgs) == 4 + assert "RMV_RFNT_NRI::18_uuid" in msgs + assert "RMV_RFNT_NRI::18_busid_GE_uuid" in msgs + assert "RMV_RFNT_NRI::18_id_GE_uuid" in msgs + assert ( + "VALUE_ERROR::Timestamp on sub-proof #0 is superfluous vs. requested attribute group 18_uuid" + in msgs + ) @async_mock.patch("indy.anoncreds.verifier_verify_proof") async def test_check_pred_names_tamper_pred_req_attr(self, mock_verify): @@ -523,7 +547,7 @@ async def test_check_pred_names_tamper_pred_req_attr(self, mock_verify): IndyLedgerRequestsExecutor, "get_ledger_for_identifier" ) as mock_get_ledger: mock_get_ledger.return_value = (None, self.ledger) - verified = await self.verifier.verify_presentation( + (verified, msgs) = await self.verifier.verify_presentation( INDY_PROOF_REQ_X, INDY_PROOF_PRED_NAMES, "schemas", @@ -535,6 +559,14 @@ async def test_check_pred_names_tamper_pred_req_attr(self, mock_verify): mock_verify.assert_not_called() assert verified is False + assert len(msgs) == 4 + assert "RMV_RFNT_NRI::18_uuid" in msgs + assert "RMV_RFNT_NRI::18_busid_GE_uuid" in msgs + assert "RMV_RFNT_NRI::18_id_GE_uuid" in msgs + assert ( + "VALUE_ERROR::Timestamp on sub-proof #0 is superfluous vs. requested attribute group 18_uuid" + in msgs + ) @async_mock.patch("indy.anoncreds.verifier_verify_proof") async def test_check_pred_names_tamper_attr_groups(self, mock_verify): @@ -546,7 +578,7 @@ async def test_check_pred_names_tamper_attr_groups(self, mock_verify): IndyLedgerRequestsExecutor, "get_ledger_for_identifier" ) as mock_get_ledger: mock_get_ledger.return_value = ("test", self.ledger) - verified = await self.verifier.verify_presentation( + (verified, msgs) = await self.verifier.verify_presentation( deepcopy(INDY_PROOF_REQ_PRED_NAMES), INDY_PROOF_X, "schemas", @@ -558,3 +590,7 @@ async def test_check_pred_names_tamper_attr_groups(self, mock_verify): mock_verify.assert_not_called() assert verified is False + assert len(msgs) == 3 + assert "RMV_RFNT_NRI::18_busid_GE_uuid" in msgs + assert "RMV_RFNT_NRI::18_id_GE_uuid" in msgs + assert "VALUE_ERROR::Missing requested attribute group 18_uuid" in msgs diff --git a/aries_cloudagent/indy/sdk/verifier.py b/aries_cloudagent/indy/sdk/verifier.py index b9e087aa82..5c67463eed 100644 --- a/aries_cloudagent/indy/sdk/verifier.py +++ b/aries_cloudagent/indy/sdk/verifier.py @@ -8,7 +8,7 @@ from ...core.profile import Profile -from ..verifier import IndyVerifier +from ..verifier import IndyVerifier, PresVerifyMsg LOGGER = logging.getLogger(__name__) @@ -34,7 +34,7 @@ async def verify_presentation( credential_definitions, rev_reg_defs, rev_reg_entries, - ) -> bool: + ) -> (bool, list): """ Verify a presentation. @@ -49,16 +49,21 @@ async def verify_presentation( LOGGER.debug(f">>> received presentation: {pres}") LOGGER.debug(f">>> for pres_req: {pres_req}") + msgs = [] try: - self.non_revoc_intervals(pres_req, pres, credential_definitions) - await self.check_timestamps(self.profile, pres_req, pres, rev_reg_defs) - await self.pre_verify(pres_req, pres) + msgs += self.non_revoc_intervals(pres_req, pres, credential_definitions) + msgs += await self.check_timestamps( + self.profile, pres_req, pres, rev_reg_defs + ) + msgs += await self.pre_verify(pres_req, pres) except ValueError as err: + s = str(err) + msgs.append(f"{PresVerifyMsg.PRES_VALUE_ERROR.value}::{s}") LOGGER.error( f"Presentation on nonce={pres_req['nonce']} " f"cannot be validated: {str(err)}" ) - return False + return (False, msgs) LOGGER.debug(f">>> verifying presentation: {pres}") LOGGER.debug(f">>> for pres_req: {pres_req}") @@ -71,11 +76,13 @@ async def verify_presentation( json.dumps(rev_reg_defs), json.dumps(rev_reg_entries), ) - except IndyError: + except IndyError as err: + s = str(err) + msgs.append(f"{PresVerifyMsg.PRES_VERIFY_ERROR.value}::{s}") LOGGER.exception( f"Validation of presentation on nonce={pres_req['nonce']} " "failed with error" ) verified = False - return verified + return (verified, msgs) diff --git a/aries_cloudagent/indy/verifier.py b/aries_cloudagent/indy/verifier.py index f14ad14b24..f61ca829f2 100644 --- a/aries_cloudagent/indy/verifier.py +++ b/aries_cloudagent/indy/verifier.py @@ -3,6 +3,7 @@ import logging from abc import ABC, ABCMeta, abstractmethod +from enum import Enum from time import time from typing import Mapping @@ -16,9 +17,21 @@ from .models.xform import indy_proof_req2non_revoc_intervals + LOGGER = logging.getLogger(__name__) +class PresVerifyMsg(str, Enum): + """Credential verification codes.""" + + RMV_REFERENT_NON_REVOC_INTERVAL = "RMV_RFNT_NRI" + RMV_GLOBAL_NON_REVOC_INTERVAL = "RMV_GLB_NRI" + TSTMP_OUT_NON_REVOC_INTRVAL = "TS_OUT_NRI" + CT_UNREVEALED_ATTRIBUTES = "UNRVL_ATTR" + PRES_VALUE_ERROR = "VALUE_ERROR" + PRES_VERIFY_ERROR = "VERIFY_ERROR" + + class IndyVerifier(ABC, metaclass=ABCMeta): """Base class for Indy Verifier.""" @@ -32,7 +45,7 @@ def __repr__(self) -> str: """ return "<{}>".format(self.__class__.__name__) - def non_revoc_intervals(self, pres_req: dict, pres: dict, cred_defs: dict): + def non_revoc_intervals(self, pres_req: dict, pres: dict, cred_defs: dict) -> list: """ Remove superfluous non-revocation intervals in presentation request. @@ -45,6 +58,7 @@ def non_revoc_intervals(self, pres_req: dict, pres: dict, cred_defs: dict): pres: corresponding presentation """ + msgs = [] for (req_proof_key, pres_key) in { "revealed_attrs": "requested_attributes", "revealed_attr_groups": "requested_attributes", @@ -60,6 +74,10 @@ def non_revoc_intervals(self, pres_req: dict, pres: dict, cred_defs: dict): if uuid in pres_req[pres_key] and pres_req[pres_key][uuid].pop( "non_revoked", None ): + msgs.append( + f"{PresVerifyMsg.RMV_REFERENT_NON_REVOC_INTERVAL.value}::" + f"{uuid}" + ) LOGGER.info( ( "Amended presentation request (nonce=%s): removed " @@ -79,6 +97,7 @@ def non_revoc_intervals(self, pres_req: dict, pres: dict, cred_defs: dict): for spec in pres["identifiers"] ): pres_req.pop("non_revoked", None) + msgs.append(PresVerifyMsg.RMV_GLOBAL_NON_REVOC_INTERVAL.value) LOGGER.warning( ( "Amended presentation request (nonce=%s); removed global " @@ -86,6 +105,7 @@ def non_revoc_intervals(self, pres_req: dict, pres: dict, cred_defs: dict): ), pres_req["nonce"], ) + return msgs async def check_timestamps( self, @@ -93,7 +113,7 @@ async def check_timestamps( pres_req: Mapping, pres: Mapping, rev_reg_defs: Mapping, - ): + ) -> list: """ Check for suspicious, missing, and superfluous timestamps. @@ -106,6 +126,7 @@ async def check_timestamps( pres: indy proof request rev_reg_defs: rev reg defs by rev reg id, augmented with transaction times """ + msgs = [] now = int(time()) non_revoc_intervals = indy_proof_req2non_revoc_intervals(pres_req) LOGGER.debug(f">>> got non-revoc intervals: {non_revoc_intervals}") @@ -159,6 +180,7 @@ async def check_timestamps( # timestamp superfluous, missing, or outside non-revocation interval revealed_attrs = pres["requested_proof"].get("revealed_attrs", {}) + unrevealed_attrs = pres["requested_proof"].get("unrevealed_attrs", {}) revealed_groups = pres["requested_proof"].get("revealed_attr_groups", {}) self_attested = pres["requested_proof"].get("self_attested_attrs", {}) preds = pres["requested_proof"].get("predicates", {}) @@ -185,11 +207,20 @@ async def check_timestamps( < timestamp < non_revoc_intervals[uuid].get("to", now) ): + msgs.append( + f"{PresVerifyMsg.TSTMP_OUT_NON_REVOC_INTRVAL.value}::" + f"{uuid}" + ) LOGGER.info( f"Timestamp {timestamp} from ledger for item" f"{uuid} falls outside non-revocation interval " f"{non_revoc_intervals[uuid]}" ) + elif uuid in unrevealed_attrs: + # nothing to do, attribute value is not revealed + msgs.append( + f"{PresVerifyMsg.CT_UNREVEALED_ATTRIBUTES.value}::" f"{uuid}" + ) elif uuid not in self_attested: raise ValueError( f"Presentation attributes mismatch requested attribute {uuid}" @@ -217,6 +248,10 @@ async def check_timestamps( < timestamp < non_revoc_intervals[uuid].get("to", now) ): + msgs.append( + f"{PresVerifyMsg.TSTMP_OUT_NON_REVOC_INTRVAL.value}::" + f"{uuid}" + ) LOGGER.warning( f"Timestamp {timestamp} from ledger for item" f"{uuid} falls outside non-revocation interval " @@ -243,13 +278,17 @@ async def check_timestamps( < timestamp < non_revoc_intervals[uuid].get("to", now) ): + msgs.append( + f"{PresVerifyMsg.TSTMP_OUT_NON_REVOC_INTRVAL.value}::" f"{uuid}" + ) LOGGER.warning( f"Best-effort timestamp {timestamp} " "from ledger falls outside non-revocation interval " f"{non_revoc_intervals[uuid]}" ) + return msgs - async def pre_verify(self, pres_req: dict, pres: dict): + async def pre_verify(self, pres_req: dict, pres: dict) -> list: """ Check for essential components and tampering in presentation. @@ -261,6 +300,7 @@ async def pre_verify(self, pres_req: dict, pres: dict): pres: corresponding presentation """ + msgs = [] if not ( pres_req and "requested_predicates" in pres_req @@ -297,12 +337,19 @@ async def pre_verify(self, pres_req: dict, pres: dict): raise ValueError(f"Missing requested predicate '{uuid}'") revealed_attrs = pres["requested_proof"].get("revealed_attrs", {}) + unrevealed_attrs = pres["requested_proof"].get("unrevealed_attrs", {}) revealed_groups = pres["requested_proof"].get("revealed_attr_groups", {}) self_attested = pres["requested_proof"].get("self_attested_attrs", {}) for (uuid, req_attr) in pres_req["requested_attributes"].items(): if "name" in req_attr: if uuid in revealed_attrs: pres_req_attr_spec = {req_attr["name"]: revealed_attrs[uuid]} + elif uuid in unrevealed_attrs: + # unrevealed attribute, nothing to do + pres_req_attr_spec = {} + msgs.append( + f"{PresVerifyMsg.CT_UNREVEALED_ATTRIBUTES.value}::" f"{uuid}" + ) elif uuid in self_attested: if not req_attr.get("restrictions"): continue @@ -339,6 +386,7 @@ async def pre_verify(self, pres_req: dict, pres: dict): raise ValueError(f"Encoded representation mismatch for '{attr}'") if primary_enco != encode(spec["raw"]): raise ValueError(f"Encoded representation mismatch for '{attr}'") + return msgs @abstractmethod def verify_presentation( @@ -349,7 +397,7 @@ def verify_presentation( credential_definitions, rev_reg_defs, rev_reg_entries, - ): + ) -> (bool, list): """ Verify a presentation. diff --git a/aries_cloudagent/ledger/indy.py b/aries_cloudagent/ledger/indy.py index 6f1c06e42f..c1c530c9af 100644 --- a/aries_cloudagent/ledger/indy.py +++ b/aries_cloudagent/ledger/indy.py @@ -770,6 +770,8 @@ async def update_endpoint_for_did( else None ) + LOGGER.info(f">>> exist_endpoint_of_type = {exist_endpoint_of_type}") + LOGGER.info(f">>> endpoint = {endpoint}") if exist_endpoint_of_type != endpoint: if await self.is_ledger_read_only(): raise LedgerError( @@ -783,6 +785,9 @@ async def update_endpoint_for_did( ) with IndyErrorHandler("Exception building attribute request", LedgerError): + LOGGER.info( + f">>> calling build_attrib_request() with {nym} {attr_json}" + ) request_json = await indy.ledger.build_attrib_request( nym, nym, None, attr_json, None ) @@ -791,17 +796,26 @@ async def update_endpoint_for_did( request_json = await indy.ledger.append_request_endorser( request_json, endorser_did ) + LOGGER.info( + f">>> calling _submit() with {request_json}, True, etc ..." + ) resp = await self._submit( request_json, - True, + sign=True, sign_did=public_info, write_ledger=write_ledger, ) if not write_ledger: + LOGGER.info(f">>> Returning ... signed_txn {resp}") return {"signed_txn": resp} + LOGGER.info(f">>> calling _submit() with {request_json}") await self._submit(request_json, True, True) return True + + else: + LOGGER.info(">>> NOT updating ledger endpoint!") + return False async def register_nym( diff --git a/aries_cloudagent/protocols/connections/v1_0/tests/test_manager.py b/aries_cloudagent/protocols/connections/v1_0/tests/test_manager.py index d7c3836236..80efb456bf 100644 --- a/aries_cloudagent/protocols/connections/v1_0/tests/test_manager.py +++ b/aries_cloudagent/protocols/connections/v1_0/tests/test_manager.py @@ -20,7 +20,6 @@ from .....multitenant.manager import MultitenantManager from .....protocols.routing.v1_0.manager import RoutingManager from .....resolver.did_resolver import DIDResolver -from .....resolver.did_resolver_registry import DIDResolverRegistry from .....storage.error import StorageNotFoundError from .....transport.inbound.receipt import MessageReceipt from .....wallet.base import DIDInfo @@ -2024,9 +2023,7 @@ async def test_fetch_connection_targets_no_my_did(self): async def test_fetch_connection_targets_conn_invitation_did_no_resolver(self): async with self.profile.session() as session: - self.context.injector.bind_instance( - DIDResolver, DIDResolver(DIDResolverRegistry()) - ) + self.context.injector.bind_instance(DIDResolver, DIDResolver([])) await session.wallet.create_local_did( method=DIDMethod.SOV, key_type=KeyType.ED25519, @@ -2320,9 +2317,7 @@ async def test_fetch_connection_targets_conn_invitation_unsupported_key_type(sel async def test_fetch_connection_targets_oob_invitation_svc_did_no_resolver(self): async with self.profile.session() as session: - self.context.injector.bind_instance( - DIDResolver, DIDResolver(DIDResolverRegistry()) - ) + self.context.injector.bind_instance(DIDResolver, DIDResolver([])) await session.wallet.create_local_did( method=DIDMethod.SOV, key_type=KeyType.ED25519, diff --git a/aries_cloudagent/protocols/endorse_transaction/v1_0/manager.py b/aries_cloudagent/protocols/endorse_transaction/v1_0/manager.py index 8ce93c45a8..3669a16eeb 100644 --- a/aries_cloudagent/protocols/endorse_transaction/v1_0/manager.py +++ b/aries_cloudagent/protocols/endorse_transaction/v1_0/manager.py @@ -21,7 +21,10 @@ from ....storage.error import StorageError, StorageNotFoundError from ....transport.inbound.receipt import MessageReceipt from ....wallet.base import BaseWallet -from ....wallet.util import notify_endorse_did_event +from ....wallet.util import ( + notify_endorse_did_event, + notify_endorse_did_attrib_event, +) from .messages.cancel_transaction import CancelTransaction from .messages.endorsed_transaction_response import EndorsedTransactionResponse @@ -794,6 +797,11 @@ async def endorsed_txn_post_processing( did = ledger_response["result"]["txn"]["data"]["dest"] await notify_endorse_did_event(self._profile, did, meta_data) + elif ledger_response["result"]["txn"]["type"] == "100": + # write DID ATTRIB to ledger + did = ledger_response["result"]["txn"]["data"]["dest"] + await notify_endorse_did_attrib_event(self._profile, did, meta_data) + else: # TODO unknown ledger transaction type, just ignore for now ... pass diff --git a/aries_cloudagent/protocols/present_proof/dif/pres_exch.py b/aries_cloudagent/protocols/present_proof/dif/pres_exch.py index dafe0b8857..3155d28860 100644 --- a/aries_cloudagent/protocols/present_proof/dif/pres_exch.py +++ b/aries_cloudagent/protocols/present_proof/dif/pres_exch.py @@ -237,7 +237,9 @@ def extract_info(self, data, **kwargs): """deserialize.""" new_data = {} if isinstance(data, dict): - if "oneof_filter" in data: + if "uri_groups" in data: + return data + elif "oneof_filter" in data and isinstance(data["oneof_filter"], list): new_data["oneof_filter"] = True uri_group_list_of_list = [] uri_group_list = data.get("oneof_filter") diff --git a/aries_cloudagent/protocols/present_proof/dif/tests/test_pres_exch.py b/aries_cloudagent/protocols/present_proof/dif/tests/test_pres_exch.py index 34638ef764..2709024263 100644 --- a/aries_cloudagent/protocols/present_proof/dif/tests/test_pres_exch.py +++ b/aries_cloudagent/protocols/present_proof/dif/tests/test_pres_exch.py @@ -395,6 +395,10 @@ def test_schemas_input_desc_filter(self): deser_schema_filter = SchemasInputDescriptorFilter.deserialize( test_schemas_filter ) + ser_schema_filter = deser_schema_filter.serialize() + deser_schema_filter = SchemasInputDescriptorFilter.deserialize( + ser_schema_filter + ) assert deser_schema_filter.oneof_filter assert deser_schema_filter.uri_groups[0][0].uri == test_schema_list[0][0].get( "uri" @@ -418,6 +422,10 @@ def test_schemas_input_desc_filter(self): deser_schema_filter = SchemasInputDescriptorFilter.deserialize( test_schemas_filter ) + ser_schema_filter = deser_schema_filter.serialize() + deser_schema_filter = SchemasInputDescriptorFilter.deserialize( + ser_schema_filter + ) assert deser_schema_filter.oneof_filter assert deser_schema_filter.uri_groups[0][0].uri == test_schema_list[0].get( "uri" @@ -428,6 +436,10 @@ def test_schemas_input_desc_filter(self): assert isinstance(deser_schema_filter, SchemasInputDescriptorFilter) deser_schema_filter = SchemasInputDescriptorFilter.deserialize(test_schema_list) + ser_schema_filter = deser_schema_filter.serialize() + deser_schema_filter = SchemasInputDescriptorFilter.deserialize( + ser_schema_filter + ) assert not deser_schema_filter.oneof_filter assert deser_schema_filter.uri_groups[0][0].uri == test_schema_list[0].get( "uri" diff --git a/aries_cloudagent/protocols/present_proof/dif/tests/test_pres_exch_handler.py b/aries_cloudagent/protocols/present_proof/dif/tests/test_pres_exch_handler.py index 509e87ba6f..87587eb674 100644 --- a/aries_cloudagent/protocols/present_proof/dif/tests/test_pres_exch_handler.py +++ b/aries_cloudagent/protocols/present_proof/dif/tests/test_pres_exch_handler.py @@ -8,7 +8,6 @@ from .....core.in_memory import InMemoryProfile from .....did.did_key import DIDKey -from .....resolver.did_resolver_registry import DIDResolverRegistry from .....resolver.did_resolver import DIDResolver from .....storage.vc_holder.vc_record import VCRecord from .....wallet.base import BaseWallet, DIDInfo @@ -69,9 +68,7 @@ def event_loop(request): def profile(): profile = InMemoryProfile.test_profile() context = profile.context - did_resolver_registry = DIDResolverRegistry() - context.injector.bind_instance(DIDResolverRegistry, did_resolver_registry) - context.injector.bind_instance(DIDResolver, DIDResolver(did_resolver_registry)) + context.injector.bind_instance(DIDResolver, DIDResolver([])) context.injector.bind_instance(DocumentLoader, custom_document_loader) context.settings["debug.auto_respond_presentation_request"] = True return profile diff --git a/aries_cloudagent/protocols/present_proof/v1_0/manager.py b/aries_cloudagent/protocols/present_proof/v1_0/manager.py index c951aa8c02..2f3af46da5 100644 --- a/aries_cloudagent/protocols/present_proof/v1_0/manager.py +++ b/aries_cloudagent/protocols/present_proof/v1_0/manager.py @@ -417,18 +417,18 @@ async def verify_presentation( ) = await indy_handler.process_pres_identifiers(indy_proof["identifiers"]) verifier = self._profile.inject(IndyVerifier) - presentation_exchange_record.verified = json.dumps( # tag: needs string value - await verifier.verify_presentation( - dict( - indy_proof_request - ), # copy to avoid changing the proof req in the stored pres exch - indy_proof, - schemas, - cred_defs, - rev_reg_defs, - rev_reg_entries, - ) + (verified_bool, verified_msgs) = await verifier.verify_presentation( + dict( + indy_proof_request + ), # copy to avoid changing the proof req in the stored pres exch + indy_proof, + schemas, + cred_defs, + rev_reg_defs, + rev_reg_entries, ) + presentation_exchange_record.verified = json.dumps(verified_bool) + presentation_exchange_record.verified_msgs = list(set(verified_msgs)) presentation_exchange_record.state = V10PresentationExchange.STATE_VERIFIED async with self._profile.session() as session: diff --git a/aries_cloudagent/protocols/present_proof/v1_0/models/presentation_exchange.py b/aries_cloudagent/protocols/present_proof/v1_0/models/presentation_exchange.py index 296740b5f9..80db45f86c 100644 --- a/aries_cloudagent/protocols/present_proof/v1_0/models/presentation_exchange.py +++ b/aries_cloudagent/protocols/present_proof/v1_0/models/presentation_exchange.py @@ -74,6 +74,7 @@ def __init__( ] = None, # aries message presentation: Union[IndyProof, Mapping] = None, # indy proof verified: str = None, + verified_msgs: list = None, auto_present: bool = False, auto_verify: bool = False, error_msg: str = None, @@ -96,6 +97,7 @@ def __init__( ) self._presentation = IndyProof.serde(presentation) self.verified = verified + self.verified_msgs = verified_msgs self.auto_present = auto_present self.auto_verify = auto_verify self.error_msg = error_msg @@ -208,6 +210,7 @@ def record_value(self) -> Mapping: "auto_verify", "error_msg", "verified", + "verified_msgs", "trace", ) }, @@ -295,6 +298,13 @@ class Meta: example="true", validate=validate.OneOf(["true", "false"]), ) + verified_msgs = fields.List( + fields.Str( + required=False, + description="Proof verification warning or error information", + ), + required=False, + ) auto_present = fields.Bool( required=False, description="Prover choice to auto-present proof as verifier requests", diff --git a/aries_cloudagent/protocols/present_proof/v1_0/models/tests/test_record.py b/aries_cloudagent/protocols/present_proof/v1_0/models/tests/test_record.py index a757dcae2b..13d9e5aac3 100644 --- a/aries_cloudagent/protocols/present_proof/v1_0/models/tests/test_record.py +++ b/aries_cloudagent/protocols/present_proof/v1_0/models/tests/test_record.py @@ -113,6 +113,7 @@ async def test_record(self): "auto_verify": False, "error_msg": None, "verified": None, + "verified_msgs": None, "trace": False, } diff --git a/aries_cloudagent/protocols/present_proof/v1_0/tests/test_manager.py b/aries_cloudagent/protocols/present_proof/v1_0/tests/test_manager.py index 310cc4421f..044c21d317 100644 --- a/aries_cloudagent/protocols/present_proof/v1_0/tests/test_manager.py +++ b/aries_cloudagent/protocols/present_proof/v1_0/tests/test_manager.py @@ -317,7 +317,7 @@ async def setUp(self): Verifier = async_mock.MagicMock(IndyVerifier, autospec=True) self.verifier = Verifier() self.verifier.verify_presentation = async_mock.CoroutineMock( - return_value="true" + return_value=("true", []) ) injector.bind_instance(IndyVerifier, self.verifier) diff --git a/aries_cloudagent/protocols/present_proof/v2_0/formats/indy/handler.py b/aries_cloudagent/protocols/present_proof/v2_0/formats/indy/handler.py index 54ef915af0..8f0a3e5057 100644 --- a/aries_cloudagent/protocols/present_proof/v2_0/formats/indy/handler.py +++ b/aries_cloudagent/protocols/present_proof/v2_0/formats/indy/handler.py @@ -329,14 +329,14 @@ async def verify_pres(self, pres_ex_record: V20PresExRecord) -> V20PresExRecord: ) = await indy_handler.process_pres_identifiers(indy_proof["identifiers"]) verifier = self._profile.inject(IndyVerifier) - pres_ex_record.verified = json.dumps( # tag: needs string value - await verifier.verify_presentation( - indy_proof_request, - indy_proof, - schemas, - cred_defs, - rev_reg_defs, - rev_reg_entries, - ) + (verified, verified_msgs) = await verifier.verify_presentation( + indy_proof_request, + indy_proof, + schemas, + cred_defs, + rev_reg_defs, + rev_reg_entries, ) + pres_ex_record.verified = json.dumps(verified) + pres_ex_record.verified_msgs = list(set(verified_msgs)) return pres_ex_record diff --git a/aries_cloudagent/protocols/present_proof/v2_0/models/pres_exchange.py b/aries_cloudagent/protocols/present_proof/v2_0/models/pres_exchange.py index f77a53166b..cc314aa9db 100644 --- a/aries_cloudagent/protocols/present_proof/v2_0/models/pres_exchange.py +++ b/aries_cloudagent/protocols/present_proof/v2_0/models/pres_exchange.py @@ -62,6 +62,7 @@ def __init__( pres_request: Union[V20PresRequest, Mapping] = None, # aries message pres: Union[V20Pres, Mapping] = None, # aries message verified: str = None, + verified_msgs: list = None, auto_present: bool = False, auto_verify: bool = False, error_msg: str = None, @@ -80,6 +81,7 @@ def __init__( self._pres_request = V20PresRequest.serde(pres_request) self._pres = V20Pres.serde(pres) self.verified = verified + self.verified_msgs = verified_msgs self.auto_present = auto_present self.auto_verify = auto_verify self.error_msg = error_msg @@ -191,6 +193,7 @@ def record_value(self) -> Mapping: "role", "state", "verified", + "verified_msgs", "auto_present", "auto_verify", "error_msg", @@ -307,6 +310,13 @@ class Meta: example="true", validate=validate.OneOf(["true", "false"]), ) + verified_msgs = fields.List( + fields.Str( + required=False, + description="Proof verification warning or error information", + ), + required=False, + ) auto_present = fields.Bool( required=False, description="Prover choice to auto-present proof as verifier requests", diff --git a/aries_cloudagent/protocols/present_proof/v2_0/models/tests/test_record.py b/aries_cloudagent/protocols/present_proof/v2_0/models/tests/test_record.py index 529b72bb6a..c22a6ff23b 100644 --- a/aries_cloudagent/protocols/present_proof/v2_0/models/tests/test_record.py +++ b/aries_cloudagent/protocols/present_proof/v2_0/models/tests/test_record.py @@ -110,6 +110,7 @@ async def test_record(self): "state": "state", "pres_proposal": pres_proposal.serialize(), "verified": "false", + "verified_msgs": None, "auto_present": True, "auto_verify": False, "error_msg": "error", diff --git a/aries_cloudagent/protocols/present_proof/v2_0/tests/test_manager.py b/aries_cloudagent/protocols/present_proof/v2_0/tests/test_manager.py index 0e352c51e2..9081c61946 100644 --- a/aries_cloudagent/protocols/present_proof/v2_0/tests/test_manager.py +++ b/aries_cloudagent/protocols/present_proof/v2_0/tests/test_manager.py @@ -476,7 +476,7 @@ async def setUp(self): Verifier = async_mock.MagicMock(IndyVerifier, autospec=True) self.verifier = Verifier() self.verifier.verify_presentation = async_mock.CoroutineMock( - return_value="true" + return_value=("true", []) ) injector.bind_instance(IndyVerifier, self.verifier) diff --git a/aries_cloudagent/resolver/__init__.py b/aries_cloudagent/resolver/__init__.py index 51bef53657..e4b82549d7 100644 --- a/aries_cloudagent/resolver/__init__.py +++ b/aries_cloudagent/resolver/__init__.py @@ -5,30 +5,30 @@ from ..config.injection_context import InjectionContext from ..config.provider import ClassProvider -from .did_resolver_registry import DIDResolverRegistry +from ..resolver.did_resolver import DIDResolver LOGGER = logging.getLogger(__name__) async def setup(context: InjectionContext): """Set up default resolvers.""" - registry = context.inject_or(DIDResolverRegistry) + registry = context.inject_or(DIDResolver) if not registry: - LOGGER.warning("No DID Resolver Registry instance found in context") + LOGGER.warning("No DID Resolver instance found in context") return key_resolver = ClassProvider( "aries_cloudagent.resolver.default.key.KeyDIDResolver" ).provide(context.settings, context.injector) await key_resolver.setup(context) - registry.register(key_resolver) + registry.register_resolver(key_resolver) if not context.settings.get("ledger.disabled"): indy_resolver = ClassProvider( "aries_cloudagent.resolver.default.indy.IndyDIDResolver" ).provide(context.settings, context.injector) await indy_resolver.setup(context) - registry.register(indy_resolver) + registry.register_resolver(indy_resolver) else: LOGGER.warning("Ledger is not configured, not loading IndyDIDResolver") @@ -36,11 +36,11 @@ async def setup(context: InjectionContext): "aries_cloudagent.resolver.default.web.WebDIDResolver" ).provide(context.settings, context.injector) await web_resolver.setup(context) - registry.register(web_resolver) + registry.register_resolver(web_resolver) if context.settings.get("resolver.universal"): universal_resolver = ClassProvider( "aries_cloudagent.resolver.default.universal.UniversalResolver" ).provide(context.settings, context.injector) await universal_resolver.setup(context) - registry.register(universal_resolver) + registry.register_resolver(universal_resolver) diff --git a/aries_cloudagent/resolver/did_resolver.py b/aries_cloudagent/resolver/did_resolver.py index f57ec47fd9..bf52959043 100644 --- a/aries_cloudagent/resolver/did_resolver.py +++ b/aries_cloudagent/resolver/did_resolver.py @@ -8,7 +8,7 @@ from datetime import datetime from itertools import chain import logging -from typing import Sequence, Tuple, Type, TypeVar, Union +from typing import List, Sequence, Tuple, Type, TypeVar, Union from pydid import DID, DIDError, DIDUrl, Resource, NonconformantDocument from pydid.doc.doc import IDNotFoundError @@ -22,7 +22,6 @@ ResolutionResult, ResolverError, ) -from .did_resolver_registry import DIDResolverRegistry LOGGER = logging.getLogger(__name__) @@ -33,9 +32,13 @@ class DIDResolver: """did resolver singleton.""" - def __init__(self, registry: DIDResolverRegistry): + def __init__(self, resolvers: List[BaseDIDResolver] = None): """Create DID Resolver.""" - self.did_resolver_registry = registry + self.resolvers = resolvers or [] + + def register_resolver(self, resolver: BaseDIDResolver): + """Register a new resolver.""" + self.resolvers.append(resolver) async def _resolve( self, profile: Profile, did: Union[str, DID] @@ -90,7 +93,7 @@ async def _match_did_to_resolver( """ valid_resolvers = [ resolver - for resolver in self.did_resolver_registry.resolvers + for resolver in self.resolvers if await resolver.supports(profile, did) ] native_resolvers = filter(lambda resolver: resolver.native, valid_resolvers) diff --git a/aries_cloudagent/resolver/did_resolver_registry.py b/aries_cloudagent/resolver/did_resolver_registry.py deleted file mode 100644 index 4154454236..0000000000 --- a/aries_cloudagent/resolver/did_resolver_registry.py +++ /dev/null @@ -1,28 +0,0 @@ -"""In memmory storage for registering did resolvers.""" - -import logging -from typing import Sequence - -from .base import BaseDIDResolver - -LOGGER = logging.getLogger(__name__) - - -class DIDResolverRegistry: - """Registry for did resolvers.""" - - def __init__(self): - """Initialize list for did resolvers.""" - self._resolvers = [] - - @property - def resolvers( - self, - ) -> Sequence[BaseDIDResolver]: - """Accessor for a list of all did resolvers.""" - return self._resolvers - - def register(self, resolver) -> None: - """Register a resolver.""" - LOGGER.debug("Registering resolver %s", resolver) - self._resolvers.append(resolver) diff --git a/aries_cloudagent/resolver/tests/test_did_resolver.py b/aries_cloudagent/resolver/tests/test_did_resolver.py index b08480e805..f2c4ba9ab5 100644 --- a/aries_cloudagent/resolver/tests/test_did_resolver.py +++ b/aries_cloudagent/resolver/tests/test_did_resolver.py @@ -18,7 +18,6 @@ ResolverType, ) from ..did_resolver import DIDResolver -from ..did_resolver_registry import DIDResolverRegistry from . import DOC @@ -88,10 +87,10 @@ async def _resolve(self, profile, did): @pytest.fixture def resolver(): - did_resolver_registry = DIDResolverRegistry() + did_resolver_registry = [] for method in TEST_DID_METHODS: resolver = MockResolver([method], DIDDocument.deserialize(DOC)) - did_resolver_registry.register(resolver) + did_resolver_registry.append(resolver) return DIDResolver(did_resolver_registry) @@ -101,7 +100,7 @@ def profile(): def test_create_resolver(resolver): - assert len(resolver.did_resolver_registry.resolvers) == len(TEST_DID_METHODS) + assert len(resolver.resolvers) == len(TEST_DID_METHODS) @pytest.mark.asyncio @@ -121,11 +120,9 @@ async def test_match_did_to_resolver_x_not_supported(resolver): @pytest.mark.asyncio async def test_match_did_to_resolver_native_priority(profile): - registry = DIDResolverRegistry() native = MockResolver(["sov"], native=True) non_native = MockResolver(["sov"], native=False) - registry.register(non_native) - registry.register(native) + registry = [non_native, native] resolver = DIDResolver(registry) assert [native, non_native] == await resolver._match_did_to_resolver( profile, TEST_DID0 @@ -134,15 +131,11 @@ async def test_match_did_to_resolver_native_priority(profile): @pytest.mark.asyncio async def test_match_did_to_resolver_registration_order(profile): - registry = DIDResolverRegistry() native1 = MockResolver(["sov"], native=True) - registry.register(native1) native2 = MockResolver(["sov"], native=True) - registry.register(native2) non_native3 = MockResolver(["sov"], native=False) - registry.register(non_native3) native4 = MockResolver(["sov"], native=True) - registry.register(native4) + registry = [native1, native2, non_native3, native4] resolver = DIDResolver(registry) assert [ native1, @@ -200,8 +193,6 @@ async def test_resolve_did_x_not_supported(resolver, profile): async def test_resolve_did_x_not_found(profile): py_did = DID("did:cowsay:EiDahaOGH-liLLdDtTxEAdc8i-cfCz-WUcQdRJheMVNn3A") cowsay_resolver_not_found = MockResolver(["cowsay"], resolved=DIDNotFound()) - registry = DIDResolverRegistry() - registry.register(cowsay_resolver_not_found) - resolver = DIDResolver(registry) + resolver = DIDResolver([cowsay_resolver_not_found]) with pytest.raises(DIDNotFound): await resolver.resolve(profile, py_did) diff --git a/aries_cloudagent/resolver/tests/test_did_resolver_registry.py b/aries_cloudagent/resolver/tests/test_did_resolver_registry.py deleted file mode 100644 index dba7afffbd..0000000000 --- a/aries_cloudagent/resolver/tests/test_did_resolver_registry.py +++ /dev/null @@ -1,12 +0,0 @@ -"""Test did resolver registery.""" - -import pytest -import unittest -from ..did_resolver_registry import DIDResolverRegistry - - -def test_create_registry(): - did_resolver_registry = DIDResolverRegistry() - test_resolver = unittest.mock.MagicMock() - did_resolver_registry.register(test_resolver) - assert did_resolver_registry.resolvers == [test_resolver] diff --git a/aries_cloudagent/wallet/did_method.py b/aries_cloudagent/wallet/did_method.py index 82382ea39b..797c26fd20 100644 --- a/aries_cloudagent/wallet/did_method.py +++ b/aries_cloudagent/wallet/did_method.py @@ -47,7 +47,8 @@ def supports_key_type(self, key_type: KeyType) -> bool: """Check whether the current method supports the key type.""" return key_type in self.supported_key_types - def from_metadata(metadata: Mapping) -> "DIDMethod": + @classmethod + def from_metadata(cls, metadata: Mapping) -> "DIDMethod": """Get DID method instance from metadata object. Returns SOV if no metadata was found for backwards compatability. @@ -63,7 +64,8 @@ def from_metadata(metadata: Mapping) -> "DIDMethod": # return default SOV for backward compat return DIDMethod.SOV - def from_method(method: str) -> Optional["DIDMethod"]: + @classmethod + def from_method(cls, method: str) -> Optional["DIDMethod"]: """Get DID method instance from the method name.""" for did_method in DIDMethod: if method == did_method.method_name: @@ -71,7 +73,8 @@ def from_method(method: str) -> Optional["DIDMethod"]: return None - def from_did(did: str) -> "DIDMethod": + @classmethod + def from_did(cls, did: str) -> "DIDMethod": """Get DID method instance from the method name.""" if not did.startswith("did:"): # sov has no prefix diff --git a/aries_cloudagent/wallet/indy.py b/aries_cloudagent/wallet/indy.py index bba9f6ed49..a0e1eddd94 100644 --- a/aries_cloudagent/wallet/indy.py +++ b/aries_cloudagent/wallet/indy.py @@ -1,6 +1,7 @@ """Indy implementation of BaseWallet interface.""" import json +import logging from typing import List, Sequence, Tuple, Union @@ -36,6 +37,8 @@ from .util import b58_to_bytes, bytes_to_b58, bytes_to_b64 +LOGGER = logging.getLogger(__name__) + RECORD_TYPE_CONFIG = "config" RECORD_NAME_PUBLIC_DID = "default_public_did" @@ -720,6 +723,7 @@ async def set_did_endpoint( endpoint_type: the type of the endpoint/service. Only endpoint_type 'endpoint' affects local wallet """ + LOGGER.info(f">>> in set_did_endpoint() with {did} and {endpoint}") did_info = await self.get_local_did(did) if did_info.method != DIDMethod.SOV: raise WalletError("Setting DID endpoint is only allowed for did:sov DIDs") @@ -731,6 +735,7 @@ async def set_did_endpoint( metadata[endpoint_type.indy] = endpoint wallet_public_didinfo = await self.get_public_did() + LOGGER.info(f">>> wallet_public_didinfo = {wallet_public_didinfo}") if ( wallet_public_didinfo and wallet_public_didinfo.did == did ) or did_info.metadata.get("posted"): @@ -741,6 +746,7 @@ async def set_did_endpoint( ) if not ledger.read_only: async with ledger: + LOGGER.info(">>> calling update_endpoint_for_did() ...") attrib_def = await ledger.update_endpoint_for_did( did, endpoint, @@ -750,6 +756,7 @@ async def set_did_endpoint( routing_keys=routing_keys, ) if not write_ledger: + LOGGER.info(f">>> returning attrib_def {attrib_def}") return attrib_def await self.replace_local_did_metadata(did, metadata) diff --git a/aries_cloudagent/wallet/routes.py b/aries_cloudagent/wallet/routes.py index 6e73f455ba..8056bbdfd2 100644 --- a/aries_cloudagent/wallet/routes.py +++ b/aries_cloudagent/wallet/routes.py @@ -17,6 +17,7 @@ from ..ledger.error import LedgerConfigError, LedgerError from ..messaging.models.base import BaseModelError from ..messaging.models.openapi import OpenAPISchema +from ..messaging.responder import BaseResponder from ..messaging.valid import ( DID_POSTURE, ENDPOINT, @@ -442,6 +443,17 @@ async def wallet_set_public_did(request: web.BaseRequest): connection_id = request.query.get("conn_id") attrib_def = None + # check if we need to endorse + if is_author_role(context.profile): + # authors cannot write to the ledger + write_ledger = False + create_transaction_for_endorser = True + if not connection_id: + # author has not provided a connection id, so determine which to use + connection_id = await get_endorser_connection_id(context.profile) + if not connection_id: + raise web.HTTPBadRequest(reason="No endorser connection found") + wallet = session.inject_or(BaseWallet) if not wallet: raise web.HTTPForbidden(reason="No wallet available") @@ -462,6 +474,7 @@ async def wallet_set_public_did(request: web.BaseRequest): routing_keys = mediation_record.routing_keys try: + LOGGER.info(">>> calling promote_wallet_public_did() from route ...") info, attrib_def = await promote_wallet_public_did( context.profile, context, @@ -534,6 +547,7 @@ async def promote_wallet_public_did( # check if we need to endorse if is_author_role(context.profile): + LOGGER.info(">>> IS author ...") # authors cannot write to the ledger write_ledger = False @@ -542,6 +556,8 @@ async def promote_wallet_public_did( connection_id = await get_endorser_connection_id(context.profile) if not connection_id: raise web.HTTPBadRequest(reason="No endorser connection found") + else: + LOGGER.info(">>> IS NOT author ...") if not write_ledger: try: @@ -577,14 +593,19 @@ async def promote_wallet_public_did( did_info = await wallet.get_local_did(did) info = await wallet.set_public_did(did_info) + LOGGER.info(f">>> did_info = {did_info}") + LOGGER.info(f">>> info = {info}") + if info: # Publish endpoint if necessary endpoint = did_info.metadata.get("endpoint") + LOGGER.info(f">>> endpoint = {endpoint}") if not endpoint: async with session_fn() as session: wallet = session.inject_or(BaseWallet) endpoint = context.settings.get("default_endpoint") + LOGGER.info(f">>> calling wallet.set_did_endpoint() with {endpoint}") attrib_def = await wallet.set_did_endpoint( info.did, endpoint, @@ -594,11 +615,6 @@ async def promote_wallet_public_did( routing_keys=routing_keys, ) - # Commented the below lines as the function set_did_endpoint - # was calling update_endpoint_for_did of ledger - # async with ledger: - # await ledger.update_endpoint_for_did(info.did, endpoint) - # Route the public DID route_manager = profile.inject(RouteManager) await route_manager.route_public_did(profile, info.verkey) @@ -808,20 +824,68 @@ async def on_register_nym_event(profile: Profile, event: Event): """Handle any events we need to support.""" # after the nym record is written, promote to wallet public DID + LOGGER.info(f">>> got a NYM event ... {event}") if is_author_role(profile) and profile.context.settings.get_value( "endorser.auto_promote_author_did" ): + LOGGER.info(">>> calling promote_wallet_public_did() ...") did = event.payload["did"] connection_id = event.payload.get("connection_id") try: - await promote_wallet_public_did( + info, attrib_def = await promote_wallet_public_did( profile, profile.context, profile.session, did, connection_id ) - except Exception: + except Exception as err: + # log the error, but continue + LOGGER.exception( + "Error promoting to public DID: %s", + err, + ) + return + + transaction_mgr = TransactionManager(profile) + try: + transaction = await transaction_mgr.create_record( + messages_attach=attrib_def["signed_txn"], connection_id=connection_id + ) + except StorageError as err: # log the error, but continue LOGGER.exception( "Error accepting endorser invitation/configuring endorser connection: %s", + err, ) + return + + # if auto-request, send the request to the endorser + if profile.settings.get_value("endorser.auto_request"): + try: + transaction, transaction_request = await transaction_mgr.create_request( + transaction=transaction, + # TODO see if we need to parameterize these params + # expires_time=expires_time, + # endorser_write_txn=endorser_write_txn, + ) + except (StorageError, TransactionManagerError) as err: + # log the error, but continue + LOGGER.exception( + "Error creating endorser transaction request: %s", + err, + ) + + # TODO not sure how to get outbound_handler in an event ... + # await outbound_handler(transaction_request, connection_id=connection_id) + responder = profile.inject_or(BaseResponder) + if responder: + await responder.send( + transaction_request, + connection_id=connection_id, + ) + else: + LOGGER.warning( + "Configuration has no BaseResponder: cannot update " + "ATTRIB record on DID: %s", + did, + ) async def register(app: web.Application): diff --git a/aries_cloudagent/wallet/util.py b/aries_cloudagent/wallet/util.py index f87e6a53da..942744bca8 100644 --- a/aries_cloudagent/wallet/util.py +++ b/aries_cloudagent/wallet/util.py @@ -102,7 +102,9 @@ def abbr_verkey(full_verkey: str, did: str = None) -> str: DID_EVENT_PREFIX = "acapy::ENDORSE_DID::" +DID_ATTRIB_EVENT_PREFIX = "acapy::ENDORSE_DID_ATTRIB::" EVENT_LISTENER_PATTERN = re.compile(f"^{DID_EVENT_PREFIX}(.*)?$") +ATTRIB_EVENT_LISTENER_PATTERN = re.compile(f"^{DID_ATTRIB_EVENT_PREFIX}(.*)?$") async def notify_endorse_did_event(profile: Profile, did: str, meta_data: dict): @@ -111,3 +113,11 @@ async def notify_endorse_did_event(profile: Profile, did: str, meta_data: dict): DID_EVENT_PREFIX + did, meta_data, ) + + +async def notify_endorse_did_attrib_event(profile: Profile, did: str, meta_data: dict): + """Send notification for a DID ATTRIB post-process event.""" + await profile.notify( + DID_ATTRIB_EVENT_PREFIX + did, + meta_data, + ) diff --git a/demo/features/0453-issue-credential.feature b/demo/features/0453-issue-credential.feature index 0c74f53645..d8b0188604 100644 --- a/demo/features/0453-issue-credential.feature +++ b/demo/features/0453-issue-credential.feature @@ -1,3 +1,4 @@ +@RFC0453 Feature: RFC 0453 Aries agent issue credential @T003-RFC0453 @GHA diff --git a/demo/features/0586-sign-transaction.feature b/demo/features/0586-sign-transaction.feature index e8c85d1e0b..1a66ef69ab 100644 --- a/demo/features/0586-sign-transaction.feature +++ b/demo/features/0586-sign-transaction.feature @@ -1,3 +1,4 @@ +@RFC0586 Feature: RFC 0586 Aries sign (endorse) transactions functions @T001-RFC0586 diff --git a/demo/features/steps/0586-sign-transaction.py b/demo/features/steps/0586-sign-transaction.py index 04b6249598..61702e84a8 100644 --- a/demo/features/steps/0586-sign-transaction.py +++ b/demo/features/steps/0586-sign-transaction.py @@ -34,11 +34,18 @@ def step_impl(context, agent_name, did_role): ) # make the new did the wallet's public did - created_did = agent_container_POST( + published_did = agent_container_POST( agent["agent"], "/wallet/did/public", params={"did": created_did["result"]["did"]}, ) + if "result" in published_did: + # published right away! + pass + elif "txn" in published_did: + # we are an author and need to go through the endorser process + # assume everything works! + async_sleep(3.0) if not "public_dids" in context: context.public_dids = {} diff --git a/demo/runners/agent_container.py b/demo/runners/agent_container.py index 065a3cb4ea..7018ee046f 100644 --- a/demo/runners/agent_container.py +++ b/demo/runners/agent_container.py @@ -331,14 +331,17 @@ async def handle_present_proof(self, message): if referent not in credentials_by_reft: credentials_by_reft[referent] = row + # submit the proof wit one unrevealed revealed attribute + revealed_flag = False for referent in presentation_request["requested_attributes"]: if referent in credentials_by_reft: revealed[referent] = { "cred_id": credentials_by_reft[referent]["cred_info"][ "referent" ], - "revealed": True, + "revealed": revealed_flag, } + revealed_flag = True else: self_attested[referent] = "my self-attested value" @@ -419,14 +422,17 @@ async def handle_present_proof_v2_0(self, message): if referent not in creds_by_reft: creds_by_reft[referent] = row + # submit the proof wit one unrevealed revealed attribute + revealed_flag = False for referent in pres_request_indy["requested_attributes"]: if referent in creds_by_reft: revealed[referent] = { "cred_id": creds_by_reft[referent]["cred_info"][ "referent" ], - "revealed": True, + "revealed": revealed_flag, } + revealed_flag = True else: self_attested[referent] = "my self-attested value" diff --git a/demo/runners/support/agent.py b/demo/runners/support/agent.py index 07da797d0a..bd82314937 100644 --- a/demo/runners/support/agent.py +++ b/demo/runners/support/agent.py @@ -600,8 +600,10 @@ async def register_or_switch_wallet( if self.endorser_role and self.endorser_role == "author": if endorser_agent: await self.admin_POST("/wallet/did/public?did=" + self.did) + await asyncio.sleep(3.0) else: await self.admin_POST("/wallet/did/public?did=" + self.did) + await asyncio.sleep(3.0) elif cred_type == CRED_FORMAT_JSON_LD: # create did of appropriate type data = {"method": DID_METHOD_KEY, "options": {"key_type": KEY_TYPE_BLS}} diff --git a/docs/assets/endorse-cred-def.png b/docs/assets/endorse-cred-def.png new file mode 100644 index 0000000000..ceb3d2fbb1 Binary files /dev/null and b/docs/assets/endorse-cred-def.png differ diff --git a/docs/assets/endorse-cred-def.puml b/docs/assets/endorse-cred-def.puml new file mode 100644 index 0000000000..a1a78c7772 --- /dev/null +++ b/docs/assets/endorse-cred-def.puml @@ -0,0 +1,75 @@ +@startuml +' List of actors for our use case +actor Admin +participant CredDefRoutes +participant RevocationRoutes +participant IndyRevocation +participant Ledger +participant TransactionManager +participant EventBus +participant OutboundHandler +participant EndorsedTxnHandler +boundary OtherAgent + +' Sequence for writing a new credential definition +Admin --> CredDefRoutes: POST /credential-definitions +group Endorse transaction process +CredDefRoutes --> Ledger: create_and_send_credential_definition() +CredDefRoutes --> TransactionManager: create_record() +CredDefRoutes --> TransactionManager: create_request() +CredDefRoutes --> OutboundHandler: send_outbound_msg() +OutboundHandler --> OtherAgent: send_msg() +OtherAgent --> OtherAgent: endorse_msg() +EndorsedTxnHandler <-- OtherAgent: send_msg() +TransactionManager <-- EndorsedTxnHandler: receive_endorse_response() +TransactionManager <-- EndorsedTxnHandler: complete_transaction() +Ledger <-- TransactionManager: txn_submit() +TransactionManager --> TransactionManager: endorsed_txn_post_processing() +TransactionManager --> EventBus: notify_cred_def_event() +end + +' Create the revocation registry once the credential definition is written +CredDefRoutes <-- EventBus: on_cred_def_event() +CredDefRoutes --> IndyRevocation: init_issuer_registry() +IndyRevocation --> EventBus: notify_revocation_reg_init_event() +RevocationRoutes <-- EventBus: on_revocation_registry_init_event() +RevocationRoutes --> RevocationRoutes: generate_tails() +group Endorse transaction process +RevocationRoutes --> Ledger:send_revoc_reg_def() +RevocationRoutes --> TransactionManager: create_record() +RevocationRoutes --> TransactionManager: create_request() +RevocationRoutes --> OutboundHandler: send_outbound_msg() +OutboundHandler --> OtherAgent: send_msg() +OtherAgent --> OtherAgent: endorse_msg() +EndorsedTxnHandler <-- OtherAgent: send_msg() +TransactionManager <-- EndorsedTxnHandler: receive_endorse_response() +TransactionManager <-- EndorsedTxnHandler: complete_transaction() +Ledger <-- TransactionManager: txn_submit() +TransactionManager --> TransactionManager: endorsed_txn_post_processing() +TransactionManager --> EventBus: notify_revocation_reg_endorsed_event() +end + +' Now create the revocation entry (accumulator) +RevocationRoutes <-- EventBus: on_revocation_registry_endorsed_event() +RevocationRoutes --> RevocationRoutes: upload_tails() +RevocationRoutes --> EventBus: notify_revocation_entry_event() +RevocationRoutes <-- EventBus: on_revocation_entry_event() +group Endorse transaction process +RevocationRoutes --> IndyRevocation: send_entry() +IndyRevocation --> Ledger: send_entry() +RevocationRoutes --> TransactionManager: create_record() +RevocationRoutes --> TransactionManager: create_request() +RevocationRoutes --> OutboundHandler: send_outbound_msg() +OutboundHandler --> OtherAgent: send_msg() +OtherAgent --> OtherAgent: endorse_msg() +EndorsedTxnHandler <-- OtherAgent: send_msg() +TransactionManager <-- EndorsedTxnHandler: receive_endorse_response() +TransactionManager <-- EndorsedTxnHandler: complete_transaction() +Ledger <-- TransactionManager: txn_submit() +TransactionManager --> TransactionManager: endorsed_txn_post_processing() + +' Notify that the revocation entry is completed (no one listens to this notification yet) +TransactionManager --> EventBus: notify_revocation_entry_endorsed_event() +end + +@enduml diff --git a/docs/assets/endorse-public-did.png b/docs/assets/endorse-public-did.png new file mode 100644 index 0000000000..275b4ab6de Binary files /dev/null and b/docs/assets/endorse-public-did.png differ diff --git a/docs/assets/endorse-public-did.puml b/docs/assets/endorse-public-did.puml new file mode 100644 index 0000000000..63de78bb50 --- /dev/null +++ b/docs/assets/endorse-public-did.puml @@ -0,0 +1,53 @@ +@startuml +' List of actors for our use case +actor Admin +participant WalletRoutes +participant IndyWallet +participant LedgerRoutes +participant Ledger +participant TransactionManager +participant EventBus +participant OutboundHandler +participant EndorsedTxnHandler +boundary OtherAgent + +' Sequence for writing a new DID on the ledger (assumes the author already has a DID) +Admin --> WalletRoutes: POST /wallet/did/create +Admin --> LedgerRoutes: POST /ledger/register-nym +group Endorse transaction process +LedgerRoutes --> Ledger: register_nym() +LedgerRoutes --> TransactionManager: create_record() +LedgerRoutes --> TransactionManager: create_request() +LedgerRoutes --> OutboundHandler: send_outbound_msg() +OutboundHandler --> OtherAgent: send_msg() +OtherAgent --> OtherAgent: endorse_msg() +EndorsedTxnHandler <-- OtherAgent: send_msg() +TransactionManager <-- EndorsedTxnHandler: receive_endorse_response() +TransactionManager <-- EndorsedTxnHandler: complete_transaction() +Ledger <-- TransactionManager: txn_submit() +TransactionManager --> TransactionManager: endorsed_txn_post_processing() +TransactionManager --> EventBus: notify_endorse_did_event() +end + +WalletRoutes <-- EventBus: on_register_nym_event() +WalletRoutes --> WalletRoutes:promote_wallet_public_did() +WalletRoutes --> IndyWallet:set_public_did() +group Endorse transaction process +WalletRoutes --> IndyWallet:set_did_endpoint() +IndyWallet --> Ledger:update_endpoint_for_did() +WalletRoutes --> TransactionManager: create_record() +WalletRoutes --> TransactionManager: create_request() +WalletRoutes --> OutboundHandler: send_outbound_msg() +OutboundHandler --> OtherAgent: send_msg() +OtherAgent --> OtherAgent: endorse_msg() +EndorsedTxnHandler <-- OtherAgent: send_msg() +TransactionManager <-- EndorsedTxnHandler: receive_endorse_response() +TransactionManager <-- EndorsedTxnHandler: complete_transaction() +Ledger <-- TransactionManager: txn_submit() +TransactionManager --> TransactionManager: endorsed_txn_post_processing() + +' notification that no one is listening to yet +TransactionManager --> EventBus: notify_endorse_did_attrib_event() +end + +@enduml diff --git a/docs/assets/endorser-design.png b/docs/assets/endorser-design.png new file mode 100644 index 0000000000..1c4b9fc555 Binary files /dev/null and b/docs/assets/endorser-design.png differ diff --git a/docs/assets/endorser-design.puml b/docs/assets/endorser-design.puml new file mode 100644 index 0000000000..39883ea66b --- /dev/null +++ b/docs/assets/endorser-design.puml @@ -0,0 +1,31 @@ +@startuml +interface AdminUser + +interface OtherAgent + +object TransactionRoutes + +object TransactionHandlers + +AdminUser --> TransactionRoutes: invoke_endpoint() + +OtherAgent --> TransactionHandlers: send_message() + +object TransactionManager + +object Wallet + +TransactionManager --> Wallet: manage_records() + +TransactionRoutes --> TransactionManager: invoke_api() +TransactionHandlers --> TransactionManager: handle_msg() + +object EventBus + +TransactionManager --> EventBus: notify() + +interface OtherProtocolRoutes + +OtherProtocolRoutes --> EventBus: subscribe() +EventBus --> OtherProtocolRoutes: notify() +@enduml