diff --git a/Multiledger.md b/Multiledger.md new file mode 100644 index 0000000000..30c692c878 --- /dev/null +++ b/Multiledger.md @@ -0,0 +1,141 @@ +# Multi-ledger in ACA-Py + +Ability to use multiple Indy ledgers (both IndySdk and IndyVdr) for resolving a `DID` by the ACA-Py agent. For read requests, checking of multiple ledgers in parallel is done dynamically according to logic detailed in [Read Requests Ledger Selection](#read-requests). For write requests, dynamic allocation of `write_ledger` is not supported. Write ledger can be assigned using `is_write` in the [configuration](#config-properties) or using any of the `--genesis-url`, `--genesis-file`, and `--genesis-transactions` startup (ACA-Py) arguments. If no write ledger is assigned then a `ConfigError` is raised. + +More background information including problem statement, design (algorithm) and more can be found [here](https://docs.google.com/document/d/109C_eMsuZnTnYe2OAd02jAts1vC4axwEKIq7_4dnNVA). + +## Table of Contents + +- [Usage](#usage) + - [Example config file:](#example-config-file) + - [Config properties](#config-properties) +- [Multi-ledger Admin API](#multi-ledger-admin-api) +- [Ledger Selection](#ledger-selection) + - [Read Requests](#read-requests) + - [For checking ledger in parallel](#for-checking-ledger-in-parallel) + - [Write Requests](#write-requests) +- [Impact on other ACA-Py function](#impact-on-other-aca-py-function) + +## Usage + +Multi-ledger is disabled by default. You can enable support for multiple ledgers using the `--genesis-transactions-list` startup parameter. This parameter accepts a string which is the path to the `YAML` configuration file. For example: + +`--genesis-transactions-list ./aries_cloudagent/config/multi_ledger_config.yml` + +If `--genesis-transactions-list` is specified, then `--genesis-url, --genesis-file, --genesis-transactions` should not be specified. + +### Example config file: +``` +- id: localVON + is_production: false + genesis_url: 'http://host.docker.internal:9000/genesis' +- id: bcorvinTest + is_production: true + is_write: true + genesis_url: 'http://test.bcovrin.vonx.io/genesis' +``` + +### Config properties +For each ledger, the required properties are as following: + +- `id`\*: The id (or name) of the ledger, can also be used as the pool name if none provided +- `is_production`\*: Whether the ledger is a production ledger. This is used by the pool selector algorithm to know which ledger to use for certain interactions (i.e. prefer production ledgers over non-production ledgers) + +For connecting to ledger, one of the following needs to be specified: + +- `genesis_file`: The path to the genesis file to use for connecting to an Indy ledger. +- `genesis_transactions`: String of genesis transactions to use for connecting to an Indy ledger. +- `genesis_url`: The url from which to download the genesis transactions to use for connecting to an Indy ledger. + +Optional properties: +- `pool_name`: name of the indy pool to be opened +- `keepalive`: how many seconds to keep the ledger open +- `socks_proxy` +- `is_write`: Whether the ledger is the write ledger. Only one ledger can be assigned, otherwise a `ConfigError` is raised. + + +## Multi-ledger Admin API + +Multi-ledger related actions are grouped under the `ledger` topic in the SwaggerUI or under `/ledger/multiple` path. + +- `/ledger/multiple/config`: +Returns the multiple ledger configuration currently in use +- `/ledger/multiple/get-write-ledger`: +Returns the current active/set `write_ledger's` `ledger_id` + +## Ledger Selection + +### Read Requests + +The following process is executed for these functions in ACA-Py: +1. `get_schema` +2. `get_credential_definition` +3. `get_revoc_reg_def` +4. `get_revoc_reg_entry` +5. `get_key_for_did` +6. `get_all_endpoints_for_did` +7. `get_endpoint_for_did` +8. `get_nym_role` +9. `get_revoc_reg_delta` + +If multiple ledgers are configured then `IndyLedgerRequestsExecutor` service extracts `DID` from the record identifier and executes the [check](#for-checking-ledger-in-parallel) below, else it returns the `BaseLedger` instance. + +#### For checking ledger in parallel + +- `lookup_did_in_configured_ledgers` function + - If the calling function (above) is in [1-4], then check the `DID` in `cache` for a corresponding applicable `ledger_id`. If found, return the ledger info, else continue. + - Otherwise, launch parallel `_get_ledger_by_did` tasks for each of the configured ledgers. + - As these tasks get finished, construct `applicable_prod_ledgers` and `applicable_non_prod_ledgers` dictionaries, each with `self_certified` and `non_self_certified` inner dict which are sorted by the original order or index. + - Order/preference for selection: `self_certified` > `production` > `non_production` + - Checks `production` ledger where the `DID` is `self_certified` + - Checks `non_production` ledger where the `DID` is `self_certified` + - Checks `production` ledger where the `DID` is not `self_certified` + - Checks `non_production` ledger where the `DID` is not `self_certified` + - Return an applicable ledger if found, else raise an exception. +- `_get_ledger_by_did` function + - Build and submit `GET_NYM` + - Wait for a response for 10 seconds, if timed out return None + - Parse response + - Validate state proof + - Check if `DID` is self certified + - Returns ledger info to `lookup_did_in_configured_ledgers` + +### Write Requests + +On startup, the first configured applicable ledger is assigned as the `write_ledger` [`BaseLedger`], the selection is dependant on the order (top-down) and whether it is `production` or `non_production`. For instance, considering this [example configuration](#example-config-file), ledger `bcorvinTest` will be set as `write_ledger` as it is the topmost `production` ledger. If no `production` ledgers are included in configuration then the topmost `non_production` ledger is selected. + +## Impact on other ACA-Py function + +There should be no impact/change in functionality to any ACA-Py protocols. + +`IndySdkLedger` was refactored by replacing `wallet: IndySdkWallet` instance variable with `profile: Profile` and accordingly `.aries_cloudagent/indy/credex/verifier`, `.aries_cloudagent/indy/models/pres_preview`, `.aries_cloudagent/indy/sdk/profile.py`, `.aries_cloudagent/indy/sdk/verifier`, `./aries_cloudagent/indy/verifier` were also updated. + +Added `build_and_return_get_nym_request` and `submit_get_nym_request` helper functions to `IndySdkLedger` and `IndyVdrLedger`. + +Best practice/feedback emerging from `Askar session deadlock` issue and `endorser refactoring` PR was also addressed here by not leaving sessions open unnecessarily and changing `context.session` to `context.profile.session`, etc. + +These changes are made here: +- `./aries_cloudagent/ledger/routes.py` +- `./aries_cloudagent/messaging/credential_definitions/routes.py` +- `./aries_cloudagent/messaging/schemas/routes.py` +- `./aries_cloudagent/protocols/actionmenu/v1_0/routes.py` +- `./aries_cloudagent/protocols/actionmenu/v1_0/util.py` +- `./aries_cloudagent/protocols/basicmessage/v1_0/routes.py` +- `./aries_cloudagent/protocols/coordinate_mediation/v1_0/handlers/keylist_handler.py` +- `./aries_cloudagent/protocols/coordinate_mediation/v1_0/routes.py` +- `./aries_cloudagent/protocols/endorse_transaction/v1_0/routes.py` +- `./aries_cloudagent/protocols/introduction/v0_1/handlers/invitation_handler.py` +- `./aries_cloudagent/protocols/introduction/v0_1/routes.py` +- `./aries_cloudagent/protocols/issue_credential/v1_0/handlers/credential_issue_handler.py` +- `./aries_cloudagent/protocols/issue_credential/v1_0/handlers/credential_offer_handler.py` +- `./aries_cloudagent/protocols/issue_credential/v1_0/handlers/credential_proposal_handler.py` +- `./aries_cloudagent/protocols/issue_credential/v1_0/handlers/credential_request_handler.py` +- `./aries_cloudagent/protocols/issue_credential/v1_0/routes.py` +- `./aries_cloudagent/protocols/issue_credential/v2_0/routes.py` +- `./aries_cloudagent/protocols/present_proof/v1_0/handlers/presentation_handler.py` +- `./aries_cloudagent/protocols/present_proof/v1_0/handlers/presentation_proposal_handler.py` +- `./aries_cloudagent/protocols/present_proof/v1_0/handlers/presentation_request_handler.py` +- `./aries_cloudagent/protocols/present_proof/v1_0/routes.py` +- `./aries_cloudagent/protocols/trustping/v1_0/routes.py` +- `./aries_cloudagent/resolver/routes.py` +- `./aries_cloudagent/revocation/routes.py` diff --git a/aries_cloudagent/askar/profile.py b/aries_cloudagent/askar/profile.py index e7ecfe20a0..4b72d20a57 100644 --- a/aries_cloudagent/askar/profile.py +++ b/aries_cloudagent/askar/profile.py @@ -64,23 +64,23 @@ def init_ledger_pool(self): if self.settings.get("ledger.disabled"): LOGGER.info("Ledger support is disabled") return - - pool_name = self.settings.get("ledger.pool_name", "default") - keepalive = int(self.settings.get("ledger.keepalive", 5)) - read_only = bool(self.settings.get("ledger.read_only", False)) - socks_proxy = self.settings.get("ledger.socks_proxy") - if read_only: - LOGGER.error("Note: setting ledger to read-only mode") - genesis_transactions = self.settings.get("ledger.genesis_transactions") - cache = self.context.injector.inject_or(BaseCache) - self.ledger_pool = IndyVdrLedgerPool( - pool_name, - keepalive=keepalive, - cache=cache, - genesis_transactions=genesis_transactions, - read_only=read_only, - socks_proxy=socks_proxy, - ) + if self.settings.get("ledger.genesis_transactions"): + pool_name = self.settings.get("ledger.pool_name", "default") + keepalive = int(self.settings.get("ledger.keepalive", 5)) + read_only = bool(self.settings.get("ledger.read_only", False)) + socks_proxy = self.settings.get("ledger.socks_proxy") + if read_only: + LOGGER.error("Note: setting ledger to read-only mode") + genesis_transactions = self.settings.get("ledger.genesis_transactions") + cache = self.context.injector.inject_or(BaseCache) + self.ledger_pool = IndyVdrLedgerPool( + pool_name, + keepalive=keepalive, + cache=cache, + genesis_transactions=genesis_transactions, + read_only=read_only, + socks_proxy=socks_proxy, + ) def bind_providers(self): """Initialize the profile-level instance providers.""" @@ -118,6 +118,7 @@ def bind_providers(self): injector.bind_provider( BaseLedger, ClassProvider(IndyVdrLedger, self.ledger_pool, ref(self)) ) + if self.ledger_pool or self.settings.get("ledger.ledger_config_list"): injector.bind_provider( IndyVerifier, ClassProvider( diff --git a/aries_cloudagent/askar/tests/test_profile.py b/aries_cloudagent/askar/tests/test_profile.py index b8b74f0358..f01da0d2fe 100644 --- a/aries_cloudagent/askar/tests/test_profile.py +++ b/aries_cloudagent/askar/tests/test_profile.py @@ -26,6 +26,7 @@ async def test_remove_success(self, AskarOpenStore): context.settings = { "multitenant.wallet_type": "askar-profile", "wallet.askar_profile": profile_id, + "ledger.genesis_transactions": mock.MagicMock(), } askar_profile = AskarProfile(openStore, context) remove_profile_stub = asyncio.Future() diff --git a/aries_cloudagent/commands/provision.py b/aries_cloudagent/commands/provision.py index 8d2f0b7bf6..3c002dc143 100644 --- a/aries_cloudagent/commands/provision.py +++ b/aries_cloudagent/commands/provision.py @@ -7,7 +7,11 @@ from ..config import argparse as arg from ..config.default_context import DefaultContextBuilder from ..config.base import BaseError -from ..config.ledger import get_genesis_transactions, ledger_config +from ..config.ledger import ( + get_genesis_transactions, + ledger_config, + load_multiple_genesis_transactions_from_config, +) from ..config.util import common_config from ..config.wallet import wallet_config from ..protocols.coordinate_mediation.mediation_invite_store import ( @@ -36,7 +40,14 @@ async def provision(settings: dict): context = await context_builder.build_context() try: - await get_genesis_transactions(context.settings) + if context.settings.get("ledger.ledger_config_list"): + await load_multiple_genesis_transactions_from_config(context.settings) + if ( + context.settings.get("ledger.genesis_transactions") + or context.settings.get("ledger.genesis_file") + or context.settings.get("ledger.genesis_url") + ): + await get_genesis_transactions(context.settings) root_profile, public_did = await wallet_config(context, provision=True) diff --git a/aries_cloudagent/config/argparse.py b/aries_cloudagent/config/argparse.py index 0a8cc3baa8..cc09002c08 100644 --- a/aries_cloudagent/config/argparse.py +++ b/aries_cloudagent/config/argparse.py @@ -748,6 +748,18 @@ def add_arguments(self, parser: ArgumentParser): "connect to the public (outside of corporate network) ledger pool" ), ) + parser.add_argument( + "--genesis-transactions-list", + type=str, + required=False, + dest="genesis_transactions_list", + metavar="", + env_var="ACAPY_GENESIS_TRANSACTIONS_LIST", + help=( + "Load YAML configuration for connecting to multiple" + " HyperLedger Indy ledgers." + ), + ) def get_settings(self, args: Namespace) -> dict: """Extract ledger settings.""" @@ -755,17 +767,30 @@ def get_settings(self, args: Namespace) -> dict: if args.no_ledger: settings["ledger.disabled"] = True else: + configured = False if args.genesis_url: settings["ledger.genesis_url"] = args.genesis_url + configured = True elif args.genesis_file: settings["ledger.genesis_file"] = args.genesis_file + configured = True elif args.genesis_transactions: settings["ledger.genesis_transactions"] = args.genesis_transactions - else: + configured = True + if args.genesis_transactions_list: + with open(args.genesis_transactions_list, "r") as stream: + txn_config_list = yaml.safe_load(stream) + ledger_config_list = [] + for txn_config in txn_config_list: + ledger_config_list.append(txn_config) + settings["ledger.ledger_config_list"] = ledger_config_list + configured = True + if not configured: raise ArgsParseError( - "One of --genesis-url --genesis-file or --genesis-transactions " - "must be specified (unless --no-ledger is specified to " - "explicitly configure aca-py to run with no ledger)." + "One of --genesis-url --genesis-file, --genesis-transactions " + "or --genesis-transactions-list must be specified (unless " + "--no-ledger is specified to explicitly configure aca-py to" + " run with no ledger)." ) if args.ledger_pool_name: settings["ledger.pool_name"] = args.ledger_pool_name diff --git a/aries_cloudagent/config/ledger.py b/aries_cloudagent/config/ledger.py index 6a0ca49fad..cdd2d9efc5 100644 --- a/aries_cloudagent/config/ledger.py +++ b/aries_cloudagent/config/ledger.py @@ -4,6 +4,7 @@ import logging import re import sys +import uuid import markdown import prompt_toolkit @@ -56,6 +57,67 @@ async def get_genesis_transactions(settings: Settings) -> str: return txns +async def load_multiple_genesis_transactions_from_config(settings: Settings): + """Fetch genesis transactions for multiple ledger configuration.""" + + ledger_config_list = settings.get("ledger.ledger_config_list") + ledger_txns_list = [] + write_ledger_set = False + for config in ledger_config_list: + txns = None + if "genesis_transactions" in config: + txns = config.get("genesis_transactions") + if not txns: + if "genesis_url" in config: + txns = await fetch_genesis_transactions(config.get("genesis_url")) + elif "genesis_file" in config: + try: + genesis_path = config.get("genesis_file") + LOGGER.info( + "Reading ledger genesis transactions from: %s", genesis_path + ) + with open(genesis_path, "r") as genesis_file: + txns = genesis_file.read() + except IOError as e: + raise ConfigError( + "Error reading ledger genesis transactions" + ) from e + is_write_ledger = ( + False if config.get("is_write") is None else config.get("is_write") + ) + ledger_id = config.get("id") or str(uuid.uuid4()) + if is_write_ledger and write_ledger_set: + raise ConfigError("Only a single ledger can be is_write") + elif is_write_ledger: + write_ledger_set = True + ledger_txns_list.append( + { + "id": ledger_id, + "is_production": ( + True + if config.get("is_production") is None + else config.get("is_production") + ), + "is_write": is_write_ledger, + "genesis_transactions": txns, + "keepalive": int(config.get("keepalive", 5)), + "read_only": bool(config.get("read_only", False)), + "socks_proxy": config.get("socks_proxy"), + "pool_name": config.get("pool_name", ledger_id), + } + ) + if not write_ledger_set and not ( + settings.get("ledger.genesis_transactions") + or settings.get("ledger.genesis_file") + or settings.get("ledger.genesis_url") + ): + raise ConfigError( + "No is_write ledger set and no genesis_url," + " genesis_file and genesis_transactions provided." + ) + settings["ledger.ledger_config_list"] = ledger_txns_list + + async def ledger_config( profile: Profile, public_did: str, provision: bool = False ) -> bool: diff --git a/aries_cloudagent/config/tests/test-ledger-args.yaml b/aries_cloudagent/config/tests/test-ledger-args.yaml new file mode 100644 index 0000000000..894a0e1784 --- /dev/null +++ b/aries_cloudagent/config/tests/test-ledger-args.yaml @@ -0,0 +1,32 @@ +- id: sovrinMain + is_production: true + genesis_transactions: + reqSignature: {} + txn: + data: + data: + alias: Node1 + blskey: >- + 4N8aUNHSgjQVgkpm8nhNEfDf6txHznoYREg9kirmJrkivgL4oSEimFF6nsQ6M41QvhM2Z33nves5vfSn9n1UwNFJBYtWVnHYMATn76vLuL3zU88KyeAYcHfsih3He6UHcXDxcaecHVz6jhCYz1P2UZn2bDVruL5wXpehgBfBaLKm3Ba + blskey_pop: >- + RahHYiCvoNCtPTrVtP7nMC5eTYrsUA8WjXbdhNc8debh1agE9bGiJxWBXYNFbnJXoXhWFMvyqhqhRoq737YQemH5ik9oL7R4NTTCz2LEZhkgLJzB3QRQqJyBNyv7acbdHrAT8nQ9UkLbaVL9NBpnWXBTw4LEMePaSHEw66RzPNdAX1 + client_ip: 192.168.65.3 + client_port: 9702 + node_ip: 192.168.65.3 + node_port: 9701 + services: + - VALIDATOR + dest: Gw6pDLhcBcoQesN72qfotTgFa7cbuqZpkX3Xo6pLhPhv + metadata: + from: Th7MpTaRZVRYnPiabds81Y + type: '0' + txnMetadata: + seqNo: 1 + txnId: fea82e10e894419fe2bea7d96296a6d46f50f93f9eeda954ec461b2ed2950b62 + ver: '1' +- id: sovrinStaging + is_production: true + genesis_file: /home/indy/ledger/sandbox/pool_transactions_genesis +- id: sovrinTest + is_production: false + genesis_url: 'http://localhost:9000/genesis' \ No newline at end of file diff --git a/aries_cloudagent/config/tests/test_argparse.py b/aries_cloudagent/config/tests/test_argparse.py index 729b9d859b..753a2ccd66 100644 --- a/aries_cloudagent/config/tests/test_argparse.py +++ b/aries_cloudagent/config/tests/test_argparse.py @@ -54,6 +54,47 @@ async def test_transport_settings(self): assert settings.get("transport.outbound_configs") == ["http"] assert result.max_outbound_retry == 5 + async def test_get_genesis_transactions_list_with_ledger_selection(self): + """Test multiple ledger support related argument parsing.""" + + parser = argparse.create_argument_parser() + group = argparse.LedgerGroup() + group.add_arguments(parser) + + with async_mock.patch.object(parser, "exit") as exit_parser: + parser.parse_args(["-h"]) + exit_parser.assert_called_once() + + result = parser.parse_args( + [ + "--genesis-transactions-list", + "./aries_cloudagent/config/tests/test-ledger-args.yaml", + ] + ) + + assert ( + result.genesis_transactions_list + == "./aries_cloudagent/config/tests/test-ledger-args.yaml" + ) + + settings = group.get_settings(result) + + assert len(settings.get("ledger.ledger_config_list")) == 3 + assert ( + { + "id": "sovrinStaging", + "is_production": True, + "genesis_file": "/home/indy/ledger/sandbox/pool_transactions_genesis", + } + ) in settings.get("ledger.ledger_config_list") + assert ( + { + "id": "sovrinTest", + "is_production": False, + "genesis_url": "http://localhost:9000/genesis", + } + ) in settings.get("ledger.ledger_config_list") + async def test_outbound_is_required(self): """Test that either -ot or -oq are required""" parser = argparse.create_argument_parser() diff --git a/aries_cloudagent/config/tests/test_ledger.py b/aries_cloudagent/config/tests/test_ledger.py index 07c69aeda4..bb6d2ecaa6 100644 --- a/aries_cloudagent/config/tests/test_ledger.py +++ b/aries_cloudagent/config/tests/test_ledger.py @@ -257,6 +257,391 @@ async def _get_session(): test_module.EndpointType.PROFILE, ) + async def test_load_multiple_genesis_transactions_from_config_a(self): + TEST_GENESIS_TXNS = { + "reqSignature": {}, + "txn": { + "data": { + "data": { + "alias": "Node1", + "blskey": "4N8aUNHSgjQVgkpm8nhNEfDf6txHznoYREg9kirmJrkivgL4oSEimFF6nsQ6M41QvhM2Z33nves5vfSn9n1UwNFJBYtWVnHYMATn76vLuL3zU88KyeAYcHfsih3He6UHcXDxcaecHVz6jhCYz1P2UZn2bDVruL5wXpehgBfBaLKm3Ba", + "blskey_pop": "RahHYiCvoNCtPTrVtP7nMC5eTYrsUA8WjXbdhNc8debh1agE9bGiJxWBXYNFbnJXoXhWFMvyqhqhRoq737YQemH5ik9oL7R4NTTCz2LEZhkgLJzB3QRQqJyBNyv7acbdHrAT8nQ9UkLbaVL9NBpnWXBTw4LEMePaSHEw66RzPNdAX1", + "client_ip": "192.168.65.3", + "client_port": 9702, + "node_ip": "192.168.65.3", + "node_port": 9701, + "services": ["VALIDATOR"], + }, + "dest": "Gw6pDLhcBcoQesN72qfotTgFa7cbuqZpkX3Xo6pLhPhv", + }, + "metadata": {"from": "Th7MpTaRZVRYnPiabds81Y"}, + "type": "0", + }, + "txnMetadata": { + "seqNo": 1, + "txnId": "fea82e10e894419fe2bea7d96296a6d46f50f93f9eeda954ec461b2ed2950b62", + }, + "ver": "1", + } + TEST_MULTIPLE_LEDGER_CONFIG_LIST = [ + { + "id": "sovrinMain", + "is_production": True, + "genesis_transactions": TEST_GENESIS_TXNS, + "is_write": True, + "keepalive": 5, + "read_only": False, + "socks_proxy": None, + "pool_name": "sovrinMain", + }, + { + "id": "sovrinStaging", + "is_production": True, + "genesis_transactions": TEST_GENESIS_TXNS, + "is_write": False, + "keepalive": 5, + "read_only": False, + "socks_proxy": None, + "pool_name": "sovrinStaging", + }, + { + "id": "sovrinTest", + "is_production": True, + "genesis_transactions": TEST_GENESIS_TXNS, + "is_write": False, + "keepalive": 5, + "read_only": False, + "socks_proxy": None, + "pool_name": "sovrinTest", + }, + ] + settings = { + "ledger.ledger_config_list": [ + { + "id": "sovrinMain", + "is_production": True, + "is_write": True, + "genesis_transactions": TEST_GENESIS_TXNS, + }, + { + "id": "sovrinStaging", + "is_production": True, + "genesis_file": "/home/indy/ledger/sandbox/pool_transactions_genesis", + }, + { + "id": "sovrinTest", + "is_production": True, + "genesis_url": "http://localhost:9000/genesis", + }, + ], + } + with async_mock.patch.object( + test_module, + "fetch_genesis_transactions", + async_mock.CoroutineMock(return_value=TEST_GENESIS_TXNS), + ) as mock_fetch, async_mock.patch( + "builtins.open", async_mock.MagicMock() + ) as mock_open: + mock_open.return_value = async_mock.MagicMock( + __enter__=async_mock.MagicMock( + return_value=async_mock.MagicMock( + read=async_mock.MagicMock(return_value=TEST_GENESIS_TXNS) + ) + ) + ) + await test_module.load_multiple_genesis_transactions_from_config(settings) + self.assertEqual( + settings["ledger.ledger_config_list"], TEST_MULTIPLE_LEDGER_CONFIG_LIST + ) + + async def test_load_multiple_genesis_transactions_from_config_b(self): + TEST_GENESIS_TXNS = { + "reqSignature": {}, + "txn": { + "data": { + "data": { + "alias": "Node1", + "blskey": "4N8aUNHSgjQVgkpm8nhNEfDf6txHznoYREg9kirmJrkivgL4oSEimFF6nsQ6M41QvhM2Z33nves5vfSn9n1UwNFJBYtWVnHYMATn76vLuL3zU88KyeAYcHfsih3He6UHcXDxcaecHVz6jhCYz1P2UZn2bDVruL5wXpehgBfBaLKm3Ba", + "blskey_pop": "RahHYiCvoNCtPTrVtP7nMC5eTYrsUA8WjXbdhNc8debh1agE9bGiJxWBXYNFbnJXoXhWFMvyqhqhRoq737YQemH5ik9oL7R4NTTCz2LEZhkgLJzB3QRQqJyBNyv7acbdHrAT8nQ9UkLbaVL9NBpnWXBTw4LEMePaSHEw66RzPNdAX1", + "client_ip": "192.168.65.3", + "client_port": 9702, + "node_ip": "192.168.65.3", + "node_port": 9701, + "services": ["VALIDATOR"], + }, + "dest": "Gw6pDLhcBcoQesN72qfotTgFa7cbuqZpkX3Xo6pLhPhv", + }, + "metadata": {"from": "Th7MpTaRZVRYnPiabds81Y"}, + "type": "0", + }, + "txnMetadata": { + "seqNo": 1, + "txnId": "fea82e10e894419fe2bea7d96296a6d46f50f93f9eeda954ec461b2ed2950b62", + }, + "ver": "1", + } + TEST_MULTIPLE_LEDGER_CONFIG_LIST = [ + { + "id": "sovrinMain", + "is_production": True, + "genesis_transactions": TEST_GENESIS_TXNS, + "is_write": False, + "keepalive": 5, + "read_only": False, + "socks_proxy": None, + "pool_name": "sovrinMain", + }, + { + "id": "sovrinStaging", + "is_production": True, + "genesis_transactions": TEST_GENESIS_TXNS, + "is_write": False, + "keepalive": 5, + "read_only": False, + "socks_proxy": None, + "pool_name": "sovrinStaging", + }, + { + "id": "sovrinTest", + "is_production": True, + "genesis_transactions": TEST_GENESIS_TXNS, + "is_write": False, + "keepalive": 5, + "read_only": False, + "socks_proxy": None, + "pool_name": "sovrinTest", + }, + ] + settings = { + "ledger.ledger_config_list": [ + { + "id": "sovrinMain", + "is_production": True, + "genesis_transactions": TEST_GENESIS_TXNS, + }, + { + "id": "sovrinStaging", + "is_production": True, + "genesis_file": "/home/indy/ledger/sandbox/pool_transactions_genesis", + }, + { + "id": "sovrinTest", + "is_production": True, + "genesis_url": "http://localhost:9001/genesis", + }, + ], + "ledger.genesis_url": "http://localhost:9000/genesis", + } + with async_mock.patch.object( + test_module, + "fetch_genesis_transactions", + async_mock.CoroutineMock(return_value=TEST_GENESIS_TXNS), + ) as mock_fetch, async_mock.patch( + "builtins.open", async_mock.MagicMock() + ) as mock_open: + mock_open.return_value = async_mock.MagicMock( + __enter__=async_mock.MagicMock( + return_value=async_mock.MagicMock( + read=async_mock.MagicMock(return_value=TEST_GENESIS_TXNS) + ) + ) + ) + await test_module.load_multiple_genesis_transactions_from_config(settings) + self.assertEqual( + settings["ledger.ledger_config_list"], TEST_MULTIPLE_LEDGER_CONFIG_LIST + ) + + async def test_load_multiple_genesis_transactions_config_error_a(self): + TEST_GENESIS_TXNS = { + "reqSignature": {}, + "txn": { + "data": { + "data": { + "alias": "Node1", + "blskey": "4N8aUNHSgjQVgkpm8nhNEfDf6txHznoYREg9kirmJrkivgL4oSEimFF6nsQ6M41QvhM2Z33nves5vfSn9n1UwNFJBYtWVnHYMATn76vLuL3zU88KyeAYcHfsih3He6UHcXDxcaecHVz6jhCYz1P2UZn2bDVruL5wXpehgBfBaLKm3Ba", + "blskey_pop": "RahHYiCvoNCtPTrVtP7nMC5eTYrsUA8WjXbdhNc8debh1agE9bGiJxWBXYNFbnJXoXhWFMvyqhqhRoq737YQemH5ik9oL7R4NTTCz2LEZhkgLJzB3QRQqJyBNyv7acbdHrAT8nQ9UkLbaVL9NBpnWXBTw4LEMePaSHEw66RzPNdAX1", + "client_ip": "192.168.65.3", + "client_port": 9702, + "node_ip": "192.168.65.3", + "node_port": 9701, + "services": ["VALIDATOR"], + }, + "dest": "Gw6pDLhcBcoQesN72qfotTgFa7cbuqZpkX3Xo6pLhPhv", + }, + "metadata": {"from": "Th7MpTaRZVRYnPiabds81Y"}, + "type": "0", + }, + "txnMetadata": { + "seqNo": 1, + "txnId": "fea82e10e894419fe2bea7d96296a6d46f50f93f9eeda954ec461b2ed2950b62", + }, + "ver": "1", + } + settings = { + "ledger.ledger_config_list": [ + { + "id": "sovrinMain", + "is_production": True, + "genesis_transactions": TEST_GENESIS_TXNS, + }, + { + "id": "sovrinStaging", + "is_production": True, + "genesis_file": "/home/indy/ledger/sandbox/pool_transactions_genesis", + }, + { + "id": "sovrinTest", + "is_production": True, + "genesis_url": "http://localhost:9001/genesis", + }, + ], + } + with async_mock.patch.object( + test_module, + "fetch_genesis_transactions", + async_mock.CoroutineMock(return_value=TEST_GENESIS_TXNS), + ) as mock_fetch, async_mock.patch( + "builtins.open", async_mock.MagicMock() + ) as mock_open: + mock_open.return_value = async_mock.MagicMock( + __enter__=async_mock.MagicMock( + return_value=async_mock.MagicMock( + read=async_mock.MagicMock(return_value=TEST_GENESIS_TXNS) + ) + ) + ) + with self.assertRaises(test_module.ConfigError) as cm: + await test_module.load_multiple_genesis_transactions_from_config( + settings + ) + assert "No is_write ledger set" in str(cm.exception) + + async def test_load_multiple_genesis_transactions_config_error_b(self): + TEST_GENESIS_TXNS = { + "reqSignature": {}, + "txn": { + "data": { + "data": { + "alias": "Node1", + "blskey": "4N8aUNHSgjQVgkpm8nhNEfDf6txHznoYREg9kirmJrkivgL4oSEimFF6nsQ6M41QvhM2Z33nves5vfSn9n1UwNFJBYtWVnHYMATn76vLuL3zU88KyeAYcHfsih3He6UHcXDxcaecHVz6jhCYz1P2UZn2bDVruL5wXpehgBfBaLKm3Ba", + "blskey_pop": "RahHYiCvoNCtPTrVtP7nMC5eTYrsUA8WjXbdhNc8debh1agE9bGiJxWBXYNFbnJXoXhWFMvyqhqhRoq737YQemH5ik9oL7R4NTTCz2LEZhkgLJzB3QRQqJyBNyv7acbdHrAT8nQ9UkLbaVL9NBpnWXBTw4LEMePaSHEw66RzPNdAX1", + "client_ip": "192.168.65.3", + "client_port": 9702, + "node_ip": "192.168.65.3", + "node_port": 9701, + "services": ["VALIDATOR"], + }, + "dest": "Gw6pDLhcBcoQesN72qfotTgFa7cbuqZpkX3Xo6pLhPhv", + }, + "metadata": {"from": "Th7MpTaRZVRYnPiabds81Y"}, + "type": "0", + }, + "txnMetadata": { + "seqNo": 1, + "txnId": "fea82e10e894419fe2bea7d96296a6d46f50f93f9eeda954ec461b2ed2950b62", + }, + "ver": "1", + } + settings = { + "ledger.ledger_config_list": [ + { + "id": "sovrinMain", + "is_production": True, + "is_write": True, + "genesis_transactions": TEST_GENESIS_TXNS, + }, + { + "id": "sovrinStaging", + "is_production": True, + "is_write": True, + "genesis_file": "/home/indy/ledger/sandbox/pool_transactions_genesis", + }, + { + "id": "sovrinTest", + "is_production": True, + "genesis_url": "http://localhost:9001/genesis", + }, + ], + "ledger.genesis_url": "http://localhost:9000/genesis", + } + with async_mock.patch.object( + test_module, + "fetch_genesis_transactions", + async_mock.CoroutineMock(return_value=TEST_GENESIS_TXNS), + ) as mock_fetch, async_mock.patch( + "builtins.open", async_mock.MagicMock() + ) as mock_open: + mock_open.return_value = async_mock.MagicMock( + __enter__=async_mock.MagicMock( + return_value=async_mock.MagicMock( + read=async_mock.MagicMock(return_value=TEST_GENESIS_TXNS) + ) + ) + ) + with self.assertRaises(test_module.ConfigError) as cm: + await test_module.load_multiple_genesis_transactions_from_config( + settings + ) + assert "Only a single ledger can be" in str(cm.exception) + + async def test_load_multiple_genesis_transactions_from_config_io_x(self): + TEST_GENESIS_TXNS = { + "reqSignature": {}, + "txn": { + "data": { + "data": { + "alias": "Node1", + "blskey": "4N8aUNHSgjQVgkpm8nhNEfDf6txHznoYREg9kirmJrkivgL4oSEimFF6nsQ6M41QvhM2Z33nves5vfSn9n1UwNFJBYtWVnHYMATn76vLuL3zU88KyeAYcHfsih3He6UHcXDxcaecHVz6jhCYz1P2UZn2bDVruL5wXpehgBfBaLKm3Ba", + "blskey_pop": "RahHYiCvoNCtPTrVtP7nMC5eTYrsUA8WjXbdhNc8debh1agE9bGiJxWBXYNFbnJXoXhWFMvyqhqhRoq737YQemH5ik9oL7R4NTTCz2LEZhkgLJzB3QRQqJyBNyv7acbdHrAT8nQ9UkLbaVL9NBpnWXBTw4LEMePaSHEw66RzPNdAX1", + "client_ip": "192.168.65.3", + "client_port": 9702, + "node_ip": "192.168.65.3", + "node_port": 9701, + "services": ["VALIDATOR"], + }, + "dest": "Gw6pDLhcBcoQesN72qfotTgFa7cbuqZpkX3Xo6pLhPhv", + }, + "metadata": {"from": "Th7MpTaRZVRYnPiabds81Y"}, + "type": "0", + }, + "txnMetadata": { + "seqNo": 1, + "txnId": "fea82e10e894419fe2bea7d96296a6d46f50f93f9eeda954ec461b2ed2950b62", + }, + "ver": "1", + } + settings = { + "ledger.ledger_config_list": [ + { + "id": "sovrinMain", + "is_production": True, + "genesis_transactions": TEST_GENESIS_TXNS, + }, + { + "id": "sovrinStaging", + "is_production": True, + "genesis_file": "/home/indy/ledger/sandbox/pool_transactions_genesis", + }, + { + "id": "sovrinTest", + "is_production": True, + "genesis_url": "http://localhost:9000/genesis", + }, + ], + } + with async_mock.patch.object( + test_module, + "fetch_genesis_transactions", + async_mock.CoroutineMock(return_value=TEST_GENESIS_TXNS), + ) as mock_fetch, async_mock.patch( + "builtins.open", async_mock.MagicMock() + ) as mock_open: + mock_open.side_effect = IOError("no read permission") + with self.assertRaises(test_module.ConfigError): + await test_module.load_multiple_genesis_transactions_from_config( + settings + ) + @async_mock.patch("sys.stdout") async def test_ledger_accept_taa_not_tty(self, mock_stdout): mock_stdout.isatty = async_mock.MagicMock(return_value=False) diff --git a/aries_cloudagent/core/conductor.py b/aries_cloudagent/core/conductor.py index 539dc5d4b4..e8bd1a2280 100644 --- a/aries_cloudagent/core/conductor.py +++ b/aries_cloudagent/core/conductor.py @@ -14,13 +14,30 @@ from ..admin.base_server import BaseAdminServer from ..admin.server import AdminResponder, AdminServer +from ..askar.profile import AskarProfile from ..config.default_context import ContextBuilder from ..config.injection_context import InjectionContext -from ..config.ledger import get_genesis_transactions, ledger_config +from ..config.provider import ClassProvider +from ..config.ledger import ( + get_genesis_transactions, + ledger_config, + load_multiple_genesis_transactions_from_config, +) from ..config.logging import LoggingConfigurator from ..config.wallet import wallet_config from ..core.profile import Profile +from ..indy.sdk.profile import IndySdkProfile +from ..indy.verifier import IndyVerifier +from ..ledger.base import BaseLedger from ..ledger.error import LedgerConfigError, LedgerTransactionError +from ..ledger.indy import IndySdkLedger +from ..ledger.indy_vdr import IndyVdrLedger +from ..ledger.multiple_ledger.base_manager import ( + BaseMultipleLedgerManager, + MultipleLedgerManagerError, +) +from ..ledger.multiple_ledger.manager_provider import MultiIndyLedgerManagerProvider +from ..ledger.multiple_ledger.ledger_requests_executor import IndyLedgerRequestsExecutor from ..messaging.responder import BaseResponder from ..multitenant.base import BaseMultitenantManager from ..multitenant.manager_provider import MultitenantManagerProvider @@ -93,12 +110,65 @@ async def setup(self): context = await self.context_builder.build_context() # Fetch genesis transactions if necessary - await get_genesis_transactions(context.settings) + if context.settings.get("ledger.ledger_config_list"): + await load_multiple_genesis_transactions_from_config(context.settings) + if ( + context.settings.get("ledger.genesis_transactions") + or context.settings.get("ledger.genesis_file") + or context.settings.get("ledger.genesis_url") + ): + await get_genesis_transactions(context.settings) # Configure the root profile self.root_profile, self.setup_public_did = await wallet_config(context) context = self.root_profile.context + # Multiledger Setup + if ( + context.settings.get("ledger.ledger_config_list") + and len(context.settings.get("ledger.ledger_config_list")) > 0 + ): + context.injector.bind_provider( + BaseMultipleLedgerManager, + MultiIndyLedgerManagerProvider(self.root_profile), + ) + if not (context.settings.get("ledger.genesis_transactions")): + ledger = ( + await context.injector.inject( + BaseMultipleLedgerManager + ).get_write_ledger() + )[1] + if isinstance(self.root_profile, AskarProfile) and isinstance( + ledger, IndyVdrLedger + ): + context.injector.bind_instance(BaseLedger, ledger) + context.injector.bind_provider( + IndyVerifier, + ClassProvider( + "aries_cloudagent.indy.credx.verifier.IndyCredxVerifier", + self.root_profile, + ), + ) + elif isinstance(self.root_profile, IndySdkProfile) and isinstance( + ledger, IndySdkLedger + ): + context.injector.bind_instance(BaseLedger, ledger) + context.injector.bind_provider( + IndyVerifier, + ClassProvider( + "aries_cloudagent.indy.sdk.verifier.IndySdkVerifier", + self.root_profile, + ), + ) + else: + raise MultipleLedgerManagerError( + "Multiledger is supported only for Indy SDK or Askar " + "[Indy VDR] profile" + ) + context.injector.bind_instance( + IndyLedgerRequestsExecutor, IndyLedgerRequestsExecutor(self.root_profile) + ) + # Configure the ledger if not await ledger_config( self.root_profile, self.setup_public_did and self.setup_public_did.did @@ -306,15 +376,15 @@ async def start(self) -> None: # mediation connection establishment provided_invite: str = context.settings.get("mediation.invite") - async with self.root_profile.session() as session: - try: + try: + async with self.root_profile.session() as session: invite_store = MediationInviteStore(session.context.inject(BaseStorage)) mediation_invite_record = ( await invite_store.get_mediation_invite_record(provided_invite) ) - except Exception: - LOGGER.exception("Error retrieving mediator invitation") - mediation_invite_record = None + except Exception: + LOGGER.exception("Error retrieving mediator invitation") + mediation_invite_record = None # Accept mediation invitation if one was specified or stored if mediation_invite_record is not None: diff --git a/aries_cloudagent/core/tests/test_conductor.py b/aries_cloudagent/core/tests/test_conductor.py index d703660757..db8c567409 100644 --- a/aries_cloudagent/core/tests/test_conductor.py +++ b/aries_cloudagent/core/tests/test_conductor.py @@ -1061,3 +1061,31 @@ async def test_mediator_invitation_x(self, _): await conductor.stop() mock_from_url.assert_called_once_with("test-invite") mock_logger.exception.assert_called_once() + + async def test_setup_ledger_both_multiple_and_base(self): + builder: ContextBuilder = StubContextBuilder(self.test_settings) + builder.update_settings({"ledger.genesis_transactions": "..."}) + builder.update_settings({"ledger.ledger_config_list": [{"...": "..."}]}) + conductor = test_module.Conductor(builder) + + with async_mock.patch.object( + test_module, + "load_multiple_genesis_transactions_from_config", + async_mock.CoroutineMock(), + ) as mock_multiple_genesis_load, async_mock.patch.object( + test_module, "get_genesis_transactions", async_mock.CoroutineMock() + ) as mock_genesis_load: + await conductor.setup() + mock_multiple_genesis_load.assert_called_once() + mock_genesis_load.assert_called_once() + + async def test_setup_ledger_only_base(self): + builder: ContextBuilder = StubContextBuilder(self.test_settings) + builder.update_settings({"ledger.genesis_transactions": "..."}) + conductor = test_module.Conductor(builder) + + with async_mock.patch.object( + test_module, "get_genesis_transactions", async_mock.CoroutineMock() + ) as mock_genesis_load: + await conductor.setup() + mock_genesis_load.assert_called_once() diff --git a/aries_cloudagent/holder/tests/test_routes.py b/aries_cloudagent/holder/tests/test_routes.py index 322e4180bf..edef758fba 100644 --- a/aries_cloudagent/holder/tests/test_routes.py +++ b/aries_cloudagent/holder/tests/test_routes.py @@ -39,6 +39,7 @@ def setUp(self): self.profile = InMemoryProfile.test_profile() self.context = self.profile.context setattr(self.context, "profile", self.profile) + self.request_dict = {"context": self.context} self.request = async_mock.MagicMock( app={}, diff --git a/aries_cloudagent/indy/credx/tests/test_cred_issuance.py b/aries_cloudagent/indy/credx/tests/test_cred_issuance.py index e97c059d36..d35962ba35 100644 --- a/aries_cloudagent/indy/credx/tests/test_cred_issuance.py +++ b/aries_cloudagent/indy/credx/tests/test_cred_issuance.py @@ -7,6 +7,9 @@ from ....askar.profile import AskarProfileManager from ....config.injection_context import InjectionContext from ....ledger.base import BaseLedger +from ....ledger.multiple_ledger.ledger_requests_executor import ( + IndyLedgerRequestsExecutor, +) from .. import issuer, holder, verifier @@ -73,6 +76,14 @@ async def setUp(self): }, ) self.issuer_profile._context.injector.bind_instance(BaseLedger, mock_ledger) + self.issuer_profile._context.injector.bind_instance( + IndyLedgerRequestsExecutor, + async_mock.MagicMock( + get_ledger_for_identifier=async_mock.CoroutineMock( + return_value=mock_ledger + ) + ), + ) self.holder = holder.IndyCredxHolder(self.holder_profile) self.issuer = issuer.IndyCredxIssuer(self.issuer_profile) diff --git a/aries_cloudagent/indy/credx/verifier.py b/aries_cloudagent/indy/credx/verifier.py index bd07c18832..c6677cfa7b 100644 --- a/aries_cloudagent/indy/credx/verifier.py +++ b/aries_cloudagent/indy/credx/verifier.py @@ -6,7 +6,6 @@ from indy_credx import CredxError, Presentation from ...core.profile import Profile -from ...ledger.base import BaseLedger from ..verifier import IndyVerifier @@ -24,7 +23,7 @@ def __init__(self, profile: Profile): profile: an active profile instance """ - self.ledger = profile.inject(BaseLedger) + self.profile = profile async def verify_presentation( self, @@ -49,7 +48,7 @@ async def verify_presentation( try: self.non_revoc_intervals(pres_req, pres, credential_definitions) - await self.check_timestamps(self.ledger, pres_req, pres, rev_reg_defs) + await self.check_timestamps(self.profile, pres_req, pres, rev_reg_defs) await self.pre_verify(pres_req, pres) except ValueError as err: LOGGER.error( diff --git a/aries_cloudagent/indy/models/pres_preview.py b/aries_cloudagent/indy/models/pres_preview.py index 3b8c0cf70d..28af5b5c1d 100644 --- a/aries_cloudagent/indy/models/pres_preview.py +++ b/aries_cloudagent/indy/models/pres_preview.py @@ -6,7 +6,11 @@ from marshmallow import EXCLUDE, fields -from ...ledger.base import BaseLedger +from ...core.profile import Profile +from ...ledger.multiple_ledger.ledger_requests_executor import ( + GET_CRED_DEF, + IndyLedgerRequestsExecutor, +) from ...messaging.models.base import BaseModel, BaseModelSchema from ...messaging.util import canon from ...messaging.valid import INDY_CRED_DEF_ID, INDY_PREDICATE @@ -291,10 +295,10 @@ def has_attr_spec(self, cred_def_id: str, name: str, value: str) -> bool: async def indy_proof_request( self, + profile: Profile = None, name: str = None, version: str = None, nonce: str = None, - ledger: BaseLedger = None, non_revoc_intervals: Mapping[str, IndyNonRevocationInterval] = None, ) -> dict: """ @@ -326,7 +330,7 @@ def non_revoc(cred_def_id: str) -> IndyNonRevocationInterval: ) epoch_now = int(time()) - + ledger = None proof_req = { "name": name or "proof-request", "version": version or "1.0", @@ -346,6 +350,16 @@ def non_revoc(cred_def_id: str) -> IndyNonRevocationInterval: cd_id = attr_spec.cred_def_id revoc_support = False if cd_id: + if profile: + ledger_exec_inst = profile.inject(IndyLedgerRequestsExecutor) + ledger_info = await ledger_exec_inst.get_ledger_for_identifier( + cd_id, + txn_record_type=GET_CRED_DEF, + ) + if isinstance(ledger_info, tuple): + ledger = ledger_info[1] + else: + ledger = ledger_info if ledger: async with ledger: revoc_support = (await ledger.get_credential_definition(cd_id))[ @@ -397,6 +411,16 @@ def non_revoc(cred_def_id: str) -> IndyNonRevocationInterval: cd_id = pred_spec.cred_def_id revoc_support = False if cd_id: + if profile: + ledger_exec_inst = profile.inject(IndyLedgerRequestsExecutor) + ledger_info = await ledger_exec_inst.get_ledger_for_identifier( + cd_id, + txn_record_type=GET_CRED_DEF, + ) + if isinstance(ledger_info, tuple): + ledger = ledger_info[1] + else: + ledger = ledger_info if ledger: async with ledger: revoc_support = (await ledger.get_credential_definition(cd_id))[ diff --git a/aries_cloudagent/indy/models/tests/test_pres_preview.py b/aries_cloudagent/indy/models/tests/test_pres_preview.py index 4e3ffcadb7..9e3fc6d59c 100644 --- a/aries_cloudagent/indy/models/tests/test_pres_preview.py +++ b/aries_cloudagent/indy/models/tests/test_pres_preview.py @@ -8,6 +8,10 @@ from asynctest import TestCase as AsyncTestCase from asynctest import mock as async_mock +from ....core.in_memory import InMemoryProfile +from ....ledger.multiple_ledger.ledger_requests_executor import ( + IndyLedgerRequestsExecutor, +) from ....messaging.util import canon from ....protocols.didcomm_prefix import DIDCommPrefix @@ -391,16 +395,23 @@ async def test_to_indy_proof_request_revo_default_interval(self): copy_indy_proof_req = deepcopy(INDY_PROOF_REQ) pres_preview = deepcopy(PRES_PREVIEW) - mock_ledger = async_mock.MagicMock( - get_credential_definition=async_mock.CoroutineMock( - return_value={"value": {"revocation": {"...": "..."}}} - ) - ) - - indy_proof_req_revo = await pres_preview.indy_proof_request( - **{k: INDY_PROOF_REQ[k] for k in ("name", "version", "nonce")}, - ledger=mock_ledger, + mock_profile = InMemoryProfile.test_profile() + context = mock_profile.context + context.injector.bind_instance( + IndyLedgerRequestsExecutor, IndyLedgerRequestsExecutor(mock_profile) ) + with async_mock.patch.object( + IndyLedgerRequestsExecutor, "get_ledger_for_identifier" + ) as mock_get_ledger: + mock_get_ledger.return_value = async_mock.MagicMock( + get_credential_definition=async_mock.CoroutineMock( + return_value={"value": {"revocation": {"...": "..."}}} + ) + ) + indy_proof_req_revo = await pres_preview.indy_proof_request( + **{k: INDY_PROOF_REQ[k] for k in ("name", "version", "nonce")}, + profile=mock_profile, + ) for uuid, attr_spec in indy_proof_req_revo["requested_attributes"].items(): assert set(attr_spec.get("non_revoked", {}).keys()) == {"from", "to"} @@ -423,20 +434,28 @@ async def test_to_indy_proof_request_revo(self): copy_indy_proof_req = deepcopy(INDY_PROOF_REQ) pres_preview = deepcopy(PRES_PREVIEW) - mock_ledger = async_mock.MagicMock( - get_credential_definition=async_mock.CoroutineMock( - return_value={"value": {"revocation": {"...": "..."}}} - ) - ) - - indy_proof_req_revo = await pres_preview.indy_proof_request( - **{k: INDY_PROOF_REQ[k] for k in ("name", "version", "nonce")}, - ledger=mock_ledger, - non_revoc_intervals={ - CD_ID[s_id]: IndyNonRevocationInterval(1234567890, EPOCH_NOW) - for s_id in S_ID - }, + mock_profile = InMemoryProfile.test_profile() + mock_profile.settings["ledger.ledger_config_list"] = [{"id": "test"}] + context = mock_profile.context + context.injector.bind_instance( + IndyLedgerRequestsExecutor, IndyLedgerRequestsExecutor(mock_profile) ) + with async_mock.patch.object( + IndyLedgerRequestsExecutor, "get_ledger_for_identifier" + ) as mock_get_ledger: + mock_get_ledger.return_value = async_mock.MagicMock( + get_credential_definition=async_mock.CoroutineMock( + return_value={"value": {"revocation": {"...": "..."}}} + ) + ) + indy_proof_req_revo = await pres_preview.indy_proof_request( + **{k: INDY_PROOF_REQ[k] for k in ("name", "version", "nonce")}, + profile=mock_profile, + non_revoc_intervals={ + CD_ID[s_id]: IndyNonRevocationInterval(1234567890, EPOCH_NOW) + for s_id in S_ID + }, + ) for uuid, attr_spec in indy_proof_req_revo["requested_attributes"].items(): assert set(attr_spec.get("non_revoked", {}).keys()) == {"from", "to"} diff --git a/aries_cloudagent/indy/sdk/profile.py b/aries_cloudagent/indy/sdk/profile.py index feb607f6a7..24badc748a 100644 --- a/aries_cloudagent/indy/sdk/profile.py +++ b/aries_cloudagent/indy/sdk/profile.py @@ -54,7 +54,8 @@ def init_ledger_pool(self): LOGGER.info("Ledger support is disabled") return - self.ledger_pool = self.context.inject(IndySdkLedgerPool, self.settings) + if self.settings.get("ledger.genesis_transactions"): + self.ledger_pool = self.context.inject(IndySdkLedgerPool, self.settings) def bind_providers(self): """Initialize the profile-level instance providers.""" @@ -84,14 +85,15 @@ def bind_providers(self): ) if self.ledger_pool: - ledger = IndySdkLedger(self.ledger_pool, IndySdkWallet(self.opened)) - - injector.bind_instance(BaseLedger, ledger) + injector.bind_provider( + BaseLedger, ClassProvider(IndySdkLedger, self.ledger_pool, ref(self)) + ) + if self.ledger_pool or self.settings.get("ledger.ledger_config_list"): injector.bind_provider( IndyVerifier, ClassProvider( "aries_cloudagent.indy.sdk.verifier.IndySdkVerifier", - ledger, + ref(self), ), ) diff --git a/aries_cloudagent/indy/sdk/tests/test_profile.py b/aries_cloudagent/indy/sdk/tests/test_profile.py index a2f7924bc9..6db4425574 100644 --- a/aries_cloudagent/indy/sdk/tests/test_profile.py +++ b/aries_cloudagent/indy/sdk/tests/test_profile.py @@ -47,6 +47,34 @@ async def test_properties(self, profile): await profile.remove() assert profile.opened is None + def test_settings_genesis_transactions(self): + context = InjectionContext( + settings={"ledger.genesis_transactions": async_mock.MagicMock()} + ) + context.injector.bind_instance(IndySdkLedgerPool, IndySdkLedgerPool("name")) + profile = IndySdkProfile( + IndyOpenWallet( + config=IndyWalletConfig({"name": "test-profile"}), + created=True, + handle=1, + master_secret_id="master-secret", + ), + context, + ) + + def test_settings_ledger_config(self): + context = InjectionContext(settings={"ledger.ledger_config_list": True}) + context.injector.bind_instance(IndySdkLedgerPool, IndySdkLedgerPool("name")) + profile = IndySdkProfile( + IndyOpenWallet( + config=IndyWalletConfig({"name": "test-profile"}), + created=True, + handle=1, + master_secret_id="master-secret", + ), + context, + ) + def test_read_only(self): context = InjectionContext(settings={"ledger.read_only": True}) context.injector.bind_instance(IndySdkLedgerPool, IndySdkLedgerPool("name")) diff --git a/aries_cloudagent/indy/sdk/tests/test_verifier.py b/aries_cloudagent/indy/sdk/tests/test_verifier.py index 6bb3940c8e..be02bb8ead 100644 --- a/aries_cloudagent/indy/sdk/tests/test_verifier.py +++ b/aries_cloudagent/indy/sdk/tests/test_verifier.py @@ -6,6 +6,11 @@ from asynctest import mock as async_mock, TestCase as AsyncTestCase from indy.error import IndyError +from ....core.in_memory import InMemoryProfile +from ....ledger.multiple_ledger.ledger_requests_executor import ( + IndyLedgerRequestsExecutor, +) + from ..verifier import IndySdkVerifier @@ -310,7 +315,12 @@ def setUp(self): } ) ) - self.verifier = IndySdkVerifier(self.ledger) + mock_profile = InMemoryProfile.test_profile() + context = mock_profile.context + context.injector.bind_instance( + IndyLedgerRequestsExecutor, IndyLedgerRequestsExecutor(mock_profile) + ) + self.verifier = IndySdkVerifier(mock_profile) assert repr(self.verifier) == "" @async_mock.patch("indy.anoncreds.verifier_verify_proof") @@ -321,7 +331,10 @@ async def test_verify_presentation(self, mock_verify): self.verifier, "pre_verify", async_mock.CoroutineMock() ) as mock_pre_verify, async_mock.patch.object( self.verifier, "non_revoc_intervals", async_mock.MagicMock() - ) as mock_non_revox: + ) as mock_non_revox, async_mock.patch.object( + IndyLedgerRequestsExecutor, "get_ledger_for_identifier" + ) as mock_get_ledger: + mock_get_ledger.return_value = self.ledger INDY_PROOF_REQ_X = deepcopy(INDY_PROOF_REQ_PRED_NAMES) verified = await self.verifier.verify_presentation( INDY_PROOF_REQ_X, @@ -353,7 +366,10 @@ async def test_verify_presentation_x_indy(self, mock_verify): self.verifier, "pre_verify", async_mock.CoroutineMock() ) as mock_pre_verify, async_mock.patch.object( self.verifier, "non_revoc_intervals", async_mock.MagicMock() - ) as mock_non_revox: + ) as mock_non_revox, async_mock.patch.object( + IndyLedgerRequestsExecutor, "get_ledger_for_identifier" + ) as mock_get_ledger: + mock_get_ledger.return_value = ("test", self.ledger) verified = await self.verifier.verify_presentation( INDY_PROOF_REQ_NAME, INDY_PROOF_NAME, @@ -376,15 +392,19 @@ async def test_verify_presentation_x_indy(self, mock_verify): @async_mock.patch("indy.anoncreds.verifier_verify_proof") async def test_check_encoding_attr(self, mock_verify): - mock_verify.return_value = True - verified = await self.verifier.verify_presentation( - INDY_PROOF_REQ_NAME, - INDY_PROOF_NAME, - "schemas", - {"LjgpST2rjsoxYegQDRm7EL:3:CL:19:tag": {"value": {}}}, - REV_REG_DEFS, - "rev_reg_entries", - ) + with async_mock.patch.object( + IndyLedgerRequestsExecutor, "get_ledger_for_identifier" + ) as mock_get_ledger: + mock_get_ledger.return_value = self.ledger + mock_verify.return_value = True + verified = await self.verifier.verify_presentation( + INDY_PROOF_REQ_NAME, + INDY_PROOF_NAME, + "schemas", + {"LjgpST2rjsoxYegQDRm7EL:3:CL:19:tag": {"value": {}}}, + REV_REG_DEFS, + "rev_reg_entries", + ) mock_verify.assert_called_once_with( json.dumps(INDY_PROOF_REQ_NAME), @@ -394,7 +414,6 @@ async def test_check_encoding_attr(self, mock_verify): json.dumps(REV_REG_DEFS), json.dumps("rev_reg_entries"), ) - assert verified is True @async_mock.patch("indy.anoncreds.verifier_verify_proof") @@ -403,15 +422,18 @@ async def test_check_encoding_attr_tamper_raw(self, mock_verify): INDY_PROOF_X["requested_proof"]["revealed_attrs"]["19_uuid"][ "raw" ] = "Mock chicken" - - verified = await self.verifier.verify_presentation( - INDY_PROOF_REQ_NAME, - INDY_PROOF_X, - "schemas", - {"LjgpST2rjsoxYegQDRm7EL:3:CL:19:tag": {"value": {}}}, - REV_REG_DEFS, - "rev_reg_entries", - ) + with async_mock.patch.object( + IndyLedgerRequestsExecutor, "get_ledger_for_identifier" + ) as mock_get_ledger: + mock_get_ledger.return_value = ("test", self.ledger) + verified = await self.verifier.verify_presentation( + INDY_PROOF_REQ_NAME, + INDY_PROOF_X, + "schemas", + {"LjgpST2rjsoxYegQDRm7EL:3:CL:19:tag": {"value": {}}}, + REV_REG_DEFS, + "rev_reg_entries", + ) mock_verify.assert_not_called() @@ -423,15 +445,18 @@ async def test_check_encoding_attr_tamper_encoded(self, mock_verify): INDY_PROOF_X["requested_proof"]["revealed_attrs"]["19_uuid"][ "encoded" ] = "1234567890" - - verified = await self.verifier.verify_presentation( - INDY_PROOF_REQ_NAME, - INDY_PROOF_X, - "schemas", - {"LjgpST2rjsoxYegQDRm7EL:3:CL:19:tag": {"value": {}}}, - REV_REG_DEFS, - "rev_reg_entries", - ) + with async_mock.patch.object( + IndyLedgerRequestsExecutor, "get_ledger_for_identifier" + ) as mock_get_ledger: + mock_get_ledger.return_value = self.ledger + verified = await self.verifier.verify_presentation( + INDY_PROOF_REQ_NAME, + INDY_PROOF_X, + "schemas", + {"LjgpST2rjsoxYegQDRm7EL:3:CL:19:tag": {"value": {}}}, + REV_REG_DEFS, + "rev_reg_entries", + ) mock_verify.assert_not_called() @@ -439,16 +464,20 @@ async def test_check_encoding_attr_tamper_encoded(self, mock_verify): @async_mock.patch("indy.anoncreds.verifier_verify_proof") async def test_check_pred_names(self, mock_verify): - mock_verify.return_value = True - INDY_PROOF_REQ_X = deepcopy(INDY_PROOF_REQ_PRED_NAMES) - verified = await self.verifier.verify_presentation( - INDY_PROOF_REQ_X, - INDY_PROOF_PRED_NAMES, - "schemas", - {"LjgpST2rjsoxYegQDRm7EL:3:CL:18:tag": {"value": {"revocation": {}}}}, - REV_REG_DEFS, - "rev_reg_entries", - ) + with async_mock.patch.object( + IndyLedgerRequestsExecutor, "get_ledger_for_identifier" + ) as mock_get_ledger: + mock_get_ledger.return_value = ("test", self.ledger) + mock_verify.return_value = True + INDY_PROOF_REQ_X = deepcopy(INDY_PROOF_REQ_PRED_NAMES) + verified = await self.verifier.verify_presentation( + INDY_PROOF_REQ_X, + INDY_PROOF_PRED_NAMES, + "schemas", + {"LjgpST2rjsoxYegQDRm7EL:3:CL:18:tag": {"value": {"revocation": {}}}}, + REV_REG_DEFS, + "rev_reg_entries", + ) mock_verify.assert_called_once_with( json.dumps(INDY_PROOF_REQ_X), @@ -469,15 +498,18 @@ async def test_check_pred_names_tamper_pred_value(self, mock_verify): INDY_PROOF_X["proof"]["proofs"][0]["primary_proof"]["ge_proofs"][0][ "predicate" ]["value"] = 0 - - verified = await self.verifier.verify_presentation( - deepcopy(INDY_PROOF_REQ_PRED_NAMES), - INDY_PROOF_X, - "schemas", - {"LjgpST2rjsoxYegQDRm7EL:3:CL:18:tag": {"value": {}}}, - REV_REG_DEFS, - "rev_reg_entries", - ) + with async_mock.patch.object( + IndyLedgerRequestsExecutor, "get_ledger_for_identifier" + ) as mock_get_ledger: + mock_get_ledger.return_value = self.ledger + verified = await self.verifier.verify_presentation( + deepcopy(INDY_PROOF_REQ_PRED_NAMES), + INDY_PROOF_X, + "schemas", + {"LjgpST2rjsoxYegQDRm7EL:3:CL:18:tag": {"value": {}}}, + REV_REG_DEFS, + "rev_reg_entries", + ) mock_verify.assert_not_called() @@ -487,15 +519,18 @@ async def test_check_pred_names_tamper_pred_value(self, mock_verify): async def test_check_pred_names_tamper_pred_req_attr(self, mock_verify): INDY_PROOF_REQ_X = deepcopy(INDY_PROOF_REQ_PRED_NAMES) INDY_PROOF_REQ_X["requested_predicates"]["18_busid_GE_uuid"]["name"] = "dummy" - - verified = await self.verifier.verify_presentation( - INDY_PROOF_REQ_X, - INDY_PROOF_PRED_NAMES, - "schemas", - {"LjgpST2rjsoxYegQDRm7EL:3:CL:18:tag": {"value": {}}}, - REV_REG_DEFS, - "rev_reg_entries", - ) + with async_mock.patch.object( + IndyLedgerRequestsExecutor, "get_ledger_for_identifier" + ) as mock_get_ledger: + mock_get_ledger.return_value = self.ledger + verified = await self.verifier.verify_presentation( + INDY_PROOF_REQ_X, + INDY_PROOF_PRED_NAMES, + "schemas", + {"LjgpST2rjsoxYegQDRm7EL:3:CL:18:tag": {"value": {}}}, + REV_REG_DEFS, + "rev_reg_entries", + ) mock_verify.assert_not_called() @@ -507,15 +542,18 @@ async def test_check_pred_names_tamper_attr_groups(self, mock_verify): INDY_PROOF_X["requested_proof"]["revealed_attr_groups"][ "x_uuid" ] = INDY_PROOF_X["requested_proof"]["revealed_attr_groups"].pop("18_uuid") - - verified = await self.verifier.verify_presentation( - deepcopy(INDY_PROOF_REQ_PRED_NAMES), - INDY_PROOF_X, - "schemas", - {"LjgpST2rjsoxYegQDRm7EL:3:CL:18:tag": {"value": {}}}, - REV_REG_DEFS, - "rev_reg_entries", - ) + with async_mock.patch.object( + IndyLedgerRequestsExecutor, "get_ledger_for_identifier" + ) as mock_get_ledger: + mock_get_ledger.return_value = ("test", self.ledger) + verified = await self.verifier.verify_presentation( + deepcopy(INDY_PROOF_REQ_PRED_NAMES), + INDY_PROOF_X, + "schemas", + {"LjgpST2rjsoxYegQDRm7EL:3:CL:18:tag": {"value": {}}}, + REV_REG_DEFS, + "rev_reg_entries", + ) mock_verify.assert_not_called() diff --git a/aries_cloudagent/indy/sdk/verifier.py b/aries_cloudagent/indy/sdk/verifier.py index 6ed066d4b8..f7695a9e0e 100644 --- a/aries_cloudagent/indy/sdk/verifier.py +++ b/aries_cloudagent/indy/sdk/verifier.py @@ -6,7 +6,7 @@ import indy.anoncreds from indy.error import IndyError -from ...ledger.indy import IndySdkLedger +from ...core.profile import Profile from ..verifier import IndyVerifier @@ -16,15 +16,15 @@ class IndySdkVerifier(IndyVerifier): """Indy-SDK verifier implementation.""" - def __init__(self, ledger: IndySdkLedger): + def __init__(self, profile: Profile): """ Initialize an IndyVerifier instance. Args: - ledger: ledger instance + profile: Active Profile instance """ - self.ledger = ledger + self.profile = profile async def verify_presentation( self, @@ -49,7 +49,7 @@ async def verify_presentation( try: self.non_revoc_intervals(pres_req, pres, credential_definitions) - await self.check_timestamps(self.ledger, pres_req, pres, rev_reg_defs) + await self.check_timestamps(self.profile, pres_req, pres, rev_reg_defs) await self.pre_verify(pres_req, pres) except ValueError as err: LOGGER.error( diff --git a/aries_cloudagent/indy/tests/test_verifier.py b/aries_cloudagent/indy/tests/test_verifier.py index 05de2c855f..ef10428d8e 100644 --- a/aries_cloudagent/indy/tests/test_verifier.py +++ b/aries_cloudagent/indy/tests/test_verifier.py @@ -6,6 +6,11 @@ from asynctest import TestCase as AsyncTestCase from asynctest import mock as async_mock +from ...core.in_memory import InMemoryProfile +from ...ledger.multiple_ledger.ledger_requests_executor import ( + IndyLedgerRequestsExecutor, +) + from .. import verifier as test_module from ..verifier import IndyVerifier @@ -328,26 +333,37 @@ def setUp(self): async def test_check_timestamps(self): # all clear, with timestamps - await self.verifier.check_timestamps( - self.ledger, - INDY_PROOF_REQ_NAME, - INDY_PROOF_NAME, - REV_REG_DEFS, + mock_profile = InMemoryProfile.test_profile() + context = mock_profile.context + context.injector.bind_instance( + IndyLedgerRequestsExecutor, IndyLedgerRequestsExecutor(mock_profile) ) + with async_mock.patch.object( + IndyLedgerRequestsExecutor, "get_ledger_for_identifier" + ) as mock_get_ledger: + mock_get_ledger.return_value = self.ledger + await self.verifier.check_timestamps( + mock_profile, + INDY_PROOF_REQ_NAME, + INDY_PROOF_NAME, + REV_REG_DEFS, + ) # timestamp for irrevocable credential with async_mock.patch.object( - self.ledger, - "get_credential_definition", - async_mock.CoroutineMock(), - ) as mock_get_cred_def: - mock_get_cred_def.return_value = { - "...": "...", - "value": {"no": "revocation"}, - } + IndyLedgerRequestsExecutor, "get_ledger_for_identifier" + ) as mock_get_ledger: + mock_get_ledger.return_value = async_mock.MagicMock( + get_credential_definition=async_mock.CoroutineMock( + return_value={ + "...": "...", + "value": {"no": "revocation"}, + } + ) + ) with self.assertRaises(ValueError) as context: await self.verifier.check_timestamps( - self.ledger, + mock_profile, INDY_PROOF_REQ_NAME, INDY_PROOF_NAME, REV_REG_DEFS, @@ -355,41 +371,45 @@ async def test_check_timestamps(self): assert "Timestamp in presentation identifier #" in str(context.exception) # all clear, no timestamps - proof_x = deepcopy(INDY_PROOF_NAME) - proof_x["identifiers"][0]["timestamp"] = None - proof_x["identifiers"][0]["rev_reg_id"] = None - proof_req_x = deepcopy(INDY_PROOF_REQ_NAME) - proof_req_x.pop("non_revoked") - await self.verifier.check_timestamps( - self.ledger, - proof_req_x, - proof_x, - REV_REG_DEFS, - ) - - # timestamp in the future - proof_req_x = deepcopy(INDY_PROOF_REQ_NAME) - proof_x = deepcopy(INDY_PROOF_NAME) - proof_x["identifiers"][0]["timestamp"] = int(time()) + 3600 - with self.assertRaises(ValueError) as context: + with async_mock.patch.object( + IndyLedgerRequestsExecutor, "get_ledger_for_identifier" + ) as mock_get_ledger: + mock_get_ledger.return_value = self.ledger + proof_x = deepcopy(INDY_PROOF_NAME) + proof_x["identifiers"][0]["timestamp"] = None + proof_x["identifiers"][0]["rev_reg_id"] = None + proof_req_x = deepcopy(INDY_PROOF_REQ_NAME) + proof_req_x.pop("non_revoked") await self.verifier.check_timestamps( - self.ledger, + mock_profile, proof_req_x, proof_x, REV_REG_DEFS, ) - assert "in the future" in str(context.exception) - # timestamp in the distant past - proof_x["identifiers"][0]["timestamp"] = 1234567890 - with self.assertRaises(ValueError) as context: - await self.verifier.check_timestamps( - self.ledger, - proof_req_x, - proof_x, - REV_REG_DEFS, - ) - assert "predates rev reg" in str(context.exception) + # timestamp in the future + proof_req_x = deepcopy(INDY_PROOF_REQ_NAME) + proof_x = deepcopy(INDY_PROOF_NAME) + proof_x["identifiers"][0]["timestamp"] = int(time()) + 3600 + with self.assertRaises(ValueError) as context: + await self.verifier.check_timestamps( + mock_profile, + proof_req_x, + proof_x, + REV_REG_DEFS, + ) + assert "in the future" in str(context.exception) + + # timestamp in the distant past + proof_x["identifiers"][0]["timestamp"] = 1234567890 + with self.assertRaises(ValueError) as context: + await self.verifier.check_timestamps( + mock_profile, + proof_req_x, + proof_x, + REV_REG_DEFS, + ) + assert "predates rev reg" in str(context.exception) # timestamp otherwise outside non-revocation interval: log and continue proof_req_x = deepcopy(INDY_PROOF_REQ_NAME) @@ -397,10 +417,13 @@ async def test_check_timestamps(self): proof_x["identifiers"][0]["timestamp"] = 1579890000 with async_mock.patch.object( test_module, "LOGGER", async_mock.MagicMock() - ) as mock_logger: + ) as mock_logger, async_mock.patch.object( + IndyLedgerRequestsExecutor, "get_ledger_for_identifier" + ) as mock_get_ledger: + mock_get_ledger.return_value = self.ledger pre_logger_calls = mock_logger.info.call_count await self.verifier.check_timestamps( - self.ledger, + mock_profile, proof_req_x, proof_x, REV_REG_DEFS, @@ -411,89 +434,94 @@ async def test_check_timestamps(self): proof_req_x = deepcopy(INDY_PROOF_REQ_NAME) proof_x = deepcopy(INDY_PROOF_NAME) proof_req_x.pop("non_revoked") - with self.assertRaises(ValueError) as context: - await self.verifier.check_timestamps( - self.ledger, - proof_req_x, - proof_x, - REV_REG_DEFS, - ) - assert "superfluous" in str(context.exception) - - # missing revealed attr - proof_req_x = deepcopy(INDY_PROOF_REQ_NAME) - proof_x = deepcopy(INDY_PROOF_NAME) - proof_x["requested_proof"]["revealed_attrs"] = {} - with self.assertRaises(ValueError) as context: - await self.verifier.check_timestamps( - self.ledger, - proof_req_x, - proof_x, - REV_REG_DEFS, + with async_mock.patch.object( + IndyLedgerRequestsExecutor, "get_ledger_for_identifier" + ) as mock_get_ledger: + mock_get_ledger.return_value = self.ledger + with self.assertRaises(ValueError) as context: + await self.verifier.check_timestamps( + mock_profile, + proof_req_x, + proof_x, + REV_REG_DEFS, + ) + assert "superfluous" in str(context.exception) + # missing revealed attr + proof_req_x = deepcopy(INDY_PROOF_REQ_NAME) + proof_x = deepcopy(INDY_PROOF_NAME) + proof_x["requested_proof"]["revealed_attrs"] = {} + with self.assertRaises(ValueError) as context: + await self.verifier.check_timestamps( + mock_profile, + proof_req_x, + proof_x, + REV_REG_DEFS, + ) + assert "Presentation attributes mismatch requested" in str( + context.exception ) - assert "Presentation attributes mismatch requested" in str(context.exception) - # all clear, attribute group ('names') - await self.verifier.check_timestamps( - self.ledger, - INDY_PROOF_REQ_PRED_NAMES, - INDY_PROOF_PRED_NAMES, - REV_REG_DEFS, - ) - - # missing revealed attr groups - proof_x = deepcopy(INDY_PROOF_PRED_NAMES) - proof_x["requested_proof"].pop("revealed_attr_groups") - with self.assertRaises(ValueError) as context: + # all clear, attribute group ('names') await self.verifier.check_timestamps( - self.ledger, + mock_profile, INDY_PROOF_REQ_PRED_NAMES, - proof_x, + INDY_PROOF_PRED_NAMES, REV_REG_DEFS, ) - assert "Missing requested attribute group" in str(context.exception) - # superfluous timestamp, attr group - proof_x = deepcopy(INDY_PROOF_PRED_NAMES) - proof_req_x = deepcopy(INDY_PROOF_REQ_PRED_NAMES) - proof_req_x["requested_attributes"]["18_uuid"].pop("non_revoked") - proof_req_x["requested_predicates"]["18_id_GE_uuid"].pop("non_revoked") - proof_req_x["requested_predicates"]["18_busid_GE_uuid"].pop("non_revoked") - with self.assertRaises(ValueError) as context: - await self.verifier.check_timestamps( - self.ledger, - proof_req_x, - proof_x, - REV_REG_DEFS, - ) - assert "is superfluous vs. requested" in str(context.exception) + # missing revealed attr groups + proof_x = deepcopy(INDY_PROOF_PRED_NAMES) + proof_x["requested_proof"].pop("revealed_attr_groups") + with self.assertRaises(ValueError) as context: + await self.verifier.check_timestamps( + mock_profile, + INDY_PROOF_REQ_PRED_NAMES, + proof_x, + REV_REG_DEFS, + ) + assert "Missing requested attribute group" in str(context.exception) - # superfluous timestamp, predicates - proof_x = deepcopy(INDY_PROOF_PRED_NAMES) - proof_req_x = deepcopy(INDY_PROOF_REQ_PRED_NAMES) - proof_req_x["requested_predicates"]["18_id_GE_uuid"].pop("non_revoked") - proof_req_x["requested_predicates"]["18_busid_GE_uuid"].pop("non_revoked") - with self.assertRaises(ValueError) as context: - await self.verifier.check_timestamps( - self.ledger, - proof_req_x, - proof_x, - REV_REG_DEFS, - ) - assert "is superfluous vs. requested predicate" in str(context.exception) + # superfluous timestamp, attr group + proof_x = deepcopy(INDY_PROOF_PRED_NAMES) + proof_req_x = deepcopy(INDY_PROOF_REQ_PRED_NAMES) + proof_req_x["requested_attributes"]["18_uuid"].pop("non_revoked") + proof_req_x["requested_predicates"]["18_id_GE_uuid"].pop("non_revoked") + proof_req_x["requested_predicates"]["18_busid_GE_uuid"].pop("non_revoked") + with self.assertRaises(ValueError) as context: + await self.verifier.check_timestamps( + mock_profile, + proof_req_x, + proof_x, + REV_REG_DEFS, + ) + assert "is superfluous vs. requested" in str(context.exception) - # mismatched predicates and requested_predicates - proof_x = deepcopy(INDY_PROOF_PRED_NAMES) - proof_req_x = deepcopy(INDY_PROOF_REQ_PRED_NAMES) - proof_x["requested_proof"]["predicates"] = {} - with self.assertRaises(ValueError) as context: - await self.verifier.check_timestamps( - self.ledger, - proof_req_x, - proof_x, - REV_REG_DEFS, - ) - assert "predicates mismatch requested predicate" in str(context.exception) + # superfluous timestamp, predicates + proof_x = deepcopy(INDY_PROOF_PRED_NAMES) + proof_req_x = deepcopy(INDY_PROOF_REQ_PRED_NAMES) + proof_req_x["requested_predicates"]["18_id_GE_uuid"].pop("non_revoked") + proof_req_x["requested_predicates"]["18_busid_GE_uuid"].pop("non_revoked") + with self.assertRaises(ValueError) as context: + await self.verifier.check_timestamps( + mock_profile, + proof_req_x, + proof_x, + REV_REG_DEFS, + ) + assert "is superfluous vs. requested predicate" in str(context.exception) + + # mismatched predicates and requested_predicates + proof_x = deepcopy(INDY_PROOF_PRED_NAMES) + proof_req_x = deepcopy(INDY_PROOF_REQ_PRED_NAMES) + proof_x["requested_proof"]["predicates"] = {} + with self.assertRaises(ValueError) as context: + await self.verifier.check_timestamps( + mock_profile, + proof_req_x, + proof_x, + REV_REG_DEFS, + ) + assert "predicates mismatch requested predicate" in str(context.exception) async def test_non_revoc_intervals(self): big_pres_req = { diff --git a/aries_cloudagent/indy/verifier.py b/aries_cloudagent/indy/verifier.py index 298ed58ae6..743b071c76 100644 --- a/aries_cloudagent/indy/verifier.py +++ b/aries_cloudagent/indy/verifier.py @@ -6,7 +6,11 @@ from time import time from typing import Mapping -from ..ledger.base import BaseLedger +from ..core.profile import Profile +from ..ledger.multiple_ledger.ledger_requests_executor import ( + GET_CRED_DEF, + IndyLedgerRequestsExecutor, +) from ..messaging.util import canon, encode from .models.xform import indy_proof_req2non_revoc_intervals @@ -78,7 +82,7 @@ def non_revoc_intervals(self, pres_req: dict, pres: dict, cred_defs: dict): async def check_timestamps( self, - ledger: BaseLedger, + profile: Profile, pres_req: Mapping, pres: Mapping, rev_reg_defs: Mapping, @@ -97,12 +101,20 @@ async def check_timestamps( """ now = int(time()) non_revoc_intervals = indy_proof_req2non_revoc_intervals(pres_req) - # timestamp for irrevocable credential - async with ledger: - for (index, ident) in enumerate(pres["identifiers"]): - if ident.get("timestamp"): - cred_def_id = ident["cred_def_id"] + for (index, ident) in enumerate(pres["identifiers"]): + if ident.get("timestamp"): + cred_def_id = ident["cred_def_id"] + ledger_exec_inst = profile.inject(IndyLedgerRequestsExecutor) + ledger_info = await ledger_exec_inst.get_ledger_for_identifier( + cred_def_id, + txn_record_type=GET_CRED_DEF, + ) + if isinstance(ledger_info, tuple): + ledger = ledger_info[1] + else: + ledger = ledger_info + async with ledger: cred_def = await ledger.get_credential_definition(cred_def_id) if not cred_def["value"].get("revocation"): raise ValueError( diff --git a/aries_cloudagent/ledger/indy.py b/aries_cloudagent/ledger/indy.py index 178035a1b5..21d334a64f 100644 --- a/aries_cloudagent/ledger/indy.py +++ b/aries_cloudagent/ledger/indy.py @@ -7,7 +7,7 @@ from datetime import date, datetime from os import path from time import time -from typing import Sequence, Tuple +from typing import Sequence, Tuple, Optional import indy.ledger import indy.pool @@ -15,15 +15,16 @@ from ..cache.base import BaseCache from ..config.base import BaseInjector, BaseProvider, BaseSettings +from ..core.profile import Profile from ..indy.issuer import DEFAULT_CRED_DEF_TAG, IndyIssuer, IndyIssuerError from ..indy.sdk.error import IndyErrorHandler from ..storage.base import StorageRecord from ..storage.indy import IndySdkStorage from ..utils import sentinel +from ..wallet.base import BaseWallet from ..wallet.did_info import DIDInfo from ..wallet.did_posture import DIDPosture from ..wallet.error import WalletNotFoundError -from ..wallet.indy import IndySdkWallet from ..wallet.util import full_verkey from .base import BaseLedger, Role from .endpoint_type import EndpointType @@ -237,7 +238,7 @@ class IndySdkLedger(BaseLedger): def __init__( self, pool: IndySdkLedgerPool, - wallet: IndySdkWallet, + profile: Profile, ): """ Initialize an IndySdkLedger instance. @@ -247,7 +248,7 @@ def __init__( wallet: The IndySdkWallet instance """ self.pool = pool - self.wallet = wallet + self.profile = profile @property def pool_handle(self): @@ -281,25 +282,31 @@ async def __aexit__(self, exc_type, exc, tb): await self.pool.context_close() await super().__aexit__(exc_type, exc, tb) + async def get_wallet_public_did(self) -> DIDInfo: + """Fetch the public DID from the wallet.""" + async with self.profile.session() as session: + wallet = session.inject(BaseWallet) + return await wallet.get_public_did() + async def _endorse( self, request_json: str, ) -> str: - if not self.pool.handle: raise ClosedPoolError( f"Cannot endorse request with closed pool '{self.pool.name}'" ) - public_info = await self.wallet.get_public_did() + public_info = await self.get_wallet_public_did() if not public_info: raise BadLedgerRequestError( "Cannot endorse transaction without a public DID" ) - - endorsed_request_json = await indy.ledger.multi_sign_request( - self.wallet.opened.handle, public_info.did, request_json - ) + async with self.profile.session() as session: + wallet = session.inject(BaseWallet) + endorsed_request_json = await indy.ledger.multi_sign_request( + wallet.opened.handle, public_info.did, request_json + ) return endorsed_request_json async def _submit( @@ -328,7 +335,7 @@ async def _submit( if sign is None or sign: if sign_did is sentinel: - sign_did = await self.wallet.get_public_did() + sign_did = await self.get_wallet_public_did() if sign is None: sign = bool(sign_did) @@ -351,18 +358,20 @@ async def _submit( acceptance["time"], ) ) - if write_ledger: - submit_op = indy.ledger.sign_and_submit_request( - self.pool.handle, - self.wallet.opened.handle, - sign_did.did, - request_json, - ) - else: - # multi-sign, since we expect this to get endorsed later - submit_op = indy.ledger.multi_sign_request( - self.wallet.opened.handle, sign_did.did, request_json - ) + async with self.profile.session() as session: + wallet = session.inject(BaseWallet) + if write_ledger: + submit_op = indy.ledger.sign_and_submit_request( + self.pool.handle, + wallet.opened.handle, + sign_did.did, + request_json, + ) + else: + # multi-sign, since we expect this to get endorsed later + submit_op = indy.ledger.multi_sign_request( + wallet.opened.handle, sign_did.did, request_json + ) else: submit_op = indy.ledger.submit_request(self.pool.handle, request_json) @@ -430,7 +439,7 @@ async def create_and_send_schema( """ - public_info = await self.wallet.get_public_did() + public_info = await self.get_wallet_public_did() if not public_info: raise BadLedgerRequestError("Cannot publish schema without a public DID") @@ -554,7 +563,7 @@ async def fetch_schema_by_id(self, schema_id: str) -> dict: """ - public_info = await self.wallet.get_public_did() + public_info = await self.get_wallet_public_did() public_did = public_info.did if public_info else None with IndyErrorHandler("Exception building schema request", LedgerError): @@ -639,7 +648,7 @@ async def create_and_send_credential_definition( Tuple with cred def id, cred def structure, and whether it's novel """ - public_info = await self.wallet.get_public_did() + public_info = await self.get_wallet_public_did() if not public_info: raise BadLedgerRequestError( "Cannot publish credential definition without a public DID" @@ -667,30 +676,34 @@ async def create_and_send_credential_definition( ) try: - if not await issuer.credential_definition_in_wallet( - credential_definition_id - ): - raise LedgerError( - f"Credential definition {credential_definition_id} is on " - f"ledger {self.pool.name} but not in wallet " - f"{self.wallet.opened.name}" - ) + async with self.profile.session() as session: + wallet = session.inject(BaseWallet) + if not await issuer.credential_definition_in_wallet( + credential_definition_id + ): + raise LedgerError( + f"Credential definition {credential_definition_id} is on " + f"ledger {self.pool.name} but not in wallet " + f"{wallet.opened.name}" + ) except IndyIssuerError as err: raise LedgerError(err.message) from err credential_definition_json = json.dumps(ledger_cred_def) break - else: # no such cred def on ledger - try: - if await issuer.credential_definition_in_wallet( - credential_definition_id - ): - raise LedgerError( - f"Credential definition {credential_definition_id} is in " - f"wallet {self.wallet.opened.name} but not on ledger " - f"{self.pool.name}" - ) - except IndyIssuerError as err: - raise LedgerError(err.message) from err + else: # no such cred def on ledger + try: + async with self.profile.session() as session: + wallet = session.inject(BaseWallet) + if await issuer.credential_definition_in_wallet( + credential_definition_id + ): + raise LedgerError( + f"Credential definition {credential_definition_id} is in " + f"wallet {wallet.opened.name} but not on ledger " + f"{self.pool.name}" + ) + except IndyIssuerError as err: + raise LedgerError(err.message) from err # Cred def is neither on ledger nor in wallet: create and send it novel = True @@ -755,7 +768,7 @@ async def fetch_credential_definition(self, credential_definition_id: str) -> di """ - public_info = await self.wallet.get_public_did() + public_info = await self.get_wallet_public_did() public_did = public_info.did if public_info else None with IndyErrorHandler("Exception building cred def request", LedgerError): @@ -812,7 +825,7 @@ async def get_key_for_did(self, did: str) -> str: did: The DID to look up on the ledger or in the cache """ nym = self.did_to_nym(did) - public_info = await self.wallet.get_public_did() + public_info = await self.get_wallet_public_did() public_did = public_info.did if public_info else None with IndyErrorHandler("Exception building nym request", LedgerError): request_json = await indy.ledger.build_get_nym_request(public_did, nym) @@ -827,7 +840,7 @@ async def get_all_endpoints_for_did(self, did: str) -> dict: did: The DID to look up on the ledger or in the cache """ nym = self.did_to_nym(did) - public_info = await self.wallet.get_public_did() + public_info = await self.get_wallet_public_did() public_did = public_info.did if public_info else None with IndyErrorHandler("Exception building attribute request", LedgerError): request_json = await indy.ledger.build_get_attrib_request( @@ -856,7 +869,7 @@ async def get_endpoint_for_did( if not endpoint_type: endpoint_type = EndpointType.ENDPOINT nym = self.did_to_nym(did) - public_info = await self.wallet.get_public_did() + public_info = await self.get_wallet_public_did() public_did = public_info.did if public_info else None with IndyErrorHandler("Exception building attribute request", LedgerError): request_json = await indy.ledger.build_get_attrib_request( @@ -931,26 +944,30 @@ async def register_nym( "Error cannot register nym when ledger is in read only mode" ) - public_info = await self.wallet.get_public_did() - if not public_info: - raise WalletNotFoundError( - f"Cannot register NYM to ledger: wallet {self.wallet.opened.name} " - "has no public DID" - ) + public_info = await self.get_wallet_public_did() + async with self.profile.session() as session: + wallet = session.inject(BaseWallet) + if not public_info: + raise WalletNotFoundError( + f"Cannot register NYM to ledger: wallet {wallet.opened.name} " + "has no public DID" + ) - with IndyErrorHandler("Exception building nym request", LedgerError): - request_json = await indy.ledger.build_nym_request( - public_info.did, did, verkey, alias, role - ) - await self._submit(request_json) # let ledger raise on insufficient privilege + with IndyErrorHandler("Exception building nym request", LedgerError): + request_json = await indy.ledger.build_nym_request( + public_info.did, did, verkey, alias, role + ) + await self._submit( + request_json + ) # let ledger raise on insufficient privilege - try: - did_info = await self.wallet.get_local_did(did) - except WalletNotFoundError: - pass # registering another user's NYM - else: - metadata = {**did_info.metadata, **DIDPosture.POSTED.metadata} - await self.wallet.replace_local_did_metadata(did, metadata) + try: + did_info = await wallet.get_local_did(did) + except WalletNotFoundError: + pass # registering another user's NYM + else: + metadata = {**did_info.metadata, **DIDPosture.POSTED.metadata} + await wallet.replace_local_did_metadata(did, metadata) async def get_nym_role(self, did: str) -> Role: """ @@ -959,7 +976,7 @@ async def get_nym_role(self, did: str) -> Role: Args: did: DID to query for role on the ledger. """ - public_info = await self.wallet.get_public_did() + public_info = await self.get_wallet_public_did() public_did = public_info.did if public_info else None with IndyErrorHandler("Exception building get-nym request", LedgerError): @@ -980,6 +997,21 @@ def nym_to_did(self, nym: str) -> str: nym = self.did_to_nym(nym) return f"did:sov:{nym}" + async def build_and_return_get_nym_request( + self, submitter_did: Optional[str], target_did: str + ) -> str: + """Build GET_NYM request and return request_json.""" + with IndyErrorHandler("Exception building nym request", LedgerError): + request_json = await indy.ledger.build_get_nym_request( + submitter_did, target_did + ) + return request_json + + async def submit_get_nym_request(self, request_json: str) -> str: + """Submit GET_NYM request to ledger and return response_json.""" + response_json = await self._submit(request_json) + return response_json + async def rotate_public_did_keypair(self, next_seed: str = None) -> None: """ Rotate keypair for public DID: create new key, submit to ledger, update wallet. @@ -988,40 +1020,44 @@ async def rotate_public_did_keypair(self, next_seed: str = None) -> None: next_seed: seed for incoming ed25519 keypair (default random) """ # generate new key - public_info = await self.wallet.get_public_did() + public_info = await self.get_wallet_public_did() public_did = public_info.did - verkey = await self.wallet.rotate_did_keypair_start(public_did, next_seed) - - # submit to ledger (retain role and alias) - nym = self.did_to_nym(public_did) - with IndyErrorHandler("Exception building nym request", LedgerError): - request_json = await indy.ledger.build_get_nym_request(public_did, nym) - - response_json = await self._submit(request_json) - data = json.loads((json.loads(response_json))["result"]["data"]) - if not data: - raise BadLedgerRequestError( - f"Ledger has no public DID for wallet {self.wallet.opened.name}" - ) - seq_no = data["seqNo"] + async with self.profile.session() as session: + wallet = session.inject(BaseWallet) + verkey = await wallet.rotate_did_keypair_start(public_did, next_seed) + + # submit to ledger (retain role and alias) + nym = self.did_to_nym(public_did) + with IndyErrorHandler("Exception building nym request", LedgerError): + request_json = await indy.ledger.build_get_nym_request(public_did, nym) + + response_json = await self._submit(request_json) + data = json.loads((json.loads(response_json))["result"]["data"]) + if not data: + raise BadLedgerRequestError( + f"Ledger has no public DID for wallet {wallet.opened.name}" + ) + seq_no = data["seqNo"] - with IndyErrorHandler("Exception building get-txn request", LedgerError): - txn_req_json = await indy.ledger.build_get_txn_request(None, None, seq_no) + with IndyErrorHandler("Exception building get-txn request", LedgerError): + txn_req_json = await indy.ledger.build_get_txn_request( + None, None, seq_no + ) - txn_resp_json = await self._submit(txn_req_json) - txn_resp = json.loads(txn_resp_json) - txn_resp_data = txn_resp["result"]["data"] - if not txn_resp_data: - raise BadLedgerRequestError( - f"Bad or missing ledger NYM transaction for DID {public_did}" - ) - txn_data_data = txn_resp_data["txn"]["data"] - role_token = Role.get(txn_data_data.get("role")).token() - alias = txn_data_data.get("alias") - await self.register_nym(public_did, verkey, alias=alias, role=role_token) + txn_resp_json = await self._submit(txn_req_json) + txn_resp = json.loads(txn_resp_json) + txn_resp_data = txn_resp["result"]["data"] + if not txn_resp_data: + raise BadLedgerRequestError( + f"Bad or missing ledger NYM transaction for DID {public_did}" + ) + txn_data_data = txn_resp_data["txn"]["data"] + role_token = Role.get(txn_data_data.get("role")).token() + alias = txn_data_data.get("alias") + await self.register_nym(public_did, verkey, role_token, alias) - # update wallet - await self.wallet.rotate_did_keypair_apply(public_did) + # update wallet + await wallet.rotate_did_keypair_apply(public_did) async def get_txn_author_agreement(self, reload: bool = False) -> dict: """Get the current transaction author agreement, fetching it if necessary.""" @@ -1031,7 +1067,7 @@ async def get_txn_author_agreement(self, reload: bool = False) -> dict: async def fetch_txn_author_agreement(self) -> dict: """Fetch the current AML and TAA from the ledger.""" - public_info = await self.wallet.get_public_did() + public_info = await self.get_wallet_public_did() public_did = public_info.did if public_info else None get_aml_req = await indy.ledger.build_get_acceptance_mechanisms_request( @@ -1057,9 +1093,11 @@ async def fetch_txn_author_agreement(self) -> dict: "taa_required": taa_required, } - def get_indy_storage(self) -> IndySdkStorage: + async def get_indy_storage(self) -> IndySdkStorage: """Get an IndySdkStorage instance for the current wallet.""" - return IndySdkStorage(self.wallet.opened) + async with self.profile.session() as session: + wallet = session.inject(BaseWallet) + return IndySdkStorage(wallet.opened) def taa_rough_timestamp(self) -> int: """Get a timestamp accurate to the day. @@ -1086,32 +1124,38 @@ async def accept_txn_author_agreement( json.dumps(acceptance), {"pool_name": self.pool.name}, ) - storage = self.get_indy_storage() + storage = await self.get_indy_storage() await storage.add_record(record) - if self.pool.cache: + async with self.profile.session() as session: + wallet = session.inject(BaseWallet) + if self.pool.cache: + cache_key = ( + TAA_ACCEPTED_RECORD_TYPE + + "::" + + wallet.opened.name + + "::" + + self.pool.name + + "::" + ) + await self.pool.cache.set( + cache_key, acceptance, self.pool.cache_duration + ) + + async def get_latest_txn_author_acceptance(self) -> dict: + """Look up the latest TAA acceptance.""" + async with self.profile.session() as session: + wallet = session.inject(BaseWallet) cache_key = ( TAA_ACCEPTED_RECORD_TYPE + "::" - + self.wallet.opened.name + + wallet.opened.name + "::" + self.pool.name + "::" ) - await self.pool.cache.set(cache_key, acceptance, self.pool.cache_duration) - - async def get_latest_txn_author_acceptance(self) -> dict: - """Look up the latest TAA acceptance.""" - cache_key = ( - TAA_ACCEPTED_RECORD_TYPE - + "::" - + self.wallet.opened.name - + "::" - + self.pool.name - + "::" - ) acceptance = self.pool.cache and await self.pool.cache.get(cache_key) if not acceptance: - storage = self.get_indy_storage() + storage = await self.get_indy_storage() tag_filter = {"pool_name": self.pool.name} found = await storage.find_all_records(TAA_ACCEPTED_RECORD_TYPE, tag_filter) if found: @@ -1128,7 +1172,7 @@ async def get_latest_txn_author_acceptance(self) -> dict: async def get_revoc_reg_def(self, revoc_reg_id: str) -> dict: """Get revocation registry definition by ID; augment with ledger timestamp.""" - public_info = await self.wallet.get_public_did() + public_info = await self.get_wallet_public_did() try: fetch_req = await indy.ledger.build_get_revoc_reg_def_request( public_info and public_info.did, revoc_reg_id @@ -1153,7 +1197,7 @@ async def get_revoc_reg_def(self, revoc_reg_id: str) -> dict: async def get_revoc_reg_entry(self, revoc_reg_id: str, timestamp: int): """Get revocation registry entry by revocation registry ID and timestamp.""" - public_info = await self.wallet.get_public_did() + public_info = await self.get_wallet_public_did() with IndyErrorHandler("Exception fetching rev reg entry", LedgerError): try: fetch_req = await indy.ledger.build_get_revoc_reg_request( @@ -1188,7 +1232,7 @@ async def get_revoc_reg_delta( """ if to is None: to = int(time()) - public_info = await self.wallet.get_public_did() + public_info = await self.get_wallet_public_did() with IndyErrorHandler("Exception building rev reg delta request", LedgerError): fetch_req = await indy.ledger.build_get_revoc_reg_delta_request( public_info and public_info.did, @@ -1222,9 +1266,11 @@ async def send_revoc_reg_def( """Publish a revocation registry definition to the ledger.""" # NOTE - issuer DID could be extracted from the revoc_reg_def ID if issuer_did: - did_info = await self.wallet.get_local_did(issuer_did) + async with self.profile.session() as session: + wallet = session.inject(BaseWallet) + did_info = await wallet.get_local_did(issuer_did) else: - did_info = await self.wallet.get_public_did() + did_info = await self.get_wallet_public_did() if not did_info: raise LedgerTransactionError( "No issuer DID found for revocation registry definition" @@ -1255,9 +1301,11 @@ async def send_revoc_reg_entry( ): """Publish a revocation registry entry to the ledger.""" if issuer_did: - did_info = await self.wallet.get_local_did(issuer_did) + async with self.profile.session() as session: + wallet = session.inject(BaseWallet) + did_info = await wallet.get_local_did(issuer_did) else: - did_info = await self.wallet.get_public_did() + did_info = await self.get_wallet_public_did() if not did_info: raise LedgerTransactionError( "No issuer DID found for revocation registry entry" diff --git a/aries_cloudagent/ledger/indy_vdr.py b/aries_cloudagent/ledger/indy_vdr.py index c126253de5..d8e8910bc1 100644 --- a/aries_cloudagent/ledger/indy_vdr.py +++ b/aries_cloudagent/ledger/indy_vdr.py @@ -12,7 +12,7 @@ from io import StringIO from pathlib import Path from time import time -from typing import Sequence, Tuple, Union +from typing import Sequence, Tuple, Union, Optional from indy_vdr import ledger, open_pool, Pool, Request, VdrError @@ -1006,6 +1006,21 @@ def nym_to_did(self, nym: str) -> str: nym = self.did_to_nym(nym) return f"did:sov:{nym}" + async def build_and_return_get_nym_request( + self, submitter_did: Optional[str], target_did: str + ) -> str: + """Build GET_NYM request and return request_json.""" + try: + request_json = ledger.build_get_nym_request(submitter_did, target_did) + return request_json + except VdrError as err: + raise LedgerError("Exception when building get-nym request") from err + + async def submit_get_nym_request(self, request_json: str) -> str: + """Submit GET_NYM request to ledger and return response_json.""" + response_json = await self._submit(request_json) + return response_json + async def rotate_public_did_keypair(self, next_seed: str = None) -> None: """ Rotate keypair for public DID: create new key, submit to ledger, update wallet. diff --git a/aries_cloudagent/ledger/merkel_validation/__init__.py b/aries_cloudagent/ledger/merkel_validation/__init__.py new file mode 100644 index 0000000000..e69de29bb2 diff --git a/aries_cloudagent/ledger/merkel_validation/constants.py b/aries_cloudagent/ledger/merkel_validation/constants.py new file mode 100644 index 0000000000..0b7489eb8c --- /dev/null +++ b/aries_cloudagent/ledger/merkel_validation/constants.py @@ -0,0 +1,57 @@ +"""Constants for State Proof and LeafHash Inclusion Verification.""" +ALL_ATR_KEYS = ["raw", "enc", "hash"] +LAST_SEQ_NO = "lsn" +VALUE = "value" +VAL = "val" +HASH = "hash" +LAST_UPDATE_TIME = "lut" +MARKER_CLAIM_DEF = "3" +MARKER_SCHEMA = "2" +MARKER_ATTR = "1" +MARKER_REVOC_DEF = "4" +MARKER_REVOC_REG_ENTRY_ACCUM = "6" +MARKER_REVOC_REG_ENTRY = "5" +GET_NYM = "105" +GET_ATTR = "104" +GET_SCHEMA = "107" +GET_CLAIM_DEF = "108" +GET_REVOC_REG_DEF = "115" +GET_REVOC_REG_ENTRY = "116" +GET_REVOC_REG_DELTA = "117" +NYM = "1" +ATTRIB = "100" +SCHEMA = "101" +CLAIM_DEF = "102" +REVOC_REG_DEF = "113" +REVOC_REG_ENTRY = "114" +DEST = "dest" +RESULT = "result" +DATA = "data" +SEQ_NO = "seqNo" +TXN_TIME = "txnTime" +TXN_METADATA = "txnMetadata" +TXN_TIME = "txnTime" +TXN = "txn" +NAME = "name" +VERSION = "version" +ATTR_NAMES = "attr_names" +FROM = "from" +METADATA = "metadata" +ORIGIN = "origin" +REF = "ref" +IDENTIFIER = "identifier" +CRED_DEF_ID = "credDefId" +REVOC_DEF_TYPE = "revocDefType" +REVOC_DEF_TYPE_ID = "revocRegDefId" +AUDIT_PATH = "auditPath" +ROOT_HASH = "rootHash" +STATE_PROOF = "state_proof" +STATE_PROOF_FROM = "stateProofFrom" +PROOF_NODES = "proof_nodes" +TAG = "tag" +ACCUM_TO = "accum_to" +ACCUM_FROM = "accum_from" +(NODE_TYPE_BLANK, NODE_TYPE_LEAF, NODE_TYPE_EXTENSION, NODE_TYPE_BRANCH) = tuple( + range(4) +) +BLANK_NODE = b"" diff --git a/aries_cloudagent/ledger/merkel_validation/domain_txn_handler.py b/aries_cloudagent/ledger/merkel_validation/domain_txn_handler.py new file mode 100644 index 0000000000..0640be9126 --- /dev/null +++ b/aries_cloudagent/ledger/merkel_validation/domain_txn_handler.py @@ -0,0 +1,484 @@ +"""Utilities for Processing Replies to Domain Read Requests.""" +import base58 +import base64 +import hashlib +import json + +from binascii import hexlify +from copy import deepcopy + +from .utils import audit_path_length +from .constants import ( + ACCUM_FROM, + ACCUM_TO, + ALL_ATR_KEYS, + LAST_SEQ_NO, + VAL, + VALUE, + HASH, + LAST_UPDATE_TIME, + MARKER_CLAIM_DEF, + MARKER_SCHEMA, + MARKER_ATTR, + MARKER_REVOC_DEF, + MARKER_REVOC_REG_ENTRY, + MARKER_REVOC_REG_ENTRY_ACCUM, + GET_NYM, + GET_ATTR, + GET_CLAIM_DEF, + GET_REVOC_REG_DEF, + GET_REVOC_REG_ENTRY, + GET_REVOC_REG_DELTA, + GET_SCHEMA, + NYM, + ATTRIB, + SCHEMA, + CLAIM_DEF, + REVOC_REG_DEF, + REVOC_REG_ENTRY, + DEST, + RESULT, + DATA, + SEQ_NO, + TXN_METADATA, + TXN_TIME, + TXN, + NAME, + VERSION, + ATTR_NAMES, + FROM, + METADATA, + REF, + CRED_DEF_ID, + REVOC_DEF_TYPE, + REVOC_DEF_TYPE_ID, + AUDIT_PATH, + ROOT_HASH, + STATE_PROOF, + STATE_PROOF_FROM, + PROOF_NODES, + TAG, +) + + +def _extract_attr_typed_value(txn_data): + """Check for 'raw', 'enc', 'hash' in ATTR & GET_ATTR, return it's name and value.""" + existing_keys = [key for key in ALL_ATR_KEYS if key in txn_data] + if len(existing_keys) == 0: + raise ValueError( + "ATTR should have one of the following fields: {}".format(ALL_ATR_KEYS) + ) + if len(existing_keys) > 1: + raise ValueError( + "ATTR should have only one of the following fields: {}".format(ALL_ATR_KEYS) + ) + existing_key = existing_keys[0] + return existing_key, txn_data[existing_key] + + +def parse_attr_txn(txn_data): + """Process txn_data and parse attr_txn based on attr_type.""" + attr_type, attr = _extract_attr_typed_value(txn_data) + if attr_type == "raw": + data = json.loads(attr) + re_raw = json.dumps(data) + key, _ = data.popitem() + return attr_type, key, re_raw + if attr_type == "enc": + return attr_type, attr, attr + if attr_type == "hash": + return attr_type, attr, None + + +def encode_state_value(value, seqNo, txnTime): + """Return encoded state value.""" + return json.dumps({LAST_SEQ_NO: seqNo, LAST_UPDATE_TIME: txnTime, VAL: value}) + + +def decode_state_value(encoded_value): + """Return val, lsn, lut from encoded state value.""" + decoded = json.loads(encoded_value) + value = decoded.get(VAL) + last_seq_no = decoded.get(LAST_SEQ_NO) + last_update_time = decoded.get(LAST_UPDATE_TIME) + return value, last_seq_no, last_update_time + + +def hash_of(text) -> str: + """Return 256 bit hexadecimal digest of text.""" + if not isinstance(text, (str, bytes)): + text = json.dumps(text) + if not isinstance(text, bytes): + text = text.encode() + return hashlib.sha256(text).hexdigest() + + +def make_state_path_for_attr(did, attr_name, attr_is_hash=False) -> bytes: + """Return state_path for ATTR.""" + nameHash = ( + hashlib.sha256(attr_name.encode()).hexdigest() + if not attr_is_hash + else attr_name + ) + return "{DID}:{MARKER}:{ATTR_NAME}".format( + DID=did, MARKER=MARKER_ATTR, ATTR_NAME=nameHash + ).encode() + + +def prepare_get_attr_for_state(reply): + """Return value for state from GET_ATTR.""" + result = reply.get(RESULT) + attr_type, attr_key = _extract_attr_typed_value(result) + data = result.get(DATA) + value_bytes = None + if data: + result = result.copy() + data = result.pop(DATA) + result[attr_type] = data + attr_type, _, value = parse_attr_txn(result) + hashed_value = hash_of(value) if value else "" + seq_no = result.get(SEQ_NO) + txn_time = result.get(TXN_TIME) + value_bytes = encode_state_value(hashed_value, seq_no, txn_time) + return value_bytes + + +def prepare_attr_for_state(txn, path_only=False): + """Return key, value pair for state from ATTR.""" + result = txn.get(RESULT) + txn_data = result.get(TXN).get(DATA) + nym = txn_data.get(DEST) + attr_type, attr_key, value = parse_attr_txn(txn_data) + path = make_state_path_for_attr(nym, attr_key, attr_type == HASH) + if path_only: + return path + hashed_value = hash_of(value) if value else "" + seq_no = result.get(TXN_METADATA).get(SEQ_NO) + txn_time = result.get(TXN_METADATA).get(TXN_TIME) + value_bytes = encode_state_value(hashed_value, seq_no, txn_time) + return path, value_bytes.encode() + + +def make_state_path_for_nym(did) -> bytes: + """Return state_path for NYM.""" + return hashlib.sha256(did.encode()).digest() + + +def prepare_nym_for_state(txn): + """Return encoded state path from NYM.""" + result = txn.get(RESULT) + txn_data = result.get(TXN).get(DATA) + nym = txn_data.get(DEST) + path = make_state_path_for_nym(nym) + return hexlify(path).decode() + + +def prepare_get_nym_for_state(reply): + """Return value for state from GET_NYM.""" + result = reply.get(RESULT) + data = result.get(DATA) + value = None + if data is not None: + if isinstance(data, str): + data = json.loads(data) + data.pop(DEST, None) + value = json.dumps(data) + return value + + +def prepare_get_schema_for_state(reply): + """Return value for state from GET_SCHEMA.""" + result = reply.get(RESULT) + value_bytes = None + attr_names = result.get(DATA).get(ATTR_NAMES) + if attr_names: + data = {ATTR_NAMES: attr_names} + seq_no = result.get(SEQ_NO) + txn_time = result.get(TXN_TIME) + value_bytes = encode_state_value(data, seq_no, txn_time) + return value_bytes + + +def make_state_path_for_schema(authors_did, schema_name, schema_version) -> bytes: + """Return state_path for SCHEMA.""" + return "{DID}:{MARKER}:{SCHEMA_NAME}:{SCHEMA_VERSION}".format( + DID=authors_did, + MARKER=MARKER_SCHEMA, + SCHEMA_NAME=schema_name, + SCHEMA_VERSION=schema_version, + ).encode() + + +def prepare_schema_for_state(txn, path_only=False): + """Return key-value pair for state from SCHEMA.""" + result = txn.get(RESULT) + origin = result.get(TXN).get(METADATA).get(FROM) + schema_name = result.get(TXN).get(DATA).get(DATA).get(NAME) + schema_version = result.get(TXN).get(DATA).get(DATA).get(VERSION) + value = {ATTR_NAMES: result.get(TXN).get(DATA).get(DATA).get(ATTR_NAMES)} + path = make_state_path_for_schema(origin, schema_name, schema_version) + if path_only: + return path + seq_no = result.get(TXN_METADATA).get(SEQ_NO) + txn_time = result.get(TXN_METADATA).get(TXN_TIME) + value_bytes = encode_state_value(value, seq_no, txn_time) + return path, value_bytes.encode() + + +def prepare_get_claim_def_for_state(reply): + """Return value for state from GET_CLAIM_DEF.""" + result = reply.get(RESULT) + schema_seq_no = result.get(REF) + if schema_seq_no is None: + raise ValueError("ref field is absent, but it must contain schema seq no") + value_bytes = None + data = result.get(DATA) + if data is not None: + seq_no = result.get(SEQ_NO) + txn_time = result.get(TXN_TIME) + value_bytes = encode_state_value(data, seq_no, txn_time) + return value_bytes + + +def make_state_path_for_claim_def(authors_did, schema_seq_no, signature_type, tag): + """Return state_path for CLAIM DEF.""" + return "{DID}:{MARKER}:{SIGNATURE_TYPE}:{SCHEMA_SEQ_NO}:{TAG}".format( + DID=authors_did, + MARKER=MARKER_CLAIM_DEF, + SIGNATURE_TYPE=signature_type, + SCHEMA_SEQ_NO=schema_seq_no, + TAG=tag, + ).encode() + + +def prepare_claim_def_for_state(txn, path_only=False): + """Return key-value pair for state from CLAIM_DEF.""" + result = txn.get(RESULT) + origin = result.get(TXN).get(METADATA).get(FROM) + schema_seq_no = result.get(TXN).get(DATA).get(REF) + if schema_seq_no is None: + raise ValueError("ref field is absent, but it must contain schema seq no") + data = result.get(TXN).get(DATA).get(DATA) + if data is None: + raise ValueError("data field is absent, but it must contain components of keys") + signature_type = result.get(TXN).get(DATA).get("signature_type", "CL") + tag = result.get(TXN).get(DATA).get(TAG, "tag") + path = make_state_path_for_claim_def(origin, schema_seq_no, signature_type, tag) + if path_only: + return path + seq_no = result.get(TXN_METADATA).get(SEQ_NO) + txn_time = result.get(TXN_METADATA).get(TXN_TIME) + value_bytes = encode_state_value(data, seq_no, txn_time) + return path, value_bytes.encode() + + +def prepare_get_revoc_def_for_state(reply): + """Return value for state from GET_REVOC_DEF.""" + result = reply.get(RESULT) + seq_no = result.get(SEQ_NO) + txn_time = result.get(TXN_TIME) + value_bytes = encode_state_value(result.get(DATA), seq_no, txn_time) + return value_bytes + + +def make_state_path_for_revoc_def( + authors_did, cred_def_id, revoc_def_type, revoc_def_tag +) -> bytes: + """Return state_path for REVOC_DEF.""" + return "{DID}:{MARKER}:{CRED_DEF_ID}:{REVOC_DEF_TYPE}:{REVOC_DEF_TAG}".format( + DID=authors_did, + MARKER=MARKER_REVOC_DEF, + CRED_DEF_ID=cred_def_id, + REVOC_DEF_TYPE=revoc_def_type, + REVOC_DEF_TAG=revoc_def_tag, + ).encode() + + +def prepare_revoc_def_for_state(txn, path_only=False): + """Return key-value pair for state from REVOC_DEF.""" + result = txn.get(RESULT) + author_did = result.get(TXN).get(METADATA).get(FROM, None) + txn_data = result.get(TXN).get(DATA) + cred_def_id = txn_data.get(CRED_DEF_ID) + revoc_def_type = txn_data.get(REVOC_DEF_TYPE) + revoc_def_tag = txn_data.get(TAG) + path = make_state_path_for_revoc_def( + author_did, cred_def_id, revoc_def_type, revoc_def_tag + ) + if path_only: + return path + seq_no = result.get(TXN_METADATA).get(SEQ_NO) + txn_time = result.get(TXN_METADATA).get(TXN_TIME) + value_bytes = encode_state_value(txn_data, seq_no, txn_time) + return path, value_bytes.encode() + + +def prepare_get_revoc_reg_entry_for_state(reply): + """Return value for state from GET_REVOC_REG_ENTRY.""" + result = reply.get(RESULT) + seq_no = result.get(SEQ_NO) + txn_time = result.get(TXN_TIME) + value_bytes = encode_state_value(result.get(DATA), seq_no, txn_time) + return value_bytes + + +def make_state_path_for_revoc_reg_entry(revoc_reg_def_id) -> bytes: + """Return state_path for REVOC_REG_ENTRY.""" + return "{MARKER}:{REVOC_REG_DEF_ID}".format( + MARKER=MARKER_REVOC_REG_ENTRY, REVOC_REG_DEF_ID=revoc_reg_def_id + ).encode() + + +def prepare_get_revoc_reg_delta_for_state(reply): + """Return value for state from GET_REVOC_REG_DELTA.""" + result = reply.get(RESULT) + if STATE_PROOF_FROM in result.get(DATA): + accum_to_seq_no = result.get(DATA).get(VALUE).get(ACCUM_TO).get(SEQ_NO) + accum_to_txn_time = result.get(DATA).get(VALUE).get(ACCUM_TO).get(TXN_TIME) + accum_from_seq_no = result.get(DATA).get(VALUE).get(ACCUM_FROM).get(SEQ_NO) + accum_from_txn_time = result.get(DATA).get(VALUE).get(ACCUM_FROM).get(TXN_TIME) + return ( + encode_state_value( + result.get(DATA).get(VALUE).get(ACCUM_TO), + accum_to_seq_no, + accum_to_txn_time, + ), + encode_state_value( + result.get(DATA).get(VALUE).get(ACCUM_FROM), + accum_from_seq_no, + accum_from_txn_time, + ), + ) + else: + seq_no = result.get(SEQ_NO) + txn_time = result.get(TXN_TIME) + return encode_state_value( + result.get(DATA).get(VALUE).get(ACCUM_TO), seq_no, txn_time + ) + + +def prepare_revoc_reg_entry_for_state(txn, path_only=False): + """Return key-value pair for state from REVOC_REG_ENTRY.""" + result = txn.get(RESULT) + txn_data = result.get(TXN).get(DATA) + revoc_reg_def_id = txn_data.get(REVOC_DEF_TYPE_ID) + path = make_state_path_for_revoc_reg_entry(revoc_reg_def_id=revoc_reg_def_id) + if path_only: + return path + seq_no = result.get(TXN_METADATA).get(SEQ_NO) + txn_time = result.get(TXN_METADATA).get(TXN_TIME) + txn_data = deepcopy(txn_data) + txn_data[SEQ_NO] = seq_no + txn_data[TXN_TIME] = txn_time + value_bytes = encode_state_value(txn_data, seq_no, txn_time) + return path, value_bytes.encode() + + +def prepare_get_revoc_reg_entry_accum_for_state(reply): + """Return value for state from GET_REVOC_REG_ENTRY_ACCUM.""" + result = reply.get(RESULT) + seq_no = result.get(SEQ_NO) + txn_time = result.get(TXN_TIME) + value_bytes = encode_state_value(result.get(DATA), seq_no, txn_time) + return value_bytes + + +def make_state_path_for_revoc_reg_entry_accum(revoc_reg_def_id) -> bytes: + """Return state_path for REVOC_REG_ENTRY_ACCUM.""" + return "{MARKER}:{REVOC_REG_DEF_ID}".format( + MARKER=MARKER_REVOC_REG_ENTRY_ACCUM, REVOC_REG_DEF_ID=revoc_reg_def_id + ).encode() + + +def prepare_revoc_reg_entry_accum_for_state(txn): + """Return key-value pair for state from REVOC_REG_ENTRY_ACCUM.""" + result = txn.get(RESULT) + txn_data = result.get(TXN).get(DATA) + revoc_reg_def_id = txn_data.get(REVOC_DEF_TYPE_ID) + seq_no = result.get(TXN_METADATA).get(SEQ_NO) + txn_time = result.get(TXN_METADATA).get(TXN_TIME) + path = make_state_path_for_revoc_reg_entry_accum(revoc_reg_def_id=revoc_reg_def_id) + txn_data = deepcopy(txn_data) + txn_data[SEQ_NO] = seq_no + txn_data[TXN_TIME] = txn_time + value_bytes = encode_state_value(txn_data, seq_no, txn_time) + return path, value_bytes.encode() + + +def extract_params_write_request(data): + """Return tree_size, leaf_index, audit_path, expected_root_hash from reply.""" + tree_size = data.get(RESULT).get(TXN_METADATA).get(SEQ_NO) + leaf_index = tree_size - 1 + audit_path = data.get(RESULT).get(AUDIT_PATH) + audit_path = audit_path[:] + decoded_audit_path = [ + base58.b58decode(hash_str.encode("utf-8")) for hash_str in audit_path + ] + expected_root_hash = base58.b58decode( + data.get(RESULT).get(ROOT_HASH).encode("utf-8") + ) + if len(decoded_audit_path) != audit_path_length(leaf_index, tree_size): + raise Exception("auditPath length does not match with given seqNo") + return tree_size, leaf_index, decoded_audit_path, expected_root_hash + + +def get_proof_nodes(reply): + """Return proof_nodes from reply.""" + if reply.get(RESULT).get( + "type" + ) == GET_REVOC_REG_DELTA and STATE_PROOF_FROM in reply.get(RESULT).get(DATA): + proof_nodes_accum_to = reply.get(RESULT).get(STATE_PROOF).get(PROOF_NODES) + proof_nodes_accum_from = ( + reply.get(RESULT).get(DATA).get(STATE_PROOF_FROM).get(PROOF_NODES) + ) + return base64.b64decode(proof_nodes_accum_to), base64.b64decode( + proof_nodes_accum_from + ) + else: + b64_encoded_nodes = reply.get(RESULT).get(STATE_PROOF).get(PROOF_NODES) + return base64.b64decode(b64_encoded_nodes) + + +def prepare_for_state_read(reply): + """Return state value from read requests reply.""" + request_type = reply.get(RESULT).get("type") + if request_type == GET_ATTR: + return prepare_get_attr_for_state(reply=reply) + if request_type == GET_NYM: + return prepare_get_nym_for_state(reply=reply) + if request_type == GET_SCHEMA: + return prepare_get_schema_for_state(reply=reply) + if request_type == GET_CLAIM_DEF: + return prepare_get_claim_def_for_state(reply=reply) + if request_type == GET_REVOC_REG_DEF: + return prepare_get_revoc_def_for_state(reply=reply) + if request_type == GET_REVOC_REG_ENTRY: + return prepare_get_revoc_reg_entry_accum_for_state(reply=reply) + if request_type == GET_REVOC_REG_DELTA: + if "issued" in reply.get(RESULT).get(DATA).get("value"): + return prepare_get_revoc_reg_delta_for_state(reply=reply) + else: + return prepare_get_revoc_reg_entry_accum_for_state(reply=reply) + raise ValueError( + "Cannot make state value for request of type {}".format(request_type) + ) + + +def prepare_for_state_write(reply): + """Return state key, value pair from write requests reply.""" + request_type = reply.get(RESULT).get(TXN).get("type") + if request_type == NYM: + return prepare_nym_for_state(txn=reply) + if request_type == ATTRIB: + return prepare_attr_for_state(txn=reply) + if request_type == SCHEMA: + return prepare_schema_for_state(txn=reply) + if request_type == CLAIM_DEF: + return prepare_claim_def_for_state(txn=reply) + if request_type == REVOC_REG_DEF: + return prepare_revoc_def_for_state(txn=reply) + if request_type == REVOC_REG_ENTRY: + return prepare_revoc_reg_entry_for_state(txn=reply) + raise ValueError( + "Cannot make state key-value pair for request of type {}".format(request_type) + ) diff --git a/aries_cloudagent/ledger/merkel_validation/hasher.py b/aries_cloudagent/ledger/merkel_validation/hasher.py new file mode 100644 index 0000000000..bd9abf361c --- /dev/null +++ b/aries_cloudagent/ledger/merkel_validation/hasher.py @@ -0,0 +1,40 @@ +"""Merkle tree hasher for leaf and children nodes.""" +import hashlib + +from binascii import hexlify, unhexlify + + +class TreeHasher(object): + """Merkle tree hasher for bytes data.""" + + def __init__(self, hashfunc=hashlib.sha256): + """Initialize TreeHasher.""" + self.hashfunc = hashfunc + + def hash_leaf(self, data): + """Return leaf node hash.""" + hasher = self.hashfunc() + hasher.update(b"\x00" + data) + return hasher.digest() + + def hash_children(self, left, right): + """Return parent node hash corresponding to 2 child nodes.""" + hasher = self.hashfunc() + hasher.update(b"\x01" + left + right) + return hasher.digest() + + +class HexTreeHasher(TreeHasher): + """Merkle tree hasher for hex data.""" + + def __init__(self, hashfunc=hashlib.sha256): + """Initialize HexTreeHasher.""" + self.hasher = TreeHasher(hashfunc) + + def hash_leaf(self, data): + """Return leaf node hash.""" + return hexlify(self.hasher.hash_leaf(unhexlify(data))) + + def hash_children(self, left, right): + """Return parent node hash corresponding to 2 child nodes.""" + return hexlify(self.hasher.hash_children(unhexlify(left), unhexlify(right))) diff --git a/aries_cloudagent/ledger/merkel_validation/merkel_verifier.py b/aries_cloudagent/ledger/merkel_validation/merkel_verifier.py new file mode 100644 index 0000000000..c6e0dbb42a --- /dev/null +++ b/aries_cloudagent/ledger/merkel_validation/merkel_verifier.py @@ -0,0 +1,50 @@ +"""Verify Leaf Inclusion.""" +from .hasher import TreeHasher + + +class MerkleVerifier: + """Utility class for verifying leaf inclusion.""" + + def __init__(self, hasher=TreeHasher()): + """Initialize MerkleVerifier.""" + self.hasher = hasher + + async def calculate_root_hash( + self, + leaf, + leaf_index, + audit_path, + tree_size, + ): + """Calculate root hash, used to verify Merkel AuditPath. + + Reference: section 2.1.1 of RFC6962. + + Args: + leaf: Leaf data. + leaf_index: Index of the leaf in the tree. + audit_path: A list of SHA-256 hashes representing the Merkle audit + path. + tree_size: tree size + + """ + leaf_hash = self.hasher.hash_leaf(leaf) + if leaf_index >= tree_size or leaf_index < 0: + return False + fn, sn = leaf_index, tree_size - 1 + r = leaf_hash + for p in audit_path: + if self.lsb(fn) or (fn == sn): + r = self.hasher.hash_children(p, r) + while not ((fn == 0) or self.lsb(fn)): + fn >>= 1 + sn >>= 1 + else: + r = self.hasher.hash_children(r, p) + fn >>= 1 + sn >>= 1 + return r + + def lsb(self, x): + """Return Least Significant Bits.""" + return x & 1 diff --git a/aries_cloudagent/ledger/merkel_validation/tests/__init__.py b/aries_cloudagent/ledger/merkel_validation/tests/__init__.py new file mode 100644 index 0000000000..e69de29bb2 diff --git a/aries_cloudagent/ledger/merkel_validation/tests/test_data.py b/aries_cloudagent/ledger/merkel_validation/tests/test_data.py new file mode 100644 index 0000000000..2270797303 --- /dev/null +++ b/aries_cloudagent/ledger/merkel_validation/tests/test_data.py @@ -0,0 +1,656 @@ +SHA256_AUDIT_PATH = [ + b"1a208aeebcd1b39fe2de247ee8db9454e1e93a312d206b87f6ca9cc6ec6f1ddd", + b"0a1b78b383f580856f433c01a5741e160d451c185910027f6cc9f828687a40c4", + b"3d1745789bc63f2da15850de1c12a5bf46ed81e1cc90f086148b1662e79aab3d", + b"9095b61e14d8990acf390905621e62b1714fb8e399fbb71de5510e0aef45affe", + b"0a332b91b8fab564e6afd1dd452449e04619b18accc0ff9aa8393cd4928451f2", + b"2336f0181d264aed6d8f3a6507ca14a8d3b3c3a23791ac263e845d208c1ee330", + b"b4ce56e300590500360c146c6452edbede25d4ed83919278749ee5dbe178e048", + b"933f6ddc848ea562e4f9c5cfb5f176941301dad0c6fdb9d1fbbe34fac1be6966", + b"b95a6222958a86f74c030be27c44f57dbe313e5e7c7f4ffb98bcbd3a03bb52f2", + b"daeeb3ce5923defd0faeb8e0c210b753b85b809445d7d3d3cd537a9aabaa9c45", + b"7fadd0a13e9138a2aa6c3fdec4e2275af233b94812784f66bcca9aa8e989f2bc", + b"1864e6ba3e32878610546539734fb5eeae2529991f130c575c73a7e25a2a7c56", + b"12842d1202b1dc6828a17ab253c02e7ce9409b5192430feba44189f39cc02d66", + b"29af64b16fa3053c13d02ac63aa75b23aa468506e44c3a2315edc85d2dc22b11", + b"b527b99934a0bd9edd154e449b0502e2c499bba783f3bc3dfe23364b6b532009", + b"4584db8ae8e351ace08e01f306378a92bfd43611714814f3d834a2842d69faa8", + b"86a9a41573b0d6e4292f01e93243d6cc65b30f06606fc6fa57390e7e90ed580f", + b"a88b98fbe84d4c6aae8db9d1605dfac059d9f03fe0fcb0d5dff1295dacba09e6", + b"06326dc617a6d1f7021dc536026dbfd5fffc6f7c5531d48ef6ccd1ed1569f2a1", + b"f41fe8fdc3a2e4e8345e30216e7ebecffee26ff266eeced208a6c2a3cf08f960", + b"40cf5bde8abb76983f3e98ba97aa36240402975674e120f234b3448911090f8d", + b"b3222dc8658538079883d980d7fdc2bef9285344ea34338968f736b04aeb387a", +] + +RAW_HEX_LEAF = ( + b"00000000013de9d2b29b000000055b308205573082043fa00302010202072b777b56df" + b"7bc5300d06092a864886f70d01010505003081ca310b30090603550406130255533110" + b"300e060355040813074172697a6f6e61311330110603550407130a53636f7474736461" + b"6c65311a3018060355040a1311476f44616464792e636f6d2c20496e632e3133303106" + b"0355040b132a687474703a2f2f6365727469666963617465732e676f64616464792e63" + b"6f6d2f7265706f7369746f72793130302e06035504031327476f204461646479205365" + b"637572652043657274696669636174696f6e20417574686f726974793111300f060355" + b"040513083037393639323837301e170d3133303131343038353035305a170d31353031" + b"31343038353035305a305331163014060355040a130d7777772e69646e65742e6e6574" + b"3121301f060355040b1318446f6d61696e20436f6e74726f6c2056616c696461746564" + b"311630140603550403130d7777772e69646e65742e6e657430820122300d06092a8648" + b"86f70d01010105000382010f003082010a0282010100d4e4a4b1bbc981c9b8166f0737" + b"c113000aa5370b21ad86a831a379de929db258f056ba0681c50211552b249a02ec00c5" + b"37e014805a5b5f4d09c84fdcdfc49310f4a9f9004245d119ce5461bc5c42fd99694b88" + b"388e035e333ac77a24762d2a97ea15622459cc4adcd37474a11c7cff6239f810120f85" + b"e014d2066a3592be604b310055e84a74c91c6f401cb7f78bdb45636fb0b1516b04c5ee" + b"7b3fa1507865ff885d2ace21cbb28fdaa464efaa1d5faab1c65e4c46d2139175448f54" + b"b5da5aea956719de836ac69cd3a74ca049557cee96f5e09e07ba7e7b4ebf9bf167f4c3" + b"bf8039a4cab4bec068c899e997bca58672bd7686b5c85ea24841e48c46f76830390203" + b"010001a38201b6308201b2300f0603551d130101ff04053003010100301d0603551d25" + b"0416301406082b0601050507030106082b06010505070302300e0603551d0f0101ff04" + b"04030205a030330603551d1f042c302a3028a026a0248622687474703a2f2f63726c2e" + b"676f64616464792e636f6d2f676473312d38332e63726c30530603551d20044c304a30" + b"48060b6086480186fd6d010717013039303706082b06010505070201162b687474703a" + b"2f2f6365727469666963617465732e676f64616464792e636f6d2f7265706f7369746f" + b"72792f30818006082b0601050507010104743072302406082b06010505073001861868" + b"7474703a2f2f6f6373702e676f64616464792e636f6d2f304a06082b06010505073002" + b"863e687474703a2f2f6365727469666963617465732e676f64616464792e636f6d2f72" + b"65706f7369746f72792f67645f696e7465726d6564696174652e637274301f0603551d" + b"23041830168014fdac6132936c45d6e2ee855f9abae7769968cce730230603551d1104" + b"1c301a820d7777772e69646e65742e6e6574820969646e65742e6e6574301d0603551d" + b"0e041604144d3ae8a87ddcf046764021b87e7d8d39ddd18ea0300d06092a864886f70d" + b"01010505000382010100ad651b199f340f043732a71178c0af48e22877b9e5d99a70f5" + b"d78537c31d6516e19669aa6bfdb8b2cc7a145ba7d77b35101f9519e03b58e692732314" + b"1383c3ab45dc219bd5a584a2b6333b6e1bbef5f76e89b3c187ef1d3b853b4910e895a4" + b"57dbe7627e759f56c8484c30b22a74fb00f7b1d7c41533a1fd176cd2a2b06076acd7ca" + b"ddc6ca6d0c2a815f9eb3ef0d03d27e7eebd7824c78fdb51679c03278cfbb2d85ae65a4" + b"7485cb733fc1c7407834f7471ababd68f140983817c6f388b2f2e2bfe9e26608f9924f" + b"16473462d136427d1f2801e4b870b078c20ec4ba21e22ab32a00b76522d523825bcabb" + b"8c7b6142d624be8d2af69ecc36fb5689572a0f59c00000" +) +GET_NYM_REPLY = { + "op": "REPLY", + "result": { + "data": { + "dest": "Av63wJYM7xYR4AiygYq4c3", + "identifier": "V4SGRU86Z58d6TV7PBUe6f", + "role": "101", + "seqNo": 17794, + "txnTime": 1632262244, + "verkey": "6QSduYdf8Bi6t8PfNm5vNomGWDtXhmMmTRzaciudBXYJ", + }, + "dest": "Av63wJYM7xYR4AiygYq4c3", + "identifier": "LibindyDid111111111111", + "reqId": 1632267113185021500, + "seqNo": 17794, + "state_proof": { + "multi_signature": { + "participants": ["Node2", "Node3", "Node1"], + "signature": "Qye7WDGrhwEpr2MUmQ2hhm8yWAsUG6gKKf4TXxrw7BybGA96HWXLXhnV5gm5HBQCb4sDXiirTKuyWgMDyfDxKewya9mZhkGXf5WzaADFuaoJkTeSywqqmsrfpcHc2e49eEyncpCxFzhJn6sius4jLgJ7MAfSeVGwyydeR1YsJb3Nm5", + "value": { + "ledger_id": 1, + "pool_state_root_hash": "7siDH8Qanh82UviK4zjBSfLXcoCvLaeGkrByi1ow9Tsm", + "state_root_hash": "GJq4XL4pJYnDGg3MJ64y3QnfuezxsuBEezk5GC5yaZPM", + "timestamp": 1632266842, + "txn_root_hash": "BTnnWQ7imcHSoMykHLeYZX5q8eGEHWdbUydQNA4RG8La", + }, + }, + "proof_nodes": r"+QZT+QIRoPfBdyHC/yQ9E7ccJxuGSGyin0AZ5xy0zfA8N6Wc75nkoLvO2UFS9kc6UJf5h3pWKpCOYU1QG2/EwVBgRaYY5oVfoFiqhixTI8GzCruiD0VLXaBU/E9lXQbDpSkMZdDPzMreoCkh/z2RksZKP1fkA7igydNPzfwbLwiM9elSt/9pDeW2oGcg6JSZBN2tOAjD2MZOI2WbBG+T0xXyrYTBkX6Tyba8oJuUzMN4PgaFyU3asvvF5V654vRkjWc73wybJW176CI8oAq8q5c+HBEffJ0+akk5MRFu4JZhQaMNUaXbGCaWzSs8oLOj9AqXxE1D9jyaCU4u2BqvqHu9HYqelbFj+R5ByEMzoKX5KxOFooV0LpTl7lbGg3kGoSWHBoX2ULYJKYZcnioRoA+xIUfMPFI/zWl1GSrvPSyXWcn4BdvfoNpn2mcMn4PDoPvM6CWLx8A/lTyVEXOE+EgGCnLMnArQ5Yf5+W3QTmrIoHjfHDGdm18gDhzpPnyNm5uanCkdjeKa1JAPeUTdKY7LoHsMpvR3ZTMcsxI1CmU9/M3xvUaFYfZrBOsDgy9X8SN1oEbM6e9l7UXKAYmTfln6h4KP52jiP09iSfbQr+BtIj+CoDS3PSiVlx1zaCxKmxaxK9swtCkioqt9JL2bC17umLyvoCeQaASHvTOBDSOzskDNAX+zZ2H3c249YZLaZ1juy290gPi1nz5vw3VP4NQ5xzLTH4cSKvsJv479pox3Kz8LFlmxW+u4k/iRuI97ImlkZW50aWZpZXIiOiJWNFNHUlU4Nlo1OGQ2VFY3UEJVZTZmIiwicm9sZSI6IjEwMSIsInNlcU5vIjoxNzc5NCwidHhuVGltZSI6MTYzMjI2MjI0NCwidmVya2V5IjoiNlFTZHVZZGY4Qmk2dDhQZk5tNXZOb21HV0R0WGhtTW1UUnphY2l1ZEJYWUoiffkBcaD0zm16yBXDIOIN9412GuA1rW+kTWnQHH1trnzwSPjcrKDIaFdSIze7Moillxoah2AQ0QWzP5kf0E5/LpGsyo0qTaDS+hAZ36Vjg0XyQucpi/mLN4pkE4x/9ktPSJ7M3mP5i6DE+p/X3eoL/TRxL2fDL483eGxfomvqh6J8MZruefjTw6B1/mubqaxnLmpeUYONXNGjbyoKPG2x/rfmengXBxUFc6DUL5gvqRDBg0eL3AlmLmD0TLAScreLTHLeymOZUbSIL6A2dgYc0w/ZQadZY55Tnzoh8zkFdQB6K9cnknZngKCHaICAoMPfNmoNKhrO7BUEJ6n6MTgGMSApPD0gDJ6vV9oMytLCoCjrQucTPIkjSugIIq5gCOYtuBw4QEh6fXbPsBTh3gYZoOO/FlS/b8npUjdblCPY77ak3yv/ph8iWrZYrI8kAhYQgICAoItxPpHCtRFevH2BZjWUyb82gezWWgqRklh33Oj86GxvgPkCEaAWno46cfndaVwfDfz5dveqQexY5V3FhvuItpDaJA8kKaBKO2/CnAJx+8V71hdMNskKDaC+pO3KjgmV/vdk7A5GfKCd9sCdRhN1Cdi3bMZk6hdynJVYAEByoaZ1f0t3KnupA6DgZ4jFO/nRym2ZmlYUbt+vP6UdKdDAJGXbCmhdImEIOKCoQbdvJzqarFR9cW7jjJOJw7oJxwGVdQDd+yGBt2oRTqBLGQeyvRSlmv4VM+kS2XPFA/Qd0PIOhWUd1FUWp1vw2KC+IkQy0YqvnBefMK1oiUfYyn8EDUWxVXMTtH/Gp0kEx6BTER0hlHBftQi0PIVagVXy8oHtbq7onmFsLv1TK8BSYaARcR92zYImHr0hGKiFv16gpJ1Z2jh3aO7XObbK9B1QAKAGYSkdYb6RuGGCsCKdnVm1U3SehvqVDDgwlPEqQm9Uo6BjkeqmdeWRntEtTUlp/PxnFLcqlNS5woQnHMeX7Gd2m6CQiWvIfSvgZqtSenfp4Vm0YRzwkJdtPtXmLzyZMWsVtaCbfsQPS4ENbfg3dFabmRSb1p4Cx+CHlA9ADDyTAD7yeqBpSsmSmoFtApFlT/zMJksMICpEMl/C3gzjmLm35yMVZaD2aVz2Mp/WDQcWgtTjnspR5p8/XROvd1TF9D9Q9PrKmKBTd4eKZCyl2r0Tgs5TS1jbG7DM96u4WotWVNYLPXw7TIA=", + "root_hash": "GJq4XL4pJYnDGg3MJ64y3QnfuezxsuBEezk5GC5yaZPM", + }, + "txnTime": 1632262244, + "type": "105", + }, +} +GET_ATTRIB_REPLY = { + "op": "REPLY", + "result": { + "data": "!@#$%^&*()", + "dest": "Av63wJYM7xYR4AiygYq4c3", + "enc": "!@#$%^&*()", + "identifier": "Av63wJYM7xYR4AiygYq4c3", + "reqId": 1632330118219577700, + "seqNo": 17861, + "state_proof": { + "multi_signature": { + "participants": ["Node2", "Node1", "Node3"], + "signature": "RYJYMn96nrrDmNFjeeP7voVY4M7fCj4fHFb8uGiBsLJWd2fujVuoFdNbD7BnP9N5djkerANM2XGqUVAoQ5b9MQLF6NXpK9rQkoAfVCrz9NBJYsD8LW6sc4YN5gwqJgnQegsmHWSfYgyhWq9b27RB59yh6pNh1PQmxc6G2N9TC1vBqH", + "value": { + "ledger_id": 1, + "pool_state_root_hash": "7siDH8Qanh82UviK4zjBSfLXcoCvLaeGkrByi1ow9Tsm", + "state_root_hash": "CdNjqYsC3wJ9ddBSGJpFDgbPUC3LMU5WpLMKfk7Nu8y9", + "timestamp": 1632330092, + "txn_root_hash": "CLP3Zj6SLd4HETezr7Me37ZCS3uRVqc1EMsj8HUvB3Yq", + }, + }, + "proof_nodes": r"+QiL+QFxoCtXu0mw1ebGv2A1OlkjRukRnJs8oy7d8gNbrj6yQHcdoJZ0vJmO7PM69ytB3v6d5S4bjVzqCqVyBOGaQO94YeW4oEwDRy2u5bnRgo+C0Ei+5gM4c8Ix8Jlp1soSA77Wp+JeoAHtsE0/wuXFZIhuZAGwGjpkmE/dPrJJpMkF5o30osw5oCVIfS9J8fGGzco4zXyQosHepwSy9paGNk9FsV0VO1MnoCgsaLN77phwtLe44HNqB0cKpX6oZ9mYIMh4L2ftnzZboL95y56198Y4O52Nco9iUuTmLfzm6v9wmrL11n1s6CzkoDV/kCAK2ik/Mgln5rw4jk9rrloUTIn4q54xeJ/mQuoloMVbQ7SP66F5uWPXPGL6wB8Ql57X+jyxxdPdsMMw1YDToA1fbg6tMjTiFsdTCFwWazLdJTKrIfs8IuUeEedhPCJ6oB2FYyT/1bw24xogKSpO5Gv3l/bKLHObq2Wk/mqD41jQgICAgICA+Ma4VzYzd0pZTTd4WVI0QWl5Z1lxNGMzOjE6OTVjZTc4OWM1YzlkMTg0OTA5NzI3MDk4MzhjYTNhOTcxOTA5NGJjYTNhYzE2MzMyY2ZlYzA2NTJiMDIzNjE0Mbhr+Gm4Z3sibHNuIjoxNzg2MSwibHV0IjoxNjMyMzI5NzkxLCJ2YWwiOiI5NWNlNzg5YzVjOWQxODQ5MDk3MjcwOTgzOGNhM2E5NzE5MDk0YmNhM2FjMTYzMzJjZmVjMDY1MmIwMjM2MTQxIn35AhGgxhCAyLVujfeK3JsRsgKfOt9Bb3ska1oiAVYag9PVuOegeRV6fPSTJt/1kb7zbzRTbp04FMXI5bh+zDAIId/BwVygIzsQpf1vDPjjejQjs/CdoQGbkZuKHmZr271Ig0AN5u+grLAIkAqdmj74uhpGiWZowzq865dSdqYgbZnMqgaToTWgNIUFaaiUJZtBWots40Rq0hHrdlBq8JkMe+nj0Lgp+fmgIpG3OWbOVvJGty97UXLi+dWrRFXPaz8NO3sxoB4mSzqgMS1eS5NVzVbpzgbPapd1YpyBVOh9Go/9i7AfaKYD5cOgat2YQEMPq60wdEGqtriJhUCjw2OLL0vhL0TNi2ND26mgkNokyer/73cwd4PnSKPcYPkZ4agoHLVvlW18WnixN0ygeL+k83IfgIXPu6BMB9+cqF7bWG/YK6oJiVf1ZuJUa9SgmHQxRfRmB/RwL0KjDbJ3qJlFoQRAz1wtxvXyNi+cByagEWsn+ywZ9ZQuKDcyn9IuehCRtTqJQsN7LBtBx/R+bJ6glo2X0D1eYSlFN7b/0LIUKclTBHj5bdgy4OTXWlj2LpOgkEjZ+8tcM+RSpKuci1FhEBDDhuoVqVgecYXXUZNrakGgL6k6RJRCh72J0PO6Q3vITduaS082vQCrgEFc/He5BqCgnVULBCn2xpLt+ZIhdZ5Iy0QplY92A6NCLXhdeiLq43OA+FGAgICg9EAG0bMCoDsOIAeeVsSUfuaWGFTwGpqY6yLabYx94ViAgKCy5sizDOkq2rkIiK7FvGI8JevCCGnuhRW9HVyKsf5N4oCAgICAgICAgID5AdGgqg1jHpCwamQoUZu/uQ18daiTDZtRr/YwOCaPtRRId1+gVI/Q8qJrilettFsXNlqrsnmaM9EOCCrCvgAYDDSdf/yAoFVcY1y8UEHaBzUobZrCK8c8kRngMZd80j7ywiP8kDpsoNDNNzg2WFFXlgBjIZveTP5YZREwyedRVXBg2wjTZAM7oIv/uPTYAlOiikIRdX81jnwSjYB5e1KUFaWv5blHaAaboCW43JdUGkAT9z3i0S4ltG3/8ZJzLsPJIAUIGgdyAhS/oJbB1VqfeADhIlt23LemzWTRbwgf9hAt6sPE2cqXUomuoCJlCjTV9hRqjUz6gOqV79a5SLWCoW4iesO349lU1o+goNlT/odRQuiWFG9k1jp65+v3p8/e+jL/19DW8Ls6TJMFoNOrRuQOnS8I1LxMJVDa6Wh7sZSb3frT5OFfvfut2UiGoM15BNI74tpvLtRxIdVRdgD0osxm6p/ZK4gIwkvFBOH2oA2qxnGbloUxmnmrFGtlFISU5DJJ05hk4taXuucoE6NZgKBuiyA7ZAD2PrOQ7Uf4k4SqZ79ocrVC/3Ig7c9KrJxBEqDlt+9qAhnCxmLm6Jemh+BEujLz6/+iYHwQdfXYJlSVXoD5AhGgjYADdAGfaHwD1ZjrFjXc6lovqisiVgLPJrq8ls0GQ3SgSjtvwpwCcfvFe9YXTDbJCg2gvqTtyo4Jlf73ZOwORnyg8ykP9RcmUJ1UBZSBwsW4qJfNOBOXR3cBVAqueQFR4GOgqzkXDPKLhggk8a3PzVdn0xrGfprjAEwGBooqB4TRcx6gVlXAnuovfI4+jt6Ajha7UTFpA374qZxNpD7PGMCYtc+gf1baoHr1sLUHBa1QivjMij1HZtvjB53M8TkTV0GqefqgM/DWNl1Q9gWZyYyazVjgCV+ONrnwhtjDHnBfH9UPDOegAEgbMzojKeMXZ4m0Sxi4CNXOyrGNPTDeHeCtr0y6uPmgN2bNLvID/ysP5NP8UrTBj9GwD64zP1xntYY4TCVrQB2gPYXHiVzJi9KBr9l3jFd5Fk3etzB2YOqTC7I+VgzzoBegOmyvt8INrm6Cto8WRCTeelO71DJHNozSFVOWKVipTMugHXniwGJA5KmvQMvz/Z3RNn8FSCQtbRXGNqCGi2thnOWgIPh3Eh/l+j+Zriqj8SsqFhF/V2aLdAWhls4dqrTqMPKgaUrJkpqBbQKRZU/8zCZLDCAqRDJfwt4M45i5t+cjFWWg9mlc9jKf1g0HFoLU457KUeafP10Tr3dUxfQ/UPT6ypigG+w51LbnJJOg7T7fWKLUfFjcfNmsDT1oJbrFKkQm1iaA", + "root_hash": "CdNjqYsC3wJ9ddBSGJpFDgbPUC3LMU5WpLMKfk7Nu8y9", + }, + "txnTime": 1632329791, + "type": "104", + }, +} +GET_CLAIM_DEF_REPLY_INVALID = { + "op": "REPLY", + "result": { + "type": "108", + "identifier": "L5AD5g65TDQr1PPHHRoiGf", + "reqId": 1514308188474704, + "seqNo": 10, + "txnTime": 1514214795, + "state_proof": { + "root_hash": "81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH", + "proof_nodes": r"+QHl+FGAgICg0he/hjc9t/tPFzmCrb2T+nHnN0cRwqPKqZEc3pw2iCaAoAsA80p3oFwfl4dDaKkNI8z8weRsSaS9Y8n3HoardRzxgICAgICAgICAgID4naAgwxDOAEoIq+wUHr5h9jjSAIPDjS7SEG1NvWJbToxVQbh6+Hi4dnsiaWRlbnRpZmllciI6Ikw1QUQ1ZzY1VERRcjFQUEhIUm9pR2YiLCJyb2xlIjpudWxsLCJzZXFObyI6MTAsInR4blRpbWUiOjE1MTQyMTQ3OTUsInZlcmtleSI6In42dWV3Um03MmRXN1pUWFdObUFkUjFtIn348YCAgKDKj6ZIi+Ob9HXBy/CULIerYmmnnK2A6hN1u4ofU2eihKBna5MOCHiaObMfghjsZ8KBSbC6EpTFruD02fuGKlF1q4CAgICgBk8Cpc14mIr78WguSeT7+/rLT8qykKxzI4IO5ZMQwSmAoLsEwI+BkQFBiPsN8F610IjAg3+MVMbBjzugJKDo4NhYoFJ0ln1wq3FTWO0iw1zoUcO3FPjSh5ytvf1jvSxxcmJxoF0Hy14HfsVll8qa9aQ8T740lPFLR431oSefGorqgM5ioK1TJOr6JuvtBNByVMRv+rjhklCp6nkleiyLIq8vZYRcgIA=", + "multi_signature": { + "value": { + "timestamp": 1514308168, + "ledger_id": 1, + "txn_root_hash": "4Y2DpBPSsgwd5CVE8Z2zZZKS4M6n9AbisT3jYvCYyC2y", + "pool_state_root_hash": "9fzzkqU25JbgxycNYwUqKmM3LT8KsvUFkSSowD4pHpoK", + "state_root_hash": "81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH", + }, + "signature": "REbtR8NvQy3dDRZLoTtzjHNx9ar65ttzk4jMqikwQiL1sPcHK4JAqrqVmhRLtw6Ed3iKuP4v8tgjA2BEvoyLTX6vB6vN4CqtFLqJaPJqMNZvr9tA5Lm6ZHBeEsH1QQLBYnWSAtXt658PotLUEp38sNxRh21t1zavbYcyV8AmxuVTg3", + "participants": ["Delta", "Gamma", "Alpha"], + }, + }, + "data": { + "primary": { + "n": "99189903989663138528233707329605798928847401116390236896063089765075372754629124936444122843118055706252253099463393473802518363989430955476296412713569660919619346846287050031967219079283379371982179475331351180676850316480923612843063670221492520841649469839691711425409965721272803880205396716527258248613465269182675039055965728267714834218146604131070697050035414285666112493815538051150077642431887363704371988066099418960634098618638462715333172965774724821969116693268040267277814162178224754076427686143924953663821419082938398179004138428079999099795086963773235443135512644254456755455484183769568248126517", + "r": { + "codigo": "77817306548649534964657604646528331475274954783537816321078654710433118782259165423707503504749761340340683516491478085067491481212351726168032288636412669012430462439715498773706352645189398384169438794551946449612310560107003342591907228223912795152388336064119335728480656473934502582043358475971816917205870755637508198751179472187927719874981872028671079011576534531499918130408338567566829925647438062516632369126112933518179590448203159587338345116555203325745240711205260653560198723237613260080096617921311750049773564348145887149507995200290642658140014127849403602234119374930582030243016220836988785748210", + "first_user_approval_did": "384901452384513081023453460679014535564385391647798526437625701270162051331706043117594824177342667741056529936232353830727447163172559133459335920295636853313371417661137116545007085547366678995840721301359884373742490543707510495480790286114302712462455589494291174548398600636747006947872365673176941628656054411947918922418869423914814997795002269883418897691993622163781887636679688285100098762090036009143337590370690393872610987382838641041957318532497439209204958966457484919235443545472237330446545746252024812060581212126262338426171637686716995589583620088274775670178908018079422522219635588234611784509", + "master_secret": "81068502063840180320741074287906342543066681492620931450052491621732172194335348147008386419344368715569934655546242506895767888038584542979256597551262789656277827574295200747121137618608474555341285956626018554958296531200733383691408878725586893553643865674542888944749888613820957252358913841108334724015407887220439435243473820099357742585381819060146664353782556070228374281217851773218373481446849227404348736589966775390145852652183646921017461814146947628161797432921543759003294668132044659050119305185855913513921197232121257678830761378709935258794232279170242473754467191437715593342674001392685033193794", + "nombre": "56564082360321401174322417318224565332137922170165451581597752645392521885870184917204920004118802994562843023560417187680525409882456236587673934846877768777199500103855762167848063320030625777430614007189667776149877989836641529396943443514272872564555015923301040044658163395537423931233642434950634299962603449098098818126403527867428584248119402622538104047631507595457098143469160334326123436992275174769315471259117233927959876492402389453565288307019478442635743703896480580402861092754498280875793229979329341447657771507422025327622092055197419860974454523205910355593867584986246363260282531337281676793491", + "organization_did": "74899935378718944225677848839884335012111378948985015496762544471888033974270547589113991633535973126327329814277788790948434314438231641471331884661213528602615811557141738364853570574223344350088619877830486416101458230567718116114215656684394440136049648931170659984460754247230461493096378581471192072170388858573417277123886312354262483127937403754694858588460383672140110402187709391829034522049039006592593736054436190154967830450996645446351632748263577745729816194354763565454686659832819328289881169088902243386915920828612763749218569228108689615286684563087321268625375752451676754219994921630526320203476", + "second_user_approval_did": "89119359902571208030477407091536952123967779930589362112667306028903075751373766454398071862082997438772474017794013371240677512679500869341895799567663068078521267419965895858373953860879647715955445898200932327173266279475993955169239937764491733655589713972003156629820768853933396265084804500417473440680977037907787836687381227950999759863611023819934983323762751137271632765089287524211847780886994147394428053839235628668993765050763392704234752115503722513934641902436865635425231562129248342900089210925398432689711852670912182650883673124222838105654417490386113188985500801951496785800582743829588068907047", + }, + "rctxt": "55547577127438452265999702988403173852746410863821639817686336079827589263493913120270701258855524914528308109700621574592392602314295356183113041445284938138034569935576573259819114834782964857977382395287420488028136585682257950683908415155172282405919061674426279404033026475575266286298592744120897737265495085570022313613602512862724115564775639676766370968468135181898956237168424857364482828066846065626379464086895703289263133463139052074535468896626368295979227303228929533876243931297370520779853391301535635730515106839625918686618833229258361549523072827169051454870773288344997544238354671312123446595072", + "s": "76390640419992943357387572917900898435678229255627168485796009046041444552924973855414926188853070419104084147556871783111646804667947793179269848356207708256916960811483515023101091196431618938254921768709340219038096287590606545670154994184316979207153302903153178050396002360076849204120423298892878360003672100994708647647917295298682518440979873684387308281835117444201027818850637617521716647611865785494687167981676381951451652957380673960004108264169883362799934962785023366929163087551374986121037560891329467151316116696669178867317314984235577675774978649604926672390555790902478998274813041084228870108873", + "z": "61045765676410000482097344755537720120654736675041635423599566923209517703349841897383346627436576448338376802479852357173131386445010272306227443795009060661051658092981601448476716067337100738642200234810775085678513549725669008108236620735588819625211643346883386904560923354128473293459231989623557344378160673531493684921954207398963165225114647217866746760601743246098299506327232318935202330671228079044639582315169397789100837576948787308107676091229832990695040809784770294223134972610136425448955990141256448736380191702953097163518870012238820870246537689784095307756867782192952273049396153702731638708090", + }, + "revocation": { + "g": "1 033139FA1575A54B8439D4BA2703F035336D28037612E50EF050647A574DA757 1 17E1CB296818C6215C4A99257AE52BA5465F62997701C9DBB8AA9C069AF7F9B0 2 095E45DDF417D05FB10933FFC63D474548B7FFFF7888802F07FFFFFF7D07A8A8", + "g_dash": "1 1EE383F16B2F27DD44CFEADEE7FAB0CD50EFD2A194E9B116B0D89ED6ACD9E342 1 19DFF4801228D80AC56B5EB6172B96AAC3B0C470C98991E81841E0D49A21CC38 1 051CEDF98F814E8213FE85F9FA28BAC1826DE1F32DD9A031FDFECE286B624E64 1 06550C3D4EF014887D36D203B99AB7CD558630DA94687F007C092438E8991398 2 095E45DDF417D05FB10933FFC63D474548B7FFFF7888802F07FFFFFF7D07A8A8 1 0000000000000000000000000000000000000000000000000000000000000000", + "h": "1 1C7B7FA206065061BDEA6B421D575985801C432BE94885958AF9E3E4C8388819 1 03A5D07D9E32DB3CCAA55FDF36C2525684ABE4F0EFFC6C4FA1DF1246B716748A 2 095E45DDF417D05FB10933FFC63D474548B7FFFF7888802F07FFFFFF7D07A8A8", + "h0": "1 0058F2F874D7B5CBE24C8CE56471346B1900B2DCABA0AD58FC976E725A8B6310 1 0E5576E1F0B1D2B1AE5057BD30F0F47AB2FFC6A89DD7EA407C6069AC8100651C 2 095E45DDF417D05FB10933FFC63D474548B7FFFF7888802F07FFFFFF7D07A8A8", + "h1": "1 21C75A1436336A5DD2870614F9EA1516686BDEEC105EDC1EDF7666AD34FE6E6B 1 19B2C0A16F9CC44AD050844400ABB8BFCEE49F2DDF4E05E58323767D6E096EC7 2 095E45DDF417D05FB10933FFC63D474548B7FFFF7888802F07FFFFFF7D07A8A8", + "h2": "1 2309419416526C6255383D3C9FFB7C37D35975F15F5AD6304310BB8BE65C1A61 1 0D287AA25D7A118E7D9991F8022945B43AACC2F1CEB47A8CB338B31E44D1CA35 2 095E45DDF417D05FB10933FFC63D474548B7FFFF7888802F07FFFFFF7D07A8A8", + "h_cap": "1 183DD9C36D3F66ECFA6EF7E3CF4331FF1EC80BE8EAEFB45F38E77E1E4DC873C5 1 1CFFF20D54ED8E75279D5498638B197CF12E90ED1B0CB81AC631CF68399FF20C 1 2349DFB2FC8743C36CF8609D960FE092E1F8D2D502808F1FD8B7F108ECAD598A 1 0F988ECDA1A09A1B36F711D6132A749D1BFAAEDFAC161E9C088822DB89C6185E 2 095E45DDF417D05FB10933FFC63D474548B7FFFF7888802F07FFFFFF7D07A8A8 1 0000000000000000000000000000000000000000000000000000000000000000", + "htilde": "1 1B2D30224CA334957A61805D1826C185AB3120F4CD05162EF6EDC3D507BFF332 1 24CD57DFD210EF8EC056864C4970A0D980B29E7235692B80200B0C425FA6260C 2 095E45DDF417D05FB10933FFC63D474548B7FFFF7888802F07FFFFFF7D07A8A8", + "pk": "1 14B3128654FA6FA490C8E07D7189AF655DB903C356C81271CE0FE2CA1D6434DF 1 212D406D3CCE907AB00EEA9F192431564604CFD13EE9DC597ACA45353616F545 2 095E45DDF417D05FB10933FFC63D474548B7FFFF7888802F07FFFFFF7D07A8A8", + "u": "1 1397CFD01CBADE79023FD996C3C253E65D6FF70B8746D3BDFF712C69AA7A5A52 1 111BC0EF91F9BB4B8ACE2562B87A02F31F031C1A115B6E14AEE11FDB65DF1628 1 02F32E65B7F0BD4B532D203DA415B34BF2E38358F2665D27F01A61899AA887AD 1 218A71DD9CB509655AE744D159570E7371C6B28F103F3BCE1FC5B74B14BED098 2 095E45DDF417D05FB10933FFC63D474548B7FFFF7888802F07FFFFFF7D07A8A8 1 0000000000000000000000000000000000000000000000000000000000000000", + "y": "1 00BF5D14ED0BEC12C8D0F95B8DB1DE5B22BA995C8823F10A59B957E2B03250EA 1 007FDEB4F6BF33271BDDD19ACA54EAC0C8F7FAD9E45CF29FB695530F26178665 1 2231CB929DDE775B5CA2001E5A7D9F75EB07968A5FC2796A02FE1946397D1F74 1 1840253E7AD8098184E480DBA2BF179BE2EDF8F1D94E13069D833607CD7F739F 2 095E45DDF417D05FB10933FFC63D474548B7FFFF7888802F07FFFFFF7D07A8A8 1 0000000000000000000000000000000000000000000000000000000000000000", + }, + }, + "signature_type": "CL", + "origin": "2VkbBskPNNyWrLrZq7DBhk", + "ref": 10, + "tag": "some_tag", + }, +} +GET_REVOC_REG_DEF_REPLY_A = { + "op": "REPLY", + "result": { + "type": "115", + "reqId": 1632558557551332200, + "state_proof": { + "proof_nodes": r"+Qml+QUstCA6QmJKdVU2dEhYZXgzOGJ4UmhFTE1wWjozOkNMOjM2ODp0YWcxOkNMX0FDQ1VNOnRhZzG5BPT5BPG5BO57ImxzbiI6MzcwLCJsdXQiOjE2MzI1NTg1NTEsInZhbCI6eyJjcmVkRGVmSWQiOiJCYkp1VTZ0SFhleDM4YnhSaEVMTXBaOjM6Q0w6MzY4OnRhZzEiLCJpZCI6IkJiSnVVNnRIWGV4MzhieFJoRUxNcFo6NDpCYkp1VTZ0SFhleDM4YnhSaEVMTXBaOjM6Q0w6MzY4OnRhZzE6Q0xfQUNDVU06dGFnMSIsInJldm9jRGVmVHlwZSI6IkNMX0FDQ1VNIiwidGFnIjoidGFnMSIsInZhbHVlIjp7Imlzc3VhbmNlVHlwZSI6IklTU1VBTkNFX09OX0RFTUFORCIsIm1heENyZWROdW0iOjE2LCJwdWJsaWNLZXlzIjp7ImFjY3VtS2V5Ijp7InoiOiIxIDE5RDQ5MDlBNTk3NTIzRUNDMjhEQTI1MzVDNENFRDY5ODUzRUE3QzFFODQ4M0ZBQkM1Q0E1NzIzNEZCNTFFNTAgMSAxQTU4RTZERkQxMDE1OTM5NDhCMDNEMTg5RTA4QzczRjg5RDRCNkZFNjVCOTIzQzgwM0NBMUREQUYwOTZENTIwIDEgMDM3N0NGN0MzODk3Q0Q0MzhDRTAwMUY0NjE2RkZFNTA5QTlBODU4MTcxQjhDQTlBMjQ2QjQzMjJCRkYzRDlDQSAxIDFERjU2ODkwNTQwQkJGRjA5QUE1NTMzRjY0MjJFRTlEM0JDNTZERjFDOTY2N0FERjgzRkJGNEU0NUIwRERGNDIgMSAwMzEwNTgxMzY2N0IzRURDNEZBODM2ODcwMUU4REEzMDlGODY3M0IzOTk3NzJCNjg4OUE1Q0UxRjE1RTdEQzdGIDEgMjE5Q0FEQkQ2MkI2QzRBRTNDNjhBQzNDQzMyQjY5NEI2MjhDMjJFRjhEMTUxMDE1MkUxM0U2NUQwQkZFNTNEQSAxIDAwMTVDQTlBNkVCRjA1RDRCODY5QjI2NzBENTBENjFFMkI4RTE1M0VDQkIwOENBQTgwNjk2QjMwRDdCMDA1MDggMSAxQ0FBOUVCNjM2Q0IwMjJENjMyMThBOTEyNkI0MEU1NEMwODUwOUEyMkNBOTM2MTU1NEVGNjI1ODM4QjE4NEEwIDEgMDMxNTQxNTc0OTkxNTgwNEI5OEZCODhFMTdGRTdFMDlCODlDQjBENEFGOTIwMDVGMkIyOTY0QkVGREM2QkIwQiAxIDFBMkExNTQwREQxQkI5REQxMDE3RDM0ODVCRDVFQUZGOEVDN0M2NzMwQ0I5MDhCQTZEQzQzMEMxMzE3ODczOEMgMSAxNEE1MDM5QzNGNDYyQUFEOEM0M0M3N0MwMjc5OEE0NzE3QzJFMjQxNzNBODU3QzUwRkJFQjk5M0UyRUI0RUY4IDEgMTBGQTg4NEY1RkE3MkUxNjcxODkyRTNGNUM5NjI1MEM2QUVERkFBRkJDRjgxMDZFQzBGM0EyOTVBNEU3MjI5NiJ9fSwidGFpbHNIYXNoIjoiRlRHTlBtdmVicTRLZWdrNXV1bkRLWlNvaW9oS1pDcEZxTkRnb3NaNmlDNVoiLCJ0YWlsc0xvY2F0aW9uIjoiL2hvbWUvc2hhYW4vLmluZHlfY2xpZW50L3RhaWxzL0ZUR05QbXZlYnE0S2VnazV1dW5ES1pTb2lvaEtaQ3BGcU5EZ29zWjZpQzVaIn19ffhxgICgRJHedcbj1Hr913tO/wEGZQgrJpTv8vUT16OQTitvH26gYiNs08jukUFJxJNWpkCQp8NBP1bqYXgfwbkTC6qCVByg0PXvEgv2Qw1np5sId7N4hITbGyWQ5GjoNouXCU1BwQ+AgICAgICAgICAgID4OZcWJKdVU2dEhYZXgzOGJ4UmhFTE1wWjo6BfKBcBYXHv6mgeaHJw3COqzq/o4JJsC7YMMxvmOBdgmfkBsaD0PZ8g9hcKs6k7eUuxz+ccv6oDn25GCj9korZ5J2ppw6APFVqpq8ClC0c5xgbG/ZGaINaYcaxwM330QdpC1p1DLaCswQB7kgT814GCOt6oXtByUAcOAifAPdk99nbd7Vg+tqBSY0F3XIqBC7JLb7pvfmpmu9/yZ7LG/2VXV9/e8atPC4Cgz8yGI/CpZvlCyzwrWB9Qnac5waH73QqrqSin5baI8b+g6y77jp+gy++RktzxWqUI79+L4W3V1O5tZEQJAEoc29Sg+oMOsxdeXnspelh0qQwJ6vqATHp6+ZhoJcglHAOEl0eAoLnL5WvOgWCFtbTl9YrmzJu4Q13j2T3A2SXHqLqEwus8oNTx3koe14+eB5AbP7kuQzrkaIBp55pWvRNdyabkyDZqoOU9kBp1jsjcqbUA2buWH5Zzz/BJjCevwc/80PqSshO7oMwURtYLhgJmC3Pi2pfFuS9g/WOobsbo+UcuoOxOaeEtoF+UQ/U1EJSwC75Bw3qeBOa+4JsdZauIsokkUusTWZOhoCIoKf/6byo4OMRBgwky31598IpxxMn0sJGT1TU8UrOQgID5AhGgzsMZEH5cod6z79OSLh0d+a0fJDQZJaOwfF48IP8bptOgL0BctXWLM4KVmRgENAT73LvwGsfFCNvb82sdEqkEGvOgl/k0Y2By+Db4T4+PHWI2ty+KeNfOggJLKPTWBc278BGgDSHNRUhDuAutDwI/elqRVfsBxRCC/vQxKw85bCUc4GGgRK8BX8dvkPEO32WVHrO0D/AkoJN1RCw4Hzg+FOa8x/ygnN8ADaBFc+CwD9yYS0TFAtzXINrTeWo2kjbtJr5W/tigReZOT7Tz0lhKdnI3S2e8J/Zu9k9kQagSmXrI+C5WR7ug5vVES2frv8zk4iErkzFKDfhWcgQ1358ieAr/FkbhgKGgqkQiq9uQjJEtqd71ghAPLQZI9qwceWGSizMo93vXChygnmkblDqyrap38rlrGMxnCGzVsRkWSpaEIAdHtlVOnTqg8BT9W9DfBg3OTntyvYIeVKO6M3Cjj+0PqzI00n6ZOWmgRW9ExX1REmzaaP0w+kYCxL8hwdvm1rH9d+PQfs+jqXGgJyALW436Fb81DNcQfUUEdt7FaaoTi3ulGcvAZqY2PL+gbJCml7zD/rjAnefT2RuWXbKcqjLp2D5ytUN8S+5I7mKgrnx7wY53cXjZvSYQ/SwswO/Nadotah0sBi9/DpEVj7egK5vcsaBELBmQsL6t501dclUKkquWpqHMtn47X3/47ZaA", + "root_hash": "GxYME9tsA7DJmG6gxFQnZEbYJPJHMGPxkyC3WgLjenhr", + "multi_signature": { + "value": { + "txn_root_hash": "AEDjpftVLdQ52ndRNgTdi58WWY8qyRgJEbYKYFPoGvFX", + "timestamp": 1632558557, + "ledger_id": 1, + "pool_state_root_hash": "NCGqbfRWDWtLB2bDuL6TC5BhrRdQMc5MyKdXQqXii44", + "state_root_hash": "GxYME9tsA7DJmG6gxFQnZEbYJPJHMGPxkyC3WgLjenhr", + }, + "participants": ["Node2", "Node4", "Node1"], + "signature": "Qw9q7HGDwmDck9MhFvja62PsyZ6pyxifr4EL4P2fKPo232EGwT7xQ9dFaHf5eBxZ2mfshUFJT1HQwAaG3Ne41shxiUxp35uLqrMhpWcSpj1G8yDwqk3RA5u8TuwqEyjHqeonEKbWeGuc9E6cAsUuVPghssoUkVUjiVBSG6Kfpdozas", + }, + }, + "txnTime": 1632558551, + "identifier": "GjZWsBLgZCR18aL468JAT7w9CZRiBnpxUPPgyQxh4voa", + "seqNo": 370, + "data": { + "credDefId": "BbJuU6tHXex38bxRhELMpZ:3:CL:368:tag1", + "revocDefType": "CL_ACCUM", + "value": { + "publicKeys": { + "accumKey": { + "z": "1 19D4909A597523ECC28DA2535C4CED69853EA7C1E8483FABC5CA57234FB51E50 1 1A58E6DFD101593948B03D189E08C73F89D4B6FE65B923C803CA1DDAF096D520 1 0377CF7C3897CD438CE001F4616FFE509A9A858171B8CA9A246B4322BFF3D9CA 1 1DF56890540BBFF09AA5533F6422EE9D3BC56DF1C9667ADF83FBF4E45B0DDF42 1 03105813667B3EDC4FA8368701E8DA309F8673B399772B6889A5CE1F15E7DC7F 1 219CADBD62B6C4AE3C68AC3CC32B694B628C22EF8D1510152E13E65D0BFE53DA 1 0015CA9A6EBF05D4B869B2670D50D61E2B8E153ECBB08CAA80696B30D7B00508 1 1CAA9EB636CB022D63218A9126B40E54C08509A22CA9361554EF625838B184A0 1 0315415749915804B98FB88E17FE7E09B89CB0D4AF92005F2B2964BEFDC6BB0B 1 1A2A1540DD1BB9DD1017D3485BD5EAFF8EC7C6730CB908BA6DC430C13178738C 1 14A5039C3F462AAD8C43C77C02798A4717C2E24173A857C50FBEB993E2EB4EF8 1 10FA884F5FA72E1671892E3F5C96250C6AEDFAAFBCF8106EC0F3A295A4E72296" + } + }, + "maxCredNum": 16, + "tailsLocation": "/home/shaan/.indy_client/tails/FTGNPmvebq4Kegk5uunDKZSoiohKZCpFqNDgosZ6iC5Z", + "issuanceType": "ISSUANCE_ON_DEMAND", + "tailsHash": "FTGNPmvebq4Kegk5uunDKZSoiohKZCpFqNDgosZ6iC5Z", + }, + "tag": "tag1", + "id": "BbJuU6tHXex38bxRhELMpZ:4:BbJuU6tHXex38bxRhELMpZ:3:CL:368:tag1:CL_ACCUM:tag1", + }, + "id": "BbJuU6tHXex38bxRhELMpZ:4:BbJuU6tHXex38bxRhELMpZ:3:CL:368:tag1:CL_ACCUM:tag1", + }, +} +GET_REVOC_REG_DEF_REPLY_B = { + "op": "REPLY", + "result": { + "type": "115", + "reqId": 1632558692947869200, + "state_proof": { + "proof_nodes": r"+Qm4+FGAgICAgKDhnfmsRzgYrRjzudZZZEcx6Em2c7hKCkI2jpfPvCO7A6A7fIAYM71Yj4a9fr96py1o2lrLFsXwsT5FX351uQoFvoCAgICAgICAgID5AXGgv9Nc6RNva1AqWXs3IB5NouueTpiLfD1lzpkDlYojxOOgSE2gU4AqppokIWWi72raYFDx7jpH46PMy2aFcj2wiySg0U1qetNR0nxwrod6ShcOGlCzl5riSy2eLEXmJFC4FqagsLH6Z8pIbmt2p6USTFoi3+O+gabk9WOY/WkbUYJNNHmg5wyCXmHrO6XnWrFxg13PBBS/WFe23Bt+L38Z+kLRVDKg0fs56mUjnOFEVed3x4yBIqG3BXNsH6UUQhaKpsLmhquAoCzz9q9bzGOS3quwu+KZZrz4nVDwAGMbZXYd5KAqXcq5oBHtoirR/Vhl0sUPZu/5e/7yJ/QlFnieFAYhL4tZwVVioKWot7qDtOaW6HPjzkVJWLDD/7CarvUtAUnz1MZ0Dd8igICgP8pTXsfPEYwGNECljgjELiktUNoof9NjF2horaKuT3+AgKA8hrTOum5D5cHhFaHD9N1VICNdOmXSCb/HXEK4pWzDH4D5BSy0IDpVWnh0c2Q3V0ozM2JTVkRrbmNQaGJnOjM6Q0w6MzgxOnRhZzE6Q0xfQUNDVU06dGFnMbkE9PkE8bkE7nsibHNuIjozODMsImx1dCI6MTYzMjU1ODY3NSwidmFsIjp7ImNyZWREZWZJZCI6IlVaeHRzZDdXSjMzYlNWRGtuY1BoYmc6MzpDTDozODE6dGFnMSIsImlkIjoiVVp4dHNkN1dKMzNiU1ZEa25jUGhiZzo0OlVaeHRzZDdXSjMzYlNWRGtuY1BoYmc6MzpDTDozODE6dGFnMTpDTF9BQ0NVTTp0YWcxIiwicmV2b2NEZWZUeXBlIjoiQ0xfQUNDVU0iLCJ0YWciOiJ0YWcxIiwidmFsdWUiOnsiaXNzdWFuY2VUeXBlIjoiSVNTVUFOQ0VfT05fREVNQU5EIiwibWF4Q3JlZE51bSI6MTYsInB1YmxpY0tleXMiOnsiYWNjdW1LZXkiOnsieiI6IjEgMTcwQkZCOUVCQUREQTJDREI4NTFCQjMwQkU3MjhGQjdBREY3RTNDRDVCMkY5OUFCMkY5MzM1QTg1M0Q2MzI4RSAxIDE0NDdBNzc0M0U2ODNBRjdCRDc0OTg0RjIzQkM0QTA1REZDQUUyMDMyMkFGQUUzMTc5QjAwNEU3MTVDRUJBRDAgMSAxRUM4MDA2NDAzOUJBQzg3RUY5RUQ4QkVBMzlEMTBBQkNCQTVDRUZFNUIwNkM1MzBGMEYwQjgwRTUzREJGQUE0IDEgMDAyN0M1RENDRDk2RTNBRDQyN0Q2RjdBNUUwMTZEMTk0N0JBQzEzQTg4N0ZGRDRFNzc5RTk5OTlCQUFDQTcxMCAxIDA0RTBBNDc4Q0M3N0YwQ0QzMTg3MUUyNkFEM0E3MUI0MTkyNUQ3MjU3OERGMjFBOERCQUE4MENBNkQ0QUNCQTAgMSAwNTczRkVBOTZCMTYyNTc3RTgyQjE1QjQ3MzZGOEJEM0VBNTU5NzE0RTI3MUIyRkMxMDJCQzQ1RDAzRTgwRTUzIDEgMUEyRjBGRDdFOTdDRTdFMzExNTQ5OTM3NzdGRkQ2QkY1MzQ5QkQ4NzFBRDU3NzEzREZERkRGMTUzRjMxRDA3NiAxIDEzNzFFNzUyMDA0QjNBQUFDQ0ZCNzdGMTBEN0VGMkUxRDQxODQ3ODQ5NDdDNTJGMDE0Qzc3MThDODhGM0IyQkEgMSAwMzg1RkJERDg0MkVGRTdBRTU1MTg0M0U2MkZCMjZBRjQwOTczREFFMzMzNzM1NTAxRTgyMDFEQkY3RDUyRUQ2IDEgMUZEQUYyOUU2M0UyNkIyMUFFNTVBNzlFQzY5NkEyRDNENjRGMzg4M0M3NTE0Qjc4QkE1RDdFREY1ODY4RDY1MiAxIDA5N0Y5RUEwRjAwNzBBRTM2MjI4RTk0MTA2RjlGOTlCOUEyNkMwNzdFRUVBNDlFMDc2MUY0NTA4NjFGMDlENzggMSAxNzQ3RTA4RTFDRUJDNkE4MzVBMThGRUQ1RjBEQTAyRDZDMzEwMkE0QTg1NTk1NEMwMUMwODYzREQzQjU4OEJEIn19LCJ0YWlsc0hhc2giOiJEcmRyUnF0TFhlR00zcUJxUFlMOFhudXRzejdWZlRaQWJBc3FEOHIxNnpBYSIsInRhaWxzTG9jYXRpb24iOiIvaG9tZS9zaGFhbi8uaW5keV9jbGllbnQvdGFpbHMvRHJkclJxdExYZUdNM3FCcVBZTDhYbnV0c3o3VmZUWkFiQXNxRDhyMTZ6QWEifX19+DmXAKeHRzZDdXSjMzYlNWRGtuY1BoYmc6OgVFz/eib08qasn8S1ZNZiSD/AONtILOC+9K3g+/tptHP4cYCAoGmQUewQJA2qlDXkc3dcXQlWBoAwbqM8aVru9NcZj2RZoBkTxTckUGPdX6hdNFVA8GWEMA7jxN9EBrLFiygYA+0QoM0TcJYpz6NylBTu9Ab1CIOBjhAtjsXoVpZcYCuIRpcvgICAgICAgICAgICA+QIRoGD/1VWsyAYU+EzD962o9WHmYIRQysQ+eqPbZTLZVWlLoC9AXLV1izOClZkYBDQE+9y78BrHxQjb2/NrHRKpBBrzoJf5NGNgcvg2+E+Pjx1iNrcvinjXzoICSyj01gXNu/ARoBxLS6JMa60ayjQlBiwaHDR7EUGlUz+LLxLYf6iickrSoESvAV/Hb5DxDt9llR6ztA/wJKCTdUQsOB84PhTmvMf8oGEcbKR9tVk4sAPJqpqhqdTLbWMRDIFOLVdKms9EEirgoEXmTk+089JYSnZyN0tnvCf2bvZPZEGoEpl6yPguVke7oOb1REtn67/M5OIhK5MxSg34VnIENd+fIngK/xZG4YChoKpEIqvbkIyRLane9YIQDy0GSPasHHlhkoszKPd71wocoJ5pG5Q6sq2qd/K5axjMZwhs1bEZFkqWhCAHR7ZVTp06oPAU/VvQ3wYNzk57cr2CHlSjujNwo4/tD6syNNJ+mTlpoEVvRMV9URJs2mj9MPpGAsS/IcHb5tax/Xfj0H7Po6lxoCcgC1uN+hW/NQzXEH1FBHbexWmqE4t7pRnLwGamNjy/oGyQppe8w/64wJ3n09kbll2ynKoy6dg+crVDfEvuSO5ioK58e8GOd3F42b0mEP0sLMDvzWnaLWodLAYvfw6RFY+3oMrvlf20M89qxsgBUnP2Dq69huQGRcrxOVhROpX1n9n5gA==", + "root_hash": "D6bik2u1PqTXfinfCDRtBesQufxTkDhcgHv6YkHwhKqV", + "multi_signature": { + "value": { + "txn_root_hash": "4nAzsTEHFLY37MhkroxNrQtqp3keu2xctYSsF3LW1oLq", + "timestamp": 1632558690, + "ledger_id": 1, + "pool_state_root_hash": "NCGqbfRWDWtLB2bDuL6TC5BhrRdQMc5MyKdXQqXii44", + "state_root_hash": "D6bik2u1PqTXfinfCDRtBesQufxTkDhcgHv6YkHwhKqV", + }, + "participants": ["Node3", "Node2", "Node4"], + "signature": "RNDa6r8TJrNkrfuSsj4DhXFscRTvV5JVZ5LtLCKKuJ2rgGQMS68HB15zCn6bscFdTMkskwd1Fu172oa8ub6dc5xGKr5s2mMDFkGHL3WiML4oeUvMn8eZ4Fz99SXySK8g8DbX7SrVRbnJyBuiLbSXY1rwtGH9XvRKfAVNbbqpV3vG6T", + }, + }, + "txnTime": 1632558675, + "identifier": "2PRyVHmkXQnQzJQKxHxnXC", + "seqNo": 383, + "data": { + "credDefId": "UZxtsd7WJ33bSVDkncPhbg:3:CL:381:tag1", + "revocDefType": "CL_ACCUM", + "value": { + "publicKeys": { + "accumKey": { + "z": "1 170BFB9EBADDA2CDB851BB30BE728FB7ADF7E3CD5B2F99AB2F9335A853D6328E 1 1447A7743E683AF7BD74984F23BC4A05DFCAE20322AFAE3179B004E715CEBAD0 1 1EC80064039BAC87EF9ED8BEA39D10ABCBA5CEFE5B06C530F0F0B80E53DBFAA4 1 0027C5DCCD96E3AD427D6F7A5E016D1947BAC13A887FFD4E779E9999BAACA710 1 04E0A478CC77F0CD31871E26AD3A71B41925D72578DF21A8DBAA80CA6D4ACBA0 1 0573FEA96B162577E82B15B4736F8BD3EA559714E271B2FC102BC45D03E80E53 1 1A2F0FD7E97CE7E31154993777FFD6BF5349BD871AD57713DFDFDF153F31D076 1 1371E752004B3AAACCFB77F10D7EF2E1D4184784947C52F014C7718C88F3B2BA 1 0385FBDD842EFE7AE551843E62FB26AF40973DAE333735501E8201DBF7D52ED6 1 1FDAF29E63E26B21AE55A79EC696A2D3D64F3883C7514B78BA5D7EDF5868D652 1 097F9EA0F0070AE36228E94106F9F99B9A26C077EEEA49E0761F450861F09D78 1 1747E08E1CEBC6A835A18FED5F0DA02D6C3102A4A855954C01C0863DD3B588BD" + } + }, + "maxCredNum": 16, + "tailsLocation": "/home/shaan/.indy_client/tails/DrdrRqtLXeGM3qBqPYL8Xnutsz7VfTZAbAsqD8r16zAa", + "issuanceType": "ISSUANCE_ON_DEMAND", + "tailsHash": "DrdrRqtLXeGM3qBqPYL8Xnutsz7VfTZAbAsqD8r16zAa", + }, + "tag": "tag1", + "id": "UZxtsd7WJ33bSVDkncPhbg:4:UZxtsd7WJ33bSVDkncPhbg:3:CL:381:tag1:CL_ACCUM:tag1", + }, + "id": "UZxtsd7WJ33bSVDkncPhbg:4:UZxtsd7WJ33bSVDkncPhbg:3:CL:381:tag1:CL_ACCUM:tag1", + }, +} +GET_REVOC_REG_REPLY_A = { + "result": { + "identifier": "2PRyVHmkXQnQzJQKxHxnXC", + "reqId": 1632557041908469400, + "data": { + "seqNo": 256, + "value": { + "accum": "21 125798F3C7741D1000FC7C73E77590E07CD7506620B1C809A3AEEA3BFD53E617B 21 13B3555A2847CEBC534A318292717B438D6220910E92E83C99CE57FEDBECDB832 6 5A69F6ACBB535F55F36B45C965BE9901042521BB2362AB87365B1FCB5B4D234C 4 34399006361DD42863054774B9E3947D86D9DD42A1323D92501C80D0BD9F84F0 6 595D71802C6CD0069C354B7CEF18CC53DC1D14096B0C386AAC5DFDD146A72D5A 4 03FD00736B09E9DB8E0061E6185830C09881AA69A4F4026AC82A059A27FFF99B" + }, + "revocDefType": "CL_ACCUM", + "revocRegDefId": "GUTK6XARozQCWxqzPSUr4g:4:GUTK6XARozQCWxqzPSUr4g:3:CL:249:tag1:CL_ACCUM:tag1", + "txnTime": 1632557039, + }, + "seqNo": 256, + "revocRegDefId": "GUTK6XARozQCWxqzPSUr4g:4:GUTK6XARozQCWxqzPSUr4g:3:CL:249:tag1:CL_ACCUM:tag1", + "type": "116", + "timestamp": 1632557039, + "state_proof": { + "root_hash": "A5FT4MFfTxxoUsbgGDKLpYCd9CNhFb6iBuP2QCCtxosv", + "proof_nodes": r"+QfW+FGAgICgm7mXdRvlzGPXvCl5OvS2xc5IX17AR6zqMpJkd6eEUr2AgICgFkd9DZPfLnL3P/3NWm0axqubLDLipM5/0JpMLVbaLEGAgICAgICAgID4cYCAgKA86FxQZV+O5/BDzA4/pfSMUBRRuFX/U/n8R1shJ97BvKDgdb1zIBodC1qqEiJsJr5CIiCup6f/tpRdhpptDJlMNKBWFjuqHtJQoq0P9JyrQ3zpDo2pFXIynSXU0vHpZsFeUICAgICAgICAgICA4hqgU3k4J1pZQi2kD46TlZxX5VZc/FjwS2JFnro5RpH/UOH5ARGAgKAuX6XZPMXEZXsQK+T4CucnVkEo+fY2GS4haZpTw7nNNqAhNp9lgiuuYB49ov+3hrw0cmYQrl/cW9vt61TgJPecAYCgvQQxg9DCA2FcSy9zbYvZz52MJgCVZk+GbMA5vFZ1MSSgY8XAseKcY68ke8PA9DWqyP1IgdnEVhqUBueWXMZ4DfigXkn/D/Gw6GcMUN0GztAhknly2C9ByRXOeMebFPQRWQ6AoNBq1Hdzxp5GVqa1cYSIIFSQzxMPJ1xk4WPOL3h4ymNmgKC/EIAumu/LD50YY3W1ErcKpTz+Ezhqz8mAuBQGLTfwPaCzATvn41wmkLyXA6rYaNuxGQOWzK9u/iBYE42Mrpn/h4CAgID4UYCAgKCpjQAzddWol055/PRfiwVKE2gseUL9V4oPe9q2RaVQkICg76VmfWOJMCKRhiINniq1mLlw2IDO7g5Efx3nU7V7T2SAgICAgICAgICAgPixgKCj/vSgDeFFz3YG+6t5cmUVyXtZTN+tuVLNZbqteoWg+YCAgICAoPjXN8xqipYO28WwNnC9t2vnRHFuYInYc0buGeQkZmWYgICgcOpz+zCYt0KoTi/vRyXuxhMCRUXVzvOXHpgU6sJ2DI+AoHqwsByALFlKmsj9egWzi89hYXKIU/jpUa1uZ06W7tpLoONasq5SvVmENqJI6ApXJ2065o6+QvMTYPJtJcp1SsW5gICA+QK8uEo1VEs2WEFSb3pRQ1d4cXpQU1VyNGc6NDpHVVRLNlhBUm96UUNXeHF6UFNVcjRnOjM6Q0w6MjQ5OnRhZzE6Q0xfQUNDVU06dGFnMbkCbfkCarkCZ3sibHNuIjoyNTYsImx1dCI6MTYzMjU1NzAzOSwidmFsIjp7InJldm9jRGVmVHlwZSI6IkNMX0FDQ1VNIiwicmV2b2NSZWdEZWZJZCI6IkdVVEs2WEFSb3pRQ1d4cXpQU1VyNGc6NDpHVVRLNlhBUm96UUNXeHF6UFNVcjRnOjM6Q0w6MjQ5OnRhZzE6Q0xfQUNDVU06dGFnMSIsInNlcU5vIjoyNTYsInR4blRpbWUiOjE2MzI1NTcwMzksInZhbHVlIjp7ImFjY3VtIjoiMjEgMTI1Nzk4RjNDNzc0MUQxMDAwRkM3QzczRTc3NTkwRTA3Q0Q3NTA2NjIwQjFDODA5QTNBRUVBM0JGRDUzRTYxN0IgMjEgMTNCMzU1NUEyODQ3Q0VCQzUzNEEzMTgyOTI3MTdCNDM4RDYyMjA5MTBFOTJFODNDOTlDRTU3RkVEQkVDREI4MzIgNiA1QTY5RjZBQ0JCNTM1RjU1RjM2QjQ1Qzk2NUJFOTkwMTA0MjUyMUJCMjM2MkFCODczNjVCMUZDQjVCNEQyMzRDIDQgMzQzOTkwMDYzNjFERDQyODYzMDU0Nzc0QjlFMzk0N0Q4NkQ5REQ0MkExMzIzRDkyNTAxQzgwRDBCRDlGODRGMCA2IDU5NUQ3MTgwMkM2Q0QwMDY5QzM1NEI3Q0VGMThDQzUzREMxRDE0MDk2QjBDMzg2QUFDNURGREQxNDZBNzJENUEgNCAwM0ZEMDA3MzZCMDlFOURCOEUwMDYxRTYxODU4MzBDMDk4ODFBQTY5QTRGNDAyNkFDODJBMDU5QTI3RkZGOTlCIn19ffkCEaDOwxkQflyh3rPv05IuHR35rR8kNBklo7B8Xjwg/xum06DLcDzsF2f9AxLo7D/7ncCg8Hqjjg1zK4zJz+iKCCHCOqDLCdeuKR+DdMTxWUikJdQHF5pq0KWRweppeZlL6HQfp6Brm58NtOuSySponWh69uhxMuDlzXaqddKs7ZYtbZe6MKB0gTWw52pzmv/oW3GZqkd7FOIPA6qbdKBjIzpk6gnMzqDC6fXYwIazimJdS4yosnhr0QH0pE/3rOW3EtlEGOSNRaB6i5J5LHEXY6QnMIZ4S/y3UUzZrfo4w1RKONiuB5v5FaDm9URLZ+u/zOTiISuTMUoN+FZyBDXfnyJ4Cv8WRuGAoaCqRCKr25CMkS2p3vWCEA8tBkj2rBx5YZKLMyj3e9cKHKAvRu4rSiR2J09faSaGSylJwHT14aTT1/jT1kzK8KXSgaDwFP1b0N8GDc5Oe3K9gh5Uo7ozcKOP7Q+rMjTSfpk5aaAuBT1ZgO78ywP4X6oYNc8A2Nt5seAzQ57+QjxbMjCeJqBOalu7RBpHCzxJTRMOttrUDaldR6iKDEnOUj29P4g346BOa63G23RPo2P5XYSa5jN/YKMkgNh6mg175A2LZ7eMpKCqzNO6E9rcaSgxIWXnKpQDTPhT/CASyiEVtxek7oIEb6BTjK7Ir/DtpJxBWw7O0F+U6m7u55QfEgh8OFdxFTVE74A=", + "multi_signature": { + "value": { + "txn_root_hash": "4NVqX2SZkr3iRbvV4147gWbd7KWTXW7cuDF1Cf5KRft6", + "state_root_hash": "A5FT4MFfTxxoUsbgGDKLpYCd9CNhFb6iBuP2QCCtxosv", + "timestamp": 1632557039, + "pool_state_root_hash": "NCGqbfRWDWtLB2bDuL6TC5BhrRdQMc5MyKdXQqXii44", + "ledger_id": 1, + }, + "participants": ["Node1", "Node4", "Node2"], + "signature": "Rb1J2dbyCbUNKTwXqMGriPQNjsCXnZ389MhUFjxpaXkWETFViYip5hcsmaqf1nBwynNrwscgrnqrvLdUuSkSFTh1T44XKWEbxRuNkqEfrwzHN8gJRfgC76ZJ6LauouFxupfTKDVAQmp5VFNhheLWMcb3kYDsradM53JFTLEpSQiy7D", + }, + }, + "txnTime": 1632557039, + }, + "op": "REPLY", +} +GET_REVOC_REG_REPLY_B = { + "op": "REPLY", + "result": { + "revocRegDefId": "72u1sXrVKn1tfa4kYr7sTV:4:72u1sXrVKn1tfa4kYr7sTV:3:CL:265:tag1:CL_ACCUM:tag1", + "reqId": 1632557306220425900, + "state_proof": { + "proof_nodes": r"+QhJ+FGAgICgVXhXibbo/CZvOCYtSZE2ING4I58Mw2HYV+RiMKEit1aAgICgFkd9DZPfLnL3P/3NWm0axqubLDLipM5/0JpMLVbaLEGAgICAgICAgID4cYCAgKBsrgWLCi7GSYl3p0wrfVfId/gJM45+ST3hC2fUmQaDWKDgdb1zIBodC1qqEiJsJr5CIiCup6f/tpRdhpptDJlMNKBWFjuqHtJQoq0P9JyrQ3zpDo2pFXIynSXU0vHpZsFeUICAgICAgICAgICA+FGAgKArYXa4Le86H9mT3z75T0f1K/UQFPDPdiRJyX0Ja8wooICg+GxxpAZXoFNWNpME6ADhkQYqweeq4pj800Qj3g3kNiyAgICAgICAgICAgID5Ary4SiB1MXNYclZLbjF0ZmE0a1lyN3NUVjo0OjcydTFzWHJWS24xdGZhNGtZcjdzVFY6MzpDTDoyNjU6dGFnMTpDTF9BQ0NVTTp0YWcxuQJt+QJquQJneyJsc24iOjI3MywibHV0IjoxNjMyNTU3MzA0LCJ2YWwiOnsicmV2b2NEZWZUeXBlIjoiQ0xfQUNDVU0iLCJyZXZvY1JlZ0RlZklkIjoiNzJ1MXNYclZLbjF0ZmE0a1lyN3NUVjo0OjcydTFzWHJWS24xdGZhNGtZcjdzVFY6MzpDTDoyNjU6dGFnMTpDTF9BQ0NVTTp0YWcxIiwic2VxTm8iOjI3MywidHhuVGltZSI6MTYzMjU1NzMwNCwidmFsdWUiOnsiYWNjdW0iOiIyMSAxMzM4MkFEQUFFMTVCMUY5MTREM0I2QjY0Mjk1NjQ0NTIxRjkwNzMxMTc5MEIwREYyMDNDMUJGQzUwNEYxQzJFRiAyMSAxMkJCNTU3NzA4N0QzREM3NjIyMjgyOTFEM0Y0QkVEMEI3QUU0QzlCQjQ4MUYyNTNEMjdBMzBFMzk5MTUwOUFFOCA2IDcxMEMyOUY0NzdGNzUxQjZFMzAzQkRGMkUzQTlERjA3NTM2QTI3NkY2RjkwQTlFNUNFMTREMzkzOTlDRDFEQjkgNCAwNzkyOEY0RjZGRUEwMkVDODg5QjlDMzE1MkNFREM4M0MyNDZDNkEyRkFBRjdERjM2NkQxNTRCNUNBNTQ0QTkzIDYgNzFBMzYzOUZGODU0MzE3MTZGNUI2REVENkU2MzU2MENENTIxNTA1NjdDN0Y3RjhCNEQxNjQzNzM4NzdCOTQwRSA0IDFCOTExRjdEN0JEQkI2MzZBNkI0MTQwRDZFMzA0RTdDOERDRkVEMzQ2NkYxRTM2QjRCMDRERDUzMTA1Q0Y4ODcifX19+HGAgICg3FYmgs8JyiC8Jizvk2ln1rEAuZunkudnRsWjl8BVl5ega4lR3E/3KgHiKHikQhvTvWzXC/1tY18HAXHtYFbQJCaAoENrCAgOrxKDaQ6m8l77L9x3ZA2V30znMRd3xlH0ltCggICAgICAgICAgPixgICgEVwsB7JVD48sXgXuqO7zicbb2KUTEDS1059LV4E7G/2gZpviTBNKphSyCWUgdohQ8/8OYF4LT/zF2IVeQgTyg0aAgKAcbA8kAyJG1wZASRmG+vvJvq3R7meCqF8O54LgLLq12qA9n3HVx52sE+Aj/ahvbHpE5rKPS/PlQgjQsGTmbzxDv4CgWrAgBAmqyTo/ySH6lC7Q7LxFzRFUvmYmowjvKjpo0IeAgICAgICA+QERgICgLl+l2TzFxGV7ECvk+ArnJ1ZBKPn2NhkuIWmaU8O5zTagITafZYIrrmAePaL/t4a8NHJmEK5f3Fvb7etU4CT3nAGAoE/ZszyCStx74mi6p5GUvgDnDJMxYkTTX+j4FMpFydhKoA8GFzvz4w/wzjT89r9upL0Jx3Te7dGtFix8oYQAzq3XoNKLirqowwgcKtyua2TVcM4v5TCghVy3buMmf3IhRWO4gKDQatR3c8aeRlamtXGEiCBUkM8TDydcZOFjzi94eMpjZoCgvxCALprvyw+dGGN1tRK3CqU8/hM4as/JgLgUBi038D2gswE75+NcJpC8lwOq2GjbsRkDlsyvbv4gWBONjK6Z/4eAgICA4hqgOZwQMvIr9gCrnTMJiQBccL+BCs2TCk13nLk/kZ6jBID5AhGgzsMZEH5cod6z79OSLh0d+a0fJDQZJaOwfF48IP8bptOgy3A87Bdn/QMS6Ow/+53AoPB6o44NcyuMyc/oigghwjqgS3KRzUqPqj/B3IUcHDr9EbkNt5GKeeiJHkwjmEYjD3+gY9HU+6U6iMV3eWHx13rYBWdNrqvJUnZ0roXRDBFH1ZygdIE1sOdqc5r/6FtxmapHexTiDwOqm3SgYyM6ZOoJzM6gwun12MCGs4piXUuMqLJ4a9EB9KRP96zltxLZRBjkjUWgReZOT7Tz0lhKdnI3S2e8J/Zu9k9kQagSmXrI+C5WR7ug5vVES2frv8zk4iErkzFKDfhWcgQ1358ieAr/FkbhgKGgqkQiq9uQjJEtqd71ghAPLQZI9qwceWGSizMo93vXChygL0buK0okdidPX2kmhkspScB09eGk09f409ZMyvCl0oGg8BT9W9DfBg3OTntyvYIeVKO6M3Cjj+0PqzI00n6ZOWmgLgU9WYDu/MsD+F+qGDXPANjbebHgM0Oe/kI8WzIwniagTmpbu0QaRws8SU0TDrba1A2pXUeoigxJzlI9vT+IN+OgTmutxtt0T6Nj+V2EmuYzf2CjJIDYepoNe+QNi2e3jKSgqszTuhPa3GkoMSFl5yqUA0z4U/wgEsohFbcXpO6CBG+gU4yuyK/w7aScQVsOztBflOpu7ueUHxIIfDhXcRU1RO+A", + "root_hash": "BVYp58aLvRurnM8DSScnRpiKvX27RKSFqRFpYZpn3KiQ", + "multi_signature": { + "value": { + "ledger_id": 1, + "state_root_hash": "BVYp58aLvRurnM8DSScnRpiKvX27RKSFqRFpYZpn3KiQ", + "txn_root_hash": "Bxz5VGY3eipuFtNUhf6bY8kbwbxVUXDDqQBtqT15xTwM", + "timestamp": 1632557304, + "pool_state_root_hash": "NCGqbfRWDWtLB2bDuL6TC5BhrRdQMc5MyKdXQqXii44", + }, + "participants": ["Node2", "Node1", "Node3"], + "signature": "QuJsPA4W2E7yxX3mPPFTFG1nAGUn6daEbFiQy3q9rnCCn1foSCWTUp6RJiL3BCrp1EP1STdoXtsp7hySj6M3EA9SpJsCFDVyr5o7dzeumfP2zLseiaH5vK6c8pDnnvCbdEqvC3jiNKQ9rnaY3fUfRPcjha3MExsXLRRv5iWUron1vw", + }, + }, + "txnTime": 1632557304, + "timestamp": 1632557304, + "seqNo": 273, + "type": "116", + "identifier": "2PRyVHmkXQnQzJQKxHxnXC", + "data": { + "revocRegDefId": "72u1sXrVKn1tfa4kYr7sTV:4:72u1sXrVKn1tfa4kYr7sTV:3:CL:265:tag1:CL_ACCUM:tag1", + "value": { + "accum": "21 13382ADAAE15B1F914D3B6B64295644521F907311790B0DF203C1BFC504F1C2EF 21 12BB5577087D3DC762228291D3F4BED0B7AE4C9BB481F253D27A30E3991509AE8 6 710C29F477F751B6E303BDF2E3A9DF07536A276F6F90A9E5CE14D39399CD1DB9 4 07928F4F6FEA02EC889B9C3152CEDC83C246C6A2FAAF7DF366D154B5CA544A93 6 71A3639FF85431716F5B6DED6E63560CD52150567C7F7F8B4D164373877B940E 4 1B911F7D7BDBB636A6B4140D6E304E7C8DCFED3466F1E36B4B04DD53105CF887" + }, + "txnTime": 1632557304, + "revocDefType": "CL_ACCUM", + "seqNo": 273, + }, + }, +} +GET_REVOC_REG_DELTA_REPLY_A = { + "result": { + "to": 1632557591, + "revocRegDefId": "KkA2eXVuUnbrcAYLXgQBhj:4:KkA2eXVuUnbrcAYLXgQBhj:3:CL:282:tag1:CL_ACCUM:tag1", + "identifier": "GjZWsBLgZCR18aL468JAT7w9CZRiBnpxUPPgyQxh4voa", + "seqNo": 289, + "data": { + "revocRegDefId": "KkA2eXVuUnbrcAYLXgQBhj:4:KkA2eXVuUnbrcAYLXgQBhj:3:CL:282:tag1:CL_ACCUM:tag1", + "value": { + "accum_to": { + "value": { + "revoked": [], + "prevAccum": "21 132752A4834056B17A4C9CF7EDE475E35E6CC4805A4DE635821933A9EFFD2800B 21 142DE001C8FB399E5F1A161D394BDCB515C4AA05057E674B5261938F7D8A58F86 6 62802355EA27DA8444332B041BF03994AD3320AC7A11CD7C0F7EF64767A43D64 4 0DEB2EB3075B3657A43857E53E01BF996D032292A9D3AD5343E7C277559B6740 6 5EB11809A3586A7A4AB93E5A67FF1145972F28F2FF1CB4777A49B7AB7C525CD8 4 1EB6F3CAB8D3BA3390F9130534CF97F2345AF330FA770B82A09C2C827E48E8A6", + "accum": "21 1354C75FB770C1F6952B08F3F6EAC48E9A456D65D1F6E4DE34EEF0EDF359CCB58 21 127CC02A5BD0936DACDE07F63875350C9A6707708253C0BC047A6D105F0900C26 6 88F617D8F807EAFE94C03506FDA3D0AA4D07B7E5297E057C32A350DDEC38C23D 4 313C36AE4190E23ECA3533BAFBE71411612CF17FE0F821512E70B8D2CDD4ED90 6 90830FCE44C7945DC2ECCE1098701917DD177E574158759277560DD6BB7A7FD1 4 1B699F8633EC6EA2DB4822A18937C6902CF8906D17B3B033E270062DD8300000", + "issued": [1, 2, 3, 4], + }, + "revocRegDefId": "KkA2eXVuUnbrcAYLXgQBhj:4:KkA2eXVuUnbrcAYLXgQBhj:3:CL:282:tag1:CL_ACCUM:tag1", + "txnTime": 1632557589, + "seqNo": 289, + "revocDefType": "CL_ACCUM", + }, + "revoked": [], + "issued": [1, 2, 3, 4], + }, + "revocDefType": "CL_ACCUM", + }, + "type": "117", + "txnTime": 1632557589, + "state_proof": { + "multi_signature": { + "signature": "RMjCa7rQZGZ6ocaEYBGoT4k8W8mzVemxhtuLgeEm83UxhGExgEc4GF1zXG2mLnVDVLiCutRLEXNQ7b7szmhd4FWHK5MRXAgFoCCKgbsk6p7sqeW2bsPmXtVASyksgq3vVMh8jBuCN5g5ryL2vKExaYsrfFujhf3xZL7w4juYzAo9k3", + "value": { + "timestamp": 1632557589, + "ledger_id": 1, + "txn_root_hash": "F5yJ6fRmeDxcvZvT9EGjhHu2LjjFTuiPgta7uSWEm4Qi", + "pool_state_root_hash": "NCGqbfRWDWtLB2bDuL6TC5BhrRdQMc5MyKdXQqXii44", + "state_root_hash": "DRyobLjvctGyUX39Y7ZYJiyfdk6NAALpCfRWQ7eUoVTg", + }, + "participants": ["Node1", "Node3", "Node2"], + }, + "root_hash": "DRyobLjvctGyUX39Y7ZYJiyfdk6NAALpCfRWQ7eUoVTg", + "proof_nodes": r"+Qk3+HGAgICgBcoXC0RbYsMUYvqRE0owUZbmlpvXUool2oWmCJx5+RCgNR5gPeqWvU+Z5IVoM6Mik99GRz+l6bEfwrl/uQOCZDSg+TFVSsOc2AOdPh09CRISoWf8mqLyr2cMeY1uIbQpCyGAgICAgICAgICAgPjxgKACDiFRMcoOr/g7RRzybbRfZshbYN8bvGjXeYPh7W8OpYCAgICg3p2odDvtuWk90njUZO4avwWMT9bIzvQh1pqzGZ+i806giQdCQHIwQbTF8Yk6rxrBKweZ4AIyghBdf5083kZFsRCAgKA3pjRbosEHZMdkFiHE5z+geHdbtBkWxtUeM7MIH/BFCKDycO7a8GNXIEdVXIiMpIuDxKnngzZW5iDFfs8X8+jxIqCq/jaBFN9FpYCzRd3CL3JG3RqjbnxFtELTwZ7BCy6IPKBPtN7dz/is1WzNOWwmIM2748c5qUuu8o+NY4iyqj5Kd4CAgOSCADqg5mmqpVmIA7IXkyqSe5lfKYwzAy6ixSX3PyIgf79M3iL5BIG4SyBrQTJlWFZ1VW5icmNBWUxYZ1FCaGo6NDpLa0EyZVhWdVVuYnJjQVlMWGdRQmhqOjM6Q0w6MjgyOnRhZzE6Q0xfQUNDVU06dGFnMbkEMfkELrkEK3sibHNuIjoyODksImx1dCI6MTYzMjU1NzU4OSwidmFsIjp7InJldm9jRGVmVHlwZSI6IkNMX0FDQ1VNIiwicmV2b2NSZWdEZWZJZCI6IktrQTJlWFZ1VW5icmNBWUxYZ1FCaGo6NDpLa0EyZVhWdVVuYnJjQVlMWGdRQmhqOjM6Q0w6MjgyOnRhZzE6Q0xfQUNDVU06dGFnMSIsInNlcU5vIjoyODksInR4blRpbWUiOjE2MzI1NTc1ODksInZhbHVlIjp7ImFjY3VtIjoiMjEgMTM1NEM3NUZCNzcwQzFGNjk1MkIwOEYzRjZFQUM0OEU5QTQ1NkQ2NUQxRjZFNERFMzRFRUYwRURGMzU5Q0NCNTggMjEgMTI3Q0MwMkE1QkQwOTM2REFDREUwN0Y2Mzg3NTM1MEM5QTY3MDc3MDgyNTNDMEJDMDQ3QTZEMTA1RjA5MDBDMjYgNiA4OEY2MTdEOEY4MDdFQUZFOTRDMDM1MDZGREEzRDBBQTREMDdCN0U1Mjk3RTA1N0MzMkEzNTBEREVDMzhDMjNEIDQgMzEzQzM2QUU0MTkwRTIzRUNBMzUzM0JBRkJFNzE0MTE2MTJDRjE3RkUwRjgyMTUxMkU3MEI4RDJDREQ0RUQ5MCA2IDkwODMwRkNFNDRDNzk0NURDMkVDQ0UxMDk4NzAxOTE3REQxNzdFNTc0MTU4NzU5Mjc3NTYwREQ2QkI3QTdGRDEgNCAxQjY5OUY4NjMzRUM2RUEyREI0ODIyQTE4OTM3QzY5MDJDRjg5MDZEMTdCM0IwMzNFMjcwMDYyREQ4MzAwMDAwIiwiaXNzdWVkIjpbMSwyLDMsNF0sInByZXZBY2N1bSI6IjIxIDEzMjc1MkE0ODM0MDU2QjE3QTRDOUNGN0VERTQ3NUUzNUU2Q0M0ODA1QTRERTYzNTgyMTkzM0E5RUZGRDI4MDBCIDIxIDE0MkRFMDAxQzhGQjM5OUU1RjFBMTYxRDM5NEJEQ0I1MTVDNEFBMDUwNTdFNjc0QjUyNjE5MzhGN0Q4QTU4Rjg2IDYgNjI4MDIzNTVFQTI3REE4NDQ0MzMyQjA0MUJGMDM5OTRBRDMzMjBBQzdBMTFDRDdDMEY3RUY2NDc2N0E0M0Q2NCA0IDBERUIyRUIzMDc1QjM2NTdBNDM4NTdFNTNFMDFCRjk5NkQwMzIyOTJBOUQzQUQ1MzQzRTdDMjc3NTU5QjY3NDAgNiA1RUIxMTgwOUEzNTg2QTdBNEFCOTNFNUE2N0ZGMTE0NTk3MkYyOEYyRkYxQ0I0Nzc3QTQ5QjdBQjdDNTI1Q0Q4IDQgMUVCNkYzQ0FCOEQzQkEzMzkwRjkxMzA1MzRDRjk3RjIzNDVBRjMzMEZBNzcwQjgyQTA5QzJDODI3RTQ4RThBNiIsInJldm9rZWQiOltdfX19+QERgICgLl+l2TzFxGV7ECvk+ArnJ1ZBKPn2NhkuIWmaU8O5zTagITafZYIrrmAePaL/t4a8NHJmEK5f3Fvb7etU4CT3nAGAoMaP5qubOISuG3SGAkNQFRnpII/ZknbOHGCWlKg69B5UoFuurcxmtfVDeJuPt/1vvd4WFyXsX9ih81w+CFMlMC5xoNKLirqowwgcKtyua2TVcM4v5TCghVy3buMmf3IhRWO4gKDQatR3c8aeRlamtXGEiCBUkM8TDydcZOFjzi94eMpjZoCgvxCALprvyw+dGGN1tRK3CqU8/hM4as/JgLgUBi038D2gswE75+NcJpC8lwOq2GjbsRkDlsyvbv4gWBONjK6Z/4eAgICA+QIRoM7DGRB+XKHes+/Tki4dHfmtHyQ0GSWjsHxePCD/G6bToC9AXLV1izOClZkYBDQE+9y78BrHxQjb2/NrHRKpBBrzoEtykc1Kj6o/wdyFHBw6/RG5DbeRinnoiR5MI5hGIw9/oJOGi1JP6XQWs60HvjbJYuTgpvrg4p7cUKSeCpRUmoYwoCc5l9vDTeM07Z3x/WTHKljqRGbywLuR19BHcsu6aYjdoMLp9djAhrOKYl1LjKiyeGvRAfSkT/es5bcS2UQY5I1FoEXmTk+089JYSnZyN0tnvCf2bvZPZEGoEpl6yPguVke7oOb1REtn67/M5OIhK5MxSg34VnIENd+fIngK/xZG4YChoKpEIqvbkIyRLane9YIQDy0GSPasHHlhkoszKPd71wocoDhACnV/1miuV1ek7RzHzoJa6dOVybb6CWll7rn5d43SoPAU/VvQ3wYNzk57cr2CHlSjujNwo4/tD6syNNJ+mTlpoC4FPVmA7vzLA/hfqhg1zwDY23mx4DNDnv5CPFsyMJ4moE5qW7tEGkcLPElNEw622tQNqV1HqIoMSc5SPb0/iDfjoE5rrcbbdE+jY/ldhJrmM39goySA2HqaDXvkDYtnt4ykoKrM07oT2txpKDEhZecqlANM+FP8IBLKIRW3F6TuggRvoFOMrsiv8O2knEFbDs7QX5Tqbu7nlB8SCHw4V3EVNUTvgA==", + }, + "reqId": 1632557591169735900, + }, + "op": "REPLY", +} +GET_REVOC_REG_DELTA_REPLY_B = { + "op": "REPLY", + "result": { + "revocRegDefId": "EKimXGooGt9o5GrPxeWqqD:4:EKimXGooGt9o5GrPxeWqqD:3:CL:298:tag1:CL_ACCUM:tag1", + "from": 1632557772, + "reqId": 1632557777495066800, + "state_proof": { + "proof_nodes": r"+Qfl+QERgICgLl+l2TzFxGV7ECvk+ArnJ1ZBKPn2NhkuIWmaU8O5zTagITafZYIrrmAePaL/t4a8NHJmEK5f3Fvb7etU4CT3nAGAoBoTvYRQeo4Rk1rDtGLQto1rlFMX6GCFzVTg7QI1YPW8oB/JIxx8ukETAY68/d6b+k1z1U4mqX9cZ6bCYgVhbWxQoNKLirqowwgcKtyua2TVcM4v5TCghVy3buMmf3IhRWO4gKDQatR3c8aeRlamtXGEiCBUkM8TDydcZOFjzi94eMpjZoCgvxCALprvyw+dGGN1tRK3CqU8/hM4as/JgLgUBi038D2gswE75+NcJpC8lwOq2GjbsRkDlsyvbv4gWBONjK6Z/4eAgICA+QK9uEsgS2ltWEdvb0d0OW81R3JQeGVXcXFEOjQ6RUtpbVhHb29HdDlvNUdyUHhlV3FxRDozOkNMOjI5ODp0YWcxOkNMX0FDQ1VNOnRhZzG5Am35Amq5Amd7ImxzbiI6MzA2LCJsdXQiOjE2MzI1NTc3NzUsInZhbCI6eyJyZXZvY0RlZlR5cGUiOiJDTF9BQ0NVTSIsInJldm9jUmVnRGVmSWQiOiJFS2ltWEdvb0d0OW81R3JQeGVXcXFEOjQ6RUtpbVhHb29HdDlvNUdyUHhlV3FxRDozOkNMOjI5ODp0YWcxOkNMX0FDQ1VNOnRhZzEiLCJzZXFObyI6MzA2LCJ0eG5UaW1lIjoxNjMyNTU3Nzc1LCJ2YWx1ZSI6eyJhY2N1bSI6IjIxIDEzOEFEQzlGQzAxN0Q0RjlGOERBMzk4NDg1MTlBOUQ0NDg3NjM4NzBGOEFFMTY3N0ZENTFBMjE2MzEyRjk4NEM2IDIxIDEzMUUyOUY4NTI5NjQ5QUZCNEI3MTM4NEI3RjAwRTc3RTA0OTVDQUExQTBEQTJCRDlGMkVDREIxRkRFNzFEQjUzIDYgODJCQjAwNTE2M0U2NzE5MUFERUNGMzZGRjBFNEI3REJBMUEyNzFGRkEwMUVFMENFN0MxRUExQTIyQTFEQjg0QSA0IDA3M0ExNTA5RTkzQkI0NjQzMTA3RUVGRUFFQUJDODhFNUM1RjFCMzVEMUUzOUQ5RjMxRkNEQTY2MUNDRkFGN0MgNiA4MTIzRkQ0NkI0MkE4RTk0OUQ1RDc0ODgzRDMyNDE3MzIyNzI4RTNBNjFDOTYxRDczMDFBRDJEMjk5MTAwNzNEIDQgMzdERTA3QTUxQjQyNkY0NjU1QUNCN0I5MTY1ODA3NDk5MDlBN0Y5RDY1NzYxMTQ2QzY4RUQyREI1RDI5Njk3NyJ9fX34UYCAgKAibpgjb8Q1c/SWJkHJQ9rBv+3Afk+p54M2ZquLQDiA/YCAgKAWR30Nk98ucvc//c1abRrGq5ssMuKkzn/QmkwtVtosQYCAgICAgICAgOIaoHGnXka8/qW6XAvTQMez42wzNqfsdkRAvEh5104KAqvx+HGAgICgbK4FiwouxkmJd6dMK31XyHf4CTOOfkk94Qtn1JkGg1igMP/Tm1vSAt2kuNr2QAiD+fXGZWKbcc/W2OvLJL31wNugfAjfB5ZNqXHs37yvH4AmSEJquXxZr2Mbwn9Zu4kL4EuAgICAgICAgICAgPkBEYCgo/70oA3hRc92BvureXJlFcl7WUzfrblSzWW6rXqFoPmAgICgtyckR5mToLceY+Rv4k9cwj0iQabCgYvu19es3oB9njegKhN+l9aC92daZBub5wpY7U3dreMDTCL1QocbkUlYUo+g+Nc3zGqKlg7bxbA2cL23a+dEcW5gidhzRu4Z5CRmZZiAgKBw6nP7MJi3QqhOL+9HJe7GEwJFRdXO85cemBTqwnYMj6CLgTqGOCelg5wTak+54se+NgZnvQo1WBdc6nM4K1PVAqB6sLAcgCxZSprI/XoFs4vPYWFyiFP46VGtbmdOlu7aS6DjWrKuUr1ZhDaiSOgKVydtOuaOvkLzE2DybSXKdUrFuYCAgPkCEaDOwxkQflyh3rPv05IuHR35rR8kNBklo7B8Xjwg/xum06AvQFy1dYszgpWZGAQ0BPvcu/Aax8UI29vzax0SqQQa86BLcpHNSo+qP8HchRwcOv0RuQ23kYp56IkeTCOYRiMPf6BPfbg7TreuZ6QmLdvq9BOLgliTo0F2+J2YpZ9tRdNAVKBvZtIkKBKP3u9KDUjZ/G5onPumxkgGOO2Ml2lS+NJLvqAXx6ou2vz0A6sOvGi9YUNMeorlErNIMKA7wr7P6IuKvqBF5k5PtPPSWEp2cjdLZ7wn9m72T2RBqBKZesj4LlZHu6Dm9URLZ+u/zOTiISuTMUoN+FZyBDXfnyJ4Cv8WRuGAoaCqRCKr25CMkS2p3vWCEA8tBkj2rBx5YZKLMyj3e9cKHKDdmTvNMj5pz18CcVDyWY5emr+lUGvq5tu8YKeLM5n4FKDwFP1b0N8GDc5Oe3K9gh5Uo7ozcKOP7Q+rMjTSfpk5aaBFb0TFfVESbNpo/TD6RgLEvyHB2+bWsf1349B+z6OpcaBOalu7RBpHCzxJTRMOttrUDaldR6iKDEnOUj29P4g346BOa63G23RPo2P5XYSa5jN/YKMkgNh6mg175A2LZ7eMpKCqzNO6E9rcaSgxIWXnKpQDTPhT/CASyiEVtxek7oIEb6BTjK7Ir/DtpJxBWw7O0F+U6m7u55QfEgh8OFdxFTVE74A=", + "root_hash": "Hc1ASwAU28iKXoTwgPLySzZsxcrB6RXm8b6km7LsMab", + "multi_signature": { + "value": { + "ledger_id": 1, + "state_root_hash": "Hc1ASwAU28iKXoTwgPLySzZsxcrB6RXm8b6km7LsMab", + "txn_root_hash": "HcKCAfN7ZmhoXgJwDokJcKTNtriQybUCuvr4u9TeBdhL", + "timestamp": 1632557775, + "pool_state_root_hash": "NCGqbfRWDWtLB2bDuL6TC5BhrRdQMc5MyKdXQqXii44", + }, + "participants": ["Node4", "Node2", "Node3"], + "signature": "R1ftxrdsmR5ZkuT9p3v44HMt7ET2SzkuxwY632EU3B9CEUKdamKkCuyFqbNF99s9Cit6x5TVufHnNPHiVXJD3YShBaFTrECpmXZjBkc2aEEPNZiBk1SSMnFJuWPmsL3hZy8EMVxGbBkyFyjFR3hhg8QbPfsTGKt7YiWf51WTdjvnw9", + }, + }, + "txnTime": 1632557775, + "to": 1632557777, + "seqNo": 306, + "type": "117", + "identifier": "GjZWsBLgZCR18aL468JAT7w9CZRiBnpxUPPgyQxh4voa", + "data": { + "revocRegDefId": "EKimXGooGt9o5GrPxeWqqD:4:EKimXGooGt9o5GrPxeWqqD:3:CL:298:tag1:CL_ACCUM:tag1", + "stateProofFrom": { + "proof_nodes": r"+Qfl+QERgKCj/vSgDeFFz3YG+6t5cmUVyXtZTN+tuVLNZbqteoWg+YCAgKBeKlyiN0qV5YjjZ0tuNk14xdmVOTBt0LjCqS0hlxtFtaAqE36X1oL3Z1pkG5vnCljtTd2t4wNMIvVChxuRSVhSj6D41zfMaoqWDtvFsDZwvbdr50RxbmCJ2HNG7hnkJGZlmICAoHDqc/swmLdCqE4v70cl7sYTAkVF1c7zlx6YFOrCdgyPoIuBOoY4J6WDnBNqT7nix742Bme9CjVYF1zqczgrU9UCoHqwsByALFlKmsj9egWzi89hYXKIU/jpUa1uZ06W7tpLoONasq5SvVmENqJI6ApXJ2065o6+QvMTYPJtJcp1SsW5gICA+HGAgICgbK4FiwouxkmJd6dMK31XyHf4CTOOfkk94Qtn1JkGg1igAMwb56rOQQzAvqI/dHd8o/XBeC1izsMNBHyqACbsSlygfAjfB5ZNqXHs37yvH4AmSEJquXxZr2Mbwn9Zu4kL4EuAgICAgICAgICAgPkBEYCAoC5fpdk8xcRlexAr5PgK5ydWQSj59jYZLiFpmlPDuc02oCE2n2WCK65gHj2i/7eGvDRyZhCuX9xb2+3rVOAk95wBgKAZqIvZnsIegY+A0la9LCEpdWFZcroQgyxrWx3J21Ae6KCRIyvON/LhTWhkbxT4zvnEiXRfga3jtob+supFYukCuqDSi4q6qMMIHCrcrmtk1XDOL+UwoIVct27jJn9yIUVjuICg0GrUd3PGnkZWprVxhIggVJDPEw8nXGThY84veHjKY2aAoL8QgC6a78sPnRhjdbUStwqlPP4TOGrPyYC4FAYtN/A9oLMBO+fjXCaQvJcDqtho27EZA5bMr27+IFgTjYyumf+HgICAgOIaoPP3f+RKePlgmPgsSLaKacGIVaHaqaATZJEQAFV6MBL++FGAgICgFj+lR2t0QPj7wyeR0Mpx+bfc0c6nUqBb2kJ6tTpLd1uAgICgFkd9DZPfLnL3P/3NWm0axqubLDLipM5/0JpMLVbaLEGAgICAgICAgID5Ar24SyBLaW1YR29vR3Q5bzVHclB4ZVdxcUQ6NDpFS2ltWEdvb0d0OW81R3JQeGVXcXFEOjM6Q0w6Mjk4OnRhZzE6Q0xfQUNDVU06dGFnMbkCbfkCarkCZ3sibHNuIjozMDUsImx1dCI6MTYzMjU1Nzc3MiwidmFsIjp7InJldm9jRGVmVHlwZSI6IkNMX0FDQ1VNIiwicmV2b2NSZWdEZWZJZCI6IkVLaW1YR29vR3Q5bzVHclB4ZVdxcUQ6NDpFS2ltWEdvb0d0OW81R3JQeGVXcXFEOjM6Q0w6Mjk4OnRhZzE6Q0xfQUNDVU06dGFnMSIsInNlcU5vIjozMDUsInR4blRpbWUiOjE2MzI1NTc3NzIsInZhbHVlIjp7ImFjY3VtIjoiMjEgMTJDMERGMzM4ODA1MzBGM0M2Mjk3NzYzOUU2Q0VCMDE0QTAzMjlCMzY0NEFBMDEwOEE4OEJEREQ5REFGMjBDRkYgMjEgMTI2MjZCQzM5NjM0MkFEMTM1RTVDOTY1MEQyQTdBNDNBOUJERjI4NDMwREIxMTg5MTU2REE5MEE0QzZDMDM3REIgNiA1OTZBQjMxMzBDRkI0ODRDRTFEMEM5MzY2ODgzM0Y3OUQ2MTBBRkJDMkYyQTYzQzE2RkZERjRCMDREN0NDMTFDIDQgMDlGOUFBMDAzOTMxMkI5MTkzREZCNUFFQUYwN0Y4RDlGQUE3NUUyNUNCNUMyRDQ3MjVFODYzRjdCRTRENzlFNCA2IDdCMzBGRkFDNEIxRkI4MkNDOUM0OTIyNkFGOUMwMDMyMDEwOTE0NjE4NUE4RUY2OEQ2OTVGQTNBQzEwNUYxNzIgNCAzMzBGQjRDQjE2ODYzMTkwMThENEYwMDNDMUIzMjk2N0U1NTMwMjFCNTZBRUFERTBBNzU4NTBFMTk2NEE3Q0EzIn19ffkCEaDOwxkQflyh3rPv05IuHR35rR8kNBklo7B8Xjwg/xum06AvQFy1dYszgpWZGAQ0BPvcu/Aax8UI29vzax0SqQQa86BLcpHNSo+qP8HchRwcOv0RuQ23kYp56IkeTCOYRiMPf6C7pwiu/E4GH2tyBg8SrPzfo02awRF9LVq1zbPKTfgVsaBvZtIkKBKP3u9KDUjZ/G5onPumxkgGOO2Ml2lS+NJLvqAXx6ou2vz0A6sOvGi9YUNMeorlErNIMKA7wr7P6IuKvqBF5k5PtPPSWEp2cjdLZ7wn9m72T2RBqBKZesj4LlZHu6Dm9URLZ+u/zOTiISuTMUoN+FZyBDXfnyJ4Cv8WRuGAoaCqRCKr25CMkS2p3vWCEA8tBkj2rBx5YZKLMyj3e9cKHKDdmTvNMj5pz18CcVDyWY5emr+lUGvq5tu8YKeLM5n4FKDwFP1b0N8GDc5Oe3K9gh5Uo7ozcKOP7Q+rMjTSfpk5aaBFb0TFfVESbNpo/TD6RgLEvyHB2+bWsf1349B+z6OpcaBOalu7RBpHCzxJTRMOttrUDaldR6iKDEnOUj29P4g346BOa63G23RPo2P5XYSa5jN/YKMkgNh6mg175A2LZ7eMpKCqzNO6E9rcaSgxIWXnKpQDTPhT/CASyiEVtxek7oIEb6BTjK7Ir/DtpJxBWw7O0F+U6m7u55QfEgh8OFdxFTVE74A=", + "root_hash": "Di5KHhwYEvJPysprH4RkyaHFBvfkSuFTd5fQBGrvcKfE", + "multi_signature": { + "value": { + "ledger_id": 1, + "state_root_hash": "Di5KHhwYEvJPysprH4RkyaHFBvfkSuFTd5fQBGrvcKfE", + "txn_root_hash": "CPsx5Mv6DG7Keg7AjTzf5GNvWhMHgm1V1PY1ATMft6Hj", + "timestamp": 1632557772, + "pool_state_root_hash": "NCGqbfRWDWtLB2bDuL6TC5BhrRdQMc5MyKdXQqXii44", + }, + "participants": ["Node1", "Node4", "Node2"], + "signature": "QpgqdTZ7zLRxaZfBsb4j3hGgtSYUyuTP8cknmcjRKSbWaqzohRCsDoHggLmvJ1XBv3aetqt8koJAFw4zeVnVyEKn8qY1FWjJSmqMkUEwT93UXxY3HFNRZzimKTJ9jXW7dkNNuodcrNgJV7aL4GmSHuhvu2Qeujp8GKybpEdr9dugNb", + }, + }, + "value": { + "revoked": [1], + "issued": [], + "accum_to": { + "revocRegDefId": "EKimXGooGt9o5GrPxeWqqD:4:EKimXGooGt9o5GrPxeWqqD:3:CL:298:tag1:CL_ACCUM:tag1", + "value": { + "accum": "21 138ADC9FC017D4F9F8DA39848519A9D448763870F8AE1677FD51A216312F984C6 21 131E29F8529649AFB4B71384B7F00E77E0495CAA1A0DA2BD9F2ECDB1FDE71DB53 6 82BB005163E67191ADECF36FF0E4B7DBA1A271FFA01EE0CE7C1EA1A22A1DB84A 4 073A1509E93BB4643107EEFEAEABC88E5C5F1B35D1E39D9F31FCDA661CCFAF7C 6 8123FD46B42A8E949D5D74883D32417322728E3A61C961D7301AD2D29910073D 4 37DE07A51B426F4655ACB7B916580749909A7F9D65761146C68ED2DB5D296977" + }, + "txnTime": 1632557775, + "revocDefType": "CL_ACCUM", + "seqNo": 306, + }, + "accum_from": { + "revocRegDefId": "EKimXGooGt9o5GrPxeWqqD:4:EKimXGooGt9o5GrPxeWqqD:3:CL:298:tag1:CL_ACCUM:tag1", + "value": { + "accum": "21 12C0DF33880530F3C62977639E6CEB014A0329B3644AA0108A88BDDD9DAF20CFF 21 12626BC396342AD135E5C9650D2A7A43A9BDF28430DB1189156DA90A4C6C037DB 6 596AB3130CFB484CE1D0C93668833F79D610AFBC2F2A63C16FFDF4B04D7CC11C 4 09F9AA0039312B9193DFB5AEAF07F8D9FAA75E25CB5C2D4725E863F7BE4D79E4 6 7B30FFAC4B1FB82CC9C49226AF9C00320109146185A8EF68D695FA3AC105F172 4 330FB4CB1686319018D4F003C1B32967E553021B56AEADE0A75850E1964A7CA3" + }, + "txnTime": 1632557772, + "revocDefType": "CL_ACCUM", + "seqNo": 305, + }, + }, + "revocDefType": "CL_ACCUM", + }, + }, +} +GET_REVOC_REG_DELTA_REPLY_C = { + "op": "REPLY", + "result": { + "revocRegDefId": "EBibQo1zAo6mt6rPEtuoSR:4:EBibQo1zAo6mt6rPEtuoSR:3:CL:315:tag1:CL_ACCUM:tag1", + "reqId": 1632557963805207500, + "state_proof": { + "proof_nodes": r"+Qnr5IIAOqCtjsPMySEJttRTU87zU4n1voi7PzeDjRDrYLVCpFEPSfhxgICAoDAC37MI4H7MSYACR1PBKNx+iB5viTFOWXdrdLN8XFqAoB4nKOAOsZtFUxMl39pGuqoDwjF+uWxAHH+Jj//La4axoE5bdLL7zlKemfVesJ1fMHKILUuxizQ67TxoKCoqU5C2gICAgICAgICAgID5ATGAgKAuX6XZPMXEZXsQK+T4CucnVkEo+fY2GS4haZpTw7nNNqAhNp9lgiuuYB49ov+3hrw0cmYQrl/cW9vt61TgJPecAaBPYX6YWHSkvPD41OtGgDtIft2eNc5RhRQTwMleM2PdyKCxuca01rsSzKiuD+L2murKydNQ0cI0W6xxPxjsyNW9b6AuDi1pQYRs2rxv0rW4V/BEp4iFLgeQFvxpxhfTy40oBKDSi4q6qMMIHCrcrmtk1XDOL+UwoIVct27jJn9yIUVjuICg0GrUd3PGnkZWprVxhIggVJDPEw8nXGThY84veHjKY2aAoL8QgC6a78sPnRhjdbUStwqlPP4TOGrPyYC4FAYtN/A9oLMBO+fjXCaQvJcDqtho27EZA5bMr27+IFgTjYyumf+HgICAgPkEfrhKIGliUW8xekFvNm10NnJQRXR1b1NSOjQ6RUJpYlFvMXpBbzZtdDZyUEV0dW9TUjozOkNMOjMxNTp0YWcxOkNMX0FDQ1VNOnRhZzG5BC/5BCy5BCl7ImxzbiI6MzIzLCJsdXQiOjE2MzI1NTc5NjEsInZhbCI6eyJyZXZvY0RlZlR5cGUiOiJDTF9BQ0NVTSIsInJldm9jUmVnRGVmSWQiOiJFQmliUW8xekFvNm10NnJQRXR1b1NSOjQ6RUJpYlFvMXpBbzZtdDZyUEV0dW9TUjozOkNMOjMxNTp0YWcxOkNMX0FDQ1VNOnRhZzEiLCJzZXFObyI6MzIzLCJ0eG5UaW1lIjoxNjMyNTU3OTYxLCJ2YWx1ZSI6eyJhY2N1bSI6IjIxIDEzODA2RTEwNDU5NEQ1RUIzMUI1M0QzN0M1Mzc3QTRBQkFDMEVEOTZFQkQzQjM1N0IzREEwMjU1QTM4NjU0MTk0IDIxIDExNzlDODNERDA0RkFFRDAzQjZFMEVCMEI4OTdEOUQ3NjRGN0U5ODQ1NzgwRTZENDFBQUZGOTRFMjFCRkFFRUU0IDYgNkJEOTUwNzgyMEUwNUI1MEFEOTIwOTg0NEMzNTEwRDc4QjVDNURGNzMzOUI1OUE1M0ZDMUFDRUEyN0VBMDE1OCA0IDEwMzI0QjNBRjE4MTlCREFGNEUxQzg1NTZGMEZGRUJFODZDMzkzOEQ5MkEzNjMyMTBDMjdDOEM4NjM3QTNBODUgNiA3NTQyOTMxNTg4Nzk2OTdBMUMzOEJCNDM1MTExNjU2NjVGNDE0NEI1MjlDNDczMjE5NkEzNjYyOTFEQjU0QzUxIDQgMzUyMTcwNDc5QzBDMjYxNzYzQjIwOUJFMkNCRjE0RjlEOTc3OEFBREIxRTVCQkNDQTNGQTNBNTQxOTZGRDExOCIsImlzc3VlZCI6WzIsMyw0XSwicHJldkFjY3VtIjoiMjEgMTM0RjMwRjhFRkIwRUM5QjZFRkI2QzM0QzcxMjA1MTNFQjYwN0NGNzBFRDZBQUVDRjM5QTQ1RTM4N0M3N0Y0RUYgMjEgMTFDNzU2OUJFNTQxODg3NTJEQTdCOTFFQzJENjIxMkJCNEYzNTRFQjlFRTNFQzBENDA5MzM2NkI5N0RBRjQzRTQgNiA2QjYyRjMyMzJFNzNGOTA2NzNCM0JENEUxNEI1Q0MzQUZEMUYyRjRCRjJCNzAwMjQ4NjcxQ0FFNjJBRUIwQ0REIDQgMkM2OTNFNzM3NkI1ODRCRkJDNjJCQUFBNThGN0YwMEEwMzc1MzQ0NzlDNEIxOUNCMUVEMzA3NEM1OTkzRTdGQiA2IDg1MUM1NDBGMkE0OTAzQTE2NDJCRDhGRjM3QkY3NENDQTFCRjVFMkI1REEzM0MxQUYyNjQyRkU1M0REMDNFOTMgNCAzNDk1RkQyRThFOEYwMEI3QjMxQUQ3MTE4NEQ3MDQzMzlERUZDOUFDQUJBOUNGMDFFNkRGRTY1NEVBMjcwNzI3IiwicmV2b2tlZCI6W119fX3iFKAjIzRaPwUA4TYW8tcRqoEedjwhxfUwwAioK3yDbhzP+vkBEYCgAg4hUTHKDq/4O0Uc8m20X2bIW2DfG7xo13mD4e1vDqWAgICgFkxIfQI7yWJAYFNmb2xU9TQf4HpeI5VJ5ymqp+SqXPGg3p2odDvtuWk90njUZO4avwWMT9bIzvQh1pqzGZ+i806giQdCQHIwQbTF8Yk6rxrBKweZ4AIyghBdf5083kZFsRCAgKA3pjRbosEHZMdkFiHE5z+geHdbtBkWxtUeM7MIH/BFCKDycO7a8GNXIEdVXIiMpIuDxKnngzZW5iDFfs8X8+jxIqCq/jaBFN9FpYCzRd3CL3JG3RqjbnxFtELTwZ7BCy6IPKBPtN7dz/is1WzNOWwmIM2748c5qUuu8o+NY4iyqj5Kd4CAgPhRgICgzZAMudGQKIApWHp/zZ7ps+qWdQMHD1GSxcvCKHJtCdOAgICAgICAgKDtPsxSg1C/1AIJN7NOFtPLMwVe4HkMBhWQB4ynN1BvloCAgICA+QIRoM7DGRB+XKHes+/Tki4dHfmtHyQ0GSWjsHxePCD/G6bToC9AXLV1izOClZkYBDQE+9y78BrHxQjb2/NrHRKpBBrzoJf5NGNgcvg2+E+Pjx1iNrcvinjXzoICSyj01gXNu/ARoFjdN6vkwNro0BC2qHDTCEGcb30ZG/liK+gfLFlzg+SioKFKEZN66xwttI/dOZ4H2PJMiHrklvCYP7eHmwRzCS5MoBfHqi7a/PQDqw68aL1hQ0x6iuUSs0gwoDvCvs/oi4q+oEXmTk+089JYSnZyN0tnvCf2bvZPZEGoEpl6yPguVke7oOb1REtn67/M5OIhK5MxSg34VnIENd+fIngK/xZG4YChoKpEIqvbkIyRLane9YIQDy0GSPasHHlhkoszKPd71wocoN2ZO80yPmnPXwJxUPJZjl6av6VQa+rm27xgp4szmfgUoPAU/VvQ3wYNzk57cr2CHlSjujNwo4/tD6syNNJ+mTlpoEVvRMV9URJs2mj9MPpGAsS/IcHb5tax/Xfj0H7Po6lxoE5qW7tEGkcLPElNEw622tQNqV1HqIoMSc5SPb0/iDfjoE5rrcbbdE+jY/ldhJrmM39goySA2HqaDXvkDYtnt4ykoKrM07oT2txpKDEhZecqlANM+FP8IBLKIRW3F6TuggRvoCub3LGgRCwZkLC+redNXXJVCpKrlqahzLZ+O19/+O2WgA==", + "root_hash": "7PLYsQCFkkNaeP8ybJvxaVArt1ygXMVMfeVGqqNJFF91", + "multi_signature": { + "value": { + "ledger_id": 1, + "state_root_hash": "7PLYsQCFkkNaeP8ybJvxaVArt1ygXMVMfeVGqqNJFF91", + "txn_root_hash": "GXGyAdyC5BBxQSjFZa5QJtyJD4N2b6FaQHX5xGgntYyg", + "timestamp": 1632557961, + "pool_state_root_hash": "NCGqbfRWDWtLB2bDuL6TC5BhrRdQMc5MyKdXQqXii44", + }, + "participants": ["Node4", "Node1", "Node3"], + "signature": "R5pstaiNyfkERd92bmSEBqFc17pK1NNhQDt43HSnEW8vwijqFuQEKHQEYy6ktoPJfyisKMtgMoeX1GhpfGM55zt9S8wPLvvHhmhorerJpDffStMSg2UjkJZymADtgqVWP1SkAdUkwKfAMv8EarKztTvQc31qQqX2tc5LqvephJs6d5", + }, + }, + "txnTime": 1632557961, + "to": 1632557963, + "seqNo": 323, + "type": "117", + "identifier": "GjZWsBLgZCR18aL468JAT7w9CZRiBnpxUPPgyQxh4voa", + "data": { + "revocRegDefId": "EBibQo1zAo6mt6rPEtuoSR:4:EBibQo1zAo6mt6rPEtuoSR:3:CL:315:tag1:CL_ACCUM:tag1", + "value": { + "revoked": [], + "issued": [2, 3, 4], + "accum_to": { + "revocRegDefId": "EBibQo1zAo6mt6rPEtuoSR:4:EBibQo1zAo6mt6rPEtuoSR:3:CL:315:tag1:CL_ACCUM:tag1", + "value": { + "revoked": [], + "prevAccum": "21 134F30F8EFB0EC9B6EFB6C34C7120513EB607CF70ED6AAECF39A45E387C77F4EF 21 11C7569BE54188752DA7B91EC2D6212BB4F354EB9EE3EC0D4093366B97DAF43E4 6 6B62F3232E73F90673B3BD4E14B5CC3AFD1F2F4BF2B700248671CAE62AEB0CDD 4 2C693E7376B584BFBC62BAAA58F7F00A037534479C4B19CB1ED3074C5993E7FB 6 851C540F2A4903A1642BD8FF37BF74CCA1BF5E2B5DA33C1AF2642FE53DD03E93 4 3495FD2E8E8F00B7B31AD71184D704339DEFC9ACABA9CF01E6DFE654EA270727", + "issued": [2, 3, 4], + "accum": "21 13806E104594D5EB31B53D37C5377A4ABAC0ED96EBD3B357B3DA0255A38654194 21 1179C83DD04FAED03B6E0EB0B897D9D764F7E9845780E6D41AAFF94E21BFAEEE4 6 6BD9507820E05B50AD9209844C3510D78B5C5DF7339B59A53FC1ACEA27EA0158 4 10324B3AF1819BDAF4E1C8556F0FFEBE86C3938D92A363210C27C8C8637A3A85 6 754293158879697A1C38BB43511165665F4144B529C4732196A366291DB54C51 4 352170479C0C261763B209BE2CBF14F9D9778AADB1E5BBCCA3FA3A54196FD118", + }, + "txnTime": 1632557961, + "revocDefType": "CL_ACCUM", + "seqNo": 323, + }, + }, + "revocDefType": "CL_ACCUM", + }, + }, +} +GET_SCHEMA_REPLY_A = { + "result": { + "data": { + "attr_names": ["sex", "height", "name", "age"], + "name": "test", + "version": "1.0", + }, + "dest": "Av63wJYM7xYR4AiygYq4c3", + "identifier": "LibindyDid111111111111", + "reqId": 1632343306375185900, + "seqNo": 17983, + "state_proof": { + "multi_signature": { + "participants": ["Node1", "Node3", "Node2"], + "signature": "RBpjv3DjHyvvuAWH6CbQQcvEXPHrMZiDn4vmuT2h6q7bdL8o6wjgfBzwtkkoNHkcYfU9TWios8SinviphEvJXFhZvXjiNNP9eqmPxqJ8wc7pyCmd4GdPreA5DHbAiG56v4MDA2bQcsQhqiWZsjHQ1qFUbXkQpJevgTgg49efwaEhGb", + "value": { + "ledger_id": 1, + "pool_state_root_hash": "7siDH8Qanh82UviK4zjBSfLXcoCvLaeGkrByi1ow9Tsm", + "state_root_hash": "7ASP7xJ2AqZJRGWKNAE8NtD5tspEYV91AokHujMpBsX1", + "timestamp": 1632343306, + "txn_root_hash": "9Ff1EwxJiU9ssJZ1QoPwdJxKhNREcKjRjyvA4jCp9gPg", + }, + }, + "proof_nodes": r"+Qi0+GKKIDp0ZXN0OjEuMLhV+FO4UXsibHNuIjoxNzk4MywibHV0IjoxNjMyMzQxNTAwLCJ2YWwiOnsiYXR0cl9uYW1lcyI6WyJzZXgiLCJoZWlnaHQiLCJuYW1lIiwiYWdlIl19ffhRgICAoJ63TzFeZ384dQ267i+Fpw09Chq/WGB96UBNDl3p3whHgICgsubIswzpKtq5CIiuxbxiPCXrwghp7oUVvR1cirH+TeKAgICAgICAgICA+DiWAGM3dKWU03eFlSNEFpeWdZcTRjMzo6DfBbkS1aPiECG/ZBA4G+TgjK7lqGYNJa0vKG76MbGHtvkB0aCqDWMekLBqZChRm7+5DXx1qJMNm1Gv9jA4Jo+1FEh3X6BUj9DyomuKV620Wxc2WquyeZoz0Q4IKsK+ABgMNJ1//ICgDC3FLqS1EhqZ9dO/EPsJiZ2IUBkZVJwLRZhoqUr1Gpig0M03ODZYUVeWAGMhm95M/lhlETDJ51FVcGDbCNNkAzugi/+49NgCU6KKQhF1fzWOfBKNgHl7UpQVpa/luUdoBpugJbjcl1QaQBP3PeLRLiW0bf/xknMuw8kgBQgaB3ICFL+gHkFKlAnEh90L4G/MgxJ9wyhWnCKIC/FPE0LXWkQ3RTigImUKNNX2FGqNTPqA6pXv1rlItYKhbiJ6w7fj2VTWj6Cg2VP+h1FC6JYUb2TWOnrn6/enz976Mv/X0NbwuzpMkwWg06tG5A6dLwjUvEwlUNrpaHuxlJvd+tPk4V+9+63ZSIagzXkE0jvi2m8u1HEh1VF2APSizGbqn9kriAjCS8UE4fagDarGcZuWhTGaeasUa2UUhJTkMknTmGTi1pe65ygTo1mAoG6LIDtkAPY+s5DtR/iThKpnv2hytUL/ciDtz0qsnEESoOW372oCGcLGYubol6aH4ES6MvPr/6JgfBB19dgmVJVegPkCEaDGEIDItW6N94rcmxGyAp8630FveyRrWiIBVhqD09W456BiR/gG+z/beEYuTsB4fLjTIFlT2di3w+uyx2T/13h5uKDggH2ZRJNCsNAgsxs0CY240GcOqpBMTSO2ZfBJmBj3rqBbQRjPNcvDLm3XHFZvfQsoCWabMkZTTq1EG0fNUIE+4aA0hQVpqJQlm0Fai2zjRGrSEet2UGrwmQx76ePQuCn5+aAGsrfX++LgJbika+ntXT7V+EUtATk3qw5fNm7GZ4GA+qAcDYr6PKYvH3+362aENAr7x2Ly83KYbTLz1dgW6wh79aA+bkmvazhHGEqJWAJ01ZycwzmJgpxGFX0FPTDcsQVmGqAkupbK6rltT+0j+BUfcrx3Bj16emuIKTsRjbxhlRI6D6B4v6Tzch+Ahc+7oEwH35yoXttYb9grqgmJV/Vm4lRr1KCYdDFF9GYH9HAvQqMNsneomUWhBEDPXC3G9fI2L5wHJqDK/XmIKfXTO2RBLPurLR/hjyM6S7D3aAnkNePjqqEtC6CWjZfQPV5hKUU3tv/QshQpyVMEePlt2DLg5NdaWPYuk6CQSNn7y1wz5FKkq5yLUWEQEMOG6hWpWB5xhddRk2tqQaACK8tNeE1Cx8OmUdFanaz1smY2ar+jZJO+BUB4/v/Ic6CdVQsEKfbGku35kiF1nkjLRCmVj3YDo0IteF16Iurjc4D4UYCgKMzUiL1w9z2bkunw7//hdnA2OPN/PE5LeCFEvEA315SgNLSrdNN9PM7mTpnkeMIJ0QtdkIbERzDD8bo7ZA5ZVbmAgICAgICAgICAgICAgPkBcaArV7tJsNXmxr9gNTpZI0bpEZybPKMu3fIDW64+skB3HaCWdLyZjuzzOvcrQd7+neUuG41c6gqlcgThmkDveGHluKBMA0ctruW50YKPgtBIvuYDOHPCMfCZadbKEgO+1qfiXqAB7bBNP8LlxWSIbmQBsBo6ZJhP3T6ySaTJBeaN9KLMOaAlSH0vSfHxhs3KOM18kKLB3qcEsvaWhjZPRbFdFTtTJ6AoLGize+6YcLS3uOBzagdHCqV+qGfZmCDIeC9n7Z82W6BB9jhtrOH9KC5E/JO9aPIy2PUkDEWCfjXmKvHXcgFw9KA1f5AgCtopPzIJZ+a8OI5Pa65aFEyJ+KueMXif5kLqJaDFW0O0j+uheblj1zxi+sAfEJee1/o8scXT3bDDMNWA06ANX24OrTI04hbHUwhcFmsy3SUyqyH7PCLlHhHnYTwieqAdhWMk/9W8NuMaICkqTuRr95f2yixzm6tlpP5qg+NY0ICAgICAgPkCEaClJeAmqo7jUwHjnSApXyyxHkGH7m6SREmG7wK1LNHgEqAeEtKzL0njh2NkARbxgn6AT7DigEG0oWMk2WONJWcO3KDu7qRj9FPw0tmh7VEhlOPE1SC+ZtRi6FoOSze44d92naC+1Pd1I05AdZJryqctzQqgc1HXaqFWnlYZ7CYhVhgiNaCaydw0TSbogejEi0od3sCnnsO34UgG6nfMaoaUFdAsD6A5jNaBjKhNUwu6B9PVw11dF8VKaQmTNhTVVLwbnfhZQKA7W+pPjpyfgwIznVj3k4noo962gyj1eKF5aksqvkUsn6A7PRzUJXK8e1ZO8u5n0swL2pdY0FKkvA0gkSVNz3eg2KCjbD9/sOsb8V5PuciIvZ6AQ/3Ya9qtOzXR/PsRPbqv3KA9hceJXMmL0oGv2XeMV3kWTd63MHZg6pMLsj5WDPOgF6BbKp2y9dkYdeCcgBN4YC3hFfCZCa696GWj787D11TPEaDOMu5fx4rqL8yrU2PXUz4uf2OHfaBqojLWqCt3x+1UnaC6jjMi2vorCKUjZaDcNjnmaZ+iB0n7oiT4COcPt/pcc6C54NSiJ31IlbE4ghDk5W20u+l2HRfeE8rJTdLHZQfPkqDJyrR7n7VyIL5nFNaLjzy8dv0IQ1NUW6KMBZlmPNlU96CBpnJC77t1+zClFu5IT2y9CK9xJRRzm90p9uQyuayRKoA=", + "root_hash": "7ASP7xJ2AqZJRGWKNAE8NtD5tspEYV91AokHujMpBsX1", + }, + "txnTime": 1632341500, + "type": "107", + }, + "op": "REPLY", +} +GET_SCHEMA_REPLY_B = { + "result": { + "state_proof": { + "multi_signature": { + "value": { + "state_root_hash": "81s3UgVN47doEdWzg897EssC5aCMZ5g2bHAFHXgJyUGe", + "pool_state_root_hash": "DuhjUiR6QDsT4X3KFTGHgPnaCCTTVMhmmA8uRwkkhDwA", + "txn_root_hash": "49dVmci81excka4Ff17uUPY6NyK5Fkyb7mxKQxHXGUXT", + "timestamp": 1526055830, + "ledger_id": 1, + }, + "signature": "R5DPg3mctNrPwbzb7qCbo5LkTC2nnLj1jvuRSZDypD4FdFoH5eHp3v5vYvQaLkGfPnG9bmPwXgdhwrybsrTm9RvXj15MYxcfnBDACCjFCeAxjEKaWU6ebyJKi8UsGPiQzJVgVNaD6yvLNtvwzn6r9UhZ3wFVjWRu3M9sBfLZPCTE1m", + "participants": ["Node1", "Node4", "Node3"], + }, + "root_hash": "81s3UgVN47doEdWzg897EssC5aCMZ5g2bHAFHXgJyUGe", + "proof_nodes": r"+QiL+FyFIC40LjS4VPhSuFB7ImxzbiI6MjQ3MSwibHV0IjoxNTI2MDU1ODMwLCJ2YWwiOnsiYXR0cl9uYW1lcyI6WyJhZ2UiLCJoZWlnaHQiLCJuYW1lIiwic2V4Il19feIWoKS+yLqQrrlh0AHdXeEgErYxfmhpVzaJ+U4XV9IhNgFz+DmXAIb3F2Y3d1cFJUVU5rWG42QXJZenM6Cg0ey22y1Sw5R0Wr517KDVTtxr1h+dgvtZsXyDnQ5IEjL5AbGgT71lC6jsMTHl3Pa1Rj1nmP8643p6YpwARnmr8pcYPVSg0mYAZNq3qZ1kBfzZ4IkJ9JPoSY9sNt5pSiPy1Sesoy6giTaZI9nDaWnk1Z5+KCJl+Go7TXmmgTHqURkcvCaGJvmAoNyumIZbkdqKaRnLJHMvQHk66iXOKc0eO9fAg0S6IweyoKOlTCQeHHpiiRIqOWf+ZiyoE1RzGLYwYv\/ZT92bj5TUoDYsQFMS2a6ItdDw7pD7FMbhAFy50WgvpLtc5N3kUoPnoEFpsFh4\/nnPQ9a7uJuO1aOF\/0eLZZPbk61GqZrhuH1goPfPWQgWLjM3nhcoyL5\/2KsHiwGTaXLKfB7UvL5Yk3u9oAoTTzqKqJFLJXTlBzO3RoFrsMHHRV7XiX\/ZNGqkeZLdoP2ODKKueuDUgH+081Su3N5ByHvgMGaU9dg+JxJCH351oL8QgC6a78sPnRhjdbUStwqlPP4TOGrPyYC4FAYtN\/A9gKBH\/5JTxilhi+ZrkUrfkkO2KYas7GIQW+HUecA6Ng0sH4Cggh7qzc1C\/0n1Os53lZb9INU33xFXaUSGn9E8NKkFUoyA+FGAgKA8XWevrjMeCJVdfpqrFMz3kn4gCHY9zstqAaYG7QRex6Cug+aEkajVK5kcv41wZi\/8t5dMKsZLmiDeQEkYUiForYCAgICAgICAgICAgID5AXGgdHHcBnO5LERbEZl44uTuxdCRy3YCHcyvM\/iroJzNdACgz7b3Y6RqprH64dbOJiWxR13LjcOakD6+ZdicWWEXM8yg\/6deXgumdcqhse0H2bS5U411JnaRZBYIEMPp1xkFGtKgaNnmuK+WwSL9pcmIZFJI5osNECREaxf7K1eOEXc0RqKgTnhZ9UgjsuSSGru0N3GIAy8BMivi0gX\/AWH8qQDpstCgbNiTG4lUqf96EyWojSovUtrz+XUpH+8N2+QFCrKpIOygv2R1F2JXyEqh2AXalY2VldRciIPsr7EWFAabOG4RKzugyDzJ72wkDeDMP19UfN\/lfpDFo9XnSIV4iHHjgz3QU4igGQ0k1eP\/uLZvThw6V0SemmquhaNp77pRQoQwMEtkdVqgqy7KI1QOQoBZgJjBqJ\/i56QfzUF1gKUu6voQvyQf3fegLUg5S0QHBY4ldJxdLdWpVfksMNnDVoeIOk4YaoY9TsiAgICAgIDkggA6oEwv9bhu5R8nrnUt4Yx3+oB0SXXXuZBO3b28qAICCOSd7owXN0LWxpY2VuY2U6OgOFNinrJ5T+l1MMwSVWJf5WOoYjFNf\/XZ3zVKmxgUnHP4sYCAgKC0dldFCLyH+sCHRrFr5phR9NjdPw6AKv0zlZKTn3x2OaDTgNSzkxwEcmAnqqgFicXYnk+U1UR9Zr5GW\/kNcr\/QpaBSnbcCh57QsvePUsmWURFq3G2WYfLinvZvbN7djK1eJaAZ1CRb5ndo+3SSaqOyg0hBd1UVFWwvDv9WxWwXaoEz4qBt6QgRVzUf5A9QoRzs\/BqQN3HRYS9mbvGzr8i8gQBsvoCAgICAgICAgPhRgICAgICAoMmjsuJicY9FzZqlzDRPdp3\/d6e2ete+uTUuGlLZrfEkgICAgICAgKA+q5fnel\/\/ZFjKfWOn3PScO8k6Qdki79w+y2ZACYUTZoCA+FGAgICAgKC4hIGO4DPcaKYmRpzNYUAXu57xExF9yyxyGD03dg3nCoCAgKBrY\/1SacMf6LNQ\/VBpr9p\/hAkk1AY6XrpoAxZW4aaOrYCAgICAgID4kYCgj6RlWm2C5oUBN0vbyaI+Hxh7AzbmI1SYINZ8bGj7FHygSwYrTkPDB\/SU3Yi8+UJW60leF09Xf4NDHIxnkH+XR8egHPodFdkcnKxOiLXtlpOuzM\/THPYfTQ9xWanloSw8uU6g1iRB91Rq8zT7dTIl1FIPXg0ovtW\/gktKvMVp\/XNxoOOAgICAgICAgICAgID5AhGgZBqeju\/DK9w6gsSRyeuChMfPiz3O5CJOt05Cwx6+Zq6g\/qAj00WnUVvBv214PQY\/n6Hz3ge+kdXc7bRq2zC31cCgK9Hxi\/K6rqzGHb2hV1UhJwZV9RZh9kW\/ROGpoNzP26qgWiSIFqr4VJmXkDDFgkHrnTBOnefFNXuDTl9ug1V3K86gsnM3Tnk5\/vzBW5+41I7cwvW1\/77F2bdmGlnc0xmx4TegMZvW8Kl\/KJS0Z3NItqp8zhKrxs+VwSHasFQG6ySr\/ROgPZToxrJxpSH9egAT4t\/cIXgXmeM9Lgqnk7ZWXWPcbN+g0kXS+Xw39zJ5DUOqKiMIKl3hcaVTIgnAEmke90PmIo+gMKg2q8Gla7GKLXYh5ZBAvGaZM0zm8zZEsu1WRLYePSSgqvwhtqLvuNKkBdPMSDAO28IarXiYXDhae80T+EiDK32gmO6lIx+XA5ydY7qyl\/c41qlWvyLmJIyhd71wOGS2xhmgwDrBtPzkPBdieH2gS50vDXWH7zUgksxbCX4lBbnlJV6gS9n9ZPT3Q7ICigKtf7\/6y383dDp\/junzRl9OfxsgYEKgtc5LYJgXPhA6bX7ma\/iOGn3C6DOpahza9mEQxCb5APCgUKah9cJa5x9dtgVS6rymjOoNTbrSuw+AxvrqhZgYgI6gE+UVSRl8WVc4BZyM+zE9W5OEO58lpO97\/pTn2PDkiB+A", + }, + "type": "107", + "reqId": 1526055830436897605, + "seqNo": 2471, + "data": { + "version": "4.4.4", + "attr_names": ["age", "height", "name", "sex"], + "name": "test-licence", + }, + "txnTime": 1526055830, + "dest": "2hoqvcwupRTUNkXn6ArYzs", + "identifier": "2hoqvcwupRTUNkXn6ArYzs", + }, + "op": "REPLY", +} +GET_CLAIM_DEF_REPLY_A = { + "result": { + "origin": "2hoqvcwupRTUNkXn6ArYzs", + "ref": 2471, + "txnTime": 1526055837, + "seqNo": 2472, + "state_proof": { + "proof_nodes": r"+TCX+FGAgICAgICgAvYLuXsAAY0sC0l3WEb54w9nGIGSozv0aj4OO9XrnluAgICAgICAoD6rl+d6X\/9kWMp9Y6fc9Jw7yTpB2SLv3D7LZkAJhRNmgID5AbGgT71lC6jsMTHl3Pa1Rj1nmP8643p6YpwARnmr8pcYPVSg0mYAZNq3qZ1kBfzZ4IkJ9JPoSY9sNt5pSiPy1Sesoy6gfBNPUfJLjh8xUMxmj8siLnL1yAzp6QFwiMDeHgiOLW2AoNyumIZbkdqKaRnLJHMvQHk66iXOKc0eO9fAg0S6IweyoKOlTCQeHHpiiRIqOWf+ZiyoE1RzGLYwYv\/ZT92bj5TUoDYsQFMS2a6ItdDw7pD7FMbhAFy50WgvpLtc5N3kUoPnoEFpsFh4\/nnPQ9a7uJuO1aOF\/0eLZZPbk61GqZrhuH1goPfPWQgWLjM3nhcoyL5\/2KsHiwGTaXLKfB7UvL5Yk3u9oAoTTzqKqJFLJXTlBzO3RoFrsMHHRV7XiX\/ZNGqkeZLdoP2ODKKueuDUgH+081Su3N5ByHvgMGaU9dg+JxJCH351oL8QgC6a78sPnRhjdbUStwqlPP4TOGrPyYC4FAYtN\/A9gKBH\/5JTxilhi+ZrkUrfkkO2KYas7GIQW+HUecA6Ng0sH4Cggh7qzc1C\/0n1Os53lZb9INU33xFXaUSGn9E8NKkFUoyA+RGbgICAoNWOkxUMviioP\/z3azO54taoWzYeWAgusSf\/SqhwzRR+gICAgICAgICAgICAuRFo+RFluRFieyJsc24iOjI0OCwibHV0IjoxNTE3MjA5MTg1LCJ2YWwiOnsicHJpbWFyeSI6eyJuIjoiMTAwODg4ODkyMTQ3NDYzNzI3NjUwNDg1NDA3NTUxNjkwNDM5ODgzNDM0NTMwMzgxMzA3MTk5MDA5OTYzMjg5OTY0OTA3MzMyODI4NzAyNTY5ODUzNjcyNDQyNTg0MDYzNTczODY1MzgyNjg0NzIyODY2MDYyMDAxMDAyODQyOTI4MzA2NjAxNjIyOTkxMzc5NzA2ODE1NjM4NzU5OTk0ODE4NzYwNjEzMTI1NDM5NTkxMjI3OTQxNDg4MjYxNTg2MzI4OTI0MTU1Mzg3MTk2ODczOTA3NTQxNzYxODQ2ODYzNDY5NTYxNDc0NDE3MTUxNjA4MjQzOTQ1MDYxMzM1OTk3MDQzMjAyNDg4Mjg0Nzc5MDU4NzA0NzY3NjUyNzYwNjMyNjE5MDQ3MTcwNDMyNTkwODYzMjAyMDE1MjQ2MzQ2NTc1NTY5MDUyMTQ4MTc0ODk2NjE3NjAxNzc1ODc4Nzg2MjY2MDc0MjI4NDU4MDYwMDg1MTEwMzAzODEzNDUxMzg2NzU2MDUyMTUxNTM5NjM1MjgyMzE0NTYzMTUwMTE3MzM4MDU5MzkyMDc1NDM1MDU2ODE5NTQ1MzM2Mzk3MDQ0ODAwOTc1NDE5MTE3MzU5NDU4NzExNDAxNjAwNDY5OTIzMjM0Njg1NTkwMTYxMzE1ODI2MzQ2ODkwMzMxMTMxNzkzMDkyNTcyNjY5MTkxOTAxNzIyMDA3NDM1ODUyMDMzNjY3MTQyOTgyMzE0NTQ3MjM5MjgxMzAxMTczNzkwNDAwODE1MTkwNDc1NjI1MjU3MDg5ODg4MDA5MzEwNTE4MTIwOTMxMTY5IiwiciI6eyJCYW5kIjoiOTA5MDk4MjI4OTY2ODc1Njc1NTQ4NjI3MjgxODE4MDM2MzIwMjQ4NDA5NzM0NjA5MzEwMDg5NzQzMTUyNzYyOTAxMjc1MjgyNTgyODY1MDE2NTU0NjgxNDAxMDg5Mjc2NTM4Mjg3ODQ4MDM2MDQ2MzY1MzExOTE4OTczNTQxMzI4MjYyOTQxNzAzMTk4NDA2NTUyMTQ5MTM4MzIwNjU2MDE3ODk0MDYxOTEyMjU3MzgxNjYzNTY3OTk2ODEzMDk4ODcxMDExOTQ5OTk4MDIzNDAzOTQ5MjgxMTc2MjA4MzY2NjgwNjg0NzAwMzA4Mzg3ODU4NDgzMTU1NTU2ODMwMjU3MDA3MjI0ODE2ODM3MTI1NDI2Njg4ODY5MDM4NDE1OTY4MzQwMzgxNjM1MzAxODEyMzA5NjU2NTc0Njg5NDc5MDI1ODEzMzEzOTg3OTI3ODYxMTg4MTgxMjY0NTg3ODk5NjAyNDQ2OTE2MjM3MzA1NDk0MTk1NTQxMzg0MTI0MzE2MzIwMjMzMDY4MTQ1NjA4NTIxMDAyMTcwMjQ0NTAwMzI4Nzk4NjE4OTYxNzEyODQyMjU0NjEzNDQ0MDEzMDk5NjkyMTMzMjcxMjU2NzgxMjE2NDU0MjAwNTc0NTgyMTQ3NTYzNDE3NjA2MjQyMDkwMzU4MjAwOTg4MDY1MjUwNDUzMjcwMjQxMDk2NjMwMzUwNjE0MDAxODI0ODI1MTk3MTE1MDUwNTM3ODU5OTQ3MDA3MDI4NTAzNTI1MDk2MTYzMjM2NjMzMTgxNTk5ODk5Njk0ODUxNDM3OTYzMzAwMjA0MjA3MjQiLCJTb25nIjoiOTA0MzQxMTcwMTE4NDQ3OTcxMzI4ODA3NDU2MDI4Mjk1MDQzMTI3ODc0MDgzMDQ5NzY3OTAxOTk3Nzk4NjI1NTAwMzczMjc3NzU2Mzc4NjQ5MjQ1NzM0MzcwNzg1NDc1MjUxMDY1NDI5Nzk2ODUxMzcxODU0NDg5Mjk5MDg5NDExOTAwNjM5MTE3ODExNzgxMDc2MzA2OTkxMzIzNzY2NDM3MDg4MDI4MjQ5ODYxNjY4MTU0NzAyMzc5MzEwNDU1NDA3NzA4NDQ2NjQ4ODY0NzIyNzE3NTUwMjQzMDI3NTM3MjM2NzM2OTYyNDM0Njc2NTg3NjY5NDk4NTY4ODEzODE5Mzk0NTE0MDEwMjcyODQ0NzI3ODg3NzU5MTM0NDA3NjYzODU3NTM5NzQwNTMwMjgzNTUyODAyMzMzNzY1NTY3OTI1MDQ3MzIyNzM1MzE5MDM1MTk1NjU1MTUzMDIxMDE3MzYxNjY5NTY3NzIzMzIyMjgwMjMzODQ0MDAxODQ0NDk1NTQ2Mzk1MjU4NzQxMzMzODkzNDU3Mjk0Njk5NzA1NjU5Mzc1OTUyOTQwMzMxMDEzNjgxODE2NzQxMTc2Mzg3NTk1NDEyNTkxMDI5NDQ3MDYwNjc0MTY5MDY5MTg5NjQ3MjQ0MzI5NTg5ODQ0NTM5NjA3MjAxNTU0NzMxNDY0OTMwNzQ4MjEyMzM4OTg2ODg3ODM1NjMzOTgzMzIyNDYzODgzNjk5Njc3MDE5MTQ2ODk4NDE4MDQ3OTcwNzE1MTAwNDA0Njc1OTk3MzczOTE0ODU2MzM3NTM0MTE1MDUwNjE4MTYzMCJ9LCJyY3R4dCI6IjM3NzE1MjQ5NjQwNTgxMDA5MDUyNjYxNDkzMzE2NDgwMTU1MDYxMzQ3NzcyMzkyMzk0NTQ2NjEyMjE3NzIyNjc0MzY4NjgwOTYxMjQ0MTI3NjM5NDIxNDA2OTc5MjQxMzgwODU0NDYwMzgwODQ5ODAxNjU1MTk5NjgzOTU1NzgyMDkzOTYwMzI5OTUyMzM5ODg0OTY4MzE4NTA5NTY4MjU2Mzc5OTM0NDk2MjU5MDEzMzI2MDA1MjI3NDcxNDYyOTk0NTg4ODA4NzA1MjU3NDk4NDI5ODA2MzUxOTk2NTI5MjQyOTkyOTMzMjU4MDI2MDUzMjkzMDEzMTE4NzExNTAxMzM5MjY1Njg1NDMwNTAyNDIwNjMwNTMwNDQ0MDA1ODc3NzYxODU3NTQ0OTMwNDU2OTc3MjUxMTg5MDgxNDY1NTk0OTY3NDE5NzA5NTcyMzAzODQzMzQzOTc5MzY2MTM3OTU1Mzk0NTQ1MDgzNjM5NDUwOTM0NTg4NjA5MTE1NTgxNTg3ODUxNzUwMzIzMjE0NzU3OTU3ODk2NTU0ODc4MTc1MjcyNDY4NTc2NzU5NDI2NzAxNTU3NDAxODcxNDg3ODMzNTk1NTYxNDIwNzUxODkwMTgxMTU5MTc2NTgyNzU4NTk2NzA3ODY3NDY1MzAwNTQzMDQxMjY5OTc1NjAzODY0MjE4MDE1ODgxOTMxMDQ5NjE0ODgxNzI1NzYzNjA5OTg1MTA0MjE0NjQ3MjUwNTE3MTAyMTI2MTAwOTQyNTAwNjY1ODE0MTgzMTAxMzYwMDY4OTI5NDI5NjAwNTM0NDAwMDc5NjY3Iiwicm1zIjoiNzU1OTY4MzE1NTE4NjM3OTE5OTAyMDk2NDIyOTc3NjA5NDc3Nzk0Mzk3MzE0MjYwNDEwOTQ2NTg0Nzg0ODAyNzc4OTQzNTAyMzgxODQ2MDYxODI4MzIwNTU1MDgzMzg5Mjg5MjEzNTM4MTU0NjMxNTYzMDg2MTY5Nzk0NzUxNDcwODIzMDg1MDkxNDMyNzI2NDczOTk3MTk5MDc4MTI0MTA5MzEyNTcyNDgyMzA5NTU2NzQ2NjEzMDg3MjEzMTIwMDQzODE5MDQ3NTgzNTE1NTc5NDYxODYwMDgyOTQxOTkxNDU4MzgyMjk0NDIwNTc3NjE0MDEwODY2MTE4ODgwMTkwOTc1NzUxMzE4MDk5NDg5OTUyNDE3MDU5MzIxOTk4ODQ4MTIwNjk1ODkzNTY5NzA5ODY3MjkxNjQwOTYyNTUyMzMxMzcxOTUwODk4MzQyNzIxNDM3OTc1NDMwNzA2MjUzNjIxMTY2NjAwOTk4OTc0MDczOTk0Njk5MDAzNTYzMzI1MzU1NDI1ODIxMDU3OTk0NDkzNzI1ODAxNzMxMzk3MTA1MzU1ODc3NTMwMDEwODM4MjQ3NzcxMjQ5ODA5NzcyMTcxMzM4NzkzODMyNTc2NTAzOTMwMDA4NDcyNzc3NDQ0ODc4OTAzMDk0NzM1NDYyNTQwNzAxMzAwMjU2MTIyNjk3NTAyNDM5MDA5NzE4NzY4OTA0MDc4NTc0NzYyMjU1NDk0MTAwOTE5NjM1MjgzMjg3NTcwNzMyNzgyODExOTk3NTUxNTQ3MTUzMDA3OTUxNjg5MjE3NjA4NTkxNjYzMDc4NjgzMzIiLCJzIjoiNDY2MzM1ODU3MjM3MzM3NzY5NjMwNzUxNzU4ODc0NTQyNDY1NzIyNTY4NjgzNDgzMDI2Mzk5MjUxODYwOTI0ODk1NTMxMDIxNzI3MDE1MzczOTk1MjU3OTIyOTg2NjYzMTIyMzMyMjkyMjk1Nzg1NDEzMjQ1MjkwMTMwNTg0MDY2NzY1NTk4ODkyNjkzNDk3Nzg4OTk5MTI5OTc3NjA2NzMwNjY0MzUzMTcyODU3NDE0NTU0MzM2NDM3NTgwNzI3NzExNjQ1MTEyNDkxMDEzMTE5NTMzODAzMzU0Njk2NDkwNjU1MDMwMzk0NzI0NDg3NzYzMTExNzM1MTE0NjQ5NDcyMTc1NzkwODE0NDU2NzkzNTAxMDQyMTA3MDkzNzk4NjUwODY1ODIwMTE3MTA3MjQ3MzM3MDMwNjE5MjkyMDgzOTcyNzU4NTA5Njc2NDAzNDY5MzU3ODA5NDc2MjY1NTg4NTUzMDQ1Njk1NjEzMjMxMTAwMDA1MDkwMDA2MjU2NTA2NTg4MDgzMDgyNDk0MTU0NjE5MTk3NjQ2OTExOTAzMjUwMjYzNjU1NDIzMDAyMjk5MzA2MjgzNDMwNzc0NTE2MDMzOTU3MjY0OTMwNTE0MzUyMDAyNTk5MzAyNzA3MTQ4NDMxMDM5MTk5Mzc2OTc5OTQ5MDUyMTkzOTMzMTAyMzc3MjYxMDg0MTE3ODE4NTAwNTA4MjM5MzM5MzIzNTAzNTM3ODE4NjQwNjcwMzAwNTE3MzU5Mzg5Mjk5OTkzNTM5OTQxMDY4NjAwMzYzOTM2NTQyMTQ0MTA0ODk1Mzc5OTM0MTQ4NzUiLCJ6IjoiNTM3MTU3Mzg0OTA2NDkwMjMxNjA4Mjg3MTAyNDgwNDk4NzAwNDU1NDk2ODM4Mjc1MDIzNTUyOTMzOTE2MDM0NDIyMjA5NTUxMTIxODg4Njc0NDc5MDY5MTc2NDA3NDgwMDI4MTAwMjAyMzMwMDQ3NzU5OTI3ODAxNzY5MzU0MDg3MjA2ODk2MTM5NDg4OTE1NTQ0MjY1NjcxOTUzOTQxNjQ2NjE3MTIwMTAzNzA5MjY1NDExOTMwODc2Njc0MzM1MDUyODM5NTgzOTk5OTA1MzE1NTg5MjYyMDEyMDE3MjA5MzAwNDcxNzAyMTk5MTYzNjMxODk2NTA5MDg5OTkzMTUxMzE1MDYzODQ0MTE0NjkzOTUzNTM1MzUzNTc5MTkzOTYwODIwMDk5MjAxMzU5MTkyODgxOTQwNTAyMjkwNTExMjQ0OTA3MTY0MDU0NDg4NDU4MzUxNDI5MTcwNjQ3NDMyNjQwNDkwMzMxMjIwMjU2NzE4NjM2MTgxMDE4MTU5MDIwNjk1MzEwMTkyMjI3NjI1MjUzNzUwMTUzNDk5OTY2NzM2OTg3NzQzOTE3MjAzMDIyODAzMjc1MTQ0NjgyOTEyOTQ5OTI3MzQ4NTIzNzc5MzUwMzA3ODY1MDk5NzYxMzAyNTYzOTYxMDYyMDYxNzE4MTYyNTg2OTg4OTcyMjAzNjM0ODU1MTE4MDE0MjcwNjc1Nzg2Mjg2OTE0NDk0NjM4NjU5Njg2NDg5NDUyMjIwODI5NDQ5NzUzODQ1OTA0NTczNTM0MjIxODY0NDgzNzg0NzI5MDk3MTA1MzA1MDMzODIwNjg5MDQifSwicmV2b2NhdGlvbiI6e319feeFE6Q0w6OgEO3Lpzagw55qrC6p9ueDgp2di7LlswI6EwUd8J6Ue9f5ATGgeU3NGal5f5L1Qcp1763QvmidBWyziMqPFb7dBLMcec+gLbHvRp7a+1JB9sJM4LLYDcLvtMlNuLznkoiS3QU88Vigp0Py4qeOqdooL8DFKuiQx7vcWNx66P0LTd9\/ADu+CTugr57haEFjYQNY14GHtJ\/Lux\/8xmQD6EdRSPNScMPU3dKgVEoAt0j1776qH5J8KWLhiW\/VbyjBSUXFmpiQApAOKVyg9KvI9puI4FLojuT63klqHZRmpmj9ycjw70zMbqfECLqgVzQRGINdS9lwpP6zuze0henl\/OGw83YsPGf8HmgSQZeg9HETGJGHzr3orwkyGFW5Q76mrULAQxo\/QOeK2NUdbDiAoGG\/USK0fJ6cubNyGXQ0wXFpngJLrB5sfCKFrfege+M5gICAgICAgOIToCgqMiJYAu9FGM+EkDtDw9AIV9UqvkJn5RObBRGsZz8D+RZDMbkWP\/kWPLkWOXsibHNuIjoyNDcyLCJsdXQiOjE1MjYwNTU4MzcsInZhbCI6eyJwcmltYXJ5Ijp7Im4iOiI4NjcyNDI4NzM1MDQ3Nzc1MTIwNjY1NjU3MDk3OTAzMjk2NjcwMzMyOTUwNTg4OTEwOTk3MDEwMDY3MjU5MzYwMDIxMjE1OTcxMDEyMjUyNDU1ODg0MDk4Nzc4NDk0NjU4NjA3NDY2MjEzMjQ4ODIzNjQwMDczODE4MTk5MTgxNjQxMjAyNjU5MTE0MDk4NDkyMTM3MDE3OTc2NzA3OTYwNjA2Mjg3NDg0OTM1MDY5OTA2ODU0NDAwNzk1MzM5NDQyNTYyMTE3MDQxMjYxNDQzNDQ3MTQzNzg3MDQxMzk4MTk2MTcyODIxMzM1Njk3MDg5NjMxMTM5MDMyNDQwMTUyODA2Mzk5MzcxMjA3OTE3NTkyOTQyMTE1MTAwMjA2Njk1ODc5MjYyMTk2ODUyNjEyMzA1MjU0MzM2MjcyNjcwOTM1MTA2NDk4MjAzNTQ4MzMwMDkwOTc1MzMxMTE4ODkyNzgzNDQzMTIxMDE0MzYxNTkxODQ4OTE4MTczNDgyOTc0MjY3ODY3MTA4NTE5NTQ4Njc3OTM0NDI3OTA4NzU3MDI3NTMzMzc2NDgwMTA2NzgxOTAzMjYzODc2MTc5NTczNTQyNzY5NzA4OTc1NjU2MDQ5MDY4MzczMjUyNDI4NjI2ODI0MjU5MjU0MTQ4MzU2MTQyMTAwMzAwODg0ODkwNTMxMjk2MDI2MzIyOTIzOTMyNzA5NjQ0NDg1NjM3Mzg4MTY1NTM2NjE2OTc4NDAwMDc0MDQ0OTgyNzU4MzQyNzMzNjI0NzI2ODc5NjAwODk1NzIzOTkzMDkyOTk0MDEwNjEyNDM4MzQ0MSIsInIiOnsiYWdlIjoiNjgzMjQ0MzEwMTczODYzNzMxNDE3MjMzMTMzNzk1ODgxMjMxMTQxMjIxNzI3MzM3ODU0OTgwMjgxMjkwOTg3NzE5MTIzMzI4NzAxNjE2MjI1ODY4MzAyMjY3MjU4MzI2ODIwODkyODI3OTc2ODI0Nzg5OTY2Mjg1Mjc0NTg1MjEyMDM5ODEwNjY1MzIxNDg1NDI4Njc5MTA3Mjc4NzY5NzM1NDA1MDE2MDQyOTgzODY5NjQ5NDEzMzEzODU5NDU1MDk0NDc5MjE1NzMzMzQ2NDAyNTcxMzE0ODA2ODEwMTcyOTg5Njc0MTE2MTU4NTE5OTcyNzg0Njg5ODIxMTQzMTgwNTM4NTI3NDUwNjIxMDc2ODU1OTY2NDU4NjA1NzQ3NDQ4MjA0Nzk0NDMwMDg1Njg1NTA3MTM3MzM2NDc5Mzg3ODQ2Mjk1MTczNTE1MjEyMjAwNDAxMTc0ODI4MTIwNTc4NDU1OTE3MjQ2OTExMjk5MDYzNTkwMTUyNTAxMTM3MDY4MzcxMTI5MzU5NzE0MTU3MzU2NjM3ODAzMjIyMDQwNTQ0NDk0NjU5NDcwMjgzMjgwMjI1MzQ5NzI1NTExODY0MDAxNjMwMTIyMjA3MTk3MTgwMzkxNzYzNTc1ODY3MzA0NTczODY3NDQ1NTcwODk0NzM0NjUzNDE1NjM0NzI3NjA4NzI1MzY1OTQ2MTE2MTgzNjUxMTA5MTk4Mzc5NjIzMDI1NTQ4MDgyMjUxMDI0MTU4ODMxODc1ODc5MDgzMTExNDEwMDQxNTAxMTMyNTQzNTAzMDA4NjE2ODQ5NzQzODcyNDU1MjQiLCJoZWlnaHQiOiIxMzkzOTM1NjY0Nzc1OTYwOTE3NDQ4NTcxOTgwMjUwMTc3MzU2NzI3NjA0OTU0ODI0NTQyNDkzNTUwNDEyNjUwMDI2NTU4NjYwODY2Mjk1MTk4MDI2MDcxODA0MDc5MTEzNDc1NTg2NDQ4NjcwMTg1MzA1MDIyNTY1MTg1NTk1NjczNzkwMTU5NTczMDk3NjI2MzUwNTA4ODU5NDM3ODM4MDA3NDI5MzQwNTU1OTE5NDc0ODIyMTc1OTYwMzU4MTg5MDU5ODA3OTExNzc4MjgyMDUyNzkyNTg2ODQ4OTkxOTYwMDY5NTEwNDMxMjI2MTgwNTk3MDY1NzMyNTI0Njk1NjA4MTIwODg3ODg2MjAwNzQzOTA1NzcxMjQ2NTAyMzMxODY3NDU0Mzg1OTM2NzIwMTM5OTA4MDE5NDU3Mjk2NjgzMzAwMzg4MTUxMjUyMDcyNTA3NDYzMzAzMzQyMDkzMDA5NTIxMTMwNTYzNTE3MzcxNzQ2NzI0MTk2ODYxMzQ4MTUxNjA2MzE1MjQ4Nzk4MDgyMDU1NTUzMzc1NTIzOTEzMzY1MTc2OTU2NTEzMzQyMDE1NjExNDc3NjU3ODE2NTQxMDIxMjIyMTg4MTk5NDkyNTg4MDYyOTc3MDAyODg5ODcwNTg1OTI2ODY2MzgzNjY2MzY0NzAyNzY3Nzc2ODQ3ODIyODA0OTA4MjE4OTQ2NTk3NzUxMTkwMjcwNzM3MzA0MTc5Mzg3OTI4OTc2NjgzMDQ2NjM2Mjc3OTE5ODMxODI1NTE5NjY1NzgzOTcwMzA5NzcwMTk1NzQzNjIxMTQzMjg2Njk0MCIsIm5hbWUiOiIyMTQ3MDY5NDEwMDcyOTg2OTc0NDI2MTI5MjAwNjc2MzgxMzM0NTE1MDA0NDE4MzQ0NjAyOTU0MDE3Nzg3MDEyNDUyNDAzOTEwNjMxNjA5NDE3NDQ2MDM4OTIzMjgzMjU3MTUzMjI4NDgxMDE3MDAwNzI5MzM4ODI5OTQ3MzczMzE0MTMyMjU2NTI4OTM0Mzg5MzI4MTMxMTQzMjIyMzEwNzI4OTg2NzQ4NjczNzU5NzUwNzM4MTkyOTMyNDQxMjU5NzExMTc5MTg5MDgyMjk4Mjk3ODYwNzM3NzY0NjQ5MzI2ODY2NTQxMDE1NDE1ODM1MjMzMjU1MjQzMDQ1NTQxMDY3NjQyNDY1NTUzODEzMTgwNjk2Mjc0MzgzNzE0MDk3NTAzNDUwMTk0NTM0OTYxMDExNDEyODA0MDUxNjkyMDQyNjQxOTc1MjE4MDQ0OTg3NjExMDQyOTYzMTM5MDA3NDAxODMzOTYyNDM2NjYyNTcwNjA1NzU3Nzk4ODk5NDE1Nzc4NDk3OTI5NTU3OTk1ODI2NjE2NjcyMjY2MzAyMjkxODA3NjY2MzQxMjEwNzI0NzEyMDkyMjU0MzQyMTk3MzEwOTc3ODIzMTU4MzY0MTQ5NDUyODk3NzQ2MzU3MTYxMDUyMjI4NjYyMjAzNzczNzMxNTAyNDMxMDAyNjU4OTUxMDEzNDA0MjEwODEzODU1OTY2MjY1NDU3NTMzMDgyNjc0MzQ0NTY5MjE5ODgwNjQ2MTk3MDE1MzIyNjM1NzkxMDcwNTI2MTE4MTIwNTQ1NzI2MDU0OTMyNzU5Nzk4MTk1MTU3Mzk4NSIsInNleCI6IjIwMDIyMTAxNzQxNDQ2NTcwMjY0NTU3NjMwMzk5NDg5Njk5OTkzMDExMzgzMzU3NDk1MzMzNzgwOTYyNTE1OTE1NDYzNzAxNzYxMzg1ODY1MzI3OTY2NTI0ODcyNjQ2OTA3ODg3ODE2OTE5MjQxNzk1MTY2ODEyMjQ1NTIxMzgxNTI2NTk5NTk0OTM5NzA4MjcxNzMzODk0OTc0NjAxODg1NDQ0Njk3MzY0MjY0MzIwNDY3MjMwNTg3NzkyMjYyMDkzNzIyNDE0NDg5NTUyNjQ5NzUwODg3NzQ2MTczNjk5NDA4NzczNTgxNzA5Njk0OTI3NzE3MzI0MzM2NTYzMjU1MzA5ODg1MzAzOTAzNDUyNzkxMzQwNDQ3Mjk2NjQ0MjMwMjk4NzAyODUyNDk1OTE5NjEwMDQ4NTcwOTk4MzE0MTQzMjExNjc4MjA5ODQzNjMxMzI5ODk5NTI4NzI3MTg5NTQxNTA2MzgwMTM1MzIzMzg0Njk3Njg1MTI5Mjg4MDM0MTM5OTAxMzE2OTUxODgzMDU2MDA5MzA4ODE2NDc0Nzk0OTA0ODcxNTMzNzg0ODc2MzMzNzA3MDM0MzkwNjU4MDI0ODk5NzU3MjIzNzE4MjMzODIyNDE2MDMzNDEwOTMyNzU1Njk1OTE4ODU1MDEzMjI3OTEyNjQ5MTI1NjI1NjEzNzA4MzAyODI2OTQwMzk1MDUwOTU5NDI1NTA2NTU5Njg5OTMwNzI5NTg5NzAyNzE1NTY1NjEzMzUwODcyMjU3NDA0Mjk0ODUyOTgxNDg1NzQyNjkzNDE0MTc1NTYxNTYwMzc2ODc4In0sInJjdHh0IjoiMjg5Njg5Njg3ODkzMTY5MjE5NTYxOTUwMjAxNTkwNDM3MDE0ODUxMjgwODQ2NjY3ODgzMTk2ODg5MTk3MDM3NjgzMDUyNTc4NTcwMzA4NTM4MjY2NTE0OTQ0MzUyODA2MzQ3MDMxMzA5NjU3NDk3OTkyNjM4NDcyODEwODkyNjE4OTA1ODY0MDQ1MTQ1NTQxMjg4OTc1NDU1Njc1NTk0MTEzOTIyNzQ5MjQ2NzI4NTA2OTU5MjM0Mjg1NTY2NjY4Mzk1MjQxOTE2Mzk1NTc1NDA5MjU4OTQ0NDEyNDc1MjM5MTU5MTE1OTQ5MTU4MTAwNjIxMTU1NzMwMjI0OTExNzU1MDUxNTMyNzc1MTI0NzYzMDUzNzc4MjYzODMxMzQzNDM2ODE2NzAzNjg1MzA5NzQ4ODYzNjIyNjY3MDYxODc2NTUwNjQyMDc5Mzg2MjE4MDg1ODg3MjU1MzE4ODc5MzEwNjIzMTQwNDQwOTQwNjQ5NDM3ODMwMDgyMTg2NDI3MjE4OTg2ODg1NDg3MTY2MDA5NTkyOTE3ODk4OTk4MDg1Mjg5ODI1NzI3ODY0NzIxODMzMzgwNDMyMTI1MDU4ODUxNDY5OTc5MTMzMTMxNDEyODI0NzgwMjg1MDI2NjYwMjEzNjQwMjc4NTE0ODgzNjMzNTU4Njk0OTExMTg2MDM4NDE0NDc4NDM4NTc4OTMzNzQ4MjM4NDM5NTE5MjM5MzUyNzU5NDkzMzI0OTUxMDkzNzA2NDQ5NDY0MDM3MDE2NzA5MTkxMzM2MjEwMDgyNDU1ODAzODg3MTIwMTY4MTQyMTYyNjgxMTQiLCJybXMiOiI1NzI3OTY5NjExMDcyNzAyMDIzMDYyODMwMjU3NjU4OTcwMDYxODE4Mzc2MjM2Mzg4Mjk2MDE5MTc2Nzc5MTEwMzg1ODQwOTQyMjIzNzExNjE1MDgyMjc0NjM4NjcwMjY3ODUzNzA2MDYxNzAzNTE2MjE1ODc3MTI2MTAyMTIwNTMyNjQxNDgxMTA4OTk5NTE1NTQ2ODQ1MjczMjM0NzM2OTk1NzUzMTA0MjYxNDc0MzY4Nzc4MDY0MTMwMzAxOTA1NjYwMTA0OTkwODI4MTU5NzMwNzEwNTQ4NDUwOTM4NTExOTY5NzczNTY3ODgyMzE3MzY2ODYyNDA4MTkzODcyODc3MDgwMDA2ODkzNTMzMjQyNjE5OTAzMjQzNDA5NzY5NDUyMTAyOTk0MzU4MTI5MzUxNTYwODI3NzA1NTUxODg2NDI5OTU2NDkwOTk1MjcwNTIwNzMzMTE1NzQyNjU3MjY4MTg5ODE2MjcxNDgyMDA1NjYyMjQyNDQ5NDg0NTE5NTI2Mzk0Njk4NjU5MTAxMjUzNzQyMjgyNTA4NjExNzc3NTQ2NzUxMDIwODk4MDM5NDY1MDA3MDg5OTcyMzkzMzQ4OTk0NTIzNDUwMzQ5MjY0NTU4MDkzMTY2MzI0MTU3MDIxMDAyNTkwMzQ5Mzk1OTA5MDYyMDg2NjQwNTc3MDQxODkzNDQ3ODM1ODg3NzI0MTc1NjY5ODExNDI2MTI1NDU2Mjg3NTU5NDI2NzI2NjE0ODAxMTA5NDczOTQ2MjgyODA2OTE3OTU1OTM5NDA3MDQ2NjcyNzU2Mzg0NTkwMTM2MjU1NzMzOSIsInMiOiI2ODU2NTgyMzAzMjcwODQ3NDMwNjUxNDQwOTcxMDI1NDc0NjE4NTA5NzAyNDkwNTg2NjQ1NjQ5MjQwMDYwOTU0ODUxMzg4MjA4MDA2MDY1MTU0NzI1NDg4NjcxNzIwOTMxNzYzNTUzNjY0Nzk1NTM0NTE2MjY4NzA2MDg5ODQ0MzcwOTEzODE0NzI3ODY2NzA4NDM4NDkwMzU4OTQ4NjQ5NjA0MTAzOTA5Mzc3NjE0NzA2MTk5MDYyOTUwOTc5NjU1NjkwMzQ1NzYxODMxODk0Njk0ODQ3NzAyMDMxNjczOTYxNjY5ODI4NTA5ODA2NDUxNzA2NTIxMDMwOTQzMjIzMTM0ODQyNTc5NDAzODQ5MTEzNjIzOTc2ODI0MzY1MzAwNDI4MjQxNTYyMjE4Mzc3MTAyMDMzMTgxODY2MDY2MTk0MjgzMzQ1MzI2MTUzNzcwNTQyMTc5Njg4NjA1MDgxOTQyNTk0NjQxNjUzNDA2NDg4NzE3MjQ3NDYzMTg3ODk0NjY0ODY4MDk2OTY3ODQ1MDI1MDYzMjcxMzM5Mjc0OTg1MjA5NTQzOTg1ODQxMjA4NTYzMjYyNDQwMTQzNDE4MzY5OTI4OTc2MzE5MDk2NjcyOTc3MTU0OTg5NjYyNzk4NjUyODAwMjAwODQ1NTE0NDA4MTIxNjAxMzkxNTM3NjkwNjk2MDYxNjIyNDEwMDc4NTI4ODQyNTIyOTAwNDU1MjEyNzE1NzgyMDEyOTg1MDI0NzYyMzg2MDY2MzcxNjAwMTMxNTMwNTgyNDI0NjUyMTUyNDMxNDQwNjc4NzU3MjIwMTkzNjE1OSIsInoiOiI4NjU3NzIyOTI2NTc0NzAwOTU2MDA1MjYxNTY3OTYxNTg5NTQxOTM0OTQ4MTA1ODI4NDAyMTA4NTY3MzkyOTI2MDYzNjk2OTE3MDUzNTU2MDM5MDY2MDAyMjY4MzQ4MTMyNjgwMzIyMDU4MTgxNTg4NDQ2NDc0OTAzODUwNjEwNDQ1NzE4MjkwNjA1MDkyOTU5Njk0MDUyOTE1MTczMTUxOTA5NjAxMjQ2MDk5NTU4MDIxMzU3NDkyMzExNTM5NDQ1MzY2NDE5MTkwMjA1Nzc5MTk1MzA0MzM0NjYyOTE0ODM2OTc3NzU0MjExMzU5OTA0MDUxMTkyMDU5Nzg2NjQ1MTY0OTE1NDQyMzkxNTI4Mjk3Nzc1MDA2MDI4MDU5Njc5OTQ0MjI2Mjc1Nzc1MzkwOTI5NTExNDg1ODEyMTQzODgyMjYwODExMDY2NDA5OTY4NDkyNTM3MTY3MTA4OTY0NDA4NjE5NjM3OTczNDA0NDI1Njk2NjQ5OTgxNDY4MDk0NjA1MTE1NDE4NzM5MTU5NzY3OTkwMDI3MzkwNjY1NDczMDc0ODUyNTY4NDg3MDk0NTkyODAwMDI2MDAwMjk5MzI3NDIxODkyNzU2MTgzOTcwNzM2MTgxNjg2MzkxODA5MjE1OTgzOTE0MzMwMDQzOTQ5NzU3ODA3NTk4MTM4MTg0MjE4OTMwNzgyNjk5MTU1OTQ2ODExNTY5OTQxMDU0MDg3OTY2NTQ2NTEyNzg5MzIwODI0MzkxNjQ3MTczMjMzMjMwODU2MjEwODk3MzYzNDM1MzIwMDI2MTU5MTI1Njc0NTgxMjIzNCJ9fX34UYCAoDxdZ6+uMx4IlV1+mqsUzPeSfiAIdj3Oy2oBpgbtBF7HoKw09NaWj7yyWG5GW490k\/xZ8t5kOWRfeI4lBGw8LMqKgICAgICAgICAgICAgPkBMYCgQ4tl8stBcyEXP\/\/R0wl852iUuKRBQ552sIRMOosVBnWgEiwdB4Gywkv0syxXBFLrnTQLJv0lLldDj616k+590uOgyUE2EnHObpvdMF5yAc4hTpTja\/Rlz0UBogZ3H6gPy86g1atgI9ujTdp8EE+WA1T6H3PsXNUD6Zd1xeKP3basI66gcyZwdwTAYc2s2qoDXEUbE3BLx40b9WQVcw7MA+LQCvyg+EG3xUp4hwn7XcrwiwW2YXsAMQgDEx+ZDlEs7XqtT82g34HRBsZxx5ks54lZ7gV\/QavSjWRdCIQLgkOG+FMgoq+g0KnlornY8DUykqyWTW2mvpO320zemmjBjexgG1qm3A+g+BjWK\/TTK\/yuTee9oc54AKBj6TQbbFRdnOcBRiv1leeAgICAgICA+QExoC3Qdck9q8QvTuqMeKyk91V2xCnXtFMlLijcPIBlZCsjoHx\/nahibluSdpnotILv50UxjjJE2VXvF3U7WRPA4uS7oFzgqFzBcJPpbBlOWvagDUDqhZq3UIH3MEMeG98qmZijoFaB1buTUt448scg0cRFQfzhLFTdS6g7hz7yFian7OuHoNbiRotVdTRl2\/MhB\/gwyCc5RHf\/dpIgFNTMgC6WcwJ6oODdN6SFU+KU5B22AQwDYW8lMJGy6fIz8UTc1HZTQ8TgoBh0UDS9bmpQYqQRt1WNQFwyCoSI+7aaaCMEzyNr2OLKoCy8rk5NyMmAXHopKk7XO2zWHuMpj68njICp4TQqvB6xoDinzBKW7z3rKnB\/NtKncYlLXCUIbdKZ\/KjvHDqXFoTCgICAgICAgID4OZcAhvcXZjd3VwUlRVTmtYbjZBcll6czoKCewNyuTkwlOElF+LJoPeacrokxvrhxwpsNzU+x3bTDBuIToB35btz4tK1sByHu\/5Id+25BoX2DCEOAV+KvuetqR0bL+QIRoGQano7vwyvcOoLEkcnrgoTHz4s9zuQiTrdOQsMevmauoP6gI9NFp1Fbwb9teD0GP5+h894HvpHV3O20atswt9XAoCvR8Yvyuq6sxh29oVdVIScGVfUWYfZFv0ThqaDcz9uqoKvKsbN4\/zaJMpzOdEa2y2n8+idE2OexbsmwZSoNnCzdoLJzN055Of78wVufuNSO3ML1tf++xdm3ZhpZ3NMZseE3oDGb1vCpfyiUtGdzSLaqfM4Sq8bPlcEh2rBUBuskq\/0ToD2U6MaycaUh\/XoAE+Lf3CF4F5njPS4Kp5O2Vl1j3GzfoNJF0vl8N\/cyeQ1DqiojCCpd4XGlUyIJwBJpHvdD5iKPoDCoNqvBpWuxii12IeWQQLxmmTNM5vM2RLLtVkS2Hj0koKr8Ibai77jSpAXTzEgwDtvCGq14mFw4WnvNE\/hIgyt9oJjupSMflwOcnWO6spf3ONapVr8i5iSMoXe9cDhktsYZoMA6wbT85DwXYnh9oEudLw11h+81IJLMWwl+JQW55SVeoEvZ\/WT090OyAooCrX+\/+st\/N3Q6f47p80ZfTn8bIGBCoLXOS2CYFz4QOm1+5mv4jhp9wugzqWoc2vZhEMQm+QDwoFCmofXCWucfXbYFUuq8pozqDU260rsPgMb66oWYGICOoBPlFUkZfFlXOAWcjPsxPVuThDufJaTve\/6U59jw5IgfgA==", + "multi_signature": { + "value": { + "ledger_id": 1, + "pool_state_root_hash": "DuhjUiR6QDsT4X3KFTGHgPnaCCTTVMhmmA8uRwkkhDwA", + "txn_root_hash": "6jDkor7uh7kBVsmZesTXzD8T2nZf3t3PyaZkhHbhLLzj", + "state_root_hash": "CDTFffp31pxPUyxotNeT2JxQ4j8K9r2HYMTzmNrrZ5TL", + "timestamp": 1526055837, + }, + "participants": ["Node3", "Node2", "Node4"], + "signature": "R3Loufnbanmb8n8ARzQFzJHERbfi8qTxVH9G4MFGnro67CH6PQyNpgSaoqufhviCB3t6ixzFsnkWtRsUzThWxXAaz3Y5YhgB3nvvdF9NPDJjMxGWG2e4VbqmGnH1kTuZbqqUgBnHkQcxhU1DwzBtwdtYCoVgKCYmAYLxrhHt8yb8qu", + }, + "root_hash": "CDTFffp31pxPUyxotNeT2JxQ4j8K9r2HYMTzmNrrZ5TL", + }, + "data": { + "primary": { + "r": { + "age": "68324431017386373141723313379588123114122172733785498028129098771912332870161622586830226725832682089282797682478996628527458521203981066532148542867910727876973540501604298386964941331385945509447921573334640257131480681017298967411615851997278468982114318053852745062107685596645860574744820479443008568550713733647938784629517351521220040117482812057845591724691129906359015250113706837112935971415735663780322204054449465947028328022534972551186400163012220719718039176357586730457386744557089473465341563472760872536594611618365110919837962302554808225102415883187587908311141004150113254350300861684974387245524", + "height": "13939356647759609174485719802501773567276049548245424935504126500265586608662951980260718040791134755864486701853050225651855956737901595730976263505088594378380074293405559194748221759603581890598079117782820527925868489919600695104312261805970657325246956081208878862007439057712465023318674543859367201399080194572966833003881512520725074633033420930095211305635173717467241968613481516063152487980820555533755239133651769565133420156114776578165410212221881994925880629770028898705859268663836663647027677768478228049082189465977511902707373041793879289766830466362779198318255196657839703097701957436211432866940", + "sex": "20022101741446570264557630399489699993011383357495333780962515915463701761385865327966524872646907887816919241795166812245521381526599594939708271733894974601885444697364264320467230587792262093722414489552649750887746173699408773581709694927717324336563255309885303903452791340447296644230298702852495919610048570998314143211678209843631329899528727189541506380135323384697685129288034139901316951883056009308816474794904871533784876333707034390658024899757223718233822416033410932755695918855013227912649125625613708302826940395050959425506559689930729589702715565613350872257404294852981485742693414175561560376878", + "name": "21470694100729869744261292006763813345150044183446029540177870124524039106316094174460389232832571532284810170007293388299473733141322565289343893281311432223107289867486737597507381929324412597111791890822982978607377646493268665410154158352332552430455410676424655538131806962743837140975034501945349610114128040516920426419752180449876110429631390074018339624366625706057577988994157784979295579958266166722663022918076663412107247120922543421973109778231583641494528977463571610522286622037737315024310026589510134042108138559662654575330826743445692198806461970153226357910705261181205457260549327597981951573985", + }, + "z": "86577229265747009560052615679615895419349481058284021085673929260636969170535560390660022683481326803220581815884464749038506104457182906050929596940529151731519096012460995580213574923115394453664191902057791953043346629148369777542113599040511920597866451649154423915282977750060280596799442262757753909295114858121438822608110664099684925371671089644086196379734044256966499814680946051154187391597679900273906654730748525684870945928000260002993274218927561839707361816863918092159839143300439497578075981381842189307826991559468115699410540879665465127893208243916471732332308562108973634353200261591256745812234", + "n": "86724287350477751206656570979032966703329505889109970100672593600212159710122524558840987784946586074662132488236400738181991816412026591140984921370179767079606062874849350699068544007953394425621170412614434471437870413981961728213356970896311390324401528063993712079175929421151002066958792621968526123052543362726709351064982035483300909753311188927834431210143615918489181734829742678671085195486779344279087570275333764801067819032638761795735427697089756560490683732524286268242592541483561421003008848905312960263229239327096444856373881655366169784000740449827583427336247268796008957239930929940106124383441", + "s": "68565823032708474306514409710254746185097024905866456492400609548513882080060651547254886717209317635536647955345162687060898443709138147278667084384903589486496041039093776147061990629509796556903457618318946948477020316739616698285098064517065210309432231348425794038491136239768243653004282415622183771020331818660661942833453261537705421796886050819425946416534064887172474631878946648680969678450250632713392749852095439858412085632624401434183699289763190966729771549896627986528002008455144081216013915376906960616224100785288425229004552127157820129850247623860663716001315305824246521524314406787572201936159", + "rms": "57279696110727020230628302576589700618183762363882960191767791103858409422237116150822746386702678537060617035162158771261021205326414811089995155468452732347369957531042614743687780641303019056601049908281597307105484509385119697735678823173668624081938728770800068935332426199032434097694521029943581293515608277055518864299564909952705207331157426572681898162714820056622424494845195263946986591012537422825086117775467510208980394650070899723933489945234503492645580931663241570210025903493959090620866405770418934478358877241756698114261254562875594267266148011094739462828069179559394070466727563845901362557339", + "rctxt": "28968968789316921956195020159043701485128084666788319688919703768305257857030853826651494435280634703130965749799263847281089261890586404514554128897545567559411392274924672850695923428556666839524191639557540925894441247523915911594915810062115573022491175505153277512476305377826383134343681670368530974886362266706187655064207938621808588725531887931062314044094064943783008218642721898688548716600959291789899808528982572786472183338043212505885146997913313141282478028502666021364027851488363355869491118603841447843857893374823843951923935275949332495109370644946403701670919133621008245580388712016814216268114", + } + }, + "reqId": 1526056339560518351, + "type": "108", + "signature_type": "CL", + "identifier": "2hoqvcwupRTUNkXn6ArYzs", + }, + "op": "REPLY", +} +GET_CLAIM_DEF_REPLY_B = { + "result": { + "data": { + "primary": { + "n": "114746515059260256085234678788587166148355243180280315102090645942751161606786511238733348638566416417303497105088451106899855630983593307364239632151590963537585250064542598188468391931410646982219473876772131150980262917568589457733589819957223264524201617141236009287666907817621809981593970342911797033651200317354944243448934984118385522763230083200500151663687733598612590779617976091488475790945488832306909284594586246370188320528257434368782767295927647652836090950436948787834961967536745377246052043584698163549299065527670856685769132775303570669824694852645445261928275720246006236672142506689977275868361", + "r": { + "address1": "99691666140371070101358855135773568925647233196811391526286824382905688163990792578749168604103123248884382828153399602940509345627800317904075610052247641753179080938711038002760487730651355496233514312301108772924047804779106854832742682362631853415382718650290457593916293375742714388123906183692192298839372942898471052447331045137271144004625111415456710024852530838725807061493074186193830563122576703301370004391867534587752756127556778961151976176561132968144358067501677783620960920599769114567213278491640769805211738029349174624671317568270236523750186824708848670440189718797166160769005983338957645374651", + "address2": "50694627350272135995850227472773682653526785886273680025707432662274319132892635359629398510776487145897598764958307032635008747216007028774721753289104990954080718757271373082072271206903673750118720632612520934845984571658325100469119390059220958605738125231926799396876392119909204431936776959112879901832499818726172148126455962771797164231412142732169157096077449075112596659842193123517211788791802444854497416778006528187110765212794812816350644158221720598510512869974792744115485570047548700807040200469385145744569019135119452695958980311814554081191423436762488471083593240293107044795731173106225269075879", + "city": "102559182306527236449083287004741425738959787531929072403797989548015397214062434117297561478541945525533027326788960769556765871071638166299517271476445491827022651938273628586072010918899170935494269888904933582635309954710186662573013491930971322773769929465525735190214323632698521462642758610601865824221106289034715039425726673460562081648867092963498261193872377056119189184137195233286115023395930817931291731614731826876607790483194892570452915675015342381665340157231556766697416007335116526753816159747328974818962282332973001195014744573290346271712396063166983772336582770882568845075757379081928828333484", + "state": "15713842569245737467211725139597063575206193864299839064963322883000040008821879926466964994443780733540685507582503521942067619465901516066678686301847079899815506652234811980107432963385421543779350493494079718474286652279834034399265138653939510696807388435204386612790208603957327635965005102296774157640016748464771397682922017927919437902023107754480395084067810355347611382320882015302277874585199781243184210455762324907919521828011758577554657145366604111620312856210314976491491503679972802443973537563280871769127378645286351511023979727441703919415153280572332149194429598144248477123785366270918320355405", + "zip": "82439749096689797099365560351239753105378075571995009495583420256509858774982681509800195848127624487713779483877286273210752251434593846321791407852670597416682439130048751917529301094961639109787459476514215677667528190858195940523300129805780128872794677939891404232119474525809874425239287980834285388300426560924836810318681834730312715238871863866236582938180704668392242864562995155085487776333282873729950673055006682711045087024364613715008969916100231057009286368917288450047513042924053264179070247044364634863653978644955951623206916995954379611610172587909771777058114259553518287236931107941424476615146", + }, + "rctxt": "12882106218364489852198756063472503925546736423511030980775724102309731782619944795270841688773712762811510510057735673384647149369683830553010311398919415875035419497039071344961855906260704429386009769391235310734274528765933895442598586760877605228504009278242556373802663874078663963134235128930919205138500224113611155809593610721019344088231155625768052107598044580568892989818589777033707706414333277298933289666110152325572085417003440591173816185535447148436461626970208749404809055575072540846992803414294365454191025320389124986744387913853078792368035519762273705341214769545763267017647118784802102609501", + "rms": "106028334352956120254190539677164874113260919335177645538985439215552009817414268001747934794792622586283368183992307658729682605636119929047669814138904443196076047435578798659633647074839223564300440710991823806423760442858751109813721943066300113338814805016319371724589997933204691059213904593223879782913786796897287377655313546120850344752880386937803109231596086401736775016957962252911603462988048614441232196851204796503935873881697942972984657550803208348039692816399383898170960697113742325598099504349794803545598259703412134354513305415733108643762987048364882676189469605110499746876278508590562888471309", + "s": "15678755827306057743760600028386502810123351546016461235476681362228825598831102199528795536128972467030169856793286910636446189899528748221680209430012552769741678665585748260276066556135386421782992251507127914167374251212176339062137692918334797451372750158620810700982064996262697961072684582507781695193053082112842190405411069106829867359524406960151388270743010774947305777397414657077050738998972524245537892741523091928637751005795657714227975900027201931853690576205714337633635803339006630861198097918548079506836227430559089772233951013727478631851053766855771442468532212938213839384294416442896733429506", + "z": "107844580696935267224685953710248721588061270382223646096349113942754030931632137466395766282891847634657182239048215576555350103258761763383903208677615631334632643004349863812873041076611716205401052812505603586763826866994143145234436298946833325296153637822126459684843587236075925037465050361093439139718472639710725193340606561954142895780104125189011472434268099373557377935176988976044145686558487862982478674750049266524295725097728797460210399377848983033250414174145298927648861452166644982315308937830863192556869355971023436985865987383144838977793063894497859875560718743368644008878035558886398529867651", + } + }, + "identifier": "GGBDg1j8bsKmr4h5T9XqYf", + "origin": "2hoqvcwupRTUNkXn6ArYzs", + "ref": 1487, + "reqId": 1522866729726860308, + "seqNo": 1488, + "signature_type": "CL", + "state_proof": { + "multi_signature": { + "participants": ["Node4", "Node3", "Node1"], + "signature": "R5cNGakn84PoGo4p9U6Y8CyYKGbRUE5mM78yA88KkvSDvWqN8jptp2hSTdzLgYgfWibbFcEv3iAxWH3LGxk9okwr38K6RR35ZbAttQHCafmWGwiUAMoKHoFrH8nMLWzb7A9ZXhb7uUYrSXp3zgZb2xFX83Q8kjmFTkvqcvZVy9uMt3", + "value": { + "ledger_id": 1, + "pool_state_root_hash": "DuhjUiR6QDsT4X3KFTGHgPnaCCTTVMhmmA8uRwkkhDwA", + "state_root_hash": "3tses33E3t9z7W2gvHk8LizYcjWLTvGZwGZd3Q46SUUm", + "timestamp": 1522824724, + "txn_root_hash": "F3iggcw2svzk5uSynSAUQcsE6mKXBWRz5bXJoNvvBaRe", + }, + }, + "proof_nodes": r"+SGN4hOgjDfvaVyBfwtNlXiJD4lMZQeLMHzsAkP/pSKsfAZ6rkn40YCg3c/1kISB4pJHqTOxHXBSdzwl5iC5MX0C9G0ceD3xxNyAoPXB+8LMotY/0qas0C5UYHiGhufecleFu3QLM371MAxvgKDZwH646BRcMMlkhVUODumTpgpZ69XwNheU+IFumV4NJoCgcMNOI89GH1cmupClasMY/JnoUAGWxFZ5AFIi5wXGrB+g6RA7lLdOue0ZvjIyENBeU+pU/98GMRKNK9/IIinUAZSgQ3GwUvuVKlj6cA7ecDmtGYGi2y6dSLzBOR+TOH/qBYWAgICAgICA4hOgwT5AhKtLJWDWb9/tVNZGPbbMkON9RdyJMABtPUL8KSb4kYCg6dtQ/Fhndta7VHxiI1loRbC4OYsWrIs0SAFQTxrEJriAoEv/yTKO+/mf5czjRntRw3ZMlJJfssMABysXO4lDtnSogKDLv5krDrA+AU28QqWtoK+sB/U8B8HRtA7h3NbAxY1CHYCgo295NuH+4DKQWGMOxqVqoHLY9Q/BhHpst26feN8ZdMGAgICAgICAgID4UYCAoBcErMgDu3yu4RVRhrWVC16/M4hCS2yPWVcsP9QhFMe9oKqfH15wamU9ANR82jt6lMG4KzlFYyMVZ66FMFhiBdzwgICAgICAgICAgICAgPkBMaA6dGrtfugJlXVAH2h+78VOJ+Kp0GDot733GYQ5SaMif6BAYqh+Z/HaT0YciVe31CoEER1w7ug46oSTWmg8ideWdqCdmnil+INbwFtef9rJ23KWVOxLEGpekmdpD2szHdTxAqAPFpoPAHqIWS+0rn1cc+XpMyUikJ63oBfATZ+fGisu3KDRMff7h6mUjArlXOiLu9XvRVAAeqYkCxHXZB2Hku1Pa6AAtKzLK6OHX7LNHmRmN8hQCqL35cKZ5Mc57Tl4a9R0ZKA4e8VMr3+D4Tsv3cqRcyYS9lOQIx92KgSYg7dSVxHXlqC3iPf41ux9xUa3JN8qq8a4mjosz+Z4oD7Kc/RPhNkSCoCgY4NGjrlmlgUwgVKcuU2SoHbmGa1TIj39CX5j8Qm3MiCAgICAgICA54UTpDTDo6C4JADeo3ZbFk0Z1XN1mxwW0WQqgVRAbnEx5W7NUSithuIToP+QG77b7atogQGq9In+38W4M2XfioeXYmZ9BJ4TY4Nc+FGAgICAgICgo3awcJ8LKWBy2rfzyzJsuBTDLA7p9BPM7/L7xBdYZ8iAgICAgICAoD6rl+d6X/9kWMp9Y6fc9Jw7yTpB2SLv3D7LZkAJhRNmgID5GMIguRi++Ri7uRi4eyJsc24iOjE0ODgsImx1dCI6MTUyMjc2OTgxMiwidmFsIjp7InByaW1hcnkiOnsibiI6IjExNDc0NjUxNTA1OTI2MDI1NjA4NTIzNDY3ODc4ODU4NzE2NjE0ODM1NTI0MzE4MDI4MDMxNTEwMjA5MDY0NTk0Mjc1MTE2MTYwNjc4NjUxMTIzODczMzM0ODYzODU2NjQxNjQxNzMwMzQ5NzEwNTA4ODQ1MTEwNjg5OTg1NTYzMDk4MzU5MzMwNzM2NDIzOTYzMjE1MTU5MDk2MzUzNzU4NTI1MDA2NDU0MjU5ODE4ODQ2ODM5MTkzMTQxMDY0Njk4MjIxOTQ3Mzg3Njc3MjEzMTE1MDk4MDI2MjkxNzU2ODU4OTQ1NzczMzU4OTgxOTk1NzIyMzI2NDUyNDIwMTYxNzE0MTIzNjAwOTI4NzY2NjkwNzgxNzYyMTgwOTk4MTU5Mzk3MDM0MjkxMTc5NzAzMzY1MTIwMDMxNzM1NDk0NDI0MzQ0ODkzNDk4NDExODM4NTUyMjc2MzIzMDA4MzIwMDUwMDE1MTY2MzY4NzczMzU5ODYxMjU5MDc3OTYxNzk3NjA5MTQ4ODQ3NTc5MDk0NTQ4ODgzMjMwNjkwOTI4NDU5NDU4NjI0NjM3MDE4ODMyMDUyODI1NzQzNDM2ODc4Mjc2NzI5NTkyNzY0NzY1MjgzNjA5MDk1MDQzNjk0ODc4NzgzNDk2MTk2NzUzNjc0NTM3NzI0NjA1MjA0MzU4NDY5ODE2MzU0OTI5OTA2NTUyNzY3MDg1NjY4NTc2OTEzMjc3NTMwMzU3MDY2OTgyNDY5NDg1MjY0NTQ0NTI2MTkyODI3NTcyMDI0NjAwNjIzNjY3MjE0MjUwNjY4OTk3NzI3NTg2ODM2MSIsInIiOnsiYWRkcmVzczEiOiI5OTY5MTY2NjE0MDM3MTA3MDEwMTM1ODg1NTEzNTc3MzU2ODkyNTY0NzIzMzE5NjgxMTM5MTUyNjI4NjgyNDM4MjkwNTY4ODE2Mzk5MDc5MjU3ODc0OTE2ODYwNDEwMzEyMzI0ODg4NDM4MjgyODE1MzM5OTYwMjk0MDUwOTM0NTYyNzgwMDMxNzkwNDA3NTYxMDA1MjI0NzY0MTc1MzE3OTA4MDkzODcxMTAzODAwMjc2MDQ4NzczMDY1MTM1NTQ5NjIzMzUxNDMxMjMwMTEwODc3MjkyNDA0NzgwNDc3OTEwNjg1NDgzMjc0MjY4MjM2MjYzMTg1MzQxNTM4MjcxODY1MDI5MDQ1NzU5MzkxNjI5MzM3NTc0MjcxNDM4ODEyMzkwNjE4MzY5MjE5MjI5ODgzOTM3Mjk0Mjg5ODQ3MTA1MjQ0NzMzMTA0NTEzNzI3MTE0NDAwNDYyNTExMTQxNTQ1NjcxMDAyNDg1MjUzMDgzODcyNTgwNzA2MTQ5MzA3NDE4NjE5MzgzMDU2MzEyMjU3NjcwMzMwMTM3MDAwNDM5MTg2NzUzNDU4Nzc1Mjc1NjEyNzU1Njc3ODk2MTE1MTk3NjE3NjU2MTEzMjk2ODE0NDM1ODA2NzUwMTY3Nzc4MzYyMDk2MDkyMDU5OTc2OTExNDU2NzIxMzI3ODQ5MTY0MDc2OTgwNTIxMTczODAyOTM0OTE3NDYyNDY3MTMxNzU2ODI3MDIzNjUyMzc1MDE4NjgyNDcwODg0ODY3MDQ0MDE4OTcxODc5NzE2NjE2MDc2OTAwNTk4MzMzODk1NzY0NTM3NDY1MSIsImFkZHJlc3MyIjoiNTA2OTQ2MjczNTAyNzIxMzU5OTU4NTAyMjc0NzI3NzM2ODI2NTM1MjY3ODU4ODYyNzM2ODAwMjU3MDc0MzI2NjIyNzQzMTkxMzI4OTI2MzUzNTk2MjkzOTg1MTA3NzY0ODcxNDU4OTc1OTg3NjQ5NTgzMDcwMzI2MzUwMDg3NDcyMTYwMDcwMjg3NzQ3MjE3NTMyODkxMDQ5OTA5NTQwODA3MTg3NTcyNzEzNzMwODIwNzIyNzEyMDY5MDM2NzM3NTAxMTg3MjA2MzI2MTI1MjA5MzQ4NDU5ODQ1NzE2NTgzMjUxMDA0NjkxMTkzOTAwNTkyMjA5NTg2MDU3MzgxMjUyMzE5MjY3OTkzOTY4NzYzOTIxMTk5MDkyMDQ0MzE5MzY3NzY5NTkxMTI4Nzk5MDE4MzI0OTk4MTg3MjYxNzIxNDgxMjY0NTU5NjI3NzE3OTcxNjQyMzE0MTIxNDI3MzIxNjkxNTcwOTYwNzc0NDkwNzUxMTI1OTY2NTk4NDIxOTMxMjM1MTcyMTE3ODg3OTE4MDI0NDQ4NTQ0OTc0MTY3NzgwMDY1MjgxODcxMTA3NjUyMTI3OTQ4MTI4MTYzNTA2NDQxNTgyMjE3MjA1OTg1MTA1MTI4Njk5NzQ3OTI3NDQxMTU0ODU1NzAwNDc1NDg3MDA4MDcwNDAyMDA0NjkzODUxNDU3NDQ1NjkwMTkxMzUxMTk0NTI2OTU5NTg5ODAzMTE4MTQ1NTQwODExOTE0MjM0MzY3NjI0ODg0NzEwODM1OTMyNDAyOTMxMDcwNDQ3OTU3MzExNzMxMDYyMjUyNjkwNzU4NzkiLCJjaXR5IjoiMTAyNTU5MTgyMzA2NTI3MjM2NDQ5MDgzMjg3MDA0NzQxNDI1NzM4OTU5Nzg3NTMxOTI5MDcyNDAzNzk3OTg5NTQ4MDE1Mzk3MjE0MDYyNDM0MTE3Mjk3NTYxNDc4NTQxOTQ1NTI1NTMzMDI3MzI2Nzg4OTYwNzY5NTU2NzY1ODcxMDcxNjM4MTY2Mjk5NTE3MjcxNDc2NDQ1NDkxODI3MDIyNjUxOTM4MjczNjI4NTg2MDcyMDEwOTE4ODk5MTcwOTM1NDk0MjY5ODg4OTA0OTMzNTgyNjM1MzA5OTU0NzEwMTg2NjYyNTczMDEzNDkxOTMwOTcxMzIyNzczNzY5OTI5NDY1NTI1NzM1MTkwMjE0MzIzNjMyNjk4NTIxNDYyNjQyNzU4NjEwNjAxODY1ODI0MjIxMTA2Mjg5MDM0NzE1MDM5NDI1NzI2NjczNDYwNTYyMDgxNjQ4ODY3MDkyOTYzNDk4MjYxMTkzODcyMzc3MDU2MTE5MTg5MTg0MTM3MTk1MjMzMjg2MTE1MDIzMzk1OTMwODE3OTMxMjkxNzMxNjE0NzMxODI2ODc2NjA3NzkwNDgzMTk0ODkyNTcwNDUyOTE1Njc1MDE1MzQyMzgxNjY1MzQwMTU3MjMxNTU2NzY2Njk3NDE2MDA3MzM1MTE2NTI2NzUzODE2MTU5NzQ3MzI4OTc0ODE4OTYyMjgyMzMyOTczMDAxMTk1MDE0NzQ0NTczMjkwMzQ2MjcxNzEyMzk2MDYzMTY2OTgzNzcyMzM2NTgyNzcwODgyNTY4ODQ1MDc1NzU3Mzc5MDgxOTI4ODI4MzMzNDg0Iiwic3RhdGUiOiIxNTcxMzg0MjU2OTI0NTczNzQ2NzIxMTcyNTEzOTU5NzA2MzU3NTIwNjE5Mzg2NDI5OTgzOTA2NDk2MzMyMjg4MzAwMDA0MDAwODgyMTg3OTkyNjQ2Njk2NDk5NDQ0Mzc4MDczMzU0MDY4NTUwNzU4MjUwMzUyMTk0MjA2NzYxOTQ2NTkwMTUxNjA2NjY3ODY4NjMwMTg0NzA3OTg5OTgxNTUwNjY1MjIzNDgxMTk4MDEwNzQzMjk2MzM4NTQyMTU0Mzc3OTM1MDQ5MzQ5NDA3OTcxODQ3NDI4NjY1MjI3OTgzNDAzNDM5OTI2NTEzODY1MzkzOTUxMDY5NjgwNzM4ODQzNTIwNDM4NjYxMjc5MDIwODYwMzk1NzMyNzYzNTk2NTAwNTEwMjI5Njc3NDE1NzY0MDAxNjc0ODQ2NDc3MTM5NzY4MjkyMjAxNzkyNzkxOTQzNzkwMjAyMzEwNzc1NDQ4MDM5NTA4NDA2NzgxMDM1NTM0NzYxMTM4MjMyMDg4MjAxNTMwMjI3Nzg3NDU4NTE5OTc4MTI0MzE4NDIxMDQ1NTc2MjMyNDkwNzkxOTUyMTgyODAxMTc1ODU3NzU1NDY1NzE0NTM2NjYwNDExMTYyMDMxMjg1NjIxMDMxNDk3NjQ5MTQ5MTUwMzY3OTk3MjgwMjQ0Mzk3MzUzNzU2MzI4MDg3MTc2OTEyNzM3ODY0NTI4NjM1MTUxMTAyMzk3OTcyNzQ0MTcwMzkxOTQxNTE1MzI4MDU3MjMzMjE0OTE5NDQyOTU5ODE0NDI0ODQ3NzEyMzc4NTM2NjI3MDkxODMyMDM1NTQwNSIsInppcCI6IjgyNDM5NzQ5MDk2Njg5Nzk3MDk5MzY1NTYwMzUxMjM5NzUzMTA1Mzc4MDc1NTcxOTk1MDA5NDk1NTgzNDIwMjU2NTA5ODU4Nzc0OTgyNjgxNTA5ODAwMTk1ODQ4MTI3NjI0NDg3NzEzNzc5NDgzODc3Mjg2MjczMjEwNzUyMjUxNDM0NTkzODQ2MzIxNzkxNDA3ODUyNjcwNTk3NDE2NjgyNDM5MTMwMDQ4NzUxOTE3NTI5MzAxMDk0OTYxNjM5MTA5Nzg3NDU5NDc2NTE0MjE1Njc3NjY3NTI4MTkwODU4MTk1OTQwNTIzMzAwMTI5ODA1NzgwMTI4ODcyNzk0Njc3OTM5ODkxNDA0MjMyMTE5NDc0NTI1ODA5ODc0NDI1MjM5Mjg3OTgwODM0Mjg1Mzg4MzAwNDI2NTYwOTI0ODM2ODEwMzE4NjgxODM0NzMwMzEyNzE1MjM4ODcxODYzODY2MjM2NTgyOTM4MTgwNzA0NjY4MzkyMjQyODY0NTYyOTk1MTU1MDg1NDg3Nzc2MzMzMjgyODczNzI5OTUwNjczMDU1MDA2NjgyNzExMDQ1MDg3MDI0MzY0NjEzNzE1MDA4OTY5OTE2MTAwMjMxMDU3MDA5Mjg2MzY4OTE3Mjg4NDUwMDQ3NTEzMDQyOTI0MDUzMjY0MTc5MDcwMjQ3MDQ0MzY0NjM0ODYzNjUzOTc4NjQ0OTU1OTUxNjIzMjA2OTE2OTk1OTU0Mzc5NjExNjEwMTcyNTg3OTA5NzcxNzc3MDU4MTE0MjU5NTUzNTE4Mjg3MjM2OTMxMTA3OTQxNDI0NDc2NjE1MTQ2In0sInJjdHh0IjoiMTI4ODIxMDYyMTgzNjQ0ODk4NTIxOTg3NTYwNjM0NzI1MDM5MjU1NDY3MzY0MjM1MTEwMzA5ODA3NzU3MjQxMDIzMDk3MzE3ODI2MTk5NDQ3OTUyNzA4NDE2ODg3NzM3MTI3NjI4MTE1MTA1MTAwNTc3MzU2NzMzODQ2NDcxNDkzNjk2ODM4MzA1NTMwMTAzMTEzOTg5MTk0MTU4NzUwMzU0MTk0OTcwMzkwNzEzNDQ5NjE4NTU5MDYyNjA3MDQ0MjkzODYwMDk3NjkzOTEyMzUzMTA3MzQyNzQ1Mjg3NjU5MzM4OTU0NDI1OTg1ODY3NjA4Nzc2MDUyMjg1MDQwMDkyNzgyNDI1NTYzNzM4MDI2NjM4NzQwNzg2NjM5NjMxMzQyMzUxMjg5MzA5MTkyMDUxMzg1MDAyMjQxMTM2MTExNTU4MDk1OTM2MTA3MjEwMTkzNDQwODgyMzExNTU2MjU3NjgwNTIxMDc1OTgwNDQ1ODA1Njg4OTI5ODk4MTg1ODk3NzcwMzM3MDc3MDY0MTQzMzMyNzcyOTg5MzMyODk2NjYxMTAxNTIzMjU1NzIwODU0MTcwMDM0NDA1OTExNzM4MTYxODU1MzU0NDcxNDg0MzY0NjE2MjY5NzAyMDg3NDk0MDQ4MDkwNTU1NzUwNzI1NDA4NDY5OTI4MDM0MTQyOTQzNjU0NTQxOTEwMjUzMjAzODkxMjQ5ODY3NDQzODc5MTM4NTMwNzg3OTIzNjgwMzU1MTk3NjIyNzM3MDUzNDEyMTQ3Njk1NDU3NjMyNjcwMTc2NDcxMTg3ODQ4MDIxMDI2MDk1MDEiLCJybXMiOiIxMDYwMjgzMzQzNTI5NTYxMjAyNTQxOTA1Mzk2NzcxNjQ4NzQxMTMyNjA5MTkzMzUxNzc2NDU1Mzg5ODU0MzkyMTU1NTIwMDk4MTc0MTQyNjgwMDE3NDc5MzQ3OTQ3OTI2MjI1ODYyODMzNjgxODM5OTIzMDc2NTg3Mjk2ODI2MDU2MzYxMTk5MjkwNDc2Njk4MTQxMzg5MDQ0NDMxOTYwNzYwNDc0MzU1Nzg3OTg2NTk2MzM2NDcwNzQ4MzkyMjM1NjQzMDA0NDA3MTA5OTE4MjM4MDY0MjM3NjA0NDI4NTg3NTExMDk4MTM3MjE5NDMwNjYzMDAxMTMzMzg4MTQ4MDUwMTYzMTkzNzE3MjQ1ODk5OTc5MzMyMDQ2OTEwNTkyMTM5MDQ1OTMyMjM4Nzk3ODI5MTM3ODY3OTY4OTcyODczNzc2NTUzMTM1NDYxMjA4NTAzNDQ3NTI4ODAzODY5Mzc4MDMxMDkyMzE1OTYwODY0MDE3MzY3NzUwMTY5NTc5NjIyNTI5MTE2MDM0NjI5ODgwNDg2MTQ0NDEyMzIxOTY4NTEyMDQ3OTY1MDM5MzU4NzM4ODE2OTc5NDI5NzI5ODQ2NTc1NTA4MDMyMDgzNDgwMzk2OTI4MTYzOTkzODM4OTgxNzA5NjA2OTcxMTM3NDIzMjU1OTgwOTk1MDQzNDk3OTQ4MDM1NDU1OTgyNTk3MDM0MTIxMzQzNTQ1MTMzMDU0MTU3MzMxMDg2NDM3NjI5ODcwNDgzNjQ4ODI2NzYxODk0Njk2MDUxMTA0OTk3NDY4NzYyNzg1MDg1OTA1NjI4ODg0NzEzMDkiLCJzIjoiMTU2Nzg3NTU4MjczMDYwNTc3NDM3NjA2MDAwMjgzODY1MDI4MTAxMjMzNTE1NDYwMTY0NjEyMzU0NzY2ODEzNjIyMjg4MjU1OTg4MzExMDIxOTk1Mjg3OTU1MzYxMjg5NzI0NjcwMzAxNjk4NTY3OTMyODY5MTA2MzY0NDYxODk4OTk1Mjg3NDgyMjE2ODAyMDk0MzAwMTI1NTI3Njk3NDE2Nzg2NjU1ODU3NDgyNjAyNzYwNjY1NTYxMzUzODY0MjE3ODI5OTIyNTE1MDcxMjc5MTQxNjczNzQyNTEyMTIxNzYzMzkwNjIxMzc2OTI5MTgzMzQ3OTc0NTEzNzI3NTAxNTg2MjA4MTA3MDA5ODIwNjQ5OTYyNjI2OTc5NjEwNzI2ODQ1ODI1MDc3ODE2OTUxOTMwNTMwODIxMTI4NDIxOTA0MDU0MTEwNjkxMDY4Mjk4NjczNTk1MjQ0MDY5NjAxNTEzODgyNzA3NDMwMTA3NzQ5NDczMDU3NzczOTc0MTQ2NTcwNzcwNTA3Mzg5OTg5NzI1MjQyNDU1Mzc4OTI3NDE1MjMwOTE5Mjg2Mzc3NTEwMDU3OTU2NTc3MTQyMjc5NzU5MDAwMjcyMDE5MzE4NTM2OTA1NzYyMDU3MTQzMzc2MzM2MzU4MDMzMzkwMDY2MzA4NjExOTgwOTc5MTg1NDgwNzk1MDY4MzYyMjc0MzA1NTkwODk3NzIyMzM5NTEwMTM3Mjc0Nzg2MzE4NTEwNTM3NjY4NTU3NzE0NDI0Njg1MzIyMTI5MzgyMTM4MzkzODQyOTQ0MTY0NDI4OTY3MzM0Mjk1MDYiLCJ6IjoiMTA3ODQ0NTgwNjk2OTM1MjY3MjI0Njg1OTUzNzEwMjQ4NzIxNTg4MDYxMjcwMzgyMjIzNjQ2MDk2MzQ5MTEzOTQyNzU0MDMwOTMxNjMyMTM3NDY2Mzk1NzY2MjgyODkxODQ3NjM0NjU3MTgyMjM5MDQ4MjE1NTc2NTU1MzUwMTAzMjU4NzYxNzYzMzgzOTAzMjA4Njc3NjE1NjMxMzM0NjMyNjQzMDA0MzQ5ODYzODEyODczMDQxMDc2NjExNzE2MjA1NDAxMDUyODEyNTA1NjAzNTg2NzYzODI2ODY2OTk0MTQzMTQ1MjM0NDM2Mjk4OTQ2ODMzMzI1Mjk2MTUzNjM3ODIyMTI2NDU5Njg0ODQzNTg3MjM2MDc1OTI1MDM3NDY1MDUwMzYxMDkzNDM5MTM5NzE4NDcyNjM5NzEwNzI1MTkzMzQwNjA2NTYxOTU0MTQyODk1NzgwMTA0MTI1MTg5MDExNDcyNDM0MjY4MDk5MzczNTU3Mzc3OTM1MTc2OTg4OTc2MDQ0MTQ1Njg2NTU4NDg3ODYyOTgyNDc4Njc0NzUwMDQ5MjY2NTI0Mjk1NzI1MDk3NzI4Nzk3NDYwMjEwMzk5Mzc3ODQ4OTgzMDMzMjUwNDE0MTc0MTQ1Mjk4OTI3NjQ4ODYxNDUyMTY2NjQ0OTgyMzE1MzA4OTM3ODMwODYzMTkyNTU2ODY5MzU1OTcxMDIzNDM2OTg1ODY1OTg3MzgzMTQ0ODM4OTc3NzkzMDYzODk0NDk3ODU5ODc1NTYwNzE4NzQzMzY4NjQ0MDA4ODc4MDM1NTU4ODg2Mzk4NTI5ODY3NjUxIn19ffkBcaBPvWULqOwxMeXc9rVGPWeY/zrjenpinABGeavylxg9VKDSZgBk2repnWQF/NngiQn0k+hJj2w23mlKI/LVJ6yjLqAfaLK0d9PblIbRzlok6u21QjS4Dx7muCERAkmnKxIY0YCgRXB16H9S0x4fH5yFIemub7inDuOG67829/Y739jLMGWgo6VMJB4cemKJEio5Z/5mLKgTVHMYtjBi/9lP3ZuPlNSgNixAUxLZroi10PDukPsUxuEAXLnRaC+ku1zk3eRSg+egQWmwWHj+ec9D1ru4m47Vo4X/R4tlk9uTrUapmuG4fWCAoAoTTzqKqJFLJXTlBzO3RoFrsMHHRV7XiX/ZNGqkeZLdgKC/EIAumu/LD50YY3W1ErcKpTz+Ezhqz8mAuBQGLTfwPYCgR/+SU8YpYYvma5FK35JDtimGrOxiEFvh1HnAOjYNLB+AoIIe6s3NQv9J9TrOd5WW/SDVN98RV2lEhp/RPDSpBVKMgPg5lwCG9xdmN3dXBSVFVOa1huNkFyWXpzOgoL8NincrZ6kIy7QoYzPJhIp6iRUiUHmOAAj0d8Bxc3lI+QExgKAfJEVEiRmtfCUzf3UP9AYqzpaoudKZn5eIVoAQc+gyWKB3xACdspYL0Y6Jn2RPj5VFkbqYGwMEj+RHYS4pb+oSLqDJQTYScc5um90wXnIBziFOlONr9GXPRQGiBncfqA/LzqDVq2Aj26NN2nwQT5YDVPofc+xc1QPpl3XF4o/dtqwjrqBzJnB3BMBhzazaqgNcRRsTcEvHjRv1ZBVzDswD4tAK/KD4QbfFSniHCftdyvCLBbZhewAxCAMTH5kOUSzteq1PzaDfgdEGxnHHmSzniVnuBX9Bq9KNZF0IhAuCQ4b4UyCir6DQqeWiudjwNTKSrJZNbaa+k7fbTN6aaMGN7GAbWqbcD6D4GNYr9NMr/K5N572hzngAoGPpNBtsVF2c5wFGK/WV54CAgICAgID5AhGgZBqeju/DK9w6gsSRyeuChMfPiz3O5CJOt05Cwx6+Zq6g/qAj00WnUVvBv214PQY/n6Hz3ge+kdXc7bRq2zC31cCgK9Hxi/K6rqzGHb2hV1UhJwZV9RZh9kW/ROGpoNzP26qgWBd12S383lcW+7Kyb9gp/OyP5U+/eB3gkMTH/QzEhLag8yqxFoLeuvR5GBbAv96n6C5AFillVcGHJO1MCpH/nJugongKEEAA+Um2j9Oj6rBtKYKjeEb3zIv6vDwMB4ltGGWg1+BLf5OUANOczDp0kjz6BRqNExkadkzr5uKEhjae0i6g0kXS+Xw39zJ5DUOqKiMIKl3hcaVTIgnAEmke90PmIo+gjN41X4+PgYvGipSLeCRvP8LeGMm6Ot4iJbpNofTlRXqgqvwhtqLvuNKkBdPMSDAO28IarXiYXDhae80T+EiDK32gYJHzb9ZyTTcqeX/PSDKzNWm9SWT2MtEgxez0oIfijkmgMO//SplF4wKP6kvUTbLsUTi8LGdIzMJUL0q7t9AMom6gGtQvALeksBhOSUpKNXku8JP9WmQOhVSD9lk04NrphKCgdeaCcxJPJmGzB06WXeTH7Nr69ZLF5S5ahB/coqXdupygUKah9cJa5x9dtgVS6rymjOoNTbrSuw+AxvrqhZgYgI6gE+UVSRl8WVc4BZyM+zE9W5OEO58lpO97/pTn2PDkiB+A", + "root_hash": "3tses33E3t9z7W2gvHk8LizYcjWLTvGZwGZd3Q46SUUm", + }, + "txnTime": 1522769812, + "type": "108", + } +} diff --git a/aries_cloudagent/ledger/merkel_validation/tests/test_domain_txn_handler.py b/aries_cloudagent/ledger/merkel_validation/tests/test_domain_txn_handler.py new file mode 100644 index 0000000000..c8205e5a63 --- /dev/null +++ b/aries_cloudagent/ledger/merkel_validation/tests/test_domain_txn_handler.py @@ -0,0 +1,402 @@ +"""Tests for Domain Txn Handling Utils.""" +import base58 +import json + +from copy import deepcopy + +from unittest import TestCase + +from aries_cloudagent.ledger.multiple_ledger.base_manager import T + +from ..domain_txn_handler import ( + _extract_attr_typed_value, + parse_attr_txn, + decode_state_value, + extract_params_write_request, + hash_of, + make_state_path_for_attr, + prepare_attr_for_state, + prepare_nym_for_state, + prepare_revoc_reg_entry_for_state, + prepare_schema_for_state, + prepare_get_claim_def_for_state, + prepare_claim_def_for_state, + prepare_revoc_def_for_state, + prepare_get_revoc_reg_entry_for_state, + prepare_revoc_reg_entry_accum_for_state, +) + +CLAIM_DEF_TXN = { + "result": { + "txn": { + "data": { + "ver": 1, + "signature_type": "CL", + "ref": 10, + "tag": "some_tag", + "data": {"primary": "....", "revocation": "...."}, + }, + "metadata": { + "reqId": 1514280215504647, + "from": "L5AD5g65TDQr1PPHHRoiGf", + "endorser": "D6HG5g65TDQr1PPHHRoiGf", + "digest": "6cee82226c6e276c983f46d03e3b3d10436d90b67bf33dc67ce9901b44dbc97c", + "payloadDigest": "21f0f5c158ed6ad49ff855baf09a2ef9b4ed1a8015ac24bccc2e0106cd905685", + }, + }, + "txnMetadata": { + "txnTime": 1513945121, + "seqNo": 10, + "txnId": "HHAD5g65TDQr1PPHHRoiGf2L5AD5g65TDQr1PPHHRoiGf1|Degree1|CL|key1", + }, + } +} + +REVOC_REG_ENTRY_TXN = { + "result": { + "txn": { + "data": { + "ver": 1, + "revocRegDefId": "L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1", + "revocDefType": "CL_ACCUM", + "value": { + "accum": "accum_value", + "prevAccum": "prev_acuum_value", + "issued": [], + "revoked": [10, 36, 3478], + }, + }, + }, + "txnMetadata": { + "txnTime": 1513945121, + "seqNo": 10, + "txnId": "5:L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1", + }, + } +} + + +class TestDomainTxnHandler(TestCase): + """Domain Txn Handler Tests""" + + def test_extract_attr_typed_value(self): + test_txn_data = {"test": {...}} + with self.assertRaises(ValueError) as cm: + _extract_attr_typed_value(test_txn_data) + assert "ATTR should have one" in cm + test_txn_data = { + "raw": {...}, + "enc": {...}, + "hash": {...}, + } + with self.assertRaises(ValueError) as cm: + _extract_attr_typed_value(test_txn_data) + assert "ATTR should have only one" in cm + + def test_parse_attr_txn(self): + test_txn_data = {"raw": '{"name": "Alice"}'} + assert parse_attr_txn(test_txn_data) == ("raw", "name", '{"name": "Alice"}') + test_txn_data = {"enc": "test"} + assert parse_attr_txn(test_txn_data) == ("enc", "test", "test") + test_txn_data = {"hash": "test"} + assert parse_attr_txn(test_txn_data) == ("hash", "test", None) + + def test_decode_state_value(self): + test_value = "test_value" + test_lsn = "100" + test_lut = "test_lut" + test_encoded = {"val": test_value, "lsn": test_lsn, "lut": test_lut} + assert decode_state_value(json.dumps(test_encoded)) == ( + test_value, + test_lsn, + test_lut, + ) + + def test_hash_of(self): + test = {"test": "test"} + assert hash_of(test) + test = "123" + assert hash_of(test) + test = b"234" + assert hash_of(test) + + def test_make_state_path_for_attr(self): + assert b"did1:1:attrName1" == make_state_path_for_attr( + "did1", "attrName1", attr_is_hash=True + ) + assert ( + b"did1:1:677a81e8649df8f1a1e8af7709a5ece1d965cb684b2c185272114c5cc3b7ec49" + == make_state_path_for_attr("did1", "attrName1", attr_is_hash=False) + ) + assert ( + b"did1:1:677a81e8649df8f1a1e8af7709a5ece1d965cb684b2c185272114c5cc3b7ec49" + == make_state_path_for_attr("did1", "attrName1") + ) + + def test_prepare_attr_for_state(self): + txn = { + "result": { + "txn": { + "data": { + "ver": 1, + "dest": "N22KY2Dyvmuu2PyyqSFKue", + "raw": '{"name":"Alice"}', + }, + }, + "txnMetadata": { + "txnTime": 1513945121, + "seqNo": 10, + "txnId": "N22KY2Dyvmuu2PyyqSFKue|02", + }, + } + } + path, value_bytes = prepare_attr_for_state(txn) + assert ( + path + == b"N22KY2Dyvmuu2PyyqSFKue:1:82a3537ff0dbce7eec35d69edc3a189ee6f17d82f353a553f9aa96cb0be3ce89" + ) + assert ( + value_bytes + == b'{"lsn": 10, "lut": 1513945121, "val": "6d4a333838d0ef96756cccc680af2531075c512502fb68c5503c63d93de859b3"}' + ) + path = prepare_attr_for_state(txn, path_only=True) + assert ( + path + == b"N22KY2Dyvmuu2PyyqSFKue:1:82a3537ff0dbce7eec35d69edc3a189ee6f17d82f353a553f9aa96cb0be3ce89" + ) + + def test_prepare_nym_for_state(self): + txn = { + "result": { + "txn": { + "data": { + "dest": "N22KY2Dyvmuu2PyyqSFKue", + }, + }, + } + } + assert prepare_nym_for_state(txn) + + def test_prepare_schema_for_state(self): + txn = { + "result": { + "txn": { + "type": "101", + "protocolVersion": 2, + "data": { + "ver": 1, + "data": { + "name": "Degree", + "version": "1.0", + "attr_names": [ + "undergrad", + "last_name", + "first_name", + "birth_date", + "postgrad", + "expiry_date", + ], + }, + }, + "metadata": { + "reqId": 1514280215504647, + "from": "L5AD5g65TDQr1PPHHRoiGf", + "endorser": "D6HG5g65TDQr1PPHHRoiGf", + "digest": "6cee82226c6e276c983f46d03e3b3d10436d90b67bf33dc67ce9901b44dbc97c", + "payloadDigest": "21f0f5c158ed6ad49ff855baf09a2ef9b4ed1a8015ac24bccc2e0106cd905685", + }, + }, + "txnMetadata": { + "txnTime": 1513945121, + "seqNo": 10, + "txnId": "L5AD5g65TDQr1PPHHRoiGf1|Degree|1.0", + }, + } + } + path, value_bytes = prepare_schema_for_state(txn) + assert path == b"L5AD5g65TDQr1PPHHRoiGf:2:Degree:1.0" + assert ( + prepare_schema_for_state(txn, path_only=True) + == b"L5AD5g65TDQr1PPHHRoiGf:2:Degree:1.0" + ) + + def test_prepare_get_claim_def_for_state(self): + txn = deepcopy(CLAIM_DEF_TXN) + txn.get("result").get("txn").get("data").pop("ref") + with self.assertRaises(ValueError) as cm: + prepare_get_claim_def_for_state(txn) + assert "ref field is absent, but it must contain schema seq no" in cm + + def test_prepare_claim_def_for_state(self): + txn = deepcopy(CLAIM_DEF_TXN) + txn.get("result").get("txn").get("data").pop("ref") + with self.assertRaises(ValueError) as cm: + prepare_claim_def_for_state(txn) + assert "ref field is absent, but it must contain schema seq no" in cm + + txn = deepcopy(CLAIM_DEF_TXN) + txn.get("result").get("txn").get("data").pop("data") + with self.assertRaises(ValueError) as cm: + prepare_claim_def_for_state(txn) + assert "data field is absent, but it must contain components of keys" in cm + + txn = deepcopy(CLAIM_DEF_TXN) + path, value_bytes = prepare_claim_def_for_state(txn) + assert path == b"L5AD5g65TDQr1PPHHRoiGf:3:CL:10:some_tag" + assert ( + prepare_claim_def_for_state(txn, path_only=True) + == b"L5AD5g65TDQr1PPHHRoiGf:3:CL:10:some_tag" + ) + + def test_prepare_revoc_def_for_state(self): + txn = { + "result": { + "txn": { + "type": "113", + "protocolVersion": 2, + "data": { + "ver": 1, + "id": "L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1", + "credDefId": "FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag", + "revocDefType": "CL_ACCUM", + "tag": "tag1", + "value": { + "maxCredNum": 1000000, + "tailsHash": "6619ad3cf7e02fc29931a5cdc7bb70ba4b9283bda3badae297", + "tailsLocation": "http://tails.location.com", + "issuanceType": "ISSUANCE_BY_DEFAULT", + "publicKeys": {}, + }, + }, + "metadata": { + "reqId": 1514280215504647, + "from": "L5AD5g65TDQr1PPHHRoiGf", + "endorser": "D6HG5g65TDQr1PPHHRoiGf", + "digest": "6cee82226c6e276c983f46d03e3b3d10436d90b67bf33dc67ce9901b44dbc97c", + "payloadDigest": "21f0f5c158ed6ad49ff855baf09a2ef9b4ed1a8015ac24bccc2e0106cd905685", + }, + }, + "txnMetadata": { + "txnTime": 1513945121, + "seqNo": 10, + "txnId": "L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1", + }, + }, + } + path, value_bytes = prepare_revoc_def_for_state(txn) + assert ( + path + == b"L5AD5g65TDQr1PPHHRoiGf:4:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1" + ) + assert ( + prepare_revoc_def_for_state(txn, path_only=True) + == b"L5AD5g65TDQr1PPHHRoiGf:4:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1" + ) + + def test_prepare_get_revoc_reg_entry_for_state(self): + assert prepare_get_revoc_reg_entry_for_state(REVOC_REG_ENTRY_TXN) + + def test_prepare_revoc_reg_entry_for_state(self): + path, value_bytes = prepare_revoc_reg_entry_for_state(REVOC_REG_ENTRY_TXN) + assert ( + path + == b"5:L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1" + ) + assert ( + prepare_revoc_reg_entry_for_state(REVOC_REG_ENTRY_TXN, path_only=True) + == b"5:L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1" + ) + + def test_prepare_revoc_reg_entry_accum_for_state(self): + path, value_bytes = prepare_revoc_reg_entry_accum_for_state(REVOC_REG_ENTRY_TXN) + assert ( + path + == b"6:L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1" + ) + + def test_extract_params_write_request(self): + write_request = { + "result": { + "txn": { + "data": { + "data": { + "attr_names": ["axuall_proof_id"], + "name": "manual_issuance", + "version": "4.1.0", + } + }, + "metadata": { + "digest": "f440da62ab1c38601e41b5e148370b64af3183a2428f327cfb8ac83ca4cbc698", + "from": "F72i3Y3Q4i466efjYJYCHM", + "payloadDigest": "0ecfd475ed34635b6623bcbcda422ff90dee53f66de20a7441bca04ba0632670", + "reqId": 1634854863739554300, + "taaAcceptance": { + "mechanism": "for_session", + "taaDigest": "8cee5d7a573e4893b08ff53a0761a22a1607df3b3fcd7e75b98696c92879641f", + "time": 1600300800, + }, + }, + "protocolVersion": 2, + "type": "101", + }, + "txnMetadata": { + "seqNo": 75335, + "txnId": "F72i3Y3Q4i466efjYJYCHM:2:manual_issuance:4.1.0", + "txnTime": 1634854863, + }, + "ver": "1", + "auditPath": [ + "DhzuS5ZkyxUrQbRzbCfq3NDcTnTzFcJHYJYyTAqgMymT", + "AtYz5H13m1ZyFiGVVLwRL7au8muhCMwikeQrMwJBJ5d9", + "3GpqpZw4jymXDFm8Mf5gSMHSJ8r1wXutiZ1x26kxwWBE", + "EQyUpv9z64prPmDqkb8n2SzBtor3tGaNw4qZP2psG2wD", + "9fbpJjWbDuNbCrC7GQxjVxDGJ48k7rdBDf5ATu7PXVx3", + "CRbATGbi9DbN1pphXMcREVZTUUqsQbD9mN5ikKm1FkSc", + "E19R3Cty6iLtWBoxSqHLrxTSJSKpBtj3wJEdD98pS9g8", + ], + "ledgerSize": 75335, + "reqSignature": { + "type": "ED25519", + "values": [ + { + "from": "F72i3Y3Q4i466efjYJYCHM", + "value": "29sKMwQjm6r2BjpbJdFi1TGvowaxbi9nnAPP9LwPHhuBzwtfWwqJFg4Ur2xAWBLYfPLnMNNSgwwNesRSb1C1L72d", + } + ], + }, + "rootHash": "4aM6yCpamk82Uqb414mNRpEdYkwdSMhi3HkXgN7YvRaX", + } + } + txn = deepcopy(write_request) + ( + tree_size, + leaf_index, + decoded_audit_path, + expected_root_hash, + ) = extract_params_write_request(txn) + + given_audit_path = [ + "DhzuS5ZkyxUrQbRzbCfq3NDcTnTzFcJHYJYyTAqgMymT", + "AtYz5H13m1ZyFiGVVLwRL7au8muhCMwikeQrMwJBJ5d9", + "3GpqpZw4jymXDFm8Mf5gSMHSJ8r1wXutiZ1x26kxwWBE", + "EQyUpv9z64prPmDqkb8n2SzBtor3tGaNw4qZP2psG2wD", + "9fbpJjWbDuNbCrC7GQxjVxDGJ48k7rdBDf5ATu7PXVx3", + "CRbATGbi9DbN1pphXMcREVZTUUqsQbD9mN5ikKm1FkSc", + "E19R3Cty6iLtWBoxSqHLrxTSJSKpBtj3wJEdD98pS9g8", + ] + expected_audit_path = [ + base58.b58decode(hash_str.encode("utf-8")) for hash_str in given_audit_path + ] + expected_hash = base58.b58decode( + "4aM6yCpamk82Uqb414mNRpEdYkwdSMhi3HkXgN7YvRaX".encode("utf-8") + ) + assert tree_size == 75335 + assert leaf_index == 75334 + assert expected_root_hash == expected_hash + assert decoded_audit_path == expected_audit_path + + txn = deepcopy(write_request) + txn["result"]["txnMetadata"]["seqNo"] = 1000 + with self.assertRaises(Exception) as cm: + extract_params_write_request(txn) + assert "auditPath length does not match with given seqNo" in cm diff --git a/aries_cloudagent/ledger/merkel_validation/tests/test_trie.py b/aries_cloudagent/ledger/merkel_validation/tests/test_trie.py new file mode 100644 index 0000000000..c522b915bc --- /dev/null +++ b/aries_cloudagent/ledger/merkel_validation/tests/test_trie.py @@ -0,0 +1,156 @@ +import json + +from asynctest import TestCase + +from ..domain_txn_handler import ( + prepare_for_state_read, + get_proof_nodes, +) +from ..hasher import TreeHasher, HexTreeHasher +from ..trie import SubTrie +from ..merkel_verifier import MerkleVerifier + +from .test_data import ( + GET_REVOC_REG_REPLY_A, + GET_REVOC_REG_REPLY_B, + GET_ATTRIB_REPLY, + GET_CLAIM_DEF_REPLY_INVALID, + GET_CLAIM_DEF_REPLY_A, + GET_CLAIM_DEF_REPLY_B, + GET_REVOC_REG_DEF_REPLY_A, + GET_REVOC_REG_DEF_REPLY_B, + GET_REVOC_REG_DELTA_REPLY_A, + GET_REVOC_REG_DELTA_REPLY_B, + GET_REVOC_REG_DELTA_REPLY_C, + GET_NYM_REPLY, + GET_SCHEMA_REPLY_A, + GET_SCHEMA_REPLY_B, + RAW_HEX_LEAF, + SHA256_AUDIT_PATH, +) + + +class TestSubTrie(TestCase): + def test_get_setter_root_hash(self): + test_trie = SubTrie() + test_trie.root_hash = 530343892119126197 + assert test_trie.root_hash == 530343892119126197 + + def test_get_blank_node(self): + assert SubTrie._get_node_type(b"") == 0 + + async def test_verify_spv_proof_catch_exception(self): + assert not await SubTrie.verify_spv_proof( + expected_value="test", proof_nodes="test" + ) + + +class TestMPTStateProofValidation(TestCase): + async def test_validate_get_nym(self): + reply = GET_NYM_REPLY + assert await SubTrie.verify_spv_proof( + proof_nodes=get_proof_nodes(reply), + expected_value=prepare_for_state_read(reply), + ) + + async def test_validate_get_attrib(self): + reply = GET_ATTRIB_REPLY + assert await SubTrie.verify_spv_proof( + proof_nodes=get_proof_nodes(reply), + expected_value=prepare_for_state_read(reply), + ) + + async def test_validate_get_schema(self): + reply_a = GET_SCHEMA_REPLY_A + assert await SubTrie.verify_spv_proof( + proof_nodes=get_proof_nodes(reply_a), + expected_value=prepare_for_state_read(reply_a), + ) + reply_b = GET_SCHEMA_REPLY_B + assert await SubTrie.verify_spv_proof( + proof_nodes=get_proof_nodes(reply_b), + expected_value=prepare_for_state_read(reply_b), + ) + + async def test_validate_get_claim_def(self): + reply_a = GET_CLAIM_DEF_REPLY_A + assert await SubTrie.verify_spv_proof( + proof_nodes=get_proof_nodes(reply_a), + expected_value=prepare_for_state_read(reply_a), + ) + reply_b = GET_CLAIM_DEF_REPLY_B + assert await SubTrie.verify_spv_proof( + proof_nodes=get_proof_nodes(reply_b), + expected_value=prepare_for_state_read(reply_b), + ) + reply_c = GET_CLAIM_DEF_REPLY_INVALID + assert not await SubTrie.verify_spv_proof( + proof_nodes=get_proof_nodes(reply_c), + expected_value=prepare_for_state_read(reply_c), + ) + + async def test_validate_get_revoc_reg(self): + reply_a = GET_REVOC_REG_REPLY_A + assert await SubTrie.verify_spv_proof( + proof_nodes=get_proof_nodes(reply_a), + expected_value=prepare_for_state_read(reply_a), + ) + reply_b = GET_REVOC_REG_REPLY_B + assert await SubTrie.verify_spv_proof( + proof_nodes=get_proof_nodes(reply_b), + expected_value=prepare_for_state_read(reply_b), + ) + + async def test_validate_get_revoc_reg_def(self): + reply_a = GET_REVOC_REG_DEF_REPLY_A + assert await SubTrie.verify_spv_proof( + proof_nodes=get_proof_nodes(reply_a), + expected_value=prepare_for_state_read(reply_a), + ) + reply_b = GET_REVOC_REG_DEF_REPLY_B + assert await SubTrie.verify_spv_proof( + proof_nodes=get_proof_nodes(reply_b), + expected_value=prepare_for_state_read(reply_b), + ) + + async def test_validate_get_revoc_reg_delta(self): + reply_a = GET_REVOC_REG_DELTA_REPLY_A + assert await SubTrie.verify_spv_proof( + proof_nodes=get_proof_nodes(reply_a), + expected_value=prepare_for_state_read(reply_a), + ) + reply_b = GET_REVOC_REG_DELTA_REPLY_C + assert await SubTrie.verify_spv_proof( + proof_nodes=get_proof_nodes(reply_b), + expected_value=prepare_for_state_read(reply_b), + ) + reply_c = GET_REVOC_REG_DELTA_REPLY_B + proof_nodes = get_proof_nodes(reply_c) + expected_values = prepare_for_state_read(reply_c) + assert await SubTrie.verify_spv_proof( + proof_nodes=proof_nodes[0], + expected_value=expected_values[0], + ) + assert await SubTrie.verify_spv_proof( + proof_nodes=proof_nodes[1], + expected_value=expected_values[1], + ) + + +class TestMerkleRootHashValidation(TestCase): + async def test_verify_leaf_inclusion_x(self): + merkle_verifier = MerkleVerifier(HexTreeHasher()) + leaf_index = 848049 + tree_size = 3630887 + expected_root_hash = ( + b"78316a05c9bcf14a3a4548f5b854a9adfcd46a4c034401b3ce7eb7ac2f1d0ecb" + ) + assert ( + await merkle_verifier.calculate_root_hash( + RAW_HEX_LEAF, + leaf_index, + SHA256_AUDIT_PATH[:], + tree_size, + ) + == expected_root_hash + ) diff --git a/aries_cloudagent/ledger/merkel_validation/tests/test_utils.py b/aries_cloudagent/ledger/merkel_validation/tests/test_utils.py new file mode 100644 index 0000000000..94ad55a0ec --- /dev/null +++ b/aries_cloudagent/ledger/merkel_validation/tests/test_utils.py @@ -0,0 +1,18 @@ +"""Tests for Merkel Validation Utils.""" +import json + +from unittest import TestCase + +from ..utils import encode_hex, ascii_chr + + +class TestUtils(TestCase): + """Merkel Validation Utils Tests""" + + def test_encode_hex(self): + assert encode_hex("test") + with self.assertRaises(TypeError): + encode_hex(123) + + def test_aschii_chr(self): + assert ascii_chr(16 * 5 + 6) diff --git a/aries_cloudagent/ledger/merkel_validation/trie.py b/aries_cloudagent/ledger/merkel_validation/trie.py new file mode 100644 index 0000000000..2c7acedf98 --- /dev/null +++ b/aries_cloudagent/ledger/merkel_validation/trie.py @@ -0,0 +1,96 @@ +"""Validates State Proof.""" +import json + +from collections import ( + OrderedDict, +) +from rlp import ( + encode as rlp_encode, + decode as rlp_decode, + DecodingError, +) +from .utils import ( + sha3_256, + NIBBLE_TERMINATOR, + unpack_to_nibbles, +) + +from .constants import ( + NODE_TYPE_BLANK, + NODE_TYPE_LEAF, + NODE_TYPE_EXTENSION, + NODE_TYPE_BRANCH, + BLANK_NODE, +) + + +class SubTrie: + """Utility class for SubTrie and State Proof validation.""" + + def __init__(self, root_hash=None): + """MPT SubTrie dictionary like interface.""" + self._subtrie = OrderedDict() + self.set_root_hash(root_hash) + + @property + def root_hash(self): + """Return 32 bytes string.""" + return self._root_hash + + @root_hash.setter + def root_hash(self, value): + self.set_root_hash(value) + + def set_root_hash(self, root_hash=None): + """.""" + self._root_hash = root_hash + + @staticmethod + def _get_node_type(node): + if node == BLANK_NODE: + return NODE_TYPE_BLANK + if len(node) == 2: + nibbles = unpack_to_nibbles(node[0]) + has_terminator = nibbles and nibbles[-1] == NIBBLE_TERMINATOR + return NODE_TYPE_LEAF if has_terminator else NODE_TYPE_EXTENSION + if len(node) == 17: + return NODE_TYPE_BRANCH + + @staticmethod + async def verify_spv_proof(expected_value, proof_nodes, serialized=True): + """Verify State Proof.""" + try: + if serialized: + proof_nodes = rlp_decode(proof_nodes) + new_trie = await SubTrie.get_new_trie_with_proof_nodes(proof_nodes) + expected_value = json.loads(expected_value) + for encoded_node in list(new_trie._subtrie.values()): + try: + decoded_node = rlp_decode(encoded_node) + # branch node + if SubTrie._get_node_type(decoded_node) == NODE_TYPE_BRANCH: + if ( + json.loads(rlp_decode(decoded_node[-1])[0].decode("utf-8")) + ) == expected_value: + return True + # leaf or extension node + if SubTrie._get_node_type(decoded_node) == NODE_TYPE_LEAF: + if ( + json.loads(rlp_decode(decoded_node[1])[0].decode("utf-8")) + ) == expected_value: + return True + except (DecodingError): + continue + return False + except Exception: + return False + + @staticmethod + async def get_new_trie_with_proof_nodes(proof_nodes): + """Return SubTrie created from proof_nodes.""" + new_trie = SubTrie() + for node in proof_nodes: + R = rlp_encode(node) + H = sha3_256(R) + new_trie._subtrie[H] = R + return new_trie diff --git a/aries_cloudagent/ledger/merkel_validation/utils.py b/aries_cloudagent/ledger/merkel_validation/utils.py new file mode 100644 index 0000000000..c3240b36a5 --- /dev/null +++ b/aries_cloudagent/ledger/merkel_validation/utils.py @@ -0,0 +1,75 @@ +"""Merkel Validation Utils.""" +from binascii import hexlify +import hashlib + +hash_function = hashlib.sha256() + +NIBBLE_TERMINATOR = 16 +hti = {c: i for i, c in enumerate("0123456789abcdef")} + + +def sha3_256(x): + """Return 256 bit digest.""" + return hashlib.sha3_256(x).digest() + + +def encode_hex(b): + """Return bytes object for string or hexadecimal rep for bytes input. + + Args: + b: string or bytes + + """ + if isinstance(b, str): + b = bytes(b, "utf-8") + if isinstance(b, bytes): + return str(hexlify(b), "utf-8") + raise TypeError("Value must be an instance of str or bytes") + + +def bin_to_nibbles(s): + """Convert string s to nibbles (half-bytes).""" + return [hti[c] for c in encode_hex(s)] + + +def unpack_to_nibbles(bindata): + """Unpack packed binary data to nibbles. + + Args: + bindata: binary packed from nibbles + + """ + + o = bin_to_nibbles(bindata) + flags = o[0] + if flags & 2: + o.append(NIBBLE_TERMINATOR) + if flags & 1 == 1: + o = o[1:] + else: + o = o[2:] + return o + + +def audit_path_length(index: int, tree_size: int): + """Return AuditPath length. + + Args: + index: Leaf index + tree_size: Tree size + + """ + length = 0 + last_node = tree_size - 1 + while last_node > 0: + if index % 2 or index < last_node: + length += 1 + index //= 2 + last_node //= 2 + + return length + + +def ascii_chr(value): + """Return bytes object.""" + return bytes([value]) diff --git a/aries_cloudagent/ledger/multiple_ledger/__init__.py b/aries_cloudagent/ledger/multiple_ledger/__init__.py new file mode 100644 index 0000000000..e69de29bb2 diff --git a/aries_cloudagent/ledger/multiple_ledger/base_manager.py b/aries_cloudagent/ledger/multiple_ledger/base_manager.py new file mode 100644 index 0000000000..0e34da32c1 --- /dev/null +++ b/aries_cloudagent/ledger/multiple_ledger/base_manager.py @@ -0,0 +1,48 @@ +"""Manager for multiple ledger.""" + +from abc import ABC, abstractmethod +from typing import TypeVar, Optional, Tuple, Mapping + +from ...core.error import BaseError +from ...core.profile import Profile + +T = TypeVar("T") + + +class MultipleLedgerManagerError(BaseError): + """Generic multiledger error.""" + + +class BaseMultipleLedgerManager(ABC): + """Base class for handling multiple ledger support.""" + + def __init__(self, profile: Profile): + """Initialize Multiple Ledger Manager.""" + + @abstractmethod + async def get_write_ledger(self) -> Tuple[str, T]: + """Return write ledger.""" + + @abstractmethod + async def get_prod_ledgers(self) -> Mapping: + """Return configured production ledgers.""" + + @abstractmethod + async def get_nonprod_ledgers(self) -> Mapping: + """Return configured non production ledgers.""" + + @abstractmethod + async def _get_ledger_by_did( + self, ledger_id: str, did: str + ) -> Optional[Tuple[str, T, bool]]: + """Build and submit GET_NYM request and process response.""" + + @abstractmethod + async def lookup_did_in_configured_ledgers( + self, did: str, cache_did: bool + ) -> Tuple[str, T]: + """Lookup given DID in configured ledgers in parallel.""" + + def extract_did_from_identifier(self, identifier: str) -> str: + """Return did from record identifier (REV_REG_ID, CRED_DEF_ID, SCHEMA_ID).""" + return identifier.split(":")[0] diff --git a/aries_cloudagent/ledger/multiple_ledger/indy_manager.py b/aries_cloudagent/ledger/multiple_ledger/indy_manager.py new file mode 100644 index 0000000000..51ab398170 --- /dev/null +++ b/aries_cloudagent/ledger/multiple_ledger/indy_manager.py @@ -0,0 +1,244 @@ +"""Multiple IndySdkLedger Manager.""" +import asyncio +import concurrent.futures +import logging +import json + +from collections import OrderedDict +from typing import Optional, Tuple, Mapping + +from ...cache.base import BaseCache +from ...core.profile import Profile +from ...ledger.error import LedgerError +from ...wallet.crypto import did_is_self_certified + +from ..indy import IndySdkLedger +from ..merkel_validation.domain_txn_handler import ( + prepare_for_state_read, + get_proof_nodes, +) +from ..merkel_validation.trie import SubTrie + +from .base_manager import BaseMultipleLedgerManager, MultipleLedgerManagerError + +LOGGER = logging.getLogger(__name__) + + +class MultiIndyLedgerManager(BaseMultipleLedgerManager): + """Multiple Indy SDK Ledger Manager.""" + + def __init__( + self, + profile: Profile, + production_ledgers: OrderedDict = OrderedDict(), + non_production_ledgers: OrderedDict = OrderedDict(), + write_ledger_info: Tuple[str, IndySdkLedger] = None, + cache_ttl: int = None, + ): + """Initialize MultiIndyLedgerManager. + + Args: + profile: The base profile for this manager + production_ledgers: production IndySdkLedger mapping + non_production_ledgers: non_production IndySdkLedger mapping + cache_ttl: Time in sec to persist did_ledger_id_resolver cache keys + + """ + self.profile = profile + self.production_ledgers = production_ledgers + self.non_production_ledgers = non_production_ledgers + self.write_ledger_info = write_ledger_info + self.executor = concurrent.futures.ThreadPoolExecutor(max_workers=5) + self.cache_ttl = cache_ttl + + async def get_write_ledger(self) -> Optional[Tuple[str, IndySdkLedger]]: + """Return the write IndySdkLedger instance.""" + return self.write_ledger_info + + async def get_prod_ledgers(self) -> Mapping: + """Return production ledgers mapping.""" + return self.production_ledgers + + async def get_nonprod_ledgers(self) -> Mapping: + """Return non_production ledgers mapping.""" + return self.non_production_ledgers + + async def _get_ledger_by_did( + self, + ledger_id: str, + did: str, + ) -> Optional[Tuple[str, IndySdkLedger, bool]]: + """Build and submit GET_NYM request and process response. + + Successful response return tuple with ledger_id, IndySdkLedger instance + and is_self_certified bool flag. Unsuccessful response return None. + + Args: + ledger_id: provided ledger_id to retrieve IndySdkLedger instance + from production_ledgers or non_production_ledgers + did: provided DID + + Return: + (str, IndySdkLedger, bool) or None + """ + try: + indy_sdk_ledger = None + if ledger_id in self.production_ledgers: + indy_sdk_ledger = self.production_ledgers.get(ledger_id) + else: + indy_sdk_ledger = self.non_production_ledgers.get(ledger_id) + async with indy_sdk_ledger: + request = await indy_sdk_ledger.build_and_return_get_nym_request( + None, did + ) + response_json = await asyncio.wait_for( + indy_sdk_ledger.submit_get_nym_request(request), 10 + ) + response = json.loads(response_json) + data = response.get("result").get("data") + if not data: + LOGGER.warning(f"Did {did} not posted to ledger {ledger_id}") + return None + if isinstance(data, str): + data = json.loads(data) + if not await SubTrie.verify_spv_proof( + expected_value=prepare_for_state_read(response), + proof_nodes=get_proof_nodes(response), + ): + LOGGER.warning( + f"State Proof validation failed for Did {did} " + f"and ledger {ledger_id}" + ) + return None + if did_is_self_certified(did, data.get("verkey")): + return (ledger_id, indy_sdk_ledger, True) + return (ledger_id, indy_sdk_ledger, False) + except asyncio.TimeoutError: + LOGGER.exception( + f"get-nym request timedout for Did {did} and " + f"ledger {ledger_id}, reply not received within 10 sec" + ) + return None + except LedgerError as err: + LOGGER.error( + "Exception when building and submitting get-nym request, " + f"for Did {did} and ledger {ledger_id}, {err}" + ) + return None + + async def lookup_did_in_configured_ledgers( + self, did: str, cache_did: bool = True + ) -> Tuple[str, IndySdkLedger]: + """Lookup given DID in configured ledgers in parallel.""" + self.cache = self.profile.inject_or(BaseCache) + cache_key = f"did_ledger_id_resolver::{did}" + if bool(cache_did and self.cache and await self.cache.get(cache_key)): + cached_ledger_id = await self.cache.get(cache_key) + if cached_ledger_id in self.production_ledgers: + return (cached_ledger_id, self.production_ledgers.get(cached_ledger_id)) + elif cached_ledger_id in self.non_production_ledgers: + return ( + cached_ledger_id, + self.non_production_ledgers.get(cached_ledger_id), + ) + else: + raise MultipleLedgerManagerError( + f"cached ledger_id {cached_ledger_id} not found in either " + "production_ledgers or non_production_ledgers" + ) + applicable_prod_ledgers = {"self_certified": {}, "non_self_certified": {}} + applicable_non_prod_ledgers = {"self_certified": {}, "non_self_certified": {}} + ledger_ids = list(self.production_ledgers.keys()) + list( + self.non_production_ledgers.keys() + ) + coro_futures = { + self.executor.submit(self._get_ledger_by_did, ledger_id, did): ledger_id + for ledger_id in ledger_ids + } + for coro_future in concurrent.futures.as_completed(coro_futures): + result = await coro_future.result() + if result: + applicable_ledger_id = result[0] + applicable_ledger_inst = result[1] + is_self_certified = result[2] + if applicable_ledger_id in self.production_ledgers: + insert_key = list(self.production_ledgers).index( + applicable_ledger_id + ) + if is_self_certified: + applicable_prod_ledgers["self_certified"][insert_key] = ( + applicable_ledger_id, + applicable_ledger_inst, + ) + else: + applicable_prod_ledgers["non_self_certified"][insert_key] = ( + applicable_ledger_id, + applicable_ledger_inst, + ) + else: + insert_key = list(self.non_production_ledgers).index( + applicable_ledger_id + ) + if is_self_certified: + applicable_non_prod_ledgers["self_certified"][insert_key] = ( + applicable_ledger_id, + applicable_ledger_inst, + ) + else: + applicable_non_prod_ledgers["non_self_certified"][ + insert_key + ] = (applicable_ledger_id, applicable_ledger_inst) + applicable_prod_ledgers["self_certified"] = OrderedDict( + sorted(applicable_prod_ledgers.get("self_certified").items()) + ) + applicable_prod_ledgers["non_self_certified"] = OrderedDict( + sorted(applicable_prod_ledgers.get("non_self_certified").items()) + ) + applicable_non_prod_ledgers["self_certified"] = OrderedDict( + sorted(applicable_non_prod_ledgers.get("self_certified").items()) + ) + applicable_non_prod_ledgers["non_self_certified"] = OrderedDict( + sorted(applicable_non_prod_ledgers.get("non_self_certified").items()) + ) + if len(applicable_prod_ledgers.get("self_certified")) > 0: + successful_ledger_inst = list( + applicable_prod_ledgers.get("self_certified").values() + )[0] + if cache_did and self.cache: + await self.cache.set( + cache_key, successful_ledger_inst[0], self.cache_ttl + ) + return successful_ledger_inst + elif len(applicable_non_prod_ledgers.get("self_certified")) > 0: + successful_ledger_inst = list( + applicable_non_prod_ledgers.get("self_certified").values() + )[0] + if cache_did and self.cache: + await self.cache.set( + cache_key, successful_ledger_inst[0], self.cache_ttl + ) + return successful_ledger_inst + elif len(applicable_prod_ledgers.get("non_self_certified")) > 0: + successful_ledger_inst = list( + applicable_prod_ledgers.get("non_self_certified").values() + )[0] + if cache_did and self.cache: + await self.cache.set( + cache_key, successful_ledger_inst[0], self.cache_ttl + ) + return successful_ledger_inst + elif len(applicable_non_prod_ledgers.get("non_self_certified")) > 0: + successful_ledger_inst = list( + applicable_non_prod_ledgers.get("non_self_certified").values() + )[0] + if cache_did and self.cache: + await self.cache.set( + cache_key, successful_ledger_inst[0], self.cache_ttl + ) + return successful_ledger_inst + else: + raise MultipleLedgerManagerError( + f"DID {did} not found in any of the ledgers total: " + f"(production: {len(self.production_ledgers)}, " + f"non_production: {len(self.non_production_ledgers)})" + ) diff --git a/aries_cloudagent/ledger/multiple_ledger/indy_vdr_manager.py b/aries_cloudagent/ledger/multiple_ledger/indy_vdr_manager.py new file mode 100644 index 0000000000..fdc5b7beb0 --- /dev/null +++ b/aries_cloudagent/ledger/multiple_ledger/indy_vdr_manager.py @@ -0,0 +1,244 @@ +"""Multiple IndyVdrLedger Manager.""" +import asyncio +import concurrent.futures +import logging +import json + +from collections import OrderedDict +from typing import Optional, Tuple, Mapping + +from ...cache.base import BaseCache +from ...core.profile import Profile +from ...ledger.error import LedgerError +from ...wallet.crypto import did_is_self_certified + +from ..indy_vdr import IndyVdrLedger +from ..merkel_validation.domain_txn_handler import ( + prepare_for_state_read, + get_proof_nodes, +) +from ..merkel_validation.trie import SubTrie + +from .base_manager import BaseMultipleLedgerManager, MultipleLedgerManagerError + +LOGGER = logging.getLogger(__name__) + + +class MultiIndyVDRLedgerManager(BaseMultipleLedgerManager): + """Multiple Indy VDR Ledger Manager.""" + + def __init__( + self, + profile: Profile, + production_ledgers: OrderedDict = OrderedDict(), + non_production_ledgers: OrderedDict = OrderedDict(), + write_ledger_info: Tuple[str, IndyVdrLedger] = None, + cache_ttl: int = None, + ): + """Initialize MultiIndyLedgerManager. + + Args: + profile: The base profile for this manager + production_ledgers: production IndyVDRLedger mapping + non_production_ledgers: non_production IndyVDRLedger mapping + cache_ttl: Time in sec to persist did_ledger_id_resolver cache keys + + """ + self.profile = profile + self.production_ledgers = production_ledgers + self.non_production_ledgers = non_production_ledgers + self.write_ledger_info = write_ledger_info + self.executor = concurrent.futures.ThreadPoolExecutor(max_workers=5) + self.cache_ttl = cache_ttl + + async def get_write_ledger(self) -> Optional[Tuple[str, IndyVdrLedger]]: + """Return the write IndyVdrLedger instance.""" + return self.write_ledger_info + + async def get_prod_ledgers(self) -> Mapping: + """Return production ledgers mapping.""" + return self.production_ledgers + + async def get_nonprod_ledgers(self) -> Mapping: + """Return non_production ledgers mapping.""" + return self.non_production_ledgers + + async def _get_ledger_by_did( + self, + ledger_id: str, + did: str, + ) -> Optional[Tuple[str, IndyVdrLedger, bool]]: + """Build and submit GET_NYM request and process response. + + Successful response return tuple with ledger_id, IndyVdrLedger instance + and is_self_certified bool flag. Unsuccessful response return None. + + Args: + ledger_id: provided ledger_id to retrieve IndyVdrLedger instance + from production_ledgers or non_production_ledgers + did: provided DID + + Return: + (str, IndyVdrLedger, bool) or None + """ + try: + indy_vdr_ledger = None + if ledger_id in self.production_ledgers: + indy_vdr_ledger = self.production_ledgers.get(ledger_id) + else: + indy_vdr_ledger = self.non_production_ledgers.get(ledger_id) + async with indy_vdr_ledger: + request = await indy_vdr_ledger.build_and_return_get_nym_request( + None, did + ) + response_json = await asyncio.wait_for( + indy_vdr_ledger.submit_get_nym_request(request), 10 + ) + response = json.loads(response_json) + data = response.get("result").get("data") + if not data: + LOGGER.warning(f"Did {did} not posted to ledger {ledger_id}") + return None + if isinstance(data, str): + data = json.loads(data) + if not await SubTrie.verify_spv_proof( + expected_value=prepare_for_state_read(response), + proof_nodes=get_proof_nodes(response), + ): + LOGGER.warning( + f"State Proof validation failed for Did {did} " + f"and ledger {ledger_id}" + ) + return None + if did_is_self_certified(did, data.get("verkey")): + return (ledger_id, indy_vdr_ledger, True) + return (ledger_id, indy_vdr_ledger, False) + except asyncio.TimeoutError: + LOGGER.exception( + f"get-nym request timedout for Did {did} and " + f"ledger {ledger_id}, reply not received within 10 sec" + ) + return None + except LedgerError as err: + LOGGER.error( + "Exception when building and submitting get-nym request, " + f"for Did {did} and ledger {ledger_id}, {err}" + ) + return None + + async def lookup_did_in_configured_ledgers( + self, did: str, cache_did: bool = True + ) -> Tuple[str, IndyVdrLedger]: + """Lookup given DID in configured ledgers in parallel.""" + self.cache = self.profile.inject_or(BaseCache) + cache_key = f"did_ledger_id_resolver::{did}" + if bool(cache_did and self.cache and await self.cache.get(cache_key)): + cached_ledger_id = await self.cache.get(cache_key) + if cached_ledger_id in self.production_ledgers: + return (cached_ledger_id, self.production_ledgers.get(cached_ledger_id)) + elif cached_ledger_id in self.non_production_ledgers: + return ( + cached_ledger_id, + self.non_production_ledgers.get(cached_ledger_id), + ) + else: + raise MultipleLedgerManagerError( + f"cached ledger_id {cached_ledger_id} not found in either " + "production_ledgers or non_production_ledgers" + ) + applicable_prod_ledgers = {"self_certified": {}, "non_self_certified": {}} + applicable_non_prod_ledgers = {"self_certified": {}, "non_self_certified": {}} + ledger_ids = list(self.production_ledgers.keys()) + list( + self.non_production_ledgers.keys() + ) + coro_futures = { + self.executor.submit(self._get_ledger_by_did, ledger_id, did): ledger_id + for ledger_id in ledger_ids + } + for coro_future in concurrent.futures.as_completed(coro_futures): + result = await coro_future.result() + if result: + applicable_ledger_id = result[0] + applicable_ledger_inst = result[1] + is_self_certified = result[2] + if applicable_ledger_id in self.production_ledgers: + insert_key = list(self.production_ledgers).index( + applicable_ledger_id + ) + if is_self_certified: + applicable_prod_ledgers["self_certified"][insert_key] = ( + applicable_ledger_id, + applicable_ledger_inst, + ) + else: + applicable_prod_ledgers["non_self_certified"][insert_key] = ( + applicable_ledger_id, + applicable_ledger_inst, + ) + else: + insert_key = list(self.non_production_ledgers).index( + applicable_ledger_id + ) + if is_self_certified: + applicable_non_prod_ledgers["self_certified"][insert_key] = ( + applicable_ledger_id, + applicable_ledger_inst, + ) + else: + applicable_non_prod_ledgers["non_self_certified"][ + insert_key + ] = (applicable_ledger_id, applicable_ledger_inst) + applicable_prod_ledgers["self_certified"] = OrderedDict( + sorted(applicable_prod_ledgers.get("self_certified").items()) + ) + applicable_prod_ledgers["non_self_certified"] = OrderedDict( + sorted(applicable_prod_ledgers.get("non_self_certified").items()) + ) + applicable_non_prod_ledgers["self_certified"] = OrderedDict( + sorted(applicable_non_prod_ledgers.get("self_certified").items()) + ) + applicable_non_prod_ledgers["non_self_certified"] = OrderedDict( + sorted(applicable_non_prod_ledgers.get("non_self_certified").items()) + ) + if len(applicable_prod_ledgers.get("self_certified")) > 0: + successful_ledger_inst = list( + applicable_prod_ledgers.get("self_certified").values() + )[0] + if cache_did and self.cache: + await self.cache.set( + cache_key, successful_ledger_inst[0], self.cache_ttl + ) + return successful_ledger_inst + elif len(applicable_non_prod_ledgers.get("self_certified")) > 0: + successful_ledger_inst = list( + applicable_non_prod_ledgers.get("self_certified").values() + )[0] + if cache_did and self.cache: + await self.cache.set( + cache_key, successful_ledger_inst[0], self.cache_ttl + ) + return successful_ledger_inst + elif len(applicable_prod_ledgers.get("non_self_certified")) > 0: + successful_ledger_inst = list( + applicable_prod_ledgers.get("non_self_certified").values() + )[0] + if cache_did and self.cache: + await self.cache.set( + cache_key, successful_ledger_inst[0], self.cache_ttl + ) + return successful_ledger_inst + elif len(applicable_non_prod_ledgers.get("non_self_certified")) > 0: + successful_ledger_inst = list( + applicable_non_prod_ledgers.get("non_self_certified").values() + )[0] + if cache_did and self.cache: + await self.cache.set( + cache_key, successful_ledger_inst[0], self.cache_ttl + ) + return successful_ledger_inst + else: + raise MultipleLedgerManagerError( + f"DID {did} not found in any of the ledgers total: " + f"(production: {len(self.production_ledgers)}, " + f"non_production: {len(self.non_production_ledgers)})" + ) diff --git a/aries_cloudagent/ledger/multiple_ledger/ledger_config_schema.py b/aries_cloudagent/ledger/multiple_ledger/ledger_config_schema.py new file mode 100644 index 0000000000..d4edb9d120 --- /dev/null +++ b/aries_cloudagent/ledger/multiple_ledger/ledger_config_schema.py @@ -0,0 +1,86 @@ +"""Schema for configuring multiple ledgers.""" +import uuid + +from marshmallow import ( + fields, + pre_load, + EXCLUDE, +) + +from ...messaging.models.base import BaseModelSchema, BaseModel +from ...messaging.models.openapi import OpenAPISchema + + +class LedgerConfigInstance(BaseModel): + """describes each LedgerConfigInstance for multiple ledger support.""" + + class Meta: + """LedgerConfigInstance metadata.""" + + schema_class = "LedgerConfigInstanceSchema" + + def __init__( + self, + *, + id: str = None, + is_production: str = True, + genesis_transactions: str = None, + genesis_file: str = None, + genesis_url: str = None, + ): + """Initialize LedgerConfigInstance.""" + self.id = id + self.is_production = is_production + self.genesis_transactions = genesis_transactions + self.genesis_file = genesis_file + self.genesis_url = genesis_url + + +class LedgerConfigInstanceSchema(BaseModelSchema): + """Single LedgerConfigInstance Schema.""" + + class Meta: + """LedgerConfigInstanceSchema metadata.""" + + model_class = LedgerConfigInstance + unknown = EXCLUDE + + id = fields.Str( + description="ledger_id", + required=False, + ) + is_production = fields.Bool(description="is_production", required=False) + genesis_transactions = fields.Str( + description="genesis_transactions", required=False + ) + genesis_file = fields.Str(description="genesis_file", required=False) + genesis_url = fields.Str(description="genesis_url", required=False) + + @pre_load + def validate_id(self, data, **kwargs): + """Check if id is present, if not then set to UUID4.""" + if "id" not in data: + data["id"] = str(uuid.uuid4()) + return data + + +class LedgerConfigListSchema(OpenAPISchema): + """Schema for Ledger Config List.""" + + ledger_config_list = fields.List( + fields.Nested( + LedgerConfigInstanceSchema(), + required=True, + ), + required=True, + ) + + +class WriteLedgerRequestSchema(OpenAPISchema): + """Schema for setting/getting ledger_id for the write ledger.""" + + ledger_id = fields.Str() + + +class MultipleLedgerModuleResultSchema(OpenAPISchema): + """Schema for the multiple ledger modules endpoint.""" diff --git a/aries_cloudagent/ledger/multiple_ledger/ledger_requests_executor.py b/aries_cloudagent/ledger/multiple_ledger/ledger_requests_executor.py new file mode 100644 index 0000000000..701b058a14 --- /dev/null +++ b/aries_cloudagent/ledger/multiple_ledger/ledger_requests_executor.py @@ -0,0 +1,71 @@ +"""Ledger Request Executor.""" +from typing import Tuple, Union, Optional + +from ...config.base import InjectionError +from ...core.profile import Profile +from ...ledger.base import BaseLedger +from ...ledger.indy import IndySdkLedger +from ...ledger.indy_vdr import IndyVdrLedger +from ...ledger.multiple_ledger.base_manager import ( + BaseMultipleLedgerManager, + MultipleLedgerManagerError, +) + +( + GET_SCHEMA, + GET_CRED_DEF, + GET_REVOC_REG_DEF, + GET_REVOC_REG_ENTRY, + GET_KEY_FOR_DID, + GET_ALL_ENDPOINTS_FOR_DID, + GET_ENDPOINT_FOR_DID, + GET_NYM_ROLE, + GET_REVOC_REG_DELTA, +) = tuple(range(9)) + + +class IndyLedgerRequestsExecutor: + """Executes Ledger Requests based on multiple ledger config, if set.""" + + def __init__( + self, + profile: Profile, + ): + """Initialize IndyLedgerRequestsExecutor. + + Args: + profile: The active profile instance + + """ + self.profile = profile + + async def get_ledger_for_identifier( + self, identifier: str, txn_record_type: int + ) -> Union[ + Optional[IndyVdrLedger], + Optional[IndySdkLedger], + Tuple[str, IndyVdrLedger], + Tuple[str, IndySdkLedger], + ]: + """Return ledger info given the record identifier.""" + # For seqNo + if identifier.isdigit(): + return self.profile.inject(BaseLedger) + elif ( + self.profile.settings.get("ledger.ledger_config_list") + and len(self.profile.settings.get("ledger.ledger_config_list")) > 0 + ): + try: + multiledger_mgr = self.profile.inject(BaseMultipleLedgerManager) + extracted_did = multiledger_mgr.extract_did_from_identifier(identifier) + if txn_record_type in tuple(range(4)): + cache_did = True + else: + cache_did = False + return await multiledger_mgr.lookup_did_in_configured_ledgers( + extracted_did, cache_did=cache_did + ) + except (MultipleLedgerManagerError, InjectionError): + return self.profile.inject_or(BaseLedger) + else: + return self.profile.inject_or(BaseLedger) diff --git a/aries_cloudagent/ledger/multiple_ledger/manager_provider.py b/aries_cloudagent/ledger/multiple_ledger/manager_provider.py new file mode 100644 index 0000000000..26a7496804 --- /dev/null +++ b/aries_cloudagent/ledger/multiple_ledger/manager_provider.py @@ -0,0 +1,163 @@ +"""Profile manager for multiple Indy ledger support.""" + +import logging + +from collections import OrderedDict + +from ...askar.profile import AskarProfile +from ...config.provider import BaseProvider +from ...config.settings import BaseSettings +from ...config.injector import BaseInjector, InjectionError +from ...cache.base import BaseCache +from ...indy.sdk.profile import IndySdkProfile +from ...ledger.base import BaseLedger +from ...utils.classloader import ClassLoader, ClassNotFoundError + +from ..indy import IndySdkLedgerPool, IndySdkLedger +from ..indy_vdr import IndyVdrLedgerPool, IndyVdrLedger + +from .base_manager import MultipleLedgerManagerError + +LOGGER = logging.getLogger(__name__) + + +class MultiIndyLedgerManagerProvider(BaseProvider): + """Multiple Indy ledger support manager provider.""" + + askar_manager_path = ( + "aries_cloudagent.ledger.multiple_ledger." + "indy_vdr_manager.MultiIndyVDRLedgerManager" + ) + basic_manager_path = ( + "aries_cloudagent.ledger.multiple_ledger." "indy_manager.MultiIndyLedgerManager" + ) + MANAGER_TYPES = { + "basic": basic_manager_path, + "askar-profile": askar_manager_path, + } + + def __init__(self, root_profile): + """Initialize the multiple Indy ledger profile manager provider.""" + self._inst = {} + self.root_profile = root_profile + + def provide(self, settings: BaseSettings, injector: BaseInjector): + """Create the multiple Indy ledger manager instance.""" + + if isinstance(self.root_profile, IndySdkProfile): + manager_type = "basic" + elif isinstance(self.root_profile, AskarProfile): + manager_type = "askar-profile" + else: + raise MultipleLedgerManagerError( + "MultiIndyLedgerManagerProvider expects an IndySDKProfile [indy] " + " or AskarProfile [indy_vdr] as root_profile" + ) + + manager_class = self.MANAGER_TYPES.get(manager_type) + + if manager_class not in self._inst: + LOGGER.info("Create multiple Indy ledger manager: %s", manager_type) + try: + if manager_type == "basic": + indy_sdk_production_ledgers = OrderedDict() + indy_sdk_non_production_ledgers = OrderedDict() + ledger_config_list = settings.get_value("ledger.ledger_config_list") + write_ledger_info = None + for config in ledger_config_list: + keepalive = config.get("keepalive") + read_only = config.get("read_only") + socks_proxy = config.get("socks_proxy") + genesis_transactions = config.get("genesis_transactions") + cache = injector.inject_or(BaseCache) + ledger_id = config.get("id") + pool_name = config.get("pool_name") + ledger_is_production = config.get("is_production") + ledger_is_write = config.get("is_write") + ledger_pool = IndySdkLedgerPool( + pool_name, + keepalive=keepalive, + cache=cache, + genesis_transactions=genesis_transactions, + read_only=read_only, + socks_proxy=socks_proxy, + ) + ledger_instance = IndySdkLedger( + pool=ledger_pool, + profile=self.root_profile, + ) + if ledger_is_write: + write_ledger_info = (ledger_id, ledger_instance) + if ledger_is_production: + indy_sdk_production_ledgers[ledger_id] = ledger_instance + else: + indy_sdk_non_production_ledgers[ledger_id] = ledger_instance + if settings.get_value("ledger.genesis_transactions"): + ledger_instance = self.root_profile.inject_or(BaseLedger) + ledger_id = "startup::" + ledger_instance.pool.name + indy_sdk_production_ledgers[ledger_id] = ledger_instance + if not write_ledger_info: + write_ledger_info = (ledger_id, ledger_instance) + indy_sdk_production_ledgers.move_to_end( + ledger_id, last=False + ) + self._inst[manager_class] = ClassLoader.load_class(manager_class)( + self.root_profile, + production_ledgers=indy_sdk_production_ledgers, + non_production_ledgers=indy_sdk_non_production_ledgers, + write_ledger_info=write_ledger_info, + ) + else: + indy_vdr_production_ledgers = OrderedDict() + indy_vdr_non_production_ledgers = OrderedDict() + ledger_config_list = settings.get_value("ledger.ledger_config_list") + write_ledger_info = None + for config in ledger_config_list: + keepalive = config.get("keepalive") + read_only = config.get("read_only") + socks_proxy = config.get("socks_proxy") + genesis_transactions = config.get("genesis_transactions") + cache = injector.inject_or(BaseCache) + ledger_id = config.get("id") + pool_name = config.get("pool_name") + ledger_is_production = config.get("is_production") + ledger_is_write = config.get("is_write") + ledger_pool = IndyVdrLedgerPool( + pool_name, + keepalive=keepalive, + cache=cache, + genesis_transactions=genesis_transactions, + read_only=read_only, + socks_proxy=socks_proxy, + ) + ledger_instance = IndyVdrLedger( + pool=ledger_pool, + profile=self.root_profile, + ) + if ledger_is_write: + write_ledger_info = (ledger_id, ledger_instance) + if ledger_is_production: + indy_vdr_production_ledgers[ledger_id] = ledger_instance + else: + indy_vdr_non_production_ledgers[ledger_id] = ledger_instance + if settings.get_value("ledger.genesis_transactions"): + ledger_instance = self.root_profile.inject_or(BaseLedger) + ledger_id = "startup::" + ledger_instance.pool.name + indy_vdr_production_ledgers[ledger_id] = ledger_instance + if not write_ledger_info: + write_ledger_info = (ledger_id, ledger_instance) + indy_vdr_non_production_ledgers.move_to_end( + ledger_id, last=False + ) + self._inst[manager_class] = ClassLoader.load_class(manager_class)( + self.root_profile, + production_ledgers=indy_vdr_production_ledgers, + non_production_ledgers=indy_vdr_non_production_ledgers, + write_ledger_info=write_ledger_info, + ) + except ClassNotFoundError as err: + raise InjectionError( + f"Unknown multiple Indy ledger manager type: {manager_type}" + ) from err + + return self._inst[manager_class] diff --git a/aries_cloudagent/ledger/multiple_ledger/tests/__init__.py b/aries_cloudagent/ledger/multiple_ledger/tests/__init__.py new file mode 100644 index 0000000000..e69de29bb2 diff --git a/aries_cloudagent/ledger/multiple_ledger/tests/test_indy_ledger_requests.py b/aries_cloudagent/ledger/multiple_ledger/tests/test_indy_ledger_requests.py new file mode 100644 index 0000000000..f2e8a60f61 --- /dev/null +++ b/aries_cloudagent/ledger/multiple_ledger/tests/test_indy_ledger_requests.py @@ -0,0 +1,94 @@ +from asynctest import TestCase as AsyncTestCase +from asynctest import mock as async_mock + +from ....core.in_memory import InMemoryProfile + +from ...base import BaseLedger +from ...multiple_ledger.base_manager import ( + BaseMultipleLedgerManager, + MultipleLedgerManagerError, +) +from ...indy import IndySdkLedger, IndySdkLedgerPool + +from ..ledger_requests_executor import IndyLedgerRequestsExecutor + + +class TestIndyLedgerRequestsExecutor(AsyncTestCase): + async def setUp(self): + self.profile = InMemoryProfile.test_profile() + self.context = self.profile.context + setattr(self.context, "profile", self.profile) + self.profile.settings["ledger.ledger_config_list"] = [ + { + "id": "test_prod_1", + "pool_name": "test_prod_1", + "is_production": True, + "genesis_transactions": "genesis_transactions", + } + ] + self.ledger = IndySdkLedger( + IndySdkLedgerPool("test_prod_1", checked=True), self.profile + ) + self.profile.context.injector.bind_instance( + BaseMultipleLedgerManager, + async_mock.MagicMock( + extract_did_from_identifier=async_mock.CoroutineMock( + return_value="WgWxqztrNooG92RXvxSTWv" + ), + lookup_did_in_configured_ledgers=async_mock.CoroutineMock( + return_value=("test_prod_1", self.ledger) + ), + ), + ) + self.profile.context.injector.bind_instance(BaseLedger, self.ledger) + self.indy_ledger_requestor = IndyLedgerRequestsExecutor(self.profile) + + async def test_get_ledger_for_identifier(self): + ( + ledger_id, + ledger_inst, + ) = await self.indy_ledger_requestor.get_ledger_for_identifier( + "WgWxqztrNooG92RXvxSTWv:2:schema_name:1.0", 0 + ) + assert ledger_id == "test_prod_1" + assert ledger_inst.pool.name == "test_prod_1" + + async def test_get_ledger_for_identifier_is_digit(self): + ledger = await self.indy_ledger_requestor.get_ledger_for_identifier("123", 0) + assert ledger == self.ledger + + async def test_get_ledger_for_identifier_x(self): + self.profile.context.injector.bind_instance( + BaseMultipleLedgerManager, + async_mock.MagicMock( + extract_did_from_identifier=async_mock.CoroutineMock( + return_value="WgWxqztrNooG92RXvxSTWv" + ), + lookup_did_in_configured_ledgers=async_mock.CoroutineMock( + side_effect=MultipleLedgerManagerError + ), + ), + ) + self.indy_ledger_requestor = IndyLedgerRequestsExecutor(self.profile) + ledger = await self.indy_ledger_requestor.get_ledger_for_identifier( + "WgWxqztrNooG92RXvxSTWv:2:schema_name:1.0", 0 + ) + assert ledger == self.ledger + + async def test_get_ledger_for_identifier_mult_ledger_not_set(self): + self.profile.settings["ledger.ledger_config_list"] = None + self.indy_ledger_requestor = IndyLedgerRequestsExecutor(self.profile) + ledger = await self.indy_ledger_requestor.get_ledger_for_identifier( + "WgWxqztrNooG92RXvxSTWv:2:schema_name:1.0", 0 + ) + assert ledger == self.ledger + + async def test_get_ledger_for_identifier_mult_ledger_not_cached(self): + ( + ledger_id, + ledger_inst, + ) = await self.indy_ledger_requestor.get_ledger_for_identifier( + "GUTK6XARozQCWxqzPSUr4g", 4 + ) + assert ledger_id == "test_prod_1" + assert ledger_inst.pool.name == "test_prod_1" diff --git a/aries_cloudagent/ledger/multiple_ledger/tests/test_indy_manager.py b/aries_cloudagent/ledger/multiple_ledger/tests/test_indy_manager.py new file mode 100644 index 0000000000..b59408c028 --- /dev/null +++ b/aries_cloudagent/ledger/multiple_ledger/tests/test_indy_manager.py @@ -0,0 +1,434 @@ +import asyncio +from copy import deepcopy +import json + +from asynctest import TestCase as AsyncTestCase +from asynctest import mock as async_mock + +from collections import OrderedDict + +from ....cache.base import BaseCache +from ....cache.in_memory import InMemoryCache +from ....core.in_memory import InMemoryProfile +from ....messaging.responder import BaseResponder + +from ...error import LedgerError +from ...indy import IndySdkLedger, IndySdkLedgerPool +from ...merkel_validation.tests.test_data import GET_NYM_REPLY + +from .. import indy_manager as test_module +from ..base_manager import MultipleLedgerManagerError +from ..indy_manager import MultiIndyLedgerManager + + +class TestMultiIndyLedgerManager(AsyncTestCase): + async def setUp(self): + self.profile = InMemoryProfile.test_profile(bind={BaseCache: InMemoryCache()}) + self.context = self.profile.context + setattr(self.context, "profile", self.profile) + self.responder = async_mock.CoroutineMock(send=async_mock.CoroutineMock()) + self.context.injector.bind_instance(BaseResponder, self.responder) + self.production_ledger = OrderedDict() + self.non_production_ledger = OrderedDict() + test_prod_ledger = IndySdkLedger( + IndySdkLedgerPool("test_prod_1", checked=True), self.profile + ) + test_write_ledger = ("test_prod_1", test_prod_ledger) + self.production_ledger["test_prod_1"] = test_prod_ledger + self.production_ledger["test_prod_2"] = IndySdkLedger( + IndySdkLedgerPool("test_prod_2", checked=True), self.profile + ) + self.non_production_ledger["test_non_prod_1"] = IndySdkLedger( + IndySdkLedgerPool("test_non_prod_1", checked=True), self.profile + ) + self.non_production_ledger["test_non_prod_2"] = IndySdkLedger( + IndySdkLedgerPool("test_non_prod_2", checked=True), self.profile + ) + self.manager = MultiIndyLedgerManager( + self.profile, + production_ledgers=self.production_ledger, + non_production_ledgers=self.non_production_ledger, + write_ledger_info=test_write_ledger, + ) + + async def test_get_write_ledger(self): + ledger_id, ledger_inst = await self.manager.get_write_ledger() + assert ledger_id == "test_prod_1" + assert ledger_inst.pool.name == "test_prod_1" + + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") + @async_mock.patch("indy.ledger.build_get_nym_request") + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedger._submit") + async def test_get_ledger_by_did_self_cert_a( + self, mock_submit, mock_build_get_nym_req, mock_close, mock_open + ): + with async_mock.patch.object( + test_module.asyncio, "wait", async_mock.CoroutineMock() + ) as mock_wait: + mock_build_get_nym_req.return_value = async_mock.MagicMock() + mock_submit.return_value = json.dumps(GET_NYM_REPLY) + mock_wait.return_value = mock_submit.return_value + ( + ledger_id, + ledger_inst, + is_self_certified, + ) = await self.manager._get_ledger_by_did( + "test_prod_1", "Av63wJYM7xYR4AiygYq4c3" + ) + assert ledger_id == "test_prod_1" + assert ledger_inst.pool.name == "test_prod_1" + assert is_self_certified + + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") + @async_mock.patch("indy.ledger.build_get_nym_request") + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedger._submit") + async def test_get_ledger_by_did_self_cert_b( + self, mock_submit, mock_build_get_nym_req, mock_close, mock_open + ): + self.non_production_ledger = OrderedDict() + self.non_production_ledger["test_non_prod_1"] = IndySdkLedger( + IndySdkLedgerPool("test_non_prod_1", checked=True), self.profile + ) + self.non_production_ledger["test_non_prod_2"] = IndySdkLedger( + IndySdkLedgerPool("test_non_prod_2", checked=True), self.profile + ) + self.manager = MultiIndyLedgerManager( + self.profile, + non_production_ledgers=self.non_production_ledger, + ) + with async_mock.patch.object( + test_module.asyncio, "wait", async_mock.CoroutineMock() + ) as mock_wait: + mock_build_get_nym_req.return_value = async_mock.MagicMock() + mock_submit.return_value = json.dumps(GET_NYM_REPLY) + mock_wait.return_value = mock_submit.return_value + ( + ledger_id, + ledger_inst, + is_self_certified, + ) = await self.manager._get_ledger_by_did( + "test_non_prod_1", "Av63wJYM7xYR4AiygYq4c3" + ) + assert ledger_id == "test_non_prod_1" + assert ledger_inst.pool.name == "test_non_prod_1" + assert is_self_certified + + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") + @async_mock.patch("indy.ledger.build_get_nym_request") + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedger._submit") + async def test_get_ledger_by_did_not_self_cert( + self, mock_submit, mock_build_get_nym_req, mock_close, mock_open + ): + get_nym_reply = deepcopy(GET_NYM_REPLY) + get_nym_reply["result"]["data"] = json.dumps( + { + "dest": "Av63wJYM7xYR4AiygYq4c3", + "identifier": "V4SGRU86Z58d6TV7PBUe6f", + "role": "101", + "seqNo": 17794, + "txnTime": 1632262244, + "verkey": "ABUF7uxYTxZ6qYdZ4G9e1Gi", + } + ) + with async_mock.patch.object( + test_module.asyncio, "wait", async_mock.CoroutineMock() + ) as mock_wait, async_mock.patch.object( + test_module.SubTrie, "verify_spv_proof", async_mock.CoroutineMock() + ) as mock_verify_spv_proof: + mock_build_get_nym_req.return_value = async_mock.MagicMock() + mock_submit.return_value = json.dumps(get_nym_reply) + mock_wait.return_value = mock_submit.return_value + mock_verify_spv_proof.return_value = True + ( + ledger_id, + ledger_inst, + is_self_certified, + ) = await self.manager._get_ledger_by_did( + "test_prod_1", "Av63wJYM7xYR4AiygYq4c3" + ) + assert ledger_id == "test_prod_1" + assert ledger_inst.pool.name == "test_prod_1" + assert not is_self_certified + + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") + @async_mock.patch("indy.ledger.build_get_nym_request") + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedger._submit") + async def test_get_ledger_by_did_state_proof_not_valid( + self, mock_submit, mock_build_get_nym_req, mock_close, mock_open + ): + get_nym_reply = deepcopy(GET_NYM_REPLY) + get_nym_reply["result"]["data"]["verkey"] = "ABUF7uxYTxZ6qYdZ4G9e1Gi" + with async_mock.patch.object( + test_module.asyncio, "wait", async_mock.CoroutineMock() + ) as mock_wait: + mock_build_get_nym_req.return_value = async_mock.MagicMock() + mock_submit.return_value = json.dumps(get_nym_reply) + mock_wait.return_value = mock_submit.return_value + assert not await self.manager._get_ledger_by_did( + "test_prod_1", "Av63wJYM7xYR4AiygYq4c3" + ) + + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") + @async_mock.patch("indy.ledger.build_get_nym_request") + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedger._submit") + async def test_get_ledger_by_did_no_data( + self, mock_submit, mock_build_get_nym_req, mock_close, mock_open + ): + get_nym_reply = deepcopy(GET_NYM_REPLY) + get_nym_reply.get("result").pop("data") + with async_mock.patch.object( + test_module.asyncio, "wait", async_mock.CoroutineMock() + ) as mock_wait: + mock_build_get_nym_req.return_value = async_mock.MagicMock() + mock_submit.return_value = json.dumps(get_nym_reply) + mock_wait.return_value = mock_submit.return_value + assert not await self.manager._get_ledger_by_did( + "test_prod_1", "Av63wJYM7xYR4AiygYq4c3" + ) + + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") + @async_mock.patch("indy.ledger.build_get_nym_request") + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedger._submit") + async def test_get_ledger_by_did_timeout( + self, mock_submit, mock_build_get_nym_req, mock_close, mock_open + ): + mock_build_get_nym_req.return_value = async_mock.MagicMock() + mock_submit.side_effect = asyncio.TimeoutError + assert not await self.manager._get_ledger_by_did( + "test_prod_1", "Av63wJYM7xYR4AiygYq4c3" + ) + + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") + @async_mock.patch("indy.ledger.build_get_nym_request") + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedger._submit") + async def test_get_ledger_by_did_ledger_error( + self, mock_submit, mock_build_get_nym_req, mock_close, mock_open + ): + mock_build_get_nym_req.return_value = async_mock.MagicMock() + mock_submit.side_effect = LedgerError + assert not await self.manager._get_ledger_by_did( + "test_prod_1", "Av63wJYM7xYR4AiygYq4c3" + ) + + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") + @async_mock.patch("indy.ledger.build_get_nym_request") + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedger._submit") + async def test_lookup_did_in_configured_ledgers_self_cert_prod( + self, mock_submit, mock_build_get_nym_req, mock_close, mock_open + ): + with async_mock.patch.object( + test_module.asyncio, "wait", async_mock.CoroutineMock() + ) as mock_wait: + mock_build_get_nym_req.return_value = async_mock.MagicMock() + mock_submit.return_value = json.dumps(GET_NYM_REPLY) + mock_wait.return_value = mock_submit.return_value + ( + ledger_id, + ledger_inst, + ) = await self.manager.lookup_did_in_configured_ledgers( + "Av63wJYM7xYR4AiygYq4c3", cache_did=True + ) + assert ledger_id == "test_prod_1" + assert ledger_inst.pool.name == "test_prod_1" + + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") + @async_mock.patch("indy.ledger.build_get_nym_request") + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedger._submit") + async def test_get_ledger_by_did_not_self_cert_not_self_cert_prod( + self, mock_submit, mock_build_get_nym_req, mock_close, mock_open + ): + get_nym_reply = deepcopy(GET_NYM_REPLY) + get_nym_reply["result"]["data"]["verkey"] = "ABUF7uxYTxZ6qYdZ4G9e1Gi" + with async_mock.patch.object( + test_module.asyncio, "wait", async_mock.CoroutineMock() + ) as mock_wait, async_mock.patch.object( + test_module.SubTrie, "verify_spv_proof", async_mock.CoroutineMock() + ) as mock_verify_spv_proof: + mock_build_get_nym_req.return_value = async_mock.MagicMock() + mock_submit.return_value = json.dumps(get_nym_reply) + mock_wait.return_value = mock_submit.return_value + mock_verify_spv_proof.return_value = True + ( + ledger_id, + ledger_inst, + ) = await self.manager.lookup_did_in_configured_ledgers( + "Av63wJYM7xYR4AiygYq4c3", cache_did=True + ) + assert ledger_id == "test_prod_1" + assert ledger_inst.pool.name == "test_prod_1" + + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") + @async_mock.patch("indy.ledger.build_get_nym_request") + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedger._submit") + async def test_lookup_did_in_configured_ledgers_self_cert_non_prod( + self, mock_submit, mock_build_get_nym_req, mock_close, mock_open + ): + self.non_production_ledger = OrderedDict() + self.non_production_ledger["test_non_prod_1"] = IndySdkLedger( + IndySdkLedgerPool("test_non_prod_1", checked=True), self.profile + ) + self.non_production_ledger["test_non_prod_2"] = IndySdkLedger( + IndySdkLedgerPool("test_non_prod_2", checked=True), self.profile + ) + self.manager = MultiIndyLedgerManager( + self.profile, + non_production_ledgers=self.non_production_ledger, + ) + with async_mock.patch.object( + test_module.asyncio, "wait", async_mock.CoroutineMock() + ) as mock_wait: + mock_build_get_nym_req.return_value = async_mock.MagicMock() + mock_submit.return_value = json.dumps(GET_NYM_REPLY) + mock_wait.return_value = mock_submit.return_value + ( + ledger_id, + ledger_inst, + ) = await self.manager.lookup_did_in_configured_ledgers( + "Av63wJYM7xYR4AiygYq4c3", cache_did=True + ) + assert ledger_id == "test_non_prod_1" + assert ledger_inst.pool.name == "test_non_prod_1" + + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") + @async_mock.patch("indy.ledger.build_get_nym_request") + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedger._submit") + async def test_get_ledger_by_did_not_self_cert_not_self_cert_non_prod( + self, mock_submit, mock_build_get_nym_req, mock_close, mock_open + ): + self.non_production_ledger = OrderedDict() + self.non_production_ledger["test_non_prod_1"] = IndySdkLedger( + IndySdkLedgerPool("test_non_prod_1", checked=True), self.profile + ) + self.non_production_ledger["test_non_prod_2"] = IndySdkLedger( + IndySdkLedgerPool("test_non_prod_2", checked=True), self.profile + ) + self.manager = MultiIndyLedgerManager( + self.profile, + non_production_ledgers=self.non_production_ledger, + ) + get_nym_reply = deepcopy(GET_NYM_REPLY) + get_nym_reply["result"]["data"]["verkey"] = "ABUF7uxYTxZ6qYdZ4G9e1Gi" + with async_mock.patch.object( + test_module.asyncio, "wait", async_mock.CoroutineMock() + ) as mock_wait, async_mock.patch.object( + test_module.SubTrie, "verify_spv_proof", async_mock.CoroutineMock() + ) as mock_verify_spv_proof: + mock_build_get_nym_req.return_value = async_mock.MagicMock() + mock_submit.return_value = json.dumps(get_nym_reply) + mock_wait.return_value = mock_submit.return_value + mock_verify_spv_proof.return_value = True + ( + ledger_id, + ledger_inst, + ) = await self.manager.lookup_did_in_configured_ledgers( + "Av63wJYM7xYR4AiygYq4c3", cache_did=True + ) + assert ledger_id == "test_non_prod_1" + assert ledger_inst.pool.name == "test_non_prod_1" + + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") + @async_mock.patch("indy.ledger.build_get_nym_request") + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedger._submit") + async def test_lookup_did_in_configured_ledgers_x( + self, mock_submit, mock_build_get_nym_req, mock_close, mock_open + ): + with async_mock.patch.object( + test_module.asyncio, "wait", async_mock.CoroutineMock() + ) as mock_wait, async_mock.patch.object( + test_module.SubTrie, "verify_spv_proof", async_mock.CoroutineMock() + ) as mock_verify_spv_proof: + mock_build_get_nym_req.return_value = async_mock.MagicMock() + mock_submit.return_value = json.dumps(GET_NYM_REPLY) + mock_wait.return_value = mock_submit.return_value + mock_verify_spv_proof.return_value = False + with self.assertRaises(MultipleLedgerManagerError) as cm: + await self.manager.lookup_did_in_configured_ledgers( + "Av63wJYM7xYR4AiygYq4c3", cache_did=True + ) + assert "not found in any of the ledgers total: (production: " in cm + + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") + @async_mock.patch("indy.ledger.build_get_nym_request") + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedger._submit") + async def test_lookup_did_in_configured_ledgers_prod_not_cached( + self, mock_submit, mock_build_get_nym_req, mock_close, mock_open + ): + with async_mock.patch.object( + test_module.asyncio, "wait", async_mock.CoroutineMock() + ) as mock_wait: + mock_build_get_nym_req.return_value = async_mock.MagicMock() + mock_submit.return_value = json.dumps(GET_NYM_REPLY) + mock_wait.return_value = mock_submit.return_value + ( + ledger_id, + ledger_inst, + ) = await self.manager.lookup_did_in_configured_ledgers( + "Av63wJYM7xYR4AiygYq4c3", cache_did=False + ) + assert ledger_id == "test_prod_1" + assert ledger_inst.pool.name == "test_prod_1" + + async def test_lookup_did_in_configured_ledgers_cached_prod_ledger(self): + cache = InMemoryCache() + await cache.set("did_ledger_id_resolver::Av63wJYM7xYR4AiygYq4c3", "test_prod_1") + self.profile.context.injector.bind_instance(BaseCache, cache) + (ledger_id, ledger_inst,) = await self.manager.lookup_did_in_configured_ledgers( + "Av63wJYM7xYR4AiygYq4c3", cache_did=True + ) + assert ledger_id == "test_prod_1" + assert ledger_inst.pool.name == "test_prod_1" + + async def test_lookup_did_in_configured_ledgers_cached_non_prod_ledger(self): + cache = InMemoryCache() + await cache.set( + "did_ledger_id_resolver::Av63wJYM7xYR4AiygYq4c3", "test_non_prod_2", None + ) + self.profile.context.injector.bind_instance(BaseCache, cache) + (ledger_id, ledger_inst,) = await self.manager.lookup_did_in_configured_ledgers( + "Av63wJYM7xYR4AiygYq4c3", cache_did=True + ) + assert ledger_id == "test_non_prod_2" + assert ledger_inst.pool.name == "test_non_prod_2" + + async def test_lookup_did_in_configured_ledgers_cached_x(self): + cache = InMemoryCache() + await cache.set("did_ledger_id_resolver::Av63wJYM7xYR4AiygYq4c3", "invalid_id") + self.profile.context.injector.bind_instance(BaseCache, cache) + with self.assertRaises(MultipleLedgerManagerError) as cm: + await self.manager.lookup_did_in_configured_ledgers( + "Av63wJYM7xYR4AiygYq4c3", cache_did=True + ) + assert "cached ledger_id invalid_id not found in either" in cm + + def test_extract_did_from_identifier(self): + assert ( + self.manager.extract_did_from_identifier( + "WgWxqztrNooG92RXvxSTWv:2:schema_name:1.0" + ) + == "WgWxqztrNooG92RXvxSTWv" + ) + assert ( + self.manager.extract_did_from_identifier( + "WgWxqztrNooG92RXvxSTWv:3:CL:20:tag" + ) + == "WgWxqztrNooG92RXvxSTWv" + ) + + async def test_get_production_ledgers(self): + assert len(await self.manager.get_prod_ledgers()) == 2 + + async def test_get_non_production_ledgers(self): + assert len(await self.manager.get_nonprod_ledgers()) == 2 diff --git a/aries_cloudagent/ledger/multiple_ledger/tests/test_indy_vdr_manager.py b/aries_cloudagent/ledger/multiple_ledger/tests/test_indy_vdr_manager.py new file mode 100644 index 0000000000..19f440ca38 --- /dev/null +++ b/aries_cloudagent/ledger/multiple_ledger/tests/test_indy_vdr_manager.py @@ -0,0 +1,444 @@ +import asyncio +import json + +from asynctest import TestCase as AsyncTestCase +from asynctest import mock as async_mock +from copy import deepcopy + +from collections import OrderedDict + +from ....cache.base import BaseCache +from ....cache.in_memory import InMemoryCache +from ....core.in_memory import InMemoryProfile +from ....messaging.responder import BaseResponder + +from ...error import LedgerError +from ...indy_vdr import IndyVdrLedger, IndyVdrLedgerPool +from ...merkel_validation.tests.test_data import GET_NYM_REPLY + +from .. import indy_vdr_manager as test_module +from ..base_manager import MultipleLedgerManagerError +from ..indy_vdr_manager import MultiIndyVDRLedgerManager + + +class TestMultiIndyVDRLedgerManager(AsyncTestCase): + async def setUp(self): + self.profile = InMemoryProfile.test_profile(bind={BaseCache: InMemoryCache()}) + self.context = self.profile.context + setattr(self.context, "profile", self.profile) + self.responder = async_mock.CoroutineMock(send=async_mock.CoroutineMock()) + self.context.injector.bind_instance(BaseResponder, self.responder) + self.production_ledger = OrderedDict() + self.non_production_ledger = OrderedDict() + test_prod_ledger = IndyVdrLedger(IndyVdrLedgerPool("test_prod_1"), self.profile) + test_write_ledger = ("test_prod_1", test_prod_ledger) + self.production_ledger["test_prod_1"] = test_prod_ledger + self.production_ledger["test_prod_2"] = IndyVdrLedger( + IndyVdrLedgerPool("test_prod_2"), self.profile + ) + self.non_production_ledger["test_non_prod_1"] = IndyVdrLedger( + IndyVdrLedgerPool("test_non_prod_1"), self.profile + ) + self.non_production_ledger["test_non_prod_2"] = IndyVdrLedger( + IndyVdrLedgerPool("test_non_prod_2"), self.profile + ) + self.manager = MultiIndyVDRLedgerManager( + self.profile, + production_ledgers=self.production_ledger, + non_production_ledgers=self.non_production_ledger, + write_ledger_info=test_write_ledger, + ) + + async def test_get_write_ledger(self): + ledger_id, ledger_inst = await self.manager.get_write_ledger() + assert ledger_id == "test_prod_1" + assert ledger_inst.pool.name == "test_prod_1" + + @async_mock.patch("aries_cloudagent.ledger.indy_vdr.IndyVdrLedgerPool.context_open") + @async_mock.patch( + "aries_cloudagent.ledger.indy_vdr.IndyVdrLedgerPool.context_close" + ) + @async_mock.patch("indy_vdr.ledger.build_get_nym_request") + @async_mock.patch("aries_cloudagent.ledger.indy_vdr.IndyVdrLedger._submit") + async def test_get_ledger_by_did_self_cert_a( + self, mock_submit, mock_build_get_nym_req, mock_close, mock_open + ): + with async_mock.patch.object( + test_module.asyncio, "wait", async_mock.CoroutineMock() + ) as mock_wait: + mock_build_get_nym_req.return_value = async_mock.MagicMock() + mock_submit.return_value = json.dumps(GET_NYM_REPLY) + mock_wait.return_value = mock_submit.return_value + ( + ledger_id, + ledger_inst, + is_self_certified, + ) = await self.manager._get_ledger_by_did( + "test_prod_1", "Av63wJYM7xYR4AiygYq4c3" + ) + assert ledger_id == "test_prod_1" + assert ledger_inst.pool.name == "test_prod_1" + assert is_self_certified + + @async_mock.patch("aries_cloudagent.ledger.indy_vdr.IndyVdrLedgerPool.context_open") + @async_mock.patch( + "aries_cloudagent.ledger.indy_vdr.IndyVdrLedgerPool.context_close" + ) + @async_mock.patch("indy_vdr.ledger.build_get_nym_request") + @async_mock.patch("aries_cloudagent.ledger.indy_vdr.IndyVdrLedger._submit") + async def test_get_ledger_by_did_self_cert_b( + self, mock_submit, mock_build_get_nym_req, mock_close, mock_open + ): + self.non_production_ledger = OrderedDict() + self.non_production_ledger["test_non_prod_1"] = IndyVdrLedger( + IndyVdrLedgerPool("test_non_prod_1"), self.profile + ) + self.non_production_ledger["test_non_prod_2"] = IndyVdrLedger( + IndyVdrLedgerPool("test_non_prod_2"), self.profile + ) + self.manager = MultiIndyVDRLedgerManager( + self.profile, + non_production_ledgers=self.non_production_ledger, + ) + with async_mock.patch.object( + test_module.asyncio, "wait", async_mock.CoroutineMock() + ) as mock_wait: + mock_build_get_nym_req.return_value = async_mock.MagicMock() + mock_submit.return_value = json.dumps(GET_NYM_REPLY) + mock_wait.return_value = mock_submit.return_value + ( + ledger_id, + ledger_inst, + is_self_certified, + ) = await self.manager._get_ledger_by_did( + "test_non_prod_1", "Av63wJYM7xYR4AiygYq4c3" + ) + assert ledger_id == "test_non_prod_1" + assert ledger_inst.pool.name == "test_non_prod_1" + assert is_self_certified + + @async_mock.patch("aries_cloudagent.ledger.indy_vdr.IndyVdrLedgerPool.context_open") + @async_mock.patch( + "aries_cloudagent.ledger.indy_vdr.IndyVdrLedgerPool.context_close" + ) + @async_mock.patch("indy_vdr.ledger.build_get_nym_request") + @async_mock.patch("aries_cloudagent.ledger.indy_vdr.IndyVdrLedger._submit") + async def test_get_ledger_by_did_not_self_cert( + self, mock_submit, mock_build_get_nym_req, mock_close, mock_open + ): + get_nym_reply = deepcopy(GET_NYM_REPLY) + get_nym_reply["result"]["data"] = json.dumps( + { + "dest": "Av63wJYM7xYR4AiygYq4c3", + "identifier": "V4SGRU86Z58d6TV7PBUe6f", + "role": "101", + "seqNo": 17794, + "txnTime": 1632262244, + "verkey": "ABUF7uxYTxZ6qYdZ4G9e1Gi", + } + ) + with async_mock.patch.object( + test_module.asyncio, "wait", async_mock.CoroutineMock() + ) as mock_wait, async_mock.patch.object( + test_module.SubTrie, "verify_spv_proof", async_mock.CoroutineMock() + ) as mock_verify_spv_proof: + mock_build_get_nym_req.return_value = async_mock.MagicMock() + mock_submit.return_value = json.dumps(get_nym_reply) + mock_wait.return_value = mock_submit.return_value + mock_verify_spv_proof.return_value = True + ( + ledger_id, + ledger_inst, + is_self_certified, + ) = await self.manager._get_ledger_by_did( + "test_prod_1", "Av63wJYM7xYR4AiygYq4c3" + ) + assert ledger_id == "test_prod_1" + assert ledger_inst.pool.name == "test_prod_1" + assert not is_self_certified + + @async_mock.patch("aries_cloudagent.ledger.indy_vdr.IndyVdrLedgerPool.context_open") + @async_mock.patch( + "aries_cloudagent.ledger.indy_vdr.IndyVdrLedgerPool.context_close" + ) + @async_mock.patch("indy_vdr.ledger.build_get_nym_request") + @async_mock.patch("aries_cloudagent.ledger.indy_vdr.IndyVdrLedger._submit") + async def test_get_ledger_by_did_state_proof_not_valid( + self, mock_submit, mock_build_get_nym_req, mock_close, mock_open + ): + get_nym_reply = deepcopy(GET_NYM_REPLY) + get_nym_reply["result"]["data"]["verkey"] = "ABUF7uxYTxZ6qYdZ4G9e1Gi" + with async_mock.patch.object( + test_module.asyncio, "wait", async_mock.CoroutineMock() + ) as mock_wait: + mock_build_get_nym_req.return_value = async_mock.MagicMock() + mock_submit.return_value = json.dumps(get_nym_reply) + mock_wait.return_value = mock_submit.return_value + assert not await self.manager._get_ledger_by_did( + "test_prod_1", "Av63wJYM7xYR4AiygYq4c3" + ) + + @async_mock.patch("aries_cloudagent.ledger.indy_vdr.IndyVdrLedgerPool.context_open") + @async_mock.patch( + "aries_cloudagent.ledger.indy_vdr.IndyVdrLedgerPool.context_close" + ) + @async_mock.patch("indy_vdr.ledger.build_get_nym_request") + @async_mock.patch("aries_cloudagent.ledger.indy_vdr.IndyVdrLedger._submit") + async def test_get_ledger_by_did_no_data( + self, mock_submit, mock_build_get_nym_req, mock_close, mock_open + ): + get_nym_reply = deepcopy(GET_NYM_REPLY) + get_nym_reply.get("result").pop("data") + with async_mock.patch.object( + test_module.asyncio, "wait", async_mock.CoroutineMock() + ) as mock_wait: + mock_build_get_nym_req.return_value = async_mock.MagicMock() + mock_submit.return_value = json.dumps(get_nym_reply) + mock_wait.return_value = mock_submit.return_value + assert not await self.manager._get_ledger_by_did( + "test_prod_1", "Av63wJYM7xYR4AiygYq4c3" + ) + + @async_mock.patch("aries_cloudagent.ledger.indy_vdr.IndyVdrLedgerPool.context_open") + @async_mock.patch( + "aries_cloudagent.ledger.indy_vdr.IndyVdrLedgerPool.context_close" + ) + @async_mock.patch("indy_vdr.ledger.build_get_nym_request") + @async_mock.patch("aries_cloudagent.ledger.indy_vdr.IndyVdrLedger._submit") + async def test_get_ledger_by_did_timeout( + self, mock_submit, mock_build_get_nym_req, mock_close, mock_open + ): + mock_build_get_nym_req.return_value = async_mock.MagicMock() + mock_submit.side_effect = asyncio.TimeoutError + assert not await self.manager._get_ledger_by_did( + "test_prod_1", "Av63wJYM7xYR4AiygYq4c3" + ) + + @async_mock.patch("aries_cloudagent.ledger.indy_vdr.IndyVdrLedgerPool.context_open") + @async_mock.patch( + "aries_cloudagent.ledger.indy_vdr.IndyVdrLedgerPool.context_close" + ) + @async_mock.patch("indy_vdr.ledger.build_get_nym_request") + @async_mock.patch("aries_cloudagent.ledger.indy_vdr.IndyVdrLedger._submit") + async def test_get_ledger_by_did_ledger_error( + self, mock_submit, mock_build_get_nym_req, mock_close, mock_open + ): + mock_build_get_nym_req.return_value = async_mock.MagicMock() + mock_submit.side_effect = LedgerError + assert not await self.manager._get_ledger_by_did( + "test_prod_1", "Av63wJYM7xYR4AiygYq4c3" + ) + + @async_mock.patch("aries_cloudagent.ledger.indy_vdr.IndyVdrLedgerPool.context_open") + @async_mock.patch( + "aries_cloudagent.ledger.indy_vdr.IndyVdrLedgerPool.context_close" + ) + @async_mock.patch("indy_vdr.ledger.build_get_nym_request") + @async_mock.patch("aries_cloudagent.ledger.indy_vdr.IndyVdrLedger._submit") + async def test_lookup_did_in_configured_ledgers_self_cert_prod( + self, mock_submit, mock_build_get_nym_req, mock_close, mock_open + ): + with async_mock.patch.object( + test_module.asyncio, "wait", async_mock.CoroutineMock() + ) as mock_wait: + mock_build_get_nym_req.return_value = async_mock.MagicMock() + mock_submit.return_value = json.dumps(GET_NYM_REPLY) + mock_wait.return_value = mock_submit.return_value + ( + ledger_id, + ledger_inst, + ) = await self.manager.lookup_did_in_configured_ledgers( + "Av63wJYM7xYR4AiygYq4c3", cache_did=True + ) + assert ledger_id == "test_prod_1" + assert ledger_inst.pool.name == "test_prod_1" + + @async_mock.patch("aries_cloudagent.ledger.indy_vdr.IndyVdrLedgerPool.context_open") + @async_mock.patch( + "aries_cloudagent.ledger.indy_vdr.IndyVdrLedgerPool.context_close" + ) + @async_mock.patch("indy_vdr.ledger.build_get_nym_request") + @async_mock.patch("aries_cloudagent.ledger.indy_vdr.IndyVdrLedger._submit") + async def test_get_ledger_by_did_not_self_cert_not_self_cert_prod( + self, mock_submit, mock_build_get_nym_req, mock_close, mock_open + ): + get_nym_reply = deepcopy(GET_NYM_REPLY) + get_nym_reply["result"]["data"]["verkey"] = "ABUF7uxYTxZ6qYdZ4G9e1Gi" + with async_mock.patch.object( + test_module.asyncio, "wait", async_mock.CoroutineMock() + ) as mock_wait, async_mock.patch.object( + test_module.SubTrie, "verify_spv_proof", async_mock.CoroutineMock() + ) as mock_verify_spv_proof: + mock_build_get_nym_req.return_value = async_mock.MagicMock() + mock_submit.return_value = json.dumps(get_nym_reply) + mock_wait.return_value = mock_submit.return_value + mock_verify_spv_proof.return_value = True + ( + ledger_id, + ledger_inst, + ) = await self.manager.lookup_did_in_configured_ledgers( + "Av63wJYM7xYR4AiygYq4c3", cache_did=True + ) + assert ledger_id == "test_prod_1" + assert ledger_inst.pool.name == "test_prod_1" + + @async_mock.patch("aries_cloudagent.ledger.indy_vdr.IndyVdrLedgerPool.context_open") + @async_mock.patch( + "aries_cloudagent.ledger.indy_vdr.IndyVdrLedgerPool.context_close" + ) + @async_mock.patch("indy_vdr.ledger.build_get_nym_request") + @async_mock.patch("aries_cloudagent.ledger.indy_vdr.IndyVdrLedger._submit") + async def test_lookup_did_in_configured_ledgers_self_cert_non_prod( + self, mock_submit, mock_build_get_nym_req, mock_close, mock_open + ): + self.non_production_ledger = OrderedDict() + self.non_production_ledger["test_non_prod_1"] = IndyVdrLedger( + IndyVdrLedgerPool("test_non_prod_1"), self.profile + ) + self.non_production_ledger["test_non_prod_2"] = IndyVdrLedger( + IndyVdrLedgerPool("test_non_prod_2"), self.profile + ) + self.manager = MultiIndyVDRLedgerManager( + self.profile, + non_production_ledgers=self.non_production_ledger, + ) + with async_mock.patch.object( + test_module.asyncio, "wait", async_mock.CoroutineMock() + ) as mock_wait: + mock_build_get_nym_req.return_value = async_mock.MagicMock() + mock_submit.return_value = json.dumps(GET_NYM_REPLY) + mock_wait.return_value = mock_submit.return_value + ( + ledger_id, + ledger_inst, + ) = await self.manager.lookup_did_in_configured_ledgers( + "Av63wJYM7xYR4AiygYq4c3", cache_did=True + ) + assert ledger_id == "test_non_prod_1" + assert ledger_inst.pool.name == "test_non_prod_1" + + @async_mock.patch("aries_cloudagent.ledger.indy_vdr.IndyVdrLedgerPool.context_open") + @async_mock.patch( + "aries_cloudagent.ledger.indy_vdr.IndyVdrLedgerPool.context_close" + ) + @async_mock.patch("indy_vdr.ledger.build_get_nym_request") + @async_mock.patch("aries_cloudagent.ledger.indy_vdr.IndyVdrLedger._submit") + async def test_get_ledger_by_did_not_self_cert_not_self_cert_non_prod( + self, mock_submit, mock_build_get_nym_req, mock_close, mock_open + ): + self.non_production_ledger = OrderedDict() + self.non_production_ledger["test_non_prod_1"] = IndyVdrLedger( + IndyVdrLedgerPool("test_non_prod_1"), self.profile + ) + self.non_production_ledger["test_non_prod_2"] = IndyVdrLedger( + IndyVdrLedgerPool("test_non_prod_2"), self.profile + ) + self.manager = MultiIndyVDRLedgerManager( + self.profile, + non_production_ledgers=self.non_production_ledger, + ) + get_nym_reply = deepcopy(GET_NYM_REPLY) + get_nym_reply["result"]["data"]["verkey"] = "ABUF7uxYTxZ6qYdZ4G9e1Gi" + with async_mock.patch.object( + test_module.asyncio, "wait", async_mock.CoroutineMock() + ) as mock_wait, async_mock.patch.object( + test_module.SubTrie, "verify_spv_proof", async_mock.CoroutineMock() + ) as mock_verify_spv_proof: + mock_build_get_nym_req.return_value = async_mock.MagicMock() + mock_submit.return_value = json.dumps(get_nym_reply) + mock_wait.return_value = mock_submit.return_value + mock_verify_spv_proof.return_value = True + ( + ledger_id, + ledger_inst, + ) = await self.manager.lookup_did_in_configured_ledgers( + "Av63wJYM7xYR4AiygYq4c3", cache_did=True + ) + assert ledger_id == "test_non_prod_1" + assert ledger_inst.pool.name == "test_non_prod_1" + + @async_mock.patch("aries_cloudagent.ledger.indy_vdr.IndyVdrLedgerPool.context_open") + @async_mock.patch( + "aries_cloudagent.ledger.indy_vdr.IndyVdrLedgerPool.context_close" + ) + @async_mock.patch("indy_vdr.ledger.build_get_nym_request") + @async_mock.patch("aries_cloudagent.ledger.indy_vdr.IndyVdrLedger._submit") + async def test_lookup_did_in_configured_ledgers_x( + self, mock_submit, mock_build_get_nym_req, mock_close, mock_open + ): + with async_mock.patch.object( + test_module.asyncio, "wait", async_mock.CoroutineMock() + ) as mock_wait, async_mock.patch.object( + test_module.SubTrie, "verify_spv_proof", async_mock.CoroutineMock() + ) as mock_verify_spv_proof: + mock_build_get_nym_req.return_value = async_mock.MagicMock() + mock_submit.return_value = json.dumps(GET_NYM_REPLY) + mock_wait.return_value = mock_submit.return_value + mock_verify_spv_proof.return_value = False + with self.assertRaises(MultipleLedgerManagerError) as cm: + await self.manager.lookup_did_in_configured_ledgers( + "Av63wJYM7xYR4AiygYq4c3", cache_did=True + ) + assert "not found in any of the ledgers total: (production: " in cm + + @async_mock.patch("aries_cloudagent.ledger.indy_vdr.IndyVdrLedgerPool.context_open") + @async_mock.patch( + "aries_cloudagent.ledger.indy_vdr.IndyVdrLedgerPool.context_close" + ) + @async_mock.patch("indy_vdr.ledger.build_get_nym_request") + @async_mock.patch("aries_cloudagent.ledger.indy_vdr.IndyVdrLedger._submit") + async def test_lookup_did_in_configured_ledgers_prod_not_cached( + self, mock_submit, mock_build_get_nym_req, mock_close, mock_open + ): + with async_mock.patch.object( + test_module.asyncio, "wait", async_mock.CoroutineMock() + ) as mock_wait: + mock_build_get_nym_req.return_value = async_mock.MagicMock() + mock_submit.return_value = json.dumps(GET_NYM_REPLY) + mock_wait.return_value = mock_submit.return_value + ( + ledger_id, + ledger_inst, + ) = await self.manager.lookup_did_in_configured_ledgers( + "Av63wJYM7xYR4AiygYq4c3", cache_did=False + ) + assert ledger_id == "test_prod_1" + assert ledger_inst.pool.name == "test_prod_1" + + async def test_lookup_did_in_configured_ledgers_cached_prod_ledger(self): + cache = InMemoryCache() + await cache.set("did_ledger_id_resolver::Av63wJYM7xYR4AiygYq4c3", "test_prod_1") + self.profile.context.injector.bind_instance(BaseCache, cache) + (ledger_id, ledger_inst,) = await self.manager.lookup_did_in_configured_ledgers( + "Av63wJYM7xYR4AiygYq4c3", cache_did=True + ) + assert ledger_id == "test_prod_1" + assert ledger_inst.pool.name == "test_prod_1" + + async def test_lookup_did_in_configured_ledgers_cached_non_prod_ledger(self): + cache = InMemoryCache() + await cache.set( + "did_ledger_id_resolver::Av63wJYM7xYR4AiygYq4c3", "test_non_prod_2", None + ) + self.profile.context.injector.bind_instance(BaseCache, cache) + (ledger_id, ledger_inst,) = await self.manager.lookup_did_in_configured_ledgers( + "Av63wJYM7xYR4AiygYq4c3", cache_did=True + ) + assert ledger_id == "test_non_prod_2" + assert ledger_inst.pool.name == "test_non_prod_2" + + async def test_lookup_did_in_configured_ledgers_cached_x(self): + cache = InMemoryCache() + await cache.set("did_ledger_id_resolver::Av63wJYM7xYR4AiygYq4c3", "invalid_id") + self.profile.context.injector.bind_instance(BaseCache, cache) + with self.assertRaises(MultipleLedgerManagerError) as cm: + await self.manager.lookup_did_in_configured_ledgers( + "Av63wJYM7xYR4AiygYq4c3", cache_did=True + ) + assert "cached ledger_id invalid_id not found in either" in cm + + async def test_get_production_ledgers(self): + assert len(await self.manager.get_prod_ledgers()) == 2 + + async def test_get_non_production_ledgers(self): + assert len(await self.manager.get_nonprod_ledgers()) == 2 diff --git a/aries_cloudagent/ledger/multiple_ledger/tests/test_manager_provider.py b/aries_cloudagent/ledger/multiple_ledger/tests/test_manager_provider.py new file mode 100644 index 0000000000..c050bd15d0 --- /dev/null +++ b/aries_cloudagent/ledger/multiple_ledger/tests/test_manager_provider.py @@ -0,0 +1,108 @@ +from asynctest import TestCase as AsyncTestCase, mock as async_mock + +from aries_cloudagent.indy.sdk.profile import IndySdkProfile + +from ....askar.profile import AskarProfileManager +from ....config.injection_context import InjectionContext +from ....core.in_memory import InMemoryProfile +from ....indy.sdk.wallet_setup import IndyOpenWallet, IndyWalletConfig +from ....ledger.base import BaseLedger +from ....ledger.indy import IndySdkLedgerPool, IndySdkLedger + +from ..base_manager import MultipleLedgerManagerError +from ..manager_provider import MultiIndyLedgerManagerProvider + +TEST_GENESIS_TXN = { + "reqSignature": {}, + "txn": { + "data": { + "data": { + "alias": "Node1", + "blskey": "4N8aUNHSgjQVgkpm8nhNEfDf6txHznoYREg9kirmJrkivgL4oSEimFF6nsQ6M41QvhM2Z33nves5vfSn9n1UwNFJBYtWVnHYMATn76vLuL3zU88KyeAYcHfsih3He6UHcXDxcaecHVz6jhCYz1P2UZn2bDVruL5wXpehgBfBaLKm3Ba", + "blskey_pop": "RahHYiCvoNCtPTrVtP7nMC5eTYrsUA8WjXbdhNc8debh1agE9bGiJxWBXYNFbnJXoXhWFMvyqhqhRoq737YQemH5ik9oL7R4NTTCz2LEZhkgLJzB3QRQqJyBNyv7acbdHrAT8nQ9UkLbaVL9NBpnWXBTw4LEMePaSHEw66RzPNdAX1", + "client_ip": "192.168.65.3", + "client_port": 9702, + "node_ip": "192.168.65.3", + "node_port": 9701, + "services": ["VALIDATOR"], + }, + "dest": "Gw6pDLhcBcoQesN72qfotTgFa7cbuqZpkX3Xo6pLhPhv", + }, + "metadata": {"from": "Th7MpTaRZVRYnPiabds81Y"}, + "type": "0", + }, + "txnMetadata": { + "seqNo": 1, + "txnId": "fea82e10e894419fe2bea7d96296a6d46f50f93f9eeda954ec461b2ed2950b62", + }, + "ver": "1", +} + +LEDGER_CONFIG = [ + { + "id": "sovrinStaging", + "is_production": True, + "is_write": True, + "genesis_transactions": TEST_GENESIS_TXN, + }, + { + "id": "sovrinTest", + "is_production": False, + "genesis_transactions": TEST_GENESIS_TXN, + }, +] + + +class TestMultiIndyLedgerManagerProvider(AsyncTestCase): + async def test_provide_invalid_manager(self): + profile = InMemoryProfile.test_profile() + provider = MultiIndyLedgerManagerProvider(profile) + context = InjectionContext() + context.settings["ledger.ledger_config_list"] = LEDGER_CONFIG + with self.assertRaises(MultipleLedgerManagerError): + provider.provide(context.settings, context.injector) + + async def test_provide_indy_manager(self): + context = InjectionContext() + profile = IndySdkProfile( + IndyOpenWallet( + config=IndyWalletConfig({"name": "test-profile"}), + created=True, + handle=1, + master_secret_id="master-secret", + ), + context, + ) + context.injector.bind_instance( + BaseLedger, IndySdkLedger(IndySdkLedgerPool("name"), profile) + ) + provider = MultiIndyLedgerManagerProvider(profile) + context.settings["ledger.ledger_config_list"] = LEDGER_CONFIG + context.settings["ledger.genesis_transactions"] = TEST_GENESIS_TXN + self.assertEqual( + provider.provide(context.settings, context.injector).__class__.__name__, + "MultiIndyLedgerManager", + ) + + async def test_provide_askar_manager(self): + context = InjectionContext() + profile = await AskarProfileManager().provision( + context, + { + # "auto_recreate": True, + # "auto_remove": True, + "name": ":memory:", + "key": await AskarProfileManager.generate_store_key(), + "key_derivation_method": "RAW", # much faster than using argon-hashed keys + }, + ) + context.injector.bind_instance( + BaseLedger, IndySdkLedger(IndySdkLedgerPool("name"), profile) + ) + provider = MultiIndyLedgerManagerProvider(profile) + context.settings["ledger.ledger_config_list"] = LEDGER_CONFIG + context.settings["ledger.genesis_transactions"] = TEST_GENESIS_TXN + self.assertEqual( + provider.provide(context.settings, context.injector).__class__.__name__, + "MultiIndyVDRLedgerManager", + ) diff --git a/aries_cloudagent/ledger/routes.py b/aries_cloudagent/ledger/routes.py index 90db11960d..ad51fb3d0e 100644 --- a/aries_cloudagent/ledger/routes.py +++ b/aries_cloudagent/ledger/routes.py @@ -2,7 +2,6 @@ from aiohttp import web from aiohttp_apispec import docs, querystring_schema, request_schema, response_schema - from marshmallow import fields, validate from ..admin.request_context import AdminRequestContext @@ -18,6 +17,19 @@ from ..wallet.error import WalletError, WalletNotFoundError from .base import BaseLedger, Role as LedgerRole +from .multiple_ledger.base_manager import ( + BaseMultipleLedgerManager, +) +from .multiple_ledger.ledger_requests_executor import ( + GET_NYM_ROLE, + GET_KEY_FOR_DID, + GET_ENDPOINT_FOR_DID, + IndyLedgerRequestsExecutor, +) +from .multiple_ledger.ledger_config_schema import ( + LedgerConfigListSchema, + WriteLedgerRequestSchema, +) from .endpoint_type import EndpointType from .error import BadLedgerRequestError, LedgerError, LedgerTransactionError @@ -169,13 +181,13 @@ async def register_ledger_nym(request: web.BaseRequest): request: aiohttp request object """ context: AdminRequestContext = request["context"] - session = await context.session() - ledger = session.inject_or(BaseLedger) - if not ledger: - reason = "No Indy ledger available" - if not session.settings.get_value("wallet.type"): - reason += ": missing wallet-type?" - raise web.HTTPForbidden(reason=reason) + async with context.profile.session() as session: + ledger = session.inject_or(BaseLedger) + if not ledger: + reason = "No Indy ledger available" + if not session.settings.get_value("wallet.type"): + reason += ": missing wallet-type?" + raise web.HTTPForbidden(reason=reason) did = request.query.get("did") verkey = request.query.get("verkey") @@ -225,18 +237,29 @@ async def get_nym_role(request: web.BaseRequest): request: aiohttp request object """ context: AdminRequestContext = request["context"] - session = await context.session() - ledger = session.inject_or(BaseLedger) - if not ledger: - reason = "No Indy ledger available" - if not session.settings.get_value("wallet.type"): - reason += ": missing wallet-type?" - raise web.HTTPForbidden(reason=reason) did = request.query.get("did") if not did: raise web.HTTPBadRequest(reason="Request query must include DID") + ledger_id = None + async with context.profile.session() as session: + ledger_exec_inst = session.inject(IndyLedgerRequestsExecutor) + ledger_info = await ledger_exec_inst.get_ledger_for_identifier( + did, + txn_record_type=GET_NYM_ROLE, + ) + if isinstance(ledger_info, tuple): + ledger_id = ledger_info[0] + ledger = ledger_info[1] + else: + ledger = ledger_info + if not ledger: + reason = "No Indy ledger available" + if not session.settings.get_value("wallet.type"): + reason += ": missing wallet-type?" + raise web.HTTPForbidden(reason=reason) + async with ledger: try: role = await ledger.get_nym_role(did) @@ -246,7 +269,10 @@ async def get_nym_role(request: web.BaseRequest): raise web.HTTPNotFound(reason=err.roll_up) except LedgerError as err: raise web.HTTPBadRequest(reason=err.roll_up) - return web.json_response({"role": role.name}) + if ledger_id: + return web.json_response({"ledger_id": ledger_id, "role": role.name}) + else: + return web.json_response({"role": role.name}) @docs(tags=["ledger"], summary="Rotate key pair for public DID.") @@ -259,13 +285,13 @@ async def rotate_public_did_keypair(request: web.BaseRequest): request: aiohttp request object """ context: AdminRequestContext = request["context"] - session = await context.session() - ledger = session.inject_or(BaseLedger) - if not ledger: - reason = "No Indy ledger available" - if not session.settings.get_value("wallet.type"): - reason += ": missing wallet-type?" - raise web.HTTPForbidden(reason=reason) + async with context.profile.session() as session: + ledger = session.inject_or(BaseLedger) + if not ledger: + reason = "No Indy ledger available" + if not session.settings.get_value("wallet.type"): + reason += ": missing wallet-type?" + raise web.HTTPForbidden(reason=reason) async with ledger: try: await ledger.rotate_public_did_keypair() # do not take seed over the wire @@ -289,18 +315,29 @@ async def get_did_verkey(request: web.BaseRequest): request: aiohttp request object """ context: AdminRequestContext = request["context"] - session = await context.session() - ledger = session.inject_or(BaseLedger) - if not ledger: - reason = "No ledger available" - if not session.settings.get_value("wallet.type"): - reason += ": missing wallet-type?" - raise web.HTTPForbidden(reason=reason) did = request.query.get("did") if not did: raise web.HTTPBadRequest(reason="Request query must include DID") + ledger_id = None + async with context.profile.session() as session: + ledger_exec_inst = session.inject(IndyLedgerRequestsExecutor) + ledger_info = await ledger_exec_inst.get_ledger_for_identifier( + did, + txn_record_type=GET_KEY_FOR_DID, + ) + if isinstance(ledger_info, tuple): + ledger_id = ledger_info[0] + ledger = ledger_info[1] + else: + ledger = ledger_info + if not ledger: + reason = "No ledger available" + if not session.settings.get_value("wallet.type"): + reason += ": missing wallet-type?" + raise web.HTTPForbidden(reason=reason) + async with ledger: try: result = await ledger.get_key_for_did(did) @@ -308,8 +345,10 @@ async def get_did_verkey(request: web.BaseRequest): raise web.HTTPNotFound(reason=f"DID {did} is not on the ledger") except LedgerError as err: raise web.HTTPBadRequest(reason=err.roll_up) from err - - return web.json_response({"verkey": result}) + if ledger_id: + return web.json_response({"ledger_id": ledger_id, "verkey": result}) + else: + return web.json_response({"verkey": result}) @docs( @@ -326,29 +365,41 @@ async def get_did_endpoint(request: web.BaseRequest): request: aiohttp request object """ context: AdminRequestContext = request["context"] - session = await context.session() - ledger = session.inject_or(BaseLedger) - if not ledger: - reason = "No Indy ledger available" - if not session.settings.get_value("wallet.type"): - reason += ": missing wallet-type?" - raise web.HTTPForbidden(reason=reason) did = request.query.get("did") + if not did: + raise web.HTTPBadRequest(reason="Request query must include DID") + + ledger_id = None + async with context.profile.session() as session: + ledger_exec_inst = session.inject(IndyLedgerRequestsExecutor) + ledger_info = await ledger_exec_inst.get_ledger_for_identifier( + did, + txn_record_type=GET_ENDPOINT_FOR_DID, + ) + if isinstance(ledger_info, tuple): + ledger_id = ledger_info[0] + ledger = ledger_info[1] + else: + ledger = ledger_info + if not ledger: + reason = "No Indy ledger available" + if not session.settings.get_value("wallet.type"): + reason += ": missing wallet-type?" + raise web.HTTPForbidden(reason=reason) endpoint_type = EndpointType.get( request.query.get("endpoint_type", EndpointType.ENDPOINT.w3c) ) - if not did: - raise web.HTTPBadRequest(reason="Request query must include DID") - async with ledger: try: r = await ledger.get_endpoint_for_did(did, endpoint_type) except LedgerError as err: raise web.HTTPBadRequest(reason=err.roll_up) from err - - return web.json_response({"endpoint": r}) + if ledger_id: + return web.json_response({"ledger_id": ledger_id, "endpoint": r}) + else: + return web.json_response({"endpoint": r}) @docs(tags=["ledger"], summary="Fetch the current transaction author agreement, if any") @@ -365,13 +416,13 @@ async def ledger_get_taa(request: web.BaseRequest): """ context: AdminRequestContext = request["context"] - session = await context.session() - ledger = session.inject_or(BaseLedger) - if not ledger: - reason = "No Indy ledger available" - if not session.settings.get_value("wallet.type"): - reason += ": missing wallet-type?" - raise web.HTTPForbidden(reason=reason) + async with context.profile.session() as session: + ledger = session.inject_or(BaseLedger) + if not ledger: + reason = "No Indy ledger available" + if not session.settings.get_value("wallet.type"): + reason += ": missing wallet-type?" + raise web.HTTPForbidden(reason=reason) async with ledger: try: @@ -406,13 +457,13 @@ async def ledger_accept_taa(request: web.BaseRequest): """ context: AdminRequestContext = request["context"] - session = await context.session() - ledger = session.inject_or(BaseLedger) - if not ledger: - reason = "No Indy ledger available" - if not session.settings.get_value("wallet.type"): - reason += ": missing wallet-type?" - raise web.HTTPForbidden(reason=reason) + async with context.profile.session() as session: + ledger = session.inject_or(BaseLedger) + if not ledger: + reason = "No Indy ledger available" + if not session.settings.get_value("wallet.type"): + reason += ": missing wallet-type?" + raise web.HTTPForbidden(reason=reason) accept_input = await request.json() async with ledger: @@ -438,6 +489,79 @@ async def ledger_accept_taa(request: web.BaseRequest): return web.json_response({}) +@docs(tags=["ledger"], summary="Fetch the current write ledger") +@response_schema(WriteLedgerRequestSchema, 200, description="") +async def get_write_ledger(request: web.BaseRequest): + """ + Request handler for fetching the currently set write ledger. + + Args: + request: aiohttp request object + + Returns: + The write ledger identifier + + """ + context: AdminRequestContext = request["context"] + async with context.profile.session() as session: + multiledger_mgr = session.inject_or(BaseMultipleLedgerManager) + if not multiledger_mgr: + reason = "Multiple ledger support not enabled" + raise web.HTTPForbidden(reason=reason) + ledger_id = (await multiledger_mgr.get_write_ledger())[0] + return web.json_response({"ledger_id": ledger_id}) + + +@docs( + tags=["ledger"], summary="Fetch the multiple ledger configuration currently in use" +) +@response_schema(LedgerConfigListSchema, 200, description="") +async def get_ledger_config(request: web.BaseRequest): + """ + Request handler for fetching the ledger configuration list in use. + + Args: + request: aiohttp request object + + Returns: + Ledger configuration list + + """ + context: AdminRequestContext = request["context"] + async with context.profile.session() as session: + multiledger_mgr = session.inject_or(BaseMultipleLedgerManager) + if not multiledger_mgr: + reason = "Multiple ledger support not enabled" + raise web.HTTPForbidden(reason=reason) + ledger_config_list = session.settings.get_value("ledger.ledger_config_list") + config_ledger_dict = {"production_ledgers": [], "non_production_ledgers": []} + production_ledger_keys = (await multiledger_mgr.get_prod_ledgers()).keys() + non_production_ledger_keys = ( + await multiledger_mgr.get_nonprod_ledgers() + ).keys() + config_ledger_ids_set = set() + for config in ledger_config_list: + ledger_id = config.get("id") + config_ledger_ids_set.add(ledger_id) + # removing genesis_transactions + config = { + key: val for key, val in config.items() if key != "genesis_transactions" + } + if ledger_id in production_ledger_keys: + config_ledger_dict.get("production_ledgers").append(config) + if ledger_id in non_production_ledger_keys: + config_ledger_dict.get("non_production_ledgers").append(config) + diff_prod_ledger_ids_set = set(production_ledger_keys) - config_ledger_ids_set + for diff_prod_ledger_id in diff_prod_ledger_ids_set: + config_ledger_dict.get("production_ledgers").append( + { + "id": diff_prod_ledger_id, + "desc": "ledger configured outside --genesis-transactions-list", + } + ) + return web.json_response(config_ledger_dict) + + async def register(app: web.Application): """Register routes.""" @@ -450,6 +574,10 @@ async def register(app: web.Application): web.get("/ledger/did-endpoint", get_did_endpoint, allow_head=False), web.get("/ledger/taa", ledger_get_taa, allow_head=False), web.post("/ledger/taa/accept", ledger_accept_taa), + web.get( + "/ledger/multiple/get-write-ledger", get_write_ledger, allow_head=False + ), + web.get("/ledger/multiple/config", get_ledger_config, allow_head=False), ] ) diff --git a/aries_cloudagent/ledger/tests/test_indy.py b/aries_cloudagent/ledger/tests/test_indy.py index 1afba916f7..c558e1c332 100644 --- a/aries_cloudagent/ledger/tests/test_indy.py +++ b/aries_cloudagent/ledger/tests/test_indy.py @@ -7,12 +7,18 @@ from asynctest import mock as async_mock, TestCase as AsyncTestCase +from ...config.injection_context import InjectionContext +from ...core.in_memory import InMemoryProfile from ...cache.in_memory import InMemoryCache from ...indy.issuer import IndyIssuer, IndyIssuerError +from ...indy.sdk.profile import IndySdkProfile +from ...indy.sdk.wallet_setup import IndyWalletConfig from ...storage.record import StorageRecord +from ...wallet.base import BaseWallet from ...wallet.did_info import DIDInfo from ...wallet.did_posture import DIDPosture from ...wallet.error import WalletNotFoundError +from ...wallet.indy import IndyOpenWallet, IndySdkWallet from ...wallet.key_type import KeyType from ...wallet.did_method import DIDMethod @@ -58,15 +64,23 @@ async def test_provide(self): @pytest.mark.indy class TestIndySdkLedger(AsyncTestCase): - test_did = "55GkHamhTU1ZbTbV2ab9DE" - test_did_info = DIDInfo( - did=test_did, - verkey="3Dn1SJNPaCXcvvJvSbsFWP2xaCjMom3can8CQNhWrTRx", - metadata=None, - method=DIDMethod.SOV, - key_type=KeyType.ED25519, - ) - test_verkey = "3Dn1SJNPaCXcvvJvSbsFWP2xaCjMom3can8CQNhWrTRx" + async def setUp(self): + self.test_did = "55GkHamhTU1ZbTbV2ab9DE" + self.test_did_info = DIDInfo( + did=self.test_did, + verkey="3Dn1SJNPaCXcvvJvSbsFWP2xaCjMom3can8CQNhWrTRx", + metadata={"test": "test"}, + method=DIDMethod.SOV, + key_type=KeyType.ED25519, + ) + self.test_verkey = "3Dn1SJNPaCXcvvJvSbsFWP2xaCjMom3can8CQNhWrTRx" + context = InjectionContext() + context.injector.bind_instance(IndySdkLedgerPool, IndySdkLedgerPool("name")) + self.profile = IndySdkProfile( + async_mock.CoroutineMock(), + context, + ) + self.session = await self.profile.session() @async_mock.patch("indy.pool.create_pool_ledger_config") @async_mock.patch("indy.pool.list_pools") @@ -79,15 +93,16 @@ async def test_init( mock_list_pools.return_value = [] mock_wallet = async_mock.MagicMock() + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) ledger = IndySdkLedger( IndySdkLedgerPool("name", genesis_transactions="genesis_transactions"), - mock_wallet, + self.profile, ) assert ledger.pool_name == "name" assert not ledger.read_only assert ledger.backend - assert ledger.wallet is mock_wallet + assert ledger.profile is self.profile await ledger.__aenter__() @@ -98,10 +113,7 @@ async def test_init( mock_create_config.assert_called_once_with( "name", json.dumps({"genesis_txn": GENESIS_TRANSACTION_PATH}) ) - assert ( - ledger.did_to_nym(ledger.nym_to_did(TestIndySdkLedger.test_did)) - == TestIndySdkLedger.test_did - ) + assert ledger.did_to_nym(ledger.nym_to_did(self.test_did)) == self.test_did @async_mock.patch("indy.pool.create_pool_ledger_config") @async_mock.patch("indy.pool.list_pools") @@ -114,11 +126,12 @@ async def test_init_not_checked( mock_list_pools.return_value = [] mock_wallet = async_mock.MagicMock() - ledger = IndySdkLedger(IndySdkLedgerPool("name"), mock_wallet) + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) + ledger = IndySdkLedger(IndySdkLedgerPool("name"), self.profile) assert ledger.pool_name == "name" assert ledger.backend - assert ledger.wallet is mock_wallet + assert ledger.profile is self.profile with self.assertRaises(LedgerError): await ledger.__aenter__() @@ -169,14 +182,18 @@ async def test_aenter_aexit( self, mock_close_pool, mock_open_ledger, mock_set_proto ): mock_wallet = async_mock.MagicMock() - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - - async with ledger as led: - mock_set_proto.assert_called_once_with(2) - mock_open_ledger.assert_called_once_with("name", "{}") - assert led == ledger - mock_close_pool.assert_not_called() - assert led.pool_handle == mock_open_ledger.return_value + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = self.test_did_info + async with ledger as led: + mock_set_proto.assert_called_once_with(2) + mock_open_ledger.assert_called_once_with("name", "{}") + assert led == ledger + mock_close_pool.assert_not_called() + assert led.pool_handle == mock_open_ledger.return_value mock_close_pool.assert_called_once() assert ledger.pool_handle is None @@ -188,19 +205,23 @@ async def test_aenter_aexit_nested_keepalive( self, mock_close_pool, mock_open_ledger, mock_set_proto ): mock_wallet = async_mock.MagicMock() + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) ledger = IndySdkLedger( - IndySdkLedgerPool("name", checked=True, keepalive=1), mock_wallet + IndySdkLedgerPool("name", checked=True, keepalive=1), self.profile ) - - async with ledger as led0: - mock_set_proto.assert_called_once_with(2) - mock_open_ledger.assert_called_once_with("name", "{}") - assert led0 == ledger - mock_close_pool.assert_not_called() - assert led0.pool_handle == mock_open_ledger.return_value - - async with ledger as led1: - assert ledger.pool.ref_count == 1 + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = self.test_did_info + async with ledger as led0: + mock_set_proto.assert_called_once_with(2) + mock_open_ledger.assert_called_once_with("name", "{}") + assert led0 == ledger + mock_close_pool.assert_not_called() + assert led0.pool_handle == mock_open_ledger.return_value + + async with ledger as led1: + assert ledger.pool.ref_count == 1 mock_close_pool.assert_not_called() # it's a future assert ledger.pool_handle @@ -216,12 +237,16 @@ async def test_aenter_aexit_close_x( self, mock_close_pool, mock_open_ledger, mock_set_proto ): mock_wallet = async_mock.MagicMock() + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) mock_close_pool.side_effect = IndyError(ErrorCode.PoolLedgerTimeout) - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - - with self.assertRaises(LedgerError): - async with ledger as led: - assert led.pool_handle == mock_open_ledger.return_value + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = self.test_did_info + with self.assertRaises(LedgerError): + async with ledger as led: + assert led.pool_handle == mock_open_ledger.return_value assert ledger.pool_handle == mock_open_ledger.return_value assert ledger.pool.ref_count == 1 @@ -234,10 +259,14 @@ async def test_submit_pool_closed( self, mock_close_pool, mock_open_ledger, mock_create_config, mock_set_proto ): mock_wallet = async_mock.MagicMock() - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - - with self.assertRaises(ClosedPoolError) as context: - await ledger._submit("{}") + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = self.test_did_info + with self.assertRaises(ClosedPoolError) as context: + await ledger._submit("{}") assert "sign and submit request to closed pool" in str(context.exception) @async_mock.patch("indy.pool.set_protocol_version") @@ -259,44 +288,40 @@ async def test_submit_signed( mock_sign_submit.return_value = '{"op": "REPLY"}' mock_wallet = async_mock.MagicMock() + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + async with ledger: + mock_wallet_get_public_did.return_value = None - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - - async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock() - mock_wallet.get_public_did.return_value = None - - with self.assertRaises(BadLedgerRequestError): - await ledger._submit("{}", True) - - mock_wallet.get_public_did = async_mock.CoroutineMock() - mock_did = mock_wallet.get_public_did.return_value - mock_did.did = TestIndySdkLedger.test_did + with self.assertRaises(BadLedgerRequestError): + await ledger._submit("{}", True) - await ledger._submit( - request_json="{}", - sign=True, - taa_accept=False, - ) + mock_wallet_get_public_did.return_value = async_mock.CoroutineMock() + mock_did = mock_wallet_get_public_did.return_value + mock_did.did = self.test_did - mock_wallet.get_public_did.assert_called_once_with() - mock_sign_submit.assert_called_once_with( - ledger.pool_handle, mock_wallet.opened.handle, mock_did.did, "{}" - ) + await ledger._submit( + request_json="{}", + sign=True, + taa_accept=False, + ) - result_json = await ledger._submit( # multi-sign for later endorsement - request_json="{}", - sign=True, - taa_accept=False, - write_ledger=False, - ) - assert json.loads(result_json) == {"endorsed": "content"} + result_json = await ledger._submit( # multi-sign for later endorsement + request_json="{}", + sign=True, + taa_accept=False, + write_ledger=False, + ) + assert json.loads(result_json) == {"endorsed": "content"} - await ledger.txn_submit( # cover txn_submit() - request_json="{}", - sign=True, - taa_accept=False, - ) + await ledger.txn_submit( # cover txn_submit() + request_json="{}", + sign=True, + taa_accept=False, + ) @async_mock.patch("indy.pool.set_protocol_version") @async_mock.patch("indy.pool.create_pool_ledger_config") @@ -318,40 +343,37 @@ async def test_submit_signed_taa_accept( mock_sign_submit.return_value = '{"op": "REPLY"}' mock_wallet = async_mock.MagicMock() - - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - ledger.get_latest_txn_author_acceptance = async_mock.CoroutineMock( - return_value={ - "text": "sample", - "version": "0.0", - "digest": "digest", - "mechanism": "dummy", - "time": "now", - } - ) - - async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock() - mock_did = mock_wallet.get_public_did.return_value - mock_did.did = TestIndySdkLedger.test_did - - await ledger._submit( - request_json="{}", - sign=None, - taa_accept=True, - sign_did=TestIndySdkLedger.test_did_info, + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = async_mock.CoroutineMock() + ledger = IndySdkLedger( + IndySdkLedgerPool("name", checked=True), self.profile + ) + ledger.get_latest_txn_author_acceptance = async_mock.CoroutineMock( + return_value={ + "text": "sample", + "version": "0.0", + "digest": "digest", + "mechanism": "dummy", + "time": "now", + } ) - mock_wallet.get_public_did.assert_not_called() - mock_append_taa.assert_called_once_with( - "{}", "sample", "0.0", "digest", "dummy", "now" - ) - mock_sign_submit.assert_called_once_with( - ledger.pool_handle, - ledger.wallet.opened.handle, - TestIndySdkLedger.test_did, - "{}", - ) + async with ledger: + mock_did = mock_wallet_get_public_did.return_value + mock_did.did = self.test_did + + await ledger._submit( + request_json="{}", + sign=None, + taa_accept=True, + sign_did=self.test_did_info, + ) + mock_append_taa.assert_called_once_with( + "{}", "sample", "0.0", "digest", "dummy", "now" + ) @async_mock.patch("indy.pool.set_protocol_version") @async_mock.patch("indy.pool.create_pool_ledger_config") @@ -375,16 +397,15 @@ async def test_submit_unsigned( mock_submit.return_value = '{"op": "REPLY"}' mock_wallet = async_mock.MagicMock() - mock_wallet.get_public_did.return_value = future - - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - - async with ledger: - await ledger._submit("{}", False) - - mock_wallet.get_public_did.assert_not_called() - - mock_submit.assert_called_once_with(ledger.pool_handle, "{}") + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = future + async with ledger: + await ledger._submit("{}", False) + mock_submit.assert_called_once_with(ledger.pool_handle, "{}") @async_mock.patch("indy.pool.set_protocol_version") @async_mock.patch("indy.pool.create_pool_ledger_config") @@ -408,17 +429,18 @@ async def test_submit_unsigned_ledger_transaction_error( mock_submit.return_value = '{"op": "NO-SUCH-OP"}' mock_wallet = async_mock.MagicMock() - mock_wallet.get_public_did.return_value = future - - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - - async with ledger: - with self.assertRaises(LedgerTransactionError): - await ledger._submit("{}", False) - - mock_wallet.get_public_did.assert_not_called() - - mock_submit.assert_called_once_with(ledger.pool_handle, "{}") + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = future + ledger = IndySdkLedger( + IndySdkLedgerPool("name", checked=True), self.profile + ) + async with ledger: + with self.assertRaises(LedgerTransactionError): + await ledger._submit("{}", False) + mock_submit.assert_called_once_with(ledger.pool_handle, "{}") @async_mock.patch("indy.pool.set_protocol_version") @async_mock.patch("indy.pool.create_pool_ledger_config") @@ -442,26 +464,30 @@ async def test_submit_rejected( mock_submit.return_value = '{"op": "REQNACK", "reason": "a reason"}' mock_wallet = async_mock.MagicMock() - mock_wallet.get_public_did.return_value = future - - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - - async with ledger: - with self.assertRaises(LedgerTransactionError) as context: - await ledger._submit("{}", False) - assert "Ledger rejected transaction request" in str(context.exception) + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = future + async with ledger: + with self.assertRaises(LedgerTransactionError) as context: + await ledger._submit("{}", False) + assert "Ledger rejected transaction request" in str(context.exception) mock_submit.return_value = '{"op": "REJECT", "reason": "another reason"}' mock_wallet = async_mock.MagicMock() - mock_wallet.get_public_did.return_value = future - - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - - async with ledger: - with self.assertRaises(LedgerTransactionError) as context: - await ledger._submit("{}", False) - assert "Ledger rejected transaction request" in str(context.exception) + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = future + async with ledger: + with self.assertRaises(LedgerTransactionError) as context: + await ledger._submit("{}", False) + assert "Ledger rejected transaction request" in str(context.exception) @async_mock.patch("indy.pool.set_protocol_version") @async_mock.patch("indy.pool.create_pool_ledger_config") @@ -479,24 +505,26 @@ async def test_txn_endorse( mock_indy_multi_sign.return_value = json.dumps({"endorsed": "content"}) mock_indy_open.return_value = 1 - mock_wallet = async_mock.MagicMock( - get_public_did=async_mock.CoroutineMock(return_value=None) - ) - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - - with self.assertRaises(ClosedPoolError): - await ledger.txn_endorse(request_json=json.dumps({"...": "..."})) - - async with ledger: - with self.assertRaises(BadLedgerRequestError): + mock_wallet = async_mock.MagicMock() + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = None + with self.assertRaises(ClosedPoolError): await ledger.txn_endorse(request_json=json.dumps({"...": "..."})) - mock_wallet.get_public_did.return_value = TestIndySdkLedger.test_did_info + async with ledger: + with self.assertRaises(BadLedgerRequestError): + await ledger.txn_endorse(request_json=json.dumps({"...": "..."})) - endorsed_json = await ledger.txn_endorse( - request_json=json.dumps({"...": "..."}) - ) - assert json.loads(endorsed_json) == {"endorsed": "content"} + mock_wallet_get_public_did.return_value = self.test_did_info + + endorsed_json = await ledger.txn_endorse( + request_json=json.dumps({"...": "..."}) + ) + assert json.loads(endorsed_json) == {"endorsed": "content"} @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -520,10 +548,10 @@ async def test_send_schema( mock_open, ): mock_wallet = async_mock.MagicMock() + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) issuer = async_mock.MagicMock(IndyIssuer) - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) issuer.create_schema.return_value = ("schema_issuer_did:name:1.0", "{}") mock_fetch_schema_by_id.return_value = None mock_fetch_schema_by_seq_no.return_value = None @@ -531,52 +559,56 @@ async def test_send_schema( mock_submit.return_value = ( r'{"op":"REPLY","result":{"txnMetadata":{"seqNo": 1}}}' ) + future = asyncio.Future() + future.set_result(async_mock.MagicMock(add_record=async_mock.CoroutineMock())) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did, async_mock.patch.object( + ledger, "get_indy_storage", async_mock.MagicMock() + ) as mock_get_storage: + mock_get_storage.return_value = future + async with ledger: + mock_wallet_get_public_did.return_value = None + + with self.assertRaises(BadLedgerRequestError): + schema_id, schema_def = await ledger.create_and_send_schema( + issuer, "schema_name", "schema_version", [1, 2, 3] + ) - async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock() - mock_wallet.get_public_did.return_value = None + mock_wallet_get_public_did.return_value = async_mock.CoroutineMock() + mock_did = mock_wallet_get_public_did.return_value + mock_did.did = self.test_did - with self.assertRaises(BadLedgerRequestError): schema_id, schema_def = await ledger.create_and_send_schema( issuer, "schema_name", "schema_version", [1, 2, 3] ) + issuer.create_schema.assert_called_once_with( + mock_did.did, "schema_name", "schema_version", [1, 2, 3] + ) - mock_wallet.get_public_did = async_mock.CoroutineMock() - mock_did = mock_wallet.get_public_did.return_value - mock_did.did = TestIndySdkLedger.test_did - - schema_id, schema_def = await ledger.create_and_send_schema( - issuer, "schema_name", "schema_version", [1, 2, 3] - ) - - mock_wallet.get_public_did.assert_called_once_with() - issuer.create_schema.assert_called_once_with( - mock_did.did, "schema_name", "schema_version", [1, 2, 3] - ) - - mock_build_schema_req.assert_called_once_with( - mock_did.did, issuer.create_schema.return_value[1] - ) + mock_build_schema_req.assert_called_once_with( + mock_did.did, issuer.create_schema.return_value[1] + ) - mock_submit.assert_called_once_with( - mock_build_schema_req.return_value, - True, - sign_did=mock_wallet.get_public_did.return_value, - write_ledger=True, - ) + mock_submit.assert_called_once_with( + mock_build_schema_req.return_value, + True, + sign_did=mock_wallet_get_public_did.return_value, + write_ledger=True, + ) - assert schema_id == issuer.create_schema.return_value[0] + assert schema_id == issuer.create_schema.return_value[0] - schema_id, signed_txn = await ledger.create_and_send_schema( - issuer=issuer, - schema_name="schema_name", - schema_version="schema_version", - attribute_names=[1, 2, 3], - write_ledger=False, - endorser_did=TestIndySdkLedger.test_did, - ) - assert schema_id == issuer.create_schema.return_value[0] - assert "signed_txn" in signed_txn + schema_id, signed_txn = await ledger.create_and_send_schema( + issuer=issuer, + schema_name="schema_name", + schema_version="schema_version", + attribute_names=[1, 2, 3], + write_ledger=False, + endorser_did=self.test_did, + ) + assert schema_id == issuer.create_schema.return_value[0] + assert "signed_txn" in signed_txn @async_mock.patch("indy.pool.set_protocol_version") @async_mock.patch("indy.pool.create_pool_ledger_config") @@ -600,26 +632,30 @@ async def test_send_schema_already_exists( # mock_did = async_mock.CoroutineMock() mock_wallet = async_mock.MagicMock() - mock_wallet.get_public_did = async_mock.CoroutineMock() - mock_wallet.get_public_did.return_value.did = "abc" - - fetch_schema_id = ( - f"{mock_wallet.get_public_did.return_value.did}:2:" - "schema_name:schema_version" - ) - mock_check_existing.return_value = (fetch_schema_id, {}) + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) issuer = async_mock.MagicMock(IndyIssuer) issuer.create_schema.return_value = ("1", "{}") - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) + mock_add_record = async_mock.CoroutineMock() + future = asyncio.Future() + future.set_result( + async_mock.MagicMock( + return_value=async_mock.MagicMock(add_record=mock_add_record) + ) + ) with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did, async_mock.patch.object( ledger, "get_indy_storage", async_mock.MagicMock() ) as mock_get_storage: - mock_add_record = async_mock.CoroutineMock() - mock_get_storage.return_value = async_mock.MagicMock( - add_record=mock_add_record + mock_get_storage.return_value = future + mock_wallet_get_public_did.return_value = self.test_did_info + fetch_schema_id = ( + f"{mock_wallet_get_public_did.return_value.did}:2:" + "schema_name:schema_version" ) + mock_check_existing.return_value = (fetch_schema_id, {}) async with ledger: schema_id, schema_def = await ledger.create_and_send_schema( @@ -651,27 +687,33 @@ async def test_send_schema_ledger_transaction_error_already_exists( ): mock_wallet = async_mock.MagicMock() - mock_wallet.get_public_did = async_mock.CoroutineMock() - mock_wallet.get_public_did.return_value.did = "abc" - - fetch_schema_id = ( - f"{mock_wallet.get_public_did.return_value.did}:2:" - "schema_name:schema_version" - ) - mock_check_existing.side_effect = [None, (fetch_schema_id, "{}")] + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) issuer = async_mock.MagicMock(IndyIssuer) issuer.create_schema.return_value = ("1", "{}") - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) ledger._submit = async_mock.CoroutineMock( side_effect=LedgerTransactionError("UnauthorizedClientRequest") ) - - async with ledger: - schema_id, schema_def = await ledger.create_and_send_schema( - issuer, "schema_name", "schema_version", [1, 2, 3] + future = asyncio.Future() + future.set_result(async_mock.MagicMock(add_record=async_mock.CoroutineMock())) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did, async_mock.patch.object( + ledger, "get_indy_storage", async_mock.MagicMock() + ) as mock_get_storage: + mock_get_storage.return_value = future + mock_wallet_get_public_did.return_value = self.test_did_info + fetch_schema_id = ( + f"{mock_wallet_get_public_did.return_value.did}:2:" + "schema_name:schema_version" ) - assert schema_id == fetch_schema_id + mock_check_existing.side_effect = [None, (fetch_schema_id, "{}")] + async with ledger: + schema_id, schema_def = await ledger.create_and_send_schema( + issuer, "schema_name", "schema_version", [1, 2, 3] + ) + assert schema_id == fetch_schema_id @async_mock.patch("indy.pool.set_protocol_version") @async_mock.patch("indy.pool.create_pool_ledger_config") @@ -690,27 +732,28 @@ async def test_send_schema_ledger_read_only( ): mock_wallet = async_mock.MagicMock() - mock_wallet.get_public_did = async_mock.CoroutineMock() - mock_wallet.get_public_did.return_value.did = "abc" - - fetch_schema_id = ( - f"{mock_wallet.get_public_did.return_value.did}:2:" - "schema_name:schema_version" - ) - mock_check_existing.side_effect = [None, fetch_schema_id] + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) issuer = async_mock.MagicMock(IndyIssuer) issuer.create_schema.return_value = ("1", "{}") ledger = IndySdkLedger( - IndySdkLedgerPool("name", checked=True, read_only=True), mock_wallet + IndySdkLedgerPool("name", checked=True, read_only=True), self.profile ) - - async with ledger: - with self.assertRaises(LedgerError) as context: - await ledger.create_and_send_schema( - issuer, "schema_name", "schema_version", [1, 2, 3] - ) - assert "read only" in str(context.exception) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = self.test_did_info + fetch_schema_id = ( + f"{mock_wallet_get_public_did.return_value.did}:2:" + "schema_name:schema_version" + ) + mock_check_existing.side_effect = [None, fetch_schema_id] + async with ledger: + with self.assertRaises(LedgerError) as context: + await ledger.create_and_send_schema( + issuer, "schema_name", "schema_version", [1, 2, 3] + ) + assert "read only" in str(context.exception) @async_mock.patch("indy.pool.set_protocol_version") @async_mock.patch("indy.pool.create_pool_ledger_config") @@ -729,27 +772,28 @@ async def test_send_schema_issuer_error( ): mock_wallet = async_mock.MagicMock() - mock_wallet.get_public_did = async_mock.CoroutineMock() - mock_wallet.get_public_did.return_value.did = "abc" - - fetch_schema_id = ( - f"{mock_wallet.get_public_did.return_value.did}:2:" - "schema_name:schema_version" - ) - mock_check_existing.side_effect = [None, fetch_schema_id] + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) issuer = async_mock.MagicMock(IndyIssuer) issuer.create_schema = async_mock.CoroutineMock( side_effect=IndyIssuerError("dummy error") ) - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - - async with ledger: - with self.assertRaises(LedgerError) as context: - await ledger.create_and_send_schema( - issuer, "schema_name", "schema_version", [1, 2, 3] - ) - assert "dummy error" in str(context.exception) + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = self.test_did_info + fetch_schema_id = ( + f"{mock_wallet_get_public_did.return_value.did}:2:" + "schema_name:schema_version" + ) + mock_check_existing.side_effect = [None, fetch_schema_id] + async with ledger: + with self.assertRaises(LedgerError) as context: + await ledger.create_and_send_schema( + issuer, "schema_name", "schema_version", [1, 2, 3] + ) + assert "dummy error" in str(context.exception) @async_mock.patch("indy.pool.set_protocol_version") @async_mock.patch("indy.pool.create_pool_ledger_config") @@ -772,27 +816,28 @@ async def test_send_schema_ledger_transaction_error( ): mock_wallet = async_mock.MagicMock() - mock_wallet.get_public_did = async_mock.CoroutineMock() - mock_wallet.get_public_did.return_value.did = "abc" - - fetch_schema_id = ( - f"{mock_wallet.get_public_did.return_value.did}:2:" - "schema_name:schema_version" - ) - mock_check_existing.side_effect = [None, fetch_schema_id] + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) issuer = async_mock.MagicMock(IndyIssuer) issuer.create_schema.return_value = ("1", "{}") - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) ledger._submit = async_mock.CoroutineMock( side_effect=LedgerTransactionError("Some other error message") ) - - async with ledger: - with self.assertRaises(LedgerTransactionError): - await ledger.create_and_send_schema( - issuer, "schema_name", "schema_version", [1, 2, 3] - ) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = self.test_did_info + fetch_schema_id = ( + f"{mock_wallet_get_public_did.return_value.did}:2:" + "schema_name:schema_version" + ) + mock_check_existing.side_effect = [None, fetch_schema_id] + async with ledger: + with self.assertRaises(LedgerTransactionError): + await ledger.create_and_send_schema( + issuer, "schema_name", "schema_version", [1, 2, 3] + ) @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -814,10 +859,9 @@ async def test_send_schema_no_seq_no( mock_open, ): mock_wallet = async_mock.MagicMock() - + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) issuer = async_mock.MagicMock(IndyIssuer) - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) issuer.create_schema.return_value = ("schema_issuer_did:name:1.0", "{}") mock_fetch_schema_by_id.return_value = None mock_fetch_schema_by_seq_no.return_value = None @@ -825,17 +869,20 @@ async def test_send_schema_no_seq_no( mock_submit.return_value = ( r'{"op":"REPLY","result":{"txnMetadata":{"no": "seqNo"}}}' ) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = async_mock.CoroutineMock() + async with ledger: + mock_wallet.get_public_did = async_mock.CoroutineMock() + mock_did = mock_wallet_get_public_did.return_value + mock_did.did = self.test_did - async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock() - mock_did = mock_wallet.get_public_did.return_value - mock_did.did = TestIndySdkLedger.test_did - - with self.assertRaises(LedgerError) as context: - await ledger.create_and_send_schema( - issuer, "schema_name", "schema_version", [1, 2, 3] - ) - assert "schema sequence number" in str(context.exception) + with self.assertRaises(LedgerError) as context: + await ledger.create_and_send_schema( + issuer, "schema_name", "schema_version", [1, 2, 3] + ) + assert "schema sequence number" in str(context.exception) @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -847,29 +894,32 @@ async def test_check_existing_schema( mock_open, ): mock_wallet = async_mock.MagicMock() - mock_wallet.get_public_did = async_mock.CoroutineMock() - mock_did = mock_wallet.get_public_did.return_value - mock_did.did = TestIndySdkLedger.test_did + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) mock_fetch_schema_by_id.return_value = {"attrNames": ["a", "b", "c"]} - - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - async with ledger: - schema_id, schema_def = await ledger.check_existing_schema( - public_did=TestIndySdkLedger.test_did, - schema_name="test", - schema_version="1.0", - attribute_names=["c", "b", "a"], - ) - assert schema_id == f"{TestIndySdkLedger.test_did}:2:test:1.0" - - with self.assertRaises(LedgerTransactionError): - await ledger.check_existing_schema( - public_did=TestIndySdkLedger.test_did, + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = async_mock.CoroutineMock() + mock_did = mock_wallet_get_public_did.return_value + mock_did.did = self.test_did + async with ledger: + schema_id, schema_def = await ledger.check_existing_schema( + public_did=self.test_did, schema_name="test", schema_version="1.0", - attribute_names=["a", "b", "c", "d"], + attribute_names=["c", "b", "a"], ) + assert schema_id == f"{self.test_did}:2:test:1.0" + + with self.assertRaises(LedgerTransactionError): + await ledger.check_existing_schema( + public_did=self.test_did, + schema_name="test", + schema_version="1.0", + attribute_names=["a", "b", "c", "d"], + ) @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -885,32 +935,43 @@ async def test_get_schema( mock_open, ): mock_wallet = async_mock.MagicMock() - mock_wallet.get_public_did = async_mock.CoroutineMock() - mock_did = mock_wallet.get_public_did.return_value - mock_did.did = TestIndySdkLedger.test_did + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) mock_parse_get_schema_resp.return_value = (None, '{"attrNames": ["a", "b"]}') mock_submit.return_value = '{"result":{"seqNo":1}}' - ledger = IndySdkLedger( - IndySdkLedgerPool("name", checked=True, cache=InMemoryCache()), mock_wallet - ) - - async with ledger: - response = await ledger.get_schema("schema_id") - - mock_wallet.get_public_did.assert_called_once_with() - mock_build_get_schema_req.assert_called_once_with(mock_did.did, "schema_id") - mock_submit.assert_called_once_with( - mock_build_get_schema_req.return_value, sign_did=mock_did + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = async_mock.CoroutineMock() + mock_did = mock_wallet_get_public_did.return_value + mock_did.did = self.test_did + ledger = IndySdkLedger( + IndySdkLedgerPool("name", checked=True, cache=InMemoryCache()), + self.profile, ) - mock_parse_get_schema_resp.assert_called_once_with(mock_submit.return_value) + async with ledger: + response = await ledger.get_schema("schema_id") + mock_wallet_get_public_did.assert_called_once_with() + mock_build_get_schema_req.assert_called_once_with( + mock_did.did, "schema_id" + ) + mock_submit.assert_called_once_with( + mock_build_get_schema_req.return_value, sign_did=mock_did + ) + mock_parse_get_schema_resp.assert_called_once_with( + mock_submit.return_value + ) - assert response == json.loads(mock_parse_get_schema_resp.return_value[1]) + assert response == json.loads( + mock_parse_get_schema_resp.return_value[1] + ) - response == await ledger.get_schema("schema_id") # cover get-from-cache - assert response == json.loads(mock_parse_get_schema_resp.return_value[1]) + response == await ledger.get_schema("schema_id") # cover get-from-cache + assert response == json.loads( + mock_parse_get_schema_resp.return_value[1] + ) @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -924,26 +985,32 @@ async def test_get_schema_not_found( mock_open, ): mock_wallet = async_mock.MagicMock() - mock_wallet.get_public_did = async_mock.CoroutineMock() - mock_did = mock_wallet.get_public_did.return_value - mock_did.did = TestIndySdkLedger.test_did + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) mock_submit.return_value = json.dumps({"result": {"seqNo": None}}) - ledger = IndySdkLedger( - IndySdkLedgerPool("name", checked=True, cache=InMemoryCache()), mock_wallet - ) - - async with ledger: - response = await ledger.get_schema("schema_id") - - mock_wallet.get_public_did.assert_called_once_with() - mock_build_get_schema_req.assert_called_once_with(mock_did.did, "schema_id") - mock_submit.assert_called_once_with( - mock_build_get_schema_req.return_value, sign_did=mock_did + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = async_mock.CoroutineMock() + mock_did = mock_wallet_get_public_did.return_value + mock_did.did = self.test_did + ledger = IndySdkLedger( + IndySdkLedgerPool("name", checked=True, cache=InMemoryCache()), + self.profile, ) - assert response is None + async with ledger: + response = await ledger.get_schema("schema_id") + mock_wallet_get_public_did.assert_called_once_with() + mock_build_get_schema_req.assert_called_once_with( + mock_did.did, "schema_id" + ) + mock_submit.assert_called_once_with( + mock_build_get_schema_req.return_value, sign_did=mock_did + ) + + assert response is None @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -961,9 +1028,7 @@ async def test_get_schema_by_seq_no( mock_open, ): mock_wallet = async_mock.MagicMock() - mock_wallet.get_public_did = async_mock.CoroutineMock() - mock_did = mock_wallet.get_public_did.return_value - mock_did.did = TestIndySdkLedger.test_did + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) mock_parse_get_schema_resp.return_value = (None, '{"attrNames": ["a", "b"]}') @@ -974,7 +1039,7 @@ async def test_get_schema_by_seq_no( "data": { "txn": { "type": "101", - "metadata": {"from": TestIndySdkLedger.test_did}, + "metadata": {"from": self.test_did}, "data": { "data": {"name": "preferences", "version": "1.0"} }, @@ -988,28 +1053,35 @@ async def test_get_schema_by_seq_no( mock_submit.side_effect = [ sub for sub in submissions ] # becomes list iterator, unsubscriptable, in mock object - - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - - async with ledger: - response = await ledger.get_schema("999") - - mock_wallet.get_public_did.assert_called_once_with() - mock_build_get_txn_req.assert_called_once_with(None, None, seq_no=999) - mock_build_get_schema_req.assert_called_once_with( - mock_did.did, f"{TestIndySdkLedger.test_did}:2:preferences:1.0" - ) - mock_submit.assert_has_calls( - [ - async_mock.call(mock_build_get_txn_req.return_value), - async_mock.call( - mock_build_get_schema_req.return_value, sign_did=mock_did - ), - ] + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = async_mock.CoroutineMock() + mock_did = mock_wallet_get_public_did.return_value + mock_did.did = self.test_did + ledger = IndySdkLedger( + IndySdkLedgerPool("name", checked=True), self.profile ) - mock_parse_get_schema_resp.assert_called_once_with(submissions[1]) + async with ledger: + response = await ledger.get_schema("999") + mock_wallet_get_public_did.assert_called_once_with() + mock_build_get_txn_req.assert_called_once_with(None, None, seq_no=999) + mock_build_get_schema_req.assert_called_once_with( + mock_did.did, f"{self.test_did}:2:preferences:1.0" + ) + mock_submit.assert_has_calls( + [ + async_mock.call(mock_build_get_txn_req.return_value), + async_mock.call( + mock_build_get_schema_req.return_value, sign_did=mock_did + ), + ] + ) + mock_parse_get_schema_resp.assert_called_once_with(submissions[1]) - assert response == json.loads(mock_parse_get_schema_resp.return_value[1]) + assert response == json.loads( + mock_parse_get_schema_resp.return_value[1] + ) @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -1027,9 +1099,7 @@ async def test_get_schema_by_wrong_seq_no( mock_open, ): mock_wallet = async_mock.MagicMock() - mock_wallet.get_public_did = async_mock.CoroutineMock() - mock_did = mock_wallet.get_public_did.return_value - mock_did.did = TestIndySdkLedger.test_did + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) mock_parse_get_schema_resp.return_value = (None, '{"attrNames": ["a", "b"]}') @@ -1050,12 +1120,16 @@ async def test_get_schema_by_wrong_seq_no( mock_submit.side_effect = [ sub for sub in submissions ] # becomes list iterator, unsubscriptable, in mock object - - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - - async with ledger: - with self.assertRaises(LedgerTransactionError): - await ledger.get_schema("999") + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = async_mock.CoroutineMock() + mock_did = mock_wallet_get_public_did.return_value + mock_did.did = self.test_did + async with ledger: + with self.assertRaises(LedgerTransactionError): + await ledger.get_schema("999") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedger.get_schema") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @@ -1079,11 +1153,11 @@ async def test_send_credential_definition( mock_get_schema, ): mock_wallet = async_mock.MagicMock() - + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) mock_find_all_records.return_value = [] mock_get_schema.return_value = {"seqNo": 999} - cred_def_id = f"{TestIndySdkLedger.test_did}:3:CL:999:default" + cred_def_id = f"{self.test_did}:3:CL:999:default" cred_def_value = { "primary": {"n": "...", "s": "...", "r": "...", "revocation": None} } @@ -1106,44 +1180,43 @@ async def test_send_credential_definition( cred_def_json, ) issuer.credential_definition_in_wallet.return_value = False - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) schema_id = "schema_issuer_did:name:1.0" tag = "default" - - async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock() - mock_wallet.get_public_did.return_value = None - - with self.assertRaises(BadLedgerRequestError): - await ledger.create_and_send_credential_definition( + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) + future = asyncio.Future() + future.set_result(async_mock.MagicMock(add_record=async_mock.CoroutineMock())) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did, async_mock.patch.object( + ledger, "get_indy_storage", async_mock.MagicMock() + ) as mock_get_storage: + mock_get_storage.return_value = future + async with ledger: + mock_wallet_get_public_did.return_value = None + with self.assertRaises(BadLedgerRequestError): + await ledger.create_and_send_credential_definition( + issuer, schema_id, None, tag + ) + mock_wallet_get_public_did.return_value = DIDInfo( + did=self.test_did, + verkey=self.test_verkey, + metadata=None, + method=DIDMethod.SOV, + key_type=KeyType.ED25519, + ) + mock_did = mock_wallet_get_public_did.return_value + ( + result_id, + result_def, + novel, + ) = await ledger.create_and_send_credential_definition( issuer, schema_id, None, tag ) - - mock_wallet.get_public_did = async_mock.CoroutineMock() - mock_wallet.get_public_did.return_value = DIDInfo( - did=TestIndySdkLedger.test_did, - verkey=TestIndySdkLedger.test_verkey, - metadata=None, - method=DIDMethod.SOV, - key_type=KeyType.ED25519, - ) - mock_did = mock_wallet.get_public_did.return_value - - ( - result_id, - result_def, - novel, - ) = await ledger.create_and_send_credential_definition( - issuer, schema_id, None, tag - ) - assert result_id == cred_def_id - assert novel - - mock_wallet.get_public_did.assert_called_once_with() - mock_get_schema.assert_called_once_with(schema_id) - - mock_build_cred_def.assert_called_once_with(mock_did.did, cred_def_json) + assert result_id == cred_def_id + assert novel + mock_get_schema.assert_called_once_with(schema_id) + mock_build_cred_def.assert_called_once_with(mock_did.did, cred_def_json) @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedger.get_schema") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @@ -1169,11 +1242,11 @@ async def test_send_credential_definition_endorse_only( mock_get_schema, ): mock_wallet = async_mock.MagicMock() - + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) mock_find_all_records.return_value = [] mock_get_schema.return_value = {"seqNo": 999} - cred_def_id = f"{TestIndySdkLedger.test_did}:3:CL:999:default" + cred_def_id = f"{self.test_did}:3:CL:999:default" cred_def_value = { "primary": {"n": "...", "s": "...", "r": "...", "revocation": None} } @@ -1196,35 +1269,34 @@ async def test_send_credential_definition_endorse_only( cred_def_json, ) issuer.credential_definition_in_wallet.return_value = False - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) schema_id = "schema_issuer_did:name:1.0" tag = "default" - - async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock() - mock_wallet.get_public_did.return_value = DIDInfo( - TestIndySdkLedger.test_did, - TestIndySdkLedger.test_verkey, + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = DIDInfo( + self.test_did, + self.test_verkey, None, DIDMethod.SOV, KeyType.ED25519, ) - - ( - result_id, - signed_txn, - novel, - ) = await ledger.create_and_send_credential_definition( - issuer=issuer, - schema_id=schema_id, - signature_type=None, - tag=tag, - support_revocation=False, - write_ledger=False, - endorser_did=TestIndySdkLedger.test_did, - ) - assert "signed_txn" in signed_txn + async with ledger: + ( + result_id, + signed_txn, + novel, + ) = await ledger.create_and_send_credential_definition( + issuer=issuer, + schema_id=schema_id, + signature_type=None, + tag=tag, + support_revocation=False, + write_ledger=False, + endorser_did=self.test_did, + ) + assert "signed_txn" in signed_txn @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedger.get_schema") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @@ -1248,11 +1320,11 @@ async def test_send_credential_definition_exists_in_ledger_and_wallet( mock_get_schema, ): mock_wallet = async_mock.MagicMock() - + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) mock_find_all_records.return_value = [] mock_get_schema.return_value = {"seqNo": 999} - cred_def_id = f"{TestIndySdkLedger.test_did}:3:CL:999:default" + cred_def_id = f"{self.test_did}:3:CL:999:default" cred_def_value = { "primary": {"n": "...", "s": "...", "r": "...", "revocation": None} } @@ -1275,28 +1347,27 @@ async def test_send_credential_definition_exists_in_ledger_and_wallet( cred_def_json, ) issuer.credential_definition_in_wallet.return_value = True - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) schema_id = "schema_issuer_did:name:1.0" tag = "default" - + future = asyncio.Future() + future.set_result(async_mock.MagicMock(add_record=async_mock.CoroutineMock())) with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did, async_mock.patch.object( ledger, "get_indy_storage", async_mock.MagicMock() ) as mock_get_storage: - mock_get_storage.return_value = async_mock.MagicMock( - add_record=async_mock.CoroutineMock() + mock_get_storage.return_value = future + mock_wallet_get_public_did.return_value = DIDInfo( + did=self.test_did, + verkey=self.test_verkey, + metadata=None, + method=DIDMethod.SOV, + key_type=KeyType.ED25519, ) async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock() - mock_wallet.get_public_did.return_value = DIDInfo( - did=TestIndySdkLedger.test_did, - verkey=TestIndySdkLedger.test_verkey, - metadata=None, - method=DIDMethod.SOV, - key_type=KeyType.ED25519, - ) - mock_did = mock_wallet.get_public_did.return_value + mock_did = mock_wallet_get_public_did.return_value ( result_id, @@ -1308,7 +1379,7 @@ async def test_send_credential_definition_exists_in_ledger_and_wallet( assert result_id == cred_def_id assert not novel - mock_wallet.get_public_did.assert_called_once_with() + mock_wallet_get_public_did.assert_called_once_with() mock_get_schema.assert_called_once_with(schema_id) mock_build_cred_def.assert_not_called() @@ -1324,22 +1395,22 @@ async def test_send_credential_definition_no_such_schema( mock_get_schema, ): mock_wallet = async_mock.MagicMock() - + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) mock_get_schema.return_value = {} issuer = async_mock.MagicMock(IndyIssuer) - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) schema_id = "schema_issuer_did:name:1.0" tag = "default" - - async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock() - - with self.assertRaises(LedgerError): - await ledger.create_and_send_credential_definition( - issuer, schema_id, None, tag - ) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = async_mock.CoroutineMock() + async with ledger: + with self.assertRaises(LedgerError): + await ledger.create_and_send_credential_definition( + issuer, schema_id, None, tag + ) @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedger.get_schema") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @@ -1363,7 +1434,7 @@ async def test_send_credential_definition_offer_exception( mock_get_schema, ): mock_wallet = async_mock.MagicMock() - + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) mock_find_all_records.return_value = [] mock_get_schema.return_value = {"seqNo": 999} @@ -1372,18 +1443,18 @@ async def test_send_credential_definition_offer_exception( issuer.credential_definition_in_wallet.side_effect = IndyIssuerError( "common IO error" ) - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) schema_id = "schema_issuer_did:name:1.0" tag = "default" - - async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock() - - with self.assertRaises(LedgerError): - await ledger.create_and_send_credential_definition( - issuer, schema_id, None, tag - ) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = async_mock.CoroutineMock() + async with ledger: + with self.assertRaises(LedgerError): + await ledger.create_and_send_credential_definition( + issuer, schema_id, None, tag + ) @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedger.get_schema") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @@ -1399,9 +1470,9 @@ async def test_send_credential_definition_cred_def_in_wallet_not_ledger( mock_get_schema, ): mock_wallet = async_mock.MagicMock() - + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) mock_get_schema.return_value = {"seqNo": 999} - cred_def_id = f"{TestIndySdkLedger.test_did}:3:CL:999:default" + cred_def_id = f"{self.test_did}:3:CL:999:default" cred_def_value = { "primary": {"n": "...", "s": "...", "r": "...", "revocation": None} } @@ -1418,18 +1489,18 @@ async def test_send_credential_definition_cred_def_in_wallet_not_ledger( mock_fetch_cred_def.return_value = {} issuer = async_mock.MagicMock(IndyIssuer) - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) schema_id = "schema_issuer_did:name:1.0" tag = "default" - - async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock() - - with self.assertRaises(LedgerError): - await ledger.create_and_send_credential_definition( - issuer, schema_id, None, tag - ) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = async_mock.CoroutineMock() + async with ledger: + with self.assertRaises(LedgerError): + await ledger.create_and_send_credential_definition( + issuer, schema_id, None, tag + ) @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedger.get_schema") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @@ -1445,9 +1516,9 @@ async def test_send_credential_definition_cred_def_not_on_ledger_wallet_check_x( mock_get_schema, ): mock_wallet = async_mock.MagicMock() - + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) mock_get_schema.return_value = {"seqNo": 999} - cred_def_id = f"{TestIndySdkLedger.test_did}:3:CL:999:default" + cred_def_id = f"{self.test_did}:3:CL:999:default" cred_def_value = { "primary": {"n": "...", "s": "...", "r": "...", "revocation": None} } @@ -1467,19 +1538,19 @@ async def test_send_credential_definition_cred_def_not_on_ledger_wallet_check_x( issuer.credential_definition_in_wallet = async_mock.CoroutineMock( side_effect=IndyIssuerError("dummy error") ) - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) schema_id = "schema_issuer_did:name:1.0" tag = "default" - - async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock() - - with self.assertRaises(LedgerError) as context: - await ledger.create_and_send_credential_definition( - issuer, schema_id, None, tag - ) - assert "dummy error" in str(context.exception) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = async_mock.CoroutineMock() + async with ledger: + with self.assertRaises(LedgerError) as context: + await ledger.create_and_send_credential_definition( + issuer, schema_id, None, tag + ) + assert "dummy error" in str(context.exception) @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedger.get_schema") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @@ -1495,9 +1566,9 @@ async def test_send_credential_definition_cred_def_not_on_ledger_nor_wallet_send mock_get_schema, ): mock_wallet = async_mock.MagicMock() - + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) mock_get_schema.return_value = {"seqNo": 999} - cred_def_id = f"{TestIndySdkLedger.test_did}:3:CL:999:default" + cred_def_id = f"{self.test_did}:3:CL:999:default" cred_def_value = { "primary": {"n": "...", "s": "...", "r": "...", "revocation": None} } @@ -1520,19 +1591,19 @@ async def test_send_credential_definition_cred_def_not_on_ledger_nor_wallet_send issuer.create_and_store_credential_definition = async_mock.CoroutineMock( side_effect=IndyIssuerError("dummy error") ) - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) schema_id = "schema_issuer_did:name:1.0" tag = "default" - - async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock() - - with self.assertRaises(LedgerError) as context: - await ledger.create_and_send_credential_definition( - issuer, schema_id, None, tag - ) - assert "dummy error" in str(context.exception) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = async_mock.CoroutineMock() + async with ledger: + with self.assertRaises(LedgerError) as context: + await ledger.create_and_send_credential_definition( + issuer, schema_id, None, tag + ) + assert "dummy error" in str(context.exception) @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedger.get_schema") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @@ -1548,9 +1619,9 @@ async def test_send_credential_definition_read_only( mock_get_schema, ): mock_wallet = async_mock.MagicMock() - + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) mock_get_schema.return_value = {"seqNo": 999} - cred_def_id = f"{TestIndySdkLedger.test_did}:3:CL:999:default" + cred_def_id = f"{self.test_did}:3:CL:999:default" cred_def_value = { "primary": {"n": "...", "s": "...", "r": "...", "revocation": None} } @@ -1574,20 +1645,20 @@ async def test_send_credential_definition_read_only( return_value=("cred-def-id", "cred-def-json") ) ledger = IndySdkLedger( - IndySdkLedgerPool("name", checked=True, read_only=True), mock_wallet + IndySdkLedgerPool("name", checked=True, read_only=True), self.profile ) - schema_id = "schema_issuer_did:name:1.0" tag = "default" - - async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock() - - with self.assertRaises(LedgerError) as context: - await ledger.create_and_send_credential_definition( - issuer, schema_id, None, tag - ) - assert "read only" in str(context.exception) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = async_mock.CoroutineMock() + async with ledger: + with self.assertRaises(LedgerError) as context: + await ledger.create_and_send_credential_definition( + issuer, schema_id, None, tag + ) + assert "read only" in str(context.exception) @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedger.get_schema") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @@ -1603,9 +1674,9 @@ async def test_send_credential_definition_cred_def_on_ledger_not_in_wallet( mock_get_schema, ): mock_wallet = async_mock.MagicMock() - + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) mock_get_schema.return_value = {"seqNo": 999} - cred_def_id = f"{TestIndySdkLedger.test_did}:3:CL:999:default" + cred_def_id = f"{self.test_did}:3:CL:999:default" cred_def_value = { "primary": {"n": "...", "s": "...", "r": "...", "revocation": None} } @@ -1625,18 +1696,18 @@ async def test_send_credential_definition_cred_def_on_ledger_not_in_wallet( issuer.credential_definition_in_wallet = async_mock.CoroutineMock( return_value=False ) - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) schema_id = "schema_issuer_did:name:1.0" tag = "default" - - async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock() - - with self.assertRaises(LedgerError): - await ledger.create_and_send_credential_definition( - issuer, schema_id, None, tag - ) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = async_mock.CoroutineMock() + async with ledger: + with self.assertRaises(LedgerError): + await ledger.create_and_send_credential_definition( + issuer, schema_id, None, tag + ) @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedger.get_schema") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @@ -1660,11 +1731,11 @@ async def test_send_credential_definition_on_ledger_in_wallet( mock_get_schema, ): mock_wallet = async_mock.MagicMock() - + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) mock_find_all_records.return_value = [] mock_get_schema.return_value = {"seqNo": 999} - cred_def_id = f"{TestIndySdkLedger.test_did}:3:CL:999:default" + cred_def_id = f"{self.test_did}:3:CL:999:default" cred_def_value = { "primary": {"n": "...", "s": "...", "r": "...", "revocation": None} } @@ -1686,40 +1757,37 @@ async def test_send_credential_definition_on_ledger_in_wallet( cred_def_id, cred_def_json, ) - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) schema_id = "schema_issuer_did:name:1.0" tag = "default" + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + async with ledger: + mock_wallet_get_public_did.return_value = None + with self.assertRaises(BadLedgerRequestError): + await ledger.create_and_send_credential_definition( + issuer, schema_id, None, tag + ) - async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock() - mock_wallet.get_public_did.return_value = None + mock_wallet_get_public_did.return_value = DIDInfo( + did=self.test_did, + verkey=self.test_verkey, + metadata=None, + method=DIDMethod.SOV, + key_type=KeyType.ED25519, + ) + mock_did = mock_wallet_get_public_did.return_value - with self.assertRaises(BadLedgerRequestError): - await ledger.create_and_send_credential_definition( + ( + result_id, + result_def, + novel, + ) = await ledger.create_and_send_credential_definition( issuer, schema_id, None, tag ) - - mock_wallet.get_public_did = async_mock.CoroutineMock() - mock_wallet.get_public_did.return_value = DIDInfo( - did=TestIndySdkLedger.test_did, - verkey=TestIndySdkLedger.test_verkey, - metadata=None, - method=DIDMethod.SOV, - key_type=KeyType.ED25519, - ) - mock_did = mock_wallet.get_public_did.return_value - - ( - result_id, - result_def, - novel, - ) = await ledger.create_and_send_credential_definition( - issuer, schema_id, None, tag - ) assert result_id == cred_def_id - mock_wallet.get_public_did.assert_called_once_with() mock_get_schema.assert_called_once_with(schema_id) mock_build_cred_def.assert_not_called() @@ -1746,11 +1814,11 @@ async def test_send_credential_definition_create_cred_def_exception( mock_get_schema, ): mock_wallet = async_mock.MagicMock() - + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) mock_find_all_records.return_value = [] mock_get_schema.return_value = {"seqNo": 999} - cred_def_id = f"{TestIndySdkLedger.test_did}:3:CL:999:default" + cred_def_id = f"{self.test_did}:3:CL:999:default" cred_def_value = { "primary": {"n": "...", "s": "...", "r": "...", "revocation": None} } @@ -1770,25 +1838,24 @@ async def test_send_credential_definition_create_cred_def_exception( issuer.create_and_store_credential_definition.side_effect = IndyIssuerError( "invalid structure" ) - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) schema_id = "schema_issuer_did:name:1.0" tag = "default" - - async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock() - mock_wallet.get_public_did.return_value = DIDInfo( - did=TestIndySdkLedger.test_did, - verkey=TestIndySdkLedger.test_verkey, + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = DIDInfo( + did=self.test_did, + verkey=self.test_verkey, metadata=None, method=DIDMethod.SOV, key_type=KeyType.ED25519, ) - - with self.assertRaises(LedgerError): - await ledger.create_and_send_credential_definition( - issuer, schema_id, None, tag - ) + async with ledger: + with self.assertRaises(LedgerError): + await ledger.create_and_send_credential_definition( + issuer, schema_id, None, tag + ) @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -1804,38 +1871,42 @@ async def test_get_credential_definition( mock_open, ): mock_wallet = async_mock.MagicMock() - mock_wallet.get_public_did = async_mock.CoroutineMock() - mock_did = mock_wallet.get_public_did.return_value - + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) mock_parse_get_cred_def_resp.return_value = ( None, json.dumps({"result": {"seqNo": 1}}), ) - - ledger = IndySdkLedger( - IndySdkLedgerPool("name", checked=True, cache=InMemoryCache()), mock_wallet - ) - - async with ledger: - response = await ledger.get_credential_definition("cred_def_id") - - mock_wallet.get_public_did.assert_called_once_with() - mock_build_get_cred_def_req.assert_called_once_with( - mock_did.did, "cred_def_id" - ) - mock_submit.assert_called_once_with( - mock_build_get_cred_def_req.return_value, sign_did=mock_did - ) - mock_parse_get_cred_def_resp.assert_called_once_with( - mock_submit.return_value + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = async_mock.CoroutineMock() + mock_did = mock_wallet_get_public_did.return_value + ledger = IndySdkLedger( + IndySdkLedgerPool("name", checked=True, cache=InMemoryCache()), + self.profile, ) - assert response == json.loads(mock_parse_get_cred_def_resp.return_value[1]) - - response == await ledger.get_credential_definition( # cover get-from-cache - "cred_def_id" - ) - assert response == json.loads(mock_parse_get_cred_def_resp.return_value[1]) + async with ledger: + response = await ledger.get_credential_definition("cred_def_id") + mock_wallet_get_public_did.assert_called_once_with() + mock_build_get_cred_def_req.assert_called_once_with( + mock_did.did, "cred_def_id" + ) + mock_submit.assert_called_once_with( + mock_build_get_cred_def_req.return_value, sign_did=mock_did + ) + mock_parse_get_cred_def_resp.assert_called_once_with( + mock_submit.return_value + ) + assert response == json.loads( + mock_parse_get_cred_def_resp.return_value[1] + ) + response == await ledger.get_credential_definition( # cover get-from-cache + "cred_def_id" + ) + assert response == json.loads( + mock_parse_get_cred_def_resp.return_value[1] + ) @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -1851,30 +1922,33 @@ async def test_get_credential_definition_ledger_not_found( mock_open, ): mock_wallet = async_mock.MagicMock() - mock_wallet.get_public_did = async_mock.CoroutineMock() - mock_did = mock_wallet.get_public_did.return_value + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) mock_parse_get_cred_def_resp.side_effect = IndyError( error_code=ErrorCode.LedgerNotFound, error_details={"message": "not today"} ) - - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - - async with ledger: - response = await ledger.get_credential_definition("cred_def_id") - - mock_wallet.get_public_did.assert_called_once_with() - mock_build_get_cred_def_req.assert_called_once_with( - mock_did.did, "cred_def_id" - ) - mock_submit.assert_called_once_with( - mock_build_get_cred_def_req.return_value, sign_did=mock_did - ) - mock_parse_get_cred_def_resp.assert_called_once_with( - mock_submit.return_value + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = self.test_did_info + ledger = IndySdkLedger( + IndySdkLedgerPool("name", checked=True), self.profile ) + async with ledger: + response = await ledger.get_credential_definition("cred_def_id") + mock_did = mock_wallet_get_public_did.return_value + mock_wallet_get_public_did.assert_called_once_with() + mock_build_get_cred_def_req.assert_called_once_with( + mock_did.did, "cred_def_id" + ) + mock_submit.assert_called_once_with( + mock_build_get_cred_def_req.return_value, sign_did=mock_did + ) + mock_parse_get_cred_def_resp.assert_called_once_with( + mock_submit.return_value + ) - assert response is None + assert response is None @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -1890,20 +1964,22 @@ async def test_fetch_credential_definition_ledger_x( mock_open, ): mock_wallet = async_mock.MagicMock() - mock_wallet.get_public_did = async_mock.CoroutineMock() - mock_did = mock_wallet.get_public_did.return_value mock_parse_get_cred_def_resp.side_effect = IndyError( error_code=ErrorCode.CommonInvalidParam1, error_details={"message": "not today"}, ) - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - - async with ledger: - with self.assertRaises(LedgerError) as context: - await ledger.fetch_credential_definition("cred_def_id") - assert "not today" in str(context.exception) + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = self.test_did_info + async with ledger: + with self.assertRaises(LedgerError) as context: + await ledger.fetch_credential_definition("cred_def_id") + assert "not today" in str(context.exception) @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -1913,27 +1989,27 @@ async def test_get_key_for_did( self, mock_submit, mock_build_get_nym_req, mock_close, mock_open ): mock_wallet = async_mock.MagicMock() - + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) mock_submit.return_value = json.dumps( - {"result": {"data": json.dumps({"verkey": TestIndySdkLedger.test_verkey})}} + {"result": {"data": json.dumps({"verkey": self.test_verkey})}} ) - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - - async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock( - return_value=TestIndySdkLedger.test_did_info - ) - response = await ledger.get_key_for_did(TestIndySdkLedger.test_did) + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = self.test_did_info + async with ledger: + response = await ledger.get_key_for_did(self.test_did) - assert mock_build_get_nym_req.called_once_with( - TestIndySdkLedger.test_did, - ledger.did_to_nym(TestIndySdkLedger.test_did), - ) - assert mock_submit.called_once_with( - mock_build_get_nym_req.return_value, - sign_did=mock_wallet.get_public_did.return_value, - ) - assert response == TestIndySdkLedger.test_verkey + assert mock_build_get_nym_req.called_once_with( + self.test_did, + ledger.did_to_nym(self.test_did), + ) + assert mock_submit.called_once_with( + mock_build_get_nym_req.return_value, + sign_did=mock_wallet_get_public_did.return_value, + ) + assert response == self.test_verkey @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -1943,31 +2019,31 @@ async def test_get_endpoint_for_did( self, mock_submit, mock_build_get_attrib_req, mock_close, mock_open ): mock_wallet = async_mock.MagicMock() - + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) endpoint = "http://aries.ca" mock_submit.return_value = json.dumps( {"result": {"data": json.dumps({"endpoint": {"endpoint": endpoint}})}} ) - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - - async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock( - return_value=TestIndySdkLedger.test_did_info - ) - response = await ledger.get_endpoint_for_did(TestIndySdkLedger.test_did) + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = self.test_did_info + async with ledger: + response = await ledger.get_endpoint_for_did(self.test_did) - assert mock_build_get_attrib_req.called_once_with( - TestIndySdkLedger.test_did, - ledger.did_to_nym(TestIndySdkLedger.test_did), - "endpoint", - None, - None, - ) - assert mock_submit.called_once_with( - mock_build_get_attrib_req.return_value, - sign_did=mock_wallet.get_public_did.return_value, - ) - assert response == endpoint + assert mock_build_get_attrib_req.called_once_with( + self.test_did, + ledger.did_to_nym(self.test_did), + "endpoint", + None, + None, + ) + assert mock_submit.called_once_with( + mock_build_get_attrib_req.return_value, + sign_did=mock_wallet_get_public_did.return_value, + ) + assert response == endpoint @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -1977,7 +2053,7 @@ async def test_get_endpoint_of_type_profile_for_did( self, mock_submit, mock_build_get_attrib_req, mock_close, mock_open ): mock_wallet = async_mock.MagicMock() - + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) endpoint = "http://company.com/masterdata" endpoint_type = EndpointType.PROFILE mock_submit.return_value = json.dumps( @@ -1989,29 +2065,29 @@ async def test_get_endpoint_of_type_profile_for_did( } } ) - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - - async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock( - return_value=TestIndySdkLedger.test_did_info - ) - response = await ledger.get_endpoint_for_did( - TestIndySdkLedger.test_did, - endpoint_type, - ) + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = self.test_did_info + async with ledger: + response = await ledger.get_endpoint_for_did( + self.test_did, + endpoint_type, + ) - assert mock_build_get_attrib_req.called_once_with( - TestIndySdkLedger.test_did, - ledger.did_to_nym(TestIndySdkLedger.test_did), - "endpoint", - None, - None, - ) - assert mock_submit.called_once_with( - mock_build_get_attrib_req.return_value, - sign_did=mock_wallet.get_public_did.return_value, - ) - assert response == endpoint + assert mock_build_get_attrib_req.called_once_with( + self.test_did, + ledger.did_to_nym(self.test_did), + "endpoint", + None, + None, + ) + assert mock_submit.called_once_with( + mock_build_get_attrib_req.return_value, + sign_did=mock_wallet_get_public_did.return_value, + ) + assert response == endpoint @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -2021,35 +2097,33 @@ async def test_get_all_endpoints_for_did( self, mock_submit, mock_build_get_attrib_req, mock_close, mock_open ): mock_wallet = async_mock.MagicMock() - + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) profile_endpoint = "http://company.com/masterdata" default_endpoint = "http://agent.company.com" data_json = json.dumps( {"endpoint": {"endpoint": default_endpoint, "profile": profile_endpoint}} ) mock_submit.return_value = json.dumps({"result": {"data": data_json}}) - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - - async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock( - return_value=TestIndySdkLedger.test_did_info - ) - response = await ledger.get_all_endpoints_for_did( - TestIndySdkLedger.test_did - ) + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = self.test_did_info + async with ledger: + response = await ledger.get_all_endpoints_for_did(self.test_did) - assert mock_build_get_attrib_req.called_once_with( - TestIndySdkLedger.test_did, - ledger.did_to_nym(TestIndySdkLedger.test_did), - "endpoint", - None, - None, - ) - assert mock_submit.called_once_with( - mock_build_get_attrib_req.return_value, - sign_did=mock_wallet.get_public_did.return_value, - ) - assert response == json.loads(data_json).get("endpoint") + assert mock_build_get_attrib_req.called_once_with( + self.test_did, + ledger.did_to_nym(self.test_did), + "endpoint", + None, + None, + ) + assert mock_submit.called_once_with( + mock_build_get_attrib_req.return_value, + sign_did=mock_wallet_get_public_did.return_value, + ) + assert response == json.loads(data_json).get("endpoint") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -2059,32 +2133,30 @@ async def test_get_all_endpoints_for_did_none( self, mock_submit, mock_build_get_attrib_req, mock_close, mock_open ): mock_wallet = async_mock.MagicMock() - + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) profile_endpoint = "http://company.com/masterdata" default_endpoint = "http://agent.company.com" mock_submit.return_value = json.dumps({"result": {"data": None}}) - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - - async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock( - return_value=TestIndySdkLedger.test_did_info - ) - response = await ledger.get_all_endpoints_for_did( - TestIndySdkLedger.test_did - ) + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = self.test_did_info + async with ledger: + response = await ledger.get_all_endpoints_for_did(self.test_did) - assert mock_build_get_attrib_req.called_once_with( - TestIndySdkLedger.test_did, - ledger.did_to_nym(TestIndySdkLedger.test_did), - "endpoint", - None, - None, - ) - assert mock_submit.called_once_with( - mock_build_get_attrib_req.return_value, - sign_did=mock_wallet.get_public_did.return_value, - ) - assert response is None + assert mock_build_get_attrib_req.called_once_with( + self.test_did, + ledger.did_to_nym(self.test_did), + "endpoint", + None, + None, + ) + assert mock_submit.called_once_with( + mock_build_get_attrib_req.return_value, + sign_did=mock_wallet_get_public_did.return_value, + ) + assert response is None @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -2094,30 +2166,30 @@ async def test_get_endpoint_for_did_address_none( self, mock_submit, mock_build_get_attrib_req, mock_close, mock_open ): mock_wallet = async_mock.MagicMock() - + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) mock_submit.return_value = json.dumps( {"result": {"data": json.dumps({"endpoint": None})}} ) - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - - async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock( - return_value=TestIndySdkLedger.test_did_info - ) - response = await ledger.get_endpoint_for_did(TestIndySdkLedger.test_did) + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = self.test_did_info + async with ledger: + response = await ledger.get_endpoint_for_did(self.test_did) - assert mock_build_get_attrib_req.called_once_with( - TestIndySdkLedger.test_did, - ledger.did_to_nym(TestIndySdkLedger.test_did), - "endpoint", - None, - None, - ) - assert mock_submit.called_once_with( - mock_build_get_attrib_req.return_value, - sign_did=mock_wallet.get_public_did.return_value, - ) - assert response is None + assert mock_build_get_attrib_req.called_once_with( + self.test_did, + ledger.did_to_nym(self.test_did), + "endpoint", + None, + None, + ) + assert mock_submit.called_once_with( + mock_build_get_attrib_req.return_value, + sign_did=mock_wallet_get_public_did.return_value, + ) + assert response is None @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -2127,28 +2199,28 @@ async def test_get_endpoint_for_did_no_endpoint( self, mock_submit, mock_build_get_attrib_req, mock_close, mock_open ): mock_wallet = async_mock.MagicMock() - + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) mock_submit.return_value = json.dumps({"result": {"data": None}}) - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - - async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock( - return_value=TestIndySdkLedger.test_did_info - ) - response = await ledger.get_endpoint_for_did(TestIndySdkLedger.test_did) + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = self.test_did_info + async with ledger: + response = await ledger.get_endpoint_for_did(self.test_did) - assert mock_build_get_attrib_req.called_once_with( - TestIndySdkLedger.test_did, - ledger.did_to_nym(TestIndySdkLedger.test_did), - "endpoint", - None, - None, - ) - assert mock_submit.called_once_with( - mock_build_get_attrib_req.return_value, - sign_did=mock_wallet.get_public_did.return_value, - ) - assert response is None + assert mock_build_get_attrib_req.called_once_with( + self.test_did, + ledger.did_to_nym(self.test_did), + "endpoint", + None, + None, + ) + assert mock_submit.called_once_with( + mock_build_get_attrib_req.return_value, + sign_did=mock_wallet_get_public_did.return_value, + ) + assert response is None @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -2164,7 +2236,7 @@ async def test_update_endpoint_for_did( mock_open, ): mock_wallet = async_mock.MagicMock() - + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) endpoint = ["http://old.aries.ca", "http://new.aries.ca"] mock_submit.side_effect = [ json.dumps( @@ -2176,73 +2248,29 @@ async def test_update_endpoint_for_did( ) for i in range(len(endpoint)) ] - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - - async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock( - return_value=TestIndySdkLedger.test_did_info - ) - response = await ledger.update_endpoint_for_did( - TestIndySdkLedger.test_did, endpoint[1] - ) - - assert mock_build_get_attrib_req.called_once_with( - TestIndySdkLedger.test_did, - ledger.did_to_nym(TestIndySdkLedger.test_did), - "endpoint", - None, - None, - ) - mock_submit.assert_has_calls( - [ - async_mock.call( - mock_build_get_attrib_req.return_value, - sign_did=mock_wallet.get_public_did.return_value, - ), - async_mock.call(mock_build_attrib_req.return_value, True, True), - ] - ) - assert response - - @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") - @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") - @async_mock.patch("indy.ledger.build_get_attrib_request") - @async_mock.patch("indy.ledger.build_attrib_request") - @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedger._submit") - async def test_update_endpoint_for_did_no_prior_endpoints( - self, - mock_submit, - mock_build_attrib_req, - mock_build_get_attrib_req, - mock_close, - mock_open, - ): - mock_wallet = async_mock.MagicMock() - - endpoint = "http://new.aries.ca" - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - - async with ledger: - with async_mock.patch.object( - ledger, "get_all_endpoints_for_did", async_mock.CoroutineMock() - ) as mock_get_all: - mock_get_all.return_value = None - mock_wallet.get_public_did = async_mock.CoroutineMock( - return_value=TestIndySdkLedger.test_did_info - ) + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = self.test_did_info + async with ledger: response = await ledger.update_endpoint_for_did( - TestIndySdkLedger.test_did, endpoint + self.test_did, endpoint[1] ) assert mock_build_get_attrib_req.called_once_with( - TestIndySdkLedger.test_did, - ledger.did_to_nym(TestIndySdkLedger.test_did), + self.test_did, + ledger.did_to_nym(self.test_did), "endpoint", None, None, ) mock_submit.assert_has_calls( [ + async_mock.call( + mock_build_get_attrib_req.return_value, + sign_did=mock_wallet_get_public_did.return_value, + ), async_mock.call(mock_build_attrib_req.return_value, True, True), ] ) @@ -2253,7 +2281,7 @@ async def test_update_endpoint_for_did_no_prior_endpoints( @async_mock.patch("indy.ledger.build_get_attrib_request") @async_mock.patch("indy.ledger.build_attrib_request") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedger._submit") - async def test_update_endpoint_of_type_profile_for_did( + async def test_update_endpoint_for_did_no_prior_endpoints( self, mock_submit, mock_build_attrib_req, @@ -2262,7 +2290,53 @@ async def test_update_endpoint_of_type_profile_for_did( mock_open, ): mock_wallet = async_mock.MagicMock() + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) + endpoint = "http://new.aries.ca" + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = self.test_did_info + async with ledger: + with async_mock.patch.object( + ledger, "get_all_endpoints_for_did", async_mock.CoroutineMock() + ) as mock_get_all: + mock_get_all.return_value = None + response = await ledger.update_endpoint_for_did( + self.test_did, endpoint + ) + assert mock_build_get_attrib_req.called_once_with( + self.test_did, + ledger.did_to_nym(self.test_did), + "endpoint", + None, + None, + ) + mock_submit.assert_has_calls( + [ + async_mock.call( + mock_build_attrib_req.return_value, True, True + ), + ] + ) + assert response + + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") + @async_mock.patch("indy.ledger.build_get_attrib_request") + @async_mock.patch("indy.ledger.build_attrib_request") + @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedger._submit") + async def test_update_endpoint_of_type_profile_for_did( + self, + mock_submit, + mock_build_attrib_req, + mock_build_get_attrib_req, + mock_close, + mock_open, + ): + mock_wallet = async_mock.MagicMock() + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) endpoint = ["http://company.com/oldProfile", "http://company.com/newProfile"] endpoint_type = EndpointType.PROFILE mock_submit.side_effect = [ @@ -2277,33 +2351,33 @@ async def test_update_endpoint_of_type_profile_for_did( ) for i in range(len(endpoint)) ] - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - - async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock( - return_value=TestIndySdkLedger.test_did_info - ) - response = await ledger.update_endpoint_for_did( - TestIndySdkLedger.test_did, endpoint[1], endpoint_type - ) + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = self.test_did_info + async with ledger: + response = await ledger.update_endpoint_for_did( + self.test_did, endpoint[1], endpoint_type + ) - assert mock_build_get_attrib_req.called_once_with( - TestIndySdkLedger.test_did, - ledger.did_to_nym(TestIndySdkLedger.test_did), - "endpoint", - None, - None, - ) - mock_submit.assert_has_calls( - [ - async_mock.call( - mock_build_get_attrib_req.return_value, - sign_did=mock_wallet.get_public_did.return_value, - ), - async_mock.call(mock_build_attrib_req.return_value, True, True), - ] - ) - assert response + assert mock_build_get_attrib_req.called_once_with( + self.test_did, + ledger.did_to_nym(self.test_did), + "endpoint", + None, + None, + ) + mock_submit.assert_has_calls( + [ + async_mock.call( + mock_build_get_attrib_req.return_value, + sign_did=mock_wallet_get_public_did.return_value, + ), + async_mock.call(mock_build_attrib_req.return_value, True, True), + ] + ) + assert response @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -2313,33 +2387,31 @@ async def test_update_endpoint_for_did_duplicate( self, mock_submit, mock_build_get_attrib_req, mock_close, mock_open ): mock_wallet = async_mock.MagicMock() - + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) endpoint = "http://aries.ca" mock_submit.return_value = json.dumps( {"result": {"data": json.dumps({"endpoint": {"endpoint": endpoint}})}} ) - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - - async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock( - return_value=TestIndySdkLedger.test_did_info - ) - response = await ledger.update_endpoint_for_did( - TestIndySdkLedger.test_did, endpoint - ) + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = self.test_did_info + async with ledger: + response = await ledger.update_endpoint_for_did(self.test_did, endpoint) - assert mock_build_get_attrib_req.called_once_with( - TestIndySdkLedger.test_did, - ledger.did_to_nym(TestIndySdkLedger.test_did), - "endpoint", - None, - None, - ) - assert mock_submit.called_once_with( - mock_build_get_attrib_req.return_value, - sign_did=mock_wallet.get_public_did.return_value, - ) - assert not response + assert mock_build_get_attrib_req.called_once_with( + self.test_did, + ledger.did_to_nym(self.test_did), + "endpoint", + None, + None, + ) + assert mock_submit.called_once_with( + mock_build_get_attrib_req.return_value, + sign_did=mock_wallet_get_public_did.return_value, + ) + assert not response @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -2349,24 +2421,24 @@ async def test_update_endpoint_for_did_read_only( self, mock_submit, mock_build_get_attrib_req, mock_close, mock_open ): mock_wallet = async_mock.MagicMock() - + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) endpoint = "http://aries.ca" mock_submit.return_value = json.dumps( {"result": {"data": json.dumps({"endpoint": {"endpoint": endpoint}})}} ) ledger = IndySdkLedger( - IndySdkLedgerPool("name", checked=True, read_only=True), mock_wallet + IndySdkLedgerPool("name", checked=True, read_only=True), self.profile ) - - async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock( - return_value=TestIndySdkLedger.test_did_info - ) - with self.assertRaises(LedgerError) as context: - await ledger.update_endpoint_for_did( - TestIndySdkLedger.test_did, "distinct endpoint" - ) - assert "read only" in str(context.exception) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = self.test_did_info + async with ledger: + with self.assertRaises(LedgerError) as context: + await ledger.update_endpoint_for_did( + self.test_did, "distinct endpoint" + ) + assert "read only" in str(context.exception) @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -2375,67 +2447,72 @@ async def test_update_endpoint_for_did_read_only( async def test_register_nym( self, mock_submit, mock_build_nym_req, mock_close, mock_open ): - mock_wallet = async_mock.MagicMock( - type="indy", - get_local_did=async_mock.CoroutineMock( - return_value=async_mock.MagicMock(metadata={"...": "..."}) - ), - replace_local_did_metadata=async_mock.CoroutineMock(), - ) - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - - async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock( - return_value=TestIndySdkLedger.test_did_info - ) - await ledger.register_nym( - TestIndySdkLedger.test_did, - TestIndySdkLedger.test_verkey, - "alias", - None, - ) - - assert mock_build_nym_req.called_once_with( - TestIndySdkLedger.test_did, - TestIndySdkLedger.test_did, - TestIndySdkLedger.test_verkey, - "alias", - None, - ) - assert mock_submit.called_once_with( - mock_build_nym_req.return_value, - True, - True, - sign_did=mock_wallet.get_public_did.return_value, - ) - mock_wallet.replace_local_did_metadata.assert_called_once_with( - TestIndySdkLedger.test_did_info.did, - { - "...": "...", - **DIDPosture.POSTED.metadata, - }, + mock_wallet = async_mock.MagicMock() + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did, async_mock.patch.object( + IndySdkWallet, "get_local_did" + ) as mock_wallet_get_local_did, async_mock.patch.object( + IndySdkWallet, "replace_local_did_metadata" + ) as mock_wallet_replace_local_did_metadata: + ledger = IndySdkLedger( + IndySdkLedgerPool("name", checked=True), self.profile + ) + mock_wallet_get_public_did.return_value = self.test_did_info + mock_wallet_get_local_did.return_value = self.test_did_info + mock_wallet_replace_local_did_metadata.return_value = ( + async_mock.CoroutineMock() ) + async with ledger: + await ledger.register_nym( + self.test_did, + self.test_verkey, + "alias", + None, + ) + assert mock_build_nym_req.called_once_with( + self.test_did, + self.test_did, + self.test_verkey, + "alias", + None, + ) + assert mock_submit.called_once_with( + mock_build_nym_req.return_value, + True, + True, + sign_did=mock_wallet_get_public_did.return_value, + ) + mock_wallet_replace_local_did_metadata.assert_called_once_with( + self.test_did_info.did, + { + "test": "test", + **DIDPosture.POSTED.metadata, + }, + ) @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") async def test_register_nym_read_only(self, mock_close, mock_open): mock_wallet = async_mock.MagicMock() + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) ledger = IndySdkLedger( - IndySdkLedgerPool("name", checked=True, read_only=True), mock_wallet + IndySdkLedgerPool("name", checked=True, read_only=True), self.profile ) - - async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock( - return_value=TestIndySdkLedger.test_did_info - ) - with self.assertRaises(LedgerError) as context: - await ledger.register_nym( - TestIndySdkLedger.test_did, - TestIndySdkLedger.test_verkey, - "alias", - None, - ) - assert "read only" in str(context.exception) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = self.test_did_info + async with ledger: + with self.assertRaises(LedgerError) as context: + await ledger.register_nym( + self.test_did, + self.test_verkey, + "alias", + None, + ) + assert "read only" in str(context.exception) @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -2445,17 +2522,20 @@ async def test_register_nym_no_public_did(self, mock_close, mock_open): get_local_did=async_mock.CoroutineMock(), replace_local_did_metadata=async_mock.CoroutineMock(), ) - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - - async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock(return_value=None) - with self.assertRaises(WalletNotFoundError): - await ledger.register_nym( - TestIndySdkLedger.test_did, - TestIndySdkLedger.test_verkey, - "alias", - None, - ) + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = None + async with ledger: + with self.assertRaises(WalletNotFoundError): + await ledger.register_nym( + self.test_did, + self.test_verkey, + "alias", + None, + ) @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -2469,19 +2549,20 @@ async def test_register_nym_ledger_x( error_code=ErrorCode.CommonInvalidParam1, error_details={"message": "not today"}, ) - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - - async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock( - return_value=TestIndySdkLedger.test_did_info - ) - with self.assertRaises(LedgerError): - await ledger.register_nym( - TestIndySdkLedger.test_did, - TestIndySdkLedger.test_verkey, - "alias", - None, - ) + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = self.test_did_info + async with ledger: + with self.assertRaises(LedgerError): + await ledger.register_nym( + self.test_did, + self.test_verkey, + "alias", + None, + ) @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -2490,38 +2571,42 @@ async def test_register_nym_ledger_x( async def test_register_nym_steward_register_others_did( self, mock_submit, mock_build_nym_req, mock_close, mock_open ): - mock_wallet = async_mock.MagicMock( - type="indy", - get_local_did=async_mock.CoroutineMock(side_effect=WalletNotFoundError()), - replace_local_did_metadata=async_mock.CoroutineMock(), - ) - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - - async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock( - return_value=TestIndySdkLedger.test_did_info - ) - await ledger.register_nym( - TestIndySdkLedger.test_did, - TestIndySdkLedger.test_verkey, - "alias", - None, - ) - - assert mock_build_nym_req.called_once_with( - TestIndySdkLedger.test_did, - TestIndySdkLedger.test_did, - TestIndySdkLedger.test_verkey, - "alias", - None, - ) - assert mock_submit.called_once_with( - mock_build_nym_req.return_value, - True, - True, - sign_did=mock_wallet.get_public_did.return_value, + mock_wallet = async_mock.MagicMock() + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did, async_mock.patch.object( + IndySdkWallet, "get_local_did" + ) as mock_wallet_get_local_did, async_mock.patch.object( + IndySdkWallet, "replace_local_did_metadata" + ) as mock_wallet_replace_local_did_metadata: + mock_wallet_get_public_did.return_value = self.test_did_info + mock_wallet_get_local_did.side_effect = WalletNotFoundError() + mock_wallet_replace_local_did_metadata.return_value = ( + async_mock.CoroutineMock() ) - mock_wallet.replace_local_did_metadata.assert_not_called() + async with ledger: + await ledger.register_nym( + self.test_did, + self.test_verkey, + "alias", + None, + ) + assert mock_build_nym_req.called_once_with( + self.test_did, + self.test_did, + self.test_verkey, + "alias", + None, + ) + assert mock_submit.called_once_with( + mock_build_nym_req.return_value, + True, + True, + sign_did=mock_wallet_get_public_did.return_value, + ) + mock_wallet_replace_local_did_metadata.assert_not_called() @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -2531,7 +2616,7 @@ async def test_get_nym_role( self, mock_submit, mock_build_get_nym_req, mock_close, mock_open ): mock_wallet = async_mock.MagicMock() - + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) mock_submit.return_value = json.dumps( { "result": { @@ -2570,21 +2655,18 @@ async def test_get_nym_role( "op": "REPLY", } ) - - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - - async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock( - return_value=TestIndySdkLedger.test_did_info - ) - assert ( - await ledger.get_nym_role(TestIndySdkLedger.test_did) == Role.ENDORSER - ) - assert mock_build_get_nym_req.called_once_with( - TestIndySdkLedger.test_did, - TestIndySdkLedger.test_did, - ) - assert mock_submit.called_once_with(mock_build_get_nym_req.return_value) + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = self.test_did_info + async with ledger: + assert await ledger.get_nym_role(self.test_did) == Role.ENDORSER + assert mock_build_get_nym_req.called_once_with( + self.test_did, + self.test_did, + ) + assert mock_submit.called_once_with(mock_build_get_nym_req.return_value) @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -2593,21 +2675,20 @@ async def test_get_nym_role_indy_x( self, mock_build_get_nym_req, mock_close, mock_open ): mock_wallet = async_mock.MagicMock() - + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) mock_build_get_nym_req.side_effect = IndyError( error_code=ErrorCode.CommonInvalidParam1, error_details={"message": "not today"}, ) - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - - async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock( - return_value=TestIndySdkLedger.test_did_info - ) - - with self.assertRaises(LedgerError) as context: - await ledger.get_nym_role(TestIndySdkLedger.test_did) - assert "not today" in context.exception.message + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = self.test_did_info + async with ledger: + with self.assertRaises(LedgerError) as context: + await ledger.get_nym_role(self.test_did) + assert "not today" in context.exception.message @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -2617,7 +2698,7 @@ async def test_get_nym_role_did_not_public_x( self, mock_submit, mock_build_get_nym_req, mock_close, mock_open ): mock_wallet = async_mock.MagicMock() - + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) mock_submit.return_value = json.dumps( { "result": { @@ -2647,15 +2728,14 @@ async def test_get_nym_role_did_not_public_x( "op": "REPLY", } ) - - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - - async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock( - return_value=TestIndySdkLedger.test_did_info - ) - with self.assertRaises(BadLedgerRequestError): - await ledger.get_nym_role(TestIndySdkLedger.test_did) + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = self.test_did_info + async with ledger: + with self.assertRaises(BadLedgerRequestError): + await ledger.get_nym_role(self.test_did) @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -2672,15 +2752,8 @@ async def test_rotate_public_did_keypair( mock_close, mock_open, ): - mock_wallet = async_mock.MagicMock( - get_public_did=async_mock.CoroutineMock( - return_value=TestIndySdkLedger.test_did_info - ), - rotate_did_keypair_start=async_mock.CoroutineMock( - return_value=TestIndySdkLedger.test_verkey - ), - rotate_did_keypair_apply=async_mock.CoroutineMock(return_value=None), - ) + mock_wallet = async_mock.MagicMock() + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) mock_submit.side_effect = [ json.dumps({"result": {"data": json.dumps({"seqNo": 1234})}}), json.dumps( @@ -2691,10 +2764,19 @@ async def test_rotate_public_did_keypair( } ), ] - - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - async with ledger: - await ledger.rotate_public_did_keypair() + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did, async_mock.patch.object( + IndySdkWallet, "rotate_did_keypair_start", autospec=True + ) as mock_wallet_rotate_did_keypair_start, async_mock.patch.object( + IndySdkWallet, "rotate_did_keypair_apply", autospec=True + ) as mock_wallet_rotate_did_keypair_apply: + mock_wallet_get_public_did.return_value = self.test_did_info + mock_wallet_rotate_did_keypair_start.return_value = self.test_verkey + mock_wallet_rotate_did_keypair_apply.return_value = None + async with ledger: + await ledger.rotate_public_did_keypair() @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -2703,21 +2785,24 @@ async def test_rotate_public_did_keypair( async def test_rotate_public_did_keypair_no_nym( self, mock_submit, mock_build_get_nym_request, mock_close, mock_open ): - mock_wallet = async_mock.MagicMock( - get_public_did=async_mock.CoroutineMock( - return_value=TestIndySdkLedger.test_did_info - ), - rotate_did_keypair_start=async_mock.CoroutineMock( - return_value=TestIndySdkLedger.test_verkey - ), - rotate_did_keypair_apply=async_mock.CoroutineMock(return_value=None), - ) + mock_wallet = async_mock.MagicMock() + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) mock_submit.return_value = json.dumps({"result": {"data": json.dumps(None)}}) + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - async with ledger: - with self.assertRaises(BadLedgerRequestError): - await ledger.rotate_public_did_keypair() + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did, async_mock.patch.object( + IndySdkWallet, "rotate_did_keypair_start", autospec=True + ) as mock_wallet_rotate_did_keypair_start, async_mock.patch.object( + IndySdkWallet, "rotate_did_keypair_apply", autospec=True + ) as mock_wallet_rotate_did_keypair_apply: + mock_wallet_get_public_did.return_value = self.test_did_info + mock_wallet_rotate_did_keypair_start.return_value = self.test_verkey + mock_wallet_rotate_did_keypair_apply.return_value = None + async with ledger: + with self.assertRaises(BadLedgerRequestError): + await ledger.rotate_public_did_keypair() @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -2734,24 +2819,26 @@ async def test_rotate_public_did_keypair_corrupt_nym_txn( mock_close, mock_open, ): - mock_wallet = async_mock.MagicMock( - get_public_did=async_mock.CoroutineMock( - return_value=TestIndySdkLedger.test_did_info - ), - rotate_did_keypair_start=async_mock.CoroutineMock( - return_value=TestIndySdkLedger.test_verkey - ), - rotate_did_keypair_apply=async_mock.CoroutineMock(return_value=None), - ) + mock_wallet = async_mock.MagicMock() + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) mock_submit.side_effect = [ json.dumps({"result": {"data": json.dumps({"seqNo": 1234})}}), json.dumps({"result": {"data": None}}), ] - - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - async with ledger: - with self.assertRaises(BadLedgerRequestError): - await ledger.rotate_public_did_keypair() + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did, async_mock.patch.object( + IndySdkWallet, "rotate_did_keypair_start", autospec=True + ) as mock_wallet_rotate_did_keypair_start, async_mock.patch.object( + IndySdkWallet, "rotate_did_keypair_apply", autospec=True + ) as mock_wallet_rotate_did_keypair_apply: + mock_wallet_get_public_did.return_value = self.test_did_info + mock_wallet_rotate_did_keypair_start.return_value = self.test_verkey + mock_wallet_rotate_did_keypair_apply.return_value = None + async with ledger: + with self.assertRaises(BadLedgerRequestError): + await ledger.rotate_public_did_keypair() @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -2767,6 +2854,7 @@ async def test_get_revoc_reg_def( mock_open, ): mock_wallet = async_mock.MagicMock() + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) mock_indy_parse_get_rrdef_resp.return_value = ( "rr-id", json.dumps({"...": "..."}), @@ -2774,16 +2862,15 @@ async def test_get_revoc_reg_def( mock_submit.return_value = json.dumps({"result": {"txnTime": 1234567890}}) ledger = IndySdkLedger( - IndySdkLedgerPool("name", checked=True, read_only=True), mock_wallet + IndySdkLedgerPool("name", checked=True, read_only=True), self.profile ) - - async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock( - return_value=TestIndySdkLedger.test_did_info - ) - - result = await ledger.get_revoc_reg_def("rr-id") - assert result == {"...": "...", "txnTime": 1234567890} + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = self.test_did_info + async with ledger: + result = await ledger.get_revoc_reg_def("rr-id") + assert result == {"...": "...", "txnTime": 1234567890} @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -2797,19 +2884,18 @@ async def test_get_revoc_reg_def_indy_x( error_code=ErrorCode.CommonInvalidParam1, error_details={"message": "not today"}, ) - + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) ledger = IndySdkLedger( - IndySdkLedgerPool("name", checked=True, read_only=True), mock_wallet + IndySdkLedgerPool("name", checked=True, read_only=True), self.profile ) - - async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock( - return_value=TestIndySdkLedger.test_did_info - ) - - with self.assertRaises(IndyError) as context: - await ledger.get_revoc_reg_def("rr-id") - assert "not today" in context.exception.message + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = self.test_did_info + async with ledger: + with self.assertRaises(IndyError) as context: + await ledger.get_revoc_reg_def("rr-id") + assert "not today" in context.exception.message @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -2825,9 +2911,7 @@ async def test_get_revoc_reg_entry( mock_open, ): mock_wallet = async_mock.MagicMock() - mock_wallet.get_public_did = async_mock.CoroutineMock( - return_value=TestIndySdkLedger.test_did_info - ) + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) mock_indy_parse_get_rr_resp.return_value = ( "rr-id", '{"hello": "world"}', @@ -2835,12 +2919,15 @@ async def test_get_revoc_reg_entry( ) ledger = IndySdkLedger( - IndySdkLedgerPool("name", checked=True, read_only=True), mock_wallet + IndySdkLedgerPool("name", checked=True, read_only=True), self.profile ) - - async with ledger: - (result, _) = await ledger.get_revoc_reg_entry("rr-id", 1234567890) - assert result == {"hello": "world"} + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = self.test_did_info + async with ledger: + (result, _) = await ledger.get_revoc_reg_entry("rr-id", 1234567890) + assert result == {"hello": "world"} @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -2856,20 +2943,21 @@ async def test_get_revoc_reg_entry_x( mock_open, ): mock_wallet = async_mock.MagicMock() - mock_wallet.get_public_did = async_mock.CoroutineMock( - return_value=TestIndySdkLedger.test_did_info - ) + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) mock_indy_parse_get_rr_resp.side_effect = IndyError( error_code=ErrorCode.PoolLedgerTimeout, error_details={"message": "bye"}, ) ledger = IndySdkLedger( - IndySdkLedgerPool("name", checked=True, read_only=True), mock_wallet + IndySdkLedgerPool("name", checked=True, read_only=True), self.profile ) - - with self.assertRaises(LedgerError): - async with ledger: - await ledger.get_revoc_reg_entry("rr-id", 1234567890) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = self.test_did_info + with self.assertRaises(LedgerError): + async with ledger: + await ledger.get_revoc_reg_entry("rr-id", 1234567890) @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -2885,6 +2973,7 @@ async def test_get_revoc_reg_delta( mock_open, ): mock_wallet = async_mock.MagicMock() + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) mock_indy_parse_get_rrd_resp.return_value = ( "rr-id", '{"hello": "world"}', @@ -2892,16 +2981,15 @@ async def test_get_revoc_reg_delta( ) ledger = IndySdkLedger( - IndySdkLedgerPool("name", checked=True, read_only=True), mock_wallet + IndySdkLedgerPool("name", checked=True, read_only=True), self.profile ) - - async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock( - return_value=TestIndySdkLedger.test_did_info - ) - - (result, _) = await ledger.get_revoc_reg_delta("rr-id") - assert result == {"hello": "world"} + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = self.test_did_info + async with ledger: + (result, _) = await ledger.get_revoc_reg_delta("rr-id") + assert result == {"hello": "world"} @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -2911,25 +2999,28 @@ async def test_send_revoc_reg_def_public_did( self, mock_indy_build_rrdef_req, mock_submit, mock_close, mock_open ): mock_wallet = async_mock.MagicMock() + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) mock_indy_build_rrdef_req.return_value = '{"hello": "world"}' ledger = IndySdkLedger( - IndySdkLedgerPool("name", checked=True, read_only=True), mock_wallet + IndySdkLedgerPool("name", checked=True, read_only=True), self.profile ) - - async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock( - return_value=TestIndySdkLedger.test_did_info - ) - await ledger.send_revoc_reg_def({"rr": "def"}, issuer_did=None) - mock_wallet.get_public_did.assert_called_once() - assert not mock_wallet.get_local_did.called - mock_submit.assert_called_once_with( - mock_indy_build_rrdef_req.return_value, - True, - sign_did=TestIndySdkLedger.test_did_info, - write_ledger=True, - ) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did, async_mock.patch.object( + IndySdkWallet, "get_local_did" + ) as mock_wallet_get_local_did: + mock_wallet_get_public_did.return_value = self.test_did_info + async with ledger: + await ledger.send_revoc_reg_def({"rr": "def"}, issuer_did=None) + mock_wallet_get_public_did.assert_called_once() + assert not mock_wallet_get_local_did.called + mock_submit.assert_called_once_with( + mock_indy_build_rrdef_req.return_value, + True, + sign_did=self.test_did_info, + write_ledger=True, + ) @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -2939,30 +3030,28 @@ async def test_send_revoc_reg_def_local_did( self, mock_indy_build_rrdef_req, mock_submit, mock_close, mock_open ): mock_wallet = async_mock.MagicMock() + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) mock_indy_build_rrdef_req.return_value = '{"hello": "world"}' ledger = IndySdkLedger( - IndySdkLedgerPool("name", checked=True, read_only=True), mock_wallet + IndySdkLedgerPool("name", checked=True, read_only=True), self.profile ) - - async with ledger: - mock_wallet.get_local_did = async_mock.CoroutineMock( - return_value=TestIndySdkLedger.test_did_info - ) - await ledger.send_revoc_reg_def( - {"rr": "def"}, - issuer_did=TestIndySdkLedger.test_did, - ) - mock_wallet.get_local_did.assert_called_once_with( - TestIndySdkLedger.test_did - ) - assert not mock_wallet.get_public_did.called - mock_submit.assert_called_once_with( - mock_indy_build_rrdef_req.return_value, - True, - sign_did=TestIndySdkLedger.test_did_info, - write_ledger=True, - ) + with async_mock.patch.object( + IndySdkWallet, "get_local_did" + ) as mock_wallet_get_local_did: + mock_wallet_get_local_did.return_value = self.test_did_info + async with ledger: + await ledger.send_revoc_reg_def( + {"rr": "def"}, + issuer_did=self.test_did, + ) + mock_wallet_get_local_did.assert_called_once_with(self.test_did) + mock_submit.assert_called_once_with( + mock_indy_build_rrdef_req.return_value, + True, + sign_did=self.test_did_info, + write_ledger=True, + ) @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -2972,22 +3061,25 @@ async def test_send_revoc_reg_def_x_no_did( self, mock_indy_build_rrdef_req, mock_submit, mock_close, mock_open ): mock_wallet = async_mock.MagicMock() + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) mock_indy_build_rrdef_req.return_value = '{"hello": "world"}' ledger = IndySdkLedger( - IndySdkLedgerPool("name", checked=True, read_only=True), mock_wallet + IndySdkLedgerPool("name", checked=True, read_only=True), self.profile ) - - async with ledger: - mock_wallet.get_local_did = async_mock.CoroutineMock(return_value=None) - with self.assertRaises(LedgerTransactionError) as context: - await ledger.send_revoc_reg_def( - {"rr": "def"}, - issuer_did=TestIndySdkLedger.test_did, + with async_mock.patch.object( + IndySdkWallet, "get_local_did" + ) as mock_wallet_get_local_did: + mock_wallet_get_local_did.return_value = None + async with ledger: + with self.assertRaises(LedgerTransactionError) as context: + await ledger.send_revoc_reg_def( + {"rr": "def"}, + issuer_did=self.test_did, + ) + assert "No issuer DID found for revocation registry definition" in str( + context.exception ) - assert "No issuer DID found for revocation registry definition" in str( - context.exception - ) @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -2997,27 +3089,30 @@ async def test_send_revoc_reg_entry_public_did( self, mock_indy_build_rre_req, mock_submit, mock_close, mock_open ): mock_wallet = async_mock.MagicMock() + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) mock_indy_build_rre_req.return_value = '{"hello": "world"}' ledger = IndySdkLedger( - IndySdkLedgerPool("name", checked=True, read_only=True), mock_wallet + IndySdkLedgerPool("name", checked=True, read_only=True), self.profile ) - - async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock( - return_value=TestIndySdkLedger.test_did_info - ) - await ledger.send_revoc_reg_entry( - "rr-id", "CL_ACCUM", {"rev-reg": "entry"}, issuer_did=None - ) - mock_wallet.get_public_did.assert_called_once() - assert not mock_wallet.get_local_did.called - mock_submit.assert_called_once_with( - mock_indy_build_rre_req.return_value, - True, - sign_did=TestIndySdkLedger.test_did_info, - write_ledger=True, - ) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did, async_mock.patch.object( + IndySdkWallet, "get_local_did" + ) as mock_wallet_get_local_did: + mock_wallet_get_public_did.return_value = self.test_did_info + async with ledger: + await ledger.send_revoc_reg_entry( + "rr-id", "CL_ACCUM", {"rev-reg": "entry"}, issuer_did=None + ) + mock_wallet_get_public_did.assert_called_once() + assert not mock_wallet_get_local_did.called + mock_submit.assert_called_once_with( + mock_indy_build_rre_req.return_value, + True, + sign_did=self.test_did_info, + write_ledger=True, + ) @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -3027,32 +3122,30 @@ async def test_send_revoc_reg_entry_local_did( self, mock_indy_build_rre_req, mock_submit, mock_close, mock_open ): mock_wallet = async_mock.MagicMock() + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) mock_indy_build_rre_req.return_value = '{"hello": "world"}' ledger = IndySdkLedger( - IndySdkLedgerPool("name", checked=True, read_only=True), mock_wallet + IndySdkLedgerPool("name", checked=True, read_only=True), self.profile ) - - async with ledger: - mock_wallet.get_local_did = async_mock.CoroutineMock( - return_value=TestIndySdkLedger.test_did_info - ) - result = await ledger.send_revoc_reg_entry( - "rr-id", - "CL_ACCUM", - {"rev-reg": "entry"}, - issuer_did=TestIndySdkLedger.test_did, - ) - mock_wallet.get_local_did.assert_called_once_with( - TestIndySdkLedger.test_did - ) - assert not mock_wallet.get_public_did.called - mock_submit.assert_called_once_with( - mock_indy_build_rre_req.return_value, - True, - sign_did=TestIndySdkLedger.test_did_info, - write_ledger=True, - ) + with async_mock.patch.object( + IndySdkWallet, "get_local_did" + ) as mock_wallet_get_local_did: + mock_wallet_get_local_did.return_value = self.test_did_info + async with ledger: + result = await ledger.send_revoc_reg_entry( + "rr-id", + "CL_ACCUM", + {"rev-reg": "entry"}, + issuer_did=self.test_did, + ) + mock_wallet_get_local_did.assert_called_once_with(self.test_did) + mock_submit.assert_called_once_with( + mock_indy_build_rre_req.return_value, + True, + sign_did=self.test_did_info, + write_ledger=True, + ) @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -3062,24 +3155,27 @@ async def test_send_revoc_reg_entry_x_no_did( self, mock_indy_build_rre_req, mock_submit, mock_close, mock_open ): mock_wallet = async_mock.MagicMock() + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) mock_indy_build_rre_req.return_value = '{"hello": "world"}' ledger = IndySdkLedger( - IndySdkLedgerPool("name", checked=True, read_only=True), mock_wallet + IndySdkLedgerPool("name", checked=True, read_only=True), self.profile ) - - async with ledger: - mock_wallet.get_local_did = async_mock.CoroutineMock(return_value=None) - with self.assertRaises(LedgerTransactionError) as context: - await ledger.send_revoc_reg_entry( - "rr-id", - "CL_ACCUM", - {"rev-reg": "entry"}, - issuer_did=TestIndySdkLedger.test_did, + with async_mock.patch.object( + IndySdkWallet, "get_local_did" + ) as mock_wallet_get_local_did: + mock_wallet_get_local_did.return_value = None + async with ledger: + with self.assertRaises(LedgerTransactionError) as context: + await ledger.send_revoc_reg_entry( + "rr-id", + "CL_ACCUM", + {"rev-reg": "entry"}, + issuer_did=self.test_did, + ) + assert "No issuer DID found for revocation registry entry" in str( + context.exception ) - assert "No issuer DID found for revocation registry entry" in str( - context.exception - ) @async_mock.patch("indy.pool.open_pool_ledger") @async_mock.patch("indy.pool.close_pool_ledger") @@ -3089,16 +3185,15 @@ async def test_taa_digest_bad_value( mock_open_ledger, ): mock_wallet = async_mock.MagicMock() - - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - - async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock( - return_value=TestIndySdkLedger.test_did_info - ) - - with self.assertRaises(ValueError): - await ledger.taa_digest(None, None) + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = self.test_did_info + async with ledger: + with self.assertRaises(ValueError): + await ledger.taa_digest(None, None) @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -3114,49 +3209,48 @@ async def test_get_txn_author_agreement( mock_open, ): mock_wallet = async_mock.MagicMock() - + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) txn_result_data = {"text": "text", "version": "1.0"} mock_submit.side_effect = [ json.dumps({"result": {"data": txn_result_data}}) for i in range(2) ] + ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), self.profile) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = self.test_did_info + async with ledger: + response = await ledger.get_txn_author_agreement(reload=True) - ledger = IndySdkLedger(IndySdkLedgerPool("name", checked=True), mock_wallet) - - async with ledger: - mock_wallet.get_public_did = async_mock.CoroutineMock( - return_value=TestIndySdkLedger.test_did_info - ) - response = await ledger.get_txn_author_agreement(reload=True) - - assert mock_build_get_acc_mech_req.called_once_with( - TestIndySdkLedger.test_did, None, None - ) - assert mock_build_get_taa_req.called_once_with( - TestIndySdkLedger.test_did, - None, - ) - mock_submit.assert_has_calls( - [ - async_mock.call( - mock_build_get_acc_mech_req.return_value, - sign_did=mock_wallet.get_public_did.return_value, - ), - async_mock.call( - mock_build_get_taa_req.return_value, - sign_did=mock_wallet.get_public_did.return_value, - ), - ] - ) - assert response == { - "aml_record": txn_result_data, - "taa_record": { - **txn_result_data, - "digest": ledger.taa_digest( - txn_result_data["version"], txn_result_data["text"] - ), - }, - "taa_required": True, - } + assert mock_build_get_acc_mech_req.called_once_with( + self.test_did, None, None + ) + assert mock_build_get_taa_req.called_once_with( + self.test_did, + None, + ) + mock_submit.assert_has_calls( + [ + async_mock.call( + mock_build_get_acc_mech_req.return_value, + sign_did=mock_wallet_get_public_did.return_value, + ), + async_mock.call( + mock_build_get_taa_req.return_value, + sign_did=mock_wallet_get_public_did.return_value, + ), + ] + ) + assert response == { + "aml_record": txn_result_data, + "taa_record": { + **txn_result_data, + "digest": ledger.taa_digest( + txn_result_data["version"], txn_result_data["text"] + ), + }, + "taa_required": True, + } @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -3166,9 +3260,9 @@ async def test_accept_and_get_latest_txn_author_agreement( self, mock_find_all_records, mock_add_record, mock_close, mock_open ): mock_wallet = async_mock.MagicMock() - + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) ledger = IndySdkLedger( - IndySdkLedgerPool("name", checked=True, cache=InMemoryCache()), mock_wallet + IndySdkLedgerPool("name", checked=True, cache=InMemoryCache()), self.profile ) accept_time = ledger.taa_rough_timestamp() @@ -3192,18 +3286,21 @@ async def test_accept_and_get_latest_txn_author_agreement( {"pool_name": ledger.pool_name}, ) ] + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = self.test_did_info + async with ledger: + await ledger.accept_txn_author_agreement( + taa_record=taa_record, mechanism="dummy", accept_time=None + ) - async with ledger: - await ledger.accept_txn_author_agreement( - taa_record=taa_record, mechanism="dummy", accept_time=None - ) - - await ledger.pool.cache.clear( - f"{TAA_ACCEPTED_RECORD_TYPE}::{ledger.pool_name}" - ) - for i in range(2): # populate, then get from, cache - response = await ledger.get_latest_txn_author_acceptance() - assert response == acceptance + await ledger.pool.cache.clear( + f"{TAA_ACCEPTED_RECORD_TYPE}::{ledger.pool_name}" + ) + for i in range(2): # populate, then get from, cache + response = await ledger.get_latest_txn_author_acceptance() + assert response == acceptance @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -3212,19 +3309,22 @@ async def test_get_latest_txn_author_agreement_none( self, mock_find_all_records, mock_close, mock_open ): mock_wallet = async_mock.MagicMock() - + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) ledger = IndySdkLedger( - IndySdkLedgerPool("name", checked=True, cache=InMemoryCache()), mock_wallet + IndySdkLedgerPool("name", checked=True, cache=InMemoryCache()), self.profile ) mock_find_all_records.return_value = [] - - async with ledger: - await ledger.pool.cache.clear( - f"{TAA_ACCEPTED_RECORD_TYPE}::{ledger.pool_name}" - ) - response = await ledger.get_latest_txn_author_acceptance() - assert response == {} + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = self.test_did_info + async with ledger: + await ledger.pool.cache.clear( + f"{TAA_ACCEPTED_RECORD_TYPE}::{ledger.pool_name}" + ) + response = await ledger.get_latest_txn_author_acceptance() + assert response == {} @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_open") @async_mock.patch("aries_cloudagent.ledger.indy.IndySdkLedgerPool.context_close") @@ -3233,27 +3333,30 @@ async def test_credential_definition_id2schema_id( self, mock_get_schema, mock_close, mock_open ): mock_wallet = async_mock.MagicMock() - - S_ID = f"{TestIndySdkLedger.test_did}:2:favourite_drink:1.0" + self.session.context.injector.bind_provider(BaseWallet, mock_wallet) + S_ID = f"{self.test_did}:2:favourite_drink:1.0" SEQ_NO = "9999" mock_get_schema.return_value = {"id": S_ID} ledger = IndySdkLedger( - IndySdkLedgerPool("name", checked=True, cache=InMemoryCache()), mock_wallet + IndySdkLedgerPool("name", checked=True, cache=InMemoryCache()), self.profile ) + with async_mock.patch.object( + IndySdkWallet, "get_public_did" + ) as mock_wallet_get_public_did: + mock_wallet_get_public_did.return_value = self.test_did_info + async with ledger: + s_id_short = await ledger.credential_definition_id2schema_id( + f"{self.test_did}:3:CL:{SEQ_NO}:tag" + ) - async with ledger: - s_id_short = await ledger.credential_definition_id2schema_id( - f"{TestIndySdkLedger.test_did}:3:CL:{SEQ_NO}:tag" - ) - - mock_get_schema.assert_called_once_with(SEQ_NO) + mock_get_schema.assert_called_once_with(SEQ_NO) - assert s_id_short == S_ID - s_id_long = await ledger.credential_definition_id2schema_id( - f"{TestIndySdkLedger.test_did}:3:CL:{s_id_short}:tag" - ) - assert s_id_long == s_id_short + assert s_id_short == S_ID + s_id_long = await ledger.credential_definition_id2schema_id( + f"{self.test_did}:3:CL:{s_id_short}:tag" + ) + assert s_id_long == s_id_short def test_error_handler(self): try: # with self.assertRaises() makes a copy of exception, loses traceback! diff --git a/aries_cloudagent/ledger/tests/test_routes.py b/aries_cloudagent/ledger/tests/test_routes.py index 46b3cc00c1..ea085cfff4 100644 --- a/aries_cloudagent/ledger/tests/test_routes.py +++ b/aries_cloudagent/ledger/tests/test_routes.py @@ -1,8 +1,16 @@ from asynctest import mock as async_mock, TestCase as AsyncTestCase from ...admin.request_context import AdminRequestContext +from ...core.in_memory import InMemoryProfile from ...ledger.base import BaseLedger from ...ledger.endpoint_type import EndpointType +from ...ledger.multiple_ledger.ledger_requests_executor import ( + IndyLedgerRequestsExecutor, +) +from ...ledger.multiple_ledger.base_manager import ( + BaseMultipleLedgerManager, + MultipleLedgerManagerError, +) from .. import routes as test_module from ..indy import Role @@ -12,8 +20,10 @@ class TestLedgerRoutes(AsyncTestCase): def setUp(self): self.ledger = async_mock.create_autospec(BaseLedger) self.ledger.pool_name = "pool.0" - self.session_inject = {BaseLedger: self.ledger} - self.context = AdminRequestContext.test_context(self.session_inject) + self.profile = InMemoryProfile.test_profile() + self.context = self.profile.context + setattr(self.context, "profile", self.profile) + self.profile.context.injector.bind_instance(BaseLedger, self.ledger) self.request_dict = { "context": self.context, "outbound_message_router": async_mock.CoroutineMock(), @@ -32,12 +42,21 @@ def setUp(self): self.test_endpoint_type_profile = "http://company.com/profile" async def test_missing_ledger(self): - self.session_inject[BaseLedger] = None - + self.profile.context.injector.bind_instance( + IndyLedgerRequestsExecutor, + async_mock.MagicMock( + get_ledger_for_identifier=async_mock.CoroutineMock(return_value=None) + ), + ) + self.profile.context.injector.clear_binding(BaseLedger) with self.assertRaises(test_module.web.HTTPForbidden): await test_module.register_ledger_nym(self.request) + with self.assertRaises(test_module.web.HTTPBadRequest): + await test_module.get_nym_role(self.request) + with self.assertRaises(test_module.web.HTTPForbidden): + self.request.query["did"] = "test" await test_module.get_nym_role(self.request) with self.assertRaises(test_module.web.HTTPForbidden): @@ -55,7 +74,15 @@ async def test_missing_ledger(self): with self.assertRaises(test_module.web.HTTPForbidden): await test_module.ledger_get_taa(self.request) - async def test_get_verkey(self): + async def test_get_verkey_a(self): + self.profile.context.injector.bind_instance( + IndyLedgerRequestsExecutor, + async_mock.MagicMock( + get_ledger_for_identifier=async_mock.CoroutineMock( + return_value=self.ledger + ) + ), + ) self.request.query = {"did": self.test_did} with async_mock.patch.object( test_module.web, "json_response", async_mock.Mock() @@ -67,24 +94,71 @@ async def test_get_verkey(self): ) assert result is json_response.return_value + async def test_get_verkey_b(self): + self.profile.context.injector.bind_instance( + IndyLedgerRequestsExecutor, + async_mock.MagicMock( + get_ledger_for_identifier=async_mock.CoroutineMock( + return_value=("test_ledger_id", self.ledger) + ) + ), + ) + self.request.query = {"did": self.test_did} + with async_mock.patch.object( + test_module.web, "json_response", async_mock.Mock() + ) as json_response: + self.ledger.get_key_for_did.return_value = self.test_verkey + result = await test_module.get_did_verkey(self.request) + json_response.assert_called_once_with( + { + "ledger_id": "test_ledger_id", + "verkey": self.ledger.get_key_for_did.return_value, + } + ) + assert result is json_response.return_value + async def test_get_verkey_no_did(self): self.request.query = {"no": "did"} with self.assertRaises(test_module.web.HTTPBadRequest): await test_module.get_did_verkey(self.request) async def test_get_verkey_did_not_public(self): + self.profile.context.injector.bind_instance( + IndyLedgerRequestsExecutor, + async_mock.MagicMock( + get_ledger_for_identifier=async_mock.CoroutineMock( + return_value=("test_ledger_id", self.ledger) + ) + ), + ) self.request.query = {"did": self.test_did} self.ledger.get_key_for_did.return_value = None with self.assertRaises(test_module.web.HTTPNotFound): await test_module.get_did_verkey(self.request) async def test_get_verkey_x(self): + self.profile.context.injector.bind_instance( + IndyLedgerRequestsExecutor, + async_mock.MagicMock( + get_ledger_for_identifier=async_mock.CoroutineMock( + return_value=self.ledger + ) + ), + ) self.request.query = {"did": self.test_did} self.ledger.get_key_for_did.side_effect = test_module.LedgerError() with self.assertRaises(test_module.web.HTTPBadRequest): await test_module.get_did_verkey(self.request) async def test_get_endpoint(self): + self.profile.context.injector.bind_instance( + IndyLedgerRequestsExecutor, + async_mock.MagicMock( + get_ledger_for_identifier=async_mock.CoroutineMock( + return_value=self.ledger + ) + ), + ) self.request.query = {"did": self.test_did} with async_mock.patch.object( test_module.web, "json_response", async_mock.Mock() @@ -97,6 +171,14 @@ async def test_get_endpoint(self): assert result is json_response.return_value async def test_get_endpoint_of_type_profile(self): + self.profile.context.injector.bind_instance( + IndyLedgerRequestsExecutor, + async_mock.MagicMock( + get_ledger_for_identifier=async_mock.CoroutineMock( + return_value=("test_ledger_id", self.ledger) + ) + ), + ) self.request.query = { "did": self.test_did, "endpoint_type": self.test_endpoint_type.w3c, @@ -109,7 +191,10 @@ async def test_get_endpoint_of_type_profile(self): ) result = await test_module.get_did_endpoint(self.request) json_response.assert_called_once_with( - {"endpoint": self.ledger.get_endpoint_for_did.return_value} + { + "ledger_id": "test_ledger_id", + "endpoint": self.ledger.get_endpoint_for_did.return_value, + } ) assert result is json_response.return_value @@ -119,6 +204,14 @@ async def test_get_endpoint_no_did(self): await test_module.get_did_endpoint(self.request) async def test_get_endpoint_x(self): + self.profile.context.injector.bind_instance( + IndyLedgerRequestsExecutor, + async_mock.MagicMock( + get_ledger_for_identifier=async_mock.CoroutineMock( + return_value=("test_ledger_id", self.ledger) + ) + ), + ) self.request.query = {"did": self.test_did} self.ledger.get_endpoint_for_did.side_effect = test_module.LedgerError() with self.assertRaises(test_module.web.HTTPBadRequest): @@ -171,7 +264,15 @@ async def test_register_nym_wallet_error(self): with self.assertRaises(test_module.web.HTTPBadRequest): await test_module.register_ledger_nym(self.request) - async def test_get_nym_role(self): + async def test_get_nym_role_a(self): + self.profile.context.injector.bind_instance( + IndyLedgerRequestsExecutor, + async_mock.MagicMock( + get_ledger_for_identifier=async_mock.CoroutineMock( + return_value=self.ledger + ) + ), + ) self.request.query = {"did": self.test_did} with async_mock.patch.object( @@ -182,12 +283,41 @@ async def test_get_nym_role(self): json_response.assert_called_once_with({"role": "USER"}) assert result is json_response.return_value + async def test_get_nym_role_b(self): + self.profile.context.injector.bind_instance( + IndyLedgerRequestsExecutor, + async_mock.MagicMock( + get_ledger_for_identifier=async_mock.CoroutineMock( + return_value=("test_ledger_id", self.ledger) + ) + ), + ) + self.request.query = {"did": self.test_did} + + with async_mock.patch.object( + test_module.web, "json_response", async_mock.Mock() + ) as json_response: + self.ledger.get_nym_role.return_value = Role.USER + result = await test_module.get_nym_role(self.request) + json_response.assert_called_once_with( + {"ledger_id": "test_ledger_id", "role": "USER"} + ) + assert result is json_response.return_value + async def test_get_nym_role_bad_request(self): self.request.query = {"no": "did"} with self.assertRaises(test_module.web.HTTPBadRequest): await test_module.get_nym_role(self.request) async def test_get_nym_role_ledger_txn_error(self): + self.profile.context.injector.bind_instance( + IndyLedgerRequestsExecutor, + async_mock.MagicMock( + get_ledger_for_identifier=async_mock.CoroutineMock( + return_value=("test_ledger_id", self.ledger) + ) + ), + ) self.request.query = {"did": self.test_did} self.ledger.get_nym_role.side_effect = test_module.LedgerTransactionError( "Error in building get-nym request" @@ -196,6 +326,14 @@ async def test_get_nym_role_ledger_txn_error(self): await test_module.get_nym_role(self.request) async def test_get_nym_role_bad_ledger_req(self): + self.profile.context.injector.bind_instance( + IndyLedgerRequestsExecutor, + async_mock.MagicMock( + get_ledger_for_identifier=async_mock.CoroutineMock( + return_value=("test_ledger_id", self.ledger) + ) + ), + ) self.request.query = {"did": self.test_did} self.ledger.get_nym_role.side_effect = test_module.BadLedgerRequestError( "No such public DID" @@ -204,6 +342,14 @@ async def test_get_nym_role_bad_ledger_req(self): await test_module.get_nym_role(self.request) async def test_get_nym_role_ledger_error(self): + self.profile.context.injector.bind_instance( + IndyLedgerRequestsExecutor, + async_mock.MagicMock( + get_ledger_for_identifier=async_mock.CoroutineMock( + return_value=self.ledger + ) + ), + ) self.request.query = {"did": self.test_did} self.ledger.get_nym_role.side_effect = test_module.LedgerError("Error") with self.assertRaises(test_module.web.HTTPBadRequest): @@ -326,3 +472,77 @@ async def test_post_process_routes(self): mock_app = async_mock.MagicMock(_state={"swagger_dict": {}}) test_module.post_process_routes(mock_app) assert "tags" in mock_app._state["swagger_dict"] + + async def test_get_write_ledger(self): + self.profile.context.injector.bind_instance( + BaseMultipleLedgerManager, + async_mock.MagicMock( + get_write_ledger=async_mock.CoroutineMock( + return_value=("test_ledger_id", self.ledger) + ) + ), + ) + with async_mock.patch.object( + test_module.web, "json_response", async_mock.Mock() + ) as json_response: + result = await test_module.get_write_ledger(self.request) + json_response.assert_called_once_with( + { + "ledger_id": "test_ledger_id", + } + ) + assert result is json_response.return_value + + async def test_get_write_ledger_x(self): + with self.assertRaises(test_module.web.HTTPForbidden) as cm: + await test_module.get_write_ledger(self.request) + assert "No instance provided for BaseMultipleLedgerManager" in cm + + async def test_get_ledger_config(self): + self.profile.context.injector.bind_instance( + BaseMultipleLedgerManager, + async_mock.MagicMock( + get_prod_ledgers=async_mock.CoroutineMock( + return_value={ + "test_1": async_mock.MagicMock(), + "test_2": async_mock.MagicMock(), + "test_5": async_mock.MagicMock(), + } + ), + get_nonprod_ledgers=async_mock.CoroutineMock( + return_value={ + "test_3": async_mock.MagicMock(), + "test_4": async_mock.MagicMock(), + } + ), + ), + ) + self.context.settings["ledger.ledger_config_list"] = [ + {"id": "test_1", "genesis_transactions": "..."}, + {"id": "test_2", "genesis_transactions": "..."}, + {"id": "test_3", "genesis_transactions": "..."}, + {"id": "test_4", "genesis_transactions": "..."}, + ] + with async_mock.patch.object( + test_module.web, "json_response", async_mock.Mock() + ) as json_response: + result = await test_module.get_ledger_config(self.request) + json_response.assert_called_once_with( + { + "production_ledgers": [ + {"id": "test_1"}, + {"id": "test_2"}, + { + "id": "test_5", + "desc": "ledger configured outside --genesis-transactions-list", + }, + ], + "non_production_ledgers": [{"id": "test_3"}, {"id": "test_4"}], + } + ) + assert result is json_response.return_value + + async def test_get_ledger_config_x(self): + with self.assertRaises(test_module.web.HTTPForbidden) as cm: + await test_module.get_ledger_config(self.request) + assert "No instance provided for BaseMultipleLedgerManager" in cm diff --git a/aries_cloudagent/messaging/credential_definitions/routes.py b/aries_cloudagent/messaging/credential_definitions/routes.py index 58f2247a56..61f0624e6b 100644 --- a/aries_cloudagent/messaging/credential_definitions/routes.py +++ b/aries_cloudagent/messaging/credential_definitions/routes.py @@ -24,6 +24,10 @@ from ...indy.models.cred_def import CredentialDefinitionSchema from ...ledger.base import BaseLedger from ...ledger.error import LedgerError +from ...ledger.multiple_ledger.ledger_requests_executor import ( + GET_CRED_DEF, + IndyLedgerRequestsExecutor, +) from ...protocols.endorse_transaction.v1_0.manager import ( TransactionManager, TransactionManagerError, @@ -164,6 +168,7 @@ async def credential_definitions_send_credential_definition(request: web.BaseReq """ context: AdminRequestContext = request["context"] + profile = context.profile outbound_handler = request["outbound_message_router"] create_transaction_for_endorser = json.loads( @@ -193,7 +198,7 @@ async def credential_definitions_send_credential_definition(request: web.BaseReq if not write_ledger: try: - async with context.session() as session: + async with profile.session() as session: connection_record = await ConnRecord.retrieve_by_id( session, connection_id ) @@ -202,8 +207,10 @@ async def credential_definitions_send_credential_definition(request: web.BaseReq except BaseModelError as err: raise web.HTTPBadRequest(reason=err.roll_up) from err - session = await context.session() - endorser_info = await connection_record.metadata_get(session, "endorser_info") + async with profile.session() as session: + endorser_info = await connection_record.metadata_get( + session, "endorser_info" + ) if not endorser_info: raise web.HTTPForbidden( reason="Endorser Info is not set up in " @@ -348,10 +355,20 @@ async def credential_definitions_get_credential_definition(request: web.BaseRequ """ context: AdminRequestContext = request["context"] - cred_def_id = request.match_info["cred_def_id"] - ledger = context.inject_or(BaseLedger) + ledger_id = None + async with context.profile.session() as session: + ledger_exec_inst = session.inject(IndyLedgerRequestsExecutor) + ledger_info = await ledger_exec_inst.get_ledger_for_identifier( + cred_def_id, + txn_record_type=GET_CRED_DEF, + ) + if isinstance(ledger_info, tuple): + ledger_id = ledger_info[0] + ledger = ledger_info[1] + else: + ledger = ledger_info if not ledger: reason = "No ledger available" if not context.settings.get_value("wallet.type"): @@ -361,7 +378,12 @@ async def credential_definitions_get_credential_definition(request: web.BaseRequ async with ledger: cred_def = await ledger.get_credential_definition(cred_def_id) - return web.json_response({"credential_definition": cred_def}) + if ledger_id: + return web.json_response( + {"ledger_id": ledger_id, "credential_definition": cred_def} + ) + else: + return web.json_response({"credential_definition": cred_def}) @docs( @@ -383,12 +405,21 @@ async def credential_definitions_fix_cred_def_wallet_record(request: web.BaseReq """ context: AdminRequestContext = request["context"] - session = await context.session() - storage = session.inject(BaseStorage) - cred_def_id = request.match_info["cred_def_id"] - ledger = context.inject_or(BaseLedger) + ledger_id = None + async with context.profile.session() as session: + storage = session.inject(BaseStorage) + ledger_exec_inst = session.inject(IndyLedgerRequestsExecutor) + ledger_info = await ledger_exec_inst.get_ledger_for_identifier( + cred_def_id, + txn_record_type=GET_CRED_DEF, + ) + if isinstance(ledger_info, tuple): + ledger_id = ledger_info[0] + ledger = ledger_info[1] + else: + ledger = ledger_info if not ledger: reason = "No ledger available" if not context.settings.get_value("wallet.type"): @@ -414,8 +445,12 @@ async def credential_definitions_fix_cred_def_wallet_record(request: web.BaseReq await add_cred_def_non_secrets_record( session.profile, schema_id, iss_did, cred_def_id ) - - return web.json_response({"credential_definition": cred_def}) + if ledger_id: + return web.json_response( + {"ledger_id": ledger_id, "credential_definition": cred_def} + ) + else: + return web.json_response({"credential_definition": cred_def}) def register_events(event_bus: EventBus): diff --git a/aries_cloudagent/messaging/credential_definitions/tests/test_routes.py b/aries_cloudagent/messaging/credential_definitions/tests/test_routes.py index c4761ba9e3..dd0ae0c716 100644 --- a/aries_cloudagent/messaging/credential_definitions/tests/test_routes.py +++ b/aries_cloudagent/messaging/credential_definitions/tests/test_routes.py @@ -5,6 +5,9 @@ from ....core.in_memory import InMemoryProfile from ....indy.issuer import IndyIssuer from ....ledger.base import BaseLedger +from ....ledger.multiple_ledger.ledger_requests_executor import ( + IndyLedgerRequestsExecutor, +) from ....storage.base import BaseStorage from ....tails.base import BaseTailsServer @@ -322,20 +325,35 @@ async def test_created(self): ) async def test_get_credential_definition(self): + self.profile_injector.bind_instance( + IndyLedgerRequestsExecutor, + async_mock.MagicMock( + get_ledger_for_identifier=async_mock.CoroutineMock( + return_value=("test_ledger_id", self.ledger) + ) + ), + ) self.request.match_info = {"cred_def_id": CRED_DEF_ID} - with async_mock.patch.object(test_module.web, "json_response") as mock_response: result = await test_module.credential_definitions_get_credential_definition( self.request ) assert result == mock_response.return_value mock_response.assert_called_once_with( - {"credential_definition": {"cred": "def", "signed_txn": "..."}} + { + "ledger_id": "test_ledger_id", + "credential_definition": {"cred": "def", "signed_txn": "..."}, + } ) async def test_get_credential_definition_no_ledger(self): + self.profile_injector.bind_instance( + IndyLedgerRequestsExecutor, + async_mock.MagicMock( + get_ledger_for_identifier=async_mock.CoroutineMock(return_value=None) + ), + ) self.request.match_info = {"cred_def_id": CRED_DEF_ID} - self.context.injector.clear_binding(BaseLedger) self.profile_injector.clear_binding(BaseLedger) with self.assertRaises(test_module.web.HTTPForbidden): diff --git a/aries_cloudagent/messaging/schemas/routes.py b/aries_cloudagent/messaging/schemas/routes.py index 03e99ea4bb..5f73364ccd 100644 --- a/aries_cloudagent/messaging/schemas/routes.py +++ b/aries_cloudagent/messaging/schemas/routes.py @@ -24,6 +24,10 @@ from ...indy.models.schema import SchemaSchema from ...ledger.base import BaseLedger from ...ledger.error import LedgerError +from ...ledger.multiple_ledger.ledger_requests_executor import ( + GET_SCHEMA, + IndyLedgerRequestsExecutor, +) from ...protocols.endorse_transaction.v1_0.manager import ( TransactionManager, TransactionManagerError, @@ -163,6 +167,7 @@ async def schemas_send_schema(request: web.BaseRequest): """ context: AdminRequestContext = request["context"] + profile = context.profile outbound_handler = request["outbound_message_router"] create_transaction_for_endorser = json.loads( @@ -191,7 +196,7 @@ async def schemas_send_schema(request: web.BaseRequest): if not write_ledger: try: - async with context.session() as session: + async with profile.session() as session: connection_record = await ConnRecord.retrieve_by_id( session, connection_id ) @@ -200,8 +205,10 @@ async def schemas_send_schema(request: web.BaseRequest): except BaseModelError as err: raise web.HTTPBadRequest(reason=err.roll_up) from err - session = await context.session() - endorser_info = await connection_record.metadata_get(session, "endorser_info") + async with profile.session() as session: + endorser_info = await connection_record.metadata_get( + session, "endorser_info" + ) if not endorser_info: raise web.HTTPForbidden( reason="Endorser Info is not set up in " @@ -328,10 +335,20 @@ async def schemas_get_schema(request: web.BaseRequest): """ context: AdminRequestContext = request["context"] - schema_id = request.match_info["schema_id"] - ledger = context.inject_or(BaseLedger) + ledger_id = None + async with context.profile.session() as session: + ledger_exec_inst = session.inject(IndyLedgerRequestsExecutor) + ledger_info = await ledger_exec_inst.get_ledger_for_identifier( + schema_id, + txn_record_type=GET_SCHEMA, + ) + if isinstance(ledger_info, tuple): + ledger_id = ledger_info[0] + ledger = ledger_info[1] + else: + ledger = ledger_info if not ledger: reason = "No ledger available" if not context.settings.get_value("wallet.type"): @@ -343,8 +360,10 @@ async def schemas_get_schema(request: web.BaseRequest): schema = await ledger.get_schema(schema_id) except LedgerError as err: raise web.HTTPBadRequest(reason=err.roll_up) from err - - return web.json_response({"schema": schema}) + if ledger_id: + return web.json_response({"ledger_id": ledger_id, "schema": schema}) + else: + return web.json_response({"schema": schema}) @docs(tags=["schema"], summary="Writes a schema non-secret record to the wallet") @@ -363,12 +382,23 @@ async def schemas_fix_schema_wallet_record(request: web.BaseRequest): """ context: AdminRequestContext = request["context"] - session = await context.session() - storage = session.inject(BaseStorage) + profile = context.profile schema_id = request.match_info["schema_id"] - ledger = context.inject_or(BaseLedger) + ledger_id = None + async with profile.session() as session: + storage = session.inject(BaseStorage) + ledger_exec_inst = session.inject(IndyLedgerRequestsExecutor) + ledger_info = await ledger_exec_inst.get_ledger_for_identifier( + schema_id, + txn_record_type=GET_SCHEMA, + ) + if isinstance(ledger_info, tuple): + ledger_id = ledger_info[0] + ledger = ledger_info[1] + else: + ledger = ledger_info if not ledger: reason = "No ledger available" if not context.settings.get_value("wallet.type"): @@ -387,11 +417,13 @@ async def schemas_fix_schema_wallet_record(request: web.BaseRequest): }, ) if 0 == len(found): - await add_schema_non_secrets_record(session.profile, schema_id) + await add_schema_non_secrets_record(profile, schema_id) except LedgerError as err: raise web.HTTPBadRequest(reason=err.roll_up) from err - - return web.json_response({"schema": schema}) + if ledger_id: + return web.json_response({"ledger_id": ledger_id, "schema": schema}) + else: + return web.json_response({"schema": schema}) def register_events(event_bus: EventBus): diff --git a/aries_cloudagent/messaging/schemas/tests/test_routes.py b/aries_cloudagent/messaging/schemas/tests/test_routes.py index d255e9af11..ca45372902 100644 --- a/aries_cloudagent/messaging/schemas/tests/test_routes.py +++ b/aries_cloudagent/messaging/schemas/tests/test_routes.py @@ -2,8 +2,12 @@ from asynctest import mock as async_mock from ....admin.request_context import AdminRequestContext +from ....core.in_memory import InMemoryProfile from ....indy.issuer import IndyIssuer from ....ledger.base import BaseLedger +from ....ledger.multiple_ledger.ledger_requests_executor import ( + IndyLedgerRequestsExecutor, +) from ....storage.base import BaseStorage from .. import routes as test_module @@ -16,18 +20,8 @@ class TestSchemaRoutes(AsyncTestCase): def setUp(self): self.session_inject = {} - self.context = AdminRequestContext.test_context(self.session_inject) - self.request_dict = { - "context": self.context, - "outbound_message_router": async_mock.CoroutineMock(), - } - self.request = async_mock.MagicMock( - app={}, - match_info={}, - query={}, - __getitem__=lambda _, k: self.request_dict[k], - ) - + self.profile = InMemoryProfile.test_profile() + self.profile_injector = self.profile.context.injector self.ledger = async_mock.create_autospec(BaseLedger) self.ledger.__aenter__ = async_mock.CoroutineMock(return_value=self.ledger) self.ledger.create_and_send_schema = async_mock.CoroutineMock( @@ -36,16 +30,29 @@ def setUp(self): self.ledger.get_schema = async_mock.CoroutineMock( return_value={"schema": "def", "signed_txn": "..."} ) - self.context.injector.bind_instance(BaseLedger, self.ledger) + self.profile_injector.bind_instance(BaseLedger, self.ledger) self.issuer = async_mock.create_autospec(IndyIssuer) - self.context.injector.bind_instance(IndyIssuer, self.issuer) + self.profile_injector.bind_instance(IndyIssuer, self.issuer) self.storage = async_mock.create_autospec(BaseStorage) self.storage.find_all_records = async_mock.CoroutineMock( return_value=[async_mock.MagicMock(value=SCHEMA_ID)] ) self.session_inject[BaseStorage] = self.storage + self.context = AdminRequestContext.test_context( + self.session_inject, profile=self.profile + ) + self.request_dict = { + "context": self.context, + "outbound_message_router": async_mock.CoroutineMock(), + } + self.request = async_mock.MagicMock( + app={}, + match_info={}, + query={}, + __getitem__=lambda _, k: self.request_dict[k], + ) async def test_send_schema(self): self.request.json = async_mock.CoroutineMock( @@ -280,18 +287,35 @@ async def test_created(self): mock_response.assert_called_once_with({"schema_ids": [SCHEMA_ID]}) async def test_get_schema(self): + self.profile_injector.bind_instance( + IndyLedgerRequestsExecutor, + async_mock.MagicMock( + get_ledger_for_identifier=async_mock.CoroutineMock( + return_value=("test_ledger_id", self.ledger) + ) + ), + ) self.request.match_info = {"schema_id": SCHEMA_ID} - with async_mock.patch.object(test_module.web, "json_response") as mock_response: result = await test_module.schemas_get_schema(self.request) assert result == mock_response.return_value mock_response.assert_called_once_with( - {"schema": {"schema": "def", "signed_txn": "..."}} + { + "ledger_id": "test_ledger_id", + "schema": {"schema": "def", "signed_txn": "..."}, + } ) async def test_get_schema_on_seq_no(self): + self.profile_injector.bind_instance( + IndyLedgerRequestsExecutor, + async_mock.MagicMock( + get_ledger_for_identifier=async_mock.CoroutineMock( + return_value=self.ledger + ) + ), + ) self.request.match_info = {"schema_id": "12345"} - with async_mock.patch.object(test_module.web, "json_response") as mock_response: result = await test_module.schemas_get_schema(self.request) assert result == mock_response.return_value @@ -300,6 +324,12 @@ async def test_get_schema_on_seq_no(self): ) async def test_get_schema_no_ledger(self): + self.profile_injector.bind_instance( + IndyLedgerRequestsExecutor, + async_mock.MagicMock( + get_ledger_for_identifier=async_mock.CoroutineMock(return_value=None) + ), + ) self.request.match_info = {"schema_id": SCHEMA_ID} self.ledger.get_schema = async_mock.CoroutineMock( side_effect=test_module.LedgerError("Down for routine maintenance") @@ -310,6 +340,14 @@ async def test_get_schema_no_ledger(self): await test_module.schemas_get_schema(self.request) async def test_get_schema_x_ledger(self): + self.profile_injector.bind_instance( + IndyLedgerRequestsExecutor, + async_mock.MagicMock( + get_ledger_for_identifier=async_mock.CoroutineMock( + return_value=self.ledger + ) + ), + ) self.request.match_info = {"schema_id": SCHEMA_ID} self.ledger.get_schema = async_mock.CoroutineMock( side_effect=test_module.LedgerError("Down for routine maintenance") diff --git a/aries_cloudagent/protocols/actionmenu/v1_0/routes.py b/aries_cloudagent/protocols/actionmenu/v1_0/routes.py index a3ddf552f5..7ba58f839c 100644 --- a/aries_cloudagent/protocols/actionmenu/v1_0/routes.py +++ b/aries_cloudagent/protocols/actionmenu/v1_0/routes.py @@ -155,7 +155,7 @@ async def actionmenu_perform(request: web.BaseRequest): params = await request.json() try: - async with context.session() as session: + async with context.profile.session() as session: connection = await ConnRecord.retrieve_by_id(session, connection_id) except StorageNotFoundError as err: raise web.HTTPNotFound(reason=err.roll_up) from err @@ -184,7 +184,7 @@ async def actionmenu_request(request: web.BaseRequest): outbound_handler = request["outbound_message_router"] try: - async with context.session() as session: + async with context.profile.session() as session: connection = await ConnRecord.retrieve_by_id(session, connection_id) except StorageNotFoundError as err: LOGGER.debug("Connection not found for action menu request: %s", connection_id) @@ -222,7 +222,7 @@ async def actionmenu_send(request: web.BaseRequest): raise web.HTTPBadRequest(reason=err.roll_up) from err try: - async with context.session() as session: + async with context.profile.session() as session: connection = await ConnRecord.retrieve_by_id(session, connection_id) except StorageNotFoundError as err: LOGGER.debug( diff --git a/aries_cloudagent/protocols/actionmenu/v1_0/util.py b/aries_cloudagent/protocols/actionmenu/v1_0/util.py index fdec9510ee..2ea08d1b27 100644 --- a/aries_cloudagent/protocols/actionmenu/v1_0/util.py +++ b/aries_cloudagent/protocols/actionmenu/v1_0/util.py @@ -16,7 +16,7 @@ async def retrieve_connection_menu( connection_id: str, context: AdminRequestContext ) -> Menu: """Retrieve the previously-received action menu.""" - async with context.session() as session: + async with context.profile.session() as session: storage = session.inject(BaseStorage) try: record = await storage.find_record( @@ -31,7 +31,7 @@ async def save_connection_menu( menu: Menu, connection_id: str, context: AdminRequestContext ): """Save a received action menu.""" - async with context.session() as session: + async with context.profile.session() as session: storage = session.inject(BaseStorage) try: record = await storage.find_record( diff --git a/aries_cloudagent/protocols/basicmessage/v1_0/routes.py b/aries_cloudagent/protocols/basicmessage/v1_0/routes.py index f2ecb1eb9d..e9989041c9 100644 --- a/aries_cloudagent/protocols/basicmessage/v1_0/routes.py +++ b/aries_cloudagent/protocols/basicmessage/v1_0/routes.py @@ -51,7 +51,7 @@ async def connections_send_message(request: web.BaseRequest): params = await request.json() try: - async with context.session() as session: + async with context.profile.session() as session: connection = await ConnRecord.retrieve_by_id(session, connection_id) except StorageNotFoundError as err: raise web.HTTPNotFound(reason=err.roll_up) from err diff --git a/aries_cloudagent/protocols/coordinate_mediation/v1_0/handlers/keylist_handler.py b/aries_cloudagent/protocols/coordinate_mediation/v1_0/handlers/keylist_handler.py index c46f7b0f47..de00566498 100644 --- a/aries_cloudagent/protocols/coordinate_mediation/v1_0/handlers/keylist_handler.py +++ b/aries_cloudagent/protocols/coordinate_mediation/v1_0/handlers/keylist_handler.py @@ -26,11 +26,11 @@ async def handle(self, context: RequestContext, responder: BaseResponder): if not context.connection_ready: raise HandlerException("Received keylist message from inactive connection") - session = await context.session() try: - await MediationRecord.retrieve_by_connection_id( - session, context.connection_record.connection_id - ) + async with context.profile.session() as session: + await MediationRecord.retrieve_by_connection_id( + session, context.connection_record.connection_id + ) except StorageNotFoundError as err: LOG.warning( "Received keylist from connection that is not acting as mediator: %s", diff --git a/aries_cloudagent/protocols/coordinate_mediation/v1_0/routes.py b/aries_cloudagent/protocols/coordinate_mediation/v1_0/routes.py index 1fe98cceab..6740dce76b 100644 --- a/aries_cloudagent/protocols/coordinate_mediation/v1_0/routes.py +++ b/aries_cloudagent/protocols/coordinate_mediation/v1_0/routes.py @@ -201,8 +201,8 @@ async def list_mediation_requests(request: web.BaseRequest): tag_filter["state"] = state try: - session = await context.session() - records = await MediationRecord.query(session, tag_filter) + async with context.profile.session() as session: + records = await MediationRecord.query(session, tag_filter) results = [record.serialize() for record in records] results.sort(key=mediation_sort_key) except (StorageError, BaseModelError) as err: @@ -219,8 +219,10 @@ async def retrieve_mediation_request(request: web.BaseRequest): mediation_id = request.match_info["mediation_id"] try: - session = await context.session() - mediation_record = await MediationRecord.retrieve_by_id(session, mediation_id) + async with context.profile.session() as session: + mediation_record = await MediationRecord.retrieve_by_id( + session, mediation_id + ) result = mediation_record.serialize() except StorageNotFoundError as err: raise web.HTTPNotFound(reason=err.roll_up) from err @@ -239,9 +241,10 @@ async def delete_mediation_request(request: web.BaseRequest): mediation_id = request.match_info["mediation_id"] try: - session = await context.session() - - mediation_record = await MediationRecord.retrieve_by_id(session, mediation_id) + async with context.profile.session() as session: + mediation_record = await MediationRecord.retrieve_by_id( + session, mediation_id + ) result = mediation_record.serialize() await mediation_record.delete_record(session) except StorageNotFoundError as err: @@ -370,7 +373,7 @@ async def get_keylist(request: web.BaseRequest): tag_filter["role"] = role try: - async with context.session() as session: + async with context.profile.session() as session: keylists = await RouteRecord.query(session, tag_filter) results = [record.serialize() for record in keylists] except (StorageError, BaseModelError) as err: diff --git a/aries_cloudagent/protocols/coordinate_mediation/v1_0/tests/test_routes.py b/aries_cloudagent/protocols/coordinate_mediation/v1_0/tests/test_routes.py index 2744f8afdc..a19070a6e6 100644 --- a/aries_cloudagent/protocols/coordinate_mediation/v1_0/tests/test_routes.py +++ b/aries_cloudagent/protocols/coordinate_mediation/v1_0/tests/test_routes.py @@ -3,6 +3,7 @@ from asynctest import mock as async_mock, TestCase as AsyncTestCase from .....admin.request_context import AdminRequestContext +from .....core.in_memory import InMemoryProfile from .....config.injection_context import InjectionContext from .....messaging.request_context import RequestContext @@ -13,8 +14,9 @@ class TestCoordinateMediationRoutes(AsyncTestCase): def setUp(self): - self.session_inject = {} - self.context = AdminRequestContext.test_context(self.session_inject) + self.profile = InMemoryProfile.test_profile() + self.context = self.profile.context + setattr(self.context, "profile", self.profile) self.outbound_message_router = async_mock.CoroutineMock() self.request_dict = { "context": self.context, @@ -68,39 +70,45 @@ def test_mediation_sort_key(self): async def test_list_mediation_requests(self): self.request.query = {} - self.context.session = async_mock.CoroutineMock() with async_mock.patch.object( test_module.MediationRecord, "query", async_mock.CoroutineMock(return_value=[self.mock_record]), ) as mock_query, async_mock.patch.object( test_module.web, "json_response" - ) as json_response: + ) as json_response, async_mock.patch.object( + self.context.profile, + "session", + async_mock.MagicMock(return_value=InMemoryProfile.test_session()), + ) as session: await test_module.list_mediation_requests(self.request) json_response.assert_called_once_with( {"results": [self.mock_record.serialize.return_value]} ) - mock_query.assert_called_once_with(self.context.session.return_value, {}) + mock_query.assert_called_once_with(session.return_value, {}) async def test_list_mediation_requests_filters(self): self.request.query = { "state": MediationRecord.STATE_GRANTED, "conn_id": "test-conn-id", } - self.context.session = async_mock.CoroutineMock() with async_mock.patch.object( test_module.MediationRecord, "query", async_mock.CoroutineMock(return_value=[self.mock_record]), ) as mock_query, async_mock.patch.object( test_module.web, "json_response" - ) as json_response: + ) as json_response, async_mock.patch.object( + self.context.profile, + "session", + async_mock.MagicMock(return_value=InMemoryProfile.test_session()), + ) as session: await test_module.list_mediation_requests(self.request) json_response.assert_called_once_with( {"results": [self.mock_record.serialize.return_value]} ) mock_query.assert_called_once_with( - self.context.session.return_value, + session.return_value, { "connection_id": "test-conn-id", "state": MediationRecord.STATE_GRANTED, @@ -386,6 +394,7 @@ async def test_mediation_request_deny_x_storage_error(self): await test_module.mediation_request_deny(self.request) async def test_get_keylist(self): + session = await self.context.profile.session() self.request.query["role"] = MediationRecord.ROLE_SERVER self.request.query["conn_id"] = "test-id" @@ -402,9 +411,9 @@ async def test_get_keylist(self): "query", async_mock.CoroutineMock(return_value=query_results), ) as mock_query, async_mock.patch.object( - self.context, + self.context.profile, "session", - async_mock.MagicMock(return_value=self.context.session()), + async_mock.MagicMock(return_value=session), ) as mock_session, async_mock.patch.object( test_module.web, "json_response" ) as mock_response: @@ -414,18 +423,19 @@ async def test_get_keylist(self): ) mock_query.assert_called_once_with( mock_session.return_value, - {"role": MediationRecord.ROLE_SERVER, "connection_id": "test-id"}, + {"connection_id": "test-id", "role": MediationRecord.ROLE_SERVER}, ) async def test_get_keylist_no_matching_records(self): + session = await self.context.profile.session() with async_mock.patch.object( test_module.RouteRecord, "query", async_mock.CoroutineMock(return_value=[]), ) as mock_query, async_mock.patch.object( - self.context, + self.context.profile, "session", - async_mock.MagicMock(return_value=self.context.session()), + async_mock.MagicMock(return_value=session), ) as mock_session, async_mock.patch.object( test_module.web, "json_response" ) as mock_response: diff --git a/aries_cloudagent/protocols/endorse_transaction/v1_0/routes.py b/aries_cloudagent/protocols/endorse_transaction/v1_0/routes.py index 64d361369f..4a31f089e2 100644 --- a/aries_cloudagent/protocols/endorse_transaction/v1_0/routes.py +++ b/aries_cloudagent/protocols/endorse_transaction/v1_0/routes.py @@ -265,7 +265,10 @@ async def transaction_create_request(request: web.BaseRequest): transaction_mgr = TransactionManager(context.profile) try: - transaction_record, transaction_request = await transaction_mgr.create_request( + ( + transaction_record, + transaction_request, + ) = await transaction_mgr.create_request( transaction=transaction_record, expires_time=expires_time, endorser_write_txn=endorser_write_txn, @@ -436,7 +439,6 @@ async def cancel_transaction(request: web.BaseRequest): context: AdminRequestContext = request["context"] outbound_handler = request["outbound_message_router"] - transaction_id = request.match_info["tran_id"] try: async with context.profile.session() as session: @@ -471,7 +473,8 @@ async def cancel_transaction(request: web.BaseRequest): transaction, cancelled_transaction_response, ) = await transaction_mgr.cancel_transaction( - transaction=transaction, state=TransactionRecord.STATE_TRANSACTION_CANCELLED + transaction=transaction, + state=TransactionRecord.STATE_TRANSACTION_CANCELLED, ) except (StorageError, TransactionManagerError) as err: raise web.HTTPBadRequest(reason=err.roll_up) from err @@ -501,7 +504,6 @@ async def transaction_resend(request: web.BaseRequest): context: AdminRequestContext = request["context"] outbound_handler = request["outbound_message_router"] - transaction_id = request.match_info["tran_id"] try: async with context.profile.session() as session: @@ -672,7 +674,6 @@ async def transaction_write(request: web.BaseRequest): context: AdminRequestContext = request["context"] outbound_handler = request["outbound_message_router"] - transaction_id = request.match_info["tran_id"] try: async with context.profile.session() as session: diff --git a/aries_cloudagent/protocols/endorse_transaction/v1_0/tests/test_manager.py b/aries_cloudagent/protocols/endorse_transaction/v1_0/tests/test_manager.py index 7ef3cc6ef2..edad031889 100644 --- a/aries_cloudagent/protocols/endorse_transaction/v1_0/tests/test_manager.py +++ b/aries_cloudagent/protocols/endorse_transaction/v1_0/tests/test_manager.py @@ -1,3 +1,4 @@ +import asyncio import json import uuid @@ -388,10 +389,13 @@ async def test_complete_transaction(self): messages_attach=self.test_messages_attach, connection_id=self.test_connection_id, ) - - self.ledger.get_indy_storage = async_mock.MagicMock( - return_value=async_mock.MagicMock(add_record=async_mock.CoroutineMock()) + future = asyncio.Future() + future.set_result( + async_mock.MagicMock( + return_value=async_mock.MagicMock(add_record=async_mock.CoroutineMock()) + ) ) + self.ledger.get_indy_storage = future self.ledger.txn_submit = async_mock.CoroutineMock( return_value=json.dumps( { diff --git a/aries_cloudagent/protocols/endorse_transaction/v1_0/tests/test_routes.py b/aries_cloudagent/protocols/endorse_transaction/v1_0/tests/test_routes.py index 3855255fb7..d91b1ff8b8 100644 --- a/aries_cloudagent/protocols/endorse_transaction/v1_0/tests/test_routes.py +++ b/aries_cloudagent/protocols/endorse_transaction/v1_0/tests/test_routes.py @@ -1,3 +1,4 @@ +import asyncio import json from asynctest import mock as async_mock, TestCase as AsyncTestCase @@ -23,9 +24,11 @@ class TestEndorseTransactionRoutes(AsyncTestCase): - def setUp(self): - self.session_inject = {} + async def setUp(self): self.profile = InMemoryProfile.test_profile() + self.context = self.profile.context + setattr(self.context, "profile", self.profile) + self.session = await self.profile.session() self.profile_injector = self.profile.context.injector self.profile_session = InMemoryProfile.test_session() setattr( @@ -49,18 +52,18 @@ def setUp(self): } ) ) - self.ledger.get_indy_storage = async_mock.MagicMock( - return_value=async_mock.MagicMock(add_record=async_mock.CoroutineMock()) + future = asyncio.Future() + future.set_result( + async_mock.MagicMock( + return_value=async_mock.MagicMock(add_record=async_mock.CoroutineMock()) + ) ) + self.ledger.get_indy_storage = future self.ledger.get_schema = async_mock.CoroutineMock( return_value={"id": SCHEMA_ID, "...": "..."} ) self.profile_injector.bind_instance(BaseLedger, self.ledger) - self.context = AdminRequestContext.test_context( - self.session_inject, profile=self.profile - ) - self.request_dict = { "context": self.context, "outbound_message_router": async_mock.CoroutineMock(), @@ -425,17 +428,19 @@ async def test_transaction_create_request_mgr_create_request_x(self): async def test_endorse_transaction_response(self): self.request.match_info = {"tran_id": "dummy"} - - self.session_inject[BaseWallet] = async_mock.MagicMock( - get_public_did=async_mock.CoroutineMock( - return_value=DIDInfo( - "did", - "verkey", - {"meta": "data"}, - method=DIDMethod.SOV, - key_type=KeyType.ED25519, + self.session.context.injector.bind_instance( + BaseWallet, + async_mock.MagicMock( + get_public_did=async_mock.CoroutineMock( + return_value=DIDInfo( + "did", + "verkey", + {"meta": "data"}, + method=DIDMethod.SOV, + key_type=KeyType.ED25519, + ) ) - ) + ), ) with async_mock.patch.object( @@ -446,7 +451,11 @@ async def test_endorse_transaction_response(self): test_module, "TransactionManager", async_mock.MagicMock() ) as mock_txn_mgr, async_mock.patch.object( test_module.web, "json_response" - ) as mock_response: + ) as mock_response, async_mock.patch.object( + self.context.profile, + "session", + async_mock.MagicMock(return_value=self.session), + ) as mock_session: mock_txn_mgr.return_value = async_mock.MagicMock( create_endorse_response=async_mock.CoroutineMock( return_value=( @@ -475,37 +484,51 @@ async def test_endorse_transaction_response(self): # TODO code re-factored from routes.py to manager.py so tests must be moved async def skip_test_endorse_transaction_response_no_wallet_x(self): - self.session_inject[BaseWallet] = None + self.session.context.injector.clear_binding(BaseWallet) with self.assertRaises(test_module.web.HTTPForbidden): await test_module.endorse_transaction_response(self.request) async def skip_test_endorse_transaction_response_no_endorser_did_info_x(self): self.request.match_info = {"tran_id": "dummy"} - self.session_inject[BaseWallet] = async_mock.MagicMock( - get_public_did=async_mock.CoroutineMock(return_value=None) + self.session.context.injector.bind_instance( + BaseWallet, + async_mock.MagicMock( + get_public_did=async_mock.CoroutineMock(return_value=None) + ), ) - - with self.assertRaises(test_module.web.HTTPForbidden): - await test_module.endorse_transaction_response(self.request) + with async_mock.patch.object( + self.context.profile, + "session", + async_mock.MagicMock(return_value=self.session), + ) as mock_session: + with self.assertRaises(test_module.web.HTTPForbidden): + await test_module.endorse_transaction_response(self.request) async def test_endorse_transaction_response_not_found_x(self): self.request.match_info = {"tran_id": "dummy"} - self.session_inject[BaseWallet] = async_mock.MagicMock( - get_public_did=async_mock.CoroutineMock( - return_value=DIDInfo( - "did", - "verkey", - {"meta": "data"}, - method=DIDMethod.SOV, - key_type=KeyType.ED25519, + self.session.context.injector.bind_instance( + BaseWallet, + async_mock.MagicMock( + get_public_did=async_mock.CoroutineMock( + return_value=DIDInfo( + "did", + "verkey", + {"meta": "data"}, + method=DIDMethod.SOV, + key_type=KeyType.ED25519, + ) ) - ) + ), ) with async_mock.patch.object( TransactionRecord, "retrieve_by_id", async_mock.CoroutineMock() - ) as mock_txn_rec_retrieve: + ) as mock_txn_rec_retrieve, async_mock.patch.object( + self.context.profile, + "session", + async_mock.MagicMock(return_value=self.session), + ) as mock_session: mock_txn_rec_retrieve.side_effect = test_module.StorageNotFoundError() with self.assertRaises(test_module.web.HTTPNotFound): @@ -513,24 +536,30 @@ async def test_endorse_transaction_response_not_found_x(self): async def test_endorse_transaction_response_base_model_x(self): self.request.match_info = {"tran_id": "dummy"} - - self.session_inject[BaseWallet] = async_mock.MagicMock( - get_public_did=async_mock.CoroutineMock( - return_value=DIDInfo( - "did", - "verkey", - {"meta": "data"}, - method=DIDMethod.SOV, - key_type=KeyType.ED25519, + self.session.context.injector.bind_instance( + BaseWallet, + async_mock.MagicMock( + get_public_did=async_mock.CoroutineMock( + return_value=DIDInfo( + "did", + "verkey", + {"meta": "data"}, + method=DIDMethod.SOV, + key_type=KeyType.ED25519, + ) ) - ) + ), ) with async_mock.patch.object( ConnRecord, "retrieve_by_id", async_mock.CoroutineMock() ) as mock_conn_rec_retrieve, async_mock.patch.object( TransactionRecord, "retrieve_by_id", async_mock.CoroutineMock() - ) as mock_txn_rec_retrieve: + ) as mock_txn_rec_retrieve, async_mock.patch.object( + self.context.profile, + "session", + async_mock.MagicMock(return_value=self.session), + ) as mock_session: mock_conn_rec_retrieve.side_effect = test_module.BaseModelError() mock_txn_rec_retrieve.return_value = async_mock.MagicMock( serialize=async_mock.MagicMock(return_value={"...": "..."}) @@ -542,23 +571,30 @@ async def test_endorse_transaction_response_base_model_x(self): async def test_endorse_transaction_response_no_jobs_x(self): self.request.match_info = {"tran_id": "dummy"} - self.session_inject[BaseWallet] = async_mock.MagicMock( - get_public_did=async_mock.CoroutineMock( - return_value=DIDInfo( - "did", - "verkey", - {"meta": "data"}, - method=DIDMethod.SOV, - key_type=KeyType.ED25519, + self.session.context.injector.bind_instance( + BaseWallet, + async_mock.MagicMock( + get_public_did=async_mock.CoroutineMock( + return_value=DIDInfo( + "did", + "verkey", + {"meta": "data"}, + method=DIDMethod.SOV, + key_type=KeyType.ED25519, + ) ) - ) + ), ) with async_mock.patch.object( ConnRecord, "retrieve_by_id", async_mock.CoroutineMock() ) as mock_conn_rec_retrieve, async_mock.patch.object( TransactionRecord, "retrieve_by_id", async_mock.CoroutineMock() - ) as mock_txn_rec_retrieve: + ) as mock_txn_rec_retrieve, async_mock.patch.object( + self.context.profile, + "session", + async_mock.MagicMock(return_value=self.session), + ) as mock_session: mock_conn_rec_retrieve.return_value = async_mock.MagicMock( metadata_get=async_mock.CoroutineMock(return_value=None) ) @@ -572,16 +608,19 @@ async def test_endorse_transaction_response_no_jobs_x(self): async def skip_test_endorse_transaction_response_no_ledger_x(self): self.request.match_info = {"tran_id": "dummy"} self.context.injector.clear_binding(BaseLedger) - self.session_inject[BaseWallet] = async_mock.MagicMock( - get_public_did=async_mock.CoroutineMock( - return_value=DIDInfo( - "did", - "verkey", - {"meta": "data"}, - method=DIDMethod.SOV, - key_type=KeyType.ED25519, + self.session.context.injector.bind_instance( + BaseWallet, + async_mock.MagicMock( + get_public_did=async_mock.CoroutineMock( + return_value=DIDInfo( + "did", + "verkey", + {"meta": "data"}, + method=DIDMethod.SOV, + key_type=KeyType.ED25519, + ) ) - ) + ), ) with async_mock.patch.object( @@ -590,7 +629,11 @@ async def skip_test_endorse_transaction_response_no_ledger_x(self): TransactionRecord, "retrieve_by_id", async_mock.CoroutineMock() ) as mock_txn_rec_retrieve, async_mock.patch.object( test_module, "TransactionManager", async_mock.MagicMock() - ) as mock_txn_mgr: + ) as mock_txn_mgr, async_mock.patch.object( + self.context.profile, + "session", + async_mock.MagicMock(return_value=self.session), + ) as mock_session: mock_txn_mgr.return_value = async_mock.MagicMock( create_endorse_response=async_mock.CoroutineMock( return_value=( @@ -620,23 +663,30 @@ async def skip_test_endorse_transaction_response_no_ledger_x(self): async def test_endorse_transaction_response_wrong_my_job_x(self): self.request.match_info = {"tran_id": "dummy"} - self.session_inject[BaseWallet] = async_mock.MagicMock( - get_public_did=async_mock.CoroutineMock( - return_value=DIDInfo( - "did", - "verkey", - {"meta": "data"}, - method=DIDMethod.SOV, - key_type=KeyType.ED25519, + self.session.context.injector.bind_instance( + BaseWallet, + async_mock.MagicMock( + get_public_did=async_mock.CoroutineMock( + return_value=DIDInfo( + "did", + "verkey", + {"meta": "data"}, + method=DIDMethod.SOV, + key_type=KeyType.ED25519, + ) ) - ) + ), ) with async_mock.patch.object( ConnRecord, "retrieve_by_id", async_mock.CoroutineMock() ) as mock_conn_rec_retrieve, async_mock.patch.object( TransactionRecord, "retrieve_by_id", async_mock.CoroutineMock() - ) as mock_txn_rec_retrieve: + ) as mock_txn_rec_retrieve, async_mock.patch.object( + self.context.profile, + "session", + async_mock.MagicMock(return_value=self.session), + ) as mock_session: mock_conn_rec_retrieve.return_value = async_mock.MagicMock( metadata_get=async_mock.CoroutineMock( return_value={ @@ -656,16 +706,19 @@ async def test_endorse_transaction_response_wrong_my_job_x(self): async def skip_test_endorse_transaction_response_ledger_x(self): self.request.match_info = {"tran_id": "dummy"} - self.session_inject[BaseWallet] = async_mock.MagicMock( - get_public_did=async_mock.CoroutineMock( - return_value=DIDInfo( - "did", - "verkey", - {"meta": "data"}, - method=DIDMethod.SOV, - key_type=KeyType.ED25519, + self.session.context.injector.bind_instance( + BaseWallet, + async_mock.MagicMock( + get_public_did=async_mock.CoroutineMock( + return_value=DIDInfo( + "did", + "verkey", + {"meta": "data"}, + method=DIDMethod.SOV, + key_type=KeyType.ED25519, + ) ) - ) + ), ) self.ledger.txn_endorse = async_mock.CoroutineMock( side_effect=test_module.LedgerError() @@ -677,7 +730,11 @@ async def skip_test_endorse_transaction_response_ledger_x(self): TransactionRecord, "retrieve_by_id", async_mock.CoroutineMock() ) as mock_txn_rec_retrieve, async_mock.patch.object( test_module, "TransactionManager", async_mock.MagicMock() - ) as mock_txn_mgr: + ) as mock_txn_mgr, async_mock.patch.object( + self.context.profile, + "session", + async_mock.MagicMock(return_value=self.session), + ) as mock_session: mock_txn_mgr.return_value = async_mock.MagicMock( create_endorse_response=async_mock.CoroutineMock( return_value=( @@ -707,16 +764,19 @@ async def skip_test_endorse_transaction_response_ledger_x(self): async def test_endorse_transaction_response_txn_mgr_x(self): self.request.match_info = {"tran_id": "dummy"} - self.session_inject[BaseWallet] = async_mock.MagicMock( - get_public_did=async_mock.CoroutineMock( - return_value=DIDInfo( - "did", - "verkey", - {"meta": "data"}, - method=DIDMethod.SOV, - key_type=KeyType.ED25519, + self.session.context.injector.bind_instance( + BaseWallet, + async_mock.MagicMock( + get_public_did=async_mock.CoroutineMock( + return_value=DIDInfo( + "did", + "verkey", + {"meta": "data"}, + method=DIDMethod.SOV, + key_type=KeyType.ED25519, + ) ) - ) + ), ) with async_mock.patch.object( @@ -727,7 +787,11 @@ async def test_endorse_transaction_response_txn_mgr_x(self): test_module, "TransactionManager", async_mock.MagicMock() ) as mock_txn_mgr, async_mock.patch.object( test_module.web, "json_response" - ) as mock_response: + ) as mock_response, async_mock.patch.object( + self.context.profile, + "session", + async_mock.MagicMock(return_value=self.session), + ) as mock_session: mock_txn_mgr.return_value = async_mock.MagicMock( create_endorse_response=async_mock.CoroutineMock( side_effect=test_module.TransactionManagerError() @@ -752,16 +816,19 @@ async def test_endorse_transaction_response_txn_mgr_x(self): async def test_refuse_transaction_response(self): self.request.match_info = {"tran_id": "dummy"} - self.session_inject[BaseWallet] = async_mock.MagicMock( - get_public_did=async_mock.CoroutineMock( - return_value=DIDInfo( - "did", - "verkey", - {"meta": "data"}, - method=DIDMethod.SOV, - key_type=KeyType.ED25519, + self.session.context.injector.bind_instance( + BaseWallet, + async_mock.MagicMock( + get_public_did=async_mock.CoroutineMock( + return_value=DIDInfo( + "did", + "verkey", + {"meta": "data"}, + method=DIDMethod.SOV, + key_type=KeyType.ED25519, + ) ) - ) + ), ) with async_mock.patch.object( @@ -772,7 +839,11 @@ async def test_refuse_transaction_response(self): test_module, "TransactionManager", async_mock.MagicMock() ) as mock_txn_mgr, async_mock.patch.object( test_module.web, "json_response" - ) as mock_response: + ) as mock_response, async_mock.patch.object( + self.context.profile, + "session", + async_mock.MagicMock(return_value=self.session), + ) as mock_session: mock_txn_mgr.return_value = async_mock.MagicMock( create_refuse_response=async_mock.CoroutineMock( return_value=( @@ -803,21 +874,28 @@ async def test_refuse_transaction_response(self): async def test_refuse_transaction_response_not_found_x(self): self.request.match_info = {"tran_id": "dummy"} - self.session_inject[BaseWallet] = async_mock.MagicMock( - get_public_did=async_mock.CoroutineMock( - return_value=DIDInfo( - "did", - "verkey", - {"meta": "data"}, - method=DIDMethod.SOV, - key_type=KeyType.ED25519, + self.session.context.injector.bind_instance( + BaseWallet, + async_mock.MagicMock( + get_public_did=async_mock.CoroutineMock( + return_value=DIDInfo( + "did", + "verkey", + {"meta": "data"}, + method=DIDMethod.SOV, + key_type=KeyType.ED25519, + ) ) - ) + ), ) with async_mock.patch.object( TransactionRecord, "retrieve_by_id", async_mock.CoroutineMock() - ) as mock_txn_rec_retrieve: + ) as mock_txn_rec_retrieve, async_mock.patch.object( + self.context.profile, + "session", + async_mock.MagicMock(return_value=self.session), + ) as mock_session: mock_txn_rec_retrieve.side_effect = test_module.StorageNotFoundError() with self.assertRaises(test_module.web.HTTPNotFound): @@ -826,23 +904,30 @@ async def test_refuse_transaction_response_not_found_x(self): async def test_refuse_transaction_response_conn_base_model_x(self): self.request.match_info = {"tran_id": "dummy"} - self.session_inject[BaseWallet] = async_mock.MagicMock( - get_public_did=async_mock.CoroutineMock( - return_value=DIDInfo( - "did", - "verkey", - {"meta": "data"}, - method=DIDMethod.SOV, - key_type=KeyType.ED25519, + self.session.context.injector.bind_instance( + BaseWallet, + async_mock.MagicMock( + get_public_did=async_mock.CoroutineMock( + return_value=DIDInfo( + "did", + "verkey", + {"meta": "data"}, + method=DIDMethod.SOV, + key_type=KeyType.ED25519, + ) ) - ) + ), ) with async_mock.patch.object( ConnRecord, "retrieve_by_id", async_mock.CoroutineMock() ) as mock_conn_rec_retrieve, async_mock.patch.object( TransactionRecord, "retrieve_by_id", async_mock.CoroutineMock() - ) as mock_txn_rec_retrieve: + ) as mock_txn_rec_retrieve, async_mock.patch.object( + self.context.profile, + "session", + async_mock.MagicMock(return_value=self.session), + ) as mock_session: mock_conn_rec_retrieve.side_effect = test_module.BaseModelError() mock_txn_rec_retrieve.return_value = async_mock.MagicMock( serialize=async_mock.MagicMock(return_value={"...": "..."}) @@ -854,23 +939,30 @@ async def test_refuse_transaction_response_conn_base_model_x(self): async def test_refuse_transaction_response_no_jobs_x(self): self.request.match_info = {"tran_id": "dummy"} - self.session_inject[BaseWallet] = async_mock.MagicMock( - get_public_did=async_mock.CoroutineMock( - return_value=DIDInfo( - "did", - "verkey", - {"meta": "data"}, - method=DIDMethod.SOV, - key_type=KeyType.ED25519, + self.session.context.injector.bind_instance( + BaseWallet, + async_mock.MagicMock( + get_public_did=async_mock.CoroutineMock( + return_value=DIDInfo( + "did", + "verkey", + {"meta": "data"}, + method=DIDMethod.SOV, + key_type=KeyType.ED25519, + ) ) - ) + ), ) with async_mock.patch.object( ConnRecord, "retrieve_by_id", async_mock.CoroutineMock() ) as mock_conn_rec_retrieve, async_mock.patch.object( TransactionRecord, "retrieve_by_id", async_mock.CoroutineMock() - ) as mock_txn_rec_retrieve: + ) as mock_txn_rec_retrieve, async_mock.patch.object( + self.context.profile, + "session", + async_mock.MagicMock(return_value=self.session), + ) as mock_session: mock_conn_rec_retrieve.return_value = async_mock.MagicMock( metadata_get=async_mock.CoroutineMock(return_value=None) ) @@ -884,23 +976,30 @@ async def test_refuse_transaction_response_no_jobs_x(self): async def test_refuse_transaction_response_wrong_my_job_x(self): self.request.match_info = {"tran_id": "dummy"} - self.session_inject[BaseWallet] = async_mock.MagicMock( - get_public_did=async_mock.CoroutineMock( - return_value=DIDInfo( - "did", - "verkey", - {"meta": "data"}, - method=DIDMethod.SOV, - key_type=KeyType.ED25519, + self.session.context.injector.bind_instance( + BaseWallet, + async_mock.MagicMock( + get_public_did=async_mock.CoroutineMock( + return_value=DIDInfo( + "did", + "verkey", + {"meta": "data"}, + method=DIDMethod.SOV, + key_type=KeyType.ED25519, + ) ) - ) + ), ) with async_mock.patch.object( ConnRecord, "retrieve_by_id", async_mock.CoroutineMock() ) as mock_conn_rec_retrieve, async_mock.patch.object( TransactionRecord, "retrieve_by_id", async_mock.CoroutineMock() - ) as mock_txn_rec_retrieve: + ) as mock_txn_rec_retrieve, async_mock.patch.object( + self.context.profile, + "session", + async_mock.MagicMock(return_value=self.session), + ) as mock_session: mock_conn_rec_retrieve.return_value = async_mock.MagicMock( metadata_get=async_mock.CoroutineMock( return_value={ @@ -920,7 +1019,7 @@ async def test_refuse_transaction_response_wrong_my_job_x(self): async def test_refuse_transaction_response_txn_mgr_x(self): self.request.match_info = {"tran_id": "dummy"} - self.session_inject[BaseWallet] = async_mock.MagicMock( + self.session.context.injector.bind_instance( BaseWallet, async_mock.MagicMock( get_public_did=async_mock.CoroutineMock( @@ -943,7 +1042,11 @@ async def test_refuse_transaction_response_txn_mgr_x(self): test_module, "TransactionManager", async_mock.MagicMock() ) as mock_txn_mgr, async_mock.patch.object( test_module.web, "json_response" - ) as mock_response: + ) as mock_response, async_mock.patch.object( + self.context.profile, + "session", + async_mock.MagicMock(return_value=self.session), + ) as mock_session: mock_txn_mgr.return_value = async_mock.MagicMock( create_refuse_response=async_mock.CoroutineMock( side_effect=test_module.TransactionManagerError() diff --git a/aries_cloudagent/protocols/introduction/v0_1/handlers/invitation_handler.py b/aries_cloudagent/protocols/introduction/v0_1/handlers/invitation_handler.py index dd0066db85..3244d2e338 100644 --- a/aries_cloudagent/protocols/introduction/v0_1/handlers/invitation_handler.py +++ b/aries_cloudagent/protocols/introduction/v0_1/handlers/invitation_handler.py @@ -27,12 +27,13 @@ async def handle(self, context: RequestContext, responder: BaseResponder): BaseIntroductionService ) if service: - await service.return_invitation( - context.connection_record.connection_id, - context.message, - await context.session(), - responder.send, - ) + async with context.profile.session() as session: + await service.return_invitation( + context.connection_record.connection_id, + context.message, + session, + responder.send, + ) else: raise HandlerException( "Cannot handle Invitation message with no introduction service" diff --git a/aries_cloudagent/protocols/introduction/v0_1/routes.py b/aries_cloudagent/protocols/introduction/v0_1/routes.py index b3ac7c0aaa..b8894a10fa 100644 --- a/aries_cloudagent/protocols/introduction/v0_1/routes.py +++ b/aries_cloudagent/protocols/introduction/v0_1/routes.py @@ -72,13 +72,14 @@ async def introduction_start(request: web.BaseRequest): raise web.HTTPForbidden(reason="Introduction service not available") try: - await service.start_introduction( - init_connection_id, - target_connection_id, - message, - await context.session(), - outbound_handler, - ) + async with context.profile.session() as session: + await service.start_introduction( + init_connection_id, + target_connection_id, + message, + session, + outbound_handler, + ) except (IntroductionError, StorageError) as err: raise web.HTTPBadRequest(reason=err.roll_up) from err diff --git a/aries_cloudagent/protocols/issue_credential/v1_0/handlers/credential_issue_handler.py b/aries_cloudagent/protocols/issue_credential/v1_0/handlers/credential_issue_handler.py index a69747ea39..1e6404e521 100644 --- a/aries_cloudagent/protocols/issue_credential/v1_0/handlers/credential_issue_handler.py +++ b/aries_cloudagent/protocols/issue_credential/v1_0/handlers/credential_issue_handler.py @@ -27,7 +27,7 @@ async def handle(self, context: RequestContext, responder: BaseResponder): """ r_time = get_timer() - + profile = context.profile self._logger.debug("CredentialHandler called with context %s", context) assert isinstance(context.message, CredentialIssue) self._logger.info( @@ -37,7 +37,7 @@ async def handle(self, context: RequestContext, responder: BaseResponder): if not context.connection_ready: raise HandlerException("No connection established for credential issue") - credential_manager = CredentialManager(context.profile) + credential_manager = CredentialManager(profile) cred_ex_record = await credential_manager.receive_credential( context.message, context.connection_record.connection_id ) # mgr only finds, saves record: on exception, saving state null is hopeless @@ -64,7 +64,7 @@ async def handle(self, context: RequestContext, responder: BaseResponder): # treat failure to store as mangled on receipt hence protocol error self._logger.exception(err) if cred_ex_record: - async with context.session() as session: + async with profile.session() as session: await cred_ex_record.save_error_state( session, reason=err.roll_up, # us: be specific diff --git a/aries_cloudagent/protocols/issue_credential/v1_0/handlers/credential_offer_handler.py b/aries_cloudagent/protocols/issue_credential/v1_0/handlers/credential_offer_handler.py index eb2aa6990c..cddc2e4010 100644 --- a/aries_cloudagent/protocols/issue_credential/v1_0/handlers/credential_offer_handler.py +++ b/aries_cloudagent/protocols/issue_credential/v1_0/handlers/credential_offer_handler.py @@ -28,7 +28,7 @@ async def handle(self, context: RequestContext, responder: BaseResponder): """ r_time = get_timer() - + profile = context.profile self._logger.debug("CredentialOfferHandler called with context %s", context) assert isinstance(context.message, CredentialOffer) self._logger.info( @@ -39,7 +39,7 @@ async def handle(self, context: RequestContext, responder: BaseResponder): if not context.connection_ready: raise HandlerException("No connection established for credential offer") - credential_manager = CredentialManager(context.profile) + credential_manager = CredentialManager(profile) cred_ex_record = await credential_manager.receive_offer( context.message, context.connection_record.connection_id ) # mgr only finds, saves record: on exception, saving state null is hopeless @@ -72,7 +72,7 @@ async def handle(self, context: RequestContext, responder: BaseResponder): ) as err: self._logger.exception(err) if cred_ex_record: - async with context.session() as session: + async with profile.session() as session: await cred_ex_record.save_error_state( session, reason=err.roll_up, # us: be specific diff --git a/aries_cloudagent/protocols/issue_credential/v1_0/handlers/credential_proposal_handler.py b/aries_cloudagent/protocols/issue_credential/v1_0/handlers/credential_proposal_handler.py index df9a2a34ef..fe2a94bf72 100644 --- a/aries_cloudagent/protocols/issue_credential/v1_0/handlers/credential_proposal_handler.py +++ b/aries_cloudagent/protocols/issue_credential/v1_0/handlers/credential_proposal_handler.py @@ -28,6 +28,7 @@ async def handle(self, context: RequestContext, responder: BaseResponder): """ r_time = get_timer() + profile = context.profile self._logger.debug("CredentialProposalHandler called with context %s", context) assert isinstance(context.message, CredentialProposal) @@ -39,7 +40,7 @@ async def handle(self, context: RequestContext, responder: BaseResponder): if not context.connection_ready: raise HandlerException("No connection established for credential proposal") - credential_manager = CredentialManager(context.profile) + credential_manager = CredentialManager(profile) cred_ex_record = await credential_manager.receive_proposal( context.message, context.connection_record.connection_id ) # mgr only finds, saves record: on exception, saving state null is hopeless @@ -73,7 +74,7 @@ async def handle(self, context: RequestContext, responder: BaseResponder): ) as err: self._logger.exception(err) if cred_ex_record: - async with context.session() as session: + async with profile.session() as session: await cred_ex_record.save_error_state( session, reason=err.roll_up, # us: be specific diff --git a/aries_cloudagent/protocols/issue_credential/v1_0/handlers/credential_request_handler.py b/aries_cloudagent/protocols/issue_credential/v1_0/handlers/credential_request_handler.py index e5282596ea..39ff2e73b5 100644 --- a/aries_cloudagent/protocols/issue_credential/v1_0/handlers/credential_request_handler.py +++ b/aries_cloudagent/protocols/issue_credential/v1_0/handlers/credential_request_handler.py @@ -28,7 +28,7 @@ async def handle(self, context: RequestContext, responder: BaseResponder): """ r_time = get_timer() - + profile = context.profile self._logger.debug("CredentialRequestHandler called with context %s", context) assert isinstance(context.message, CredentialRequest) self._logger.info( @@ -39,7 +39,7 @@ async def handle(self, context: RequestContext, responder: BaseResponder): if not context.connection_ready: raise HandlerException("No connection established for credential request") - credential_manager = CredentialManager(context.profile) + credential_manager = CredentialManager(profile) cred_ex_record = await credential_manager.receive_request( context.message, context.connection_record.connection_id ) # mgr only finds, saves record: on exception, saving state null is hopeless @@ -76,7 +76,7 @@ async def handle(self, context: RequestContext, responder: BaseResponder): ) as err: self._logger.exception(err) if cred_ex_record: - async with context.session() as session: + async with profile.session() as session: await cred_ex_record.save_error_state( session, reason=err.roll_up, # us: be specific diff --git a/aries_cloudagent/protocols/issue_credential/v1_0/manager.py b/aries_cloudagent/protocols/issue_credential/v1_0/manager.py index 99221070d6..8dc8116888 100644 --- a/aries_cloudagent/protocols/issue_credential/v1_0/manager.py +++ b/aries_cloudagent/protocols/issue_credential/v1_0/manager.py @@ -11,7 +11,11 @@ from ....core.profile import Profile from ....indy.holder import IndyHolder, IndyHolderError from ....indy.issuer import IndyIssuer, IndyIssuerRevocationRegistryFullError -from ....ledger.base import BaseLedger +from ....ledger.multiple_ledger.ledger_requests_executor import ( + GET_CRED_DEF, + GET_SCHEMA, + IndyLedgerRequestsExecutor, +) from ....messaging.credential_definitions.util import ( CRED_DEF_TAGS, CRED_DEF_SENT_RECORD_TYPE, @@ -260,7 +264,15 @@ async def _create(cred_def_id): credential_preview = credential_proposal_message.credential_proposal # vet attributes - ledger = self._profile.inject(BaseLedger) + ledger_exec_inst = self._profile.inject(IndyLedgerRequestsExecutor) + ledger_info = await ledger_exec_inst.get_ledger_for_identifier( + cred_def_id, + txn_record_type=GET_CRED_DEF, + ) + if isinstance(ledger_info, tuple): + ledger = ledger_info[1] + else: + ledger = ledger_info async with ledger: schema_id = await ledger.credential_definition_id2schema_id(cred_def_id) schema = await ledger.get_schema(schema_id) @@ -392,7 +404,15 @@ async def create_request( cred_offer_ser = cred_ex_record._credential_offer.ser async def _create(): - ledger = self._profile.inject(BaseLedger) + ledger_exec_inst = self._profile.inject(IndyLedgerRequestsExecutor) + ledger_info = await ledger_exec_inst.get_ledger_for_identifier( + credential_definition_id, + txn_record_type=GET_CRED_DEF, + ) + if isinstance(ledger_info, tuple): + ledger = ledger_info[1] + else: + ledger = ledger_info async with ledger: credential_definition = await ledger.get_credential_definition( credential_definition_id @@ -532,8 +552,15 @@ async def issue_credential( else: cred_offer_ser = cred_ex_record._credential_offer.ser cred_req_ser = cred_ex_record._credential_request.ser - - ledger = self._profile.inject(BaseLedger) + ledger_exec_inst = self._profile.inject(IndyLedgerRequestsExecutor) + ledger_info = await ledger_exec_inst.get_ledger_for_identifier( + schema_id, + txn_record_type=GET_SCHEMA, + ) + if isinstance(ledger_info, tuple): + ledger = ledger_info[1] + else: + ledger = ledger_info async with ledger: schema = await ledger.get_schema(schema_id) credential_definition = await ledger.get_credential_definition( @@ -739,7 +766,15 @@ async def store_credential( raw_cred_serde = cred_ex_record._raw_credential revoc_reg_def = None - ledger = self._profile.inject(BaseLedger) + ledger_exec_inst = self._profile.inject(IndyLedgerRequestsExecutor) + ledger_info = await ledger_exec_inst.get_ledger_for_identifier( + raw_cred_serde.de.cred_def_id, + txn_record_type=GET_CRED_DEF, + ) + if isinstance(ledger_info, tuple): + ledger = ledger_info[1] + else: + ledger = ledger_info async with ledger: credential_definition = await ledger.get_credential_definition( raw_cred_serde.de.cred_def_id diff --git a/aries_cloudagent/protocols/issue_credential/v1_0/routes.py b/aries_cloudagent/protocols/issue_credential/v1_0/routes.py index 8cb0211a80..fd8c7b3f59 100644 --- a/aries_cloudagent/protocols/issue_credential/v1_0/routes.py +++ b/aries_cloudagent/protocols/issue_credential/v1_0/routes.py @@ -321,7 +321,7 @@ async def credential_exchange_list(request: web.BaseRequest): } try: - async with context.session() as session: + async with context.profile.session() as session: records = await V10CredentialExchange.query( session=session, tag_filter=tag_filter, @@ -357,7 +357,7 @@ async def credential_exchange_retrieve(request: web.BaseRequest): credential_exchange_id = request.match_info["cred_ex_id"] cred_ex_record = None try: - async with context.session() as session: + async with context.profile.session() as session: cred_ex_record = await V10CredentialExchange.retrieve_by_id( session, credential_exchange_id ) @@ -479,6 +479,7 @@ async def credential_exchange_send(request: web.BaseRequest): r_time = get_timer() context: AdminRequestContext = request["context"] + profile = context.profile outbound_handler = request["outbound_message_router"] body = await request.json() @@ -495,7 +496,7 @@ async def credential_exchange_send(request: web.BaseRequest): cred_ex_record = None try: preview = CredentialPreview.deserialize(preview_spec) - async with context.session() as session: + async with profile.session() as session: connection_record = await ConnRecord.retrieve_by_id(session, connection_id) if not connection_record.is_ready: raise web.HTTPForbidden(reason=f"Connection {connection_id} not ready") @@ -516,7 +517,7 @@ async def credential_exchange_send(request: web.BaseRequest): outcome="credential_exchange_send.START", ) - credential_manager = CredentialManager(context.profile) + credential_manager = CredentialManager(profile) ( cred_ex_record, credential_offer_message, @@ -530,7 +531,7 @@ async def credential_exchange_send(request: web.BaseRequest): except (BaseModelError, CredentialManagerError, LedgerError, StorageError) as err: if cred_ex_record: - async with context.session() as session: + async with profile.session() as session: await cred_ex_record.save_error_state(session, reason=err.roll_up) await report_problem( err, @@ -574,6 +575,7 @@ async def credential_exchange_send_proposal(request: web.BaseRequest): r_time = get_timer() context: AdminRequestContext = request["context"] + profile = context.profile outbound_handler = request["outbound_message_router"] body = await request.json() @@ -588,12 +590,12 @@ async def credential_exchange_send_proposal(request: web.BaseRequest): cred_ex_record = None try: preview = CredentialPreview.deserialize(preview_spec) if preview_spec else None - async with context.session() as session: + async with profile.session() as session: connection_record = await ConnRecord.retrieve_by_id(session, connection_id) if not connection_record.is_ready: raise web.HTTPForbidden(reason=f"Connection {connection_id} not ready") - credential_manager = CredentialManager(context.profile) + credential_manager = CredentialManager(profile) cred_ex_record = await credential_manager.create_proposal( connection_id, comment=comment, @@ -608,7 +610,7 @@ async def credential_exchange_send_proposal(request: web.BaseRequest): except (BaseModelError, StorageError) as err: if cred_ex_record: - async with context.session() as session: + async with profile.session() as session: await cred_ex_record.save_error_state(session, reason=err.roll_up) await report_problem( err, @@ -702,7 +704,7 @@ async def credential_exchange_create_free_offer(request: web.BaseRequest): r_time = get_timer() context: AdminRequestContext = request["context"] - + profile = context.profile body = await request.json() cred_def_id = body.get("cred_def_id") @@ -722,7 +724,7 @@ async def credential_exchange_create_free_offer(request: web.BaseRequest): cred_ex_record = None try: (cred_ex_record, credential_offer_message) = await _create_free_offer( - profile=context.profile, + profile=profile, cred_def_id=cred_def_id, auto_issue=auto_issue, auto_remove=auto_remove, @@ -739,7 +741,7 @@ async def credential_exchange_create_free_offer(request: web.BaseRequest): StorageError, ) as err: if cred_ex_record: - async with context.session() as session: + async with profile.session() as session: await cred_ex_record.save_error_state(session, reason=err.roll_up) raise web.HTTPBadRequest(reason=err.roll_up) trace_event( @@ -774,6 +776,7 @@ async def credential_exchange_send_free_offer(request: web.BaseRequest): r_time = get_timer() context: AdminRequestContext = request["context"] + profile = context.profile outbound_handler = request["outbound_message_router"] body = await request.json() @@ -796,13 +799,13 @@ async def credential_exchange_send_free_offer(request: web.BaseRequest): cred_ex_record = None connection_record = None try: - async with context.session() as session: + async with profile.session() as session: connection_record = await ConnRecord.retrieve_by_id(session, connection_id) if not connection_record.is_ready: raise web.HTTPForbidden(reason=f"Connection {connection_id} not ready") cred_ex_record, credential_offer_message = await _create_free_offer( - profile=context.profile, + profile=profile, cred_def_id=cred_def_id, connection_id=connection_id, auto_issue=auto_issue, @@ -820,7 +823,7 @@ async def credential_exchange_send_free_offer(request: web.BaseRequest): LedgerError, ) as err: if cred_ex_record: - async with context.session() as session: + async with profile.session() as session: await cred_ex_record.save_error_state(session, reason=err.roll_up) await report_problem( err, @@ -866,6 +869,7 @@ async def credential_exchange_send_bound_offer(request: web.BaseRequest): r_time = get_timer() context: AdminRequestContext = request["context"] + profile = context.profile outbound_handler = request["outbound_message_router"] body = await request.json() if request.body_exists else {} @@ -875,7 +879,7 @@ async def credential_exchange_send_bound_offer(request: web.BaseRequest): cred_ex_record = None connection_record = None try: - async with context.session() as session: + async with profile.session() as session: try: cred_ex_record = await V10CredentialExchange.retrieve_by_id( session, credential_exchange_id @@ -897,7 +901,7 @@ async def credential_exchange_send_bound_offer(request: web.BaseRequest): if not connection_record.is_ready: raise web.HTTPForbidden(reason=f"Connection {connection_id} not ready") - credential_manager = CredentialManager(context.profile) + credential_manager = CredentialManager(profile) ( cred_ex_record, credential_offer_message, @@ -919,7 +923,7 @@ async def credential_exchange_send_bound_offer(request: web.BaseRequest): StorageError, ) as err: if cred_ex_record: - async with context.session() as session: + async with profile.session() as session: await cred_ex_record.save_error_state(session, reason=err.roll_up) await report_problem( err, @@ -961,6 +965,7 @@ async def credential_exchange_send_request(request: web.BaseRequest): r_time = get_timer() context: AdminRequestContext = request["context"] + profile = context.profile outbound_handler = request["outbound_message_router"] credential_exchange_id = request.match_info["cred_ex_id"] @@ -968,7 +973,7 @@ async def credential_exchange_send_request(request: web.BaseRequest): cred_ex_record = None connection_record = None try: - async with context.session() as session: + async with profile.session() as session: try: cred_ex_record = await V10CredentialExchange.retrieve_by_id( session, credential_exchange_id @@ -984,7 +989,7 @@ async def credential_exchange_send_request(request: web.BaseRequest): if not connection_record.is_ready: raise web.HTTPForbidden(reason=f"Connection {connection_id} not ready") - credential_manager = CredentialManager(context.profile) + credential_manager = CredentialManager(profile) ( cred_ex_record, credential_request_message, @@ -1002,7 +1007,7 @@ async def credential_exchange_send_request(request: web.BaseRequest): StorageError, ) as err: if cred_ex_record: - async with context.session() as session: + async with profile.session() as session: await cred_ex_record.save_error_state(session, reason=err.roll_up) await report_problem( err, @@ -1045,6 +1050,7 @@ async def credential_exchange_issue(request: web.BaseRequest): r_time = get_timer() context: AdminRequestContext = request["context"] + profile = context.profile outbound_handler = request["outbound_message_router"] body = await request.json() @@ -1055,7 +1061,7 @@ async def credential_exchange_issue(request: web.BaseRequest): cred_ex_record = None connection_record = None try: - async with context.session() as session: + async with profile.session() as session: try: cred_ex_record = await V10CredentialExchange.retrieve_by_id( session, credential_exchange_id @@ -1068,7 +1074,7 @@ async def credential_exchange_issue(request: web.BaseRequest): if not connection_record.is_ready: raise web.HTTPForbidden(reason=f"Connection {connection_id} not ready") - credential_manager = CredentialManager(context.profile) + credential_manager = CredentialManager(profile) ( cred_ex_record, credential_issue_message, @@ -1084,7 +1090,7 @@ async def credential_exchange_issue(request: web.BaseRequest): StorageError, ) as err: if cred_ex_record: - async with context.session() as session: + async with profile.session() as session: await cred_ex_record.save_error_state(session, reason=err.roll_up) await report_problem( err, @@ -1127,6 +1133,7 @@ async def credential_exchange_store(request: web.BaseRequest): r_time = get_timer() context: AdminRequestContext = request["context"] + profile = context.profile outbound_handler = request["outbound_message_router"] try: @@ -1140,7 +1147,7 @@ async def credential_exchange_store(request: web.BaseRequest): cred_ex_record = None connection_record = None try: - async with context.session() as session: + async with profile.session() as session: try: cred_ex_record = await V10CredentialExchange.retrieve_by_id( session, credential_exchange_id @@ -1153,7 +1160,7 @@ async def credential_exchange_store(request: web.BaseRequest): if not connection_record.is_ready: raise web.HTTPForbidden(reason=f"Connection {connection_id} not ready") - credential_manager = CredentialManager(context.profile) + credential_manager = CredentialManager(profile) cred_ex_record = await credential_manager.store_credential( cred_ex_record, credential_id, @@ -1165,7 +1172,7 @@ async def credential_exchange_store(request: web.BaseRequest): StorageError, ) as err: # treat failure to store as mangled on receipt hence protocol error if cred_ex_record: - async with context.session() as session: + async with profile.session() as session: await cred_ex_record.save_error_state(session, reason=err.roll_up) await report_problem( err, @@ -1223,7 +1230,7 @@ async def credential_exchange_problem_report(request: web.BaseRequest): description = body["description"] try: - async with context.session() as session: + async with context.profile.session() as session: cred_ex_record = await V10CredentialExchange.retrieve_by_id( session, credential_exchange_id ) @@ -1261,7 +1268,7 @@ async def credential_exchange_remove(request: web.BaseRequest): credential_exchange_id = request.match_info["cred_ex_id"] cred_ex_record = None try: - async with context.session() as session: + async with context.profile.session() as session: cred_ex_record = await V10CredentialExchange.retrieve_by_id( session, credential_exchange_id ) diff --git a/aries_cloudagent/protocols/issue_credential/v1_0/tests/test_manager.py b/aries_cloudagent/protocols/issue_credential/v1_0/tests/test_manager.py index 486f3134bc..4e8f52b4b7 100644 --- a/aries_cloudagent/protocols/issue_credential/v1_0/tests/test_manager.py +++ b/aries_cloudagent/protocols/issue_credential/v1_0/tests/test_manager.py @@ -14,6 +14,9 @@ from .....messaging.credential_definitions.util import CRED_DEF_SENT_RECORD_TYPE from .....messaging.responder import BaseResponder, MockResponder from .....ledger.base import BaseLedger +from .....ledger.multiple_ledger.ledger_requests_executor import ( + IndyLedgerRequestsExecutor, +) from .....storage.base import StorageRecord from .....storage.error import StorageNotFoundError @@ -67,7 +70,14 @@ async def setUp(self): return_value=SCHEMA_ID ) self.context.injector.bind_instance(BaseLedger, self.ledger) - + self.context.injector.bind_instance( + IndyLedgerRequestsExecutor, + async_mock.MagicMock( + get_ledger_for_identifier=async_mock.CoroutineMock( + return_value=self.ledger + ) + ), + ) self.manager = CredentialManager(self.profile) assert self.manager.profile @@ -934,7 +944,14 @@ async def test_issue_credential_non_revocable(self): self.ledger.__aenter__ = async_mock.CoroutineMock(return_value=self.ledger) self.context.injector.clear_binding(BaseLedger) self.context.injector.bind_instance(BaseLedger, self.ledger) - + self.context.injector.bind_instance( + IndyLedgerRequestsExecutor, + async_mock.MagicMock( + get_ledger_for_identifier=async_mock.CoroutineMock( + return_value=("test_ledger_id", self.ledger) + ) + ), + ) with async_mock.patch.object( V10CredentialExchange, "save", autospec=True ) as save_ex: @@ -1095,7 +1112,14 @@ async def test_issue_credential_no_active_rr_no_retries(self): return_value=(json.dumps(cred), cred_rev_id) ) self.context.injector.bind_instance(IndyIssuer, issuer) - + self.context.injector.bind_instance( + IndyLedgerRequestsExecutor, + async_mock.MagicMock( + get_ledger_for_identifier=async_mock.CoroutineMock( + return_value=("test_ledger_id", self.ledger) + ) + ), + ) with async_mock.patch.object( test_module, "IssuerRevRegRecord", autospec=True ) as issuer_rr_rec, async_mock.patch.object( @@ -1152,7 +1176,14 @@ async def test_issue_credential_no_active_rr_retry(self): return_value=(json.dumps(cred), cred_rev_id) ) self.context.injector.bind_instance(IndyIssuer, issuer) - + self.context.injector.bind_instance( + IndyLedgerRequestsExecutor, + async_mock.MagicMock( + get_ledger_for_identifier=async_mock.CoroutineMock( + return_value=("test_ledger_id", self.ledger) + ) + ), + ) with async_mock.patch.object( test_module, "IssuerRevRegRecord", autospec=True ) as issuer_rr_rec, async_mock.patch.object( @@ -1212,7 +1243,14 @@ async def test_issue_credential_rr_full(self): side_effect=test_module.IndyIssuerRevocationRegistryFullError("Nope") ) self.context.injector.bind_instance(IndyIssuer, issuer) - + self.context.injector.bind_instance( + IndyLedgerRequestsExecutor, + async_mock.MagicMock( + get_ledger_for_identifier=async_mock.CoroutineMock( + return_value=("test_ledger_id", self.ledger) + ) + ), + ) with async_mock.patch.object( test_module, "IndyRevocation", autospec=True ) as revoc: @@ -1308,7 +1346,14 @@ async def test_store_credential(self): return_value=json.dumps(INDY_CRED_INFO) ) self.context.injector.bind_instance(IndyHolder, holder) - + self.context.injector.bind_instance( + IndyLedgerRequestsExecutor, + async_mock.MagicMock( + get_ledger_for_identifier=async_mock.CoroutineMock( + return_value=("test_ledger_id", self.ledger) + ) + ), + ) with async_mock.patch.object( test_module, "RevocationRegistry", autospec=True ) as mock_rev_reg, async_mock.patch.object( @@ -1404,7 +1449,14 @@ async def test_store_credential_no_preview(self): return_value=json.dumps(cred_info_no_rev) ) self.context.injector.bind_instance(IndyHolder, holder) - + self.context.injector.bind_instance( + IndyLedgerRequestsExecutor, + async_mock.MagicMock( + get_ledger_for_identifier=async_mock.CoroutineMock( + return_value=("test_ledger_id", self.ledger) + ) + ), + ) with async_mock.patch.object( V10CredentialExchange, "save", autospec=True ) as save_ex, async_mock.patch.object( @@ -1464,7 +1516,14 @@ async def test_store_credential_holder_store_indy_error(self): side_effect=test_module.IndyHolderError("Problem", {"message": "Nope"}) ) self.context.injector.bind_instance(IndyHolder, holder) - + self.context.injector.bind_instance( + IndyLedgerRequestsExecutor, + async_mock.MagicMock( + get_ledger_for_identifier=async_mock.CoroutineMock( + return_value=("test_ledger_id", self.ledger) + ) + ), + ) with self.assertRaises(test_module.IndyHolderError): await self.manager.store_credential( cred_ex_record=stored_exchange, credential_id=cred_id diff --git a/aries_cloudagent/protocols/issue_credential/v2_0/formats/indy/handler.py b/aries_cloudagent/protocols/issue_credential/v2_0/formats/indy/handler.py index 5a462c501f..9604b9e278 100644 --- a/aries_cloudagent/protocols/issue_credential/v2_0/formats/indy/handler.py +++ b/aries_cloudagent/protocols/issue_credential/v2_0/formats/indy/handler.py @@ -14,6 +14,11 @@ from ......indy.models.cred_request import IndyCredRequestSchema from ......indy.models.cred_abstract import IndyCredAbstractSchema from ......ledger.base import BaseLedger +from ......ledger.multiple_ledger.ledger_requests_executor import ( + GET_CRED_DEF, + GET_SCHEMA, + IndyLedgerRequestsExecutor, +) from ......messaging.credential_definitions.util import ( CRED_DEF_SENT_RECORD_TYPE, CredDefQueryStringSchema, @@ -196,6 +201,15 @@ async def _create(): offer_json = await issuer.create_credential_offer(cred_def_id) return json.loads(offer_json) + ledger_exec_inst = self._profile.inject(IndyLedgerRequestsExecutor) + ledger_info = await ledger_exec_inst.get_ledger_for_identifier( + cred_def_id, + txn_record_type=GET_CRED_DEF, + ) + if isinstance(ledger_info, tuple): + ledger = ledger_info[1] + else: + ledger = ledger_info async with ledger: schema_id = await ledger.credential_definition_id2schema_id(cred_def_id) schema = await ledger.get_schema(schema_id) @@ -250,7 +264,15 @@ async def create_request( cred_def_id = cred_offer["cred_def_id"] async def _create(): - ledger = self.profile.inject(BaseLedger) + ledger_exec_inst = self._profile.inject(IndyLedgerRequestsExecutor) + ledger_info = await ledger_exec_inst.get_ledger_for_identifier( + cred_def_id, + txn_record_type=GET_CRED_DEF, + ) + if isinstance(ledger_info, tuple): + ledger = ledger_info[1] + else: + ledger = ledger_info async with ledger: cred_def = await ledger.get_credential_definition(cred_def_id) @@ -312,8 +334,15 @@ async def issue_credential( rev_reg_id = None rev_reg = None - - ledger = self.profile.inject(BaseLedger) + ledger_exec_inst = self._profile.inject(IndyLedgerRequestsExecutor) + ledger_info = await ledger_exec_inst.get_ledger_for_identifier( + schema_id, + txn_record_type=GET_SCHEMA, + ) + if isinstance(ledger_info, tuple): + ledger = ledger_info[1] + else: + ledger = ledger_info async with ledger: schema = await ledger.get_schema(schema_id) cred_def = await ledger.get_credential_definition(cred_def_id) @@ -451,7 +480,15 @@ async def store_credential( cred = cred_ex_record.cred_issue.attachment(IndyCredFormatHandler.format) rev_reg_def = None - ledger = self.profile.inject(BaseLedger) + ledger_exec_inst = self._profile.inject(IndyLedgerRequestsExecutor) + ledger_info = await ledger_exec_inst.get_ledger_for_identifier( + cred["cred_def_id"], + txn_record_type=GET_CRED_DEF, + ) + if isinstance(ledger_info, tuple): + ledger = ledger_info[1] + else: + ledger = ledger_info async with ledger: cred_def = await ledger.get_credential_definition(cred["cred_def_id"]) if cred.get("rev_reg_id"): diff --git a/aries_cloudagent/protocols/issue_credential/v2_0/formats/indy/tests/test_handler.py b/aries_cloudagent/protocols/issue_credential/v2_0/formats/indy/tests/test_handler.py index 11314fdce4..55dfc0e378 100644 --- a/aries_cloudagent/protocols/issue_credential/v2_0/formats/indy/tests/test_handler.py +++ b/aries_cloudagent/protocols/issue_credential/v2_0/formats/indy/tests/test_handler.py @@ -10,6 +10,9 @@ from .......core.in_memory import InMemoryProfile from .......ledger.base import BaseLedger +from .......ledger.multiple_ledger.ledger_requests_executor import ( + IndyLedgerRequestsExecutor, +) from .......indy.issuer import IndyIssuer from .......cache.in_memory import InMemoryCache from .......cache.base import BaseCache @@ -214,7 +217,14 @@ async def setUp(self): return_value=SCHEMA_ID ) self.context.injector.bind_instance(BaseLedger, self.ledger) - + self.context.injector.bind_instance( + IndyLedgerRequestsExecutor, + async_mock.MagicMock( + get_ledger_for_identifier=async_mock.CoroutineMock( + return_value=self.ledger + ) + ), + ) # Context self.cache = InMemoryCache() self.context.injector.bind_instance(BaseCache, self.cache) diff --git a/aries_cloudagent/protocols/issue_credential/v2_0/routes.py b/aries_cloudagent/protocols/issue_credential/v2_0/routes.py index cd27db11f4..27bfe4938b 100644 --- a/aries_cloudagent/protocols/issue_credential/v2_0/routes.py +++ b/aries_cloudagent/protocols/issue_credential/v2_0/routes.py @@ -659,7 +659,7 @@ async def credential_exchange_send(request: web.BaseRequest): V20CredFormatError, ) as err: if cred_ex_record: - async with context.session() as session: + async with profile.session() as session: await cred_ex_record.save_error_state(session, reason=err.roll_up) await report_problem( err, @@ -747,7 +747,7 @@ async def credential_exchange_send_proposal(request: web.BaseRequest): except (BaseModelError, StorageError) as err: if cred_ex_record: - async with context.session() as session: + async with profile.session() as session: await cred_ex_record.save_error_state(session, reason=err.roll_up) await report_problem( err, diff --git a/aries_cloudagent/protocols/present_proof/indy/pres_exch_handler.py b/aries_cloudagent/protocols/present_proof/indy/pres_exch_handler.py index 2e6b8fdc69..c2cce65d4b 100644 --- a/aries_cloudagent/protocols/present_proof/indy/pres_exch_handler.py +++ b/aries_cloudagent/protocols/present_proof/indy/pres_exch_handler.py @@ -9,7 +9,11 @@ from ....core.profile import Profile from ....indy.holder import IndyHolder, IndyHolderError from ....indy.models.xform import indy_proof_req2non_revoc_intervals -from ....ledger.base import BaseLedger +from ....ledger.multiple_ledger.ledger_requests_executor import ( + GET_SCHEMA, + GET_REVOC_REG_DELTA, + IndyLedgerRequestsExecutor, +) from ....revocation.models.revocation_registry import RevocationRegistry from ..v1_0.models.presentation_exchange import V10PresentationExchange @@ -83,13 +87,22 @@ async def return_presentation( f"{reft} for non-revocable credential {req_item['cred_id']}" ) # Get all schemas, credential definitions, and revocation registries in use - ledger = self._profile.inject(BaseLedger) schemas = {} cred_defs = {} revocation_registries = {} - async with ledger: - for credential in credentials.values(): - schema_id = credential["schema_id"] + + for credential in credentials.values(): + schema_id = credential["schema_id"] + ledger_exec_inst = self._profile.inject(IndyLedgerRequestsExecutor) + ledger_info = await ledger_exec_inst.get_ledger_for_identifier( + schema_id, + txn_record_type=GET_SCHEMA, + ) + if isinstance(ledger_info, tuple): + ledger = ledger_info[1] + else: + ledger = ledger_info + async with ledger: if schema_id not in schemas: schemas[schema_id] = await ledger.get_schema(schema_id) cred_def_id = credential["cred_def_id"] @@ -109,14 +122,23 @@ async def return_presentation( # of the presentation request or attributes epoch_now = int(time.time()) revoc_reg_deltas = {} - async with ledger: - for precis in requested_referents.values(): # cred_id, non-revoc interval - credential_id = precis["cred_id"] - if not credentials[credential_id].get("rev_reg_id"): - continue - if "timestamp" in precis: - continue - rev_reg_id = credentials[credential_id]["rev_reg_id"] + for precis in requested_referents.values(): # cred_id, non-revoc interval + credential_id = precis["cred_id"] + if not credentials[credential_id].get("rev_reg_id"): + continue + if "timestamp" in precis: + continue + rev_reg_id = credentials[credential_id]["rev_reg_id"] + ledger_exec_inst = self._profile.inject(IndyLedgerRequestsExecutor) + ledger_info = await ledger_exec_inst.get_ledger_for_identifier( + rev_reg_id, + txn_record_type=GET_REVOC_REG_DELTA, + ) + if isinstance(ledger_info, tuple): + ledger = ledger_info[1] + else: + ledger = ledger_info + async with ledger: reft_non_revoc_interval = precis.get("non_revoked") if reft_non_revoc_interval: key = ( @@ -201,12 +223,19 @@ async def process_pres_identifiers( rev_reg_defs = {} rev_reg_entries = {} - ledger = self._profile.inject(BaseLedger) - async with ledger: - for identifier in identifiers: - schema_ids.append(identifier["schema_id"]) - cred_def_ids.append(identifier["cred_def_id"]) - + for identifier in identifiers: + schema_ids.append(identifier["schema_id"]) + cred_def_ids.append(identifier["cred_def_id"]) + ledger_exec_inst = self._profile.inject(IndyLedgerRequestsExecutor) + ledger_info = await ledger_exec_inst.get_ledger_for_identifier( + identifier["schema_id"], + txn_record_type=GET_SCHEMA, + ) + if isinstance(ledger_info, tuple): + ledger = ledger_info[1] + else: + ledger = ledger_info + async with ledger: # Build schemas for anoncreds if identifier["schema_id"] not in schemas: schemas[identifier["schema_id"]] = await ledger.get_schema( diff --git a/aries_cloudagent/protocols/present_proof/v1_0/handlers/presentation_handler.py b/aries_cloudagent/protocols/present_proof/v1_0/handlers/presentation_handler.py index 1fb7c191c9..40c4d73dc6 100644 --- a/aries_cloudagent/protocols/present_proof/v1_0/handlers/presentation_handler.py +++ b/aries_cloudagent/protocols/present_proof/v1_0/handlers/presentation_handler.py @@ -27,7 +27,7 @@ async def handle(self, context: RequestContext, responder: BaseResponder): """ r_time = get_timer() - + profile = context.profile self._logger.debug("PresentationHandler called with context %s", context) assert isinstance(context.message, Presentation) self._logger.info( @@ -35,7 +35,7 @@ async def handle(self, context: RequestContext, responder: BaseResponder): context.message.serialize(as_string=True), ) - presentation_manager = PresentationManager(context.profile) + presentation_manager = PresentationManager(profile) presentation_exchange_record = await presentation_manager.receive_presentation( context.message, context.connection_record @@ -57,7 +57,7 @@ async def handle(self, context: RequestContext, responder: BaseResponder): except (BaseModelError, LedgerError, StorageError) as err: self._logger.exception(err) if presentation_exchange_record: - async with context.session() as session: + async with profile.session() as session: await presentation_exchange_record.save_error_state( session, reason=err.roll_up, # us: be specific diff --git a/aries_cloudagent/protocols/present_proof/v1_0/handlers/presentation_proposal_handler.py b/aries_cloudagent/protocols/present_proof/v1_0/handlers/presentation_proposal_handler.py index ab2abda95a..89708a4368 100644 --- a/aries_cloudagent/protocols/present_proof/v1_0/handlers/presentation_proposal_handler.py +++ b/aries_cloudagent/protocols/present_proof/v1_0/handlers/presentation_proposal_handler.py @@ -27,7 +27,7 @@ async def handle(self, context: RequestContext, responder: BaseResponder): """ r_time = get_timer() - + profile = context.profile self._logger.debug( "PresentationProposalHandler called with context %s", context ) @@ -42,7 +42,7 @@ async def handle(self, context: RequestContext, responder: BaseResponder): "No connection established for presentation proposal" ) - presentation_manager = PresentationManager(context.profile) + presentation_manager = PresentationManager(profile) presentation_exchange_record = await presentation_manager.receive_proposal( context.message, context.connection_record ) # mgr only creates, saves record: on exception, saving state null is hopeless @@ -69,7 +69,7 @@ async def handle(self, context: RequestContext, responder: BaseResponder): except (BaseModelError, LedgerError, StorageError) as err: self._logger.exception(err) if presentation_exchange_record: - async with context.session() as session: + async with profile.session() as session: await presentation_exchange_record.save_error_state( session, reason=err.roll_up, # us: be specific diff --git a/aries_cloudagent/protocols/present_proof/v1_0/handlers/presentation_request_handler.py b/aries_cloudagent/protocols/present_proof/v1_0/handlers/presentation_request_handler.py index 231e2c5a31..21979940ee 100644 --- a/aries_cloudagent/protocols/present_proof/v1_0/handlers/presentation_request_handler.py +++ b/aries_cloudagent/protocols/present_proof/v1_0/handlers/presentation_request_handler.py @@ -31,6 +31,7 @@ async def handle(self, context: RequestContext, responder: BaseResponder): """ r_time = get_timer() + profile = context.profile self._logger.debug("PresentationRequestHandler called with context %s", context) assert isinstance(context.message, PresentationRequest) @@ -42,14 +43,14 @@ async def handle(self, context: RequestContext, responder: BaseResponder): if not context.connection_ready: raise HandlerException("No connection established for presentation request") - presentation_manager = PresentationManager(context.profile) + presentation_manager = PresentationManager(profile) indy_proof_request = context.message.indy_proof_request(0) # Get presentation exchange record (holder initiated via proposal) # or create it (verifier sent request first) try: - async with context.session() as session: + async with profile.session() as session: ( presentation_exchange_record ) = await V10PresentationExchange.retrieve_by_tag_filter( @@ -124,7 +125,7 @@ async def handle(self, context: RequestContext, responder: BaseResponder): ) as err: self._logger.exception(err) if presentation_exchange_record: - async with context.session() as session: + async with profile.session() as session: await presentation_exchange_record.save_error_state( session, reason=err.roll_up, # us: be specific diff --git a/aries_cloudagent/protocols/present_proof/v1_0/manager.py b/aries_cloudagent/protocols/present_proof/v1_0/manager.py index 64bb5ee3c5..50cf219c41 100644 --- a/aries_cloudagent/protocols/present_proof/v1_0/manager.py +++ b/aries_cloudagent/protocols/present_proof/v1_0/manager.py @@ -7,7 +7,6 @@ from ....core.error import BaseError from ....core.profile import Profile from ....indy.verifier import IndyVerifier -from ....ledger.base import BaseLedger from ....messaging.decorators.attach_decorator import AttachDecorator from ....messaging.responder import BaseResponder from ....storage.error import StorageNotFoundError @@ -137,7 +136,7 @@ async def create_bound_request( name=name, version=version, nonce=nonce, - ledger=self._profile.inject(BaseLedger), + profile=self._profile, ) presentation_request_message = PresentationRequest( comment=comment, diff --git a/aries_cloudagent/protocols/present_proof/v1_0/routes.py b/aries_cloudagent/protocols/present_proof/v1_0/routes.py index e986aac1f0..ddc90d4ef3 100644 --- a/aries_cloudagent/protocols/present_proof/v1_0/routes.py +++ b/aries_cloudagent/protocols/present_proof/v1_0/routes.py @@ -202,7 +202,6 @@ async def presentation_exchange_list(request: web.BaseRequest): """ context: AdminRequestContext = request["context"] - tag_filter = {} if "thread_id" in request.query and request.query["thread_id"] != "": tag_filter["thread_id"] = request.query["thread_id"] @@ -213,7 +212,7 @@ async def presentation_exchange_list(request: web.BaseRequest): } try: - async with context.session() as session: + async with context.profile.session() as session: records = await V10PresentationExchange.query( session=session, tag_filter=tag_filter, @@ -244,12 +243,13 @@ async def presentation_exchange_retrieve(request: web.BaseRequest): """ context: AdminRequestContext = request["context"] + profile = context.profile outbound_handler = request["outbound_message_router"] presentation_exchange_id = request.match_info["pres_ex_id"] pres_ex_record = None try: - async with context.session() as session: + async with profile.session() as session: pres_ex_record = await V10PresentationExchange.retrieve_by_id( session, presentation_exchange_id ) @@ -260,7 +260,7 @@ async def presentation_exchange_retrieve(request: web.BaseRequest): except (BaseModelError, StorageError) as err: # present but broken or hopeless: protocol error if pres_ex_record: - async with context.session() as session: + async with profile.session() as session: await pres_ex_record.save_error_state(session, reason=err.roll_up) await report_problem( err, @@ -292,6 +292,7 @@ async def presentation_exchange_credentials_list(request: web.BaseRequest): """ context: AdminRequestContext = request["context"] + profile = context.profile outbound_handler = request["outbound_message_router"] presentation_exchange_id = request.match_info["pres_ex_id"] @@ -301,7 +302,7 @@ async def presentation_exchange_credentials_list(request: web.BaseRequest): ) try: - async with context.session() as session: + async with profile.session() as session: pres_ex_record = await V10PresentationExchange.retrieve_by_id( session, presentation_exchange_id ) @@ -319,7 +320,7 @@ async def presentation_exchange_credentials_list(request: web.BaseRequest): start = int(start) if isinstance(start, str) else 0 count = int(count) if isinstance(count, str) else 10 - holder = context.profile.inject(IndyHolder) + holder = profile.inject(IndyHolder) try: credentials = await holder.get_credentials_for_presentation_request_by_referent( pres_ex_record._presentation_request.ser, @@ -330,7 +331,7 @@ async def presentation_exchange_credentials_list(request: web.BaseRequest): ) except IndyHolderError as err: if pres_ex_record: - async with context.session() as session: + async with profile.session() as session: await pres_ex_record.save_error_state(session, reason=err.roll_up) await report_problem( err, @@ -370,6 +371,7 @@ async def presentation_exchange_send_proposal(request: web.BaseRequest): r_time = get_timer() context: AdminRequestContext = request["context"] + profile = context.profile outbound_handler = request["outbound_message_router"] body = await request.json() @@ -380,7 +382,7 @@ async def presentation_exchange_send_proposal(request: web.BaseRequest): # Aries RFC 37 calls it a proposal in the proposal struct but it's of type preview presentation_preview = body.get("presentation_proposal") connection_record = None - async with context.session() as session: + async with profile.session() as session: try: connection_record = await ConnRecord.retrieve_by_id(session, connection_id) presentation_proposal_message = PresentationProposal( @@ -403,7 +405,7 @@ async def presentation_exchange_send_proposal(request: web.BaseRequest): "auto_present", context.settings.get("debug.auto_respond_presentation_request") ) - presentation_manager = PresentationManager(context.profile) + presentation_manager = PresentationManager(profile) pres_ex_record = None try: pres_ex_record = await presentation_manager.create_exchange_for_proposal( @@ -414,7 +416,7 @@ async def presentation_exchange_send_proposal(request: web.BaseRequest): result = pres_ex_record.serialize() except (BaseModelError, StorageError) as err: if pres_ex_record: - async with context.session() as session: + async with profile.session() as session: await pres_ex_record.save_error_state(session, reason=err.roll_up) # other party does not care about our false protocol start raise web.HTTPBadRequest(reason=err.roll_up) @@ -454,6 +456,7 @@ async def presentation_exchange_create_request(request: web.BaseRequest): r_time = get_timer() context: AdminRequestContext = request["context"] + profile = context.profile outbound_handler = request["outbound_message_router"] body = await request.json() @@ -478,9 +481,9 @@ async def presentation_exchange_create_request(request: web.BaseRequest): trace_msg, ) - presentation_manager = PresentationManager(context.profile) pres_ex_record = None try: + presentation_manager = PresentationManager(profile) pres_ex_record = await presentation_manager.create_exchange_for_request( connection_id=None, presentation_request_message=presentation_request_message, @@ -488,7 +491,7 @@ async def presentation_exchange_create_request(request: web.BaseRequest): result = pres_ex_record.serialize() except (BaseModelError, StorageError) as err: if pres_ex_record: - async with context.session() as session: + async with profile.session() as session: await pres_ex_record.save_error_state(session, reason=err.roll_up) # other party does not care about our false protocol start raise web.HTTPBadRequest(reason=err.roll_up) @@ -525,12 +528,13 @@ async def presentation_exchange_send_free_request(request: web.BaseRequest): r_time = get_timer() context: AdminRequestContext = request["context"] + profile = context.profile outbound_handler = request["outbound_message_router"] body = await request.json() connection_id = body.get("connection_id") - async with context.session() as session: + async with profile.session() as session: try: connection_record = await ConnRecord.retrieve_by_id(session, connection_id) except StorageNotFoundError as err: @@ -559,9 +563,9 @@ async def presentation_exchange_send_free_request(request: web.BaseRequest): trace_msg, ) - presentation_manager = PresentationManager(context.profile) pres_ex_record = None try: + presentation_manager = PresentationManager(profile) pres_ex_record = await presentation_manager.create_exchange_for_request( connection_id=connection_id, presentation_request_message=presentation_request_message, @@ -569,7 +573,7 @@ async def presentation_exchange_send_free_request(request: web.BaseRequest): result = pres_ex_record.serialize() except (BaseModelError, StorageError) as err: if pres_ex_record: - async with context.session() as session: + async with profile.session() as session: await pres_ex_record.save_error_state(session, reason=err.roll_up) # other party does not care about our false protocol start raise web.HTTPBadRequest(reason=err.roll_up) @@ -607,13 +611,14 @@ async def presentation_exchange_send_bound_request(request: web.BaseRequest): r_time = get_timer() context: AdminRequestContext = request["context"] + profile = context.profile outbound_handler = request["outbound_message_router"] body = await request.json() presentation_exchange_id = request.match_info["pres_ex_id"] pres_ex_record = None - async with context.session() as session: + async with profile.session() as session: try: pres_ex_record = await V10PresentationExchange.retrieve_by_id( session, presentation_exchange_id @@ -639,8 +644,8 @@ async def presentation_exchange_send_bound_request(request: web.BaseRequest): if not connection_record.is_ready: raise web.HTTPForbidden(reason=f"Connection {conn_id} not ready") - presentation_manager = PresentationManager(context.profile) try: + presentation_manager = PresentationManager(profile) ( pres_ex_record, presentation_request_message, @@ -648,7 +653,7 @@ async def presentation_exchange_send_bound_request(request: web.BaseRequest): result = pres_ex_record.serialize() except (BaseModelError, LedgerError, StorageError) as err: if pres_ex_record: - async with context.session() as session: + async with profile.session() as session: await pres_ex_record.save_error_state(session, reason=err.roll_up) # other party cares that we cannot continue protocol await report_problem( @@ -694,12 +699,13 @@ async def presentation_exchange_send_presentation(request: web.BaseRequest): r_time = get_timer() context: AdminRequestContext = request["context"] + profile = context.profile outbound_handler = request["outbound_message_router"] presentation_exchange_id = request.match_info["pres_ex_id"] body = await request.json() pres_ex_record = None - async with context.session() as session: + async with profile.session() as session: try: pres_ex_record = await V10PresentationExchange.retrieve_by_id( session, presentation_exchange_id @@ -725,8 +731,8 @@ async def presentation_exchange_send_presentation(request: web.BaseRequest): if not connection_record.is_ready: raise web.HTTPForbidden(reason=f"Connection {connection_id} not ready") - presentation_manager = PresentationManager(context.profile) try: + presentation_manager = PresentationManager(profile) ( pres_ex_record, presentation_message, @@ -748,7 +754,7 @@ async def presentation_exchange_send_presentation(request: web.BaseRequest): WalletNotFoundError, ) as err: if pres_ex_record: - async with context.session() as session: + async with profile.session() as session: await pres_ex_record.save_error_state(session, reason=err.roll_up) # other party cares that we cannot continue protocol await report_problem( @@ -793,12 +799,13 @@ async def presentation_exchange_verify_presentation(request: web.BaseRequest): r_time = get_timer() context: AdminRequestContext = request["context"] + profile = context.profile outbound_handler = request["outbound_message_router"] presentation_exchange_id = request.match_info["pres_ex_id"] pres_ex_record = None - async with context.session() as session: + async with profile.session() as session: try: pres_ex_record = await V10PresentationExchange.retrieve_by_id( session, presentation_exchange_id @@ -817,13 +824,13 @@ async def presentation_exchange_verify_presentation(request: web.BaseRequest): ) ) - presentation_manager = PresentationManager(context.profile) try: + presentation_manager = PresentationManager(profile) pres_ex_record = await presentation_manager.verify_presentation(pres_ex_record) result = pres_ex_record.serialize() except (BaseModelError, LedgerError, StorageError) as err: if pres_ex_record: - async with context.session() as session: + async with profile.session() as session: await pres_ex_record.save_error_state(session, reason=err.roll_up) # other party cares that we cannot continue protocol await report_problem( @@ -867,7 +874,7 @@ async def presentation_exchange_problem_report(request: web.BaseRequest): description = body["description"] try: - async with await context.session() as session: + async with await context.profile.session() as session: pres_ex_record = await V10PresentationExchange.retrieve_by_id( session, pres_ex_id ) @@ -905,7 +912,7 @@ async def presentation_exchange_remove(request: web.BaseRequest): presentation_exchange_id = request.match_info["pres_ex_id"] pres_ex_record = None try: - async with context.session() as session: + async with context.profile.session() as session: pres_ex_record = await V10PresentationExchange.retrieve_by_id( session, presentation_exchange_id ) diff --git a/aries_cloudagent/protocols/present_proof/v1_0/tests/test_manager.py b/aries_cloudagent/protocols/present_proof/v1_0/tests/test_manager.py index dd35121f47..8fb587e37a 100644 --- a/aries_cloudagent/protocols/present_proof/v1_0/tests/test_manager.py +++ b/aries_cloudagent/protocols/present_proof/v1_0/tests/test_manager.py @@ -17,6 +17,9 @@ from .....indy.sdk.verifier import IndySdkVerifier from .....indy.verifier import IndyVerifier from .....ledger.base import BaseLedger +from .....ledger.multiple_ledger.ledger_requests_executor import ( + IndyLedgerRequestsExecutor, +) from .....messaging.decorators.attach_decorator import AttachDecorator from .....messaging.request_context import RequestContext from .....messaging.responder import BaseResponder, MockResponder @@ -260,7 +263,14 @@ async def setUp(self): ) ) injector.bind_instance(BaseLedger, self.ledger) - + injector.bind_instance( + IndyLedgerRequestsExecutor, + async_mock.MagicMock( + get_ledger_for_identifier=async_mock.CoroutineMock( + return_value=self.ledger + ) + ), + ) Holder = async_mock.MagicMock(IndyHolder, autospec=True) self.holder = Holder() get_creds = async_mock.CoroutineMock( @@ -396,7 +406,7 @@ async def test_create_exchange_for_request(self): name=PROOF_REQ_NAME, version=PROOF_REQ_VERSION, nonce=PROOF_REQ_NONCE, - ledger=self.ledger, + profile=self.profile, ) pres_req = PresentationRequest( request_presentations_attach=[ @@ -435,7 +445,7 @@ async def test_create_presentation(self): name=PROOF_REQ_NAME, version=PROOF_REQ_VERSION, nonce=PROOF_REQ_NONCE, - ledger=self.ledger, + profile=self.profile, ) exchange_in.presentation_request = indy_proof_req @@ -477,7 +487,7 @@ async def test_create_presentation_proof_req_non_revoc_interval_none(self): name=PROOF_REQ_NAME, version=PROOF_REQ_VERSION, nonce=PROOF_REQ_NONCE, - ledger=self.ledger, + profile=self.profile, ) indy_proof_req["non_revoked"] = None # simulate interop with indy-vcx @@ -539,7 +549,7 @@ async def test_create_presentation_self_asserted(self): name=PROOF_REQ_NAME, version=PROOF_REQ_VERSION, nonce=PROOF_REQ_NONCE, - ledger=self.ledger, + profile=self.profile, ) exchange_in.presentation_request = indy_proof_req @@ -591,7 +601,7 @@ async def test_create_presentation_no_revocation(self): name=PROOF_REQ_NAME, version=PROOF_REQ_VERSION, nonce=PROOF_REQ_NONCE, - ledger=self.ledger, + profile=self.profile, ) exchange_in.presentation_request = indy_proof_req @@ -657,7 +667,7 @@ async def test_create_presentation_bad_revoc_state(self): name=PROOF_REQ_NAME, version=PROOF_REQ_VERSION, nonce=PROOF_REQ_NONCE, - ledger=self.ledger, + profile=self.profile, ) exchange_in.presentation_request = indy_proof_req @@ -725,7 +735,7 @@ async def test_create_presentation_multi_matching_proposal_creds_names(self): name=PROOF_REQ_NAME, version=PROOF_REQ_VERSION, nonce=PROOF_REQ_NONCE, - ledger=self.ledger, + profile=self.profile, ) exchange_in.presentation_request = indy_proof_req @@ -818,7 +828,7 @@ async def test_no_matching_creds_for_proof_req(self): name=PROOF_REQ_NAME, version=PROOF_REQ_VERSION, nonce=PROOF_REQ_NONCE, - ledger=self.ledger, + profile=self.profile, ) get_creds = async_mock.CoroutineMock(return_value=()) self.holder.get_credentials_for_presentation_request_by_referent = get_creds @@ -1184,7 +1194,7 @@ async def test_verify_presentation(self): name=PROOF_REQ_NAME, version=PROOF_REQ_VERSION, nonce=PROOF_REQ_NONCE, - ledger=self.ledger, + profile=self.profile, ) pres_req = PresentationRequest( request_presentations_attach=[ diff --git a/aries_cloudagent/protocols/present_proof/v2_0/tests/test_manager.py b/aries_cloudagent/protocols/present_proof/v2_0/tests/test_manager.py index 3cd33d49a0..150f1c7717 100644 --- a/aries_cloudagent/protocols/present_proof/v2_0/tests/test_manager.py +++ b/aries_cloudagent/protocols/present_proof/v2_0/tests/test_manager.py @@ -15,6 +15,9 @@ ) from .....indy.verifier import IndyVerifier from .....ledger.base import BaseLedger +from .....ledger.multiple_ledger.ledger_requests_executor import ( + IndyLedgerRequestsExecutor, +) from .....messaging.decorators.attach_decorator import AttachDecorator from .....messaging.responder import BaseResponder, MockResponder from .....storage.error import StorageNotFoundError @@ -420,6 +423,14 @@ async def setUp(self): ) ) injector.bind_instance(BaseLedger, self.ledger) + injector.bind_instance( + IndyLedgerRequestsExecutor, + async_mock.MagicMock( + get_ledger_for_identifier=async_mock.CoroutineMock( + return_value=self.ledger + ) + ), + ) Holder = async_mock.MagicMock(IndyHolder, autospec=True) self.holder = Holder() diff --git a/aries_cloudagent/protocols/trustping/v1_0/routes.py b/aries_cloudagent/protocols/trustping/v1_0/routes.py index 4a7c08ac8e..0bdd71d350 100644 --- a/aries_cloudagent/protocols/trustping/v1_0/routes.py +++ b/aries_cloudagent/protocols/trustping/v1_0/routes.py @@ -56,7 +56,7 @@ async def connections_send_ping(request: web.BaseRequest): comment = body.get("comment") try: - async with context.session() as session: + async with context.profile.session() as session: connection = await ConnRecord.retrieve_by_id(session, connection_id) except StorageNotFoundError as err: raise web.HTTPNotFound(reason=err.roll_up) from err diff --git a/aries_cloudagent/resolver/default/indy.py b/aries_cloudagent/resolver/default/indy.py index 75f24e3c2c..566e5ce22c 100644 --- a/aries_cloudagent/resolver/default/indy.py +++ b/aries_cloudagent/resolver/default/indy.py @@ -10,9 +10,12 @@ from ...config.injection_context import InjectionContext from ...core.profile import Profile -from ...ledger.base import BaseLedger from ...ledger.endpoint_type import EndpointType from ...ledger.error import LedgerError +from ...ledger.multiple_ledger.ledger_requests_executor import ( + GET_KEY_FOR_DID, + IndyLedgerRequestsExecutor, +) from ...messaging.valid import IndyDID from ..base import BaseDIDResolver, DIDNotFound, ResolverError, ResolverType @@ -41,7 +44,15 @@ def supported_did_regex(self) -> Pattern: async def _resolve(self, profile: Profile, did: str) -> dict: """Resolve an indy DID.""" - ledger = profile.inject_or(BaseLedger) + ledger_exec_inst = profile.inject(IndyLedgerRequestsExecutor) + ledger_info = await ledger_exec_inst.get_ledger_for_identifier( + did, + txn_record_type=GET_KEY_FOR_DID, + ) + if isinstance(ledger_info, tuple): + ledger = ledger_info[1] + else: + ledger = ledger_info if not ledger: raise NoIndyLedger("No Indy ledger instance is configured.") diff --git a/aries_cloudagent/resolver/default/tests/test_indy.py b/aries_cloudagent/resolver/default/tests/test_indy.py index a82f7ba427..21c8744644 100644 --- a/aries_cloudagent/resolver/default/tests/test_indy.py +++ b/aries_cloudagent/resolver/default/tests/test_indy.py @@ -8,6 +8,9 @@ from ....core.profile import Profile from ....ledger.base import BaseLedger from ....ledger.error import LedgerError +from ....ledger.multiple_ledger.ledger_requests_executor import ( + IndyLedgerRequestsExecutor, +) from ....messaging.valid import IndyDID from ...base import DIDNotFound, ResolverError @@ -27,7 +30,7 @@ def resolver(): @pytest.fixture def ledger(): """Ledger fixture.""" - ledger = async_mock.MagicMock(spec=test_module.BaseLedger) + ledger = async_mock.MagicMock(spec=BaseLedger) ledger.get_endpoint_for_did = async_mock.CoroutineMock( return_value="https://github.com/" ) @@ -39,7 +42,12 @@ def ledger(): def profile(ledger): """Profile fixture.""" profile = InMemoryProfile.test_profile() - profile.context.injector.bind_instance(BaseLedger, ledger) + profile.context.injector.bind_instance( + IndyLedgerRequestsExecutor, + async_mock.MagicMock( + get_ledger_for_identifier=async_mock.CoroutineMock(return_value=ledger) + ), + ) yield profile @@ -60,7 +68,12 @@ async def test_resolve_x_no_ledger( self, profile: Profile, resolver: IndyDIDResolver ): """Test resolve method with no ledger.""" - profile.context.injector.clear_binding(BaseLedger) + profile.context.injector.bind_instance( + IndyLedgerRequestsExecutor, + async_mock.MagicMock( + get_ledger_for_identifier=async_mock.CoroutineMock(return_value=None) + ), + ) with pytest.raises(ResolverError): await resolver.resolve(profile, TEST_DID0) diff --git a/aries_cloudagent/resolver/routes.py b/aries_cloudagent/resolver/routes.py index 17c91eeff1..680353023a 100644 --- a/aries_cloudagent/resolver/routes.py +++ b/aries_cloudagent/resolver/routes.py @@ -90,8 +90,8 @@ async def resolve_did(request: web.Request): did = request.match_info["did"] try: - session = await context.session() - resolver = session.inject(DIDResolver) + async with context.profile.session() as session: + resolver = session.inject(DIDResolver) result: ResolutionResult = await resolver.resolve_with_metadata( context.profile, did ) diff --git a/aries_cloudagent/resolver/tests/test_routes.py b/aries_cloudagent/resolver/tests/test_routes.py index 633c79c038..01c213572b 100644 --- a/aries_cloudagent/resolver/tests/test_routes.py +++ b/aries_cloudagent/resolver/tests/test_routes.py @@ -6,7 +6,8 @@ from asynctest import mock as async_mock from pydid import DIDDocument -from ...admin.request_context import AdminRequestContext +from ...core.in_memory import InMemoryProfile + from .. import routes as test_module from ..base import ( DIDMethodNotSupported, @@ -17,6 +18,7 @@ ResolverType, ) from ..did_resolver import DIDResolver + from . import DOC @@ -55,9 +57,14 @@ def mock_resolver(resolution_result): yield did_resolver -@pytest.fixture -def mock_request(mock_resolver): - context = AdminRequestContext.test_context({DIDResolver: mock_resolver}) +@pytest.mark.asyncio +async def test_resolver(mock_resolver, mock_response: async_mock.MagicMock, did_doc): + profile = InMemoryProfile.test_profile() + context = profile.context + setattr(context, "profile", profile) + session = await profile.session() + session.context.injector.bind_instance(DIDResolver, mock_resolver) + outbound_message_router = async_mock.CoroutineMock() request_dict = { "context": context, @@ -71,14 +78,14 @@ def mock_request(mock_resolver): json=async_mock.CoroutineMock(return_value={}), __getitem__=lambda _, k: request_dict[k], ) - yield request - - -@pytest.mark.asyncio -async def test_resolver(mock_request, mock_response: async_mock.MagicMock, did_doc): - await test_module.resolve_did(mock_request) - mock_response.call_args[0][0] == did_doc.serialize() - # TODO: test http response codes + with async_mock.patch.object( + context.profile, + "session", + async_mock.MagicMock(return_value=session), + ) as mock_session: + await test_module.resolve_did(request) + mock_response.call_args[0][0] == did_doc.serialize() + # TODO: test http response codes @pytest.mark.asyncio @@ -90,14 +97,37 @@ async def test_resolver(mock_request, mock_response: async_mock.MagicMock, did_d (ResolverError, test_module.web.HTTPInternalServerError), ], ) -async def test_resolver_not_found_error( - mock_resolver, mock_request, side_effect, error -): +async def test_resolver_not_found_error(mock_resolver, side_effect, error): mock_resolver.resolve_with_metadata = async_mock.CoroutineMock( side_effect=side_effect() ) - with pytest.raises(error): - await test_module.resolve_did(mock_request) + + profile = InMemoryProfile.test_profile() + context = profile.context + setattr(context, "profile", profile) + session = await profile.session() + session.context.injector.bind_instance(DIDResolver, mock_resolver) + + outbound_message_router = async_mock.CoroutineMock() + request_dict = { + "context": context, + "outbound_message_router": outbound_message_router, + } + request = async_mock.MagicMock( + match_info={ + "did": "did:ethr:mainnet:0xb9c5714089478a327f09197987f16f9e5d936e8a", + }, + query={}, + json=async_mock.CoroutineMock(return_value={}), + __getitem__=lambda _, k: request_dict[k], + ) + with async_mock.patch.object( + context.profile, + "session", + async_mock.MagicMock(return_value=session), + ) as mock_session: + with pytest.raises(error): + await test_module.resolve_did(request) @pytest.mark.asyncio diff --git a/aries_cloudagent/revocation/indy.py b/aries_cloudagent/revocation/indy.py index f2f59160ce..70c2877d5e 100644 --- a/aries_cloudagent/revocation/indy.py +++ b/aries_cloudagent/revocation/indy.py @@ -3,7 +3,11 @@ from typing import Sequence from ..core.profile import Profile -from ..ledger.base import BaseLedger +from ..ledger.multiple_ledger.ledger_requests_executor import ( + GET_CRED_DEF, + GET_REVOC_REG_DEF, + IndyLedgerRequestsExecutor, +) from ..storage.base import StorageNotFoundError from .error import RevocationNotSupportedError, RevocationRegistryBadSizeError @@ -28,7 +32,15 @@ async def init_issuer_registry( tag: str = None, ) -> "IssuerRevRegRecord": """Create a new revocation registry record for a credential definition.""" - ledger = self._profile.inject(BaseLedger) + ledger_exec_inst = self._profile.inject(IndyLedgerRequestsExecutor) + ledger_info = await ledger_exec_inst.get_ledger_for_identifier( + cred_def_id, + txn_record_type=GET_CRED_DEF, + ) + if isinstance(ledger_info, tuple): + ledger = ledger_info[1] + else: + ledger = ledger_info async with ledger: cred_def = await ledger.get_credential_definition(cred_def_id) if not cred_def: @@ -98,7 +110,15 @@ async def get_ledger_registry(self, revoc_reg_id: str) -> "RevocationRegistry": if revoc_reg_id in IndyRevocation.REV_REG_CACHE: return IndyRevocation.REV_REG_CACHE[revoc_reg_id] - ledger = self._profile.inject(BaseLedger) + ledger_exec_inst = self._profile.inject(IndyLedgerRequestsExecutor) + ledger_info = await ledger_exec_inst.get_ledger_for_identifier( + revoc_reg_id, + txn_record_type=GET_REVOC_REG_DEF, + ) + if isinstance(ledger_info, tuple): + ledger = ledger_info[1] + else: + ledger = ledger_info async with ledger: rev_reg = RevocationRegistry.from_definition( await ledger.get_revoc_reg_def(revoc_reg_id), True diff --git a/aries_cloudagent/revocation/routes.py b/aries_cloudagent/revocation/routes.py index 57b77159e3..c877f865df 100644 --- a/aries_cloudagent/revocation/routes.py +++ b/aries_cloudagent/revocation/routes.py @@ -433,14 +433,14 @@ async def create_rev_reg(request: web.BaseRequest): """ context: AdminRequestContext = request["context"] - + profile = context.profile body = await request.json() credential_definition_id = body.get("credential_definition_id") max_cred_num = body.get("max_cred_num") # check we published this cred def - async with context.session() as session: + async with profile.session() as session: storage = session.inject(BaseStorage) found = await storage.find_all_records( @@ -453,14 +453,14 @@ async def create_rev_reg(request: web.BaseRequest): ) try: - revoc = IndyRevocation(context.profile) + revoc = IndyRevocation(profile) issuer_rev_reg_rec = await revoc.init_issuer_registry( credential_definition_id, max_cred_num=max_cred_num, ) except RevocationNotSupportedError as e: raise web.HTTPBadRequest(reason=e.message) from e - await shield(issuer_rev_reg_rec.generate_registry(context.profile)) + await shield(issuer_rev_reg_rec.generate_registry(profile)) return web.json_response({"result": issuer_rev_reg_rec.serialize()}) @@ -490,7 +490,7 @@ async def rev_regs_created(request: web.BaseRequest): tag_filter = { tag: request.query[tag] for tag in search_tags if tag in request.query } - async with context.session() as session: + async with context.profile.session() as session: found = await IssuerRevRegRecord.query(session, tag_filter) return web.json_response({"rev_reg_ids": [record.revoc_reg_id for record in found]}) @@ -547,7 +547,7 @@ async def get_rev_reg_issued(request: web.BaseRequest): rev_reg_id = request.match_info["rev_reg_id"] - async with context.session() as session: + async with context.profile.session() as session: try: await IssuerRevRegRecord.retrieve_by_revoc_reg_id(session, rev_reg_id) except StorageNotFoundError as err: @@ -583,7 +583,7 @@ async def get_cred_rev_record(request: web.BaseRequest): cred_ex_id = request.query.get("cred_ex_id") try: - async with context.session() as session: + async with context.profile.session() as session: if rev_reg_id and cred_rev_id: rec = await IssuerCredRevRecord.retrieve_by_ids( session, rev_reg_id, cred_rev_id @@ -718,6 +718,7 @@ async def send_rev_reg_def(request: web.BaseRequest): """ context: AdminRequestContext = request["context"] + profile = context.profile outbound_handler = request["outbound_message_router"] rev_reg_id = request.match_info["rev_reg_id"] create_transaction_for_endorser = json.loads( @@ -728,19 +729,19 @@ async def send_rev_reg_def(request: web.BaseRequest): connection_id = request.query.get("conn_id") # check if we need to endorse - if is_author_role(context.profile): + if is_author_role(profile): # authors cannot write to the ledger write_ledger = False create_transaction_for_endorser = True if not connection_id: # author has not provided a connection id, so determine which to use - connection_id = await get_endorser_connection_id(context.profile) + connection_id = await get_endorser_connection_id(profile) if not connection_id: raise web.HTTPBadRequest(reason="No endorser connection found") if not write_ledger: try: - async with context.session() as session: + async with profile.session() as session: connection_record = await ConnRecord.retrieve_by_id( session, connection_id ) @@ -749,8 +750,10 @@ async def send_rev_reg_def(request: web.BaseRequest): except BaseModelError as err: raise web.HTTPBadRequest(reason=err.roll_up) from err - session = await context.session() - endorser_info = await connection_record.metadata_get(session, "endorser_info") + async with profile.session() as session: + endorser_info = await connection_record.metadata_get( + session, "endorser_info" + ) if not endorser_info: raise web.HTTPForbidden( reason="Endorser Info is not set up in " @@ -764,11 +767,11 @@ async def send_rev_reg_def(request: web.BaseRequest): endorser_did = endorser_info["endorser_did"] try: - revoc = IndyRevocation(context.profile) + revoc = IndyRevocation(profile) rev_reg = await revoc.get_issuer_rev_reg_record(rev_reg_id) rev_reg_resp = await rev_reg.send_def( - context.profile, + profile, write_ledger=write_ledger, endorser_did=endorser_did, ) @@ -782,7 +785,7 @@ async def send_rev_reg_def(request: web.BaseRequest): return web.json_response({"result": rev_reg.serialize()}) else: - transaction_mgr = TransactionManager(context.profile) + transaction_mgr = TransactionManager(profile) try: transaction = await transaction_mgr.create_record( messages_attach=rev_reg_resp["result"], connection_id=connection_id @@ -793,7 +796,10 @@ async def send_rev_reg_def(request: web.BaseRequest): # if auto-request, send the request to the endorser if context.settings.get_value("endorser.auto_request"): try: - transaction, transaction_request = await transaction_mgr.create_request( + ( + transaction, + transaction_request, + ) = await transaction_mgr.create_request( transaction=transaction, # TODO see if we need to parameterize these params # expires_time=expires_time, @@ -827,6 +833,7 @@ async def send_rev_reg_entry(request: web.BaseRequest): """ context: AdminRequestContext = request["context"] + profile = context.profile outbound_handler = request["outbound_message_router"] create_transaction_for_endorser = json.loads( request.query.get("create_transaction_for_endorser", "false") @@ -837,19 +844,19 @@ async def send_rev_reg_entry(request: web.BaseRequest): rev_reg_id = request.match_info["rev_reg_id"] # check if we need to endorse - if is_author_role(context.profile): + if is_author_role(profile): # authors cannot write to the ledger write_ledger = False create_transaction_for_endorser = True if not connection_id: # author has not provided a connection id, so determine which to use - connection_id = await get_endorser_connection_id(context.profile) + connection_id = await get_endorser_connection_id(profile) if not connection_id: raise web.HTTPBadRequest(reason="No endorser connection found") if not write_ledger: try: - async with context.session() as session: + async with profile.session() as session: connection_record = await ConnRecord.retrieve_by_id( session, connection_id ) @@ -858,8 +865,10 @@ async def send_rev_reg_entry(request: web.BaseRequest): except BaseModelError as err: raise web.HTTPBadRequest(reason=err.roll_up) from err - session = await context.session() - endorser_info = await connection_record.metadata_get(session, "endorser_info") + async with profile.session() as session: + endorser_info = await connection_record.metadata_get( + session, "endorser_info" + ) if not endorser_info: raise web.HTTPForbidden( reason="Endorser Info is not set up in " @@ -873,10 +882,10 @@ async def send_rev_reg_entry(request: web.BaseRequest): endorser_did = endorser_info["endorser_did"] try: - revoc = IndyRevocation(context.profile) + revoc = IndyRevocation(profile) rev_reg = await revoc.get_issuer_rev_reg_record(rev_reg_id) rev_entry_resp = await rev_reg.send_entry( - context.profile, + profile, write_ledger=write_ledger, endorser_did=endorser_did, ) @@ -892,10 +901,11 @@ async def send_rev_reg_entry(request: web.BaseRequest): return web.json_response({"result": rev_reg.serialize()}) else: - transaction_mgr = TransactionManager(context.profile) + transaction_mgr = TransactionManager(profile) try: transaction = await transaction_mgr.create_record( - messages_attach=rev_entry_resp["result"], connection_id=connection_id + messages_attach=rev_entry_resp["result"], + connection_id=connection_id, ) except StorageError as err: raise web.HTTPBadRequest(reason=err.roll_up) from err @@ -903,7 +913,10 @@ async def send_rev_reg_entry(request: web.BaseRequest): # if auto-request, send the request to the endorser if context.settings.get_value("endorser.auto_request"): try: - transaction, transaction_request = await transaction_mgr.create_request( + ( + transaction, + transaction_request, + ) = await transaction_mgr.create_request( transaction=transaction, # TODO see if we need to parameterize these params # expires_time=expires_time, @@ -936,16 +949,16 @@ async def update_rev_reg(request: web.BaseRequest): """ context: AdminRequestContext = request["context"] - + profile = context.profile body = await request.json() tails_public_uri = body.get("tails_public_uri") rev_reg_id = request.match_info["rev_reg_id"] try: - revoc = IndyRevocation(context.profile) + revoc = IndyRevocation(profile) rev_reg = await revoc.get_issuer_rev_reg_record(rev_reg_id) - await rev_reg.set_tails_file_public_uri(context.profile, tails_public_uri) + await rev_reg.set_tails_file_public_uri(profile, tails_public_uri) except StorageNotFoundError as err: raise web.HTTPNotFound(reason=err.roll_up) from err @@ -971,13 +984,14 @@ async def set_rev_reg_state(request: web.BaseRequest): """ context: AdminRequestContext = request["context"] + profile = context.profile rev_reg_id = request.match_info["rev_reg_id"] state = request.query.get("state") try: - revoc = IndyRevocation(context.profile) + revoc = IndyRevocation(profile) rev_reg = await revoc.get_issuer_rev_reg_record(rev_reg_id) - async with context.session() as session: + async with profile.session() as session: await rev_reg.set_state(session, state) LOGGER.debug("set registry %s state: %s", rev_reg_id, state) diff --git a/aries_cloudagent/revocation/tests/test_indy.py b/aries_cloudagent/revocation/tests/test_indy.py index c98f3d7404..2a4a0445cd 100644 --- a/aries_cloudagent/revocation/tests/test_indy.py +++ b/aries_cloudagent/revocation/tests/test_indy.py @@ -4,6 +4,9 @@ from ...core.in_memory import InMemoryProfile from ...ledger.base import BaseLedger +from ...ledger.multiple_ledger.ledger_requests_executor import ( + IndyLedgerRequestsExecutor, +) from ...storage.error import StorageNotFoundError from ..error import ( @@ -28,7 +31,14 @@ def setUp(self): ) self.ledger.get_revoc_reg_def = async_mock.CoroutineMock() self.context.injector.bind_instance(BaseLedger, self.ledger) - + self.context.injector.bind_instance( + IndyLedgerRequestsExecutor, + async_mock.MagicMock( + get_ledger_for_identifier=async_mock.CoroutineMock( + return_value=self.ledger + ) + ), + ) self.revoc = IndyRevocation(self.profile) self.test_did = "sample-did" diff --git a/aries_cloudagent/revocation/tests/test_routes.py b/aries_cloudagent/revocation/tests/test_routes.py index 6084fbc409..ef3011edf2 100644 --- a/aries_cloudagent/revocation/tests/test_routes.py +++ b/aries_cloudagent/revocation/tests/test_routes.py @@ -2,6 +2,8 @@ from asynctest import TestCase as AsyncTestCase from asynctest import mock as async_mock +from aries_cloudagent.core.in_memory import InMemoryProfile + from ...admin.request_context import AdminRequestContext from ...storage.in_memory import InMemoryStorage from ...tails.base import BaseTailsServer @@ -16,8 +18,9 @@ def setUp(self): self.tails_server.upload_tails_file = async_mock.CoroutineMock( return_value=(True, None) ) - self.session_inject = {} - self.context = AdminRequestContext.test_context(self.session_inject) + self.profile = InMemoryProfile.test_profile() + self.context = self.profile.context + setattr(self.context, "profile", self.profile) self.context.injector.bind_instance(BaseTailsServer, self.tails_server) self.request_dict = { "context": self.context, diff --git a/aries_cloudagent/wallet/crypto.py b/aries_cloudagent/wallet/crypto.py index d0203e0824..5e79c3c15f 100644 --- a/aries_cloudagent/wallet/crypto.py +++ b/aries_cloudagent/wallet/crypto.py @@ -1,5 +1,7 @@ """Cryptography functions used by BasicWallet.""" +import re + from collections import OrderedDict from typing import Callable, Optional, Sequence, Tuple, Union, List @@ -80,6 +82,24 @@ def seed_to_did(seed: str) -> str: return did +def did_is_self_certified(did: str, verkey: str) -> bool: + """ + Check if the DID is self certified. + + Args: + did: DID string + verkey: VERKEY string + """ + ABBREVIATED_VERKEY_REGEX = "^~[1-9A-HJ-NP-Za-km-z]{21,22}$" + if re.search(ABBREVIATED_VERKEY_REGEX, verkey): + return True + verkey_bytes = b58_to_bytes(verkey) + did_from_verkey = bytes_to_b58(verkey_bytes[:16]) + if did == did_from_verkey: + return True + return False + + def sign_pk_from_sk(secret: bytes) -> bytes: """Extract the verkey from a secret signing key.""" seed_len = nacl.bindings.crypto_sign_SEEDBYTES diff --git a/aries_cloudagent/wallet/routes.py b/aries_cloudagent/wallet/routes.py index b3d84c01a9..62002f1bb6 100644 --- a/aries_cloudagent/wallet/routes.py +++ b/aries_cloudagent/wallet/routes.py @@ -198,7 +198,6 @@ async def wallet_did_list(request: web.BaseRequest): """ context: AdminRequestContext = request["context"] - filter_did = request.query.get("did") filter_verkey = request.query.get("verkey") filter_method = DIDMethod.from_method(request.query.get("method")) diff --git a/aries_cloudagent/wallet/tests/test_crypto.py b/aries_cloudagent/wallet/tests/test_crypto.py index d6789ca25b..66f6fc770a 100644 --- a/aries_cloudagent/wallet/tests/test_crypto.py +++ b/aries_cloudagent/wallet/tests/test_crypto.py @@ -57,6 +57,18 @@ def test_decode_pack_message_x(self): test_module.decode_pack_message(b"encrypted", lambda x: b"recip_secret") assert "Sender public key not provided" in str(excinfo.value) + def test_did_is_self_certified(self): + did = "Av63wJYM7xYR4AiygYq4c3" + verkey = "6QSduYdf8Bi6t8PfNm5vNomGWDtXhmMmTRzaciudBXYJ" + assert test_module.did_is_self_certified(did, verkey) + verkey = "~PKAYz8Ev4yoQgr2LaMAWFx" + assert test_module.did_is_self_certified(did, verkey) + verkey = "ABUF7uxYTxZ6qYdZ4G9e1Gi" + assert not test_module.did_is_self_certified(did, verkey) + did = "6YnVN5Qdb6mqimTIQcQmSXrHXKdTEdRn5YHZReezUTvta" + verkey = "6QSduYdf8Bi6t8PfNm5vNomGWDtXhmMmTRzaciudBXYJ" + assert not test_module.did_is_self_certified(did, verkey) + def test_decode_pack_message_outer_x(self): with pytest.raises(ValueError) as excinfo: test_module.decode_pack_message_outer(json.dumps({"invalid": "content"})) diff --git a/aries_cloudagent/wallet/tests/test_routes.py b/aries_cloudagent/wallet/tests/test_routes.py index 83147be5bc..33df4b553a 100644 --- a/aries_cloudagent/wallet/tests/test_routes.py +++ b/aries_cloudagent/wallet/tests/test_routes.py @@ -1,8 +1,8 @@ -from ...core.in_memory import InMemoryProfile from asynctest import mock as async_mock, TestCase as AsyncTestCase from aiohttp.web import HTTPForbidden from ...admin.request_context import AdminRequestContext +from ...core.in_memory import InMemoryProfile from ...ledger.base import BaseLedger from ...multitenant.base import BaseMultitenantManager from ...multitenant.manager import MultitenantManager diff --git a/demo/README.md b/demo/README.md index 43df1767f4..a3db767adf 100644 --- a/demo/README.md +++ b/demo/README.md @@ -22,6 +22,7 @@ There are several demos available for ACA-Py mostly (but not only) aimed at deve - [Revocation](#revocation) - [Mediation](#mediation) - [Multi-tenancy](#multi-tenancy) + - [Multi-ledger](#multi-ledger) - [DID Exchange](#did-exchange) - [Endorser](#endorser) - [Run Askar Backend](#run-askar-backend) @@ -71,7 +72,7 @@ In the first terminal window, start `von-network` by following the [Running the In the second terminal, change directory into `demo` directory of your clone of this repository. Start the `faber` agent by issuing the following command: ``` bash - ./run_demo faber + ./run_demo faber ``` In the third terminal, change directory into `demo` directory of your clone of this repository. Start the `alice` agent by issuing the following command: @@ -80,7 +81,7 @@ In the third terminal, change directory into `demo` directory of your clone of t ./run_demo alice ``` -Jump to the [Follow the Script](#follow-the-script) section below for further instructions. +Jump to the [Follow the Script](#follow-the-script) section below for further instructions. ### Running Locally @@ -229,7 +230,7 @@ When you revoke a credential you will need to provide those values: Enter revocation registry ID: WGmUNAdH2ZfeGvacFoMVVP:4:WGmUNAdH2ZfeGvacFoMVVP:3:CL:38:Faber.Agent.degree_schema:CL_ACCUM:15ca49ed-1250-4608-9e8f-c0d52d7260c3 Enter credential revocation ID: 1 -Publish now? [Y/N]: y +Publish now? [Y/N]: y ``` Note that you need to Publish the revocation information to the ledger. Once you've revoked a credential any proof which uses this credential will fail to verify. @@ -274,6 +275,16 @@ To enable support for multi-tenancy, run the `alice` or `faber` demo with the `- (This option can be used with both (or either) `alice` and/or `faber`.) +### Multi-ledger + +To enable multiple ledger mode, run the `alice` or `faber` demo with the `--multi-ledger` option: + +```bash +./run_demo faber --multi-ledger +``` + +The configuration file for setting up multiple ledgers (for the demo) can be found at `./demo/multiple_ledger_config.yml`. + You will see an additional menu option to create new sub-wallets (or they can be considered to be "virtual agents"). Faber: diff --git a/demo/multi_ledger_config.yml b/demo/multi_ledger_config.yml new file mode 100644 index 0000000000..3290f50d5e --- /dev/null +++ b/demo/multi_ledger_config.yml @@ -0,0 +1,6 @@ +- id: bcorvinTest + is_production: true + genesis_url: 'http://test.bcovrin.vonx.io/genesis' +- id: greenlightTest + is_production: true + genesis_url: 'http://dev.greenlight.bcovrin.vonx.io/genesis' diff --git a/demo/runners/acme.py b/demo/runners/acme.py index 13361aafb0..898ff72846 100644 --- a/demo/runners/acme.py +++ b/demo/runners/acme.py @@ -126,6 +126,7 @@ async def main(args): acme_agent.start_port, acme_agent.start_port + 1, genesis_data=acme_agent.genesis_txns, + genesis_txn_list=acme_agent.genesis_txn_list, no_auto=acme_agent.no_auto, tails_server_base_url=acme_agent.tails_server_base_url, timing=acme_agent.show_timing, diff --git a/demo/runners/agent_container.py b/demo/runners/agent_container.py index eb027045e4..b15bfcde96 100644 --- a/demo/runners/agent_container.py +++ b/demo/runners/agent_container.py @@ -566,11 +566,12 @@ async def create_schema_and_cred_def( class AgentContainer: def __init__( self, - genesis_txns: str, ident: str, start_port: int, no_auto: bool = False, revocation: bool = False, + genesis_txns: str = None, + genesis_txn_list: str = None, tails_server_base_url: str = None, cred_type: str = CRED_FORMAT_INDY, show_timing: bool = False, @@ -586,6 +587,7 @@ def __init__( ): # configuration parameters self.genesis_txns = genesis_txns + self.genesis_txn_list = genesis_txn_list self.ident = ident self.start_port = start_port self.no_auto = no_auto @@ -634,6 +636,7 @@ async def initialize( self.start_port, self.start_port + 1, genesis_data=self.genesis_txns, + genesis_txn_list=self.genesis_txn_list, no_auto=self.no_auto, tails_server_base_url=self.tails_server_base_url, timing=self.show_timing, @@ -661,6 +664,7 @@ async def initialize( self.endorser_agent = await start_endorser_agent( self.start_port + 7, self.genesis_txns, + self.genesis_txn_list, use_did_exchange=self.use_did_exchange, ) if not self.endorser_agent: @@ -682,7 +686,7 @@ async def initialize( if self.mediation: self.mediator_agent = await start_mediator_agent( - self.start_port + 4, self.genesis_txns + self.start_port + 4, self.genesis_txns, self.genesis_txn_list ) if not self.mediator_agent: raise Exception("Mediator agent returns None :-(") @@ -1049,6 +1053,14 @@ def arg_parser(ident: str = None, port: int = 8020): parser.add_argument( "--mediation", action="store_true", help="Enable mediation functionality" ) + parser.add_argument( + "--multi-ledger", + action="store_true", + help=( + "Enable multiple ledger mode, config file can be found " + "here: ./demo/multi_ledger_config.yml" + ), + ) parser.add_argument( "--wallet-type", type=str, @@ -1119,8 +1131,11 @@ async def create_agent_with_args(args, ident: str = None): "If revocation is enabled, --tails-server-base-url must be provided" ) + multi_ledger_config_path = None + if "multi_ledger" in args and args.multi_ledger: + multi_ledger_config_path = "./demo/multi_ledger_config.yml" genesis = await default_genesis_txns() - if not genesis: + if not genesis and not multi_ledger_config_path: print("Error retrieving ledger genesis transactions") sys.exit(1) @@ -1150,9 +1165,10 @@ async def create_agent_with_args(args, ident: str = None): ) agent = AgentContainer( - genesis, - agent_ident + ".agent", - args.port, + genesis_txns=genesis, + genesis_txn_list=multi_ledger_config_path, + ident=agent_ident + ".agent", + start_port=args.port, no_auto=args.no_auto, revocation=args.revocation if "revocation" in args else False, tails_server_base_url=tails_server_base_url, @@ -1192,9 +1208,9 @@ async def test_main( try: # initialize the containers faber_container = AgentContainer( - genesis, - "Faber.agent", - start_port, + genesis_txns=genesis, + ident="Faber.agent", + start_port=start_port, no_auto=no_auto, revocation=revocation, tails_server_base_url=tails_server_base_url, @@ -1209,9 +1225,9 @@ async def test_main( aip=aip, ) alice_container = AgentContainer( - genesis, - "Alice.agent", - start_port + 10, + genesis_txns=genesis, + ident="Alice.agent", + start_port=start_port + 10, no_auto=no_auto, revocation=False, show_timing=show_timing, diff --git a/demo/runners/alice.py b/demo/runners/alice.py index 1206c95ef6..c98a37899a 100644 --- a/demo/runners/alice.py +++ b/demo/runners/alice.py @@ -120,6 +120,7 @@ async def main(args): alice_agent.start_port, alice_agent.start_port + 1, genesis_data=alice_agent.genesis_txns, + genesis_txn_list=alice_agent.genesis_txn_list, no_auto=alice_agent.no_auto, tails_server_base_url=alice_agent.tails_server_base_url, timing=alice_agent.show_timing, diff --git a/demo/runners/faber.py b/demo/runners/faber.py index 10c3b59fdf..43cf8ace0c 100644 --- a/demo/runners/faber.py +++ b/demo/runners/faber.py @@ -387,6 +387,7 @@ async def main(args): faber_agent.start_port, faber_agent.start_port + 1, genesis_data=faber_agent.genesis_txns, + genesis_txn_list=faber_agent.genesis_txn_list, no_auto=faber_agent.no_auto, tails_server_base_url=faber_agent.tails_server_base_url, timing=faber_agent.show_timing, diff --git a/demo/runners/performance.py b/demo/runners/performance.py index 6f70472f6e..e90c176087 100644 --- a/demo/runners/performance.py +++ b/demo/runners/performance.py @@ -261,6 +261,7 @@ async def main( show_timing: bool = False, multitenant: bool = False, mediation: bool = False, + multi_ledger: bool = False, use_did_exchange: bool = False, revocation: bool = False, tails_server_base_url: str = None, @@ -268,10 +269,15 @@ async def main( wallet_type: str = None, ): - genesis = await default_genesis_txns() - if not genesis: - print("Error retrieving ledger genesis transactions") - sys.exit(1) + if multi_ledger: + genesis = None + multi_ledger_config_path = "./demo/multi_ledger_config.yml" + else: + genesis = await default_genesis_txns() + multi_ledger_config_path = None + if not genesis: + print("Error retrieving ledger genesis transactions") + sys.exit(1) alice = None faber = None @@ -284,6 +290,7 @@ async def main( alice = AliceAgent( start_port, genesis_data=genesis, + genesis_txn_list=multi_ledger_config_path, timing=show_timing, multitenant=multitenant, mediation=mediation, @@ -294,6 +301,7 @@ async def main( faber = FaberAgent( start_port + 3, genesis_data=genesis, + genesis_txn_list=multi_ledger_config_path, timing=show_timing, tails_server_base_url=tails_server_base_url, multitenant=multitenant, @@ -309,12 +317,12 @@ async def main( if mediation: alice_mediator_agent = await start_mediator_agent( - start_port + 8, genesis + start_port + 8, genesis, multi_ledger_config_path ) if not alice_mediator_agent: raise Exception("Mediator agent returns None :-(") faber_mediator_agent = await start_mediator_agent( - start_port + 11, genesis + start_port + 11, genesis, multi_ledger_config_path ) if not faber_mediator_agent: raise Exception("Mediator agent returns None :-(") @@ -603,6 +611,14 @@ async def check_received_pings(agent, issue_count, pb): parser.add_argument( "--mediation", action="store_true", help="Enable mediation functionality" ) + parser.add_argument( + "--multi-ledger", + action="store_true", + help=( + "Enable multiple ledger mode, config file can be found " + "here: ./demo/multi_ledger_config.yml" + ), + ) parser.add_argument( "--did-exchange", action="store_true", @@ -669,6 +685,7 @@ async def check_received_pings(agent, issue_count, pb): args.timing, args.multitenant, args.mediation, + args.multi_ledger, args.did_exchange, args.revocation, tails_server_base_url, diff --git a/demo/runners/support/agent.py b/demo/runners/support/agent.py index fd6e401efb..2647c969b5 100644 --- a/demo/runners/support/agent.py +++ b/demo/runners/support/agent.py @@ -1,5 +1,6 @@ import asyncio import asyncpg +import base64 import functools import json import logging @@ -7,8 +8,9 @@ import random import subprocess import sys +import yaml + from timeit import default_timer -import base64 from aiohttp import ( web, @@ -119,6 +121,7 @@ def __init__( internal_host: str = None, external_host: str = None, genesis_data: str = None, + genesis_txn_list: str = None, seed: str = None, label: str = None, color: str = None, @@ -142,6 +145,7 @@ def __init__( self.internal_host = internal_host or DEFAULT_INTERNAL_HOST self.external_host = external_host or DEFAULT_EXTERNAL_HOST self.genesis_data = genesis_data + self.genesis_txn_list = genesis_txn_list self.label = label or ident self.color = color self.prefix = prefix @@ -206,6 +210,21 @@ def __init__( self.agency_wallet_did = self.did self.agency_wallet_key = self.wallet_key + if self.genesis_txn_list: + updated_config_list = [] + with open(self.genesis_txn_list, "r") as stream: + ledger_config_list = yaml.safe_load(stream) + for config in ledger_config_list: + if "genesis_url" in config and "/$LEDGER_HOST:" in config.get( + "genesis_url" + ): + config["genesis_url"] = config.get("genesis_url").replace( + "$LEDGER_HOST", str(self.external_host) + ) + updated_config_list.append(config) + with open(self.genesis_txn_list, "w") as file: + documents = yaml.dump(updated_config_list, file) + async def get_wallets(self): """Get registered wallets of agent (this is an agency call).""" wallets = await self.admin_GET("/multitenancy/wallets") @@ -326,6 +345,8 @@ def get_agent_args(self): ) if self.genesis_data: result.append(("--genesis-transactions", self.genesis_data)) + if self.genesis_txn_list: + result.append(("--genesis-transactions-list", self.genesis_txn_list)) if self.seed: result.append(("--seed", self.seed)) if self.storage_type: @@ -1196,12 +1217,15 @@ async def handle_basicmessages(self, message): self.log("Received message:", message["content"]) -async def start_mediator_agent(start_port, genesis): +async def start_mediator_agent( + start_port, genesis: str = None, genesis_txn_list: str = None +): # start mediator agent mediator_agent = MediatorAgent( start_port, start_port + 1, genesis_data=genesis, + genesis_txn_list=genesis_txn_list, ) await mediator_agent.listen_webhooks(start_port + 2) await mediator_agent.start_process() @@ -1312,12 +1336,18 @@ async def handle_basicmessages(self, message): self.log("Received message:", message["content"]) -async def start_endorser_agent(start_port, genesis, use_did_exchange: bool = True): +async def start_endorser_agent( + start_port, + genesis: str = None, + genesis_txn_list: str = None, + use_did_exchange: bool = True, +): # start mediator agent endorser_agent = EndorserAgent( start_port, start_port + 1, genesis_data=genesis, + genesis_txn_list=genesis_txn_list, ) await endorser_agent.register_did(cred_type=CRED_FORMAT_INDY) await endorser_agent.listen_webhooks(start_port + 2) diff --git a/requirements.txt b/requirements.txt index 5d1e87235f..9f26e08d86 100644 --- a/requirements.txt +++ b/requirements.txt @@ -21,4 +21,5 @@ pydid~=0.3.2.post1 jsonpath_ng==1.5.2 pytz~=2021.1 python-dateutil~=2.8.1 +rlp==0.5.1 unflatten~=0.1 \ No newline at end of file