-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Solving the Cardano Concurrency Issue #47
Comments
I believe this type of capability can be useful; a 3rd gen BlockChain should have the best of both world $BTC/$ETH via opt-in. I wonder if Smart Transactions opens up the possibility of having self-executing smart contracts (i.e. SC automation)? |
Absolutely love this work! Question: isn't smart transaction just a fancy name for indeterministic transactions ala Ethereum / account style blockchains? |
@matiwinnetou Yes, but I would stay away from the word indeterministic. Cardano has spent so much time and marketing on the concept of determinism I wouldn't want to throw that out. It's always deterministic when it lands on chain. It's only indeterministic while in the mempool waiting to be applied. |
We already have a small amount of indeterminism today. If both you and I submit a transaction simultaneously to our nodes that spends the same utxo, both will sit in the mempool and spread across the network. Only one of them will end up on chain and the other will get dropped. This is essentially what we're proposing here except that before being dropped, a transaction is given a chance to find a different utxo that still allows it to fulfill its goals before being unceremoniously dropped. |
I've kicked this idea around with other folks back around the '21 summit, IMO it's a really good idea. I kicked it around with some IOG engineers a few times, and I think the main hurdle to overcome in getting this adopted is twofold:
In a sense, this is what scooper/batcher solutions are doing, just one layer removed: The user is signing a transaction that gives their approval to some funds being spent, and forgoing the choice of exactly how it happens; The scooper/batchers are then serving the role of the "mempool", re-building the transaction when the conditions that led to it become invalid.
This isn't really what Cardano's determinism is about; Like you point out, Cardano has indeterminism, but the indeterminism is always cheap to validate, and so your collateral isn't at risk. UTXO races, validity intervals, etc. Once you add this, though, the question becomes: Is this part of phase 1, or phase 2? If it's phase 1, you have to make sure the kinds of criteria you can specify are very very cheap to evaluate; and since the UTXO is indexed by transaction, and not address, this is unlikely to currently be the case. If it's part of phase 2, now you have a non-deterministic thing that can forfeit your collateral. And I think this is where it gets tricky, because you might say "let the dApp developer decide what makes sense for them", but the philosophy that Cardano has had so far is that this isn't the dApp developer's decision to make, it's the users, and so you make a massive UX problem in communicating to the user that "unlike every other transaction you're used to, this one might fail and sacrifice your collateral like on Ethereum". Finally, this makes things like front-running way easier than they currently are today, as that UTXO evaluation bit in the node becomes a really ripe target for forking and adding your own custom rules for which transactions get priority. |
@Quantumplation All good thoughts. I would say that this might a good idea to make it a part of phase 2 validation, but I'm open to other ideas. We don't want to have dapps intentionally building transactions that are depending on the mempool in order to dump them out. We want dapps building transactions that are designed to always succeed. As such, a protocol should be designed so that even if another transaction jumps the line in front of mine, the output of that transaction can be an input to mine and keep it valid. If a dapp is relying on the mempool to drop stuff, it should be penalized by taking the collateral as evaluating these smart transactions are taking additional processing power in the node. |
The problem is, this isn't penalizing the dApp at all, it's penalizing the user. It's probably less of an issue now that we have collateral return so the risk can be kept lower, but still 😅 The answer might be "display this information in the transaction signing dialog so the user can decide, it's a UX problem", but I'm not sure if the average user is sophisticated enough to evaluate this; we already have tons of problems with people misinterpreting the current transaction descriptions. |
That sounds similar to the concept of malleable transactions in AVOUM. They include a fee market for ordering TXs accessing the smart contract and to incentivize nodes. https://docs.google.com/document/u/0/d/12atK0oEME0y1GHo_HmqhrcZ3pQeEqB_0tFKknhsjsLY/mobilebasic |
To me it's always seemed there is a fair bit of overlap between solutions to this problem and on-chain liabilities, the proposed implementation for babel fees. The transaction building for the smart contract could be designed so that users who want to spend the contentious utxo submit their tx with a liability to be paid and a reward offered. Any block producer or entity willing to take on the liability in exchange for the reward could fulfill the tx and bundle together any other similar transactions with liabilities contending for the same utxo. I don't expect that such a scenario would work out of the box with on-chain liabilities, I just wonder if there isn't some opportunity to design them in such a way that they can also be useful for concurrency. There would likely need to be some logic that exists that informs the bundler how to update the state on the datum of the contentious utxo so that it adheres to the validation logic of the smart contract, perhaps a monoid like definition that exists on the contentious utxo's datum or in a reference utxo for how to update state as one bundles additional transactions together. It seems a solution like this would also require relaxing some of the determinism requirements in order to work. The user submitting their tx would not know the final resulting output to the smart contract, only that it was computed correctly according to the monoidal definition. I feel like in many scenarios this is sufficient from the user's point of view, but not sure it's always the case. |
Really great insights here. I like what @PhilippeLeLong said, it does seem to be similar in a way to mallable transactions (Fuel V2) / Sway. I remember debating this topic very briefly with Sebastian from Dc-Spark. He is of an opinion that rather than putting this on L1, Cardano should have a sidechain with mallable transactions (effectively kind of like Fuel V2) rather than introducing indeterminism to L1. Some of you are aware but one of IOG teams is working on "sidechain framework", where Mamba will be the first one and likely Midnight the second one. We are all looking forward to this. |
@matiwinnetou Doesn't introducing a sidechain with bridges just make the whole solution more expensive? If we can't do something like this on L1, wouldn't someone just choose an entirely different L1 because of simpler tooling and dev UX? User adoption for crypto projects already has way too many steps and user friction. 1.) buy crypto on exchange Every one of those steps is a steep learning curve for someone new and they can make mistakes at every step. Picking the wrong wallet (Daedalus), not storing their keys properly, forgetting their password, etc... Adding another step of learning how to interact with an L2 is just one more step. Another potential solution that wouldn't require IOG is some type of universal batcher project. Currently, all protocols and dexes (that I'm aware of) are implementing their own batchers or doing centralized batching. I wonder if they could be designed in a flexible-enough way as to provide a universal batcher network. Spectrum (ergoDex) has open-sourced their batcher https://twitter.com/SpectrumLabs_/status/1612871109610991617 |
@AndrewWestberg I see your point of view and yes this is a disadvantage. Avalanche is going similar way though. I still believe that wallets can become better at managing subnets / subchains, in Cardano we haven't even started exploring this properly. Some wallets have support for sidechain - Milkomeda but usually this experience is far from ideal. I even didn't know some wallets simply support Milkomeda by.... specifying Ethereum address instead of Cardano (almost like an easter egg). Summary: sidechain complications can be solved partially with better UX and we in Cardano are far off where we could be. In terms of open source universal batcher - one thing to know as well. I spoke today with TeddySwap guys and they confirmed they will have fully open sourced batcher that ANYONE can run, where OPEN-SOURCE and ANYONE are the important keywords. |
there's a fairly nascent project at MLabs to use an L2 for deterministic batching - without the need to transfer funds to the second layer. |
@Benjmhart I read through the proposal, but it wasn't clear to me exactly how it worked. Is it a system where the end user submits an order, but then the L2 determines who is allowed to batch the transactions? Was this proposal funded, or just idea phase? |
I recently had a similar conversation with Zygomeb and this was my response: https://twitter.com/thepizzaknight_/status/1613945525975552007?s=20&t=26vDHeD10tytwn_-DxJ18Q I believe it is high time builders and interested parties recognise the Cardano L1 as a settlement layer equipped to be capable enough to verify multiple forms/events of execution that occur outside the L1 i.e. off-chain computing solutions like sidechains and mini-ledgers. This aligns very well with the initial proposal of the Cardano project and better compartmentalises things in the ecosystem. People can go to Solana to do all the advanced business logic they want, but they won't benefit from the extensibility that Cosmos offers. Point being that business will always make compromises whatever the cases. I don't think the goal for Cardano is to win over every developer/project but to be capable enough to support it in the most necessary of ways(whatever they may be over time).
This is something that we run to and not from as a problem, the best we can aim to do is streamline these processes to reduce the friction.
@AndrewWestberg On the subject of Smart Transactions, I came up with an idea a while ago called Charon: |
@AndrewWestberg it was funded but is still fairly early in development Essentially the sidechain nodes would Sign batching transactions via secp256k1. Oroboros on Oroboros. An out of the box decentralized auditable batching mechanism. |
@Benjmhart I think we can put this discussion on ice until we see what you guys come up with. Happy to help test once you get to that point. |
@NetWalker108 I'd need a flow chart of some kind to understand how Charon is proposed to work. My brain isn't "getting it" just from the text description. |
This version does most of the heavy lifting off-chain(more flexible) and but this work can be translated to Plutus scripts for most of the heavy lifting to be done on-chain(more strict). Do let me know if this helps @AndrewWestberg 💯 |
@NetWalker108 With the off-chain piece here, it seems like this is a centralized solution to the concurrency issue. If the dApp/Vendor is constructing the final signature, they are ensuring the one-at-a-time ordering to avoid utxo contention of the contract. How would you translate that dApp/Vendor box into something that could run on chain? |
Yep, centralized off-chain coordination per each dApp, at least in this version. Given most dApps are already very centralised at this level, I see it as no extra difference, just extra work. OTOH every metric that is used to approve transactions is verifiable on-chain via the updates in Plutus script accounts so dApp controllers can't make priority fees up(or rather if they do, they can be caught publicly).
The stage where checks are made i.e. the conditionals module, is built into the target Plutus script. Datum: Current Priority Fee + Last TX Time(by script) If Phase-1 validation passes then TX submits, executes, updates the Datum i.e. metrics accordingly. No special signatures required from dApp vendor. If Phase-1 validation fails then the TX never submits so never enters the mempool or chain. So the off-chain processes moves away from the dApp vendor to the end-user. EXTRA Given its nature, I find it good to also serve the role as a Priority Scheduling method for concurrency control. Naturally this will affect concurrency in favour of high priority fee transactions and cause starvation w.r.t. access for low priority TXs, till the cooldown period allows for a reset. This will also result in more serial execution(one-at-a-time ordering) for high priority transactions. Parallelism can be improved if dApps build provisions to facilitate, eg. DEXs create extra liquidity pools that can be smaller for low priority TXs. All in all, this concept is still naive and can be improved upon(especially at the implementation level.). |
Summary:
|
I just posted a pair of a CPS on the topic of intents (cardano-foundation/CIPs#779) and a CIP proposing a limited solution to some of the problems (cardano-foundation/CIPs#780). I think these will be of interest to the people here! |
The above is acceptable because no fees are incurred by submitter of the failed transaction and the node will not need to execute the scripts in the submitter's transaction, instead the transaction will just be discarded immediately in phase 1 validation. The property that we are concerned about maintaining is the In relation to the discussion on determinism, the key component to note is that the concern is specifically around maintaining this property: Which guarantees that we can determine off-chain the result of a transaction that runs Phase-2 validation (whether it will succeed or fail) without reliance on the current state of the blockchain or other transactions in the mem-pool. This means we do not need to execute smart contracts (which is expensive) needlessly to determine whether or not they will fail, we can determine off-chain if they will succeed or fail phase-2 validation. This way, any transaction with Plutus scripts that is submitted in good faith will either fail at Phase 1 validation (and thus the scripts will not be wastefully executed and paid for) or it will execute and Phase 2 validation will succeed. This is a critical property that we should maintain in any CIP: |
"Determinism" is a bit of an overloaded concept here. Script determinism is indeed very important for the reasons that you mention, but Cardano has another kind of determinism that I don't think we quite have a formal description of yet. Basically: if you submit a transaction, it will do exactly what you expect... or it will do nothing. No behaviour is determined later. This is also a pretty nice property for users. But the fact that the transaction might not be applied is a kind of non-determinism from the user's perspective. Adding other kinds of intent will add more non-determinism. It is certainly true that any non-determinism that affects script execution will be extra tricky. We really don't want to have the possibility of e.g. a user submitting a partial intent that then gets filled in such that the script evaluation fails and their collateral gets claimed. |
@michaelpj this argument flares up on Twitter every once in a while as there's someone in the community who insists on moving the goalposts such that "Blockchain determinism" only refers to the fact that of you replay the Blockchain you get the same result (which most of us contend is not a useful definition, as every Blockchain must, almost by definition, have this property), and that the definition of determinism you describe is much more useful. @colll78 is the most recent one to get sucked into that debate 😅 Regarding script/intent indeterminacy, I would think that is fairly cleanly solved by having the person who resolved the intent put their collateral at risk, rather than the original intenter. They're the ones submitting the tx, they're the ones who should know whether they're going to be wasting node time, and combined with their choices, the tx should be fully determined and indistinguishable from a normal tx. So in the context of your proposal, this would equate to a collateral field and a signature from the submitter on the tx zone itself. |
I saw this discussion just now because it was linked here cardano-foundation/CIPs#780 (something I am currently working on) FYI on the topic concurrency - two related papers, Structured Contracts (https://omelkonian.github.io/data/publications/eutxo-struc.pdf), which models stateful computation on an EUTxO ledger in a general way, and message-passing/double satisfaction (https://omelkonian.github.io/data/publications/eutxo-messages.pdf) got into conferences with proceedings (WTSC and FMBC). Both of these discuss how to formally reason about integrity of implementations of stateful contracts on the EUTxO ledger, giving examples of distributed implementations spread across multiple UTxOs. "Structured contracts" introduces the approach, and the message-passing/double satisfaction paper uses it to present a distributed message-passing contract. I think this sort of formalizes what is being discussed here. |
Dealing with concurrency on Cardano is something that most dapps and protocols have had to contend with due to the nature of utxo. There exists some local state and more than one actor wants to mutate that state at the same time.
Traditionally, this has been solved by each dapp one of two ways.
1.) Fragment the utxos so that each actor has a better chance of getting their own view of the world to complete their transaction. This is complex for a developer to implement, still results in some collisions of actors, and doesn't scale well due to the added costs of maintaining a large set of utxos.
2.) Utilize a "batcher" approach. Each actor submits an "order" transaction that represents their desire to complete an action on the protocol. Later, the protocol operator or a federated group of batchers bundles up the orders and executes them as a group against the protocol.
On other blockchains that are not utxo-based, an account-model has no issue with concurrent transactions as they have global state. Every protocol having to manage a huge list of utxos or find a way to pay decentralized batcher/scooper operators. This is a reason to pick other chains instead of Cardano. For example, WorldMobile picked a cardano sidechain based on Cosmos.
One idea I had to solve this issue is to create a CIP for Smart Transactions™.
A Smart Transaction contains input and output utxos just like a standard transaction. However, some of the input and output utxos are not fixed at submission time, but are instead resolved while the transaction sits in the mempool. Instead of a hash#index for an input utxo, it is defined as a criteria object.
A Criteria object contains an address bytearray. This will cause the mempool to find a utxo that must be assigned to that address. Additional fields can be added to this criteria to match on amount of ada (exact, less than, greater than, between). Another field could be added to match a utxo containing specific inline datum values. There also needs to be the concept of capturing datum values as variables that can be used on the output side of the smart transaction.
The cardano-node is already validating transactions in the mempool. Whenever a new block arrives, transactions that exist in that block are dropped from the mempool. Also, any mempool transaction that has passed beyond the TTL value becomes invalid and is dropped.
What Smart Transactions proposes is to add additional validation to see if there exists any utxos that can be gathered to make a valid transaction. If it ever happens that a valid transaction cannot be built based on a transaction's criteria, it will be dropped from the mempool. If a valid transaction can be constructed from utxos in the virtual mempool ledger state, they are added to new areas outside of the body of the transaction. So a Smart Transaction contains criteria utxos, smart output utxos, and an area of what utxos got resolved by the mempool.
On the output utxo side, there needs to be some flexibility so that an output utxo can have a computed amount of ada, native assets, and datum. I'm not sure exactly how to implement this piece, but we need to support certain capabilities such as take an input datum integer value, increment it and apply it to the output datum. Maybe snippets of plutus code could be used for these capabilities.
At the end of the day, we need to have the capability for two actors to put in a Smart Transaction that interacts with a smart contract, doesn't specify exactly which utxos on the contract it uses, but instead specifies what a utxo must look like to be used. Then, if both come in at nearly the same time, the second one ends up selecting utxos from the output of the first one. Both are placed successfully into the mempool as chained transactions.
There are likely some security implications to this to ensure it cannot be overly abused. Users might need to be warned whenever a criteria is selecting an open-ended utxo from an address in their own wallet. This also introduces the possibility that pool operators could engage in front-running. We would need to decide on whether this type of capability is useful enough to overcome these issues. I personally think this would open up Cardano to be able to build any type of protocol or dapp that is currently possible on other blockchains without resorting to batchers or cost-prohibitive architectures.
The text was updated successfully, but these errors were encountered: