Part I in a new series discussing the challenges and the design choices that represent the backbone of the Dusk Network protocol.
As many of our readers might be aware, the success of a digital currency protocol does not only depend on the technical prowess or the adoption of the technology, but also the economic model and the game-theoretically sound principles deployed with a common goal of sustaining a healthy network.
In this series we will discuss the challenges and design choices of the economic model that is the backbone of the Dusk Network Protocol.
Bitcoin, the first digital currency platform to reach mainstream acclaim, has been designed as a network that incentivizes certain types of work, with the goal to drastically increase the security of the network against attacks.
These incentives take the form of mining rewards and transaction fees. For now, we will ignore the “mining” rewards and concentrate on the transaction fees instead. It’s in the name: users that transact need to pay a fee. The way the fee is structured depends on various parameters, such as the size of the transaction (kB) and the limit, if any, on the block size.
In Bitcoin the block size is limited to 1 MB, as the average transaction makes up a few hundred bytes, only a certain amount of transactions can be added to the block. The 1MB block limit thus effectively imposes a bottleneck on the throughput of the protocol, resulting in a queue of transactions that want to be included in the block. Miners prioritize transactions with the highest transaction fees. By creating scarcity users pay increasingly higher fees if the network becomes more congested.
The transaction fee is calculated based on the transaction size and the miners are free to set lower bounds on fees that they will accept as well as prioritizing the transactions with the largest fees per kB available.
Caveats, caveats, caveats
Bitcoin’s success in creating competition for limited space is also its downfall, as the decreased throughput is one of the reasons for the protocol’s failure to reach any true mainstream levels of adoption. While one might think that increasing the throughput limit can act as a solution to the aforementioned problem, the increased block sizes will lead to drastically diminishing fees overall, as shown in Bitcoin Cash. This causes the security of the protocol to crumble once the mining rewards dwindle to a point where transaction fees represent a sole source of incentivization for mining as well as the deteriorating utility of the coin.
While the latter should not cause any serious issues for Bitcoin, which is considered to be a store of value, a similar scenario for protocol tokens with more utility would have a much more adverse and pronounced effect on the value of the token.
The miner is not the only entity bearing the cost of adding a transaction to the blockchain. Instead, the entire network shares the costs of the transactions amongst themselves. What are those costs, you might ask? Well, the cost of permanently storing the transaction, the cost of computation and the cost of maintaining a transaction-related database (UTXO set, in case of a UTXO model or a state tree, in case of an account-based model) would be the most notable ones. Bitcoin tackles the first and the third of the aforementioned costs via an imposed artificial threshold on the throughput of the protocol while the second cost is negligible due to the lightweight nature of the computation required to verify a Bitcoin transaction.
One of the most famous approaches to the issue above is the one taken by Ethereum. Instead of enforcing a static limit on the block sizes, Ethereum institutes an adaptive gas limit on the blocks, in other words limiting the amount of computation that can be done per block. Ethereum blocks are not space-bounded, which means that block sizes may vary depending on the computational intensivity of the transactions included in the block. Ethereum also includes a formula for adapting the gas limit dependent on the current consumption of gas per block.
Monero, on the other hand, imposes an adaptive block limit, which can decrease or increase depending on the throughput needs of the protocol. The protocol penalizes the miner for a drastic increase in block size, in comparison to the previous blocks, which prevents the adversary from bloating the blockchain.
What is the ideal approach to take for Dusk Network?
Being an anonymity-preserving protocol, Dusk Network bears additional costs on the nodes participating in the network. For example, Dusk Network requires permanent storage of key images, which prevent an attacker from “double-spending” a transaction output, as well as an unprunable UTXO set, which grows linearly alongside the transaction outputs added to the block.
As with many complex issues, there is no perfect solution to the problem. Any solution would involve a compromise between scarcity and throughput. As we’ve outlined in this post:
- Bitcoin uses a fixed block limit
- Ethereum uses an adaptive gas limit
- Monero uses an adaptive block limit
So what is next?
In Part II of the series, we will discuss the approach that Dusk Network protocol deploys as a compromise between network throughput and meaningful reimbursement for the transaction-related costs outlined above.
Part III will outline the use of fees in terms of the Virtual Machine and how certain fee structures are necessary to avoid the “halting problem”.
Finally, in Part IV we will take a look at the ways in which the network participants are incentivized to retain the security of the protocol as well as the measures deployed to penalize malicious behaviour.
Dusk — Technology for Securities
Dusk Network is an open-source and privacy-oriented blockchain based on years of academic research. You can use Dusk Network to create smart contracts that control digital assets and securities.
Dusk Network Economic Model, Part I: Tackling the Problem was originally published in Dusk Network on Medium, where people are continuing the conversation by highlighting and responding to this story.