Not Just Hot Air: Liberal Radicalism and Gas Markets

Recently, I wrote about Liberal Radicalism, because it tries to combine deep economics with game theory, some other complicated maths around Quadratic Voting, political science, history and philosophy and so is worth studying. One thing I was left wondering, though, is "What are the immediate practical applications of all this stuff?"

Radically different ways of organising society and the economy are challenging for other people to grasp, because it is difficult for most of us to think outside of the paradigms in which we currently live (i.e. the "reward-oriented hierarchy" spoken of by Marianne Brün in Paradigms: The Inertia of Language). Often, proposed alternative methods require different premises and concepts entirely, which means they come across as - at best - abstract.

So, the question remains: how do we actually apply it? We're not going to convince the Fed that it should implement Harberger Taxes tomorrow, nor any developed government that it ought to move across to Quadratic Voting for elections, so where do we start, and how do we iterate from there?

One potential answer comes from a paper Vitalik published just a few weeks before, on "Blockchain Resource Pricing" - i.e. "gas", the stuff used to pay for cycles of computation in the Ethereum Virtual Machine.

From The Abstract to Real Problems

The paper opens by outlining the key problem faced by shared computational resources:

"One of the most challenging issues in blockchain protocol design is how to limit and price the submission of transactions that get included into the chain. Every transaction confers some private benefit to its sender, but transactions also incur social costs to the network as a whole, as every node in the network must process every transaction. This results in a classic example of the 'tragedy of the commons' problem."

The question posed here is how do we make sure we can set the price for each resource equal to the social cost that the act of consuming each resource imposes on the network? Remember also, that there is a very direct relationship between optimal decisions about how to fund public goods and collective decision making as a whole. This link is literalised by blockchains: if we can figure out how to fund sustainably the public goods we all share, it would imply that we have simultaneously found optimal governance solutions at the protocol layer.

As with any effective form of governance, such solutions are defined by use, rather than discursive and ambiguous discussion of definitions. Vitalik says this directly in the discussion around this paper on ethresear.ch:

"What I'm suggesting would turn that block-by-block price volatility into block-by-block usage volatility, reducing costs and delays for participants at fairly little cost."

That's really quite a profound claim, when you think about it for a little while.

So, while gas may still seem abstract to many people, it is precisely the first application to which we can usefully and practically apply the ideals of Liberal Radicalism. Given that it has neither the social nor linguistic inertia of current political and financial systems and markets, it can be designed from different first principles, and it seems to me like Buterin, Hitzig and Weyl's paper is an attempt to encapsulate in formal logic some of those principles.

Begin at the Beginning

"A transaction that is published to a blockchain confers some private benefit to its sender, but also confers an external social cost to the network's participants. In order to account for this social cost and prevent abuse of the blockchain as a common pool resource, some economic mechanism for restricting what transactions get included is required. However, there are many types of economic mechanisms that can be used to solve resource pricing problems of this type, and understanding which one is optimal requires more deeply understanding the nature and types of social costs in question."

There are basically 4 different kinds of such "social costs":

  • Bandwidth cost: the cost of all nodes downloading each submitted transaction, bundling it into a block, and then rebroadcasting the transaction as part of some block.
  • Computational cost: the cost of every node verifying each transaction.
  • History storage cost: the cost of storing the transaction for all nodes that store the blockchain's history, for the time for which the history is stored (possibly infinity).
  • State storage cost: the marginal cost of the impact of the transaction on the size of the state (eg. contract code, account balances) that every node must store to be able to process further transactions.

By splitting things up like this, we can see that the first 2 costs are paid by nodes which are online exactly when a transaction is included, the 3rd is paid by those nodes and those that come online then or at some point in the near future, and the fourth is paid by all nodes, forever, amen.

We can also categorise cost by different types of first and second order effects on nodes by assigning a weight to each transaction given its complexity. Each user then can be assigned some resource cost function which, for very high weights (i.e. very complex transactions), becomes unacceptable/untenable, causing them to drop off the network, and lower the node count (which can therefore be modelled also as a function of the 'weight' of transactions).

This means - stick with me here! - that there is a utility function reflecting the social value of the level of decentralization achieved by having lots of online nodes, which can be translated into a function of the total transaction load. There is also a cost function that reflects the increased ease of attacking the network as more transactions get included and more nodes drop off.

We can then summarise all of these costs as a combined cost function, which serves as the model for the rest of the paper, irrespective of the consensus algorithm used, which is pretty cool.

A Trail of Bits

Both Bitcoin and Ethereum currently use a simple "cap-and-trade" scheme to price shared resources by defining the total quantity of resources that the transactions contained in a block will consume (albeit it in different ways). The problem, in case the technical language has you lost, is how to figure out a reasonable weight limit to use in such schemas (which can be seen in practice by the never-ending debate about bitcoin block sizes).

"The purpose of this paper will be to try to go beyond the one-dimensional design space of 'there must be some limit, what should it be?' and explore a much larger space of policies that attempt to address transaction resource consumption externalities, and try to develop policies that are both closer to optimal at the present time, and more robust to changes in economic circumstances in the long-term future, reducing the need for 'extra-protocol interventions' such as hard forks."

Just pause for a moment here and think about what Vitalik's really doing. He's mapping traditionally linguistic means of interpreting and ordering the world and our ways of being in it, into mathematical space. Rather than policy discussions in discursive language - which is open to both ambiguity and interpretation - by conducting considerations about how we price public goods using mathematical objects instead of ambiguous words, he can provide a far more powerful, direct, and exact means of discussing optimal policy. It may at first seem more abstract, but the irony is that it is actually its very concreteness which leaves people confused.

Feeling Green?

There are interesting parallels between gas and environmental regulations: just like the pollution produced by a profitable factory must be suffered by everyone living in the surrounding area, validators get rewarded for including transactions in blocks, but publishing those blocks implies a cost for all full nodes who must then process each block. What we are trying to do is limit the negative externalities of those costs.

Citing Weitzman's 1974 paper "Prices vs Quantities", Vitalik makes the point that regulation by price or regulation by quantity has the same effect if and only if there is perfect information about the social cost function and the demand curve for consuming the resource, but that's never the case in the real world, and the two approaches have very different effects in practice.

Basically, if the consumer's marginal private costs increase faster with quantity than the marginal social costs, then setting prices is better. Otherwise, setting quantities is better (some social costs around global warming and "tipping points" are good examples of situations where the marginal social cost increases much faster than private costs to individuals).

Mathematically, we are working with second derivatives here, because we're primarily interested in the rate of change in marginal costs. There's one further complication, too: "If changes in the cost and benefit curves are correlated, then an additional term must be added into the choice rule, increasing the relative attractiveness of limiting quantity."

The paper gets more and more complicated from here on out, so before we dive further into it, I want to highlight again what is happening here conceptually. Vitalik writes:

"One can think of policy space as the space of possible supply curves for a given resource, where a pricing policy represents a horizontal supply curve and a cap-and-trade scheme represents a vertical supply curve."

Again, the project here is to map real world policy discussions onto mathematical objects - in this case, cost and demand curves - about which we can reason concretely, using linguistic primitives capable of operating without the need for trust because they can be verified independently by others. By narrowing the interpretive space from natural language - with all its ambiguities - to mathematics, Vitalik excludes a lot of people incapable of understanding the equations, but he produces a result which is provably optimal. As with everything, it's a trade-off.

Whether that trade-off will work out, or result in even less humane systems by excluding the ambiguity which is such a signature of human relationship, remains to be seen.

Curves All The Way Down

We then dive deep into estimating different kinds of curves, which you can take a look at yourself. After some references to a Cornell study of Bitcoin, and some slightly questionable assumptions that allow us to turn a decidedly exponential looking curve into a linear one when considering the social cost curves of Ethereum, Vitalik actually ends up arguing that such social cost curves are most likely U-shaped for any blockchain accepting transactions.

Estimating the private benefit curve (i.e. what individuals stand to gain from their use of public goods) is significantly harder, and requires colloquial reference to the sorts of costs we have actually seen on Bitcoin when transaction processing time between blocks increases, or when gas costs go up on Ethereum. The maths that follows seem to suggest that "a flat per-weight-unit in-protocol transaction fee, coupled with a hard limit at the point where the marginal social cost starts rapidly increasing, is superior to a pure weight limit-based regime", which has always been the economic argument for using gas in Ethereum anyway.

However, looking only at the short term effects on fees that longer processing times or higher gas prices has does not account for something called demand elasticity, which is generally higher in the long run, and takes us into a larger consideration of cryptocurrency prices as a whole.

"Specifically, (i) the price of a cryptocurrency; (ii) the social cost curve [as the number of beneficiaries of the system increases, and the number of full nodes also increases]; and (iii) the benefit curve [as there are more users sending transactions] are all highly correlated with a single variable, which we might call 'adoption'."

Using some fairly simple maths this time, we can show that under the assumptions above, "adoption leaves the private benefit and social cost curves unchanged", though there is no empirical data to prove this yet. However, "the lucky coincidence of being forced to denominate fees in a cryptocurrency whose price is itself proportional to adoption, [means] there is at least no very strong first-order reason to expect positive correlation between the nominal benefit and cost curves. Hence, the arguments for using fixed fees in addition to gas limits still stand."

Phew, that's good to know if you use Ethereum.

Design Your Mechanism!

Now for the fun stuff: "The field of mechanism design [MD] has made many discoveries about what types of auctions perform better under what circumstances, and much of it is relevant to transaction fee markets."

Current fee markets allow senders to specify any fee they want along with their transaction, otherwise known as "a first price auction, characterized by the key property that 'you pay what you specify, and only if you win.'" Such auctions are shown by MD to be deeply suboptimal, though, and the "typical alternative is for selling many items a kth price auction [where] everyone pays the same as the lowest fee that gets included."

However, these in turn "have a different kind of serious flaw: they are not incentive-compatible for the auctioneer (i.e. block proposer)", so encourage collusion between the block proposer and some transaction senders, which is also something we wish to guard against as a part of the design. Where does that leave us, you ask? "So far the evidence suggests that hard quantity limits are overused and price floors are underused. But how do we even start trying to set the price floor?"

See, even Vitalik asks questions rather than just dropping ridiculous amounts of knowledge sometimes. Take heart, brave reader…

Basically, what he wants to do here is show that "it is possible to improve upon a hard quantity limit in a way that specifically alleviates the problem of deadweight losses from short-term transaction fee volatility, without having to set a specific price as a protocol parameter."

There follows some calculations which you have to see for yourself, because they're epic and involve normalizing weight units so that the optimal weight limit and transaction fee level are both 1 and drawing deadweight loss triangles to calculate the size of the economic inefficiency.

What this all proves "is a sort of marginal analogue to Weitzman's 1974 result, where in the exact same case where choosing a price is better than choosing a quantity, a quantity limit can be improved on the margin by replacing it with a 'flexible limit' that is really just a price level that adjusts over the medium term to target the quantity limit."

Remember, sometimes its optimal to regulate price, other times its optimal to regulate quantity, and this depends on the rate of change of marginal costs to both individuals and they societies they exist within. From this observation and the interplanetary maths I have skipped, Vitalik can propose an alternate resource pricing/limit rule that he believes provides superior properties to a hard limit, as is implemented now. Which is pretty f**kin epic.

This idea is finished off with a note about how setting fee parameters always implies an element of central planning, but if the aim of such planning is to reduce complexity (specifically, Kolmogorov complexity), it can result in policies that are "as simple as possible while still being substantial and needed improvements on the status quo."

Moar Complexity, Though…

Nope, sorry, we're not done yet. Everything above works well for simple systems, but computational networks have lots of different costs associated with them, as previously mentioned: bandwidth, computational, history storage and state storage. "One question worth asking is: can we somehow measure in-protocol the social cost of computation and bandwidth, or at least a more limited statistic like the maximum level of computation and bandwidth that the median client can safely handle?"

Bitcoin does achieve this to a certain extent, but is particularly fragile in the face of hardware advances as shown by the rise of ASIC miners over GPUs, for instance. Furthermore, proof-of-work algorithms only deal with the "cost of calculating a computation once. The actual social cost also depends heavily on the number of nodes in the network, as each node will need to run the computation, and this is also extremely difficult to measure."

While bandwidth, computation and state I/O can be mostly dealt with by fixed limits, pricing storage costs cannot, because state space must be stored by - and thus burdens - all full nodes, forever. In Bitcoin, there are no explicit fees for filling storage, while Ethereum adopts a more complex gas cost schedule for storage-affecting operations, encapsulated by the SSTORE and CREATE opcodes in the EVM.

In case you ever wanted to know, the exact fees are calculated by "taking as a goal a cost of  200 gas per byte in storage, estimating the number of bytes added to storage space by each particular type of storage-filling operation, multiplying the two values, and then adding an additional term to take into account other costs such as computation and contribution to history size."

Despite the more complex approach, both Bitcoin and Ethereum suffer four big drawbacks from such schemes:

  1. Storage is far too cheap in an absolute sense.
  2. The social cost of storage is far more linear (than the U-shaped curves we ideally want)
  3. There is insufficient incentive to clear storage.
  4. There is no incentive to clear storage earlier rather than later.

There are various different solutions to this, including a time-based storage maintenance fee - i.e. the infamous "rent" - but Vitalik ends up arguing "in favor of simply setting the maintenance fee to one specific value (eg. 10-7 ETH per byte per year) and leaving it this way forever." This is because the social cost of storage is pretty much linear, and it also results in far better and more predictable user experience for everyone.

Conclusion

So what does it all mean, Bob? Well, more "economic analysis and econometric research can be used to help identify further mechanisms that can be used to better reduce costs while discouraging wasteful use of public blockchain resources." Typically modest, this is Vitalik's way of showing how formal mappings from discursive policy discussions to mathematical objects that can be understood through formal means which do not require trust can really help us build a more equitable world.

Sure, there's lots of rhetoric around permissionless innovation and open access to decentralised tools and networks, but that doesn't actually happen at the level of natural language - it is a battle fought at the level of protocols, which is where power really operates in the modern world. Liberal Radicalism presents us with a framework in which to understand more completely, and therefore challenge, such power. It's not only knowledge though, but the unique combination of economic modelling, game theory, interplanetary maths, and political science which also gives us a toolset which we can use to implement such challenges in the world of which we are a part.

It may still seem abstract, though it is in fact the very concreteness of these theories which makes them at times difficult to follow. Nevertheless, it seems appropriate that a fairly technical, little-known, and abstract concept like "gas" fees for transactions on the first world computer would be the first place where we can actually go ahead and implement some of these exciting new ideas about how to organise ourselves more efficiently and thereby bring about the sort of accessible and valuable world we want so badly to see.