- Max Tang
- Nov 22, 2022
Musings on the Cobb-Douglas Function: Web3’s Useful Primitive
Musings on the utility of the Cobb-Douglas Function and its role in The Graph Network, by Max Tang.
The Graph Network uses the Cobb-Douglas function to incentivize Indexer behaviors. Cobb-Douglas historically had wide application in both empirical and theoretical economics. Since most Indexers have a computer science background rather than economics, they typically need to pick up contextual knowledge about how all of this works at a fundamental level.
This is an introduction to the Cobb-Douglas function. Additionally, just like any other tools, there are important limitations and trade-offs with this function. We welcome The Graph community’s input to make continuous improvements.
Through this post, I aim to:
- Provide background information to existing and potential Indexers on this function
- Serve as a partial re-introduction of The Graph’s tokenomics
- Introduce and discuss Cobb-Douglas as a primitive for work tokens
- Invite you, the reader, to discuss future improvements
The Cobb-Douglas function is a term that is frequently used in web3, but is often opaque to its users. It is a staple function in economics. With its adoption by 0x, The Graph, and Goldfinch, it is emerging as a primitive in tokenomics. I will provide a little background about this function, starting with a simplified version of how it works, followed by a slightly deeper dive into its properties. It is important to note upfront that there are other functional forms that potentially serve the same purposes, and those are worth exploring in the future as well.
Table of contents
- Part 1: What are Cobb-Douglas Functions
- Part 2: The Graph’s Adoption of the Cobb-Douglas Function: Tokenomics Basics
- Part 3: The Graph’s Adoption of the Cobb-Douglas Function: The Design Mechanics
- Part 4: The Graph’s Adoption of the Cobb-Douglas Function: The α Coefficient
- Part 5: Optimizing the Cobb-Douglas Function
Part 1: What are Cobb-Douglas Functions
An Essential Explanation
At the base level, the goal of the Cobb-Douglas function is to find incentive alignments for a virtual owner-user marketplace. Imagine a world where taxi medallions are tokenized: drivers own the tokens which provide them a right to work on the platform. How do we find a mechanism that aligns usage and ownership?
The Cobb-Douglas function provides such a mechanism. In essence, it gives a mathematical relationship between inputs (staking and query fees) to outputs (query fee rebates).
A good example of Cobb-Douglas in action is The Graph’s work token model.
A Slightly More Technical Explanation
The function’s earliest form was a production function (the Cobb-Douglas Production Function). Cobb and Douglas modeled out how capital and labor ultimately contribute to final products (production). It looks like this:
- The output P is a function of Labor (L) and Capital (C),
- b is the total factor productivity.
This is a mouthful, but it describes how the two factors of production, Labor and Capital, interact with each other. In other words, if Labor and Capital are the two ingredients of input, how much do each of these two factors contribute to output?
Although it is the original form of the function, the function’s unique mathematical properties soon made it a useful tool for a variety of economic analysis situations. It morphed into a generic form:
α1, α2, α3 …and αn are positive numbers, but do not have to sum to one (depending on use cases). Compared with its original capital/labor form, this generic form can have any number of inputs that reference any ingredient. Like alchemy, you throw some inputs (e.g., copper, iron, a page from Gilgamesh) into the function and it can give an output (hopefully gold!).
Since the function now has a generic form, it is both used in producer theory (as a production function) and consumer theory (as a utility function). When it is used as a production function, it is like measuring the results of alchemy. From the Cobb-Douglas function, a rational producer would be able to determine, for example, how much copper to use.
When it is used as a utility function, it measures a consumer’s trade-off among various options. Should I buy more CryptoPunks or Bored Apes?
Because of its fit with both a consumer theory and producer theory, the function naturally became a staple in applied general equilibrium analysis, which seeks to find a market clearing point between supply (producer theory) and demand (consumer theory).
In summary, you’ll see the Cobb-Douglas functions in various contexts. It could be a production function if it’s used in producer analysis, or a utility function if it’s used in consumer analysis. The forms (which dictate the mathematical properties) are similar but the definition of the variables will be different in each context.
Part 2: The Graph’s Adoption of the Cobb-Douglas Function: Tokenomics Basics
The Graph uses a stake-to-earn model. Protocol participants are expected to stake their tokens to secure the network. One specific case of stake-to-earn is a work token model, pioneered by Augur and others.
The work token model works like this:
- A service provider in the network has to stake tokens to earn the right to perform the service for the network.
- The amount of service that they perform should be proportional to the amount of tokens they stake on the network.
It is akin to the taxi medallion market where the medallion entitles taxi drivers to operate in the market. In the taxi market, drivers purchase medallions to be able to operate in a city. These medallions are transferable, and there are even specialty financial services that provide medallion loans to drivers so that they can purchase medallions from other players.
When the local taxi market gains momentum due to reasons such as population increases, medallions transactions in secondary markets appreciate in value. When the market experiences cyclical or structural issues (such as the entry of Uber), medallions fall in value. There is a self balancing mechanism.
The Graph can be thought of as a virtualized medallion system, where GRT functions as a right to provide service on the platform.
Similar to medallions, GRT is meant to be purchased only in proportion to the level of work performed and services procured (query fees) on the protocol. If you have two drivers, you get one medallion (assuming two shifts in a day). If you have 6 drivers, you should get three.
The key challenge for this model is to create a reliable relationship between tokens staked and work performed. Ideally as more queries are performed in the network, the amount of tokens staked should increase. Using the taxi analogy, you don’t want people to sit on medallions and not go to work!
People buy medallions because they want to earn a living by taking passengers from point A to B, which is a right entitled by the medallion.
The Graph could have enforced this numerical relationship, but the rigidity could cause several issues:
- Limits the amount of work a smaller staker can perform, which is not conducive to the growth of the network
- Asks for a mechanism to force large stakers to perform work when they don’t want to (or divest their stake), which can be quite complicated to coordinate on-chain. (Ramirez 2019)
In other words, The Graph’s design principle is that Indexers should have the freedom to serve any amount of queries regardless of their stake. Again using the medallion analogy, people should not be forced to work when they don’t feel well, even if they are a large medallion owner. The idea of Cobb-Douglas is to create an incentive mechanism to make it economically more sensible to work without forcing people to do so.
Part 3: The Graph’s Adoption of the Cobb-Douglas Function: The Design Mechanics
According to Edge & Node’s Co-Founder and CEO Brandon Ramirez, The Graph’s use of Cobb-Douglas was inspired by its adoption at 0x. (Bandeali et al 2019; Ramirez 2019)
The problem that it intends to address is: how do we design a system where users are owners, and they own the appropriate amount of GRT relative to their usage?
The protocol anticipates that GRT owners will stake their tokens in the contract, and actively participate in protocol governance. In a way, it is like designing co-ops and mutuals in a virtual marketplace. Cobb-Douglas serves as a mechanism to balance out the dual mandate of ownership and utility.
On a high level, the mechanism looks like this: query fees will first go into a mutual pool (rebate pool). At the end of the period, the protocol uses the Cobb-Douglas formula to tally each Indexer’s share in the mutual pool. The share is based on both the amount of their staked GRT and the amount of work they performed (query fees).
The function is expressed as follows:
- rewards is the fees collectable by a single Indexer,
- totalRewards is the total fees for all Indexers over an epoch (an epoch is currently 6,646 blocks, or about 22 hours post Ethereum Merge. It is managed by the EpochManager contract),
- feeRatio = Fees attributed to the the staking pool / Total fees collected across all pools that earned rewards,
- stakeRatio = Stake attributed to the staking pool / Total stake across all pools that earned rewards,
- α is the Cobb-Douglas coefficient (originally named k in the Cobb-Douglas paper).
We can easily see the resemblance between the above function and the function’s original form:
Except that here we have two variables feeRatio and stakeRatio. The function aims to address the split between staked GRT (Capital, which is meant to provide economic security) and query fees (Labor, which is the reward for serving queries).
In a world without Cobb-Douglas, once an Indexer has served queries, they collect the query fees they served. Let’s call it the “you eat what you kill” model.
In a world with Cobb-Douglas, once an Indexer has served queries, the query fees go into a mutual pool. The Indexer’s ultimate share of the pool is determined by both the amount they staked and the amount of queries they served.
An obvious question is: is there an optimal amount of stake relative to the fee served that maximizes the profits for indexers?
We can use a metric called staking intensity to describe this problem:
It is the amount of the GRT staked relative to the fees served by an Indexer. So the above question can be rephrased as: is there an optimal staking intensity for Indexers?
There is currently limited consensus on this question. One school argues that there is no optimal staking intensity. People do not have an incentive to increase the overall size of the rebate pool; they are only incentivized to increase their share, which means that they will always stake more.
Another school argues that there is an optimal staking intensity. The reason is that there is an implicit cost of capital for staking. The excess amount of GRT staked will earn less fees than its alternatives.
What are the alternatives? One choice is to delegate to other Indexers who are not staking enough (stakingIntensity < 1). In other words, the marginal productivity of the capital is higher in lending these tokens out than self-staking.
Another way to think about it is the diminishing marginal productivity of capital implied by the Cobb-Douglas function. While it is always positive (i.e. putting in more capital always gets more returns), the marginal benefit decreases as you put in more capital. It is better to employ the capital elsewhere for higher returns.
Intuitively the most optimal choice is to stake the same amount of GRT relative to the queries they served. In other words, when feeRatio = stakeRatio (i.e. stakingIntensity = 1), Indexers get back exactly what they would have gotten back in the “you eat what you kill” world. There is no inefficiency in this state.
This is the ideal equilibrium state of the query fee market as intended by the Cobb-Douglas function. In other words, in the long run, Indexers should allocate a proportion of stake equivalent to the share of query fees that they generate, everything else being equal.
Empirically speaking, the first school of thought (that there is no optimal staking intensity) is correct for now, because of the reasons that we will discuss in Part 5. We will also discuss some issues we encountered in the function’s practical implementation.
Part 4: The Graph’s Adoption of the Cobb-Douglas Function: The α Coefficient
In addition to stakingIntensity, the exponents α and (1-α) are also important variables. They are called factor shares of the production function: they dictate the share of capital (staked GRT) and labor (query fees) in this query fee production market.
Note that the exponents add up to 1: α + (1-α) = 1. This is called “Constant Return to Scale”. It means that if we increase both the feeRatio and stakeRatio by a certain percentage, the Indexer’s share in the mutual pool will also increase by the same percentage.
In other words, regardless of whether it is a large indexer or a small Indexer, if the Indexer simultaneously increases its contribution of both capital (stakeRatio) and labor (feeRatio) by 20%, their share of the reward pool will also increase by 20%; if both of the inputs are increased by 35%, the outputs will be increased by 35% as well.
Hence a large Indexer will not be rewarded disproportionately just because it is large, and vice versa. This feature also eliminates the possibility that participants game the system by aggregating or disaggregating wallets.
Just to complete the picture, when the sum of the exponents is > 1, we get increasing returns to scale. This happens to certain industries with monopolistic tendencies (e.g. most power markets). When the sum of the exponents is < 1, we get decreasing returns to scale. In a trustless setting, both of these setups can be gamed. Therefore The Graph assumes Constant Return to Scale (sum of the exponents = 1).
Fully understanding the mechanism requires some background in basic calculus. You can check out the math in this lecture note under the section “Return to Scale”. (Cottrell 2019)
But what does α actually mean? We can look at it as the labor (query fee)’s share of the total output. (1-α) is the capital (GRT staking)’s share. In other words, in a given epoch, labor is entitled to α of the fee earnings and capital (GRT staking) is entitled to (1-α).
If we look forward, assuming the market stays in equilibrium, there will be a stream of fee earnings that is entitled by capital (GRT staking). A GRT owner’s value can be derived from this discounted present value analysis. Let’s say the totality of discounted present value of protocol query fees is X, the capital’s value is (1-α) * X. It is similar to what we have in corporate finance: a firm’s value is the discounted present value of its future cash flows (Discounted Cash Flow, or DCF).
Putting it another way, query fee is the explicit protocol “revenue”, whereas staking/signaling is the implicit protocol “revenue”. Again, this is a flawed analogy given that GRT is a utility token.
The good thing about DCF is that we can do some fair value analysis with traditional valuation metrics. We can analyze the size of the market that The Graph potentially serves (hint: much more than blockchain indexing), assume a market structure and market share for The Graph protocol, apply the protocol margin (1-α) and use a certain discount rate to get a terminal value. However, we have to be cautious as this analysis assumes the market is in an equilibrium state intended by Cobb-Douglas’s optimal stakingIntensity. It does not work in the current market where a significant number of token holders do not participate in the network.
We can even take a step further and think about how discounted cash flow analysis is applied in its traditional firm valuation context. The cash flow of each period, net of payouts, is the cash flow captured by the firm. The cash flow not captured by the firm goes to other factors of production (salary, suppliers, among others). The percentage that the firm retains out of the total topline revenue is the firm’s profit margin. Since the Cobb-Douglas coefficient α dictates capital’s share of output (topline revenue) in each period, from an income statement’s perspective it dictates the firm’s profit margin.
In other words, in The Graph’s setting, the coefficient of stakeRatio (1-α) is the protocol’s de facto margin, borrowing language from accounting.
Currently, the α coefficient was assessed at 0.77, which is calculated in the Smart Contract as:
For real time information, see alphaNumerator and alphaDenominator at Etherscan. It basically means for an Indexer, GRT staking is expected to capture 23% ( = 1 - 0.77) of the query fee value.
Part 5: Optimizing the Cobb-Douglas Function
There is still a fair amount of work to be done to optimize the framework. For example, the function implies a complicated game theory analysis of market participants when they contribute capital (they have to contribute an optimal amount relative to other market participants). Participants are penalized for not playing the game properly. However, this is where a theoretical game meets praxeology. The complexity of the game has deterred the players from playing it the way it is intended.
Additionally, the protocol currently issues inflationary rewards to Indexers. At this stage of the protocol’s development, the rewards are much larger than the query fees. Naturally, Indexers are optimizing their behaviors toward the inflationary rewards rather than the query fee rebate pool. How do we properly adjust the incentives at this early stage of the query fee market?
Also, at the heart of the Cobb-Douglas function is a regression analysis. We have to look at empirical data to determine the value of α. This can be done when the query fee market gets sufficiently large and provides more relevant time series datasets.
Lastly, the participation of speculators in a query fee market. Economics professors Sockin and Xiong pointed out that the presence of speculators may contribute to a breakdown of market equilibrium in a utilities tokenomics market (Sockin and Xiong 2020). Users can be crowded out because of the participation of speculators. How should we design a better marketplace considering the presence of speculators?
Part of the benefits of building in the open (the bazaar approach) is that we can potentially get inputs from a wide range of people and everybody contributes to the development of the protocol. I would argue that tokenomics is squarely in the middle of the bazaar, just like any pieces in the stack. By musing about the history of primitives and thinking through its use cases and limitations, we collectively contribute to the knowledge pool and potentially move the protocol forward. I invite everybody to challenge and discuss this primitive.
The artworks are credited to the following open source AI projects:
- InvokeAI Stable Diffusion Toolkit (InvokeAI nd),
- CompVis/stable-diffusion-v1-4 checkpoint hosted on Hugging Face (Rombach et al 2022; CompVis at Ludwig Maximilian University of Munich),
- runwayml/stable-diffusion-v1-5 checkpoint hosted on Hugging Face (Rombach et al 2022; Runway Research),
- LAION-5BN Open Dataset, a free dataset of 5BN image-text pairs (Beaumont 2022); LAION Aesthetics, subsets of LAION-5BN (Schuhmann 2022).
Bandeali, Avi, Will Warren, Weijie Wu, and Peter Zeitz. 2019. “Protocol Fees and Liquidity Incentives in the 0x Protocol.” 0x Protocol Working Paper. Accessed October 22, 2022. https://gov.0x.org/t/research-on-protocol-fees-and-liquidity-incentives/340.
Barmat, Ariel, et al. n.d. “Graph Protocol Contracts - LibCobbDouglas.” GitHub. Accessed October 22, 2022. https://github.com/graphprotocol/contracts/blob/dev/contracts/staking/libs/Cobbs.sol.
Barmat, Ariel and David Kajpust. n.d. “Graph Protocol Contracts - Rebates.” 2022. GitHub. Accessed October 22, 2022. https://github.com/graphprotocol/contracts/blob/dev/contracts/staking/libs/Rebates.sol.
Beaumont, Romain. 2022. “LAION-5B: A NEW ERA of OPEN LARGE-SCALE MULTI-MODAL DATASETS | LAION.” Laion.ai. Accessed November 5, 2022 https://laion.ai/blog/laion-5b/.
Biddle, Jeff. 2021. Progress through Regression: The Life Story of the Empirical Cobb-Douglas Production Function. Cambridge, United Kingdom; New York, NY: Cambridge University Press.
Biddle, Jeff. 2012. “Retrospectives: The Introduction of the Cobb–Douglas Regression.” Journal of Economic Perspectives 26, no. 2 (May): 223–36. https://doi.org/10.1257/jep.26.2.223.
Cottrell, Allin. 2019. “The Cobb-Douglas Production Function.” Accessed Oct 22, 2022. https://users.wfu.edu/cottrell/ecn207/cobb-douglas.pdf.
“Desmos | Graphing Calculator | Untitled Graph.” n.d. Desmos. Accessed October 22, 2022. https://www.desmos.com/calculator/exrkmlfmr4.
Douglas, Paul, and Charles Cobb. 1928. “A Theory of Production.” The American Economy Review, Mar., Vol, 18, No. 1, Supplement: 139-65.
Etherscan.io. n.d. “The Graph: Proxy 2 | Address 0xF55041E37E12cD407ad00CE2910B8269B01263b9 | Etherscan.” Ethereum (ETH) Blockchain Explorer. Accessed October 22, 2022. https://etherscan.io/address/0xF55041E37E12cD407ad00CE2910B8269B01263b9#readProxyContract
Goldfinch. 2022. “GIP-13 Tokenomics Update Phase 1: Membership Vaults.” Goldfinch Governance Forum. June 7, 2022. Accessed October 22, 2022. https://gov.goldfinch.finance/t/gip-13-tokenomics-update-phase-1-membership-vaults/996.
Indexer Office Hours. 2022. “Indexer Office Hours #73.” Accessed October 22, 2022. https://www.youtube.com/watch?v=cc0o7AiFUpA&t=2099s.
InvokeAI. n.d. “InvokeAI.” GitHub. Accessed October 22, 2022. https://github.com/invoke-ai.
Malinvaud, Edmond. 2003. “The Legacy of Knut Wicksell to Capital Theory.” Scandinavian Journal of Economics 105, no. 4 (December): 507–25. https://doi.org/10.1111/j.0347-0520.2003.00001.x.
Ramirez, Brandon. 2019. “The Graph Network in Depth - Part 2.” The Graph Blog. Accessed October 22, 2022. https://thegraph.com/blog/the-graph-network-in-depth-part-2/.
Rombach, Robin, Andreas Blattmann, Dominik Lorenz, Patrick Esser, and Björn Ommer. 2022. “High-Resolution Image Synthesis with Latent Diffusion Models.” ARXIV. Accessed October 22, 2022. https://arxiv.org/abs/2112.10752v2.
Samuelson, Paul A. 1979. “Paul Douglas’s Measurement of Production Functions and Marginal Productivities.” Journal of Political Economy 87, no. 5, Part 1 (October): 923–39. https://doi.org/10.1086/260806.
Schuhmann, Christoph. 2022. “LAION-Aesthetics | LAION.” Laion.ai. Accessed November 7, 2022. https://laion.ai/blog/laion-aesthetics/.
Sockin, Michael, and Wei Xiong. 2020. “A Model of Cryptocurrencies.” NBER Working Paper No. 26816. Accessed October 22, 2022. http://www.nber.org/papers/w26816.
Zeitz, Peter. 2019. “0x Governance, Fees and Liquidity Rebates.” www.youtube.com. Accessed October 22, 2022. https://www.youtube.com/watch?v=s2wlzlQxd5E.
About Edge & Node
Edge & Node is a software development team dedicated to the advancement of web3. Founded by the initial team behind The Graph, a protocol for indexing and querying blockchain data, Edge & Node plays a vital role in supporting and scaling protocols and development teams throughout the global web3 ecosystem. From building solutions that helped decentralize one of the most critical layers of web3 to tracking every project being built in the decentralized community, Edge & Node is mission-focused on empowering individuals with the ability to contribute to moving humanity forward.
Subscribe to our newsletter for the best of web3
Stay on top of the latest web3 news with a fresh cup of web3 Tea delivered right to your inbox. Web3 Tea is a bi-weekly round-up of web3 developments, macro observations, and profound tweets.
Stay stimulated with web3 Tea, whether you're a web3 beginner or expert!