Original title: a16z: 7 Tips for Token Design
Original article by Guy Wuollet
ODAILY: The Planet
Tokens are a powerful new primitive that can be defined in a variety of ways. The space for Token design is rich, but we're still in the early stages of exploration.
In fact, many teams struggle to find the "right" Token design for their projects. But the industry lacks a tested design framework, so future generations repeatedly face the same challenges as their predecessors. Fortunately, there are (a few) examples of early successful Token designs. Most valid Token models have unique elements that target their goals, but most flawed Token designs have some common ones. Bug. Therefore, this article will discuss why we should consider Token research and design, not just the "Token economy", and list seven tips for avoiding the pit.
The biggest problem in Token design is how to build a complex Token model before the goal is clear. The first step should be to identify the goal and make sure the entire team fully understands: what is it, why is it important, and what do you really want to accomplish? Failure to strictly define goals often leads to redesign and wasted time. Clarity of purpose also helps avoid the problem of "fabricating a Token economy for the sake of designing a Token economy," which is common in some Token economy designs.
In addition, the goal should revolve around the Token itself, but this is often overlooked. Examples of clear goals include:
Design a Token model of the game, the model can achieve the best scalability and support modeling.
A DeFi The protocol wants to design a Token model to reasonably distribute risk among participants.
Designing a reputation agreement that guarantees money cannot be a direct substitute for reputation (for example, by separating liquidity from reputation signals).
Design a storage network that ensures files are available with low latency.
Design a pledge network that provides maximum economic security.
Design a governance mechanism that elicits true user preferences or maximum engagement.
The list goes on and on. Make the Token support any use case and achieve any goal, rather than the other way around.
So how do you start to define a clear goal? Clearly defined goals often come from the "project mission." While the "project mission" tends to be high-level and abstract, the goals should be concrete and reduced to their most basic form.
Let's start with... EIP-1559 Take for example. Roughgarden vs. IP-1559 A stated objective of IP-1559&nbSP; should improve the user experience outside of periods of rapid demand growth, in the form of a simple cost estimate in the form of an "obvious best bid."
He went on to offer another clear goal: "Can we redesign Ethereum's transaction fee mechanism so that setting a transaction's price is more 'silky' like shopping on Amazon? Ideally, a price Posting mechanism, which means a take-it-or-leave-it price for each user?"
What these two examples have in common is that they state a high-level goal, provide a relevant analogy to help others understand your goal, and then proceed to outline the design that best supports that goal.
When creating something new, it's a good idea to start with what you already have. When you evaluate existing protocols and existing literature, you should evaluate them objectively on their technical merits.
Token models are typically evaluated based on the price of the Token or the popularity of the related item. These factors may be unrelated to the ability of the Token model to achieve its stated goals. Valuation, popularity, or other simple methods of evaluating the Token model may result. Builder: Take many detours. If you assume that other Token models work when they don't, you might create a Token model that is "inherently flawed."
Articulate your assumptions. When you're focused on building tokens, it's easy to take basic assumptions for granted. It's also easy to misrepresent the assumptions you really make.
Consider a new protocol that assumes that the hardware bottleneck is computing speed. Making this assumption part of the Token model (for example, by limiting the cost of hardware required to participate in the protocol) can help align the design with the desired behavior.
However, if the protocol and Token designers do not explicitly express their assumptions, or if the assumptions they express are wrong. It is then possible for participants who are aware of this mismatch to extract value from the protocol. Hackers are often people who know more about the system than the people who built it in the first place.
Clarifying your assumptions will make it easier to understand your Token design and make sure it works. You can't test your hypothesis without clarifying it.
As the saying goes, "It's not the things you don't know that get you into trouble. It's the things you're sure aren't."
The Token model typically makes a number of assumptions. This approach comes in part from Byzantine System Design, the inspiration for blockchain. The system makes an assumption and sets up a function that guarantees a certain output if the assumption is true. For example, Bitcoin guarantees activity in a synchronous network model, if the network is in 51% If the hashing power is honest, consistency is guaranteed. Several smaller blockchains have been hit. 51% Is a violation of the number of honest assumptions that Satoshi Nakamoto's consensus requires the blockchain to function properly.
Token designers can test their hypotheses in a number of ways. Rigorous statistical modeling, often in the form of agent-based models, can help test these assumptions. Assumptions about user behavior can also often be tested by talking to users, and better yet by observing what people actually do (rather than saying they do). The likelihood of successful verification is high, especially through incentivized test networks that produce empirical results in sandbox environments. Formal validation or intensive auditing will also help ensure that the code base is performing as expected.
The abstraction barrier is the interface between the different layers of the system or protocol. It is used to separate the different components of a system, allowing each component to be designed, implemented, and modified independently. Clear barriers to abstraction are useful in all fields of engineering, but especially in software design, but they are even more necessary for decentralized development and for large teams to build complex systems that cannot be understood by individuals.
In Token design, the goal of removing barriers to abstraction is to minimize complexity. Reducing the (internal) dependencies between the different components of the Token model results in cleaner code with less Bug And better Token design.
For example, many blockchains are built by large engineering teams. A team might make an assumption about the cost of hardware over time and use it to determine how many miners contribute hardware to the blockchain for a given Token price. If another team relies on the Token price as a parameter, but doesn't know the first team's assumptions about the cost of the hardware, they can easily make conflicting assumptions.
At the application level, clear barriers to abstraction are critical to achieving composability. As more and more protocols are combined with each other, the ability to adapt, build, extend, and remix will only become more important. With greater composition comes greater possibility, but also greater complexity. When applications want to compose, they must understand the details of the composition protocol they are using.
Opaque assumptions and interfaces can occasionally lead to blurring. Bug, especially in early DeFi In the agreement. Vague barriers to abstraction also increase the required efficiency of communication between teams working on different components of the protocol, thereby prolonging development time. Vague barriers to abstraction also add to the complexity of the protocol, making it difficult to fully understand the mechanism.
By creating explicit barriers to abstraction, Token designers can more easily predict how a particular change will affect each part of the Token design. Clear barriers to abstraction also make it easier to extend a Token or protocol and create a more inclusive and extensible one. Builder The community.
External parameters are not inherent to the system, but can affect overall performance and success, such as the cost, transaction volume, or latency of computing resources during the initial creation of the Token model.
But unexpected behavior can occur when the Token model only works if the parameters remain within a limited range. For example, an agreement to sell a service and provide a rebate in the form of a fixed Token reward may be worth more than the cost of the service if the price of the Token is unexpectedly high. In this case, it makes sense to buy unlimited services from the agreement, which takes full advantage of Token rewards and services.
Or to take another example, decentralized networks often rely on encryption algorithms or computational puzzles that are difficult but not impossible to solve. Difficulty often depends on an exogenous variable, such as how fast a computer computes a hash function or a zero-knowledge proof. Consider a protocol that assumes how fast a given hash function is computed and pays Token rewards accordingly. If someone invents a new way to compute a hash function faster, or simply has oversized resources to solve the problem that are disproportionate to their actual work in the system, they could be rewarded with unexpectedly large tokens.
Designing a Token should be like designing a counter system. The behavior of the user will change as the Token works.
A common mistake is to adjust the Token model without ensuring that any user behavior still produces acceptable results. Don't assume that user behavior will stay the same as the Token model changes. Often this error occurs late in the design process, when someone spends a lot of time defining the purpose of the Token, defining its functionality, and validating it to make sure it works as intended. They then settled on a special case and changed the Token design to accommodate it, but forgot to revalidate the entire Token model. By fixing one exception, they had another (or several) unintended consequence.
Remember not to let the hard work go to waste, and every time a project changes its Token model, reverify that it is working as expected.
Original link
Welcome to join the official BlockBeats community:
Telegram Subscription Group: https://t.me/theblockbeats
Telegram Discussion Group: https://t.me/BlockBeats_App
Official Twitter Account: https://twitter.com/BlockBeatsAsia