Why Web3 Needs a Separate Data Availability Layer

22-11-22 18:00
Read this article in 15 Minutes
总结 AI summary
View the summary 收起
Original title: "Why Web3 Needs a Separate Data Availability Layer?" "
Original author: Kang Shuiyue


When the data economy develops to a certain extent , people are widely and deeply involved in it, and everyone inevitably participates in different data storage activities. In addition, with the advent of the Web3 era, most technological fields will slowly begin to upgrade or transform in the past few years, and decentralized storage, as an important infrastructure of Web3, will be implemented in more application scenarios in the future. For example, we are familiar with the data storage networks behind social data, short videos, live broadcasts, smart cars, etc., and will also adopt a decentralized storage model in the future.


Data is the core asset of the Web3 era, and users owning data is the main feature of Web3. Let users safely own data and the assets represented by data, dispel all kinds of worries of ordinary users about asset security, and help guide the next billion users to enter the Web. A separate data availability layer will be an integral part of Web3.


From Decentralized Storage to Data Availability Layer


In the past, data was stored in the cloud through the traditional centralized method, and the data was usually completely stored on the centralized server. Amazon Web Services (AWS) is the originator of cloud storage and currently the world's largest cloud storage provider. With the passage of time, users' demand for personal information security and data storage continues to increase, especially after some large data operators have data leaks, the disadvantages of centralized storage have gradually emerged, and traditional storage methods can no longer meet the current market demand. Coupled with the continuous advancement of the Web3 era and the development of blockchain applications, the data has become diversified and the data scale has also continued to grow. The dimensions of personal network data are more comprehensive and valuable, making data security and data privacy become More importantly, the requirements for data storage have also begun to rise.


Decentralized data storage came into being. Decentralized storage is one of the earliest and most concerned infrastructures in the Web3 field. The earliest solution was Filecoin launched in 2017. Compared with AWS, there is an essential difference between decentralization and centralization. AWS has established and maintained its own data center consisting of multiple servers, and users who need to purchase storage services can pay directly to AWS. The decentralized storage follows the sharing economy, using massive edge storage devices to provide storage services, and the data is actually stored on the storage provided by the Provider node. Therefore, the decentralized storage project party cannot control these data. The most essential difference between decentralized storage and AWS is whether users can control their own data. In such a system without centralized control, the safety factor of data is very high.


Decentralized storage is mainly to store files or file sets in fragments through distributed storage Storage business model on storage space. The reason why decentralized storage is important is that it solves the various pain points of Web2 centralized cloud storage, and is more in line with the development needs of the big data era. It can store unstructured edge data at a lower cost and with higher efficiency, giving various emerging technologies. Therefore, decentralized storage can also be said to be the cornerstone of Web3 development.


There are currently two common decentralized storage projects, one is based on block generation The purpose is to use storage for mining. The problem with this model is that the storage and download on the chain will slow down the actual use speed. It often happens that it takes several hours to download a photo. The other is to use one or several nodes as centralized nodes, which can only be stored and downloaded after being verified by the centralized nodes. Once the centralized nodes are attacked or damaged, the stored data will also be lost.


Compared to the first project, MEMO's storage layering mechanism solves the The storage download speed problem makes the storage download speed reach the order of seconds. Compared with the second type of project, MEMO adopts the role of Keeper to randomly select verification nodes, avoiding the emergence of centralization and ensuring security at the same time. Moreover, MEMO created the original RAFI technology, which can improve the repair ability several times, and greatly improve the security, reliability and availability of storage.


Data Availability DA (Data Availability) is essentially a light node without participating in the consensus , does not need to store all data, nor does it need to maintain the status of the entire network in a timely manner. For such nodes, an efficient way to ensure data availability and accuracy is required. Because the core of the blockchain lies in the immutability of data. The blockchain can ensure that the data in the entire network is consistent. In order to ensure performance, consensus nodes will tend to be more centralized. Other nodes need to obtain available data confirmed by consensus through DA. The independent data availability layer effectively eliminates the single point of failure problem and maximizes data security.


In addition, Layer 2 scaling solutions such as zkRollup also require the use of a data availability layer. Layer 2 as the execution layer uses Layer 1 as the consensus layer. In addition to updating the result status of batch transactions to Layer 1, it also needs to ensure the availability of original transaction data to ensure that Layer 2 can still be restored when no prover is willing to generate proofs. The state of the network, avoiding the extreme situation where user assets are locked in Layer2. However, if the original data is directly stored in Layer1, it violates the function of Layer1 as a consensus layer under the blockchain network modularization. Therefore, the data is stored in the exclusive data availability layer, and only the Merkel root calculated for these data is recorded in The consensus layer is a more reasonable design, and it is also a longer-term inevitable trend.


Figure 1 is the general Layer2 independent data availability layer model designed by Fox Tech.


Figure 1: Universal Layer2 independent data available Layer Model


Celestia for Independent Data Availability Layer Analysis


An independent data availability layer is a public chain, which is better than an availability committee composed of a group of subjective people, if enough private keys of committee members (this happened with both Ronin Bridge and Harmony Horizon Bridge), making off-chain data availability unusable, it is possible to threaten users - only if they pay enough ransom to withdraw from Layer2.


Since the data availability committee under the chain is not secure enough, if the blockchain is introduced as a trust What about the main body to ensure the availability of off-chain data?


What Celestia does is to make the data availability layer more decentralized - equivalent to providing The independent DA public chain has a series of verification nodes, block producers and consensus mechanisms to improve the security level.


Layer 2 publishes the transaction data to the Celestia main chain, and the DA Attestation Merkle Root to sign and send to the DA Bridge Contract on the Ethereum main chain for verification and storage. In this way, the Merkle Root of DA Attestation is actually used to prove the availability of all data. The DA Bridge Contract on the Ethereum main chain only needs to verify and store this Merkle Root, and the overhead is greatly reduced.


Celestia's fraud proof is an optimistic proof, as long as no one makes mistakes in this network, the efficiency is very high of. If nothing goes wrong, I won't have fraud proof. Light nodes do not need to do anything, as long as they receive the data and restore it according to the code, if the whole process is correct, the optimistic proof is still very efficient.


MEMO of Independent Data Availability Layer Analysis


MEMO is a new generation of high-capacity, high-availability enterprise-level storage network created by aggregating global edge storage devices through algorithmic features. The team was established in 2017 In September, he mainly researched the field of decentralized storage. MEMO is a highly secure and highly reliable large-scale decentralized data storage protocol based on blockchain point-to-point technology, which can realize large-scale data storage. Unlike one-to-many centralized storage, MEMO can realize many-to-many storage operations without data centers. In MEMO's main chain, there are mainly smart contracts used to constrain all nodes. A series of key operations such as uploading of stored data, matching of storage nodes, normal operation of the system, and operation of the penalty mechanism are all controlled by smart contracts. control.


In terms of technology, in the existing decentralized storage system, Filecoin, Arweave, Storj, etc. On behalf of them, they allow all computer users to connect and rent out their unused hard disk space for a certain fee or Token. Although they are all decentralized storage, they all have their own characteristics. The difference between MEMO is that it uses erasure coding and data repair technology to improve storage functions, make data more secure, and make storage and download more efficient. Because creating a more pure and practical decentralized storage system is the ultimate goal of MEMO.


MEMO optimizes the incentive mechanism of Provider while enhancing the usability of storage. In addition to User and Provider roles, Keeper is also introduced to prevent nodes from being maliciously attacked. The system maintains economic balance through the mutual restraint of multiple roles. It can support high-capacity, high-availability enterprise-level commercial storage applications. It can provide safe and reliable cloud storage services for NFT, GameFi, DeFi, SocialFi, etc., and is compatible with WEB2. The product of the perfect fusion of blockchain and cloud storage.  


Original link


欢迎加入律动 BlockBeats 官方社群:

Telegram 订阅群:https://t.me/theblockbeats

Telegram 交流群:https://t.me/BlockBeats_App

Twitter 官方账号:https://twitter.com/BlockBeatsAsia

举报 Correction/Report
Choose Library
Add Library
Cancel
Finish
Add Library
Visible to myself only
Public
Save
Correction/Report
Submit