Blockchain Interoperability Part I: The Current State of Bridging
With the increasing number of layer 1s, layer 2s and appchains it is more important now than ever that blockchains have secure, low-cost and efficient ways to communicate with one another.
In this article we will cover why Interoperability is important, the challenges it faces and what are the current approaches. This is Part I of a 3 part series of Interoperability where we set the scene of where we are today. Parts II III will explore emerging Interoperability approaches that we are excited about.
A big thank you to our friends at LiFi ( Akshay ), Connext ( Rahul ), Omni Network ( Austin ), Hyperlane ( Trevor ), Polymer ( Bo ), Catalyst ( Jim ) and Polyhedra ( Abner ) for their suggestions contributions.
The Proliferation of Blockchains
The first public blockchain, Bitcoin, was introduced in 2009. In the 14 years since there has been a Cambrian explosion of public blockchains with the number now totalling 201 according to DeFiLlama. While Ethereum has mostly dominated on-chain activity, accounting for ~96% Total Value Locked (TVL) in 2021; the last 2 years has seen that number fall to 59% as alternative layer 1 blockchains such as Binance Smart Chain (BSC) and Solana launched and layer 2 rollups such as Optimism, Arbitrum, zkSync Era, Starknet and Polygon zkEVM emerged amongst many others as scaling solutions for Ethereum.
According to DeFiLlama, as of writing, there are over 115 EVM based chains and 12 Ethereum rollup / L2s and the trend of activity on multiple chains is set to continue for various reasons:
-
Major L2s like Polygon, Optimism Arbitrum positioned themselves as scaling solutions for Ethereum early, raised a large amount of capital and established themselves as easy places to deploy applications cheaply (in the last year we’ve seen +2,779% growth for developer teams building on Arbitrum, +1,499% on Optimism and +116% on Polygon - albeit off a small base of ~200-400 devs)
-
Alternative L1s continue to be launched to optimize for specific needs. Some chains optimize for higher throughput, speed and settlement times (e.g. Solana, BSC) and others for specific use cases like Gaming (ImmutableX), DeFi (Sei) and traditional finance (e.g. gated Avalanche subnets)
-
Applications with sufficient scale and users are launching their own rollups or app chains to capture more value manage network fees (dydx); and
-
Several frameworks, SDKs, toolkits and “Rollup-as-a-service” providers have hit the market to make it easy for any project to spin-up their own rollups with minimal technical lift (e.g. Caldera, Eclipse, Dymension, Sovereign, Stackr, AltLayer, Rollkit)
We live in a multichain, multilayer world.
The Increasing Importance of Interoperability
This proliferation of L1s, L2 and appchains has highlighted the importance of Interoperability - i.e. the ability and manner in which blockchains communicate with one another; to transfer assets, liquidity, messages and data between them.
Blockchain Interoperability can be broken down into three parts, as suggested by Connext :
-
Transport: where message data is passed from one chain to another
-
Verification: where the correctness of the data is proven (which typically involves proving the source chain’s consensus / state); and
-
Execution: where the destination chain does something with the data
The benefit of being able to move assets and liquidity between chains is straightforward - it allows users to explore and transact in new blockchains and ecosystems. They will be able to leverage the benefits of new blockchains (e.g. trading or transacting on layer 2s that have lower fees) and discover new and lucrative opportunities (e.g. accessing DeFi protocols with higher yields on other chains).
The benefit of transporting messages lies in unlocking a whole set of cross-chain use-cases without having to move their original assets. Messages sent from Chain A (the source) trigger execution of code on Chain B (the destination). For example, a dapp on Chain A could pass a message about a user’s assets or history of transactions to Chain B which then allow them to engage in activities on Chain B without having to move any assets, e.g.
-
Borrowing on Chain B and using their assets on Chain A as collateral
-
Participating in community benefits on a lower-cost rollup (like minting a new NFT collection, claiming tickets to events merchandise) without having to move their NFT on Chain A
-
Leveraging the decentralized ID and on-chain history they set up on one chain to engage in DeFi access better rates on another
Challenges in Interoperability
Despite the many benefits that Interoperability unlocks, it faces many technical challenges:
-
Firstly, blockchains generally do not communicate well with one another: they use different consensus mechanisms, cryptography schemes and architectures. If your tokens are on Chain A, using them to buy tokens on Chain B is not a straightforward process.
-
Secondly, at the Verification layer - an interoperability protocol’s reliability is only as good as the verification mechanism chosen to verify that messages passed were, indeed, legitimate and valid.
-
Thirdly, having multiple places for developers to build results in applications losing composability which is a key building block in web3. This means developers are unable to easily combine components on another chain to design new applications and unlock greater possibilities for users.
-
Lastly, the large number of chains means liquidity gets fragmented making participants’ capital less efficient. For example, if you have provided liquidity into a pool on Chain A to access yields, it is difficult to then take the LP token from that transaction and use it as collateral in another protocol to generate more yield. Liquidity is the lifeblood of DeFi and protocol activity - the more chains there are the more difficult it is for them all to flourish.
There are some Interoperability solutions that exist today to address some of these problems, so what is the current state of play?
The Current State of Interoperability
Today cross-chain bridges are the main facilitator of cross-chain transactions. There are currently more than 110+ bridges with varying levels of functionality and tradeoffs across security, speed and how many blockchains it can support.
As outlined by LI.FI in their comprehensive Bridging 101 piece , there are several different bridge types:
-
Wrap mint bridges - which secure tokens on a Chain A in a multisig and mint corresponding tokens on Chain B. In theory the wrapped tokens should be the same value as the original tokens, but their value is only as good as the bridge is safe - i.e. if the bridge gets hacked, then there is nothing for the wrapped tokens to be swapped back into when a user tries to bridge from Chain B to A (Portal, Multichain)
-
Liquidity networks - where parties provide token liquidity on either side of a chain to facilitate cross-chain swaps (e.g. Hop, Connext Amarok, Across)
-
Arbitrary Messaging Bridges - which enable the transfer of any data (tokens, contract calls, the state of a chain), e.g. LayerZero, Axelar, Wormhole
-
Specific use case bridges (e.g. Stablecoin NFT bridges) which burn stablecoins / NFTs on Chain A before releasing them to Chain B
These bridges are secured using different trust mechanisms underpinned by different trusted parties and incentives - and these choices matter (as pointed out by Jim from Catalyst Labs and the Li.Fi team):
-
Team Human relies on a set of entities to attest to the validity of transaction;
-
Team Economics relies on a set of validators with staked collateral at risk of slashing penalties to disincentivize bad behavior. This only works if the economic benefit of misbehaving is lower than the slashing penalty
-
Team Game Theory divides various tasks in the cross-chain process (e.g. checking transaction validity; relaying) among different parties
-
Team Math performs on-chain light client verifications, leveraging zero knowledge tech and succinct proofs to verify state on one chain before releasing assets on another. This minimizes human interaction and is technically complex to set up
Ultimately, trust mechanisms range from humans to humans with economic incentives to math-based verification. These approaches aren’t mutually exclusive - in some cases we’ve seen some being combined to enhance security - e.g. LayerZero’s game theory-based bridge incorporating Polyhedra (who rely on zk proofs for verification) as an oracle to its network.
How have bridges performed to date? So far, bridges have facilitated the transfer of a large amount of capital - in January 2022 TVL in bridges peaked at $60b. With this much capital at stake bridges have become prime targets for exploits and hacks. In 2022 alone $2.5b was lost through a combination of multi-sig key compromises and smart contract vulnerabilities. A 4% annual capital loss ratio is not tenable for a financial system to thrive and attract more users.
The attacks continued in 2023 with Multichain addresses being drained for $126m (representing 50% of the Fantom bridge and 80% of the Moonriver bridge holdings) accompanied with the revelation that all this time their CEO held control of all the keys of their ‘multisig’. In the weeks after this hack, TVL on Fantom (which had a lot of assets bridged across Multichain) dropped 67%.
At the end of the day, some of the largest bridge exploits and follow-on consequences have come down to multisig vulnerabilities (Ronin $624m, Multichain $126.3m, Harmony $100m) highlighting the importance of what bridge trust mechanisms are employed.
Having a small (Harmony) or grouped (Ronin) or singular (Multichain) validator set is a key reason for some of these exploits - but attacks can come from a frightening number of vectors. In April 2022, the FBI, Cybersecurity Infrastructure Security Agency (CISA) and US Treasury Department issued a joint Cybersecurity Advisory Notice highlighting some of the tactics used by North Korean state-sponsored Lazarus Group. They ranged from social engineering, e-mail, Telegram and CEX account phishing among others (screenshot examples in this thread by Tayvano ).
Where do we go from here?
It’s clear that verification mechanisms that ultimately rely on humans are easy targets - yet the need for secure, efficient Interoperability remains. So where do we go next?
We’re now seeing the emergence of trust-minimized approaches to verification - and that’s what we’re excited about:
-
In Part II we’ll cover Consensus Proofs, which are used to attest to a source chain’s latest consensus (i.e. their state / ‘truth’ in the last few blocks) to facilitate bridging; and
-
In Part III we cover Storage Proofs, which attest to historical transactions and data in older blocks to facilitate a wide range of cross-chain use cases.
Both approaches center around trust-minimized verification to circumvent human reliance fallibility and are flying the flag for the future of Interoperability. We’ll do a deep dive on them and the teams building in the space, stay tuned!
Disclaimer: The content of this article solely reflects the author's opinion and does not represent the platform in any capacity. This article is not intended to serve as a reference for making investment decisions.
You may also like
BTC falls below $104,000
BlackRock IBIT's pre-market trading volume reached $50 million today