With demand to participate in ETH staking heating up post-Shapella, what's next for the sector? Watch the recap of a livestream conversation with leading staking experts covering post-Shapella traction, liquid staking solutions, decentralized validator technology (DVT), restaking, and more.
Hosted by Liquid Collective's Mr. X
Good day everyone. I am your host Mr. X, and I must say it's purrfectly delightful to welcome you all to today's panel discussion “What's Next for ETH staking?” We're here to scratch beneath the surface and to provide you, our esteemed audience, with the most valuable information about the future of Ethereum staking.
We are very fortunate to have a panel of distinguished experts here today. Let's meet our panel with a quick round of intros in alphabet alphabetical order, by first name. Panelists, please introduce yourself and share the name of the company you represent or the Ethereum project you're actively working on. Starting with you, Collin.
Hey everyone! Thank you, for the intro Mr. X. Thank you for having us today. I'm Collin Myers and I’m co-founder of Obol Labs, currently focused on building the Obol Network which includes the technology of DVT.
Thank you Mr. X. I’m Darren Langley. I am the general manager of Rocket Pool. We are a decentralized liquid staking protocol on Ethereum.
Thank you Mara. Sreeram?
Thank you Mr. X. Hi everybody. I'm Sreeram, founder of a project called EigenLayer which enables Ethereum stakers to opt in to provide additional services. We call it restaking.
Thank you. And last, but not least, Viktor.
Hi everyone, my name is Viktor Bunin. I am a Protocol Specialist at Coinbase. My team is responsible for knowing all there is to know about all the very different protocols that are out there as well as middlewares, MEV solutions, etc. and then operationalizing that research in order to drive product strategy, roadmap, and features, and pricing, and content, and so on, and so forth.
Thank you all. We're off to an excellent start. Let's dive into the world of Ethereum staking. Viktor, the first question is for you. What are the trends you're seeing in staking post-Shapella and how are the dynamics of staked ETH changing?
I think the nice thing is that as much as the dynamics are changing, they're playing out in the way that people expected. So right now, you know, as Shapella successfully completed the upgrade, ETH withdrawals are now possible. So you're able to unstake your staked ETH and move things around or do whatever you like with it. So that's a really powerful conclusion to the whole transition to proof of stake. This is kind of like the final chapter that people have been waiting for, of saying, “Okay, this work is done.”
You know, one of the things that we expected was that a lot more stake was going to come online. This is the final de-risking moment—proof of state is completely successful on Ethereum—and so that has played out and we've seen a lot more net inflows, so the staking rate is going up.
But the other thing that we're seeing is that during bear markets, there's less to do with ETH. So, you know, DeFi isn't as strong. There's less different farms or different liquidity providing that you can do. So staking in particular has drawn a lot of attention because it's one of the best ways to continue to support the network while being rewarded for actually performing the work of doing so.
And then another thing that I'll just say really quickly is there are a number of upgrades that are coming to Ethereum, both in-protocol and out-of-protocol. So in-protocol meaning like proto-danksharding, for example. Out-of-protocol meaning middleware solutions, such as Obol and EigenLayer, obviously that we have on the call. Flashbots for example falls into that as well.
And so one thing that that means is that they all do different things but fundamentally, they all make Ethereum more useful and make validators more powerful in terms of the amount of services they can provide, in terms of how they can benefit the network, and what they can do.
And what that means is that by performing those additional services typically they gain additional functionality. So they're being paid for the work that they are providing. What does all that equal?
Well, what it equals is that if staking was, you know, just staking previously and you're just securing Ethereum and that's all you're doing, and you're being paid a certain reward rate, now if you do a lot more, you can be paid more. So I do expect that as those opportunities continue to proliferate, and as Ethereum becomes more useful, being a validator will become more important and more valuable. We expect more people to stake as a result.
Fantastic. Any other trends any of the other panelists are noticing? Particularly picking up on that theme of making Ethereum more useful?
Personally, from where I sit, there's just a lot of new projects, which has been cool to see. From genesis to withdrawals, the design landscape for a protocol was fairly limited in its design and honestly required a lot of rework to build. So what a lot of people did is they waited until post-withdrawals. Where we sit at Obol is we try to work with a lot of protocols to help them achieve their goals which are mostly security-based or decentralization-based or resiliency-based. Now, post-withdrawals there's a whole new landscape.
I feel like there's new pools coming every day, there's new protocols, or there's new products. Honestly this unlocking of your rewards or this partial withdrawals time period, is almost what I would consider to be the next large microeconomic event of Ethereum. People can stream rewards in different directions now. People can start creating small economies around their rewards and that's something that we haven't seen before. So I'm quite excited to see where all the rewards go, because I feel like they're probably going to go to beneficial places of sorts based on the types of personalities that have held on this long.
Yeah we're certainly seeing a lot more interesting DeFi protocols and existing protocols using liquid staking tokens in interesting ways because they're kind of reward bearing. They can be used in some pretty cool ways. So they're starting to blossom and they're starting to come through now which is good to see.
To add to that I think one of the interesting things is that we're really seeing a totally new momentum, I think, for adoption in the staking market. Generally we've had about a million ETH flowing into the network for deposits since the withdrawal upgrade alone. So that's almost $2B dollars worth of ETH that is coming into the market to participate.
I think it's really exciting to see, not just the builders that are really trying to innovate on new solutions that can help this growing and proliferating market, but also the new types of users that we're starting to see emerging. The withdrawal enablement was actually a really important unlock from a risk consideration perspective for a lot of market participants, more risk-conscious retail participants, but also in particular the institutional market. So we're starting to see a lot of enablement solutions that are starting to tap into very specific customer segments and customer segment-based solutions.
So, you know, Liquid Collective for example, is really focused on enabling the enterprise market on the institutional market that is now looking eagerly to start participating in staking support and getting access to the risk-free rate equivalent that we have in web3.
I definitely agree that it's a classic adoption cycle. We're on that very very steep part of the curve, which is quite exciting.
What do we think these changes indicate for the overall decentralization of the Ethereum network?
It’s a good question. I'll give my thoughts about this. “Is more better?” I think [this] is the question a lot of people in the core Ethereum community are asking themselves. How many options of the same thing do we need for something to be sufficiently decentralized? Is more better or is more kind of a misallocation of capital and time towards the people who are building it the right way? And that's kind of how I look at where we're at today.
I think right now, seeing more innovation and more capital flowing in is good, but ultimately I think now it's kind of up to the core Ethereum community to decide how many of one thing is sufficient. Is more better?
I still think today more is better, but I'm afraid we're going to get to a point where there's just going to be 15 of everything, just because. So I'm quite curious to see how we all coordinate with each other around the inherent centralization vectors of opening up more growth, right? Because you'll go through this period of more growth. There'll be a bunch of new protocols [coming] in. There'll be a bunch of ideas and then eventually they'll kind of fall back and fall out, right as we go through the next cycle. Is more better for decentralization or is more focus in specific areas better? These are some of the trends I'm focusing on.
When you said, “Is more better?” I thought you were going to take it another way, so I'll take that other way, which is that right now if you just have ETH validators, right, one of the nice things about how Ethereum is designed is that it's designed for regular people to be able to run validators. It is not the easiest thing in the world, but it is a doable thing and there's a lot of individuals that are running them.
And so as you start adding additional things to the validator running experience, right, so now you're not just running a validator, now you're also running MEV boost from Flashbots, you're also running Obol, you're also running EigenLayer, you're also running Rocket Pool, you're running a bunch of different things in some configuration. All of a sudden, if you're, let's say just running a validator, versus running it with all these things, now you might have a totally different rate of reward, because you're doing more versus less.
So if it becomes too hard for individuals to run all these things, and so they're like, “Hey, I'm getting less rewards. Why would I do that? Why don't I stake with someone else that is doing all that work?” So one of the questions is: will that actually diminish the proportion of the network that is running solo validators? Or are people going to either put up with it or will they accept a lower rate of reward? So that's one question that I have.
So I think that right now, you know there's certainly groups [that] are taking the individuals running validators really really seriously, like the EthStaker Community, for example. I think that work is going to continue to become more and more important as these different modules mature, just to make sure that continued experience is as easy as possible and as close as possible to what it is to run just a single individual validator.
For sure. I think one of the interesting dynamics that always plays out when a market becomes more attractive and attracts more capital, is naturally the fact that you create new competitive dynamics. I feel like we started seeing that over the last two years and the node operator market that has actually contributed in many ways to improved decentralization on key metrics, like client distribution, for example when a mandate became very clear that having a multi-client configuration is actually something that enhances security and correlated risk protection.
I feel like that type of competition actually can lead us to configurations that really focus on enhancing native security within the network and for stakers that are participating inside of it. A lot of those efforts tend to trend in the direction of how the protocol has actually been designed, which is focusing on ensuring that the most risk-mitigated configurations tend to favor decentralized and distributed core components and architectures. So I definitely feel like that is one of the things that we're going to start seeing.
I think the other thing that is really interesting, if we talk about sort of the macro picture of ‘where is stake flowing?’ We are starting to see a lot of movement in what I would consider actually a more decentralized landscape of where ETH is being deposited and how ETH is being staked. In many ways we're starting to see that right now.
A lot of participants are starting to opt into multi-operator staking pools, liquid staking solutions over opting for a single operator architectures or staking infrastructure. We've started seeing those trends accelerate actually over the last couple of weeks, particularly since the withdrawals upgrade.
That's a perfect transition into a deep dive question I have. Let's take a step back, for those in our audience who are unfamiliar with liquid staking. Darren, what is liquid staking and what role does it play in the Ethereum staking landscape now that withdrawals are enabled?
So, liquid staking is essentially, it's actually the easiest way to stake your ETH today. You swap ETH for a liquid staking token (LST) or essentially you deposit ETH into a staking protocol. They give you back a liquid staking token and that liquid staking token represents your kind of ownership of that stake. Over time that accrues value and there's some different methods of how that happens. But over time that accrues value based on the value that is being generated from Ethereum itself. Then you can kind of transfer that around, you can use it, it's very flexible, you can use it in DeFi and that sort of thing, and then when you want to unstake you essentially burn it, or you know deposit it back into the protocol and then they give you back ETH in return. So that's the definition of liquid staking.
Mara, any further insights on the need for liquid staking now that the withdrawals are enabled?
For sure. I think one of the questions that has been really prevalent in the market is what the benefit of liquid staking is in a world where you know participants can with ease stake and then also withdraw their ETH and their rewards?
One of the things that's actually been really interesting to observe—and we've actually written a ton about this—is the fact that the protocol actually has an interesting configuration, which is called the churn limit. So the churn limit controls how many validators can be activated within a given epoch and how many validators can exit within a given epoch.
So what's really interesting is that although enabling withdrawals and participation is a massive milestone, this churn limit is actually creating major delays in the activation queue for validators to start earning rewards. We're at roughly 30 days now, so you have to wait a whole month before your validator can actually start participating, contributing security, and earning rewards for that.
Similarly, the withdrawal queue is also stretching multiple days. I think it's around 5 or 6 days at this point in time. Participants who are looking for opportunities for creating better capital efficiency are obviously looking for liquid staking solutions. We're actually starting to see a lot of the ETH flows, as we discussed earlier, moving into liquid staking solutions for different reasons. Right? Not just the liquidity aspect, but also the benefits that can be consumed as a result of a distributed validator set, having correlated failure protections, embedding native capabilities such as slashing coverage.
So I think liquid staking is really the evolution of what we've seen in the staking market, building more sophisticated, more secure products that also have great benefits for users and participants that want to leverage these types of products for other solutions and capabilities as well.
Yeah liquid shaking is not one-size-fits all, is my observation. We’ve looked at a lot of the different pools, and their architectures, and their economics, and their users. Honestly for the first two years, I think there was a lot of a similar personality type staking on the meta like macro level and now there's new personalities coming in. They have different needs. They have different wants and products have to change to cater to those things. Because of that, it's decentralized more, but it's also become quite competitive.
What's quite cool is that we're kind of already seeing decentralized options competing with more centralized options in terms of both scale and performance, which is really exciting. It's definitely a glimmer of hope for Ethereum in terms of making sure that it stays decentralized.
Decentralization is extremely important to Ethereum's well-being. Large institutions that are kind of starting to come in now, you know really do need to consider that particularly it could be classed as a kind of an ESG policy issue almost. In the end it's kind of you know good old-fashioned long-term commercial sustainability. So you know that's definitely going to play out a lot once we get this new kind of wave of adoption, those early majority people.
Viktor, are there other items on Ethereum's roadmap this year that may affect staking participation, like proto-danksharding? What exactly is proto-danksharding?
Yeah, thank you. It's more often referred to as EIP-4844. But essentially, right now Ethereum is all in on the L2 roadmap. L2’s are the future of Ethereum. One of the big problems for L2’s is essentially like where to put their data and this has repercussions for their security model. So right now if you were to just put it onto Ethereum, there's like a pretty finite amount you can put onto it, and it's not really built to handle that kind of data in those types of quantities especially for where we think the L2 space is moving.
So one of the changes Ethereum is making is it's changing a little bit about how the blockchain works. It's changing a little bit about what validators are going to be responsible for around the storage of data and how that data is basically stored and how long it's stored for, and it's changing those characteristics, in order to better enable L2’s to put their data onchain in a way that enables them to have security and have the have the features that they would like. So that's one of the important changes, because not only will validators be obviously paid for it, but it is changing the dynamic of how validators actually accrue rewards in terms of what they make.
Right now all these L2’s, any data that they put onchain, they're essentially competing with regular transactions, and they're being part of the same kind of like process as like Uniswap transactions for example. If you really think about it doesn't really make a ton of sense because you know any data that they put onchain doesn't interact with other smart contracts, other things, other you know whatever. It doesn't need to be ordered in the same way that Uniswap transactions need to be ordered, where ordering really matters. So there's a whole bunch of different things there.
What we're seeing right now, is this transition for the role of the Ethereum blockchain and how do you better optimize it for where we think the use cases are going to be moving in the future. It's a pretty big one. There's not a clear answer to ‘will the amount of fees go up or go down?’ Nobody really knows and we can have some assumptions. But I think fundamentally it goes back to what we were saying earlier, which is that it makes the Ethereum blockchain more useful, it makes it better able to support the use cases that it would like to support, which is to be a settlement layer, and to be a place where L2’s can thrive and all these other things. So because of that, we think being a validator is going to become more important.
One of the other things that I think is a proposed thing for Cancun, is this one called EIP-4788, which is the beacon state route in the EVM. That's massive for decentralized staking protocols. So at the moment, the Beacon Chain and the execution or the consensus chain and the execution chain: Ethereum is basically run on two tracks. They don't really talk to each other at all, so you have to kind of oracle information between the two. The Beacon State roots in 4788 actually make that connection.
So you can do things, cool things, with proofs, definitely some fantastic stuff with zero knowledge proofs, built on that. So that would be amazing to get that into Cancun because it opens up a big design surface for new decentralized staking protocols.
I know Danny is also kind of spec-ing out another one called 0x01 initiate exits that I don't think it has an EIP yet, but if that was to get in somehow within the roadmap, that's also a massive thing for decentralized protocols as well because again it provides this design space in which to kind of help us build better and better protocols essentially.
Yeah and I guess if we're looking a little bit further out as well, there's another thing that people are talking about, especially Justin Drake and certain members of the Ethereum Foundation community, all around MEV burn and MEV smoothing.
Essentially there's a whole bunch of MEV that is being generated right now. It's being captured by searchers and by validators in different proportions. And now the design space is opening up with things like MEV share, which will potentially give some MEV back to users, or to the transacting wallets, or whatever it is.
But also, they're looking at what are the properties MEV exerts on Ethereum consensus. So if you have a particularly large block with a particularly large reward for example, what can you do about the MEV in order to basically limit its impact on Ethereum’s security and on Ethereum’s stability. So there's some proposals going around, nothing’s really been finalized, but that's really an important one to look at as well.
To just to kind of comment on what Darren had mentioned, just to give listeners concrete examples, these are two also very important initiatives that are important to Obol. We've been kind of lobbying for these things as well. For us the Beacon state root for example is very important as we work on our V2.
There's many things that take Obol from V1 to V2 or DVT from V1 to V2, but one of them is being able to objectively prove that someone in a shared cluster is not doing their performance. To accomplish this, it is a series of fraud proofs and verifications and without the Beacon State root, it's completely financially infeasible to even build our V2. So that's a big one for us.
Then on the 0x01 or however we're kind of calling these forced withdrawals and a shared validator environment, it's very hard to force withdrawal that situation in a collaborative manner based on how withdrawals work today. So that's another big one where Obol and a distributed validator cluster with a group of people run into a ‘how do we get out of this situation?’ that 0x01 or whatever you want to call it now, call it just ‘forced withdrawals,’ or a new form of getting out of withdrawals is also important for the future DVs as well.
Yeah, just adding on something on both of these points: the smart contract triggered withdrawals is particularly crucial for what we're building on EigenLayer. The idea being that in the way EigenLayer works is when staking you can set your withdrawal credentials. Normally you would set it to yourself. Instead you add one additional step in your withdrawal flow. You set it to a smart contract, to the EigenLayer contracts, and in the EigenLayer contract you set the withdrawal address to yourself. So you're kind of taking one step up.
One of the coordination problems between EigenLayer and Ethereum is that when you opt into EigenLayer and something happens, that you misbehave on some other protocol that you staked to, how do we let Ethereum know of that? One way to do it is to actually trigger the withdrawals in situations where we actually need smart contracts to talk to the Beacon Chain.
So you can think of both of these upgrades: the Beacon state root as information passing from the Beacon Chain to Ethereum, and the smart contract triggered withdrawals as passing information from the execution layer to the Beacon Chain. So I think both of these are quite important for unlocking more primitives, more trustless primitives, in the staking ecosystem.
In the meanwhile all of the different protocols have different ways of having them make due solutions. The one that we are using for bringing the Beacon state to Ethereum is using some kind of zero knowledge proofs from what is called the sync committee. So, that's what we use to bring the Beacon Chain state root. But I think having that be enshrined in protocol is actually a far better solution.
Darren, I would like to dive into Rocket Pool's Atlas Upgrade where you're introducing 8-ETH minipools for node operations, or “LEB8s.” What are the benefits of lowering the barrier to entry for node operators and do minipools make it easier for small validators to join the big cats?
So from Rocket Pool’s perspective obviously lowering the barrier to entry is great for us. We get to scale a lot faster, our node operators earn more, and liquid stakers actually end up paying less, so that's great.
More broadly though, lowering the barrier to entry increases participation. Increased participation raises the overall stake providing more economic security for Ethereum. So that's pretty important as well. It also brings in more independent, small scale node operators and home stakers.
There's always going to be this kind of power law of stake, where you have a lot of stake being staked by a few people. But what's really important for Ethereum is this kind of long tail. The long tail is extremely important for decentralization. With decentralization you then get all these follow-on effects: the higher resiliency, the credible neutrality, you know all the things that we want from a truly global settlement layer. So that's essentially the benefits of lowering the barrier to entry. Ethereum needs more node operators essentially.
I totally agree, and love discussion of long tails.
Mara, how does a liquid staking protocol like Liquid Collective that has a more limited node operator set increase Ethereum's decentralization?
Yeah, so we've been very very thoughtful about the key requirements from a performance, security, and risk perspective for node operators within Liquid Collective’s active set.
It's really important to note that when servicing the enterprise-grade market, working with publicly registered companies servicing institutional businesses and more traditional players in the market, it's really important to strike the balance in finding ways to create decentralization by means of core parameters. So for example, opting for configurations in a node operator set that allow multi-region configurations, multi-client configurations, and creating a certain set of parameters or standards that allow us to do that in an effective, transparent, and onchain way.
So that's what we've actually spent the last couple of months on, really innovating and thinking through the different parameters and standards that we want to see within an expanding set of node operators that can really set the bar for performance but in many ways also for the decentralization parameters that we want to see within our node operators and the validator set that we actually have within the protocol.
For us, it's actually really exciting because we're trying to figure out if there is a security optimization point within the context of the protocol itself. If you think about a node operator set that is distributed that adheres to a certain requirement for diversity parameters within core architectures and then you combine that with really interesting technologies like threshold signing capabilities in the context of DVT, then you can really hit a really interesting optimization point from a security standpoint in many ways also contribute to the core decentralization within the network itself with a solution that is really catered to meeting the needs of a very specific customer segment. We're really excited to work with the community to release some of these standards over the next couple of weeks and months here.
Fantastic. Collin, let's switch gears and talk about DVT. What is DVT for those who are unfamiliar and how far along are we in DVT’s development?
Yeah, great question. So most simply, I guess before I explain DVT let's talk about a validator today. So how do we look at a validator?
A validator consists of three things. The first thing is an individual public-private key pair. Boom: cryptographic primitive number one. One set of keys. The next one is one machine. People just add multiple validators to one machine over time. Then the third one is an agent, call it a human or an entity.
So today, it’s very monolithic. The validator stack of today is super monolithic: it's an individual public-private key pair. It's one machine and then it's either one person or one entity running that public-private key pair hosted on a machine. Super monolithic.
So what we can do with DVT is that DVT enables you to run a validator across multiple machines and multiple people or entities at the same time. From a web2 perspective, this is just simply a ‘highly available validator’ and a thing called ‘active active redundancy.’
A basic example of that is the five of us on this call today could all create a distributed validator together. All of us have a key share. All of us have our own machine and then let's say Viktor's house burns down. It's the second time I've used this analogy. Let's hope it doesn't happen. Let's say Viktor's house burns down, our validator is going to keep going. Let's say while Viktor's house is burning down, Darren's house actually goes down at the same time as well due to bad electricity in Australia, which sometimes during the heat can actually happen.
So now all we have left is Collin, Mara, and one other person. And the three of us can power this together. So it has fault tolerance, it has redundancy, and it's able to withstand the fault of an individual or an entity. So it's very important for this multi-operator staking movement. It's kind of the end game for that. So today there's a lot of pools that are building multi-operators staking, however it's a bunch of individual validators running their own machines with their own keys.
The next paradigm we'll move on to is collaborative staking, which is where each operator in a liquid staking pool just has a key share. They're not running a full validator. They're running a fractional validator, which has a variety of different Byzantine things which I can get into later. But I will spend some time on, “How far along are we in the development of it?”
It's been a while. We've been working on DVT for 3.5 years and now we're finally at the main point of it. In December of last year, we launched the world's first distributed validator, run fully at home, across four people in 3 countries, and now since then we've scaled mainnet deployments to around 25, all of which are being run with not just an individual group, but they're all collections of people. Some are a 4-individual cluster, some are 5, some are 7, we've even seen 9, 10. We're kind of now beginning to push these clusters to their furthest size to test the latency.
But right now we're going through our alpha release phase. So we've decided to cap the amount of validators that at least we will help, or recommend, to put on mainnet. People can go use it today. There's a business license on the codebase but there's one free validator for everyone, so you can just go run one today if you want. We have completed one audit. The auditor results by Sigma Prime are actually coming out in the coming weeks, which is super exciting. Right now really what we're building towards is Obol v1.
Obol V1 is to be released in the fourth quarter of this year. We’re in alpha now. We'll build into beta in Q3, and then in Q4 we'll launch v1, which is very exciting. That roadmap is very well laid out. The features are very well laid out. It's just a matter of getting audits and continuing to do testing and rolling it out in a controlled manner with people before we kind of rip the Band-Aid off.
In addition to that though, we are making a lot of progress on v2. We’re fortunate enough to come to this kind of ETH staking party a little bit later than some of the other protocol teams and we've been able to watch them grow into their v2s and v3s. We've really started working on our v2 ahead of time.
Obol v2 will actually be a multi-implementation version of Obol which is very exciting. So in the coming weeks, we'll actually announce a very large partnership that we've done with the development team to help decentralize DVT and actually make a unified DVT protocol with multiple implementations, very similar to how Ethereum has been built. So today, we're making great progress. We're ahead of schedule. We've had a ton of great supporters helping us get to this point, which is awesome. It's been a very organic and decentralized matter. But really, what we're most focused on as a core team today, is that v2: getting to multi-implementation because that is the best version of how we think DVT should be built.
Fantastic. I loved everything you said there except Viktor's house burning down. Let's hope that doesn't happen.
He also said it multiple times—it was like, Jesus! The funny thing is Collin has a standing invitation to come stay in our guest bedroom whenever he's in New York, so it's especially weird that he's like, “I hope it burns down.” Like, please don't make me do this.
No comment. Another exciting technology is restaking. Sreeram, would love to hear a simple overview of what restaking is, how does it work, and what are the benefits for users.
Yeah thank you so much Mr. X. The idea of restaking is that as more stake comes online, the stake is looking for additional opportunities to participate in validation. Right now what is happening is the core of staking is you're putting down some bunch of ETH. Then you're saying that ‘I'm going to run the Ethereum protocol correctly’ and by doing that you're actually able to earn a certain reward because people are willing to pay for decentralized trust. But what EigenLayer is doing is taking this and making Ethereum the decentralized trust layer of crypto.
So essentially by staking ETH using EigenLayer, using restaking, what you can do is not only validate Ethereum blocks, but also validate blocks for any new chain or new service. An example of a service would be something like you want to run a bridge, you want to run an oracle, you want to run a multi-party computation, you want to run a network of nodes for serving AI queries. Any of these kinds of things you can actually use the Ethereum staking layer as the underlying economic base.
The way restaking works, is when you stake ETH, you opt into EigenLayer and you say that I'm also validating all these ABCD other protocols. Because I'm validating these other protocols I'm entitled to earn revenue or fees from actually the validation of these other protocols, so you're enhancing your revenue stream.
You're doing that by promising that you're not going to violate the governance of these other protocols. These other protocols may have what are called slashing conditions. Slashing conditions are basically boundary conditions that make sure that people who opt-in and run validators don't misbehave.
For example, if, say, you're storing data but you're unable to produce a chunk of the data on time on Ethereum, you may get slashed. Things like that. So the slashing conditions enforce a certain covenant and behavior from the validators. So that's the core idea of the EigenLayer protocol.
It's a multi-sided marketplace. On the one side, you have stakers who can opt in and provide validation for a bunch of services. On the other side you have builders. People who build complex distributed systems. Every time you have a new distributed system, right now what you have to do is to go and start a whole new trust ecosystem around that particular technology.
Each technology needs a distributed trust network which is very difficult and has placed a massive entry barrier for innovators in the space to build new and interesting crypto infrastructure. So what we hope to unlock, is by using the existing staking to support these new innovations, we can both grow the staking market but on the other side, also grow the productive utility, of I think what we think of as the core raw material of the crypto space, which is decentralized trust.
Fantastic. If EigenLayer does create a competitive market for pooled security across protocols, what effect may that have on governance across the ecosystem?
Yeah, I think the first interesting thing that it does is, right now any upgrade to Ethereum requires some kind of consensus that we are all going to run that upgrade. This is actually a very hard process. There are many, it feels like ocean liners: they don't want to make abrupt movements, so you want to go slow and careful.
Whereas what we can do with a layer like EigenLayer is people can come and test new ideas on smaller scales and in an opt-in market, before you know it's battle-tested. Then once it's battle-tested in the wild, we can actually incorporate some of these changes back to Ethereum.
So you can kind of think of Ethereum as an opt-in layer where many potential versions of upgrades are tested, before one of them gets enshrined. And we are ourselves are trying to play with this idea. For example, we are building a version of the Ethereum data availability scaling, called darksharding, which is an upgraded version of EIP-4844 and proto-danksharding that Viktor talked about earlier.
So our version of this, we call it EigenDA: Eigen Data Availability. The idea is basically to take some of the best ideas from the Ethereum roadmap, which in this case is danksharding and then test it out in an opt-in framework, where people can opt in and see how you know we can optimize these distributed systems.
Maybe there'll be other competing versions of the data availability other people build using ETH staking and then we'll find the best one and then some of those ideas can then be translated back to Ethereum, so that's our kind of vision for how we think EigenLayer fits with Ethereum governance.
Fantastic, thank you Sreeram. Viktor, zooming back out, what impact do you expect new technologies like DVT and restaking will have on the staking landscape? Will they cause a cat-astrophic shift? Or just small im-purr-ceptible change?
Well, right meow I think that, I think that a lot of these changes are kind of easy to predict the direction that it'll go in but harder to predict the scale of the impact. And maybe that's one of the things that I'll focus on.
When we talk about EigenLayer for example, right. Disclosure: I'm a small investor in EigenLayer and I'm a huge fan of Sreeram and team. But you know when we talk about it, it's an incredibly powerful capability. But I tend to think of it as performing useful work, and so when we think about how much impact will EigenLayer have, well, it depends on how much work it performs, right? It depends how many services it provides? How long it takes to build those out, right? And fundamentally are these services net new services? Are they competing with existing things? Are these special editions to the core Ethereum protocol? On some of these questions we have some semblance of an answer, and a lot of them we don't.
So when we think about ‘will EigenLayer have an impact?’ Yes, absolutely. Will EigenLayer have a single billion dollar impact? A $10b dollar impact? A $100b dollar impact? That part is very, very difficult to fathom because there's so many factors at play.
So I think we know, the same thing for Obol and a lot of different protocols, at the middleware layer. So we know the direction that things will head in. We don't know each individual thing, the scale of the impact that is possible, although we can take some assumptions around bull cases and if things go right, what will it look like?
But I think that one of the important things that kind of comes out of it, is that even if we don't know the impact of each individual item, there is one underlying trend that we do know, which is that the Ethereum core protocol is going to become harder and harder to change over time. It's going to crystallize and, so, because it'll do that, that pushes innovation to the middleware layer, right? So the protocol kind of stays as it is, and it’s like what can you do? You know, “Protocol and.” “Protocol plus.” So I do expect for there to be more middleware solutions that come in the market over time and more capabilities that get built out.
So I expect for all of these things to create a really powerful base on infrastructure in which not just Ethereum can function as a blockchain, but all the different capabilities and services that are built on top of crypto and that would benefit from decentralized infrastructure. That's what the Ethereum validator set is. It's the most powerful piece of decentralized infrastructure in the world.
Let's talk about Schelling points, which I think is an interesting discussion. Everyone building in this space has a Schelling point which is a result of proof of stake. And the Schelling point is 33%.
What happens then? Are projects limited in size? How big can you actually get while still adding benefit and not entering into that harm category? Most of the reason why we're moving towards multiple implementations is to get in front of this Schelling point topic because there are some pieces of a validator stack that if designed properly can maybe run more than 33%. There's other pieces that I think most of us have agreed should not, in some areas.
So everyone building here has the same Schelling point actually, which is the 33% barrier. Really what you're trying to do is continue to build helpful technology that enables you to scale past that, that still benefits the network, and doesn't hurt it. That's really an art form. To be honest with you it's not like an individual project goes and explores and finds out what to do at this point. It's actually all of us learning from each other, working together, incentivizing our protocols together in different teams to figure out how we continue to grow products larger and networks bigger while still adding value and not bringing harm.
The other side of that equation is maybe the point of all of this, is to not have a $10b dollar company. What's wrong with 10 $1b dollar companies that all have their own cultures, and all all have their own employee bases, and you know maybe that's the end game too. But I've seen this happen quicker in the staking industry rather than others because of the Schelling point and the mathematical barrier of 33%. It's the first time protocols have been capped that have massive demand. So now what do you do? And it's a very interesting kind of exploratory phase.
I get comfortable with this because everyone that’s working on that problem is here for the right reason, and they're doing so to help protect Ethereum while also building great software.
One quick thing I want to add here which touches on some things from before is the Ethereum staking protocol fundamentally wants two different goals. One is economic security, which is we want as much stake as possible. The second one is we want decentralization. Unfortunately protocol can measure the first one but not the second one.
So all of this discussion is downstream of this fundamental observation which is staking supplies decentralized trust which has two dimensions. One dimension is economic value and the other dimension is decentralization. And we need both of these for the protocol to function, but the protocol itself can only measure and incentivize the first one, but not the second one.
So we think about this a lot because as a marketplace for decentralized trust, we want to understand both the dimensions of this, and [we] want to figure out how to incentivize both these different dimensions. One way we think about it is instead of, you know, ‘at the core protocol layer it is not possible for us to measure and reward decentralization,’ but, at a middleware layer for example, in the restaking layer, what you could do is start incentivizing decentralization. As new services come and build on top they say, “I want a more decentralized validator set.” Whether that is measured by stake decentralization, whether that is measured by geographic decentralization, or other parameters.
These kind of parameters which actually concretely direct positive economic incentives for certain objective goals, like for example grow decentralization, actually start valuing the things that we care about and paying for it rather than it being purely ideological and religious, which is where we are right now. Decentralization is an ideological and religious objective but not valued in economic terms. This is one of the things I hope for EigenLayer, and hopefully at some point itself becomes part of the Ethereum protocol, something like that.
But the idea is that decentralization needs to be measured and valued. It cannot be measured objectively on the Ethereum core protocol but can be subjectively measured by people building on top and demand a more decentralized validator set. This kind of brings in all the different themes that we'll be talking about here.
Totally. I think piggyback of that Sreeram, we think about this nexus a lot and I think there's a corollary that we tried to tether to a lot. Decentralization is a challenging thing to measure but is a direct corollary to risk.
So risk is something that most participants in the market understand whether you're a protocol evaluating different configurations for a restaking infrastructure, whether you're a staker that is evaluating options in the market contemplating different kinds of setups and what the risk adjusted reward rate or even the risk basis looks like.
So risk, unlike performance, is actually more subjective because you know if you look at different risk parameters there's sort of the optimization point that people and participants are trying to find. One of the things that I think we as an industry have to do together is actually defining some of these core parameters in such a way that they can be digestible and they can be understandable to participants because it's only when the demand side of the market demands risk-adjusted configurations they inherently end up demanding more decentralized solutions. I think if we can support that education as stewards of the Ethereum protocol and the ecosystem overall it is that education that we should invest in, because it can help us create a market where an educated user base and an educated builder base can come together around efforts that support a more decentralized or favor more decentralized architecture for solutions overall.
Fantastic, thank you Mara. I want to take one question from the chat that has generated a lot of discussion. The question is as follows: “Is anyone working on integrating Obol / Charon with Rocket Pool nodes. Last week's ETH outage caused a heartburn for Rocket Pool node operators running Prysm and Teku. Fault tolerant Rocket Pool clusters soon?” I guess Darren you're best qualified to field that one first?
So, Joe, who's actually our smart contract developer, has been working with the Obol guys and a few of our kind of node operators to test that out and build a prototype. It's more from a collaborative staking sort of setup rather than a full kind of permissionless staking thing. So it's more for if you know the people that you're staking [with]. But it's certainly looking promising. It's working. It's running on our kind of Goerli testnet. So that's a fantastic start to that kind of journey.
Yeah and on our side too it's been pretty interesting. So in the early launching of our first mainnet clusters, we did a Southeast Asia cluster which has nodes in India, South Korea, Australia and New Zealand. We've been like running at home validators in those areas checking latency and seeing what's going on.
Our initial intention for that was to just do it on Rocket Pool. Then we got really close and we're like, “Let's not do validator number 3 on Rocket Pool mainnet. Let's wait a little bit.” So then we actually deployed those on testnet at that point in time and we deployed that same configuration just on mainnet as a regular validator, and now we're getting closer to making it a reality.
Both teams are working on their own things to fix. It's been pretty pretty cool and collaborative. But yeah I hope to see more on testnet and I hope to see a couple on mainnet soon. I think finally now that we've got about 20 or so, I think 26 on mainnet, it's a good time to probably start pushing more into the protocol area of our adoption. Whereas actually all of our mainnet adoption hasn't been with protocol teams. Which to be honest with you actually has its own set of barriers to entry, you know.
Getting DVT inside of a distributed community that has governance is not easy. It's not like one person's like, “Oh I really want this.” You need quorum. You need consensus. You need an integration plan. You need a roadmap. So honestly in our go-to-market strategy with protocols, the ones that exist with stake are more difficult to adopt, even though they're more convicted, because they have social coordination topics.
Whereas if you look at someone like EigenLayer who's approaching its launch and getting ready to launch, that's a much better area where we're focused at today because we kind of have two mutual things that we both want to achieve, right? EigenLayer wants to launch, they want to attract stake, they want to make sure that ETH is getting in there, and we want to make sure that that ETH is protected by DVT essentially.
So yeah each kind of go-to-market partner for us has had their own complexities, but the governance element is actually pretty tough. Well it's fascinating, I wouldn't call it tough. It's just time consuming at the end of the day.
Yeah you know doing business development with protocols, it's like selling to the government or to the town hall. So you need to kind of know the town hall members, talk to them, get consensus. It's a whole process. I think it's funny. Some of us here interact with the Cosmos ecosystem. This is maybe the most prevalent, where you have very different kinds of people who are all on the governance.
That's pretty much my job 24/7. We're working with our community at the moment but on our roadmap prioritization so that's going to be a lot of fun. It is actually quite good fun.
For all those who think that these protocols are centralized. They're not, to be honest with you. There is no one or two people that you go to to get anything done and accomplished, like there is a level of decentralization there. It's difficult to interact with sometimes. There's no doubt about it. It's good fun but it's hard to coordinate at the end of the day because they aren't centralized. Who's in charge is really a real question? And that feels good. But as a person trying to get a protocol integrated, sometimes it's like, “Dang why can't this thing just be centralized,” you know?
Yeah so this would be, I think, the agility versus democratization trade-off that I think we all have to grapple with here in the crypto space.
Yeah a dictatorship is very efficient. Authoritarian rule is very efficient. Democracy is very inefficient, but participatory governance is definitely worthwhile.
We're almost at time, so I want to ask the panel, if you could each say one or two sentences of the thing you're most looking forward to regarding ETH staking. It could be something you already mentioned. One or two sentences on the thing you're most looking forward to regarding ETH staking in the coming year, and I'll start with you Viktor.
Ah okay, good thing I thought about it really quick. So we talked a lot about all the stuff so I'm just going to mention something that we haven't talked about, which is that I'm thinking a lot about intents. Intents are fundamentally different from transactions: it's expressing the desire for a particular outcome but not specifying the path to get there.
I have a hundred dollars, I'd like to buy 100 of X memecoin but I'm not saying I want it through Uniswap or Curve or whatever, I just want it. Who can deliver that for me?
So the systems and networks that can potentially service those intents and match the requester and the entity that's fulfilling it, I'm really curious to see how validators and staking will play into that as an area of very open research. Nobody knows the answer. That fundamentally changes how transactions occur and the paths that they take and you know how they enter the memepool and all that stuff. So that's a change that I'm looking forward to seeing how it plays out. There's no firm roadmap but that's something that we haven't discussed before.
Fantastic, who wants to go next?
I feel like we're just getting started. I think there's been such a fundamental focus on core protocol development and some of the really important upgrades that have happened, are going to happen, that are going to unlock even more support for modularity within the ecosystem.
But I think when you start thinking about it even within this very short time span—I would probably not call it even the last 12 months—we're really starting to see how staking, and staking products, and staking rewards are starting to become productized.
So we're starting to see the emergence of solutions that are building way on the top of the stack: aggregation, indexing, hedging, you know we're really starting to see a lot of that develop. I think that's super interesting. It's happening on both sides of the market, so in the DeFi space we can do really fast development, and you know really being able to articulate a lot of the value propositions that are going to be really interesting.
Viktor, you're kind of thinking about this intent and aggregation market and thinking a lot about ‘how do you productize rewards and the yields that associate with them?’ How can you create different types of products that ingest and incorporate them in different capabilities?
We're starting to see that on the CeFi side too. We see a lot of players coming into the market really starting to think about, “If I can hold ETH on my balance sheet how do I get access to ETH rewards? So I can do market neutral trades now because there's products that I can leverage on liquid staking receipts that allow me to do that.”
Suddenly you start seeing how the foundation that we're all building together unleashes such a tremendous amount of innovation on top of it, that really this feels like the beginning to be honest. So I think we have all got to do a lot of work in the next couple of years to make sure that the fundamental layers of this new liquidity layer in the market is really robust, and really secure, and really is able to meet all the different customer segments and their needs in the market.
Fantastic, if we could keep these to one or two sentences I know we're a little bit over time so I just want to wrap up and be respectful of everyone's time, thank you so much.
Personally for me I think we're at a ZK renaissance time period. Oh did I freeze? Oh I'm back. Sorry I froze for a second.
For me, I think it's more of a ZK renaissance happening since we've been working a lot to actually create real trustless validators. There's a lot of improvements that need to happen in the ZK plumbing system. However right now there's a lot of good investment going into that, to actually help these protocols expand into their future versions.
The last thing I'll leave everyone with is the way I look at it based on the world I look at, is everyone's been asking, “What is the killer use case for blockchain for years?” just like email was for the internet. What everyone is forgetting is that blockchain is the killer use case of applied cryptography. We can't do any of this unless cryptography increases, produces, and improves over time. So I look at the cryptography world and that's what I'm most excited about. To help all of us do the next thing that we need to do.
I'm going to reiterate lots of points. Watching the water rise on the industry definitely that's going to be very exciting. We've put a lot of effort into Ethereum staking so it's good to kind of see that things come to fruition, and new technologies, and the Ethereum roadmap for 4788, and all of that good stuff.
From our end, I think the thing I'm most excited about is seeing the transition of Ethereum from this monolithic blockchain to basically being just the base trust layer for everything. That's the thing I'm most excited about and I'm looking forward to.
Fantastic. Thank you all! Thank you all for joining us in this engaging exploration of the future of Ethereum staking. To our esteemed panelists, thank you so much for sharing your purrspectives. This is an exciting time for Ethereum! Until next time stay curious, stay informed, and whatever you do stay pawsitive. Thank you and good day!
All opinions expressed by the Participants are solely their current opinions and do not reflect the opinions of their respective parent companies or affiliates or the companies with which the Participants are affiliated, and may have been previously disseminated by them. The Participants' opinions are based upon information they consider reliable, but neither Liquid Collective nor its affiliates, nor the companies with which such participants are affiliated, warrant its completeness or accuracy, and it should not be relied upon as such.
Liquid staking via the Liquid Collective protocol and using LsETH involves significant risks, and you should not enter into any transactions or otherwise engage with the protocol or LsETH unless you have fully understood all such risks and have independently determined that such transactions are and/or engagement is appropriate for you. Any discussion of the risks contained herein should not be considered to be a disclosure of all risks or a complete discussion of the risks that are mentioned. The material contained herein is not and should not be construed as financial, legal, regulatory, tax, or accounting advice.