Is Solana Decentralized Enough? w/ Kyle Samani, Su Zhu, and Hasu

4.21k views14114 WordsGrade 17 Readability4.9/5 RatingDownload TxT File
Uncommon Core

In this episode, Su Zhu and Hasu invited Kyle Samani from Multicoin Capital, one of the most successful venture funds this cycle. Kyle is a day-one supporter of Solana, a smart-contract platform optimized for high throughput. They talked about: * Multicoin's approach to investing * How will the winning blockchain scale? * What is enough decentralization? * Win conditions for Solana * What is Serum Enjoy! Listen to conversations between Su Zhu, the CEO and CIO of Three Arrows Capital, and Hasu, an experienced crypto researcher and writer. Together with occasional guests, we explore the transformative nature of trust-minimized currency and financial services. SUBSCRIBE to the Podcast Apple Podcast https://podcasts.apple.com/us/podcast/uncommon-core/id1517659188?uo=4 Spotify https://open.spotify.com/show/3vuV292Him90EjQ5YL4XIw Youtube https://podcasts.google.com/feed/aHR0cHM6Ly9hbmNob3IuZm0vcy8yNTc4ZDVhMC9wb2RjYXN0L3Jzcw== Other https://anchor.fm/uncommoncore FOLLOW on Twitter Su Zhu https://twitter.com/zhusu Hasu https://twitter.com/hasufl Our homepage and mailing list https://uncommoncore.co/podcast/ Transcripts, if available https://uncommoncore.co/blog/

... Show More

Video Transcript:

Welcome to uncommon core where we explore the big ideas in crypto from first principles this show is hosted by suzu the ceo and chief investment officer of three aeros capital and me hasu a crypto researcher and writer hey welcome to the show sue hey yasu our guest today is kait zamani the general partner at multicoin capital and our topic is the two major ways to Scale a layer one blockchain and really like how much decentralization is the winning blockchain going to have um before we dive into this kai can you give us a quick intro uh both of yourself and also um of sort of multicoin and your approach to investing and sort of your time horizon so we can you know get some context on like your portfolio and so on Sure so um hi everyone i'm a pleasure to be on the show long time listener on common core one of my favorite podcasts i mean i'll do a great job unpacking the fun uh fun media debates um so i've launched multicoin um in october of 2017 uh along with my co-founder tushar um we in 2017 launched our our hedge fund um we added our first vendor fund in july of 18. That fund is fully deployed we're now deploying out of our second venture funds um today we manage a few billion across those vehicles um and um we invest in you know all things crypto um our our strategy is pretty straightforward um we are a fundamental focused uh fund our our time riser and our hedge fund is measured in six to 24 months typically Um is when we kind of put on positions on that that time frame many things we own like solana for example which we'll talk about today our time horizon is longer than that but you know at a minimum we kind of have to underwrite it to six to 24 months and our venture funds are obviously buy and hold for you know five to ten years um we generally in terms of kind of thesis Formation and what we invest in we've invested kind of every layer of the stack all the way from kind of core technical primitives through core financial primitives through middleware and all the way through applications we are generally comfortable with all forms of risk so technical risk product risk timing risk um all kind of the team risk whatever um i Can think of probably deals where we've had serious risk and at least one of those categories um we generally get more uncomfortable when you've got two or three of those that are compounding um but in fact some of the best returns are the ones where you in fact compound those risks and so um on rare occasions we will kind of you know compound those risks but we prefer To say we're really underwriting one specific form of risk that we think is the core question at hand um and then try not to compound too many other forms of risk uh now again it's always impossible to do that perfectly the world is not that that neatly cut up but that's usually how we like to think about risk interesting do you have an example for Like something that you would define as a risk and that you would try to avoid to compound yeah so um for example like we invested in um zero knowledge stuff um so we invested in mina invested in starkware um and we have investors in both of those since 2018 and um [Music] We came to the conclusion to get out of kind of looking at zero knowledge i was like okay that you can replicate or you rather you can prove to someone that you've done a computation right and demonstrate the integrity of the computation you did without them having to redo the computation um and if you look at blockchains the way blockchains handle this problem is simply through redundancy um and just Replication just have as many people replicate the same thing over and over and over um and so zero knowledge kind of in a very abstract sense represents one of the most disruptive kind of just fundamental changes to the nature of of trust minimization that that is out there um in 2018 we looked at starkware and coda which were the two kind of really only credible zero Knowledge plays and stockwear and mina are very different things by all accounts and we said look the this is if anything is ever going to kill crypto it's probably this um our ability to reason about layer one versus layer two at that time was almost non-existent um thinking about you know the like zero knowledge programming environments and those things we had no Idea reason about any of these things um but we said okay look uh there's a real good chance zero knowledge can kind of just break all assumptions we have right now um we went we had invested in both of those things at that time um where where our biggest risk in our mind was timing um i had a pretty strong suspicion that has borne out i think mostly correctly that it was too early at the time um [Music] but we said we underwrote that saying we don't care that we're too early because in the event that we're we're wrong and it's not too early like this can just have crazy impacts through the rest of our portfolio and so it's both a hedge on the rest of our portfolio and in itself kind of an asymmetric opportunity um and so like the biggest risk there was timing Um we weren't worried about team or math or anything else like those guys are the world's expert in this stuff right like we're not going to um underwrite correctness there our biggest question was is zero knowledge three to seven years too early um i personally kind of experienced that pain of being too early with my last startup which was called pristine We built software for google glass um for surgeons um and in in hindsight it's been eight years since google glass launched it was 2013 um and like it was obvious it's obvious now that it was too early the hardware just wasn't there and like if you look at the snapchat summit they just had like a few days ago um like it's clear that it's still too early Um like the stuff still doesn't really work um in a consumer-friendly package and so you know i spent two and a half years of my life doing something that was at least eight years too early probably 12 years too early um and i looked at i remember looking at zero knowledge in 2018 thinking the same thing thinking okay is this too early and i was like i think probably 85 90 Probability it is too early um but we went ahead and pulled the trigger anyways makes sense and this also brings us to sort of the question of today and that's basically will the winning blockchain scale and layers and logical sharding or rather horizontally within a single shot um the first of those sort of breaks you know the sort of nice synchronous composability that we are Used to where all applications can interact with each other atomically but it has the major benefit that you just only have to verify sort of that small block of the state that they care about and in the second approach so you keep the entire state in one huge blob and this retains the nice composability that we have gotten used to but at the expense of ballooning Sort of the verification costs for users and i i've i had a pretty i used to have a pretty strong strong stance on this and i i would say that i still have but i mean sort of reading um you know some of your work um and seeing the early success of solana has made me wonder if i am personally diversified enough on this like As you said maybe like is my risk in this area too compounded that you know maybe there is more than one approach so where do you stand on this question yeah so one quick clarification i want to make on the your comment about the users on the um layered approach being able to verify the part of the state they care about um i'm not sure that's strictly true even in kind of the maximalist sense Um in that like if you're you know you're you own let's say your assets are on one chart or whatever but if you end up having to interface with three five ten other shards um right now as a user it's actually not clear how you will actually verify uh yourself that you know things executed correctly on the other shards um in in a theoretical world where statelessness you know works um you can get there but like that is Still up undefined unsolved problem space maybe to interject that just for a second so i didn't mean like sort of the the sharding that the sharding is in the ethereum two roadmap but just sort of the i see like the roll-up centric roadmap of ethereum also has logical sharding because like if you don't use a rollup then you don't have to verify it But it still scales your theorem as a whole ah right okay yeah so so slightly lightly different definition um again i just want to make sure we're very clear um i would actually argue so sharding maintains logical centralization um because the difference between starting in roll-ups right is that in a shard um if you execute in like near or polka dot or cosmos or Whatever theoretically in these systems if you say hey you go interact with this transaction on this other shard you know the the inner shard protocol will figure it out and just do it for you and the issuing transaction does not need to know or care which which other shards the pieces of state are on um and the charting protocol itself handles that magically the difference between Roll-ups and charting is that roll-ups by definition break that because the layer one system does not know that layer two even exists um and so you cannot automatically route that through the logic of the of the shard itself um so so roll ups break logical centralization sharding theoretically maintains it sorry i know it's just very nuanced kind of technical Yeah surgery um so the original question was yeah like layered approach versus um versus kind of horizontal scaled single approach um so so i think the answer in this question is a question of like what trade-offs are you making and what are you prioritizing for um [Music] i think the right thing you have to Prioritize is sufficient decentralization and then optimal user for some minimum level of decentralization which primarily gives you censorship resistance that's the property you're really getting um have some minimum threshold of that and then beyond that threshold do not optimize for decentralization more and instead optimize for developer experience and user experience Um that is how i think about that the the problem is that if as you decentralize to the maximum degrees you go just further and further down the decentralization curve you create engineering problems um you create developer experience problems use your experience problems um and the theoretical solutions for these problems are things like sharding and Rollups and and i'm not convinced that you need to go that far down the decentralization spectrum to to make these things sufficiently centered persistent to achieve the kinds of of properties you want out of these systems yeah so maybe a good question to ask is so what is enough decentralization i mean Lots of people probably have lots of different opinions on this is there like a first principles way to approach this um yeah so i think probably the best thing i've seen written on this right is it maybe that blog post biology wrote back and i want to say 2017 um on quantifying decentralization i mean you can quantify it across lots of metrics um probably the most obvious ones Are state distribution slash hash power distribution um number of clients a number of major implementations and obviously you know stake or hash distribution of those um number of validators um in the consensus group um the number of validators or miners who can uh impact liveness um so that's one-third of proof-of-stake or 51 in proof-of-work um right That distribution um those are probably the metrics that matter and then maybe just like general wealth concentration um for you know general like egalitarian equality purposes i don't think there's any others that seriously matter those are probably the five or six that matter of those i think you can probably stack rank um them to some Degree mind if you ask anatoly from solana he'll tell you that the one that matters is um number of number of consensus validators that can get you to one third of the stake because um that is how you freeze centered like that's how you impact censorship resistance and how you can theoretically roll back the chain and create liveness problems and make the system Fundamentally less usable for its intended purpose which is d5 and that's me feels like a very clear and reasonably objective way to think about it it may not be the correct way but but it is at least a cogent cogent view of the problem yeah i think to add into what you're saying there too i totally agree and i also think that decentralization is often a it's often a very emotional word for People because a lot of people when they come into crypto they think that it needs to be a certain amount of decentralization for it to have any value at all or have any meaning at all and i mentioned in one of our of our earliest on common core uh podcast the idea of like a spectrum and and sort of at that time i was talking about centralized exchanges and like comparing cme futures trading Against ftx against derebit against bitmex and these kind of concepts and and then and then saying that you know people they they think too much in terms of absolutes like th this will kill that or this will kill that and i think the reality is like it's all intersubjective if the market demands a very High standard of decentralization for a specific task and it ends up doing so then then that may make sense for that but for a lot of uh what people currently do on d5 and a lot of what people might use blockchains for in in block space four um there's definitely a concept of of like of overkill i think and and i think that um i think that like you guys have been Very smart to this thesis of this idea that um doing things on chain is fundamentally useful um and that uh if you go a little bit further on the spectrum uh you can get a lot more done basically yeah it's it's obviously a spectrum i think you know a year and a half ago that this course was non-existent and i think today it's It's reasonably existent um and i think a lot of people have real open questions about you know how much decentralization is enough and which vectors really matter um right definitely one that matters is number of validators get you to a third of the stake weight um number and and that needs to be you know if you look at like eth2 because there's no native delegation you have to kind of suss that out Between the stated number of validators and then the number of validators who are controlled by coinbase or controlled by lido are controlled by kraken or binance or whatever um and so um just be so that that's kind of an interesting sub point the other one to think about is just um and then and that metric is specifically Important as you think about liveness thresholds and how what number of people can collude to impact liveness of the system the other real fundamental property that matters is energy resistance um and they're basically all you need is just more validators validating the system um and as you go from a thousand to ten thousand to a hundred thousand Validators you're just getting more central resistance because basically all right if anyone tries to block a transaction um right or or insert an invalid transaction you see more and more people watching them and more and more people in in the consensus group um such the transactions will get included um and and be verified um so those are probably the Two most important ones um if you look today at those you know well let's talk about the second one first century resistance i think that's probably the more important one um on kind of a grand human history perspective right i just make sure you can't be censored um how many nodes need to be in the consensus group right such that collusion is sufficiently difficult such that you can get your Transactions included and my intuition there is like probably the number is ten thousand i mean look this is like very subjective right but like if there's more than 10 000 nodes around the world and you know that they are physically distributed all right you have reasonable reason to believe that then you know like what's the probability that that's you know half of them two-thirds of them right Are colluding that you you're not gonna get your transaction included in the block um it just seems very very hard to foresee that kind of large scale collusion is it really realistic that any system like whether it's proof of work but even more so proof of stake that it will ever have 10 thousand distinct participants in the valeda set because of all the You know economies of scale that are involved with staking and mining um proof of work i i mean well if it sounds like if you're an individual people mining versus the hash pools if you talk about individual miners i'm fairly certain there's a lot more than 10 000 people who mine today um right but they don't make their own blocks right they correct outsource this to the mining Pools this is unlikely to ever change agreed that basically zero percent probability that will change in proof of work systems uh improve stake systems today um you know if you look at polka dot um i think polka dot is something like 800 validators or thereabouts on the main net i think kusama is like 1200 or something somewhere in that range solana has around 600 values on mainnet around 1200 on testnet Cosmos and tazos i think are in algorand are all in the in the 1000-ish range right now maybe 1500 but none of them are 8 000 to my knowledge but validators in those systems don't say anything about who owns the stake right the validator just represents one fixed amount of stick and they are all the same size um why is this different uh so i'm not I'm not sure on the i don't want solana there's 600 nodes in consent participating in consensus today um so they they have steak and they've staked it and they're positioning they could be and are probably controlled by a smaller number of people yeah there's possible there's individuals running multiple nodes um yeah yeah what's nice in basically all the systems other than eth2 is there is no native delegation Excuse me e2 does not have native delegation but the other ones do um and and so the the motivation for like having many nodes to represent a single piece of stake um is reduced um substantially when you have native delegation yeah um so you're kind of in that range today is is your call you're in the thousand range plus or minus for most of these Proof of stake systems um and like getting to ten thousand doesn't seem very hard to me like a 10x is just like pretty reasonable um on a three to five year time horizon um i'd say the probability is in my mind it's like 85 90 that these things are more than 10 000 um individual consensus validators in three to five years time And how does it work that um like they don't all they may be in the consensus set right but do they really participate in let's say the making of the next 100 blocks because i i remember like in bft for example you have in the bfg base proof of stake you have sort of this this this hard upper cap of like a hundred validators Um who can participate because of the because the communication overhead between them is so large um so well a few things to unpack here um so one is assuming you have 10 000 consensus validators um assuming they were perfect distribution of stake then obviously you're only participating on average one out of every 10 000 blocks so if your threshold is out of 100 Blocks then that doesn't really work um um [Music] second comment on your bfd comment um it's not quite correct um in all these in all these bft systems you have to trade off liveness for safety uh well either you have to choose prioritizing for liveness or prioritizing for safety right if the network splits Then do you stall or do you keep making blocks and then eventually re-merge somehow um is the fundamental question at hand um tender major you know i'd say it's probably considered kind of the gold standard of previous state systems prioritizes safety over liveness so it does in fact halt and what that what that means is every single block on a block by block basis Every single node has to communicate with every other node so that they can finalize that block before moving on to the next block um which is the messaging overhead you just alluded to um in systems like solana the salon and as well as eth2 um they prefer liveness safety and so that messaging overhead does not have to happen uh block by block like you can make More blocks into the future even if a block isn't finalized and so um solana does this and i think most of the other liveness focused proof of stake systems as well or basically that communication overhead can afford to fall behind and the impact of that just means there's time to latency may increase right maybe one second may go to three seconds or five seconds who knows um depending on network conditions um but It doesn't prevent the rate of block production and therefore it also means that you can increase the value of your set and keep block production going at the same pace um what that will increase is latency to final to to finalization yeah i wasn't sure if solana favors safety or liveness so that answers it for me thanks yeah you were talking about decentralization of the validator set Right right right so most of the non-eth2 proof of stake systems today have a thousand ish validators plus or minus a few hundred um so the question is like can that grow and there's no theoretical reason why they can't um there's just kind of questions of like do more people want to run nodes basically right um and my intuition is just as these Systems grow if you look at bitcoin look at ethereum those are the two oldest ones in basically every dimension they have continued to decentralize um over time and that's been a relatively monotonic process um in terms of political control of kind of the governance of these systems in terms of the number of applications built on them the number of nodes um even just like who makes the a6 Right i mean just kind of in every way these things have all decentralized over time because basically as the aggregate dollar value of the system grows there's just more and more incentive for random people to get involved in some way shape or form um and so the a growing market cap i would argue um generally increases decentralization and i think that trend will continue um i don't really see why that won't Continue um even even things like stake distribution right like there's a lot of people who invested early in the theory who owned a huge percentage of ethereum uh like joe lubin obviously owned the massive percentage of ethereum um i don't know if he still does but he certainly did at one point in time because right consensus was burning like a hundred dollars a month um and and so he just had that one a Huge amount of ether to underwrite that um even like look at like right like criticism of the salon i was like oh multi-coin and alameda own too much like okay but like we are forced sellers at some point like our literally our fund has a life like we have to return the money um and so um i kind of don't like even things like state distribution have To get decentralized over time um and and so as long as market cap is growing um i think basically all metrics of decentralization kind of have to move in the right direction yeah i generally agree with your comment that decentralization increases over time and it's a function of how many people care about the protocol and this is sort of the the biggest Driver ahead of any like technical properties um and those are actually also the two that i would say like you touched on them in the in the like in your longer explanation but you didn't mention them explicitly i'd say that the political governance of these systems is definitely very important so who decides the roadmap like how difficult is it to change the consensus rules And um and then the second um sort of and this is this is i think also at the heart of the debate between these two approaches is the culture of validation among users because it's true that um [Music] sort of you need a certain thresh need to pass a certain threshold of malicious block block producers and the block producer Said in order to corrupt liveness and safety in these systems um but even if sort of this threshold is reached then if if the if many users sort of validate the state transition of these networks then what what like sort of the evil that these block producers can do is much more strictly limited so my question to you would be sort of is this something that you're willing to Sacrifice or how how much do we have to have to sacrifice this property of sort of non-block producers also validating the chain keeping the block producers in check yeah so again this varies a little in proof of work versus proof of stake to some degree um in in proof of uh you i mean in um bitcoin right like the no bitcoin is particularly weird because You have like four mining pools or whatever they control more than half the hash power and if i recall there was an episode i want to say it was in in 2015 or 2016 right where the miners actually stopped they started producing invalid blocks um as like some sort of shortcut to like increase their hashing or something um and the full nodes ended up catching Them and um right so you that that fundamental need for more validating nodes is fundamentally important um what's interesting in proof of workforce proof of stake is that dynamic exists to a lot lesser degree uh because you don't have this massive cap expand where your goal is just to juice your your hardware as much as possible At the expense of other people um in proof of stake you just have you know probably like probabilistic rotations based on stake weight um and so that those kind of fixed some dynamics of i increased my hash power with some game at the extent of everyone else um is it exists to some degree improve a stake but to a substantially less degree um and so that that dynamic is kind of Reduced um the other comment is just you know doesn't matter even if you assume there's no one verifying consensus validators other than consensus validators um and it's not clear to me the answer to that is yes um if you've got 20 000 nodes in consensus or 50 000 nodes or even let's just say 10 000 on the low end um Or enough of can you assume enough of them are honest that it keeps the system in check because the good thing is if anyone produces an invalid block like that's easily slashable um and censorship is just a function of node count um and then liveness is just a function of stake weight up to one-third um and so if your focus is is as a user do i know my transaction will be Included then you just need more nodes in the system to maximize the probability if your concern is someone screwing with the system again you just need more nodes whether they're consensus or not actually is not super relevant um as long as there is slashing built in and as long as some nodes can identify that and submit the you know invalid right proof to the rest Of the nodes um and then the third is just you know will in fact there be some sort of of uh what's it called uh liveness attack right where like you get a large amount of history gets rewritten um and that's actually the hardest to solve that's actually the highest bar of all of these to solve because it's hard to force stake distribution especially among the top validators Okay so it seems that we started from this you know point like on the one hand we said all these systems started as completely centralized and they decentralized over time but now at the same time we are arguing there's this counter force uh on sort of the meta level not inside the individual projects but sort of between them Where more you know projects come online that sort of erode uh these ideals of decentralization that have emerged in the community in order to you know get something out of it right get more get better user experience get better developer experience be a better platform for d5 so sort of where does it stop right is this like is there like in two years from now sort of the you Know the more user friendly the more scalable solana um that sort of instead of having like supporting a thousand validators will just say okay like we decided that 12 validators you know geographically distributed sort of like libra that this is enough or like at what point do the users feel sort of that Until here and no further right in terms of eroding decentralization um yes so a few kind of comments are on this um the first is should a protocol prescribe a level of decentralization um eth2 does prescribe a level of decentralization there's 64 shards they prescribe right like how many um what's it called like the hardware requirements Per shard like like it is there's ideological um dogma built into the protocol um it has represented in chart count as well as hardware requirements for shard uh bsc the same thing obviously every kind of longer direction but the same thing right yeah um interestingly solana actually does not describe anything at the protocol layer at all Um selena does not prescribe harder requirements solano does not prescribe node counts or anything um solana protocol lets all that fall to the market itself um that's a lot of the protocol does happen to be optimized for gpus which do happen to run on 4000 concurrent cores um and i'd say it it assumes i'd say the one thing solana assumes is that there you have a reasonably high Bandwidth computer um just so that the tr the proof of history um and all the the messages related to that um can go in and out um but beyond that it really assumes nothing about the node count or the hardware accounts um and all of those decisions about what is the degree of hardware you need to keep up with the parallel transaction Execution and what is the degree of hardware you need to keep up with the proof of history um with with running the the hashing cycles those those are the two most important questions to actually answering how decentralized visit um and those are not prescribed in the protocol whatsoever those are exclusively decided by the users by the market um where that's Some combination of non-sticking users and then people who stake the validators and then the validators themselves right and there's some and then i guess you know there's obviously the soft social power of like what does the solano core team say about what they recommend um what does sam or what does kyle have to say about you know those things as well so there's all that kind of soft Social discourse um but the protocol itself says nothing um now that's kind of a technocratic answer but i think it's worth noting um the better realistic answer is um you know in practice like the solana foundation they had they have recommended computer spec on the website um and most of the validators today do in fact adhere to those specs And so if you try and join with a lesser computer like you won't keep up um so there's obviously some practical reality here um but it's worth noting that that all of these dynamics around does it decentralize or does it centralize over time are not in any way dictated by the protocol um it's only dictated by the market um yeah i mean but there are reasons right why all other protocols sort of Have these they they cap sort of stuff like throughput state growth bandwidth requirements um and that is for one to sort of protect the the non mining users the non-staking users because they like their private benefit from validating the state transition is quite low right it's just i want to make sure that i'm on the right chain um And that's basically it right and um but for them you know the incentive to do this is quite low as long as as enough other people do it and that's why i sort of have this verifiers dilemma on all layer 1 and layer 2 block blockchains um but the second one is also to and this is i think this is a bigger deal in improve of work than proof of stake but i might be wrong which is sort of Also protect so the weakest of the miners and stakers um because it like in proof of work we have seen this like if um there's this theoretical attack vector where you know larger miners their blocks have longer propagation times and so they want to mine blocks that are as large as possible and so if you if you leave the block size to the free market then you know sort of the The steady state is that blocks will just keep growing um validation like the the propagation times we keep growing and this puts this is basically selfish mining service mining attack factor like automatic one um do you see any any of those risks in solana so i'm not worried about the selfish mining kind of a thing proof of stake kind of naturally solved that with with Kind of guaranteed timing of moving between um nodes so that that's not that's what notes i mean the validators can miss that there's a like they have slots right so in liveness favoring proof of stake systems there are like these slots where that your node has like let's say i don't know one second or half a second in solana time to produce a block And if they miss that slot then they don't get the reward and get like sort of a micro slashing or something um yeah so they don't penalize you for they don't technology for liveness failures um at least not not like on an individual basis like that um and same is true as well um but yeah i mean conceptually if you're not online and ready to go then you're going to miss Your you know the hard requirements as a validator you are going to miss your can blocks get so large that uh sort of the smaller notes fail to produce a block in time or is this this is like totally outlandish that no no it's absolutely a real thing in fact if you go to solana beach um which is a kind of the main solar block explorer network overview thing um if you go to the list of the Validators i think there's a validators tab and you click on the validators one of the key metrics you'll see is basically like i think it's called slot um uptime or something i forget what it's called basically it means what percentage of the time are those validators hitting like responding in time to the rest of the network with transactions from their slots Um i i from what i recall the median today something like 85 percent and that number has been growing but that's that's quite low like why do you think it is that low yeah i mean it's because the blocks so the blocks a lot of times are about 4 500 milliseconds and then you rotate blocks every four blocks or excuse me rotate validators every four slots um so let's say you're rotating on average every two seconds um So just yeah communication overhead around the world like literally some people are just missing um are missing that um but that's just like that's just okay like it it it doesn't matte like it act well it reduces throughput because obviously you just have empty slots so it does impact performance but beyond that it doesn't really matter i mean it's a strong centralizing force In the validator set because those who miss their their slots they will just lose money over time being a validator and then stop validating um yeah i mean can you do they all collude right and and that kind of a thing um i i'm generally pretty skeptical of like large-scale collusion among lots of independent parties oh i didn't mean collusion or anything like that just That larger block producers have a strong incentive to my knowledge blocks but i mean no i actually i might i might also just be wrong and this doesn't apply at all to sort of this that you can affect us as a validator because the slot is the slot right um it's like you don't need to wait for someone else's block to build on it unlike correct i think unlike in proof of work correct so right Like in specifically the point of end of the whole proof of history system is that everyone is maintaining an independent clock um which is the repeated hash and so if someone misses their slot if you're the next guy you just don't care that the last guy missed their slot then you can make sure you're ready to go yeah okay but nonetheless i mean that 15 Of validators missed their slot on a like consistent basis that i think that's just that you know that there is a centralizing force there in the block producer set um yes yes potentially uh i mean directionally that's obviously true um but i also think that the countervailing forces um just system optimization is just like a long ways to go there's a lot of known Things that the salon team wants to do um to improve um redundancy in the system and and make that better um i would suspect that general call it um consistency of performance will probably you know over the next 12 to 24 months you'll you'll see a 2 or 3x out of growth in um or 203x production in kind of missed failures um because it's just Like all all new systems like this just it takes a long time to optimize them um and the salon team is very open about that in fact they actually still call the system beta for this reason because they they know there are so many optimizations they haven't done yet that they're unwilling to take the beta tag off of it i think backing up too and just talking about your point about um supply decentralization or distributing Over time i think people underestimate how quickly supply can distribute if the protocol is actually being used and it's useful for people right like um you think about ethereum in 2016 supply was incredibly decentralized it just took one year in icos and a lot of activity and you know everyone in the world that knows what ethereum is at that point and Then now today very few people relatively talk about ethereum supply being you know controlled by only a few people and and so i do think that utility solves all actually uh when it comes to supply decentralization because if people want to get their hands on it and it's useful for them then it's then it'll just happen even if it starts less decentralized i think also If you look at the way that solana and and also polka dot and kusama the way that they did their sort of listings and then the price history and just being able to like allowing normal individuals to access those assets from relatively early on i think there is clearly a a very a relatively broad holder set Than what uh what i think people um would have assumed when these projects were in their seed phase right like i remember during the seed phase of a lot of these i remember kyle came to us and and asked us to join them in the in one of the rounds and and we ended up passing because we didn't look closely enough and then later on we realized like we made a mistake and we bought a Lot of it otc and we also went and just like did a lot more research in the thesis but back then the main criticism of solano was that you know the you know only a few people would own a lot of it and this kind of stuff and and i truly think that this is one of the biggest red herrings in in investing because at the end of the day it's about it's about technology and community so if they have a way to Uh create a community if they have a if they have legitimate technology then then like distribution is not a problem right and and and so i think um [Music] i think people forget that all of these things started relatively centralized right like bitcoin when satoshi mined the first block he had all he like he had it all so like like everything starts from that And then you know from that point of view these newer pos chains uh they they ultimately are a little bit more well engineered like in a sense because they think very critically about how do they want to give out supply how do they want to bring people in how do they want to create organic uh you know reward early adopters reward People coming in like with mina you know with the you know with the coin list uh sale i think you know tens of thousands of people are able to to buy it on coin list like i think there's there's a sort of like an advantage of modernity like in a way with some of the newer chains that have launched because they've been Able to see the history of a lot of other chains and they can go and say how how do we despite starting relatively centralized because we need to have actual cash to be able to to to fund this technology to then later go and how how do we decentralize supply over time um while making sure that there's still um a lot of activity going on so i just think like that's like a complete red airing in Investing in crypto i think the other point i would i would make a couple points i make building on that one is if you look at the pace of decentralization um of bitcoin versus ethereum obviously ethereum decentralized a lot faster um and it's because like no one's paying attention to bitcoin in 2009 right no one knew what any of these things were um there's a lot of education that had to happen and such um If you look at then look at like where solana is today versus where ethereum was as long as about one year old it's about 13 or 14 months old um if you look at ethereum 13 months after it launched in july 15 so 13 14 months later it was like they went through the dow hard fork and you know like there was nothing on the chain other than the dao and the hard work of the dow um Right and so it's obvious that like the pace in which the ecosystem is growing is just a lot faster now than what it was then um that's not i don't mean to like criticize the theory there was no one paying attention to crypto back then a whole bunch of of things have changed um but but the nature of comparing time uh time is compressing um right where like the pace of which these things can Decentralize is a lot faster now than it was then um if you look if you if you look at the salon network today this is crazy to think about so um uh the commissioner hitman from the sec gave a speech in june of 2018 saying ethereum is not a security or eth is not a security if you look at the state of the ethereum network at that time um uniswap did not exist uh i think Compound had not yet launched i think they had raised money but they hadn't launched the product uh maker did exist um 0x that exists i think kyber had maybe just launched like v1 or like was about to launch v1 um and like either delta was around and that was about it um there was like a few hundred million dollars in stable coins not that much even in stable coins on the system um and and so You look at salon day there's like a billion stable coins there's 1.6 billion of tvl crm is doing nine figures and trading volume a day like it's it's kind of crazy to think about the non-linearity of of how fast these things grow um so i think it's kind of a backwards-facing comment on thinking about this kind of red herring that the two alluded to um and then i think if you reject that Forwards the non-linearity gets even more interesting um the way i what i think is going to happen right is like you know these permissionless d5 crypto thingies um right and obviously a lot of people around the world are paying attention to this stuff right now and they're all trying to figure out what does it mean for my business um and this is true for both kind of Finance companies as well as banks but like i have to imagine every tech company and every social media company in the world right now is thinking about this stuff saying like what's here and they're all looking at the cloud they're looking at social tokens you know like there's obviously there's a lot of cool ideas here like there's a very interesting design space um None of them have done anything yet on a public chain um facebook tried to go the wrong way with with libra slash dm and it doesn't appear that's working for i'm not sure why but for whatever reason they have problems um but my point is no one actually has done anything on a public chain um most interestingly the one company that got close was reddit and they like got really excited about Like doing a point system thingy um they did this big public bake off right last summer and their conclusion was none of these things are ready um we're gonna do something permissioned and private ourselves um and and so it just kind of tells you like how not ready like that that's the biggest most empirical demonstration of how not ready These systems are for scaling to large numbers of users and that that was nine months ago that happened it was about nine months ago um when i think about what can happen over the next nine to 24 months i know all these companies are looking at doing crypto things um and the number one thing they're all worried about is scale right is they don't want to break they don't have bad user experience they Want to make sure it's going to work they want to make sure the fees are low and all that stuff um and i'd actually argue that like if a real company with you know let's say 50 million plus daily users says hey we're going to move our users onto a blockchain and for some core operation that's like you know native to the application right that like you expect 50 million people to use Multiple times per day um the first time that happens um that blockchain um is now the most likely to become the largest blockchain in the world um because once that happens then most of the people other companies in the world are going to say okay let's watch and see what happens to these guys um because there's a lot of technical risk involved a lot of product risk i Mean there's just like a lot of operational execution risk right in so many ways to pull this off um and um so everyone's gonna kind of sit back and say okay let's see if these guys fall on their face or not and assuming it works then the amount of of uh convergence of perspective you're going to get among kind of global engineering leadership around the world saying okay like this is the least risky Way to scale these things to 500 million two million users um like right that can that perception is going to change very fast um and and so i don't generally hold this this dogma that like well my practices point of like um perceptions can change um i think the pace at which these things can change um is extraordinarily small Um and the momentum can shift um and and so yeah i think the i don't think any of these core debates are over because you're going to see these types of announcements are going to come i don't think they're imminent in the next six months i think that's probably a little premature but i'm optimistic within 18 months you're going to see at least one major tech company do Something that's like fundamental to the business that incorporates public chain i'm i don't have a good track record of predicting these things but i i'm i would still say that i would be very surprised if that's true i mean i maybe i like just the creativity to see what a pub using public blockchain can do to you know the products that these Businesses offer and why they wouldn't rather use an experience that they control either just using a regular database or like do you have any example in your mind of like even when reddit made the announcement i thought that it was stupid and it didn't make any sense so the right frame to think here i think is not one of control um it's actually the Opposite it's one of liability uh vitalic wrote a really good blog post about this i don't know six nine 12 months ago i forget um it was it was phenomenal where he basically said that the conversations increasingly inside of companies because of gdpr because of all these hacks is data i mean look if you're google and facebook you have like a big machine learning business like okay You have some data requirements but for most other businesses that are not doing large-scale ml stuff um is data an asset or data liability um and you know you keep seeing these hacks right you just keep seeing all this stuff cambridge analytica i mean it's just target and just constantly all these things keep happening um and i i think you know a little bit more regulatory Push um from various jurisdictions um it's not hard to see a world in which um a lot of companies start to view data as a liability instead of data as an asset and how do blockchain separate that yeah so i mean what blockchain provide is the substrate such that you can design applications where users own the data they own the state whether that state represents money in The form of a social token or something else or whether that represents you know your telegram messages or or whatever now is telegram when i move over to some decentralized system soon no because the scale of messaging is is too long like that's the highest order engineering problem because this is like trillions of messages per day or sent um but there's a lot of intermediate Things between here and there that are just much lower volume um that you know if it's like 100 million messages per day um you know can you can you get that over in decentralized capacity um and um i think vitalik's probably right that on the longer your horizon the more that data is a liability not an asset i i think that's right um and and so what do companies do Um i think the obvious design spaces are financial inclusion types of things where the companies can say look we don't no we're not a money transmitter you know they kind of um say we're not liable and then anything related to kind of social tokens and creator economy stuff all that feels very ripe for this kind of a thing um i do think probably the social media and social adjacent companies are Probably the most interesting for this design space um i think relatively you look at like you know reddit i mean look i'm not like a reddit user i've always kind of thought right it was stupid i just it's messy and hard to to filter through um but like there's some interesting design space here of of karma points or credibility points or Whatever you want to call them on a per reddit bit your per forum basis um and people want to put embed value into that um for reddit to imbue value into that as reddit makes them a money transmitter and a lot of the things they want to deal with and so this is just a clever regulatory arbitrage for reddit to imbue value in their systems without becoming a money transmitter Um starting all these vectors are very interesting for big companies to to start to to engage with the stuff um the other comment i would say just generally is um with most new technologies that are orthogonal to a lot of existing things um they tend to seep into the world in ways that are very not predictable um the internet being kind of the best example of this Um there's a mark andreessen quote about this um he says i now assume every entrepreneur that comes into the investment committee to pitch us i assume that they are right like whatever their core thesis is is correct um the only question is timing um no that's like a somewhat of a kind of extremist view he's obviously being a little bit hyperbolic but like um directionally there's like a Nugget of truth in there um and i've kind of you know doing this for a few years now has made me a lot more open to that general line of reasoning i'm just going to assume all these weird things people pitch me even if i think they're dumb um you just have to kind of assume they're right the question is just a function of timing i think i think an interesting thing that happened yesterday to that point as well Is gamestop announcing that they're going to do this nifty thing on ethereum right and remember when when gamestop first came to the mainstream and there was done a lot of talk well gamestop should put bitcoin on their balance sheet and then that would be a great way to like play crypto and then you know they've gone and done their research and they've decided actually What we want to do is put nifties on you know on ethereum and i think that there's a few there's a few i think conclusions we can draw from that i think one is that for for businesses that are um you know gen z native internet native millennial native uh there is a huge growing interest in the idea of social uh collectible uh internet of value thesis this stuff when you explain it to These types of owners of these businesses it truly excites them and i think the idea that they want to control a centralized database of this stuff like if you're running with that company you can't tell them about the straight face you want to control that database yourself right because what what advantage do you get like if gamestop went and said you know what i'm going to make a database Of collectibles you can now click these things in my database like if you said that in the boardroom of gamestop you'd be laughed at right like it just sound insane so i think that there's we're getting to the point now where in those discussions if the person advocating a centralized database will soon seem like uh the most crazy guy in the room uh and the guy and then the question Will only be you know which chain do we deploy to um do we use a newer chain how how much you know what features do we want what user experience do we want and and so i think that that that whole concept of the centralized database i think um it assumes several things about what companies want that that isn't really True right it assumes that they that they um that they aren't able to do the research and then figure out um what's what could make their business work they've now already seen the growth of divide they've seen top shots they've seen nifties and and and they once that imagination is kind of sparked and the sort of reflexivity of this entire um like adoption phase i think that I think that there's you're we're completely entering the stage now where assuming that crypto is sort of able to serve as this credible neutrality settlement layer that internet native companies find this incredibly interesting and want to deploy as soon as possible their biggest ideas yeah i agree that it's hard to remember now but like a lot of companies ran a lot of experiments With the internet back in very back in the day um and it took a lot of people a long time to figure out what to do um right um but a lot of the core ideas were there from day one things like forums things like chat um and even things like like crms and databases and those things and like all those core ideas have been present from you know early 90s Um and you know i look today and kind of the obvious ones are defy and digital collectibles um are like the two really big obvious ones and um i think there will be more that kind of iterate from there but the number of places you can insert those those core primitives um everything is is measured in billions of daily active users um so yeah i'm optimistic this stuff will happen And the companies that already have distribution will in fact be the primary distribution for this stuff um i don't think this is going to be like the internet where basically you had a whole bunch of new companies come in and disrupt the old guys and the old guys didn't figure it out um i'm a lot more optimistic that internet native companies um we'll see the pattern see the trends going on Um and not all of them will adopt will adapt correctly for sure a lot a number of them will fail but um i think the leading social media companies specific anyone any company that has deep social roots in it um i think it's very probable that they figure out how to embed these new primitives um into their product and service in a compelling way but they're not asleep at the wheel Um in fact most of them are founder-led um those companies are likely to rejigger their products yeah i mean personally i think that the villa the vision of decentralized social media is very compelling in the sense that sort of the state is is public and uncensorable and users can choose between different interfaces that may have different amounts of different levels of moderation And those can then be regulated um i agree that it's definitely a question of timing like in the web3 version this this part seems the most compelling to me um but i don't know how many years out it is and then about what you said earlier i think that this is this is maybe or one of the most interesting aspects of d5 sort of removing liability from financial Service providers um so there are two reasons why regulation exists right one is sort of keep like for the incumbents to keep sort of competitors out but also really to protect like um consumers and um you know just protect the the economy itself from uh yeah sort of more hazard so on and and contagious sort of effects and um and i feel like this part you can really at least the second part you can really Get around by you know using smart contracts like you know just in general this concept of a company tying their own hands i mean if you can do that then all of a sudden a lot of need for cumbersome regulation disappears uh well it can't be evil right now don't be evil yes exactly so okay so we talked about sort of what it takes for developers to adopt this and definitely The user experience and the general transaction cost and some minimum level of decentralization are definitely big parts of that but another that we have seen that is very like a very potent network effect is sort of the um the execution environment and programming language um for developers and all the tooling around that and i would say that the evm and Solidity are huge leaders right now in that area and um solana uses uh i think a rust like uses a rust-based uh execution environment can you like talk a bit about that and how you think that the basically the advantage of the evm is not already insurmountable um yeah solana has a custom run time called sea level um it is uh a uh llv A compile style vm and then cloud's kind of a new instruction set called uh e berkeley back filter or bpf that that stuff is well below my understanding of kind of core like how circuit switch and how memory is stored and how processors transact things so there's a level of technical depth there that i'm not qualified to speak about um what i do know is that solana compiles down to Kind of a native code the the rust actually gets the llvm compounds down in the native code um you're kind of getting native execution um as opposed to some intermediate layer the evm actually acts as a virtual machine um which is in the name solana you'll note it's not the sea level is not called the vm it's called a runtime so um you have kind of this abstraction With the virtual machine um and again kind of the depth technical depth of virtual machines is beyond me and probably beyond the scope of this podcast yeah but um uh surprised today it's just generally understood that they create they can have a bottleneck um kind of in terms of processing efficacy um so one of the core insights the salon team had was to say look We need to be able to get one of the really interesting things to think about in these networks is um you have a fixed amount of computational space right and that space is well there's the physical bandwidth but there's also just the physical processors and graphics cards um and you have literally billions of people trying to theoretically share this fixed amount of resource space Given that you need to optimize every ounce of performance out of the system to make sure that you can run at the limits of what the hardware can actually do and so one of the core things on the team recognized early on was the evm was just not um in any way kind of optimized to take advantage of hardware both in terms of just efficacy on a kind of a per instruction basis but even more importantly um in terms of Parallelism um and this is probably the most important difference between the evm and sea level which is that um solana natively supports parallel transaction execution um if you think about defy or even just payment flows right if i pay sue and then you has to pay someone else there's no dependencies between those two things and so Those those things should transact in parallel um the the problem in kind of a blockchain is that right you have this completely open state and anyone can submit any transaction into the state at any time and any transaction can theoretically modify any part of the state um you don't know in advance what it's gonna what it's gonna to modify and so if you enable Concurrent transaction execution um the thing you have to make sure you have to make sure that basically you don't have two transactions um reading and writing from the same piece of memory at the same time um yeah like two trades two trades in the same unit or whatever exactly whatever plenty of examples you can come up with um on a technical basis you're specifically focused on on met on Address space um right like in the in memory itself um that's really the core technical constraint um so the evm solves this problem by just saying don't solve the problem and just force everything to run serially um and if you do that then like it's a solution but obviously you forfeit parallelism um interestingly of all the other major Chains the only chain that even attempts to solve this problem um within the context of a single shard is solana um the way they solve it is they basically say every transaction has a transaction header and the transaction header specifies all the parts of the of the state that that transaction can modify not that it will modify because there may be some branching if Logic in the transaction but that it could modify based on all potential permutations um of you know if statements in in the transaction so you have to basically you just lock all of those pieces of state and say i have i am get monopolistic um rights over these parts of state for the course of this block um and so by doing that basically the system can then parallelize It knows what every transaction is going to touch until you can parallelize all transactions that don't touch overlapping state um and the benefit of doing this is that you can parallelize things um modern graphics cards have about 4 000 cores um and so you get kind of 4 000 lanes of parallelism basically um in a year or so nvidia is going to release cards with 8 000 cores And you just double the throughput um when i think about the nature of these blockchain systems you know if you assume there's going to be social media applications from snapchat and from you know bitcloud and you're going to think there's going to be d5 stuff and you think that people can be trading tokenized securities i mean these are all like largely different things um and by Definition like these different categories of applications are not overlapping um in what they do and so it's only natural that you should be able to parallelize these transactions um and it seems relatively clear to me that you know if you assume you got 100 million or a billion users doing all kinds of whatever social media things d5 things whatever um on a block by block basis where a Block here let's just say for simplicity is one second um the percentage of of those transactions that will be actually demanding overlapping state all right on a per second time scale uh my intuition is that it's probably under one percent um and maybe under point one percent of transactions are actually fighting over the same piece of state um maybe it's two or three percent like But i'm pretty sure it's like not over ten um percent of transactions in ethereum it's probably a lot more is this is what i'm saying as you as you increase the array of applications yeah yeah yeah yeah exactly right that percentage has to drop ah yeah okay and and so um my intuition is that's probably on the order of one percent Maybe it's two or three hard for me to believe it's higher than that um and so if you don't parallelize you're just forfeiting you know massive amounts of of throughput per unit of time um and i think that's like super important for these things and that's one of the fundamental differences in solana versus ethereum the downside of course is that you lose backwards compatibility with evm Um which obviously has a fair bit of infrastructure built up around it um i i've always felt that it hadn't achieved escape velocity for kind of all the reasons we just talked about with big companies and all these other things and so it obviously is a something you have to overcome and it was not um to be taken for granted that it would be overcome um but at this point uh if you look at The state of the salon ecosystem it's hard not to imagine basically all the things ethereum has that telon doesn't have today so things like dune things like graph which is announced you know a few more d5 primitives and those things it's pretty hard not to imagine in the next three or so months those things all maybe six like all those things getting built out and you're basically kind of future parody for what i'll call all the Middleware d5 primitive stuff and if you assume that's the case in three to six months then it's like okay well what advantage is the evm actually providing now um and that starts to become very very um insignificant pretty quickly right okay i mean i agree about those tools but just in general it's not possible to put something like i mean not that Not that you would want to have uniswap on solana because of course solana would support more efficient order book based exchanges but if there's a an application in d5 that the users like what are sort of the steps to putting it over i assume it has to it requires a full rewrite yeah yeah full rewrite of the smart contracts for sure um one one thing i i've observed that i Was wrong about was you know back in august september of last year you know reached out to a whole bunch of the d5 protocols um all the major ones and was like hey guys you know solana's you know serum was announced and there's a little bit of volume starting to happen and solana team reached out to all the major d5 protocols on ethereum and said hey you know are you guys interested in Rebuilding you know on onslaught all of them said they all were like oh this is interesting and cool and then none of them did anything but it's been nine months now and like you can see that none of them have ever launched um you know on solana um and so like the question is is why um obviously they didn't think it was a priority is kind of implicit in that but um if you kind of dig beyond that Um one thing i observed having you know interface with a fair number of of solidity-based um evm engineering organizations um is that the the ethereum developer teams have very little if any um expertise building rust writing rust um into playing rust um you're not that rust is like a weird niche language i mean rust is a it's one of the most popular languages in the world now um But these teams just don't have that experience in-house um and so you know as an engineering leader um if you don't understand kind of this this other code base the other technology base you know you're tasked to go build some first first first class application um it's just a very hard thing to do both as an engineering leader and then also to find and recruit the team you Know to do it um and so there's just a lot of organizational momentum that um existing ethereum-based d5 teams have um that's not easy to overcome so um you know the salon team i think had to realize you know late last year that those kind of those efforts to try and get people to port over were failing um i mean i think the failure rate was 100 percent um and um have realized okay well then we have to build everything new from the ground up with new teams um which feels like a risky strategy um and it is obviously riskier um but i don't actually think the amount of risk is actually that that high on a relative basis um and if you look today you've got multiple teams building Um money market things like like compound lava the jet and oxygen i think are the two that i'm aware of there may be others jet and oxygen are our money markets um you've got teams building um margin trading mango you've got teams working multiple teams working on perpetual contracts and quarterly futures you've got multiple teams working on options um certainly like Those are all the most important um kind of core primitives uh you've got teams already working on all those things um and i think most of those those market segments will be reasonably competitive you know there will probably be two or three major um players in most of those markets which is healthy right like you don't want there to be a single a single protocol for each kind of core primitive It's healthy for the market to have two or three um and um most of these teams that i kind of just named are venture backable i'm not saying that we are necessarily investors in them but they are all like venture backable teams can you say something about serum yeah um you know sam was running ftx and they obviously it seems like defy clicked for them kind of in may or so of Last year they said aha like this is this is important um and they started to do stuff on ethereum and they just they hit all the throughput constraints um and they were like we just can't do this um it's just not gonna we're not gonna build a product that we want to be able to build so they started looking around um i remember i had a call with Sam and anatoly i want to say it was like on july 7th or thereabouts last year something like that um calls started it was ten o'clock for me in texas eight o'clock for anatoly um and it was 11 a.m for sam in hong kong and uh call was hit up for 30 minutes went for two and a half hours um and kind of like just dove like really deep into Um you know like what what did sam want to build uh i remember we had like a really like existential debate about like what is the nature of financial markets like information theory like what's the speed at which things propagate i mean literally the speed of light but then also like more importantly like what is the time scale that matters for prices to update um is it okay if the price updates aren't You know measured in nanoseconds but measured in milliseconds um and like what what's the inefficiencies that creates um right and like kind of very kind of existential questions about the nature of these things um you know kind of reason through all this stuff and you know you could remember it was obvious sam's the wheels were turning in sam's head and he was like Yes like 400 millisecond to one second time scale is is a sufficiently low time scale that you can make this thing work 15 seconds is too slow it's unclear where exactly the threshold is between one second then 15 seconds but but somewhere in there is kind of the threshold to make this stuff work um and you know you understood quickly that like you need parallelism for this to work because he's like yeah like i'm Gonna have a bunch of sierra markets and like obviously you need these things to transact in parallel well not in not serially um and so you kind of realize all that stuff pretty quickly over the phone um yeah i remember i went to bed and then the next day i woke up and then totally texted me he said dude someone's spamming the salon network and i was like i was like i bet you it's The fdx engineers and like yeah they started spamming the network that night to test it um and like you know sam got underway building serum from there is serum as an application or a suit of applications serum is a protocol um serum actually does not have a front end today at least not one that's like endorsed by sam and the serum team officially um if You even go to project crm.com there's like a list of front ends that are there um but serum is not a protocol i think probably the most interesting thing about serum is that it is the opposite of ftx in so many ways um ftx is obviously a full stack experience um the ui is glorious customer support if you had on-ramps all those things um What's interesting is that like you know ftx has been widely recognized for their product execution over the last two years um where they control the full stack and it's been very interesting to watch the crm team do the opposite for for serum was to say hey guys here's a protocol um it enable it's a protocol that enables you to have order books and markets um on the chain And ability to cross the spread basically right and complete a transaction [Music] and it only does that for spot stuff although you can use the order book infrastructure for theoretically any asset whether it's leverage or derivatives or something else but here's this infrastructure please go build other stuff around it so build front Ends build margin trading build perpetual contracts quarterly futures all these other things um and at first i was kind of confused um kind of watching them do this but but it was you know like ftx is a very full stack highly controlled thing um and serum is not and i just kind of assumed they were going to build serum in the same way that was just my default assumption um and they've If you look at the communications from the serum telegram from this from the crm medium from this here on twitter um you'll see they can this continuous repeated focus on serum is a development platform for third parties um where there is no official front end um and they're really focused on enabling other developers to build d5 primitives Um this is like not particularly novel thinking in crypto defy land obviously but i think it's very interesting that you've got a single entrepreneur who is known for controlling the full stack experience on land also then having the wherewithal to say we're going to engage defy in a defined native way um and not try and control the whole thing um and i think that's been super interesting to kind of see that economy Play out yeah thank you that is indeed very interesting um [Music] um we did we kind of moved away from the original question we barely talked about some of the the synchronous versus the asynchronous experience of d5 and instead talked about like how much decentralization is enough how to measure decentralization and And so on and what you can get in return and that was also very interesting so i would say thanks guys for the discussion uh hey ha too too thank you for having me on pleasure to be honest i've been losing for a long time and uh yeah it's cool cool to dive into this stuff i love the really deep first principles uh yeah peeling back the onion one layer at a time Thank you thanks

Related Videos

State of the Market - with Su Zhu and Hasu

Uncommon Core

15.9k - 6 days ago - 1 hour, 8:2

Ethereum Bull Case - with Cobie, Su Zhu, and Hasu

Uncommon Core

74.8k - 1 month ago - 1 hour, 43:50

Art of Trading - with Light, Su Zhu, and Hasu

Uncommon Core

10.5k - 7 months ago - 1 hour, 24:45

Interview with Paradigm - with Charlie Noyes, Georgios Konstantopoulos, and Hasu

Uncommon Core

6.2k - 1 month ago - 1 hour, 31:27

Where do the Yields come from in Crypto? - with Su Zhu and Hasu

Uncommon Core

14.5k - 2 months ago - 52:23

The most innovative DEX? - with Hasu and Felix Leupold

Uncommon Core

3.5k - 3 weeks ago - 48:48

Defi Top 20 with Arthur0x, Su Zhu and Hasu - Part 1

Uncommon Core

5.9k - 5 months ago - 1 hour, 23:56

Uncommon Core Quest WoW

WoW Quests

9.3k - 1 year ago - 3:33

Trading Options 101 - with Josh from Orthogonal, Su Zhu, and Hasu

Uncommon Core

2.4k - 5 months ago - 1 hour, 17:8

Sir Roger Scruton: How to Be a Conservative

Hoover Institution

1.1M - 3 years ago - 44:46

State of the Market - with Su Zhu and Hasu

Uncommon Core

1.6k - 7 months ago - 1 hour, 8

The Case for Defi - with Richard Galvin, Su Zhu, and Hasu

Uncommon Core

2.7k - 3 months ago - 1 hour, 2:21

Perpetual Swaps 101 - with Su Zhu and Hasu

Uncommon Core

1.5k - 7 months ago - 1 hour, 7:22

Interview with DegenSpartan - with Hasu

Uncommon Core

1.9k - 7 months ago - 37:43

UnCommon Core | Imperial by Design, John Mearsheimer

The University of Chicago

56.3k - 9 years ago - 45:39

Uncommon Core WoW Quest

ZaFrostPet

1.3k - 1 year ago - 3:37

Defi Top 20 with Arthur0x, Su Zhu and Hasu - Part 2

Uncommon Core

2.3k - 5 months ago - 1 hour, 5:46

Discusssing the Coingecko Top 20 - with Su Zhu and Hasu

Uncommon Core

3.6k - 6 months ago - 1 hour, 51:24

Interview with Zee Prime Capital, Part 2 - with Su Zhu and Hasu

Uncommon Core

2.6k - 3 months ago - 1 hour, 1:26

UnCommon Core | Wendy Doniger: An Alternative History of the Hindus

The University of Chicago

11k - 6 years ago - 1 hour, 12:49

Like it? Make YTScribe even better by leaving a review