morning everybody David Shapiro here with a video so my video about AGI was super popular um I suppose I should uh should have anticipated that you know making a bold declaration like AGI within 18 months so with that being said uh the ramp up to AGI ASI or super intelligence and then Singularity seems like it's accelerating especially if you go by Reddit comments on R Singularity and a few other places people are like is this happening are we actually approaching the wall of exponential takeoff so assuming that that is the case and that we are
ramping up to the singularity within the coming uh you know months or years or whatever let's explore how the singularity will actually unfold first we need to Define singularity so what do we mean when we say Singularity there's obviously a lot of ways to Define this thing and the simplest way that I could come up with to Define it is just we say that the singularity is when AI becomes orders of magnitude more intelligent than all humans combined so basically if current trends continue in terms of AI research and the power of AI gpt4 is
just as intelligent as many people more intelligent than some and gpt5 is being trained and there's open source versions etc etc so basically if you haven't been living under a rock you are probably aware of the rapid ramp up of AI and it's not showing any signs of slowing down if anything it's accelerating because now we're not looking at AI advancements on a monthly basis we're looking at it on a weekly basis I was actually at a Meetup recently and someone pointed that out they're like I think we're actually already in the singularity because we're
measuring advancements on a week to week basis and soon we might be measuring it on a day-to-day basis okay so let's break this down what are some of the macro economic changes um that we can expect to see with the singularity first we have to talk about what remains scarce because it's very easy to get caught up in this you know magical thinking of AI is going to change everything but there's going to be some things that don't change with Ai No matter how smart it gets so if the singularity happens there's a few things
that are not really going to change that much so first is desirable and arable land some places will remain deserts hence this AI generated image of a woman in a desert we'll get to that later when we talk about Fusion but uh fresh potable water most of the water on the planet is also salt water again we might be able to change that if we solve uh if we solve uh nuclear fusion and other energy sources um but then there's other physical resources such as minerals um and and and mind natural resources that will also
probably remain scarce no matter how intelligent AI becomes so this will be a really con uh critical constraint which could really drive up the value of some of these resources um but and as I mentioned if we solve nuclear fusion we could desalinate water we could air irrigate deserts and so on and so forth but that could have unintended consequences because if suddenly you irrigate every desert on the planet maybe those deserts actually form a really critical component of our ecosystem lastly if we solve space flight we could probably start harvesting asteroids and other even
other planets for Rare Minerals because there are trillions and trillions and trillions of dollars worth of Rare Minerals out there on the solar system so it's entirely possible that all of these will actually be solved at some point in the future with this with the singularity but at least in the short term these will remain scarce resources now on the flip side from a macroeconomic perspective what becomes abundant with the singularity the primary thing that becomes abundant with the singularity is knowledge information and cognitive labor so what I mean by cognitive labor is thinking knowledge
work service industry jobs so basically what happens if AI becomes orders of magnitude smarter than humans and we remain in control of it this is all assuming the the good ending right that we don't get wiped out basically what that means is that human cognitive effort becomes irrelevant um now that sounds really awful but one thing that I realized while I was working on this is that on an individual basis most people's cognitive effort is already irrelevant right we're on a planet of 8 billion people chances are someone has already solved the problem that you're
working on whether or not you realize it is a different you know story but um you know doing science and and solving problems it can be really difficult and it's it's mostly a matter of right place right time but you know there are there well with that being said there are still unsolved problems out there and the singularity with you know hyper Advanced AI will probably just result in kind of the same behavior that we're already seeing because the collective wisdom of humanity solves problems pretty quickly right but the velocity of that problem solving will
probably go up uh you know maybe a few degrees maybe an order of magnitude not sure yet but a hyperabundance of cognitive labor is is actually probably not going to be as immediately and dramatically impactful as you might think because like look at Reddit and Twitter and other social media platforms that allow you to solve problems get answers uh and and move on very quickly so basically instead of having you know lazy Twitter or Reddit where you ask a problem ask the machines for a problem where you know people are collectively now the machines are
going to be doing it and actually some of the people that I'm working with on these autonomous AI projects one of the key things that we're working on is figuring out how to get AI to talk to each other in an autonomous manner that is safe and transparent and this is where natural language comes in because you don't want AI using their own coded language to talk to each other you want AI using uh human readable natural language to talk to each other anyways that's a topic for another video um so let's move forward from
those macro economic changes to technological breakthroughs if suddenly we have a hyperabundance of cognitive effort or cognitive labor what kind of Technology Solutions can we imagine being solved so first is high energy physics high energy physics the stuff that they're working on at the CERN at LHC this includes nuclear fusion it could include even anti-matter research who knows maybe time travel may be faster than light travel not really sure but at least the first problem that will likely be solved in high energy physics is probably going to be nuclear fusion it's really difficult to anticipate
what solving nuclear fusion will do because nuclear fusion is a thousand times more powerful and efficient than any form of energy we have today so when you have a hyperabundance of energy suddenly a lot of other things become possible for instance you can then afford to desalinate as much water as you need you can then afford to run underground Farms you know that are completely unbounded from arable land there's all kinds of stuff that you can do once you unlock nuclear fusion the knock-on effects of solving nuclear fusion are impossible to say not only just
like in the short term we can come up with a couple ideas but certainly in the long term solving nuclear fusion solves so many other problems um it solves recycling because then suddenly you can afford to just melt down any material no matter how expensive it is so you can reclaim all the lithium all the Cobalt all the nickel from everything platinum gold pretty much every mineral becomes accessible no matter how difficult it is to isolate because suddenly if you have many many gigajoules of energy available at all times for practically free it doesn't matter
how much energy it costs to recycle a material that's just another example um another set of solved problems that you can expect with a hyperabundance of of cognitive labor is basically disease genetics and aging uh that you know the the human body our genetics our metabolism one of the most complex systems uh in existence uh there's more than a hundred thousand metabolic pathways that we know of in the human body alone and they all interact uh not only that they interact with your genes your epigenetics uh your microflora all kinds of stuff super complex system
but if you have a hyper abundance of of intellect then you can create new tools you can create new processes you can manage vast amounts of information and so then we might end up we might end up curing all disease all aging and untangling all genetics within a relatively short period of time after achieving you know the singularity or AGI or however you want to call it excuse me and then finally Material Science so Material Science we're already seeing the beginning of this with Alpha fold and so basically imagine that you have Alpha fold which
if you're not familiar with Alpha fold that is a way of using deep neural networks using Transformers to model protein folding which was an unsolved problem but now that it's a solved problem we can model any protein folding now take that to the next level what if not only you can model all protein folding you can measure or you can model all protein interactions all genetic interactions then take that one step further you can measure or model um uh nanoparticles carbon You can predict how to build very very Advanced Materials which could revolutionize for instance
batteries and computer technology I predict that the Materials Science breakthroughs that will result from um from AI means that like basically in a in five to ten years your phone could be more powerful than all computers on Earth today um and I'm not really exaggerating when I say that because the amount of computational power just in the atoms of a phone like if you have a membrane or whatever or a three-dimensional wafer the amount of potential computational power in matter is inconceivable basically inconceivable um so anyways you know it would it would not surprise me
if we move up a kardashev scale or two um post singularity now that being said there are still some unsolved problems uh that that pretty much no amount of intellectual labor um on Earth could solve so for instance uh some people in the comments have asked about you know the hard problem of Consciousness uh that may or may not be solvable by machines period that might be something that we humans have to figure out for ourselves uh which extends to fundamental questions of existence of cosmology some of these things are not necessarily a matter of
you know mathematically proving it and measuring it in the lab some of these things are a matter of interpretation some of these things are a matter of subjective values such as the meaning of life so on and so forth now one thing that people imagine when we talk about transhumanism or post-humanism is that we will have some sort of transcendence event I personally don't think the singularity will result in some kind of transcendent event where we all become like Q from Star Trek or you know some final solution where we become beings of energy I
also don't think that mind uploading is a good idea I know a lot of people think that that's great but like we don't understand why we are conscious you know and and basically I I predict that you know if you try and upload your mind you're just going to upload a copy of yourself and then your body will be dead and so subjectively you will have died but a copy of you will continue on forever so I don't think that mind uploading is a good idea which if that's the case then like we will forever
be locked in our organic bodies even if there are digital copies of us frolicking out in cyberspace they're not going to be us and they're going to have an entirely different set of constraints because then if they're if if you become or a copy of you becomes a digital entity you suddenly don't have the same biological constraints and so we have this like Grand Divergence of digital post-humans and then us organic meat bags um that's that to me sounds like an unsolved problem that I don't think AI is going to fix for us all right
uh moving on to social changes jobs and occupations so as machines get more intelligent the the tldr is that most jobs are going to become irrelevant um you know I've talked with people about this there's a lot of there's a lot of BS jobs out there that nobody really wants to do but you do it because you know you need to eat and you need to pay for your house and whatever and so what we're going to have to do is then recalibrate how we think of meaning and purpose and success and this includes uh
maybe uh shifting and having a greater emphasis on Creative creativity exploration and self-improvement um and then one idea that that came from discussing this with chat GPT was that as a society we might instead instead of focusing on conformance to one standard of Education we might instead really focus on what makes everyone unique which was a really interesting new model of education so imagine that you go to school and instead of like everyone has the same classes you have a broad variety of projects and experiments and things to figure out what it is that one
you really care about and two what really makes you stand out and so then everyone can have a very different uh focus on education my first year of school was at Montessori school and so I can imagine taking that to the next level anyways um I know that there's a lot of people that say oh well without a job we have no meaning that is your neoliberal programming speaking I and other people that have made a transition to a different uh different kind of occupation you know my my occupation is now YouTube and patreon which
I find much more interesting and and and uh and rewarding is much closer to Lifestyles that have existed in the past so for instance in ancient Greece particularly in Sparta Spartan citizens were not allowed to have a job their job was to be soldiers to be Hunters to be politicians to participate in culture and Society not to be leather workers or anything else and so obviously ancient Sparta didn't ultimately didn't do so well ancient Athens they did much better very similar model with the with the the citizen class the Leisure Class um ditto for ancient
Rome so humans have adapted to kind of these effectively opposed scarcity world before but instead of working on the backs of subjugated classes of people we will be we will all enter into a post-scarcity Leisure Class on the backs of AI that's kind of what I predict is going to happen because honestly most people want that anyways and if we have a collective willpower to want that who cares and I can hear some of you already complaining oh corporations are never going to allow that to happen I'm gonna get to that in just a second
glad you asked okay so if we sudden if nobody's job really matters what do we do then right one of the conversations that I had at a Meetup was like well what if everyone just plays video games there's actually a reason that video games are so popular uh because video games uh can social uh can foster social connection right a lot of games are very very social today and they're also challenging which means that they uh give you a sense of competence a sense of Mastery and finally video games give you a lot more autonomy
like you you can be anyone that you want in a video game world and those three things satisfy um the three pillars of self-determination Theory autonomy human connection incompetence which is why so many people play video games so if you look at sdt self-determination Theory and then you say okay well take away the need for a job and suddenly AI gives us all a lot more autonomy gives us an opportunity for more human connection the only remaining thing is challenge and what happens for a lot of people who retire or step away from conventional work
is that we realize like oh wait I can challenge myself in new ways um all of you that watch my YouTube channel I don't actually need to do all the coding experiments that I do but I find it deeply satisfying to challenge myself to try and solve the problems out there and I'm not saying everyone is going to engage in this kind of problem solving some people are going to go to do martial arts or go climb mountains or whatever but we humans love love challenges we need to feel competent and we need to to
have a sense of Mastery and um the Sam Altman interview he pointed out that yes AI has solved go and chess and other things but we still play chess we just don't play against computers because there's no point there's no sense of Mastery against something that you're never going to win against um so anyways the long-term effect of this is that we're probably going to see new social structures emerge um or maybe even older social structures re-emerge I particularly predict that we're going to see more multi-generational homes more kind of tribal or Village lifestyle things
uh and re-emerge because suddenly it's like okay well there's you know here's a dozen people that I really like and none of us have a job so let's go form an Eco Village you know out in the countryside or maybe an urban co-living situation in the city um who knows uh just some speculation there okay so I promised that we would address the uh some of the elephants in the room so let's unpack all the risks and factors that will go into this Rosie post Singularity result that I have outlined so the first one is
the development and control of AI obviously many of you are probably aware that there's been a uh the the letter circulating that's signed by a whole bunch of people including Elon Musk and Max tegmark all calling for a a moratorium on the advancement of AI for at least six months while we take a breath and reassess so it's an it is possible that if we continue at a Breakneck pace and things do it and people do it wrong then we're going to end up in some kind of dystopian or cataclysmic outcome so there's basically two
primary uh failure modes for this one is we lose control of the AI and it decides to kill us all the other failure the other major failure mode is that we don't lose control of AI but the wrong people get the powerful Ai and they use it to kill everyone else or subjugate everyone else so those are the two primary failure modes that have to do with AI development and control um and this has been explored in a lot of fiction um and so like I'm kind of tired of it so I'm not going to
really talk about it that much more but point being is that 99 of people don't really want an AI apocalypse some people seem to really wish for it but I think that's a sense of nihilism like leaking through some people think it's inevitable and there's a sort of fatalism about it and that again you know I empathize with people like that um I Echo Sam Altman's sentiment that like yeah there are a lot of people afraid and I'm not going to tell them that they're wrong or that they're stupid or that it's magical thinking like
we are playing with fire um I just happen to be very sanguine about it because I feel like one all the problems that exist are solvable and two I think that they are solvable in the very near term okay another big risk is distribution of benefits this is one of the biggest things that people are worried about which is okay do you the like the one of the most common pushbacks is like do you honestly think that corporations are going to allow everyone to live a luxurious lifestyle or that the rich and Powerful are going
to allow everyone else to live like they do well first I don't know that they'll have that much of a choice in it but two I think the fact that the the masses like you and I the proletariat we don't want to live in a cyberpunk Hell right and we have seen what happens repeatedly through history as people get hungrier and more desperate the most recent incident was the Arab Spring in which case much of the Middle East the Arab world rose up and the primary driving Factor was economic conditions um and then of course
you go back even further the French Revolution this kind of thing has happened time and time again so I'm not too particularly worried about that because push comes to shove people are going to stand up and and and redistribute forcefully now I'm not advocating for you know Civil War or anything I don't even think it's going to come to that because you know I follow Davos and World economic forum and U.N and and all the you know Halls of power IMF the World Bank um the halls of power really are paying attention to this and
I think that they're preparing for it honestly so for instance I suspect that the uh the the stimulus checks that America did during the pandemic I think that that was a pilot program to demonstrate that redistribution works that it is fast efficient and fair because they what they did was they did the stimulus checks alongside the the um the paycheck uh Protection Program the PPP loans and they basically did a side-by-side test showing look the PPP loans are expensive and Rife with corruption and the stimulus checks went directly to people who needed it and it
all got spent by individuals who needed it so I kind of think that the the stimulus checks were a a pilot program or a prototype for Ubi and when you look at the landscape right now where there's been over 300 000 Tech layoffs and more other other kinds of people are already starting to get laid off and notified of layoffs due to Technologies like chat GPT um my fiance who's a writer and is in a lot of writing discords there are copywriters out there who are already getting laid off and losing work to AI um
so like the AI layoffs are coming so I think that we're also going to see a lot of stimulus checks coming and it's just a matter of okay are these stimulus checks permanent and I think that they will be the regulatory environment so this is where that letter that just came out um is is asking for regulation Sam Allman has asked for regulation Elon Musk has asked for regulation all kinds of people are asking for more regulation now the big problem here though is one there's no agreement on how to regulate these things and in
the conversations I've had at meetups the question rapidly comes up how do you even enforce it right if all these models are getting faster and more efficient and you can run them on laptops now you can't put that Genie back in the model so does regulation even matter or if it does how so the big concern here with the regulatory environment at the federal and international level is existing power structures and the status quo so the wealthy and Powerful are going to want to remain the wealthiest and most powerful on the planet that's just how
it is and how it has always been there have been reset events like you know the French Revolution American Revolution so on and so forth there have been reset events in history but they're generally violent and we want to avoid that so do the so do the powers that be also want to avoid that but the biggest problem in these conversations that I've had is that things are advancing so fast and the gerontocracy which is ruled by the elderly old folks generally don't get AI they don't understand how much is changing and why and what
its impact is going to be and that honestly could be one of the biggest risks is you know us younger people we get it we see it coming even some of the people at the meetups that I talk to their children are already acclimating to a an AI world and they're going to trust the AI more than people because it's like well politicians lie and yeah chat GPT might get it wrong sometimes but it's not going to lie to you but not like a politician will so we're in we're in for some very interesting uh
advancements in the regulatory front public perception and adaptation so there's a lot of fun fear uncertainty and doubt uh denialism doomerism and then also lots of people saying oh that's still decades away it's not it's months and years away not decades um so the another big problem is a lot of this uncertainty a lot of this denialism um some of the there's various aspects of the of the denialism for instance some people think oh well ai's never going to be as smart as us or it's never going to be smarter than us and it's like
I kind of think that it's already smarter than those people it just lacks autonomy um but you know that's my opinion and I know some of you disagree with it anyways this is another big risk is because a lot of people are sticking their head in the sand and then there's also comments around the world like someone was saying that I think in France like they don't even like people aren't even talking about it right it's so like all of this is happening so quickly and most people aren't even aware of it of course chat
GPT made the news but then people just kind of you know World by and large collectively Shrugged without understanding how fast this is ramping up so public perception and and acclimating to this could also be a big barrier uh Global uh Global cooperation and collaboration the big thing here is um what what I call trauma politics so basically you look at people like Putin and Xi Jinping both of whom suffered a tremendous amount of trauma at the hands of their dystopian governments um and they basically are seeking power for the purpose of self-soothing um that's
pretty much all there is to it but when when people who have a tremendous amount of trauma come into power they tend to have a more nihilistic worldview which with which then results in things like genocide mass incarceration surveillance States because they want control they want as much control and power as they can get and it's never enough um and so this nihilism also creates a self-fulfilling prophecy because they project their pain onto the world which causes more trauma look at the war in Ukraine um look at China's treatment of the uyghurs and then that
creates a self-perpetuating loop of more trauma intergenerational trauma and so forth so so on and so forth and so in my opinion um this unaddressed uh basically intergenerational PTSD or nihilism is the greatest threat to humanity because these are the kinds of people who will look at these things Ai and say oh that's the perfect weapon for control that's the perfect weapon for subjugation whereas healthy individuals look at Ai and say maybe we don't do that um Singularity facts so uh there is a lot of kind of gotcha questions that come up I tried to
capture some of the best ones what will happen to money post Singularity um some people think like oh cryptocurrency is the future or maybe we get do away with money altogether well I've got some good news and some bad news the uh the good news is that uh it is entirely possible that money will change monetary systems will change and financial policies will change however the concept of currency the concept of money is too useful and too helpful because um it is an abstract uh reserve of value and it is also a really good medium
of Exchange and so you know whether that means that Bitcoin or other cryptocurrencies are gonna are gonna you know replace uh fiat currency I'm not really going to say one way or another but basically currency is here to stay in some form um personally I think that there's too many problems with cryptocurrency um namely that it is uh subject to manipulation because its value can change a lot right like the the the the the wild swings of value of of Bitcoin and stuff basically proves that it is not a stable reserve of value and you
know people have lost fortunes on it people have made fortunes on it too usually people that with uh um not the best intentions oh I don't say usually but sometimes basically uh organized crime um loves cryptocurrency uh what will happen to the human population now this one really kind of uh uh is is interesting because there's a lot of debate over what is the actual carrying capacity of the planet some people say oh it's easily 50 billion um and it's no it's not simply enough no the carrying capacity of the planet is nowhere near 50
billion there is technically enough room physical room 450 billion humans but when you look at the the the constraints of thermodynamics hydrological Cycles the amount of arable land no now it is possible that the singularity with you know its results in nuclear fusion and stuff you could probably tip that a little bit further right especially if you can um synthesize more arable land or grow food underground or desalinate water you could probably boost the carrying capacity of the planet quite a bit 50 billion still seems way out there for me but the biggest thing is
going to be is not going to be those things like you know okay we we overcome those those energetic constraints it's still going to come down to like uh mostly mostly management right sustainable management of of the population because the thing is you know you you if if Logistics breaks down today we all starve pretty quickly right because we don't have locally sourced food our food and water you know requires a very stable um infrastructure in order to provide that and that only gets worse when you have like 50 billion people on the planet so
you know sustainable and responsible management of necessary resources primarily food and water are going to be the key to what happens with the human population now in some of the discussions that I've had there's a few confounding factors here one thing that isn't mentioned on this slide is what happens if we solve aging because what happens with populations is as they become more gender equal women choose to have fewer children and so what if people are living longer but having fewer children I kind of predict that the population is going to stabilize there's always going
to be some people who want children but at the same time right like if you if you don't if you don't actually really deeply want children you're probably not going to have them and then in a post-scarcity life like maybe you choose never to have children and again some people will choose to have children and even if you solve aging people will still die they're still going to be accidents right they're still going to be um maybe a few a handful of Unsolved medical issues but primarily you're going to see accidents and also one of
the conversations that came up was okay well if you can if you can hypothetically live forever do you want to and the I the many people suspect that you won't actually want to live forever you might choose to live for a few hundred years but then you might get tired of life and then you know quit taking the life extending medicine and allow yourself to die naturally who knows um but personally I kind of predict a a a a population stabilization food so food has been a big thing um so on top of you know
vertical farming or underground farming powered by nuclear fusion okay great we can eat whatever we want wherever we want I also suspect that biotechnology is going to really change our diet and what I mean by that is synthetic Foods engineered foods and even hyper personalized diets so for instance by and large you might believe that Dairy is bad for you because it's you know got you know anal you know saturated fat in it but when I started uh when I added more Dairy to my diet all my numbers got better because it's just in my
jeans it's in whatever and so but I had to figure that out through trial and error Dairy raises some people's cholesterol in my case it lowered it so the combination of engineered Foods better bioinformatics and biotech um and and things like mobile Farms oh there's actually I actually saw an ad for it the the first like portable Farms are actually the the container shipping container Farms are are coming so that only that only ramps up and gets better over time so that means you go to the grocery store and everything that you could possibly want
is there and it's fresh and it's local so that so you know some people are worried like oh well they're going to take our steaks they're going to take our Burgers I don't think so I think you're actually going to have much more options and they're going to be healthier options uh in a post Singularity world uh War so I did mention trauma politics and and geopolitics earlier obviously the biggest the absolute biggest risk here is an AI arms race um even Nations liberal democracies that are not run by deeply traumatized tyrants are still going
to be engaged in some kind of AI arms race uh which is an unfortunate reality I'm not saying that that's a good thing I'm not passing moral judgment on it it's just an observation every time there's new technology it is integrated into the military apparatus I don't all I also don't think that we're going to end up with a one world government at least not anytime soon um and there's numerous reasons for this not the least of which is language barriers cultural differences um uh past grievances between cultures um you know it could take many
many generations to heal those um those Intercultural wounds before people even want to uh collaborate you look at the the animosity between like China and Japan between Israel and Palestine between Iran and and a bunch of other nations and so on and so forth it takes a lot of work to heal those wounds and there's a lot of resistance to Healing those wounds and those wounds could continue to Fester what I'm hoping is that AI actually helps us break the cycle of intergenerational trauma and so then within maybe you know two or three generations we're
ready for a more peaceful Global community and again I still don't think that a global government is going to happen um just because like geographically speaking like it kind of makes sense to have the uh the the Nations the nation states and then the union model that makes the most sense right now like you know France is still France Great Britain is still Great Britain but they're part of the European Union right and over time I do suspect that those Continental sized unions will get stronger but not that they'll replace the local governments just like
you know we have municipals we have local city governments we have County we have state governments and we have Federal governments I think that we're just going to add a few tiers on top of that and eventually we will end up with a global governance but again I think that it's probably at least two or three generations away minimum and then finally corporations so I did promise that I would address this um so some people and this includes myself I hope that corporations as we know them go away because corporations are intrinsically amoral and I
don't mean immoral amoral and corporations their morality is only beholden to the investor right uh to the shareholders and the shareholders just want more value um whatever it costs right and corporations will always explore every little nook and cranny of what they can legally get away with um and that often is results in bad things such as mistreatment of people environmental abuse and so on so because corporations are intrinsically amoral I hope that they go away but I don't think that they will um I tried I tried to figure out how the singularity could result
in this but I I the more I explore it the more I realized like no basically what's going to happen is that AI is going to allow corporations to produce more with less so productivity will continue to go up while head count goes down and I talked about this in my AI job apocalypse video a couple months ago basically what's going to happen is that uh you're going to see corporations replace as many of their workers as they can and so then you have um the the owner the ownership class whether it's shareholders CEOs whoever
is going to have basically unmitigated um stock price growth because suddenly the greatest constraint and the most expensive aspect of running a corporation human labor is no longer a factor so we I think that we are at risk of seeing like the mega Corp things that you see in like uh dystopian sci-fi I think that we probably are at risk of seeing you know multi-trillion dollar quadrillion dollar companies out there that have almost no employees that are all entirely run by shareholders and then AI uh so that is an interesting thing now as to whether
or not they will allow the rest of us to live in certain ways I kind of think that they don't care right because as as obscenely wealthy as corporations are going to be like there's just it doesn't make sense for them to expend any energy depriving everyone else um and so like let's just imagine that like Elon Musk takes SpaceX and uses it to start harvesting asteroids and SpaceX becomes a 20 trillion dollar Company by harvesting iridium and Cobalt and platinum from asteroids great is is Elon Musk going to personally say actually I don't think
that I think that everyone should live in slums and favelas around the world no he he's not going to care he doesn't give a crap how everyone else lives as long as he's a trillionaire right and so when I think it through it that way it's like it would take a lot of deliberate effort on corporate um to on on behalf of Corporations to deliberately deprive the rest of us of a better life so I don't think that's going to happen certainly it's something to be aware of because again corporations are intrinsically amoral which is
one of the biggest risks to our standard of living in the future okay that's that thanks for watching um I hope you thought found this video enlightening and thought-provoking um yeah I know that uh there will probably be some disagreements in the comments um keep it civil or you get banned thanks bye