Satya Nadella | BG2 w/ Bill Gurley & Brad Gerstner

31.55k views14736 WordsCopy TextShare
Bg2 Pod
Open Source bi-weekly convo w/ Bill Gurley and Brad Gerstner on all things tech, markets, investing ...
Video Transcript:
I think the company of this generation is already been created which is open AI in some sense uh it's kind of like the Google or the Microsoft or um or The Meta of uh this [Music] [Applause] [Music] era well it's great to be with you you know when Bill and I were talking sucha and looking back at your tenure as CEO it was really quite astonishing you know you started at Microsoft in 1992 for those who may not know you took over online in 2007 you launched Bing search in 2009 you took over servers and
launched Azure in 2011 and you became CEO in 2014 and it was just before that that a pretty now well-known essay entitled The irrelevance of Microsoft had just been published um now since then a you've taken Azure from 1 billion to 66 billion run rate the total revenues of the business are up 2 and a2x the total earnings are up over 3x and the share pric is up almost 10x you've added almost $3 trillion of value to Microsoft shareholders and as you reflect back on that over the course of the last decade what's the single
greatest change that you thought you could do then to unlock the value to change the course of Microsoft which has been just an extraordinary success yeah so the way I've always thought Brad about sort of that entire period of time is some sense from 92 to now um it's just one continuous sort of uh period for me although obviously 2014 was a big event with the accountability that goes with it but what I felt was essentially pattern match when we were successful uh and we when we were not and do more of the former and
less of the latter I mean in some sense it's as simple as that because I've sort of lived through when I joined in ' 92 that was just after windows 31 was launched uh I think Windows 31 was you know may of 92 and I joined in November of '92 in fact the re I was working at Sun and I was thinking of going to business school and I got an offer at Microsoft and I said ah maybe I'll go to business school and then I somehow or the other the boss who was hiring me
convinced me to just come to Microsoft and was like the best decision because the thing that convinced me was the PDC of 91 in mascon Center when I went to and saw the uh basically uh Windows NT it was not called Windows NT at that time um and x86 and I said God this you know what's happening in the client will happen on the server uh and this is a platform company and a partner company and they're going to ride the wave and so that was sort of uh the calculus then then of course the
web happened we managed that transition we got a lot of things right like for example I mean uh we recognized the browser we competed uh and got that browser thing eventually right uh we miss search right we sort of uh felt like wow the big thing there was the browser because it felt more like an operating system but we didn't understand the new category which is the organizing layer of the internet happened to be searched um then uh we kind of were there in Mobile uh but we really didn't get it right obviously the iPhone
happened uh and we got the cloud right so if I look at it and then we are here we are on the fourth one on AI uh in all of those cases I think doing things which are not coming out of because somebody else got it and we just need to do the same sometimes it's okay to fast follow and it worked out but you shouldn't do things out of envy that was one of the hardest lessons I think we've learned uh do it because you have permission to do this and you can do it
better like both of those matter to me the brand permission like uh you know Jeffrey Moore once said this to me which I said hey why don't you go do things which your customers expect you to do I love that right which is cloud was one such thing which is the custo you know in fact when I first remember showing up in Azure people would tell me oh it's a winner take call it's all over uh and uh Amazon's want it all I never believed it because after all I'd compete against Oracle and IBM in
the servers and I always felt like look it's just never going to be win or take all when it comes to infrastructure um and all you need to do is just get to a into the game uh with a value proposition so in some sense a lot of these transitions for me has been about making sure you kind of recognize your structural position you really get a good understanding of where you have permission from those those partners and customers who want you to win and go go do those obvious things first um and I think
you know you could call it hey that's the basics of strategy uh but that's sort of what I feel uh I think at least has been key and you know there are things that cultivated uh to your point Brad which is you know there's the sense of purpose and Mission the culture that you need to have all those are the most I I would say those are the necessary conditions uh to even have a real chance for shots on goal uh but I would just say getting that strategy right by recognizing your structural position and
permission is probably what I have you know hopefully done a reasonable job of s before we move on to AI I I have a couple questions about the transition and and just echoing what Brad said I mean there's a I think that's definitive that you may be the best CEO hire of all time I mean three trillion is unmatched so one I read an article that suggested and maybe this isn't true so you tell us that you wrote a 10-page memo to the committee that was choosing um the CEO is that true and and what
was in the memo yeah it is true um yeah because I think our uh CEO uh process was pretty uh open out there and um at that time um quite frankly it was um is definitely not obvious to me that uh won in the beginning remember I never thought that first bill would leave and then second Steve would leave right it's not like you join Microsoft and think oh yeah you know Founders are going to retire and there's going to be a job opening and you can apply for it I mean that was not the
mental model uh growing up at Microsoft so when Steve decided to retire I forget now I think in August August of 2013 it was a pretty big shock and I you know at that time I was running our server and tools business as it was called in which Azure was housed and so on and I was having a lot of fun and I didn't even put up my hand first saying oh I want to be CEO because it was not even like a thing that I was thinking that'll happen and then eventually the board came
around and asked and there were a lot of other candidates at that time even internally at Microsoft um and um and so at some point in that process they asked us to write and quite frankly it's fascinating um that memo everything I said in it right in fact you know one of the terms I used in that memo which I subsequently used even in the first piece of email I sent out at the com to the company had ambient intelligence and ubiquitous Computing and I dumbed it down to mobile first Cloud first later because you
know my PR folks came and said what the heck is this nobody will understand what ambient Computing is and you know ubiquit other ambient intelligence and ubit computer but that was the mobile first Cloud first how do you really go where the secular shift is then understanding our structural position thinking about Microsoft cloud what are the assets we have why is M365 in fact one of the things I've always resisted is thinking of our Cloud the way the market segments it right the market segments it oh here is I ass uh even Brad the way
he described as to me I've never I don't allocate my capital thinking here is the Azure Capital here is the M365 Capital here is um you know gaming I kind of think of hey there's a cloud infra that's the core theory of the firm for me on top of it I have a set of workloads one of those workloads happens to be Azure the other one is M365 Dynamics gaming what have you and so uh in some sense that was all in that memo uh and pretty much has played out um uh and one of
the assumptions at that time was that this you know we had a 98% 99% gross margin business in our servers and clients and people said oh you know good news you now can move to the cloud and maybe you'll have some margin and so that was the transition yeah and my gut was it is going to be less GM but the Tam is bigger uh you know we'll we'll sell more to small businesses we will sell more in uh in aggregate in terms of even upsell like the consumption would increase right because uh you know
we had sold a bit of exchange but if you think about it exchange SharePoint teams now everything expanded so that was the basic Arc that I had in that memo was there was there any element of cultural shift I mean the number of CEO hire there's coo hires made in this in the world all the time and many of them fail I mean Intel's going through it a a second reboot here as we speak and as Brad pointed out there were people that are arguing oh Microsoft's the next IBM or deck that it's better days
are over so what did you do and what would you advise new CEOs that come on to kind of reboot the culture and get it moving in a different direction yeah what one of the advantages I think I had was I was a consumate Insider right I mean uh having grown up pretty much all my professional career at Microsoft um and so in some sense if I would even criticize our culture it was criticizing myself so in an interest the break I got was it is it never felt like somebody from the outside coming and
criticizing the you know folks uh who are here versus it's about mostly pointing the finger right back at me because I was pretty much part of the culture right you could I couldn't say um anything that I was not part of and so I felt like to your point bill I distinctly remember I think the first time Microsoft became the largest market cap company I remember walking around the campus all of us including me we were all strutting around as if we were like you know the best thing to uh humankind uh right and it
is all our Brilliance that's finally reflected in the market cap and and I somehow it stuck with me that God that is the culture that you want to avoid right because I as always say from sort of ancient sort of Greece to Modern Silicon Valley there's only one thing that brings civilizations countries and companies down which is hubris and so one of the greatest breaks is my wife had introduced me to a book by Carol DW you know a few years before I became CEO which I read on growth mindset more in the context of
my children's education and parenting and what have you and I said God this thing is like the best you know all of us are always talking about learning and learning cultures and so on and this was the best cultural meme we could have ever picked so I attribute a lot of our success culturally to that Meme because we it is not the other thing nice thing about that uh bill was it is not trademarked you know Microsoft or it's not some new Dogma from a CEO it's a thing that speaks to work in life uh
you can be a better parent a better partner a better friend a neighbor and a manager and a leader so we picked that and the py way I've always characterized it is hey go from being the no it alls to learn it alls uh and it's a destination you never reach because the day you say I have a growth mindset means you don't have a growth mindset by definition and so uh it has been very very helpful uh for us and you know it's like all cultural change you got to give it time oxygen breathing
space uh and it's both top down and bottom up and it middles out right which is there's not a single meeting that I do with the company or even my executive staff or whatever you where I don't start with Mission culture those are the two bookends and I've been very like the other thing is uh I've been super disciplined on my framework do you point about that memo pretty much for the last now close to 11 years the arc is the same mission culture it's the world viiew right that ambient intelligence ubiquitous Computing and then
the specific set of products SL strategies that frame I pick and choose every word I'm very very deliberate about it I repeat it until I'm bored stiff but I just stay on it well speaking of that you've you know you mentioned the phase shifts that we've been through um and I've heard you say that as a large platform company most of the value capture right is determined in that first three or four years of the phase shift when the market position is established Sacha you know I've heard you say you basically you know Microsoft was
coming off of having Miss search having largely missed mobile and and and I've heard you say caught the last train out of town on cloud right so as you started thinking about the next big phase shift it it appears that you and others in the team Kevin Scott sniffed out pretty early that Google was likely ahead in AI with deep mind you make the decision to invest in open AI what convinced you of this direction right versus the internal AI research efforts that you had underway yeah it's a great point because there a couple of
things there right one is we were at it on AI for a long long time obviously you know when bills started um MSR in 1995 I think you know the first group I mean he was always into this natural user interface I think the first group was Speech you know Rick Rashid came there was you know in fact Kaiu worked here and uh you know we had a lot of i' would say focus on trying to crack natural user interface language was always something something that we cared about right in fact even Hinton worked like
some of the early work in DNN happened when he was in residency in MSR uh and then Google hired so we missed I would say even uh in the early 2010s some of what could have been doubling down uh at around the same time uh that Google doubled down and bought even Deep Mind Right um and so that actually bothered me quite a bit but I always wanted to focus like for example Skype translate was one of the first things I focused on um because that was a pretty cool like that was the first time
you could see transfer learning work right which is you could train it on one language pair and it got better on another language right that was the first place where we could say wow machine translation is also with DNS like is different and so ever since I've been obsessed with language along with Kevin in fact the first time um uh yeah actually Elon and Sam they were looking for obviously Azure credits and what have you and uh we gave them some credits and that time they were more into RL and DOTA to and what have
you and that was interesting uh and then we stopped for I forget even exactly what happened and then they I think went to gcp and then they came back uh to talk about sort of what they wanted to do with language that was the moment right which they talked about Transformers and natural language and because I always felt like look if that because that's to me our Core Business and it goes back a little bit to how I think which is what's our structural position I knew always that if there was a way to have
a nonlinear breakthrough in terms of some model architecture that sort of exhibited um you know like one of the things that bill you know You' always say throughout my our career here was there's only one category in digital it's called Information Management the way he thought about it was you schematize the world right take people places things you know just build a schema right we went down many way you know there was this very Infamous project called win aest at Microsoft uh which was all about schematize everything and then you know you'll make sense of
all information and this was it was just it's just impossible to do uh and so therefore you needed some breakthrough and we said maybe the way to do that is how we schematize after all the human brain does it through language and inner monologue and what and reasoning and so therefore anyway so that's what led me to open Ai and quite frankly the ambition that Sam and Greg and team had and that was the other thing right scaling laws in fact I think the first memo weirdly enough I read on scaling was written by Dario
when he was at open AI uh and Ilia and that's sort of what like I said let's take a bet on this right which is hey wow if this is going to have exponential performance uh why not go all in and give it a real shot and then and then of course once we started seeing it work on GitHub co-pilot and so on then it was pretty easy to Double Down uh but that was the intuition one of the things that um has happened I think in previous phase shifts is some of the incumbents don't
get on board fast enough you even talked about Microsoft um perhaps missing mobile or search or that kind of thing I could argue especially since I'm old and I've seen these shifts that everyone's awake on this one like or it has it's the most awake like it's heavily coreographed everyone's maybe at the starting line at almost the same time um I'm curious if you agree with that and how you think about the key players in the race you know Google Amazon meta with llama Elon is entered the game yeah it's a it's an interesting one
to your point about I always think about it right there if you sort of take the late 90s there was Microsoft and there was Daylight uh and then there was the rest interestingly enough now you know people talk about the mag 7 there is probably more than that even to your point about everybody's awake to it they all have amazing balance sheets uh there even I think I'll call it CH you know if you think about open AI in some sense you could say it's mag 8 uh because I think the company of this generation
is already been created which is open AI in some sense uh it's kind of like the Google or the Microsoft or um or The Meta of uh this era um and um and so there are a couple of things so therefore I think it's going to be very competitive I also think that uh I don't don't think it's going to be win or take all right because a lot there may be some categories that may be win or take all um for example on the hypers scale side absolutely not right I mean the world will
demand you know even ex China uh multiple providers of Frontier models uh distributed all over the world in fact one of the best structural positions uh that I think Microsoft has is uh you know because if you remember the a Azure structure slightly different right we built out Azure for Enterprise workloads with lot of data residency with lots we have 60 plus regions more regions than others so we didn't con it was not like we built our Cloud for one big app we built Cloud for a lot of heterogeneous Enterprise workloads which I think in
the long run is where all the inference demand will be with Nexus to data and the app server and what have you so I think there is going to be multiple winners at the infrastructure layer uh there is going to be in the models even there just the model and the app servers that each hyperscaler will have a bunch of models and there will be an app server around multi like every app today even including co-pilot it's just a multimodel app uh and so there's in fact a complete new app server like everyone like there
was a mobile app server there was a web app server and guess what there's an AI app server now and for us that's Foundry and we're building one and others will build there'll be multiple of those then in apps I think there will be more FOC you know I would say Network effects is always going to be at the software layer right so at the app layer uh there'll be different network effects in consumer in the Enterprise and what have you and so to you to to your fundamental point I think you have to analyze
it at structurally by layer and there is going to be fierce competition between the seven eight nine 10 of us at different layers of the stack and as I always say to our team which is watch for the one who comes and you know adds to it right that's the game you're all in uh where you're always looking at who's the new entrepreneur will come out of the blue and at least I would say open AI is one such company which at this point has a skape velocity yeah which you know if we think about
you know the app layer for a second start with consumer AI a little bit here Sacha you know Bing's a very large business you and I've discussed 10 Blue Links was maybe the best business model in the history of capitalism but it's massively threatened by a new modality where consumers just want answers right for example my kids they're like why would I go to a search engine when I can just get answers so do you think you know first can Google and Bing continue to grow the Legacy search businesses in the age of answers and
then what does you know what do uh uh does Bing need to do or your consumer efforts under Mustafa need to do in order to you know compete with chat GPT which really looks like you know it's broken out from a consumer perspective yeah I mean I think the the first thing is what you said last which is chat meets answers and that's chat GPD both the brand the product uh and it's becoming stateful right I mean like GPT now is not just you know in fact search was a state less for you know there
was search history but I think more so uh these agents will be a lot more stateful so uh in fact so that's why I was so thrilled like I've been trying to get an Apple search deal for like 10 years and so when Tim finally did a deal with Sam I was like the most thrilled person which is better it's better to have uh chat GPT get that deal than anybody else as because we you know we have that commercial and investor relationship with open AI So to that point the way I look at it
and say is at the same time distribution matters right I mean this is where Google has an enormous Advantage right they have the Distribution on Apple they're the default uh they are obviously the default on Android uh they touch so therefore I think uh uh and the habit don't go away right I mean the number of times you just go to the browser URL and just type in your query right I mean even now even though I want to go to co-pilot I mean my usage is mostly co-pilot and like if I have to think
about B versus co-pilot it's kind of interesting right some of the navigational stuff I go to Bing pretty much everything else I go to co-pilot right that shift I think is what's happening universally uh and we are away maybe one or two of these agents for shopping or travel away from even some of the commercial query that's the that's the time when the Dam breaks I think on traditional search when some of the commercial intent also migrates into the chat right now mostly the businesses withstood because the commercial intent has not migrated but once commercial
intent migrates that's when it suddenly moves um and so I think yes this is a secular shift the way we are managing it is we have three properties in mustafa's world right there is bang MSN and co-pilot so we think in fact he's got a crisp vision of what the these three things are they're all sort of one ecosystem one is a feed one is search in the traditional way and then the other is this new agent uh you know uh interface and they all have a social contract with content providers we need to drive
traffic we need to have pay walls Maybe we need to have ads supported models all of those and so that's what we're trying to manage we have our own distribution the one advantage we do still have is Windows uh we get to relitigate we lost the browser right even Chrome became the dominant browser which is a real travesty because we had won against Netscape only to loes to Google and we are getting it back now in an interesting way both with Edge and with co-pilot guess what now even Gemini has to earn like the good
news about Windows for at least is it's the open system right Chad GPT has a shot Gemini has a shot we you don't have to call Microsoft you can go do your best work uh and go over the top but that also means we also get to having lost it is great sometimes because you can win it all back uh and so to me even Windows distribution I mean I always say Google makes more money on Windows than all of Microsoft I mean literally I mean and I say wow this is the best news for
Microsoft shareholders that we lost so badly that we can now go contest it and win back some share hey SAA one thing everybody's talking about these agents and and if you just kind of think forward in your mind a bit you can imagine all kind of players wanting to enact action on other apps and other data that may be on a system and you know Microsoft's in an interesting position because you control the windows ecosystem but you have apps on like the iPhone ecosystem or the Android ecosystem and how do you think about and this
you know partially a terms of service question partially a partnership question will Apple allow Microsoft to control other apps on iOS will Microsoft let uh chat gbt instantiate apps and take data from apps on on Windows OS I mean you get the question it goes all the way to when you start thinking about search and and commerce like you know will booking.com let you know Gemini run transactions on it without their permission or knowledge yeah I think that this is the most interesting question right I mean to some degree uh it's unclear exactly how this
will happen there is a slight uh very old school way of thinking about some of this which is if you remember you know how business applications of various kinds um manage to do interrupt right they did manage you know manage that interrupt using connectors and people had connector licenses so there was a business model that emerged right I mean sap was one of the most classic ones where you know you could say Hey you can access sap data as long as you had connectors so there's a part of me which says something like that will
emerge as when agent to agent interface o um it's unclear exactly what happens in consumer because consumer the value exchange was a lot of you know advertising and traffic and what have you some of those things go away in an agentic world so I think the business model is SL slightly unclear to me on the consumer side uh but on the Enterprise side I think what will happen is everybody will say hey in order for you to either action into my action space or to get data out of my uh sort of schema so to
speak there is some kind of an interface to my agent uh that is licensed so to speak and I think that that's a reason like today for example when I go to co-pilot um at Microsoft I have connectors into Adobe into my instance of sap obviously our instance of CRM which is Dynamics uh so it's as it's it's fascinating in fact I you know when it was the last time any of us really went to a business application right we licens all these SAS applications we hardly use them and somebody in the OR is sort
of inputting data into it but in the AI age the intensity goes up because all that data now is easy right your query away I can literally say hey I'm meeting with Bill tell me about all the companies that benchmarks uh invested in it's both taking the web anything that's in my CRM database collating it all together giving me a note what have you so to some degree all that I think can be monetized by us and by even these connectors but more explicitly like the thing that could happen really quickly because there's been talk
about it like would you allow chat GPT on the Windows OS to just start opening random apps and Tak now that's an interesting one right so the that over-the-top computer use who is going to permit permit that right so which is is it the user or is it the operating system like on Windows there is quite frankly not anything I can do uh to prevent that other than some security guard rails right so I could sort of definitely like because I think if they became a sec like one of my big fears is the security
risk uh right if some malware got downloaded and that malware started sort of actioning stuff right that's when it's really dangerous so I think those are the ones that we will build into the OS itself right which is some elevated access and privilege that this computer use stuff happens but at the end of the day the user will be in control uh on an open platform like Windows and I'm sure apple and and you know Google will have a lot more control so they won't allow it um and so that's in some sense you could
say that's a Advantage they have or um you know depending on how at rules on all of those uh you know ultimately it'll be a interesting thing to watch you flip that around and then we can move on but like would you to allow the Android OS or or let's just call it the Android AI or the iOS AI to read email you know through a Microsoft client on that smartphone yeah I mean we kind of like you know for example the today you know one of the things I always think about is I don't
know whether that was value leaking or did it actually help us right which is we licensed the sync uh the for Outlook uh to Apple uh for Apple Mail uh it was kind of a it was an interesting case and I think that there was a lot of value leaked perhaps but at the same time I think that was one of the reasons why we were able to hold on to exchange right it would have been doubly problematic understood um if we had not done that and so so one of the things I think is
going to your point Bill if we we're building out the reason we're going to do this is we have to have a trust system around Microsoft 365 we just cannot sort of say hey any agent comes in and does anything because after all first it's not our data it's our customers's data right it'll be and so therefore the customer will have to permit it the it folks in the customer will have to permit it it's not like some blanket flag I can set um and then the second thing is it has to have a trust
boundary so I think what we will do is it's kind of it's an interesting way it's kind of like what Apple intelligence is doing think of it as we will do that around M365 you go through a lot today I I'd highly recommend people download it it's uh super interesting yeah so Sacha you know clicking on this you know um Mustafa has said that 2025 will be the year of infinite memory and Bill and I have talked a lot dating back to the start of this year that we think the next 10x function you know
it sounds like you agree on gbt is really you know this persistent memory um combined with being able to take some actions on our behalf so we're already seeing the starts of memory and I'm pretty convinced as well that 2025 we you know it seems like that one's pretty well solved but this question of actions when am I going to be able to say to chat gbt book me the Four Seasons in Seattle next Tuesday at the lowest price right and Bill and I you know have have have gone back and forth on this one
and it would seem that computer use is the early you know uh test case for that but do you have any sense is you know does that seem like a hard one from here to you yeah I mean the most open-ended um action space is still hard but to your point there are two things or maybe three things that are really exciting Beyond I'll just say uh I'm sure we'll talk about it the scaling laws itself and capabilities of of the raw models uh one is memory the other is tools use or actions and the
other one I would say is even um entitlements right which is you know what can you like you know one of the most interesting products we have even is perview inside of Microsoft because increasingly what do you have permissions to what can you get you know you have to be able to access things in a safe way somebody needs to have governance on it and what have you so if you put all those three things together together and this agent is going to then be more governable and when it comes to actions it is verifiable
uh and then it has memory then I think you're off uh to a very different place where for doing more autonomous work so to speak I still think one of the things I always think is your build I like this co-pilot as the UI for AI because even in a fully autonomous World from time to time you'll raise exception you'll ask for permission you'll ask for invocation what have you and so therefore this UI layer will be the organizing layer in fact that's kind of why we think of co-pilot as the the organizing layer for
work work artifacts and workflow uh but to your fundamental point I don't think the models like take even 40 right even not even going to 01 40 is pretty good with function calling so you can do in the Enterprise setting significant more more so than consumer because consumer web function calling is just hard uh where at least in an open-ended web uh you can do it for a couple of websites but once you say hey let's go do a book me a ticket on anything and it just and if there's schema changes on the back
end and so on to trip over you can teach it that that's where I think 01 can get better if it's a verifiable autog gradable uh sort of process on Rails uh but I think we're maybe a year year to two years away from doing more and more but as in so but I think at least from an Enterprise perspective going and doing here's my sales agent here's my marketing agent here's my supply chain agent which can do more of these autonomous tasks uh we built 10 or 15 of them into Dynamics right even looking
into sort of my supplier Communications and automatically handling my supplier Communications updating my databases changing my inventories my apply those are the kinds of things that you can do today I would say the the Mustafa made this comment about near infinite memory and I'm sure you heard it or hear it internally is there any clarification you can offer about that or is that more to come I think that I mean I at some level the idea that you have essentially a type system um for your memory right that's the thing right which is it's not
like every time I start you I get I get the idea he made it sound like you guys had an internal technical breakthrough on this front yeah I mean we have like there there's an open source project even um I think it's I forget like I it's uh it's the same set of folks who did all the type uh typescript stuff uh who are working on this so what we're trying to do is uh essentially take memory and schematize it and sort of make it available such that you can go like each time I start
a let's just imagine I'm on some new prod I know how to Cluster based on everything else I have done and then that type matching and so on I think is a good way for us to build up a memory system so Shi shifting maybe to Enterprise AI sucha you know the Microsoft AI business has already reported to be about10 billion you've said that it's all inference um and that you're not actually renting raw gpus to others to train on because your inference demand is so high so as we think about this there's a lot
of I think skepticism out there in the world as to whether or not major workloads are moving you know and and and so if you think about the key Revenue products that are that people are using today and how it's driving that inference revenue for you today and how that may be similar or different from Amazon or Google I'd be interested in that yeah I think that's a good so the way for us this thing has played out is you got to remember most of our training stuff with open AI is sort of more investment
logic right so it's sort of not in our quarterly results it's more in the other income right based on uh our investment so um the so that means the only thing that shows maybe or loss other income or loss right that is right that is right right now uh that's how it shows up um and so the um uh so most the revenue or all the revenue is pretty much uh our API business or in fact to your point chat gpt's inference costs are there right so that's a different uh piece and so the fact
is the big hit apps of this era are what chat GPT co-pilot uh GitHub co-pilot uh and the apis of open Ai and Azure open AI right so in some sense if you had to list out the 10 most of hit apps you know these would probably be in the four or five uh of them and so therefore that's the biggest driver the advantage we have had and open AI has had which is we've had two years of Runway um right pretty much uncontested to your point uh bill made the point about hey everybody's awake
but and it might be I I don't think there will be ever again maybe a two-year lead like this who knows you know it's all you say that and somebody else you know drops some sample and suddenly blows the world away but that said I think it's unlikely that that type of you know lead could be established with some Foundation model um and but we had that advantage that was the great Advantage we've had with open AI open AI was able to really build out this escape velocity with Chad GPT but on the API side
the biggest thing that we were able to gain was you know take you know Shopify or stripe or Spotify these were not customers of azure they were all customers of gcp or they were customers of AWS so suddenly we got access to many many more logos uh who are all quote unquote digital natives who are using uh Azure in some shape of fashion and so on so that's sort of one and when it comes to the traditional Enterprise I think it's scaling like I mean literally it is you know people are playing with co-pilot on
one end and then are building agents on the other end using Foundry uh but like these things are design wins and project wins and they're slow but they're starting to scale and again the fact that we've had two years of runway on it uh I think I like that business a lot more and that's one of the reasons why the adverse selection problems here would have been lots of tech startups all looking for their h100 allocation in small batches right that you know having watched what happened to Sun micro system in the uh sort of
do I always worry about that which is whoa if um you know you just can't chase everybody building models in fact even in the I think the investor side I think the sentiment is changing which is now people are wanting to be more Capital light and build on top of other people's models and so on and so forth and if that's the case uh you know everybody who was looking for h100 will not want to uh you know want to look for it more so that's kind of what we've been selective on and you're saying
says uh for the others that training of those models and those model clusters was a much bigger part of their AI re Revenue versus yours I I don't know I mean this is where I'm speaking for other people's um results I don't I mean it's just I go back and say what are the other big hit apps um right um I don't know what they are like I mean um the where do they like what models do they run where do they run them um I would like I that's kind of I'm not I mean
obviously Google's Gemini I don't know the when I look at the Dow numbers of any of these AI products um there is chat GPT right and then there is you know like even Gemini I'm very surprised at the Gemini numbers I mean obviously I think it'll grow you know because of all the inherent distribution uh but I it's kind of interesting to say that they're not that many in fact we talk a lot more about AI um scale but there is not that many hit apps right there is chat GPT get up co-pilot there's co-pilot
and there's Gemini I think those are the four I would say in a DA like is there anything else that comes to your mind well I think there you know there are a lot of these startup use cases that I think are starting to get some traction kind of Bottoms Up A lot of them buildt on top of of llama um but you know but if you said oh and there's meta if you said that 10 more what are the apps that have more than five million d right I think yeah I think I think
Zuckerberg would argue met AI certainly you know has more Etc but but but I think you're right in terms of the non-affiliated apps you named them and and zck stuff all runs on his own CL I mean he's not running on public club right yeah so on the on the Enterprise side um obviously the coding spaces Off to the Races and you guys are doing well and there's a lot of venture back players there on some of the productivity apps I have a question about the the co-pilot approach and and I guess Mark B off's
been kind of obnoxiously uh critical on this front and called it clippy 2 or whatever um do you do you worry that someone might think kind of first principles AI from ground up and that some of the infrastructure say in an Excel spreadsheet isn't necessary to know if you did a if you did a AI first product and the same thing by the way could be said about the CRM right there's a bunch of of fields and task that that may be able to be Aus skated for the user yeah I mean it's a it's
a very very very important question the SAS applications or Biz apps so let me just speak of our own Dynamics thing the Approach at least we're taking is I think the notion that business applications exist that's probably where they'll all collapse right in the agent era because if you think about it right they are essentially Crow databases with a bunch of business Logic the business logic is all going to these agents and these agents are going to be multi- repo crud right so they're not going to discriminate between uh what the back end is they're
going to update multiple databases and all the logic will be in uh the AI tier so to speak um and once the AI tier becomes the place where all the logic is then people will start replacing the back ends right we people you that's what you know in fact it's interesting as we speak I think we are seeing pretty high rates of wins on Dynamics backends um and the agent use and we are going to go pretty aggressive aggressively and try and collapse it all right whether it's in customer service whether it is in you
know uh by the way the other fascinating thing that's increasing is just not CRM but even our what we call finance and operations uh because people want more AI native Biz apps right that means The Biz app the logic tier can be orchestrated by Ai and AI agent so in other words copilot to agent to my business application should be very seamless now in the same way you could even say hey why do I need Excel like interestingly enough one of the most exciting things for me is Excel with python is like GitHub with co-pilot
right that's essentially so what we've done is when you have Excel like this by the way would be fun for you guys right which is you should just bring up Excel bring up copilot and start playing with it because it's no longer like oh you know it is like having a data analyst uh and so it's no longer just making sense of the numbers that you have it will do the plan for you right it will literally like how get up copilot workspace creates the plan and then it executes the plan this is like a
data analyst who is using Excel as a sort of row column visualization to do analysis scratch Pad so it's kind of tools you so the co-pilot is using Excel as a tool with all of its action space because it can generate and it has python interpreter that is in fact a great way to reconceptualize Excel and at some point you could say hey I'll generate all of excel uh and that is also true after all there's a Code interpreter right so therefore you can generate anything um and so yes I think there will be disruption
but so the way we are approaching at least our M365 stuff is one is build co-pilot as that organizing layer UI for AI get all AG agents including our own agents you can say the Excel is an agent to my co-pilot word is an agent it's kind of specialized canvases which is I'm doing a legal document let me take it into pages and then to word and then have the co-pilot Go With It uh go into Excel and have the co-pilot go with it and so that's sort of a new way to think about the
work in workflow you know one of the one of the questions I hear people ringing their hands about a lot today Sacha is the ROI people are making on these Investments you know you have over 225,000 employees are you leveraging AI to increase productivity reduce costs Drive revenues in your own business if so kind of what are the biggest examples there you know and maybe to a finer po point on that you know when we had Jensen on I asked him you know when he two or 3x his Top Line what did he expect his
his headcount to increase by and he said 25% and when I asked why he said well I have 100,000 agents helping us do the work so when you two or 3x your revenue for Azure you know do you expect to see that similar type of Leverage um on on on headcount yeah I mean it's um it's it's it's a it's top of mind and top of mind for both us at Microsoft as well as U customers here's the way I come at it I I love this thing of I've been going to school on learning
a lot about what happened in industrial companies with lean yeah um right I mean it's fascinating right they're all GDP plus Growers it's unbelievable like I mean the discipline they have in how the good Industrials can literally say hey I'll add two to three you know 100 basis points of Tailwind uh just by lean which is increase value reduce waste right that's the practice so I think of AI is the um lean for knowledge work you know we are really going to school on it like which is how do we really go look at that's
why I think you know the good old you know we remember in the 990s we had all this business process re-engineering I think it's back uh in a new way where people who can think endtoend process flows uh and say hey what's the way to think about the process efficiency what can be automated what can be more made more efficient so that's a little bit of I think so customer service is the obvious one like we are on course we spent around $4 billion or so this is everything from Xbox support to Azure support uh
this is really I mean this is serious one year because of the deflection rate on the front end then the biggest benefit is the uh the agent efficiency right where the agent is happier the customer is happier and our costs are going down uh and so that's I think the most obvious uh place and that we have in our contact center application that's also doing super well the other one is obviously GI up copilot that's the other and with GI up copilot workspace right that's the first place where even this what is agentic sort of
side comes in right you go from an issue to a plan to or to a spec to a plan and then multifile edit right so it's just completely changes the workflow uh for the in team um uh as I said and then the 0365 is the you know the catchall right so the the M365 co-pilot is where I mean just to give you a feel like even my own right every time I'm meeting a customer I would say the workf flow of the prep of the CEO office has not changed since 1990 right basically I
mean in fact one of the ways I look at it is just imagine how did forecasting happen prep PCS and post PCS P right there were faxes then inter office memos and then PCS became a thing and people said hey I'm just going to put an Excel spreadsheet in email and send it around and people will enter numbers and we will have a forecast the same thing is happening in the AI era uh right now all over the place right I prep for a customer meeting where I literally go into co-pilot and I say tell
me everything I need to know about the customer it tells me everything from my CRM my emails my teams meetings and the web right it grounds it I put it into Pages share it with my account team in you know in real time so just imagine the hierarchy this entire thing of oh let me prepare a brief for the CEO goes away it's just a query away I generate a query share a page if they want to annotate it so I am reasoning with AI and collaborating with my colleagues right that's the new workflow and
that's happening all over the place somebody gave me this example from supply chain like somebody said supply chain is like a a trading desk except it is doesn't have real-time information right that's kind of what it is so it's like you wait for the quarter to end and then the CFO comes and bangs you on the head as saying all the mistakes you made what if that financial analyst essentially can be in real time be available to you and giving you like oh you're doing this contract and for this data center in this region you
should think about these terms all that intelligence in real time is changing the workflow and work artifact so lots and lots of use cases all around and I think your to your fundamental Point our goal is to kind of create operating leverage through AI right so I think headcount will in fact one of the ways I look at it and says our total people costs will go down our cost per head will go up and my GPU per researcher will go up yes that's kind of the way I look at it that makes sense um
hey let's shift ahead to um something that you referenced earlier just around what we're seeing out of Model scaling uh and and capex generally you know I've heard you talk about you know Microsoft's capex I imagine in 2014 when you took over you had no idea that the capex would would would look like it does today in fact you've said it looks increasingly these companies look more like Industrial company capex uh than traditional software compan companies your capex gone from about 20 billion in in in 2020 to maybe as high as 70 billion in 2025
you know you've earned a pretty consistent return on that capex right so there's actually a very high correlation when you look at your capex to revenue some people are worried that that correlation will break and even you have said you know maybe at some point there's going to be you know capex is going to have to be spent ahead of the revenue you you know there may be an air pocket we have to build for this uh resiliency so how do you feel about the level of capex uh does it cause you any sleepless nights
and when does it begin to taper off uh you know in terms of this this rate of growth yeah I mean couple of different things right one is this is where being a hyperscaler I think structurally super helpful because in some sense um we've been practicing this for a long time right which is in a you know hey data centers have 20 year life cycles power you pay only when you use uh the kits are six years you know how to sort of Drive utilization up uh these are and the good news here is it's
kind of like Capital intensive but it's also software intensive and you use software to bring the roic of your Capital higher right that's kind of uh like when people even in the early days said hey how can like a hyperscaler ever make money because what's the difference between old hosters and the new hyperscalers it is software right um and that I think is what's going to apply even uh to this uh GPU physics even right which is hey you buy leading you build it out in fact one of the things that's happening right now is
what I'll call catchup right which is we built after all over the last 15 years the cloud suddenly a new meter showed up in the cloud it's called the AI accelerator because every app now needs a database uh a kubernetes cluster uh and and a model uh that runs on an AI accelerator right so if you sort of say oh I need all three you suddenly had to build up these AI accelerators in order to be able to provision for all of these applications so that will normalize so the first thing is the buildout will
happen the workloads will normalize and then it will be you will just keep growing like the cloud has grown so that's sort of the one side of it and that's where avoiding some of these adverse selection issues making sure it's not just all supply side you know everybody's sort of building only hoping demand will come just making sure that there is real diverse demand all over the world all over the segments I watch for all of that uh so I think that that's I think the way to manage the roic and by the way the
margins will be different right this is goes back to the very early dialogue we had on when I think about the Microsoft cloud the margin profile of a raw GPU versus the margin profile of fabric plus GPU or Foundry plus GPU um or app g chat you know or GitHub co-pilot uh add-on uh to M365 so they're all going to be different and so if you're having a portfolio matters here right because if I look at even the mic why does Microsoft have a premium today in the cloud uh we are bigger than Amazon growing
faster than Amazon with better margins than Amazon because we have you know you know all these layers and that's kind of what we want to do even in the AI era sucha there's been a lot of talk about model scaling and um obviously there was talk historically about kind of 10 Xing the cluster size that you might do over and over again not you know once and then twice um and nx. a is is still making noise about going in that direction there was a podcast recently where they kind of flipped everything on their head
and they said well if we're not doing that anymore it's way better because we can just move on to inference which is getting cheaper and you won't have to spend all this capex I'm curious those are two kind of views of the same coin but what's your view on on large llm model scaling and training cost and where we're headed in the future yeah I mean you know this I mean I'm a big believer in scaling laws I'll sort of first say and in fact if anything the bet we placed in 2019 was on scaling
laws and I stay on that right which is in other words uh don't bet against scaling laws but at the same time uh let's also be grounded on a couple of different things one is um these exponentials on scaling laws will become harder uh just because as the Clusters become harder uh everything I mean the distributed computing problem of doing large scale training becomes harder um and and so that's kind of one side of it so there is uh but I would just still say and I'll let the open AI folks speak for what they're
doing but they are you know continuing to you know pre-training I think is not over over it sort of continues but the exciting thing which again open AI has talked open I mean about and Sam has talked about is what they've done with 01 right so this Chain of Thought with autog grading and uh uh is just a fantastic in fact you know basically it is test time compute or inference time compute as an another scaling law right so you have pre-training and then you have effectively this test time sampling that then creates the tokens
that can go back into pre-training creating even more powerful models that then are running on your inference right so therefore uh that's I think a fantastic way to increase model capability so uh the good news of test time or inference time compute is sometimes you know running of those 01 models means um the run you know there's two separate things sampling is kind of like training when you're using it to generate tokens for training uh for your pre-training uh but also customers when they you know are using 01 they're using more of your meters and
so you are getting paid for it and so therefore uh there is more of an economic model right so therefore I like it in fact that's where I said I have a good structural position with 60 plus data centers all over the world different it's a different Hardware architecture for one of those scaling versus the other for the pre-training versus exactly and and and I and I think the best way to think about it is it's a ratio right so going back to sort of Brad's thing about roic this is where I think you have
to sort of really establish a stable state in fact you know whenever I've talked to Jensen I think he's got it right which is look you kind of want to buy some every year not buy like think about it right when you depreciate something over six years the best way is what we have always done which is you buy a little every year and you age it you age it you age it right you use the leading node for training and then the next year it makes it goes into inference uh and that's sort of
the stable State I think uh we will get into across the fleet for both utilization uh and the roic and then the demand meet Supply and like basically to your point about everybody saying oh wow have the exponentials stopped one of the other things is the economic realities will also sort of stop right me at some point everybody will look and say what's the economically rational thing to do agree uh which is hey even if I double every Year's capability but I'm not able to sell that inventory and the other problem is the Winner's curse
right which is uh if you don't you don't even have to publish a paper the other folks have to just look at your capability and do a either a distillation it's just impossible it's kind of like piracy right I mean you can start of all kinds of terms of use but like it's impossible to control distillation that's one second thing is you know you don't even have to do anything you just have you reverse engineer that capability and you do it in a more computer efficient way uh and so given all this I I think
there will be a governor on how much people will kind of Chase right now a little bit of everybody wants to be first it's great but at some point all the economic reality will set in on everyone and uh and the network effects are at the app layer so why would I want to spend a lot on some model capability with the network effects are all on the app what I heard you say I believe you know so Elon has said that he's going to build a million GPU cluster I think met has said the
same thing I think the pre-training I think he said 200 and then he kind of joked about a million but I I think he joked about a billion but you know the the fact of the matter is have your versus the start of the Year Sacha based on what you've seen around pre-training and scaling have you changed your infrastructure plans around that and then I have a separate question with regard to 01 I am bill building to what I would say is a way like a little bit of the 10x point right which is hey
how do you we can argue the duration uh like is it every two years is it every three years every four years there is an economic model and this is where I think a little bit of disciplined way of thinking about how do you clear your inventory um such that it makes sense right which is or or the other way is the depreciation cycle of your kit right there is no way you can sort of buy you you can pre unless you find the physics of the GPU works out where suddenly it flows through my
p&l and it's actually you know it's in the same or better margin than hyperscaler that's simple like so that's kind of what I'm going to do I'm going to keep going and building basically to hey how do I drive inference demand and then keep increasing my capability and be efficient at it uh I absolutely and and Sam may have a different objective and he's been open to it right he's sort of like he may say hey I want to build uh because I know I'm deeply have deep conviction on what AGI looks like or what
have you and so be it so therefore uh that's where I think a little bit of our tension is even to clarify something I I heard Mustafa say on a podcast that Microsoft is not going to engage in the biggest model training comp competition that's going on is that fair well what we won't do is do it twice right because after all we have the IP from it would be silly for Microsoft today given the partnership with open AI to do two unnecessary training set yes correct so we are very and that's why we have
Conant and by the way that's the Strategic discipline we've had right which is you know that's why you know I always stress to Sam like we bet the farm on open AI um and said hey we will concentrate our compute and we did it because we had all the rights to the IP and uh and so that's sort of the uh the give gets on it and we feel fantastic about it and so then what Mustafa is basically saying is hey we will also do in fact a lot of focus on our end is post
training and even on the uh the verification or what have you so that's a big thing so we'll focus a lot of our compute resources on adding more model adaptations and capabilities that make sense while also having a principled pre-training stuff that sort of gives us capability internally uh to do things we anyway have different model weights and model classes for different use cases that we will continue to go ahead and develop as well does your question to Brad's question about your answer to Brad's question about the balancing of GPO GPU Roi does that answer
the question as to why you've outsourced some of the infrastructure to cor weave in that partnership that you have that that we did because we all got caught with the hit called chat gpte and open yeah we were completely I mean like yeah it was it was imposs there's no supply chain planning I could have done in you know what is it TW you know like you like none of us knew what was going to happen right what happened in November of 22 like that was just a bol from the blue right so therefore we
had to catch up so we said hey we're not going to in fact worry about too much inefficiency so that's why whether it's corv or uh many others you know we bought all over the place and so uh and that is a one time thing and then now it's all catching up and you know and um yeah so that was just more about trying to get caught up with demand are you still are you still Supply constraints Sacha um I am power yes I am not chip Supply constrainted we were definitely constrained in 24 what
we have told the street is that's why we are optimistic about sort of uh the first half of 25 which is the rest of our fiscal year um and then after that I think we'll be in better shape going into 26 and so on we have you know good line of sight so I'm I'm hearing with respect to you know this level two thinking you know the 01 uh test time compute posttraining work that's being done on that is leading to really positive outcomes and when you think about that that's also pretty compute intensive because
you're generating a lot of tokens you're recycling those tokens back into the context window and you're doing that time and time again and so that compounds very quickly Jensen said he thought looking at 01 the inference was going to a million or a billion x you know just that it was the demand for inference is going to go up dramatically in that regard do you feel like you have the right long-term plan to scale inference to keep up with these new models yeah I mean I think there are two things there uh Brad which is
in some sense it'll be it's very helpful to think about the full workload there the full workload like in the agentic world you have to have the accelerator one of the fastest growing things of in fact open aai itself is the container service because after all these agents need a scratch pad for doing some of those autog grading even to generate the samples right and so that is like where they run a code interpreter and that by the way is a regular Azure kubernetes cluster so in an interesting way there's a ratio of even what
is regular Azure compute and its Nexus to the GPU and then some data service so to your point it's not when we say inference it's that that's why I look at it and say there's you know people think about AI as separate from the cloud AI is now core part of the cloud um and I think in a world where every AI application is a stateful application it's an agentic application that agent agent performs actions then classic app server plus the AI app server plus the data base are all required and so I go back
to my fundamental thing which is hey we built this 60 plus AI regions I mean Azure regions they all will be ready for fullon AI applications and that's I think what will be needed that makes it you know that makes a lot of sense um so let's talk a little bit you know we've talked around open AI a lot during this conversation but you're manage the managing this balance between you know a huge investment there and your own efforts at ignition you showed a slide highlighting the differences between Azure open Ai and open AI Enterprise
and a lot of those were about the Enterprise grade uh you know things that you bring to the table so when you look at that tension you know the competition that you have with open AI do you think about them as chbt is like to be that that that winner on the consumer side you'll have your own consumer apps as well and then you you'll you'll divide and conquer when it comes to Enterprise how do you think about competing with them the way I think about at this point given open AI is a very at
scale uh company right so it's no longer um it's a really very successful company uh with even multiple lines if you will of business and segments and what have you and so I come at it very principally uh like I would with any other big partner right because I don't think of them so I think of them as hey as an investor what are their interests and our interests and how do we align them I think of them as an IP partner um and because we give them systems IP they give us uh model IP
right so that's another side of it where we are very deeply um interested in each other's success the third is I think of them as a big customer um you know and so therefore I want to serve them like I would serve any other big customer and so and then the last one is the coopertition right which is um uh whether it's co-pilot in the consumer space whether it's co-pilot with M365 or whatever else uh we sort of say hey um where is the coop petition where is and that's where I kind of look at
it and say you know ultimately these things will have some overlap but I also in that context the fact that they have the Apple deal is in some sense for the msft shareholder you know a creative right even in a like the fact that their apis like to your point about the API differences hey you choose right the customers can choose which API front or like some of the the you know there's differences right Azure has a particular style and if you're an Azure customer and you want to use other services of azure then it's
easiest to have an Azure and Azure Mac but if you are on AWS and you want to just use just the API in a stateless way great just use even open AI so I think in an interesting way there sometimes the having these two types of distributions is also helpful to the msft uh cost such is the the I would say the um kind of curious part of the Silicon Valley Community and even larger I would say the entire business Community is I think infatuated with the relationship between Microsoft and AP I was at dealbook
last week and Andrew sorcin pushed Sam really hard on this um I imagine there's a lot you can't say but is there anything you can say there's supposedly a restructuring you know conversion to profit I guess elon's launched to missive in there as well what what can you tell us yeah I mean I think those uh Bill are obviously all for the opening eye board and Sam and Sarah and Brad and that team to decide what they want to do and we want to be supportive I mean so this is where we're an investor we
let me I I would say the one thing that we care deeply about is open AI continues to succeed right I mean it's in our interest um and I also think it's a company that is an iconic company of this uh platform shift and uh the world is better with open AI doing well and so therefore that's sort of the fundamental position then after that um the pace with which the tension to your point comes from like in all of these Partnerships some of it is that coopertition tension some of it is you know Sam's
somebody who is an unbelievable entrepreneur with great amount of sort of vision and ambition and the pace with which he wants to move uh you know and and so therefore we have to balance that all out right which is uh you know what he wants to do I have to accommodate for so that he can do what he dos to do and he needs to accommodate for the discipline that we need on our end uh given you know the overall constraints we may have and so I think we'll work it out um but I I
mean the good news here I think is in this construct um we have come a long way I mean this 5 years has been great for them it's been great for us and at least for my part I'm going to want to keep going back to that and I want to prolong it as long as I can it would only behoove us to have a long-term stable partnership when you think about uh you know the separate funding you know and untangling the two businesses such are you know are you guys motivated to do that relatively
quickly um I've talked about thinking that the next step for them you know it'd be great to have them as a public company you know it's such an iconic uh you know business uh early leader in in AI um is that the path that you see you know for for these guys on the way forward um or do you think that it stays kind of in the relationship that we are today and that's the place where Brad I want to be careful not to overstep right because in some sense you know I'm neither we're not
we're not in the board we're investors like you uh and uh at the end of the day it's their board and their management decision and so at some level I'm going to take whatever their cues are like in other words I'm very clear that I want to support them with whatever decision they make um and and to me perhaps even as an investor it's that commercial and IP partnership that matters the most uh we want to make sure we protect our interests in all of this uh and if anything bols to them going forward uh
but I think you know at this point you know people like Sarah and Brad and Sam are you know are very very smart folks on this and what M makes the most sense for them to achieve their objectives on the mission is what we would be supportive of well maybe we should wrap and thank you for so much time today but I want to wrap on this topic of open versus closed you know and how we should cooperate to usher in safe Ai and so maybe I'll just leave it open-ended to you you know talk
to us a little bit about how you think about some of these differences and debates and the importance of doing this and one anecdote I would just throw out there is Reuters recently reported that Chinese researchers developed an AI model for potential M military use on the back of meta Lama right and you know there are a lot of supporters like Bill and I of Open Source but we've also heard critics um and and you said everybody can distill a model uh you know out there so we we are going to see some of these
put to uses that we're not going to be happy about so how do you think about uh you know uh you know us coming together really as a as a nation and as a collection of companies to usher in safe AI yeah I think um two things I think that I always have thought of Open Source versus closed source as two different tactics in order to create network effects right I've never thought of them as uh just religious battles uh I've thought of them as more like hey two different I mean that's why I think
what meta and Mark are doing is very smart right which is um in some sense he's trying to commoditize even his compliment right it makes a ton of sense to me if I were in his shoes I would do that right which is get the entire world converged I mean I think he talks openly and very eloquently about how he wants to be the Linux of llms and I think it's a beautiful model in fact there is even a model there I think there you know sometimes to going back to some of your economics um
question uh I think there is like the game theoretically uh a Consortium uh could be a superior model quite frankly than any one player trying to do it uh like this has unlike the Linux Foundation where the contributions were mostly Opex contributions right which is if I I I always say Linux wouldn't have happened but for I guess you know in fact the micros is one of the largest committers to Linux uh and so was IBM so was Oracle and what have you and I think that there may be a real place for uh open
and open source is a beautiful mechanism for that right which is when you have multiple entities coming together and so on and it's a smart business strategy then close Source may make sense in close Source after all we have had lots of close Source Products then safety is an important but orthogonal issue because after all regulations will apply and safety will apply to both sides um and you know one could make arguments that hey if everybody's inspecting it and you know there will be more safety on one side or the other so I think of
these as perhaps best dealt with in capitalism at least it's better to have multiple models uh and let there be competition uh and different companies will choose different paths uh and then we should be pretty hardcore and the governments will demand that I think in Tech you know now there's no chance of saying hey we'll see what happens to the unintended consequences later I mean no government no community no Society is going to tolerate that so therefore these AI safety you know institutions all over will hold a same bar uh and also National Security to
your point if there is sort of National Security leakage challenges the people will worry about that too so therefore I think States and state policy will have a lot to say about uh which of these models and and what the regulatory regime will look like well it's hard to believe that we're only 22 months into the post chat GPT era um uh you know but you you you know it's interesting when I reflect back on your you know framework around phase shifts you have to put Microsoft in a really good position as we uh emerge
into the age of AI and so congrats on the run over the last 10 years it's been really really um you know a sight to behold but you know it's great I think both bill and I get excited when we see the leadership uh you Elon Mark Sundar Etc you know really forging ahead for Team America around AI um you know I feel uh I think we both feel pretty pretty incredibly optimistic about how we're going to be positioned VV the rest of the world uh so thanks for spending some time with us yeah I
can't thank you enough for the time Sacha really appreciate it thank you so much thank you Brad Bill thank take care s [Music] cheers as a reminder to everybody just our opinions not investment advice
Copyright © 2024. Made with ♥ in London by YTScribe.com