[Music] I'll just briefly introduce Susan and uh and then we're gonna have a a discussion so um Susan is the economist of Technology professor at the GSB she's um an economist by training I I met Susan for the first time um when I was an admitted graduate student at MIT in the PHD program and Susan was a Susan had been a PhD student here and got her PhD in three years very quickly four four years four years you were three I was three it's not that we're competing or anything okay but to Susan but to
Susan's CR she started college when she was like 12 so it's uh um but I was an admin into the PHD program at MIT and I and I went into Susan's office and um wow was she just smart and um curious and uh in fact we I think we started our first research project more or less on my admitted students uh day um and so I've had the great pleasure of being on the faculty with Susan for a long time first in economics and now here at the at the GSB where she got here first
um and we're going to have a discussion she's I I'll just briefly to foreshadow a little bit we you're going to hear a little about we're going talk a little about Susan's career um and she you'll see she's had a very interesting trajectory starting in academics going out into the private sector in a variety of ways and she's for the last two years she has been the chief Economist for Annie trust at the perent of Justice which she's done while also having some of her time still here working on various uh things at the at
the school and so she just rewrote the US merger guidelines she's been in the middle of all of the interesting issues around Market power and Technology um so we'll get to that as we as we go on okay Susan great to Great to get to do this with you um maybe to start out you could just say a little bit about how you um got into academics and became a professor yeah so um so I went to college at the late 80s and I had always been a bit of a computer nerd I was uh
programming and basic in the on our little trs8 Radio Shack computers and so on um and I needed a work study job so I ended up getting a job administering the Unix workstation for this economics professor and he said well I kind of want to hire someone who wants to get an economics PhD I was like yeah right um I didn't really understand what that was about or what being a professor was really about but at the same time I had a summer job working for a startup and we were selling computers to the government
in procurement auctions okay so we were we were writing bids and putting them in the these these procurement auctions and at my startup everybody was complaining about the rules and people were like manipulating the rules and it was unfair and it was inefficient and it was frustrating and we sat around and we drank a beer and complained about it um I told my professor about it it um and he showed me how to turn that problem into an economic analysis of the policy and the incentives and he ended up testifying in front of Congress to
try to change the rule and so that was like Wow here's this tool I can take this job and I can change things for a lot of people so that was what got me excited and you came to the GSB you were a student you're in the PHD program talk a little about being a PhD student here at the GSB well first of all actually the GSP was a great place to be and you know we were right there before the beginning of the internet so that was a a wonderful time I spent a lot
of time with mbas because I wasn't quite 12 when I started college but I was I was 20 when I started my PhD and so I was definitely up for the the lpfs rather than the the the board game nights um so that was good um but uh but so you know it was an exciting time to be here and we there was a lot of interest in things like auctions and that was one of the reasons I came here because mgrm and Wilson were who later got the Nobel Prize and faculty members here were
innovators in this area of of auctions so I started working on sort of slightly more theoretical work but yet you know at the lpfs and in the corridors and at the you know in the The Rains housing and so on people were also like starting to build things like Yahoo and and eBay and so I also felt this a little bit of like the path not taken because I was doing very well on my chosen path but yet this whole startup computer thing was sort of getting left behind even though that had been my initial
passion um but nonetheless I put my head down and I got the job at MIT which allowed me to recruit some young students like you um so I okay so you you what you started off you go to MIT you when when you you start you did very mathematical research in economic theory when you started and then at some point you sort of broadened out and just started exploring all different things H how did how did you pick the things you were going to work on as a as an economist so the actually looking all
back on it the mathematical part was a was a almost an accident um I've always wanted to solve problems and it just so happened that I proved this these theorems that helped people solve some problems in economics but it was very upstream and frankly it was a little bit less satisfying to me so I was very happy to kind of start getting back to solving more real world problems and so like some of the things that we worked on we were combining economic models with data to answer Marketplace design questions so we were asking questions
like you know you know if you we were looking at whether having small biders around helped make DOA more competitive or or deterred collusion and then we combine data with models to answer questions like how would you design an auction to both make sure the small biders W enough to keep participating but also didn't raise procurement cost or cause inefficiency and so I guess in in these kinds of problems during that time period it was about you know how can we use these really rigorous tools but you know solve policy problems directly so you after
you um had been I guess first at MIT you came back here as a faculty member I guess then you went to then you went then that was one of my big losses in the economics for when Susan left and went back to Harvard that was terrible uh and um I but we did bring her back eventually uh you went back to Harvard and you had won the John Bates Clark medal as the leading Economist under the age of 40 in the United States and then you did something a little unusual which was you decided
to go to Microsoft to be their Chief Economist um why did you why did you do that well yeah so so I was kind of sitting in my office in MIT you know and I had been mostly working on Timber auctions and other types of auctions kind of oldfashioned stuff but but I had started working on search advertising auctions and I got this big prize and uh Steve balber had read an article about me in the Harvard alumni magazine so I really didn't know much about any of this and I just got an invitation to
come to Seattle I like okay I'll go so I went and they this was in 2007 and he had they just tried to buy double click which Google bought and turned into Google's ad exchange and then bought this other company a quantive for $6 billion which they later wrote off entirely and then it was like the next week that he read the article about me and it's like you know I think I should use some different advice so so um so they they brought me in and you know this was pretty scary because I was
just coming in cold I hadn't really been in a business meeting since I was in college working for the startup um and certainly never something like Microsoft uh but they had these really interesting questions like can there be more than two search engines um and you know what does that look like and you know should we be in the advertising business and and how would that work and also how do we operate it because they had just they had built a search engine and they had Engineers making kind of the the user side run but
they hadn't done any they didn't have any analytics on their ad platform so they had they were kind of at a very very early stage of understanding Big Data and building package software is incredibly different than running a search engine so they just generally realized that they were kind of in over their heads a little bit so but I just kind of got thrown in their cold and it wasn't you know I was just Consulting and then suddenly like I was Chief Economist sort of felt like the next day and I was you know is
is yaho worth $35 billion and you know it was kind of the big time and that was that was pretty overwhelming I think if You' asked me if I wanted to to just go do that big job I probably would have said no but I got sort of as soon as I saw what was involved I realized it was as int electrically challenging as anything I'd ever done because I was trying to predict industry structures for industries that were new or didn't exist yet I needed to understand the value of data um you know how
much how much how many more advertisers would you get if you got more more um users and so on so these were incredibly interesting problems but this so that was you know so it was pretty compelling to have and since they were early I had a very large influence very quickly on the other hand um there was still like this connection with research because I was seeing research problems every day in fact I still haven't finished all the research ideas that I got in 2008 um because basically you know I had a front row seat
to what is what we now see all around us but you could sort of see the future what is what is it like to try to manage a large L artificial intelligence machine Learning System at scale that has 2,000 algorithms and a decentralized structure you know how do you know if it's safe and how do you make decisions and how do you how do you monitor it and make sure it's giving users what they want and what they expect you have this really decentralized advertising business with you know some of your customers are really small
businesses and some of them are you know companies like Amazon and eBay are advertising customers as well and how do you how do you manage a system going back to our work on like the small Biggers biders and the big biders but now this was much more complex how can you make a healthy marketplace where the small biders stick around but you don't lose the business of the big the big businesses either so those were like more economicly questions but then there was all these other things too like wow this this affects what information people
get and we we rerank the results and people click on different things and they go to different websites and wow what if people start getting their news this way what will that do okay you know people like oh okay that's not how people read the news people like their newspaper Susan what are you worried about well like I was worried and I was right to be worried um just a little ahead of my time and then you know the AB testing platform is another thing right this is this so at the time you know economists
occasionally ran randomized experiments but it was like group one and group two and you would would spend a couple years planning it and analyzing it you know like how does class size affect student outcomes but the idea that you're going to run thousands of experiments in parallel and then be implementing hundreds of things at the time you know mind blown and the science of how to do those experiments and and how you how they're different when they're when you have large data sets and you might use the data for personalization that science hadn't been done
the statistical science um and so on so there was suddenly both this incredible opportunity to to do it and play with it but also it opened up a bunch of of scientific questions so while I was there I was seeing the questions and I was kind of hacking the answers you know guessing at what an answer would be and implementing something that would be 80% good um but but ultimately what led me to leave and come back was that I realized that I just been given this enormous gift because you know I would go back
to my office at Harvard and say hey there's AB test there's machine learning they're like machine learning we don't do that you know AB testing what are you talking about search engines that's Niche like who really cares like that's economic of Internet that's like a tiny little thing so very negative feedback which what first was discouraging but then I realized wow I can really make a difference I can take that knowledge and be a part of like integrating what's coming into the fabric of our teaching and our research all right I want to just go
for a little more into that sort of feedback loop between going as an academic into industry and then kind of coming back um what was the most you sort of went in first to micros then you were lots of Technology boards and other companies as well over time what was the what's the most valuable thing you have brought into to tech companies as an academic you know it's it's a great question and sometimes I wonder why do they want me here again you know people seem to want me here but I I I don't always
know why um so I've puzzled over you know what Economist can do and by the way one other thing that came after that was really the the beginning of this career path of tech Economist so we started these little groups with like three or four or five tech economists and now we have a conference with 800 economists come and there's a job market and everything for economists to be part of the tech industry so I've you know I've listened to other people talk about this question as well I think there's there's more than one answer
I think at the highest level like some of the things that Microsoft I'd say one of the things I'm really proud of there is I worked on convincing the company to build Azure and that now it seems like a no-brainer but that was highly controversial because there was this business you know Microsoft SQL server and tools and that was making billions of dollars a year and the people who ran that business believed that now and forever their customers were going to keep doing what they were doing which is do everything on premise and if you
ask the the CTO and the cios they would all say oh yeah we want to be on premise we have to be on premise because they were running organizations that way right so we did a lot of economic analysis to for two points one was to think about the cost curves and say okay those cios are going to get fired um because their their costs are going to be 10x what what can be delivered um so that was one very important fact um a second really important fact was to think about the industry structure and
to realize it was likely to be concentrated and so instead of like wait and see if this Cloud thing is really going to be for real or not and then we'll build it later no you needed to build it now even if it meant sacrificing you know a couple billion dollars of certain revenue for an uncertain future so that was a really gutsy decision um and so we did a lot of analysis we argued about it for a year and a half or so um and then then Steve put Sacha Adela who was in search
in charge of server and tools which he then turned into the azer division which was again very controversial but one other piece of that was Sacha and I had been working together in the search engine and we both had this lived experience of what it meant to be a customer um on the search engine but so when I think about like what did I bring to that conversation obviously there were very important other participants in the conversation but I would say like the rigorous economic analysis and understanding the technology well enough to be able to
sort of predict the future and also understanding how the features of the technology and the features of the business model were going to fit together to lead to a future Market structure which made it imperative to build immediately so um I think having that rigorous thinking is important and I think more broadly and and I think this is something that's common across Business Leaders as well you know when I say like why is it so important that we train future NBAs to be in the room you know Microsoft there were lots of rooms where there
were no NBAs like I was kind of doing the MBA job but the MBA wasn't in the room um partly because they didn't have the technical background and so I wanted to make sure that our future leaders were being trained so that they would be in the room and so because I feel it's so important to have that kind of the business perspective it seems obvious but when you're Building Technology you know you can have people trying to operate a search engine and they were building tools for advertisers they' never met an Advertiser you know
there were engineers like this was just number one 12567 in the database you know like that their customers were just lines in a database and they didn't have the insight about the business problems so I think be and and it's not that hard to become multilingual you know so you have your domain expertise but you become multilingual and and grow up with it and so as decisions get more data driven and so on we I wanted to make sure that we were teaching our students and part of it is just exposure so getting exposure in
the classroom to technology so that it wouldn't be frightening um and people would have the confidence also the confidence to raise your hand and kind of call BS like you know people have these technical visions that are you know kind of crazy one of my big jobs at Microsoft was to say ideas were bad it's it's kind of depressing but you know technical people can come up with lots of of ideas without really thinking through whether the business is viable or not so that's that's kind of the I think a value it's very similar to
the value that in that in that a business person brings it's it's really combining the technical with the with the the business Insight I love that I think that that's a that's a core uh learning in an academic seminar is being able to tell people ideas are bad so that's uh that's that's a good de we're able to apply that um now one of the things you you brought back from that experience in technology was an appreciation for machine learning which was kind of nowhere in economics or the social sciences when you started working on
it and now it is like at graduation when the PhD students at the GSB walk across the stage like machine learning is in literally every person's dissertation title um say a little bit about how you saw that coming and what you've done with that to make it such a big thing yeah so so one thing that you may or may not know is that you know generally economics research and marketing research and so on has been the research side has been empirical for a long time so we've used data together with models but a lot
of our research was on small data sets that's one thing the second thing is that economics and and business research is often about policy questions or like cause and effect so we want to know what would happen if a firm raised prices um what would happen if we increase the minimum wage these are like what if kinds of questions so the first wave of machine learning in AI was all about prediction and classification so you have a data set of images and you have labels this is a cat this is a dog this is a
horse and you then hold out some test data and you use the model to tell whether a new picture is a cat or a dog and you know you done a good job if in this hidden test data you predict cat and dog and you get it right okay so that's a very easy problem in a way it's a good thing you can delegate to a blackbox because if you just hide your it's like hiding your answer key in the drawer teacher's drawer you know you have you have an assessment and some intern can come
in and build some blackbox thing you don't need to know how it works at the end you just assess like did it get the cats and dogs right if it got it right great let's go okay so this is of course the engineering side of it has been miraculous right but the the conceptual side of it like what is the problem you're solving the statement of the problem and the ability to assess where you solve the problem is very simple you can contrast that with like okay what would have happened if we hadn't cut the
price last month if we had kept it high you know what would happen if we introduce this new product that's kind of similar to our old product what if we you know announce a promotion these are all kinds of these what if decisions and you can't see the world if you hadn't done the thing very easily so you don't necessarily have your answer key in the drawer for any if I'm treating patients with a drug some people got the drug other people didn't get the drug but anybody who got the drug I don't know what
would have happened to them if they hadn't so that's a causal problem so what one of the things that that the gaps that I saw was that these amazing engineering ing Innovations had not been connected with the set of people doing regular old business analytics and kind of Economics research so I built the the I laid out sort of the conceptual framework and built the tools for sort of connecting those things which helps you for example build a personalized policy so you give the drug to the People For Whom the drug works best so the
tools and software I built are now like they've been used by thousands of social scientists to you know they people go back and reanalyze old experiments or they look at administrative data some things I've looked at are like the effects of layoff some people and who's most resilient to layoffs and who's who's most impacted but also Tech firms have adopted this as well and they built them in the AB testing platforms many of the large Tech firms and small Tech firms have used this I meet people on the street like oh I'm using your software
package and so of course you know how did I know that would be adopted because I was working on it in industry and I saw that the products didn't exist and that people were confused about what they should do and so there was a problem I could solve and I went out to solve it but beyond that I also kind of przed to academics about the impact of this stuff and that they needed to learn about it and sort of measure the impact as well and then when you were part A couple of years ago
now about must about seven years ago now you were part of the the group that set up the human centered AI Institute at Stanford so um tell to tell us a little bit about um about that and what you did there to help this is a Institute at Stamford that was started 2017 I guess about seven years ago with a vision of advancing research in artificial intelligence sort of foundational research but also application and ethics uh thinking about the responsible use of of AI so maybe you want to say AIT about the role you played
you played there as sort of the the social science person in the in the team yeah so I think one of my learnings from the past had been that you know people um don't it's really hard to just tell people to be interdisciplinary um you know it's it's hard to just say go talk hey you engineer go talk to you ethicist you know people like yeah I'm busy um and but if you are actually building something together or if you are really de more deeply engaged in the problem that's when those collaborations happen so that
was something I saw I've seen it multiple tech companies you know if it's when there's a problem oh no what do we do you know people are are you know putting in objectionable Search terms in the search engine or you know there's something that's um you know causing a differential impact on different groups in your products um that's when people make a lot of progress because the engine oh my gosh I don't know how to solve this problem and the ethicist or the the business person is said well okay I don't really know what the
options are to try to solve it um you know content moderation is another great example like that it's just it's so hard um we don't know you know what it is like people don't agree on what you're trying to do and then how would you measure if you were doing it so these are these are Big challenges So based on that experience I felt like at Stanford if we wanted to send people out to the world we needed to really get them engaged in in real problems and that's why there have been a bunch of
institutes started at other universities but ours is more inherently multi-is disciplinary and so just from you know one of what is one of the problems I wanted to solve our engineers in the computer science and engineering school they would take all these classes where they would learn machine learning but what did they learn they would learn an algorithm like random forest and then they would download 200 data sets and run it on all 200 to see how it worked but they didn't know what the variables were you know they never they had no idea whether
this was like a health data set or an advertising data set or anything else so they were getting really good at sort of cranking the machine but but then they would go into industry and at the time I was on Tech boards and they say oh those annoying Stanford grads you know they're coming in and they want to build neural nets for everything but like you know wait we're you know we're lending we're a lending organization and we have regulation and you know what if the economy changes will your model break right so the the
these these these students were hot on the job market they could get jobs but it was actually hard for companies to ingest them and then on the student side actually they also kind of realized they had something missing they knew neural Nets would get them a very high-paying job okay but but like they also do care about the problems they're working on and they realize that they didn't know they weren't getting trained for that so this is kind of the Gap that we were trying to solve let's let the engineers solve a real problem and
actually get get practice thinking about what it means and let's also bring in Social scientists or business students and let them kind of learn from the technologist about what Solutions are possible yeah it's been in it's been amazing in doing exactly that including you know with classes and other other programs um I want to open it up to the audience but before I do that I want to ask you a little bit about the doj experience so two years at the doj you rewrote the merger guidelines I know you can't talk about Prosecuting all the
tech firms um but that's been a big uh big thing of course and and uh um and you help sort of bring the you've been helping to bring the doj into a data aware era um say a little about the things you're most uh excited about from the work you've done at the doj sure so the merger guidelines was a big deal and that was a really complicated project um just to like very simply um you know the the previous guidelines had there were many cases they didn't cover and that can be a problem if
you're doing a merger and the situation that you're in are not addressed by the guidelines because then you have to sort of guess about how the agencies will think about it and then the guidelines are also helping staff do investigations and so then they're kind of unmowed in terms of you know what's the language how do I how do I approach this and so one of the things that was missing was a merger involving a company that's already a monopolist um buying an adjacent product not not you know Coke and Pepsi merging but um you
know a firm buying something that might be a compliment to their product but it might also be a complement to their competitor's product okay and then they might be able to withhold that product from their competitor or reduce interoperability so there are we we did have already analytic Frameworks for analyzing that in the case where you know there's it's a pretty competitive industry and so if things are like very competitive then you know one firm buying a compliment can be beneficial because you realize that you know if if you sell more of your product then
people will buy more of your compliment but if you are um if you are a monopolist already the main thing you're worried about and you should be worried about from your shareholders perspective is not losing your Monopoly okay so you're not going to care so much you're not going to be thinking about buying something because of some short-term effect you're really thinking about building your moat or whether you know this complement might help somebody enter your industry so those are those kinds of considerations and so in the merger guidelines the revised merger guidelines we laid
out how we think about those things so it's still all about in the end how a merger is going to affect customers but it's not just today's customers assuming that there's going to be lots of Entry but it's also about tomorrow's customers who may lose out so you know if you're a monopoly software company and you buy something adjacent maybe you just end up bloated with bad customer service and buggy and you have no reason to fix it and so we're thinking about tomorrow's customers who are cursing at the screen and don't have any other
option rather than just you know are you going to get you know a 2% price change today so it's really introducing those Dynamics now what's interesting actually is that here at the GSB you know back in like the the long time ago 70s ' 80s research was getting done that talks about all those kinds of scenarios so the research Arch is actually quite old but the antitrust law is very conservative and it moves very slowly so it had mostly been still considering a world with like perfect competition and kind of atomistic companies and it was
also really considering worlds where the only the only choice a company has is to either lower price or improve quality so if that's your only choice and a competitor comes in of course you lower price or improv quality to compete but if you have other choices like closing down your API they that's that's not in the econ one textbook and that might be good for you and bad for your customers okay and so those are the types of considerations that we we brought in um from the the building up the data perspective actually it's really
I think of just a continuation of the theme so you know here at the GSB I I built the gold capital social impact lab where we have interdisciplinary teams you know business students and Engineers building stuff for social impact or helping social impact organizations adopt technology the government has that problem in Spades um you know they they have like the doj my group had used to have economists and lawyers but now you're investing Tech firms you know you're getting huge data sets you can't even adjust them so all it has to get outsourced to consultants
and it makes for more expense and lower quality decision-making so one of the things I did is I built a data science team I'm building a technology team um and getting the software tools and so that the so that the agency has more expertise and also when a company comes in they understand what they're talking about they understand the vocabulary they can get the decisions faster and ask more intelligent questions so that's a work in progress but I would say it's interesting that's happening actually all over the world um to and I think with the
AI Revolution All The Regulators around the world are really scared that we're going to have this super important technology and there's going to be like a couple of dominant firms and that that's going to really slow down the benefits it might hurt the the application ecosystem it might hurt the ability of startups to to be able to invest in that so H having in-house people who actually understand the technology and can build it themselves and and really be fluent in it is crucial for solving that and I'll give a little plug we're having a conference
here just the website just went up today um May 30th a com conference doj and Stanford jointly organized at the GSB um for um studying bottlenecks in the AI industry and what we need to do to promote competition so we'll have um you know startup people and um people from small and big companies to to hear about um competition and and issues in AI I so got to ask you one more question based on that which is you know what I think one thing you brought in to the doj and to to write the merger
guidelines was a lot of antitrust and Regulatory policy historically really envisioned kind of traditional economic widget manufacturers making you know and the real question was were there five firms or three firms or four firms or two firms and and so forth and the market was sort of largely contained um by the way that's a term of art at the doj internally we're like oh that's a widget merger a widget merger okay and uh that's a that is a ter in economics and uh and you kind of brought in a frame that know the modern economy
is much more about two-sided marketplaces it's more about Network effects there's opportunities for monopolization and different kinds of barriers to entry and we ought to and and different ways in which you can ex ASE Market power um if you take you know something like artificial intelligence which is absolutely in this sort of modern way of thinking about markets and and competition um what do you do you think we're going to end up with one or two dominant firms that are our technology overlords or um is there going to be is it going to usher in
an era of more competition well so I'm I'm we I am trying to learn everything I can about that topic um and I think all all of the competition people in the world are trying to learn everything they can about that topic um and there's certainly what I am hearing are multiple perspectives some people are very worried that you know there's firms that already have a dominant Market position are going to buy the things next door they're going to make exclusives this thing it's it's so much so Capital intensive it seems like you know gosh
we're just going to see some of the current problems we have just replicate themselves only worse because it's even more important this time right it touches even more of society now um on the other side uh there's you know right at this very moment a lot of people are using open-source large language models they are fine-tuning them they're doing something called retrieval augmented generation um basically like if you aren't into the technical part these large language models they take any sentence or paragraph or or you know paper and they boil them down to like either
seven billion numbers or 70 billion numbers or something like that so it's like a de a coder ring a Cod or decoder ring so any any words get a code and so what the retrieval augmented generation does is say you have documents in your company you just run all the documents in your company through this coder ring and each document gets a code so now instead of having documents of text you just have a list of codes long codes but codes and then when somebody in your company has a question like oh you know is
there a presentation about this or you know what's our analysis about that that also gets turned into a code and then you take that code and you find the documents whose codes are most similar and that's step one and then step two is you you ask a large language summarize those documents but also you can tell the docu tell them where the original documents are as well that's actually something that's very easy to do people you can do it you know for ,000 dollar or two of compute um more if you want it to be
bigger models but but I've done it myself here a lot of professors have done it um using Stamford equipment and so on and then and then running it just takes like one or two gpus so it's something that can be done um once you've trained it which is a little more expensive than it can be just done on you know a couple computers um so that's something that's like wow okay like this is going to be super competitive right like if there's an open source model and anybody can do it it only costs a couple
thousand dollars I can do it with a master student you in my whole career I've built stuff most of the time you think something's going to work in six months and it works in two years um this thing I'm like hey guys I I'm I have a presentation in 10 days and you know hey Master students can you like try to make give me a demo and and it worked in 10 days you know of course we had a bunch of we had the data organized and stuff but you know I was like wow like
nothing has ever worked that fast in like my whole life so this is something that seems like super competitive but but but that that's just because today fine-tuning the open source model is about as good as the thing you pay for and it's also good enough now what happens if the thing you pay for is a lot better than that and also what happens if those open- Source things go away for whatever reason either because companies sto building them or because they're deemed to be unsafe so this discussion that's just one element of it like
open source large language models that's one thing that could really affect the competitiv of the industry but there are others bottlenecks there's the chip bottlenecks and so on so those are the kinds of risks that that we're worried about and just to make sure I understood with that model that type of architecture where you're doing retrieve um that's a way to do generative AI which we've all seen in open GPT and index back to the underlying documents like Google search so that's a hybrid type of Technology yeah it's similar to what you're seeing on on
Bing if you try Bing right now they'll give you an answer and they'll also give you the citations but so you can build that for your own company relatively easily and I've basally never said anything was easy before um like in my entire life basically usually people come asked me oh should I adopt this and I'm like eh not really you know we had a team of 10 engineers and it took them two years to get it to work so probably not but like this is a first like yeah yeah you can do it and
then now the only question is just so does it somehow become obsolete with by the ne the latest generations of of the things you pay for where you decide it's not worth it to build yourself so it's it's a really interesting like horse race between these two models and what's going to be the most economical so it's like wow like this is a moment again we're having a moment where like industry could go one way you know say at the beginning like we all thought there was going to be 10 search engines like there was
the day um you know and and and then you know now we we think there could be a really competitive industry but but but it's you know it could go either way okay next question for for for Susan uh in in the back not quite in the back but almost in the back yo yes hi um I'm very excited and impressed that you managed to get a team working in the doj of Engineers and data scientists how could we um do that like 50 times bigger across the whole government that is an excellent question so
I have learned so many things are hard believe it or not the very hardest thing is like just getting software and um cloud computing has made things easier but um it's just it's just it's really hard to get stuff in um I have so one thing that's really surprised me about government and by the way you know in my previous life before I went I was on Boards of companies who got frustrated with Regulators so I I do understand the frustrating side as well but I'm just here going to uh focus on the positive um
that you know people are so Mission driven like I I thought you know how am I I have seen these tech companies I was on the board we were trying to hire people you couldn't hire anybody to save your life and they stay for a year you know how is this government ever going to hire anyone um and like for less than 100,000 a year never going to happen but it did and we got we we the first time we got we had three data science positions and we got our top three choices and I
couldn't believe it and I why but one thing is that people they really especially young people today they really value the mission and there's a lot of people who are concerned they've had personal experiences you this is actually something else in the merger guidelines we got you know 30,000 comments from individuals part of it the Hollywood writers were on strike so we got a lot of them writing in at the time but we also got you know thousands of comments from nurses um and doctors now you mean like again these are folks that didn't used
to think of the government as their friend but now they're they've been bought up by they they've been Consolidated and Consolidated and and Consolidated again and now they patient loads are doubled and they can't treat their patients right and so there's just a lot of people who have been like personally touched by competition issues and they're very passionate so they're really excited to go and work like I'm not going to go you know use my time and effort to like hurt people I'm going to help people you know I don't want to I want I
worked on targeting ads at some point in my life but I don't want to Target ads I want to you know I want to help citizens I was at a retirement party a couple days ago where I was speaking and you know hearing the people talk about why they had come and why they had stayed and you really saw the values like everybody's like I'm working for the American people that is what I do so um that's part of it is just that now I now I understand what it takes to recruit people we are
getting sort of younger people the other thing though is that actually we're like a pretty good job because we we're flat organization you know there's me and there's I'm a political appointee and then there's a staff member under me and then we have a manager layer and then we have the staff um so the staff like even this right out of school data scientist is like playing a huge role in an important investigation and they're the the the domain expert instead of being like the Cog in the wheel at the bottom so those things have
really helped one other thing we're hiring technology policy and some of you might know people who are who would be interested in this there's also some like career Changers there's people who you know made a bunch of money maybe you know they got in early got had Equity they their spouse and in Tech or other places and now like hey you know I see that there were some negative societal impacts and I want to do something now so we also see like VPS changing jobs and coming to a lower salary to you know try to
work in in government or government policy yeah great answer that was that was fabulous to hear that um okay I there's so many Cara you just decide hi my name is fatia and uh I've been fascinated ated with the stock so far um I have a question around your work with the doj what are some of the future crimes related to AI that you are most concerned about yeah so I actually I was mostly talking so far about competition in the AI industry but there's also the question of how will AI affect competition and the
practice of competition so I think we have a real challenge actually one of the things the merger guidelines did is they they reverted the concentration thresholds for blocking mergers back to what they had been for the previous 40 years they had been temporarily relaxed um in 2010 and I think one thing that made me feel like the older guide thresholds were more important more more um appropriate is that you know in the old world like collusion required you know you to like sit down in a room and you know to travel and you know try
to talk about things now you know we we're going to see more casset collusion which depending on how it takes place it might not be illegal it might um but it might not but you just have you know you have algorithms setting prices and looking at the other prices and responding to them and it's and research has shown that it's it's pretty easy for those to learn to keep prices high and so Industries where like humans were occasionally changing prices might have been less vulnerable to collusion than Industries where the two robots are learning to
collude um so I'm just worried in general that P that that we're going to see with moderate concentration these algorithms kind of just lead to higher prices and higher markups the second thing is that um you know there can be um algorithms or technology that are essentially helping people collude helping them share information um information that wouldn't otherwise be public and so you know there there is and that can in principle um again lead to much higher prices and there's a couple um sectors where that that's becoming more prevalent um so those are some of
the things that I'm worried about the one where the the robots are just like I build my robot you build your robot we do it completely independently but they learn to set prices High by reacting to each other is a very is a harder one to to get at um and so we may have to update regulation to think about like what is responsible pricing look like but frankly you know I don't know how to write that regulation that's like pretty hard that's fascinating problem okay hi thanks uh I guess sort of two quick questions
one you sort of hit on but should the US government be competing with open Ai and Gro and meta should they be writing their own uh open source llm utility uh and then second you've been on Boards of lots of marketplace companies turo Expedia Lending Club um what are the biggest and most common mistakes that you've seen from tech companies as they build and grow marketplaces actually why don't we let's take the second one because we haven't talked much about that and actually I one other thing I should have mentioned really is that this my
time at the GSB has been was also an incredible learning experience on platforms and and I've been teaching about platforms and marketplaces here for a long time those were popular courses they've only gotten more popular and it's another place where economics is really helpful um you know most of the time you think oh my God I slept through my economics class why does she think why is anybody signing up voluntarily to take an economics course but if you're going to build a Marketplace like you really do need to understand some economics it's not so easy
to manage suppliers and buyers and make them healthy so I would say one challenge is it's actually just hard to know and hard to measure whether your Marketplace is healthy like should I spend more on Advertising to get more Uber drivers or should I be trying to get more Uber customers and sometimes it's way out of whack so you know you should lean into one side but it's a lot of times these markets are very very local so how do you at scale understand you know where you need more Supply I think one of the
biggest problems that mistakes I see marketplaces make and this is also related to AB testing as well and mistakes in AB testing is that it's very easy to test the user side you can change the interface they cck a little more move it this way they click a little less you can run an experiment in a day and sort of figure out whether it should be pink or blue okay but on the supplier side people are often running businesses so like on turo they're running a little small business Runing out cars or you might be
a property manager on Airbnb um and some you may be an Advertiser so those small businesses are actually if anybody sells to small businesses you'll know like working with small businesses is really hard because they're like not irrational they're not like completely shortsighted but they also don't have a lot of time to adopt to your new tool and they're a little bit unpredictable and they're they're hard to attract they're hard to sign up you know they there's a lot of frictions so I the biggest mistake I think that market places make is because they can't
measure it very well their AB tests show them as being unresponsive to anything they they don't really appreciate how important it is to actually to nurture the supply side to make sure they understand the business of their supply side to build tools that make the supply side easier because you can't measure the ROI quickly but yet if you can if you have you know a million suppliers on your platform and you can save each of them you know one intern can save each of them an hour a week like that's massively efficient and so if
you think about you're creating this huge amount of surplus you just can't quite measure it but if you're capturing even like a tiny fraction of it of course the intern is worth it to fix the bug in your supply side software but across the board every Marketplace I go to you hear people like oh we can't get the resources to like fix the bugs in the seller tools but there's all these resources making the user interface like fancy and flashy right um so that's a that's I would say a pretty big mistake AI is now
going to come into that because we we're going to see these marketplaces find using AI to help make their their supplier life easier to help the suppliers manage their business and at these conferences I conferences I'm seeing you know companies like sales forest and stripe and so on they're the ones building AI at scale on behalf of their customers um and and so they're that it's going to be really interesting I would say to watch those spaces to see this intersection of like Marketplace and AI Susan thank you that was amazing [Applause] [Music]