what does a beautiful future look like to you [Music] you can flip a negative mood on its head this was a talk that was all over the place but then new york you are all over the place there are experiences that you just don't get anywhere else when you go to google and type in climate change is you're going to see different results depending on where you live and the particular things that google knows about your interests that's not by accident that's a design technique what i want people to know is that everything they're doing
online is being watched is being tracked every single action you take is carefully monitored and recorded a lot of people think google's just a search box and facebook's just a place to see what my friends are doing what they don't realize is there's entire teams of engineers whose job is to use your psychology against you i was the co-inventor of the facebook like button i was the president of pinterest google twitter instagram there were meaningful changes happening around the world because of these platforms i think we were naive about the flip side of that coin
we get rewarded by parts likes thumbs up and we conflate that with value and we conflate it with truth a whole generation is more anxious more depressed i always felt like fundamentally it was the force for good i don't know if i feel that way anymore facebook discovered that they were able to affect real world behavior and emotions without ever triggering the user's awareness they are completely clueless fake news spreads six times faster than trueness we're being bombarded with rumors if everyone's entitled to their own facts there's really no need for people to come together
in fact there's really no need for people to interact we have less control over who we are and what we really believe if you want to control the population of your country there has never been a tool as effective as facebook we built these things and we have a responsibility to change it the intention could be how do we make the world better if technology creates mass chaos loneliness more polarization more election hacking more inability to focus on the real issues we're toast this is checkmate on humanity hi everyone and welcome i'm very excited to
have this incredibly important conversation with the people behind the new documentary called the social dilemma before we dive in and there's an awful lot to talk about a lot of issues that are raised by this documentary i wanted to first introduce our panelists and in the era of zoom if you all can wave as i introduce you just to help people know who you are uh which one is you jeff orlowski is the director produ he's also the director producer and cinematographer of the award-winning films chasing coral and chasing ice tristan harris is a good
friend of mine i'd like to say i hope you think so too tristan he is the president and co-founder of the center for humane technology and a former google design ethicist he's been called the closest thing silicon valley has to a conscience uh tim kendall is ceo of moment hi tim he's also the former president of pinterest and the former director of monetization boo a facebook anyway a cathy o'neil is a professor and entrepreneur and i hope you all notice that her hair matches is that your blanket kathy or your chair yes i needed my
blanket yes all right it matches your blanket kathy earned a ph okay kathy is a kind of a brainiac she earned a phd in math from harvard was postdoc at the mit math department and is a prof and a professor at barnard college where she published a number of research papers in arithmetic algebraic geometry which makes me break into a cold sweat kathy i hope you know that anyway she um is in the film and has a lot to say meanwhile rashida richardson is a visiting scholar at rutgers law school and rutgers institute for information
policy and law where she specializes in race emerging technologies and the law so i'm thrilled as i said to be able to have this conversation this is an area that i've been interested in for some time so thank you so much for doing this and jeff let me start with you uh as i mentioned your past work includes chasing ice and chasing coral which really focused on climate change so i'm curious what made you want to turn your lens to social media and technology a very different subject in so many ways different in some ways
and very similar in other ways um i think uh our team has always been interested in big problems and big challenges and uh the existential threat of climate change um was where we spent a lot of time and then when learning from tristan and others in the film about what's at stake here the seriousness of of the way our technology is reprogramming civilization um we realized this was a huge huge issue facing society um ironically it's my own filter bubble that got me into this was seeing posts from christon and other friends and i was
in a small group of people that i was hearing these conversations a couple years ago and it was that insight that that said wait a second there's a much much bigger story here and we we set out to explore that you know i've produced documentaries as well on big thorny social issues and i'm curious what some of the challenges were for you and just actually putting this documentary together because i know it took three years jeff yeah it was so i imagine there were parts of it that weren't so easy right um it was a
big project and a big undertaking when i first started speaking with tristan and getting christian's perspective we started reaching out to other former employees from the companies as well and it was difficult to get people to just be willing to speak on the record i feel like i had to twist tim's arm a bit to get him to speak on camera and um and it was challenging because uh i i was very very curious about the perspective from the people who were inside the companies and that was sort of a foundational backbone and then from
that thinking then we reach out to rashida and kathy and others to kind of surround that insight and that knowledge um and to give us a perspective here we also really wanted the film to be very accessible to the general public um and so we were really as a creative team thinking through how do we bring it to life how do we make it interesting how do we make it accessible how do we get the public to think about this in a different way um as much as i love all of the brilliance in all
of the talking heads right i say that with dear love and affection um not everybody wants to watch a documentary and we were really trying to figure out how do we make a film that a lot more people want to come and see and to bring into the conversation than to have conversations like this to follow up and you know i think one of the great things about the film jeff is it really takes a deeper dive into these concepts that that people may kind of sense but they're somewhat oblivious to because they are so
keyed in to the technology and what it allows you to do and the world it opens up to you they don't think of some of the repercussions and tristan you've been thinking about this for a long time as i mentioned you've been called the conscience of the silicon valley and i i i think it's probably instructive for people hear your backstory just a little bit and what made what made the the uh switch what made you flip the switch and say wait a second this is not a good thing that i'm doing um yeah thanks
katie it's good good to see you here and and thank you for doing this um the you know i i was um at stanford studying a computer science degree and my friends uh in college started instagram and i saw a lot of my friends were the same age and i'm the same age as zuckerberg uh a lot of my friends when we were in college would talk about all these positive social impact driven things we wanted to do in the world with technology with computer science and i saw more and more of my friends get
sucked down the kind of rabbit hole of building these big technology companies that would get lots of engagement and growth and less and less of our choices had to do with hey how can we make the world better and more and more of our choices had to do with how could we keep people engaged and suck people in and i noticed that that just became this race to go deeper and deeper into human psychology to figure out a deeper way to manipulate our really lizard brain instincts um and the founders of instagram and i studied
at a lab at stanford called the persuasive technology lab where we learned many of these things and as the film talks about you know i had a background in magic and behavioral economics and kind of how is the mind fooled and i saw that more and more of it had to do with this trickery and that that would create this huge confusion um if we didn't as a tech industry come together and say we have a moral responsibility to get this right and we also have a more responsibility to have regulation that can create the
incentives for technology companies to do the right thing uh which is not the case currently yeah well let's talk about it it's really sort of uh they play on this pavlovian response in many ways as you describe as the lizard brain and i wan i want to use a couple of terms here by the way i encourage everybody watching this panel to really go back and watch the movie because jeff and everyone involved you all did a great job of really outlining the the host of issues uh from addiction to polarization that really is part
and parcel of the digital world but real quickly tristan can you just give us kind of an appetizer you one of the one of the concepts in the film is if the platform is free you are the product what do you mean by that exactly uh well you know if you just ask people how much have you paid for your facebook account recently and people think for a second they realize they haven't paid at all well then how is it worth more than 500 billion dollars as a company and the answer is that we are
the product and so long as we are the products meaning advertisers pay so that we are influenced but so long as we are the product that means that we're worth more if we're addicted distracted outraged polarized and disinformed because that meant that the attention mining model was successful we're worth more when we're kind of domesticated into this kind of hyper attention switching distracted addicted kind of human and i think that the big confusion is when we look in the mirror of technology we've been told these are just neutral platforms they're just showing us a reflection
in the mirror this is who you are you have a lizard brain but it's really a fun house mirror where it's amplified the lizard brain parts of ourselves we like to say since the film deals in conspiracy theories you know lizard people do actually walk among us and run the world it's just that we're the lizard people because it's actually taken our lizard brain and made that the pilot of our choices which is dangerous when you zoom out and say what it's doing to democracy and to society around the world let's talk about addiction though
before we talk about polarization which of course so many people are going to be interested in hearing especially because we're so close to an incredibly important election tim i'm curious from you about addiction and you yourself admitted that you have trouble putting your phone away you say when you know you should be spending quality time with your kids you're in a closet you know looking checking emails um and i know tristan has has really uh examined the mechanisms uh that are used to addict people whether it's bright colors or you know and we can talk
to kathy and me in a minute about a.i and how that is contributing but tim from a monetization point of view how how does facebook how do they get us so we we we feel like we're we're going cold turkey when we're not close to our devices well i think you know it all started um i mean they prey on they prey on human weakness and i i think the light bulb really went off for them when they invented photo tagging 15 years ago and they realized that oh if we let katie know that a
picture of her just got posted to facebook we sent her an email she comes to the site 100 of the time and she stays for this amount of time and so there's this there's this notion of these technologies i think they the insight there was wow if we can prey on you know katie's you know social self-consciousness or or kind of all of our phobias about a bad picture being posted online we can get incredible engagement and the dimensions on which they prey on our prefrontal cortex have just been added right all the terms that
tristan just just explained um and i think what's especially scary is that they've they've preyed on all these dimensions right our need for belonging our our need to express anger our need to watch a car crash they prayed on all of these and there are only so many people in the world and there's only so much time but these services and their value is predicated on consistent and high rates of growth which basically means they have to get better and better at it which happens at at our expense well so what what was it for
you that said i'm i'm sure you were making a boatload of money at facebook what employee were you by the way uh i think i was around um 95 or something so you had probably you know a nice chunk of change working there but then you decided like i can't do this was there an ah yeah that wasn't i mean i'd love to give myself credit for for that was 10 years ago so i'd love to get myself credit for seeing the future 10 years ago but i didn't i just wanted to go to another
company uh pinterest in this case that that did similar things i think we can we can argue it's doing less harm in the world by a pretty wide margin compared to facebook at the moment but the mechanisms are the same in terms of the algorithms being used to bring people onto the platform and have them spend increasing amounts of time by convincing them in the case of pinterest that they need things um and so you know my my reckoning if you will came probably right when i had my first daughter so six years ago when
i when i started to realize that my phone and the services on my phone were more interesting than my first child and um notwithstanding sort of my core values which is that i wanted more than anything to be the best dad i could be i couldn't get my behavior in congruence with that and so that that misalignment that sort of psychological misalignment was a was a red alert for me and i started to talk more about it in my in my job at pinterest talk more about it publicly and then ultimately decided to leave pinterest
to really try to help you know bring to bear technology to help individuals and families with with this problem and that's what really motivated you to start moment yeah which i which i there was a different founder of moment but i but i joined an existing company and then became the became the ceo kathy um you explored the biases of algorithms a lot in your work um can you explain for for people who may not be super well versed in tech sort of the whole concept of algorithms and the role they play in our lives
yeah and if you don't mind i'm going to answer the same question you asked tristan and tim which is like how this how i saw the light because it'll also explain that which is that i was a mathematician until i became a quant at a hedge fund in 2006. and i kind of just got this front row view of the credit crisis and i saw in particular if you remember the aaa ratings on mortgage-backed securities yeah that were essentially lies um those were algorithms those were risk algorithms trying to convince us to trust them and
i was like that is a big old mathematical lie that really made a problem that was already a problem the housing bubble much worse especially when it exploded in front of us dramatically all of a sudden so i quit had i quit after a while trying to you know i'll i'll skip some details but i became a data scientist sort of again once again eating the kool-aid drinking the kool-aid of like oh now i'm a data scientist instead of a hedge fund quant i can do good with data but what i saw was these algorithms
that i was now using to predict humans instead of the market were also flawed and moreover they were very directly related to this concept that both tristan and tim have described which is like making people feel like their self-worth is on the line i was deciding whether somebody was worthy of an opportunity online and moreover i realized that i'm good at my job i'm a good mathematician i'm a good data scientist and what i was doing was super dumb katie i was like how much are you worth where do you live are you a man
or a woman i was deciding that that i was deciding who was worthy i was deciding but what i essentially was doing was i was saying you have a profile of a lucky person and i'm going to give you an option i was gonna make you luckier you on the other hand you have a profile of an unlucky person based on your you know browser history and if you're unlucky i'm gonna make you unluckier i'm to segregate the world into the lucky who i'm going to make luckier and the unlucky who i'm going to be
unluckier and what i saw was that every data scientist was doing this we were all segregating the world and streamlining them into these paths that were basically propagating the past and it was going to create a feedback loop and lots of harm can you give me real world examples kathy like sure i'm sort of following you but not exactly you know can you give me a real world example that might be our viewers sure yeah so i'm i was working in ad tech uh which is basically what facebook built is built on um so i
was deciding who gets sort of an offer in the in the world of travel expedia cheap tickets that that was the world i was working in but the same exact methodologies were being used in insurance like based on your profile you might get diabetes i'm not sure about how much this health insurance should cost based on your profile people like you didn't always pay back their loan so we're gonna charge you extra and moreover people like you were willing to pay more for car insurance even if you you wouldn't actually represent a higher risk but
if you're willing to pay more we're going to charge you more so there was all sorts of ways that sort of old school scamming was became becoming digitized and moreover it was getting legitimized because we all trusted big data we all trusted algorithms it was basically taking advantage of people and actually preying on them uh for their social you know depending on their socioeconomic status and past behavior right 100 in fact a vc came to our firm and again it was just a travel industry ad tech firm and he said to send to the entire
company he said you know i'm dreaming of the day and he was an architect you should i i should say of the internet because he was a vc deciding who to to give money to in the world of ad tech he was like i can't wait for the day when i never uh all i see are ads for trips to aruba and jet skis and i never want never again have to look at another university of phoenix ad because those aren't for people like me and i was like oh wait who are those for actually
and i looked into it and i was like single black mothers who are poor enough to qualify for financial aid and who don't know the difference between um a private college and a for-profit college so they're literally preying on people who you know are trying to make the their lives better for their children it was absolutely predatory but that was actually his vision his intention and so yes katie it was really demographic it was very related to race and gender and money so i was also concerned you know with so i was concerned with lots
of different algorithms and predictive algorithms that were were sort of representing really important decisions in people's lives like what i said college admissions getting a job getting insurance getting a loan and i was also concerned with political information and that's where i came to facebook and social media because i was i was studying the extent to which political campaigns know so much about us that there's this asymmetry of information they know more about me than i know about them they get to decide which of the 20 things i agree with most that they want to
tell me about and even if i go to their website the cookies follow me they still have my profile they can still show me the one thing that their candidate agrees with me and ignore everything else and what's worse and this is something that only partly predicted when my book came out in 2016 before the election is they don't even have to give us information they can literally give us propaganda to prevent us from wanting to vote in the first place they can literally commit voter suppression and we did see that and they could predict
who it's going to work on so what's worse katie is that people like you who are journalists and like would know better and would recognize this as propaganda they're not going to show that to you they have a score for every user about how likely that user is to object to what they're saying and they're not going to show that to people who would know better so it was really pernicious and i and i needed to say something about it so i quit my job in data science and wrote that book i mean i'm just
saying like the algorithms that we built on a daily basis were definitely making inequality works and they were definitely destroying democracy rasheed i was going to ask you about that since you really are a student of this intersection of of race technology and uh and law how do you see it what kind of societal effects do you see when it comes to uh these big internet companies and this you know information age really polluting uh society and exacerbating some of the biggest social ills that we face today yeah so we're seeing a lot of compounding
of probably the ugliest parts of society happening in any setting where data is being used to generate outcomes and that's definitely part of the process within um social media platforms so if you accept as the ground truth that we live in a socially inequitable society if you're using data that's reflecting our society it's going to amplify that but then when you compound that by the fact that many of these algorithms and systems are being designed by a very homogeneous and small minority of people mostly white men then their world view is also being imported into
these systems and a lot of the imbalances that kathy just outlined as in who is lucky who's not lucky who's deserving of opportunity or who's deserving of certain benefits that's all playing out at a global scale and on not just one platform but multiple so it really sort of skews our view of reality in a way where you may see and consume information in a way that's confirming your world view and making you think oh we live in an equitable society but in reality it's it's the complete opposite happening and it also serves to worsen
what the status quo already is in many ways and that if you um if we stick with the advertising examples if if you're advertising jobs that and you're using an algorithm that has a racial and gender bias then that's going to further compound who's getting the higher period paying jobs or who's getting better opportunities in society and that's happening across several many different domains and in general it's an interesting area i came into this because i was working on civil rights issues and saw that big data tech and ai were all sort of infiltrating all
of these issues and compounding them making the issues that i was trying to help fix harder to fix um and i think we're seeing that across the world whether we're talking about voting rights issues school segregation or even equitable employment these technologies are in many ways just worsening what's already not a great start i remember exploring this when i did a national geographic hour on this and and talked to tristan about it and they were just starting to realize how how biased ai was for that very reason rashida because of the people who were creating
this this intelligence it was from their real world view would the problem be mitigated in any way shape or form cathy and rashida and anybody else can can chime in if in fact there was more diversity that was uh coming up with this this these algorithms or the artificial intelligence because you know i remember writing doctor into google and looking at the images and at the time they were all pretty much white guys with white coats and did not show diversity and that was reflecting uh the people who are programming the ai so how much
of the problem would be solved cathy you first if we had a more diverse group of people who was doing this and it seems to me that it would help but the problem is even much bigger than that right yeah um it's a great question you know i actually run an algorithmic auditing company now and so this is exactly the question we ask our clients like um how do we how do we get a lot more um diverse viewpoints into the design and the implementation of this algorithm um and and what i've noticed katie is
that you don't actually need to have them all in the data science team the data science team honestly has been way overloaded they have been in charge of not only coding but basically making ethical decisions um deciding on all sorts of things that they really have no business or expertise to do so the way i have framed it in my company is you start with the list of stakeholders like who cares about this algorithm um like if you have customers if say say it's a mortgage lending company like the customers care what does it mean
for them for this to be successful or a failure if if you deny people mortgages unfairly um those are a stakeholder that you have to bring into this conversation and it's very quite it's in in fact quite obvious that people of color black people especially have been redlined from mortgages so you should specifically talk to people representing that stakeholder group about what would mean um for this mortgage company to fail and so you you get their opinion you bring them into the conversation and this is before the algorithm is built or at least before it's
deployed and the idea is that the data scientists instead of like making these decisions for the sake of everyone without having any historical information or knowledge about how mortgages have historically been denied to people um that they should be told after the fact after the values have been laid down here's what the values are you have to make these tests to make sure they're fitting the criterion that we want and that's your job is to translate those values into code it's not to make those values to decide on those values if you will so the
answer is we absolutely must realize that algorithms are everywhere there are you know we don't have humans making decisions anymore college admissions offices in the next few months are going to be decimated by budget cuts they're going to be turned into essentially algorithms and the question we have to ask is well for whom are they going to fail you know who who gets to decide what it means for this to work or fail and we have to make that a larger and broader conversation and and tristan i mean you were kind of the ethics guy
until you weren't right why don't these companies have any interest in in having someone present or a team of people present talking about the larger impact societal impacts of some of these technologies they is it because they don't care yeah i think that you know i agree first of all with everything that cappy just shared it's heartbreaking when you see the scale of harms that are being generated in places where there's no representation i think a good example of this first of all is i think it's the case that seventy percent of facebook users are
outside uh the us which means that the assumptions of the design team that are sitting there in menlo park are going to be incomplete or incorrect 70 of the time um then you have the issue of these technology companies really aggressively going into markets in places where there isn't infrastructure they're going into myanmar different countries in the african continent and they're actually establishing the infrastructure so they're saying hey when you get a cell phone we're going to do a deal with the telecom so you get a cell and you get a phone number and you
get your first phone in myanmar facebook comes built into the phone you get a facebook account and they're actually this is important it actually crowds out other places other players who could be there so for example you go into a country that didn't even have internet before um and uh and suddenly all the content that's generated in those local languages is actually all on facebook there isn't some you know 20 other online publications who can actually compete with what's on facebook all that material is there and then you have a situation like in myanmar where
the government spreads fake news that starts to as is described in the film go after the rohingya minority group and there's no there's no counterbalancing force in fact in that case in 2013 or 14 there was only four burmese speakers who can even do the content moderation would have any idea what's going on in that country now zooming out you say okay facebook is managing about 80 elections per year so we all care about the upcoming u.s election we care about whether you know how well facebook deals with some of the issues that are that
are present now in this country but then you zoom out and say do they have teams that speak the languages of all the countries that they have colonized and it's a form of digital colonialism when they create infrastructure where they don't even have the staff in those countries and the people who are representative and so across the board i would also just add in every case if you don't have expertise um or the stakeholders as cathy said um where the people closest to the pain are not closest to the decision making are closest to the
power that's a huge problem and and i wanted to add katie if i could just add there um you just said something about the the people inside the companies and kind of their intentions and one of the things from my experience in having the conversations over the last few years i think there are a lot of people at the companies that do mean well that are well intended yet they're stuck with this problem they're stuck with this business model there's a phrase in coding that have heard inherent vice so it's like they even acknowledging the
problems of the code they still built around those problems and now they've grown to a scale that is affecting all of society so even if they wanted to make changes you have to overturn the entire business model you have to overturn the entire valuation by the stock market so when you talk about getting diversity of opinion into these companies diversity in a lot of ways is needed but i think we also need to really question the fundamental business model of what's driving these companies i often make a comparison to the fossil fuel industry and the
fossil fuel industry seemed really awesome to humanity when we started it and it gave us great power and opportunity but years later we're seeing the consequences of that and the same thing with our social media technology and this business model of of attention and this advertising targeted business model like we need to figure out how can we flip this model upside down and create financial models that work in alignment with people in alignment with humanity and society and can take all of these factors into consideration when when really aligned for the public goods and tim
why can't they change their their business model i mean how much money does mark zuckerberg need well i think i think it's a little more complicated than than just his wealth but i i think the trick is that is it's what jeff said you know the the business model is advertising and the value of the company is multiples of the amount of revenue and it's just predicated on that revenue continuing to grow and grow and grow and so if they were to map a path where they were to segue off of that business to to
something else let's just say asking users to pay so much value would be destroyed um the company probably wouldn't be recognizable or at least that's i mean i can't come up with a creative way for how they would do that on their own accord in a way that wouldn't just destroy hundreds of billions of dollars of shareholder value um i just don't see a tractable path um which is why i think that governments are going to have to play a role and i think we as as individuals are going to have to play a role
in terms of having our own reckoning and applying our own pressure to these companies speaking of a reckoning you know rashida we have seen just this phenomenal social justice movement unfold in the last six months or so following the the murder of george floyd and of course ahmad aubry and so many other instances that we almost hear about on a weekly if not daily basis and i was curious you said that these technologies make it harder for you to do your work and i'm curious if if you can explain why yes so i guess part
of explaining why i made that comment is because a lot of my work has been looking at not only commercial uses like social media platforms but also uses of algorithms and data-driven technologies within the government sector so i've written about the use of predictive policing and risk assessments in the criminal justice system which are just accelerating the racial disparities within that sector um but they're often and do you explain what that means rashida for people yeah so predictive policing is a technology that relies on police and other data to predict who may be a victim
or a perpetrator of a crime or where a crime may occur and risk assessments are actuarial tools that attempt to use historical data and perform statistical analysis to then predict decisions for similarly situated individuals usually judges so in both of these examples they're technologies that are relying on historical data and a really racially biased and flawed system to make predictions about the future and they're often adopted under the guys that they're more objective or fairer or impartial when the reality is they're just further concretizing the inequities that already exist in our society and then giving
this gloss of fairness on top um so it's that when i give those examples the challenges you're not only dealing with the mythology that comes along with anything related to math tech science that um we have in society but you're also grappling with these deeper structural issues that we really haven't learned how to deal with as a society and trying to tackle them all at once um and then if you take that out and look at the private sector you're dealing with similar issues and that like mis we've talked about misinformation hate speech is another
issue these aren't new issues in society but the design of these platforms are made so that they amplify these issues and we're also dealing with a society that hasn't dealt with those issues at least legally in an equitable or manageable way to date so we're asking both technology companies to solve problems we can't really solve and then we're integrating similarly flawed technologies into our social systems and public systems that already have deep cracks in them and then asking to solve those um along with the tech issues on top what do you make of rashida when
when facebook says that we have ai that can um you know track and stop things like hate speech and misinformation um is that is that a legitimate uh stance for that company to take i mean do do they really have that because it it seems and tristan you and i have talked about this you know when there's a a piece of misinformation out there by the time it's corrected it's traveled through thousands if not millions of people my daughter actually uh her job involves correcting misinformation on facebook she's very very busy but i always say
gosh by the time you can evaluate it um it's been seen and shared over and over and over again so can their ai really stop things or is that just a pr thing it's pr tech isn't going to solve the tech problem i think that's just a fundamental thing to understand but also like i said these are very complicated issues in that if there isn't a clear roadmap toolkit or rubric for individuals to follow it means they're making these subjective judgments at scale and with no clear guidance on how to make what are really consequential
decisions um so i do think it is the responsibility of facebook twitter and these other platforms to monitor and understand how their platforms are contributing to these problems but i do think it's misguided to suggest that ai is somehow going to solve or be the silver bullet solution to problems that are amplified by their technologies but tim how do they do that when there's so much information going through these pipes you know how do you keep up with it i guess isn't that sort of the the nub of the issue and and you know i'm
particularly interested as the election is just around the corner so much false information so many uh just complete blatant lies that are being consumed and processed and then shared and they're so powerfully influential uh you know and and even friends of mine i can tell when they've been fed stuff over and over and over again because we sometimes can't even have a conversation because i know the algorithms they might have read something and then they just become this repository for first for like-minded content and then i and then i can't really have a conversation with
them and it's very frustrating you know yeah i mean i'm not uh not sure exactly what what the question is but uh i'm on the receiving end of that too and i i think it does go back to jeff's point and then other points that that folks have made about the incentives look like misinformation is really good for their business the spread and engagement of incendiary content is terrific for their union why because it plays to the attention economy idea and it keeps correct well hooked is that tristan should talk to this because he'll he'll
give the great wrapper around it but yeah i mean it it just it it plays on our our weaknesses you know it plays on our animal brain at the at the lowest level and that's really good for business because when that part of my brain is being tapped i go into an unconscious mode where i am sucked in sucked into the horn yeah and i spend an ordinary amount of time and and more time more time i spend on that platform the more money they make from me and if i really become a convert i
help spread it to other people and then we make even more money from from them tristan you want to add to that um yeah i mean it's just to sort of sum up where that leads to and to double down on what tim is saying um i think jeffy uses this line that you know this is a polarization for profit business model because let's say facebook has two possible versions of the news feed they could show let's imagine one news feed called the challenge feed and so everything you see when you swipe it challenges and
expands and makes more nuance to your view of reality every single swipe you make and the other feed is called the affirmation fee it just makes it shows you why you're right you're right you're right you're right you're even more right than you thought here's even more evidence about why the other side is even more crazy more whatever it is that you hate which of those two feeds is going to keep you coming back right the the affirmation fee right and so i think that the most nuanced point in the film that i think is
so critical for people to get is that these technology systems have taken this shared reality and put it through this paper shredder where we each get our own three billion truman shows each one of us has our own reality and if those realities are not compatible with each other which they're not because the way i get you your information is to show you different facts that are completely different of a different kind and frame than the other facts it makes it impossible to have a conversation and if we cannot have consensus in agreement when you
disagree with someone you can't just say i don't want you on the planet i don't want you voting anymore so for anything we want to do whether it's climate change racism inequality any issue we want to tackle it depends on us having a shared consensus view semi-reliable consensus reality and we need that to be able to do anything which is why i think this is the existential threat that undergirds all the other issues that makes our already difficult problems unsolvable now the thing that gives me hope is that the film will create a shared truth
about the breakdown of that shared truth so instead of being caught in the mess of all of it we actually have a shared conversation about the breakdown of our shared conversation and that's an empowering place to stand in fact my favorite thing that i think jeff recommends people do after the after watching the film is you watch it with other people who politically you disagree with and you both at the end of the film open up facebook on both your phones and then you reality swap you trade phones and you see how if i was
living in their feed you know if you saw my feed you'd see like climate apocalyptic news and u.s china escalating relations and you'd say oh my god it makes sense why tristan looks terrified all the time you have a deeper empathy for where each of us are coming from and i think that's probably the biggest thing that we can do because honestly the companies are not going to be able to change magically all of these things as tim just said not only is it against their shareholder shareholder pressure and their business model but even if
they put 100 of their resources to solving the issue the growth rate of the harms the conspiracy theories the mental health the degree degradation of our public square has far exceeded any of the product changes and developments that they can make so we're left with culture and the thing that gives me hope is that this film being simultaneously released in 190 countries and in 30 languages has shown i'm getting about 100 messages on instagram about you know per hour from people all around the world saying oh my god this is why we got the five
star movement in italy this is why we got bolsonaro in brazil and i think that shared context that collective global will is what we're going to be able to use to hopefully create a government response and also wake up the public that we've lost our shared conversation before we talk about uh just go i'm gonna go around the horn and get solutions but um you know it seems to me that this is gonna be a real problem for the election uh we're hearing about russian interference right again we're uh hearing obviously about these echo chambers
and how people are getting affirmation not information you know in my business cable news is sort of following this model right with appealing to one side or the other engagement through enragement as my friend kara swisher says so you know is anything being done i think it's sort of um really lip service what's being done before the election can you explain who who wants to explain what facebook is doing uh before november 3rd and what impact it will actually have any anybody um i can't speak to the specifics of what facebook is doing but in
my mind it's political ads right the week before the election right but these are i i feel like anything that they do is a band-aid solution like they're plugging holes in a sinking ship and this is not actually addressing the fundamental problem just like with climate change we have broken down our environmental ecosystem and these platforms have broken down our information ecosystem and we have we're literally just re-warping and reshaping our sense of truth and understanding and shared conversation um and just like with climate change even if we solved it right now we would have
a decade or more of inertia from the from the carbon buildup like the same problem exists here in my mind with our information breakdown where these are not easy things to solve or undo this is like we've been living through this mind warp for a period of time now so the idea of solving stuff between now and the election is is kind of a moot thing in my mind we can make little band-aids steps but already we've been fractured into different ways of seeing the world that are not going to undo quickly right in fact
you know a lot of these issues really haven't changed where where people stand i mean they're pretty deeply entrenched i interviewed nate silver and he said there's been very little movement from one camp you know to another people are kind of sticking with their their tribalistic instincts so you can tell that they are being you know fed but what's frustrating to me is that people don't trust the media and people don't even understand what truth actually is and what facts actually are um everything is being called into question and it's it's for somebody who's been
a journalist for a long time that's exceedingly frustrating kathy what are we gonna do about potential russian interference just kind of like say oh well i guess we're gonna have to live with it you know it is really difficult and i agree with you that it is trust itself that has been uh injured um and not just trust about any particular topic but trust in the concept of authority and expertise i'm writing a book uh now about how shame plays into all this because i uh i think that this sort of fuel that that that
divides us um and that makes us more and more tribal is is literally his shame and wanting to be outraged and shame the other side um so it is really hard and i agree with tim that it's not going to happen overnight even if we turn off facebook tomorrow which i really wish we could do i do want to say though that there are you know smaller easier problems to solve that would go a long way in some of the economic aspects of this so i mean to be clear there are as i said before
algorithms that are dividing deciding almost any bureaucratic choice that we are we go through anytime we touch your bureaucracy it's an algorithm now and they are not even being asked to comply with existing laws that's how outside the system that's how blindly trusted algorithms have been in the last 10 15 years as they've popped up everywhere so if you don't mind i'm going to say i don't know how to solve this big problem of facebook democracy and trust but i do want to say that we can solve smaller problems that are embedded inside this larger
problem like um mortgage loans like college admissions like who gets a job um you know to rashida's point earlier like it's all about the historical propagation of bias so let's ask linkedin to show us exactly how they match make between people looking for jobs and people who are looking to to hire people because i doubt that that's ever been vetted and that's an algorithm that i could think we can all agree will be extremely extremely important in the coming months as we recover from this great depression when all these people who have lost jobs need
new jobs they're going to go online and they're going to be told which jobs exist by an algorithm and you know like are they going to be shown all the jobs no they're going to be shown the jobs that the platform that they're on decides you deserve do you see what i mean so like this stuff is both economic and it's civic like we've been talking about the civic side of it like what does it mean to be an informed citizen in the age of misinformation that's a tough problem but we can talk about who
deserves a job and let's make sure that that works and that i don't think that's irrelevant to this question of a civic minded nation we have to actually have an economic existence in order to get back to trust so what do you think the the solution is tristan you said watch the film trade facebook feeds um that's a really nice idea i hope it happens but i think you would concede that that we need more well katie there's a there's a short term and there's a long term um long term we need to move to
a totally humane and just and equal and fair uh digital infrastructure and as cathy and rashid have pointed out there's a quote from from mark andreessen that software is eating the world which is that software is replacing technology's replacing every aspect of our physical infrastructure in our physical decision making we're going to hand over to the machines so we have to have and i think kathy has said this kind of like an fda for algorithms or some notion of responsibility for the ways that these things are steering society fundamentally and that's a bigger conversation when
it comes to the shorter term and we're talking about the election you asked katie about about what are we going to do about about russia you know i have friends right now at the stanford cyber policy center who are tracking multiple networks of hundreds of thousands of accounts that can be activated overnight in the days or months days or weeks leading up to the election and drop news it's important for people to realize first of all it's not just russia it's china iran saudi arabia everyone's in on the game now the kgb officers used to
spend 25 percent of the time manufacturing disinformation story that's how you were successful at your job was 25 of your time was inventing plausible things that could happen so now you imagine that you know while we've been we've been obsessed with protecting our physical borders of this country and we have a you know department of defense and we spend billions of dollars a year on that while we're protecting our physical borders you know if russia are trying to try to fly a plane into the united states they're going to be shot down by the department
of defense but if they try to fly an information bomb into the united states into facebook they're met with a white glove and an advertising algorithm that says yeah exactly which zip code zip code or conspiracy theory group would you like to target and i think we have to realize that our digital borders are completely unprotected which means that people have to exercise a totally new degree of cynicism and skepticism about what they read on facebook except if it's probably from their closest uh uh friends that they know um you really it's not just the
news by the way it's the comment threads russia they work in threes so they'll actually control the way that a conversation goes and they'll generate conflict to try to steer it in one way or the other i think that we fundamentally cannot trust the information that we're seeing and we need a cultural movement a global cultural awakening for this understanding these are not authoritative platforms which doesn't mean don't trust anything it means that we have to recognize that these technology platforms are not trustworthy places to get information and i really hope that the film does
deliver that message it's perhaps the one thing we can agree on is that these things have actually torn us apart um and that it's especially on a bipartisan level how it's eroded the mental health of our children which everyone agrees is an enormous problem and it's not a partisan issue that's the one thing that gives me hope i thought that was really well done with the the young girl who um you can see that she just crushed when somebody makes fun of her ears and you know even adults get crushed when people say mean things
about them and you can only imagine for a young developing uh person who's trying to to establish his or her sense of self how how horrible that is and uh i think you know my daughter who's 29 says gosh i wish i grew up in the 70s when there was no social media because that's when i grew up and i'm like i hear you i don't you know it was hard enough as a teenager in the 70s uh to to you know get through it relatively unscathed i don't know how you do it today could
i share one thing on that katie before i know you're probably wrapping up but we're evolved to care when people don't approve of us or don't like us right because that's that's an existential issue for survival and a tribe if the people in your tribe don't like you or they're saying negative things about you our attention is going to be sent to that instantly and it also sticks with us so we'll end up sort of revisiting that over and over and over again in our minds that negative thing that that one person said i've noticed
that even with the the film coming out that about 99.9 of all the feedback is incredibly positive but of course what does my mind hold on to it's that point one percent of people who who hate it who said the worst possible things about me about anybody else hey tristan welcome to my world yeah well this is the thing is i think katie that celebrities and public public intellectual public you know figures in society had dealt with this for a long time but now we've subjected every person to the masses right and it's this big
gladiator tournament where if you say one thing that's that's wrong and sort of the context collapse um it's never been easier to see infinite evidence going clicking clicking clicking deeper down the rabbit hole of people who want to hate on hate on you and i'm especially concerned about how that's affecting um a teenagers because they're so vulnerable as you said it affects all of us but it affects young people the most rashida clearly it sounds like you know we need the government to act and then i look at what's going on with the government right
now and it seems to me that that more often than not uh they do nothing they do nothing well not only they do nothing but they actually seem to be exploiting the divide and polarization that is is resulting from all the things we've been discussing so i guess my question to you is should the government get involved and what do you think the appetite is for some kind of oversight some kind of regulation yes the government should get involved but i think where you go from answering that becomes more complicated because we're not talking about
just one issue you could have civil rights enforcement anti-trust enforcement privacy enforcement and and even when i name those three categories we don't necessarily have regulatory or legal frameworks for addressing the exact concerns we're talking about so it we are grappling with very complex and nuanced issues so i do want to give people in government some credit that that they're trying to learn real time and figure out what to do but there's also a range of actions that need to be taken taken because we're talking about companies that are not only functioning as communication technologies
but also advertising companies so if you take a sectoral approach which do you do can you do both um and then i think we're also dealing with as i said earlier a compounded effect of societal problems that have always existed that we never have really wanted to grapple with um so in some ways we also need more robust and structural reform to address some of these issues and i do think there is an appetite or at least an understanding that something needs to be done but i think what's happening not only here in the us
but internationally is that every government is trying to grapple with where to start because we're dealing with so many different issues and it's hard to prioritize one certain fix over the other or figure out how many different forms of regulation can work together to address the myriad of issues we've discussed today and in the film but rashida when it's when it's uh operating to the benefit of some of these authoritarian governments that are popping up all over the world um it seems to me that they're not going to be really jonesing to address these problems
because they've come to power as a result of the the problems ostensibly they should be fixing right you guys yeah you are dealing with some governments that have a self-interest in maintaining the status quo and that's not simply an authoritarian regimes but also in democracies but you're also dealing with the problem that the majority of these companies are based in the united states and are u.s companies and therefore regulated by our lack of laws here um so you're dealing with the need for national reforms for how issues are um affected within certain jurisdictional boundaries but
also just a global problem with global scale companies and the inability to for us as a globe to really understand how to regulate and deal with issues that are not isolated within the borders of one country right tim and jeff what do you think are you optimistic that something's going to be done and if you could wave a magic wand tim um you know good luck how would you fix this this huge problem that's so as rashida said so multifaceted yeah i mean i'm i think philosophically i'm just i'm i'm more focused on bottoms up
probably because i'm just more comfortable with that which is just like starting with the individual when i think about global systemic problems like climate change or even cigarettes right the evolution of cigarettes from something that was actually good for you and doctors were doing it to now where it's it's it's absolutely socially rejected i think two things happened right people had a reckoning with their own in climate change case their own contribution to it do i have an suv in my driveway like how what is my carbon footprint and am i contributing to this problem
that i now feel stronger and stronger about what changes can can i make um and then i you know so that is sort of all to say that at an individual level one of the things that i spend my time on is trying to show people a really tight feedback loop about the impact that the phone has on them right i mean the thing that happened with cigarettes is that it just became really clear that it was going to kill me and the thing that's happened with sugar is that it's become really clear that i'm
going to get diabetes and i have a lot of problems beyond that and so the compulsive and addictive usage of phones right now we don't have a feedback loop we can't see the size of our brain changing even though there's very clear data that it does we can't see the onset of depression and anxiety even though there's very clear data that it does and so if i had a magic wand i would i would try to create transparency around that feedback loop so we knew incrementally when i spend four hours on my phone tonight what's
the cost individually um i think katie my i think my optimism comes from my pessimism the same with climate change and that i think things are going to get worse and we're not gonna we're gonna push back at some point that we as a society are gonna realize no this is not acceptable this is not how we wanna move forward and the stakes are so high and it's so apparent to everybody that we say no we need to we need to about face we need to change the system and whether that comes from the goodwill
of the fossil fuel industry whether that comes from the goodwill of the tech industry or whether that comes from regulation i i only those are the paths forward that i see uh the status quo is unsustainable um it just will continue to break down and rip apart our society in a way that um the public will not go for and my hope here is that the film and countless people like everyone here on this conversation and many many more working in the industry can continue to just raise the alarm and talk about the problems and
define the problems so that we can have a shared definition of this is what we need to address this is what we need to fix we can we agree on these basic facts so that we can look towards the solutions and what those solutions look like and then i'm super eager to hear from rasheeda and kathy and the t and many many more people around like what does meaningful policy look like i that's so beyond my expertise and my knowledge but i'm i'm hungry and eager to hear that i want somebody to just put together
a list of let's do these things and if we do these things it makes a step in the right direction while you're completing that thought i just looked at my screen time and it says 18 hours and 13 minutes tristan is that really bad uh more than any of us you guys i'm writing a book and i use this to write so my social networking two hours and nine minutes that's bad too huh i think we have to be careful i especially want the parents out there to be um not too hard on themselves because
what makes the situation inhumane is when we have a systemic problem for which we only have individual solutions it's sort of like you see this massive problem and then the burden is only on what i can do or look at my screen time just like i wouldn't want you to feel bad about how often you use your arm if you had a timer about how many minutes or hours a day use your arm the the slab of glass that we have here is is not by itself evil i'm more worried about the broader climate change
of culture the systemic issues that are really really bad um and i think that um you know we it we need a full press full court press solution we need there to be you know individual actions that we can take and a cultural movement around that which i hope the film starts but never without the broader link to how do we change the system so i think for everybody who whether they download moment or they turn off notifications or they take their whole school and say you know what as a school we're going to move
all the kids who are on tick tock or instagram because um one thing parents should know is it's not about an individual taking their teenager off of instagram when all my friends are still on instagram and they're still talking and that's where all the homework and dating conversations are it's not a viable solution we need a group migration so as we do all these individual things that each of those people stay part of this conversation you know they can do it through you know center for humane technology or any one of these other groups but
they need to become part of the movement for to ask for demand ask and demand for a much greater regulation and change um we don't want a world where it's just the individual we want everyone to do the individual things and to participate in demanding uh the more systemic change that we need on an earlier call it was on today someone said can we put the genie back in the bottle well um there's an interesting story of tylenol in the 1980s where they literally did just that they found there was poison in the tablets yeah
and they could have lied about it and said you know what actually there's no poison we just have to make sure our stock price keeps going up we're going to pretend it's not happening and meanwhile people keep dying and they keep putting the tylenol out in this case they quite literally put the you know the genie they took it off the shell until they invented the tamper-proof container um and that invention was when they put it back on the market and the fact that they were transparent and honest about the problem at first their stock
price dropped dramatically but then it actually went back to even higher than what it was before and so i think that there's several things that short term to stop the bleeding we need them to do everything that they can including turning off algorithmic amplification not recommending facebook groups twitter can untrend october which is one of the things to not have these trending topics which are easily gameable by foreign actors there's a long litany of these kinds of things and i think the diversity of the solutions need to be reflected in the people not just on
this call but the broader community um that said it's just this is going to take a long time so it's one of those multi-stage things just like climate change and uh i think kathy should probably chime in yeah i just wanted to mention that i wrote a piece for bloomberg like just came out a couple days ago about tick tocks algorithm you know there's a whole whole thing about uh who gets to own the algorithm even if tick tock is sold and i just made the obvious data science point that recommendation engines can be manipulated
and i just outlined how if i control tick tock's algorithm i and i know there's like an anti-vaxxing viral video or a cluster of viral videos around anti-vaxxing i could amplify those in the recommendation engine or i could diminish the i could you know de-emphasize them um and i think of that as a good thing personally because i know that like meetup.com um they ad hoc changed their recommendation engine to make it less sexist and that's what's something i want to see um you see i i would say that um the the thing that's most
disingenuous about zuckerberg's travels to congress is this idea that by doing nothing he's somehow being objective by not you know putting his thumb on the scale that's somehow the default and the most reasonable suggestion to do no it is uh he acts as if it's value-free but it's actually of course value-laden what he's saying is that the the results of the choices we've made in in maximizing engagement as tim mentioned which of course encourages divisiveness and tribal justice and disagreement um that that was a choice that was a very value-laden choice and you have that
now the choice to do something else you can say hey we should de-emphasize misinformation propaganda and anti-science uh rhetoric they could do that are they gonna do that i doubt it um but i think the long the long-term point is that it is actually quite possible to do it katie it's also quite possible to regulate algorithms and force them to follow rules and as i've said a couple times now like my goal in the short term is to convince us and regulators and lawmakers and policymakers that once you've built a law in plain english you
could force an algorithm to follow that law we don't know what the laws need to be for facebook and social media but they need to be something like this has to work for the public good not just for your profit and once we translate that we can force them to follow that rule and jeff you wanted to say something you're you're going to have to so why don't you close close things out oh well i was going to add this one comment to what kathy was saying there and i just heard this recently and it's
from a reliable source but not somebody inside facebook but apparently that as as comments get closer and closer to the their terms of service and how extreme things get they actually do algorithmic those those apparently spread exponentially and those are being tampered down along the um the terms of service um and it's interesting because the more extreme something gets the more rapidly it spreads um but that's something that i've heard it's very speculative but to your point kathy i don't think it's tying into all the aspects of misinformation that we're discussing here um but katie
just to to close from from my perspective i'm so grateful for this team and everybody that we have here for sharing their stories for sharing their voice and for helping to elevate this issue i knew nothing about this going into this a couple of years ago and it was from all of these conversations that i got to see just how there is this invisible change happening in our society through this code that we interact with every day that nobody realizes what's hiding on the other side of their screen and my hope here through all of
these voices is that people can wake up and see and recognize that there is something that we need to pay attention to as a society there's something that we need to demand a change to there's something we need to demand transparency into um just as kathy was saying like there are these opaque algorithms that drive our lives that we have no insight into how they operate and that's really scary it's something that touches each and every one of us and we don't know what's driving it so that is my biggest hope is that this can
be a wake-up call and that our society can rally together and say we want to do something about this thank you guys so much for this conversation it was so critically important and so informative rashida tristan kathy tim jeff i really enjoyed speaking to all of you and i cannot encourage uh everyone enough to watch the social dilemma uh you did a fantastic job and took a lot of complicated issues and made them extremely um accessible i think for the average person so it's on netflix it's called the social dilemma and please check it out
um take care everybody thank you so much thank you thanks katie thanks everybody thanks everybody you