artificial intelligence we talk about it a lot on this show how it's changing its potential tonight how it's being used in politics some videos have depicted made-up Futures other deep fakes have made it sound like a candidate is saying something that the candidate never said America you blame me for interfering with your democracy but I don't have to you are doing it to yourselves we're entering an era in which our enemies can make it look like anyone is saying anything at any point in time even if they would never say those things our great national
anthem baby shark these are not the best current examples but to use current examples we'd be taking the bait of their creators instead we'd like to talk about this Minefield with Danielle citrin professor of law at the University of Virginia law school Professor thank you we should also note uh that Danielle is also the author of the book fight for privacy protecting dignity identity and love in the digital age so Professor we've been wrestling with misinformation and disinformation for a while what's the additional harm from artificial intelligence and the use of deep deep fakes with
respect to the matter of just truth you know they come at a time you know they're getting hyper realistic and the technology that enables us to make them is you know spreading far and wide and so it's at a time of course when trust in in our institutions trust in each other is at an all-time low and so when many respects were both primed to believe information of course that confirms our biases and if it's information that's negative and novel about important institutions including politicians then we're going to be even more likely to believe what
our eyes and ears are telling us even though the video may not be you know maybe completely synthetic you know a made-up video or audio of a politician doing and saying something um awful right the night before an election the risk is that that well-timed deep fake can sort of turn the tide of who shows up to vote and I think that is that good we're in this like precarious moment of disbelief and distrust coupled with the diffusion of the technology let me ask you about the specifics in a second but before we get there
back to this general idea of Truth so is it is what you're saying is you're not talking just about a bad ad here or there as damaging as that could be you're talking about an erosion of kind of our Baseline relationship with the truth if it's a world in which whatever you're seeing may be true or not true it allows you to be credulous if that is what you want to be or be doubtful if that's what you want to be and that's easier in this new world that's right it's almost like a pick your
own adventure uh in terms of politics and what really troubles so Bobby Chesney the dean of University of Texas law school and I wrote a paper in 2019 and we we described the phenomenon of when someone wants to disassociate themselves from truthful real videos or audio showing them doing saying something that you know is unappealing that they can easily just say Ah that's not true right you can't believe anything your eyes and ears are telling us these days when we call that the Liars dividend that we're in this environment with such epistemic distrust right of
institutions of each other if you're not on my team I don't trust you right the sort of tribalism that I think we worry that not only will we immediately choose our adventure of what we want to believe right confirmation bias but that when the truth is out there right showing some someone Faithfully doing something they can easily just point to and say you believe anything these days right and we saw you know the president Trump tried to do that with the Access Hollywood tape and at the time you know I think it just kind of
rolled off our shoulders right it was long enough ago but I think we're in a moment of sort of Truth crisis so then as the last question when we address this give us a sense of whether you think there needs to be some kind of Regulation and as you explain your answer is there a what rights do we have to keep in mind that we that would have to be a part of any kind of solution to this moment we're in right I'm a law professor so you might say okay Danielle I know you're going
to say the law is an answer right but laws are really blunt tool and especially when it comes to politics and and public discourse around elections I worry that when we start crafting laws that are designed to address lies about politics that we're gonna it's gonna be really a hard to craft those laws and the trade-off might be pretty significant so you might be able to address kind of deep fakery related to defamation right we can do that in the law and maybe lies about when you show up at an election like there's certain kinds
of lies that of course we can prescribe but I worry when we're talking about lies related to candidates and politics that were in an area where if we step too far we could be making I think best and bad choices Danielle Citron we're going to be back to you on this question but we really appreciate you being with us tonight UVA law professor and author of Fight for privacy thank you so much thank you