O que a fofoca nos ensina sobre proteção de dados | Nina da Hora | TEDxBeloHorizonte

7.89k views1773 WordsCopy TextShare
TEDx Talks
Privacidade de dados é um direito fundamental que extrapola o mundo digital. Mas, quando falamos em ...
Video Transcript:
Translator: David DeRuwe Well folks, I promise I won’t make anyone here read the privacy terms of websites and applications, OK? I’d like to begin by asking you a question: Has anyone here ever told a friend’s secret? You can raise your hands; the camera isn’t on you.
Oh really? That’s how it goes. Since we were born, we’ve experienced and worked with these concepts of secrecy and information privacy.
We have friends who tell us secrets, whether we keep them or not, as was the case with some people here who raised their hands. If you’re the oldest child in the family, maybe your mother will tell you a surprise she’s preparing for the family, and it’s your responsibility to safeguard that information until the birthday or other surprise occasion. When we move this into the digital world, we get confused about how privacy and information secrecy should work.
In the offline world, we have control - you choose whether or not to tell that secret, you choose whether or not to put an end to that surprise - but in the digital world, we don’t know who is on the other side of the screen when we’re communicating and trading information. We also don’t know who is storing this information. Data privacy isn’t a concept linked only to the digital age, much less to the digital environment.
Data privacy is what I ended up exemplifying for you. We already know this; we already deal with this. When we structure this in a digital envrionment we necessarily need legislation that helps us achieve transparency about how data is being stored and shared.
Who is getting access to our data? I remember, in 2018, when I was at the peak of using many social networks, people used to ask me, “Nina, how does this social network know who is my friend? I post a photo, and automatically, somebody’s name is already there.
” “How can this telephone operator manage to send me messages right when I need them - a sale at a particular restaurant or a particular pharmacy? ” to which I responded, “Folks, this idea has been in development for all these years with our permission, and we didn’t even know we were permitting it: it’s the concept of data surveillance. ” When I throw out this concept and talk about surveillance, the crowd goes: “Oh!
” You didn’t do that, did you? Usually, people are like, “Oh, my God, surveillance! ” You didn’t, I’d like an “Oh, surveillance” .
. . Audience: Oh!
Nina da Hora: Thank you! So it’s because the crowd starts to get scared, right? The idea is for us to not be scared.
Clearly folks, there has to be fear, up to a certain point, a limit, but the idea isn’t to provoke fear in people when I talk about data surveillance and data protection. It’s to provoke the development of critical thinking in relation to the technological tools we’re using in our daily lives. When we develop this critical thought, we can create, think, and argue with the companies, the governmental organs, and with the projects that are soliciting our data every day.
I know it’s tempting to give your data to have access to some discount. I almost fell for this when I was coming here. At the airport entrance, there was that speaker from a very famous brand that I won’t mention here - the sound that goes, you know?
I don’t even need to speak very loudly because the sound is already super. The music would have been awesome, but I had to give up my data. I asked, “What data do you need?
” and it was almost my entire life. I said, “I love this speaker, but it’s not worth this. ” I know it’s tempting when you enter a pharmacy and don’t give your data to have access to the discount on a very expensive medication.
We live in Brazil where social and economic inequality are in place. It’s structural, so I’m not going to get into this issue. I’ll not discuss this issue here because it’s in place, and we know it.
So all this development, these concepts that I’ve cited here, comes from companies and organizations that pay attention to the rise and fall of a country, because then they’ll be able to choose which software and which tools will make people unafraid of giving up their data and will discourage them from asking too many questions. Data privacy . .
. Ten years ago, right? Since 2000 and - 2000, that’s 20 years already, 2000, 2002, that’s 18 years more or less, it’s been considered a fundamental right.
So we have the right to privacy and to secrecy of our information. We have the right to request the removal of our data from a digital environment, from a website, or from an app we don’t wish to be associated with. We already have this right.
The big issue is that the way they explain data privacy to people ends up pushing people away from this debate. It’s not a debate linked only to the field of law, much less to the computing field. As a computer scientist, I admit how hard it was for me to make the transition from a passionate technology developer who loved robots to a computer scientist who wants to almost eliminate robots from the world.
Almost, because I have a robot in my home. Obviously, those robots aren’t going to replace people. I don’t believe in that.
I won’t get into that argument; it’s not the discussion here. We can talk about this on a social network later. I don’t believe that robots will replace human beings because the machines with this presumed artificial intelligence aren’t able to recognize errors.
When I asked here who had already told a friend’s secret, it wasn’t to expose you; it was for you to show you know how to recognize your error - you exposed a friend’s secret, or you exposed the secret of someone important in your family. We human beings manage to have and to develop sufficient thoughts and reflections to arrive at this conclusion. Machines with these intelligences need us to nudge them, and say something like this: “You’re wrong.
Look here, this data you’re showing me doesn’t make any sense. ” They need this. It’s because of this that I don’t believe in the replacement of humans by machines with artificial intelligence.
Returning to privacy, it is increasingly difficult for me and other researchers, activists, and experts in the field of information security in Brazil and around the world to be able to explain to people how key civil society participation is for the reconstruction of what we consider to be information security. Speaking globally, it’s difficult to say this because we have robots as competitors. We have robot competitors giving lectures for children.
Have you seen this already - little robots reading for children? And then, there are people who think it’s great for a child to sit with a tablet of any brand and to interact with it, instead of interacting with the family. Only that .
. . what am I trying to provoke here with you?
If you have a child or a nephew, if you have friends with a nephew . . .
She has a child, right? I already noticed. I saw you pointed it out .
. . Anyone with a child at home needs to ask themselves: where is all this data going from these interactions of children with these tablets and cellphones?
“Oh, don’t worry, Nina. I don’t have money in the bank” I hear this argument a lot. “I don’t have any money in the bank.
” “I’m not a famous person. ” “No, my child is just one more child in the world. ” And I become desperate: “People!
” Now is the time for you to look in the mirror and say . . .
“Did I really said that? ” I know. I say, “People, what do you mean you’re not worried about the future, about the journey of your children’s data in this digital environment where we have no transparency regarding who is using, processing, and sharing this data?
” My concern can’t be moved by the visibility people have in society. It has to be moved by fundamental rights that every citizen has. Right?
We have rights and duties. On my side, I need to argue by considering citizens’ rights, and children are already a part of society. Even though you say, “Ah, school is special” or ”My child isn’t in daycare, my child isn’t in elementary school,” it doesn’t matter; they’re a part of society.
We need to be worried about what we are sharing on social networks. Photos of children remain stored on servers that aren’t even in Brazil. And why am I highlighting this?
To use the available laws today in our country, to claim all this here like I’m doing, there is a territorial problem. Each country has its own laws, their own cultures, and their own ways of treating data. So I can’t, for the social networks that are part of our daily lives today, use only the Brazilian LGPD, the General Personal Data Protection Law.
That’s why the participation of civil society is so important. I didn’t come here to try to conceptualize data protection and information security. I know that would be super boring.
I won’t stay here conceptualizing the algorithms, in spite that I love this theme and love talking about it. What I came here for was to show you that you need to participate in this debate with us. Don’t leave this debate in the hands of scientists and folks in the law field, because we have a very problematic bias, a behavioral bias.
I have limitations on how I see society, just as everyone has. That’s why we need the participation of different people - diversity. I’m talking about regional diversity, taking race and gender identity into account, and participating in this debate to raise issues I’m not seeing.
I have limitations too, Just because I’m a Black LGBT woman doesn’t mean I’m automatically going to be inclusive of everyone. We need to be aware of our mistakes. As I asked before, right?
The first thing I did here was ask again who had ever exposed a friend’s secrets. And when we have the courage to raise our hand in an audience to say, “I exposed a friend’s secret,” it’s because we have to fight so we can have, in fact, a reconstruction of what we understand privacy to be in the world and, principally, in Brazil. Thank you.
Copyright © 2024. Made with ♥ in London by YTScribe.com