Will technology shape our future or will we | Deborah Nas | TEDxAlkmaar

363.71k views2068 WordsCopy TextShare
TEDx Talks
As a Ted x speaker, Deborah has an unparalleled ability to captivate any audience with her inspiring...
Video Transcript:
Transcriber: Sina Wang Reviewer: marshall8 mathers88 Will technology shape our future? Or will we? I think Marshall McLuhan insight from the 1960s is still relevant today.
We shape our tools and then our tools shape us. Radio and TV broadened our world view. Mobile devices and the internet changed when, where and how we work.
Cars expanded our habitats and planes made our global travelers. Technology creates many new possibilities, but unfortunately it is also destructive. Transportation pollutes our air.
Electronic devices create huge amounts of electronic waste and digital technologies, gave rise to big tech companies and sophisticated surveillance systems. And it won't be long before we have to deal with the consequences of transformational technologies like AI and Quantum. Now, we often feel that technology just happens to us, that we have no control over it.
But technology doesn't create itself. It's humans that create technology. More specifically, it's a small group of people that take the lead.
They are the technology optimists, and I'm one of them. Or at least I used to be one of them. I think I'm a technology optimist having an identity crisis.
I first thought differently about technology. A few years after having kids meet my sons Ramsey and Jordan, and especially Jordan is this boy with an exceptional talent for anything digital. As a baby, he became physically wild.
When he showed my iPhone at the age of two. He knew how to switch WiFi networks if he had a slow connection. And at the age of three, he knew what he wanted to become in life A YouTube star.
And if you have kids, you might recognize the situation. Your father is going to take a picture with mommy then you’ll get your ice cream. But in any other situation, it looked like this and I was like a mobile electronic store carrying iPads, game consoles, batteries, extra headsets, you name it.
And I had it with me. And although I facilitated this as a mom, I was deeply worried at the same time. What does it do to his eyes?
Will he need glasses soon? How about his brains? Are some of the games maybe too violent?
And does he have enough friends? How are his social skills developing? As a mother, I want to limit his screen time and force him to play outside as I did as a little kid.
But I also have another role in life. In my professional life. I'm a part time professor at the Delft University of Technology, and I work with companies to speed up technological innovation.
Looking at Jordan as an innovation professional, I have a very different perspective. I see his affinity and his talent for digital technologies, and I feel I should nurture this talent and facilitate him to become whoever he wants to be. And this put me in a very interesting position because as a professional, I observed my feelings as a mother with interest and I wondered, what am I afraid of?
And it triggered research. And I found that there's a historical consistency to how people have worried and how they express these worries about technological innovation. Basically, humans have fear technology.
Since writing was the latest thing, Greek philosophers argued that writing would be bad for our brains if we would write everything down We don't have to remember things. It would instill forgetfulness in our minds and make our knowledge superficial. Fast forward to the printing press.
When the upper class feared that books would spread rebellious and irreligious thoughts amongst the common people, they feared they would lose control over what was being printed. Newspapers would hurt social fabric of society because until then, people physically came together to learn about the news and talk about it. But with the newspaper, people would be in their home by themselves reading about the news.
Cinemas would demoralize society. The TV generation would never learn how to read or write well. And when the Internet arrived, we saw the most bizarre headlines.
The Internet was bad for everything. One of the things being it would make our knowledge more superficial this time because you would jump from article to article without an interest in reading longer pieces. And these are just a few examples, but we see similar arguments recurring over and over again.
Loss of cognitive, cognitive and physical abilities, loss of social skill, loss of morale and loss of control being the most prominent ones. When we envision societal change through technology, we tend to frame it in terms of loss. Let’s look at an example.
If you enjoyed playing with Lego as a kid, chances are you'd rather see your children or grandchildren play with Lego over a digital game like Minecraft. Most people immediately point out what you lose when you move to the digital world. Fine motor skills and 3D insights.
Kids gain another type of fine motor skills and 3D insight in the digital world. But if you're not familiar with those yourself, you tend to judge them as less valuable. You maybe even never mastered or tried Minecraft essentially to digital Lego.
You can mine all sorts of materials and then craft any object you like. And because it's a digital world, possibilities are endless. If you make a rational comparison between the two.
Minecraft comes out really well. Nevertheless, we stick to our preference for Lego because we tend to choose what we understand and value and are familiar with. And if you didn't grow up with digital games, you're often prejudiced.
It's bad for your eyes, it's bad for your brain. And worst of all, you can get addicted and that will put you in social isolation. Steven Johnson came up with a brilliant thought experiment saying, What if books were invented after video games?
So try to imagine for a moment there are no books, no newspapers, no magazines. You do not read, but you do play video games all the time. And everybody around you does it, it’s a normal thing to do.
And all of a sudden books are invented and kids love it and they start reading like crazy. But what we say. Chances are we would say it's bad for your brain because, again, it's a rich multimedia experience triggering all parts of your brain.
And a book is just words on the page. It's bad for creativity and leadership because in a game you learn how to explore and you learn about leadership. But in a book, there's no other option than following the plot.
Last but not least, it will put you in social isolation because the game you play with others, whether they’re next to you on the other side of the world, it doesn’t matter. But with a book a kid would be in their room by themselves for hours and hours reading. And this perfectly illustrates how your frame of reference shapes your attitude towards new technologies.
And Douglas Adams, a science fiction writer, once brilliantly summarized what's happening here. He shared, Everything you grew up with is a normal part of the world. Nothing new.
Anything invented between age 15 and 35 is new and exciting, and you can probably get a career out of it. Anything invented after age 35 is against the natural order of things. Now, obviously, it's not this black and white and it doesn't go for everyone.
The technology optimists like me tend to focus on the benefits of new technologies, even when they're older than 35. But lately, I can't help myself feeling increasingly worried. We apply algorithms to define if somebody can get a loan, only to find out later that it rightfully discriminates against certain groups.
We let algorithms define what we see on social media influencing our political attitudes. And soon we will have AI driven virtual friends that know us so well that we fully trust them and follow their advice, unaware of the political or commercial motives behind the algorithms that bring these virtual friends to life. Looking ahead and taking into account the fast developments in AI.
I am afraid that we will lose agency and control. I tell myself my fears are grounded that things are different from the past. Digital technologies are all around us and their inner workings are invisible.
Big tech companies know everything about us and control what information we're being exposed to. Startups with a mission to change the world for the better quickly change character after venture capitalists pump hundreds of millions of dollars into them, forcing them to quickly scale and monetize everything they know about their customers. Generative AI is interacting with us as if it's human, forcing us to rethink what is real and what is not.
I feel change is happening too fast and we can't oversee the longer term consequences. At the same time, I know that already. In 1970, Alvin Toffler wrote about future shock being the anxiety brought on by too much change in too short a period of time.
I know that my worries match historical arguments against technological innovations. I know that I didn't grow up with AI, so my frame of reference probably informs my opinion about it, making me focus on what we lose instead of what we gain. This makes me wonder.
Am I falling prey to the psychological processes I've been researching? Am I simply getting old? Is my frame of reference no longer fitting today’s reality?
The truth is, I really don’t know. And in the past, my life was simple. I was a technology optimist, helping others to see the benefits of new technology.
And now I'm in a process of recalibration, trying to figure out if I should start warning people for the dangers that lie ahead. But one thing I do know, digital technologies are developing at lightning speed and we do not have the luxury to lie low. The technology optimists that work at tech companies that shape our world focus on the benefits of new technology and vastly underestimate their dangers and societal impact.
If we do nothing, tech companies are free to develop and apply technology as they like without any ethical or societal guidelines they must follow. And while we're still struggling to deal with the implications of AI, figuring out if and how we can control it, we already need to start thinking about the next transformational technology quantum. Quantum technology.
Like quantum computers, we have the opportunity to think about the ethical, legal and societal aspects before the technology leaves the lab and is widely applied. This is incredibly difficult, though. We can compare it to the invention of the laser.
Imagine for a moment that we are in 1960 and Ted Maiman just created the first working laser in his lab, this big, bulky machine with a weak signal receptive to noise. And at that point in time, we're all already trying to predict that decades later, we will use lasers to cut through metal print information on paper, operate on people's eyes, can barcodes in supermarkets, or even use it as entertainment in dance parties? It is almost impossible, bordering on science fiction.
But nevertheless, we must try because quantum is a transformational technology that will shape the future. We're just not exactly sure yet how, when, where and through which applications. We need to envision these future applications to understand its societal impact.
We need to understand the positive and the negative impact to shape a future that we want to live in. We need to start a broad societal debate today. And I'm lucky to have a role in this in the Netherlands, thinking about the impact of Quantum.
But all of you should have a role in it, too, because we need people from all cultural backgrounds, religions, age, socioeconomic status. We need people with different values and beliefs, and we need the ones who fear technology as much as the ones who trust it. The future doesn't just happen to us.
We shape technology and then technology shapes our future. And I'm looking for people who want to think about this future with Quantum. So I invite you to join me and help shape our future for the better.
Thank you very much.
Copyright © 2024. Made with ♥ in London by YTScribe.com