The Present And Future Of Artificial Intelligence On Sleepwalkers

Hands of robot and human touching on global virtual network connection future interface. Artificial intelligence technology concept.

Sleepwalkers, a podcast hosted by Oz Woloshyn, takes a hard look at the technologies and artificial intelligence that might have seemed like nothing more than science fiction only a decade or two ago, but are now a daily part of our lives. With face recognition, targeted ads, “deep fake” video manipulation, consistent surveillance, and advances in robotics, it can feel a little like we’re all just a bunch of flesh bags trying to keep up with Skynet. Mostly, this technology is welcome, offering solutions in national security, medicine, criminal justice, even matchmaking, but just as often, the same technology can be used against us in extremely scary ways. Unfortunately, we often don’t see the problems until it’s too late. Oz wants to know what exactly we’re “sleepwalking into” without understanding the full consequences or applications of the tech and AI in development today. This episode gets in depth about dating apps and targeted ads to find out how developers are using AI to discover what you want so they can nudge you toward - or sometimes, away - from your goal, and what that means for society as a whole.

Targeted ads are familiar to most people - how many of us have casually mentioned needing a new coffee pot, for example, only to discover shortly after that our entire social media feed is full of ads for Keurigs? That can be rather disconcerting, especially when the algorithms aren't able to account for huge changes in your life, like death. But targeting can be useful, too - not only can it direct you to ads for what you actually need and want instead of things that are irrelevant to your life, in some cases, it can save lives. That’s what Google’s security program, Jigsaw, is working on. They realized that terrorist extremists in the Middle East were recruiting new members online, radicalizing people from Canada, Europe, and the United States. How was ISIS able to persuade them to leave their comfortable lives for war-torn Syria? And if technology could be used to recruit, could it also be used to dissuade?

Google wasn’t afraid to try an extreme method to find out: they asked. “We brought together 84 former extremists and survivors of terrorism...Islamists...a former violent Israeli settler, former Christian militia,” Yasmin Green, the Director of Research and Development at Jigsaw, tells Oz. “Can you imagine the security concern for Google in helping us convene everyone in one place? We had snipers on the roof.” But it was worth it: their conversation helped Jigsaw learn what the recruitment narratives were, “and they were largely...that this was a devout, correct, religious thing to do. That this...was going to lead to a healthier, happier life,” Yasmin says. Using that information, they were able to create a targeting strategy focused on people likely to be radicalized based on what they were searching for online, and push them to alternative content. “If you were interested in Fatwas about Jihad...we would give you those edicts, just not the ones that ISIS was proposing,” Yasmin says. 

Of course it’s great to use targeting to deter terrorism, but “it does also mean living in a world where our main Internet search providers also edit the results,” Oz points out. “Do we want Google controlling what we know about the world?” That’s a lot of control to surrender to a few tech companies. “In the end, it all comes down to who’s steering the ship and what they’re steering you towards. Who watches the watchmen?” Oz asks.  

Touch screen analyser

But can we really blame tech companies for radicalization, or convincing us to buy a new coffee pot? Each person is responsible for their own actions, after all. But tech has a secret weapon there, too, as Tristan Harris, design ethicist for Google, knows intimately: He studied persuasion in technology. “What is the perfect, most seductive...red color for that notification,” Tristan says, “or the most seductive video that you can't help but want to watch next...What we have to do is flip the table around. We're not giving people what they want, we're giving people what they can't help but watch.” 

“Our apps and smart devices are hijacking some of the deepest and most powerful systems in our brain,” Oz concludes, which makes it less about conscious decision-making and more a question of addiction. “The truth is that Tinder, apps, social media validation, they all generate the same feelings as romantic love. So, of course, we're prone to be addicted.”

Just like with anything else, though, knowledge is power, and educating ourselves on how we’re susceptible to technological manipulation is important to how we shape the world going forward. “Companies know they can manipulate us for good and evil with technology that touches us at our evolutionary roots,” Oz says. “But the future of those technologies isn't inevitable. And we still have the power in our hands to decide what to allow in our lives.” 

Listen to the episode to find out more about how dating apps play into the addiction centers of your brain, and binge the whole season of Sleepwalkers for episodes about robots writing recipes, machines helping blind people see, and using AI in the battlefield, if you don’t want to be caught sleepwalking. 

If you want to be sure you're listening to the podcasts everyone else is checking out, iHeartRadio has you covered. Every Monday, iHeartRadio releases a chart showing the most popular podcasts of the week. Stay up to date on what's trending by checking out the chart here. There's even a chart just for radio podcasts here, featuring all your favorite iHeartRadio personalities like Bobby Bones, Elvis Duran, Steve Harvey and dozens of others.

Photos: Getty Images


Sponsored Content

Sponsored Content