MENU

Photo: Eugene Hoshiko/AP Photo

Interface designer and serial founder Aza Raskin has been grappling with the persuasive power of VR—and our total inability to defend against it

Emerge Music + Impact Conference is created and produced by ABP Media’s parent company, A Beautiful Perspective. Leading up to the conference, we’re featuring some of the musicians and speakers who’ll be performing in Las Vegas this November.

Aza Raskin has a lot on his mind.

The interface designer and tech wunderkind who created Songza, served as creative lead of Firefox at Mozilla and founded Massive Health, which used data and design to help people make healthier decisions (and was acquired by Jawbone), launched a new product last year: Elsewhere, a $50 headset that turns any image or video on your phone into a 3D experience.

It’s a neat trick, but lately Raskin, who’s speaking at Emerge Music + Impact Conference in Las Vegas this November, has been thinking more about the persuasive power of virtual reality than its potential for a moment of light-hearted wonder. VR, he says, “is the most empathetic technology we’ve ever created,” and with that emotional connection comes the capability to influence viewers in ways good, bad and very dangerous. 

“In Silicon Valley we trade what is right for what works,” he adds. “We are A-B testing our way to amorality.”

Before Raskin set off on a month-long adventure in the woods of Alaska, ABP sat down with him in San Francisco to talk about the future of VR, the risks of targeting and how he moderates his own device addictions.

What made you start thinking about persuasion and virtual reality?

I helped build the web at Mozilla—that was my first thing. There was a tech idealism or this utopia vision of what we were building. If you could make something open and widely available and decentralized, then all of a sudden the power flows out of the major corporations and back to the people. And at the very beginning of the web, it felt that way.

But then you realize, no, the entire economy of the web is set up to steal our attention and direct it someplace. That is the attention economy. And there is incredible value there, which means that all of the smartest minds of our generation are here trying to figure out what you’re doing and then divert you a little bit toward something else.

I did a company called Massive Health that was trying to do consumer health care in data. What we were trying to do: behavior change. It really got me thinking, as we got better and better at changing people’s behavior, yes, health is something great to do, but also these techniques are going to be weaponized.

And then the election hit.

How does Aza Raskin disconnect? By taking a 30-day sojourn off the grid in Alaska. Sergej Zabijako

Is there a crisis of conscience happening in Silicon Valley right now?

Yes and. Technology is neither good nor bad, but nor is it neutral. The problem is we’re playing a game of whack-a-mole. We realize that we’re starting to become media providers, but our inclination is still to back away and say, “It’s not really our fault. It’s just what the users want to share,” ignoring the fact that we’re setting up the architecture of our systems.

I think of fake news as a parasite or virus that sits on top of the attention economy. When we have these tools to try to do better fact-checking, it’s kind of like taking a Tylenol to reduce the fever without getting rid of the underlying infection.

You can see it looking at VR. VR is the most empathetic technology we’ve ever created.

Why is that?

You put these goggles on your face, and all of a sudden you’re no longer here. It turns out the brain is really easy to trick. After you spend five minutes in this thing, you just feel like you’re sort of there. What we see is what we believe.

But this can be used for good or for bad. Look at the business models we already have: They already go after your eyes, and now you’re going to strap a screen to your face. The next war in Silicon Valley is for how we see. Not just our eyeballs, but our physical eyes.

Marketers know the best way to predict someone’s future actions is to know their past actions. Can you flip that on its head? Maybe the best way to change somebody’s future actions is to change their memories of their past. That’s what [psychologist and memory expert] Elizabeth Loftus does. She implants a memory of you as a child getting sick eating a bad egg, now you won’t eat eggs for a month. It changes your behavior.

That’s terrifying.

We are outsourcing more and more of our memories to our computers. Our ability to read maps has gone down because we outsourced it to GPS. Same thing with memory actually. We remember less when we know we can use Google to find it later.

We give immense power to the companies by letting them hold these things. VR creates memories. The first time I was flying a drone, I had a little VR headset. I was standing on golf course down in La Quinta, and a bird flew by and my friend was like, “Chase that bird.”

My memory is not of me standing on a golf course, it’s of me chasing behind that bird. That’s why VR is going to be such a dangerous medium. It’s because it creates memories, and memories are how we make our future actions.

We don’t have any sort of regulation. There are no barriers to any of this, right?

Here’s going to be the line of argument: Who are we to decide what people choose and what experiences people choose to have. If they want to have free internet and VR, why don’t we let them do that?

Our ethical and moral obligations change as we understand how to persuade people better.

In terms of advertising, we have regulations around using cartoons to advertise cigarettes to kids. Joe Camel has disappeared.

Which is indicative of how powerfully persuasive even dumb print ads are, and they’re not even targeted.

Imagine now a cigarette ad that knows when you are emotionally weakest. It’s 2 am, after you’ve been drinking and when you’re near other friends. And that’s when it sends you the little push: For $.25 you can just get one delivered by drone to exactly where you are.

That’s the world that we’re moving towards. 

New memories, they're coming for your brain. Samuel Zeller

Does that scare you?

It completely terrifies me. We as humans don’t have the antibodies to deal with targeted persuasion.

If an algorithm can read through the likes you’ve made on Facebook to be able to know your personality in the big five personality test better than a coworker only takes 10 likes. To be better than a family member, 150 likes. To know you better than a spouse—including being able to predict substance abuse, depression—four-year spouses, do you know how many likes it takes? 300.

But do they know when they’re likes of obligation versus genuine likes?

Well, turns out everyone has similar types of feelings, so on net, yeah.

It used to be with larger scale broadcast mediums, if you wanted to make an attack ad on a politician and you wanted to make it really garish with baseball bats and beating people, well that’s going to turn off a big portion of your people while it might energize someone else. So you have this averaging effect. You have to have something that sits in the public discourse. But when you start getting these ads that only have to go to you and you and you individually, I can amp up, I can make the message more sugary, I can make the message with hotter hot sauce, if you will, because it’s not going to turn off someone else.

So you can see how targeting promotes both silo and bubble chambers as well as polarization, because you can just turn up the juice on everything because you don’t have the effect of having to have it out in the public.

To me, that breaks democracy.

As a consumer, how do you protect yourself from this?

I think this is one of the fundamental problems of our time, and I don’t have good solutions. In some ways, you could say, these companies are just making technologies that are catering to our existing wants and behaviors. The problem is human interest and human behavior aren’t aligned.

At every decision in the making of a product, there is a bias to make something that tests better. People like it more, and therefore it’s the right thing to do. We have to have the fortitude at every stage to say, “When we scale this thing up, what are its effects?” I think it will take legislation. We have to create ground rules.

You’re about to spend 30 days in the wilderness away from all of this. Is this reflective of a need to disconnect?

Yup. Absolutely. Create space so when you come back you can see things more clearly.

Among all the things I just said, another two words that scare me most are just “post-nature.” I feel like it behooves us to spend time in nature 1. while we still can and 2. so we can become better ambassadors for it.

Are there behaviors that you do to be less influenced, less connected?

I don’t know how popular this is with people, but it works really well for me: I don’t charge my phone at night. I charge it during the day, but normally I try to not have 100 percent battery. It just makes me super conscious about when I’m using my phone, because I don’t want to get stuck with out battery. Just telling myself, “Don’t get distracted. Don’t look,” is not going to work. So you have to find something else that fights it. That’s what works for me.