Companies like Facebook, Google and Amazon are free to use, but have you ever thought that we, our data, is their free product to sell?
In an insightful and rather disturbing TED Talk (see below), techno-sociologist Zeynep Tufekci lays out how the same algorithms that companies like Facebook, Google and Amazon use to get us to click on ads can also be used to organize our access to political and social information.
Many people are nervous about artificial intelligence (AI) and prominent technology experts have warned about the dangers of unregulated AI, notably Elon Musk who said that until people see robots going down the streets killing people, they don’t know how to react, because it seems so ethereal.
But, Tufekci says at this point our problem isn’t machines or what AI can do to us on its own, but how people in power will use AI to control us and manipulate us in novel, subtle and unexpected ways.
“Much of the technology that threatens our freedom and our dignity in the near-term future is being developed by companies in the business of capturing and selling our data and our attention to advertisers and others: Facebook, Google, Amazon, Alibaba, Tencent,” says Tufekci.
And online ads seem simple and innocuous, as easy to dismiss, not so? Surely, you have the personal power to ignore an ad to buy a pair of boots online.
But it’s not as simple as that.
In the physical world we have a thing called persuasion architecture. An example is the candy display at the checkout counter right at toddler eye-level. Now, in the physical world, such persuasion architectures are kind of limited, because you can only put so many things by the cashier.
In the digital world such limitations don’t apply.
Persuasion architectures can be built at the scale of billions and they can target, infer, understand and be deployed at individuals one by one by figuring out your weaknesses, and they can be sent to everyone’s phone private screen, so it’s not visible to us, says Tufekci.
In the digital world machine-learning algorithms churn through big data (that they get from various platforms like Facebook) to understand the characteristics of people who for instance purchased tickets to Vegas before. When they learn this from existing data, they also learn how to apply this to new people.
Now, you might think, so what? The algorithm can predict who will buy tickets to Las Vegas. What’s the big deal?
The big deal is, no one understands how the algorithms learned that.
***NEW MASTERCLASS! How to break through toxic relationships and find true love, with the world-renowned shaman Rudá Iandê. Sick and tired of messy relationships? Join us for this masterclass, playing only for a limited time.
“The problem is, we no longer really understand how these complex algorithms work. We don’t understand how they’re doing this categorization. It’s giant matrices, thousands of rows and columns, maybe millions of rows and columns, and not the programmers and not anybody who looks at it, even if you have all the data, understands anymore how exactly it’s operating any more than you’d know what I was thinking right now if you were shown a cross section of my brain.
“It’s like we’re not programming anymore, we’re growing intelligence that we don’t truly understand,” says Tufekci.
These algorithms can only work their magic in the presence of huge amounts of data and that’s why platforms are so keen to get as much data as they can about you.
Here’s a terrifying point:
Imagine what the algorithms could do with seemingly disconnected data. The predictions they could come up with, the ideas that might spawn in the minds of those who first set eyes on them.
Tufekci mentions a computer scientist whose algorithms could figure out the onset of mania from social media posts of someone with bipolar even before clinical symptoms set in.
What algorithms decide to prioritize can affect you
Facebook also algorithmically arranges the posts that your friends put on Facebook, or the pages you follow. It doesn’t show you everything chronologically. It puts the order in the way that the algorithm thinks will entice you to stay on the site longer. So you may think that someone is snubbing you on Facebook when actually the algorithm may never be showing your post to them. The algorithm is prioritizing some of them and burying the others.
Experiments show that what the algorithm picks to show you can affect your emotions. But that’s not all. It also affects political behavior.
Now, consider, the 2016 presidential election that was decided by about 100,000 votes and the fact that Facebook can also very easily infer what your politics are, even if you’ve never disclosed them on the site.
“These algorithms can do that quite easily. What if a platform with that kind of power decides to turn out supporters of one candidate over the other? How would we even know about it?” asks Tufekci.
The point is that we no longer know if we’re seeing the same information or what anybody else is seeing, and without a common basis of information, little by little, public debate is becoming impossible, warns Tyfekci.
For more on what algorithms can infer from our Facebook likes and dating site profiles and how this infrastructure built for clicking on ads has become an infrastructure of surveillance authoritarianism, you really must watch this TED Talk.
***Do you want to be a stronger person? Do you want to stare down your challenges and overcome any obstacles? If so, check out our eBook: The Art of Resilience: A Practical Guide to Mental Toughness.