Monday, October 09, 2023
Book excerpt: The problem is not the algorithm
(This is an excerpt from the draft of my book. Please let me know if you like it and want more of these.)
“The Algorithm,” in scare quotes, is an oft-attacked target. But this obscures more than it informs.
It creates the image of some mysterious algorithm, intelligent computers controlling our lives. It makes us feel out of control. After all, if the problem is “the algorithm”, who is to blame?
When news articles talk about lies, scams, and disinformation, they often blame some all-powerful, mysterious algorithm as the source of the troubles. Scary artificial intelligence controls what we see, they say. That grants independence, agency, and power where none exists. It shifts responsibility away from the companies and teams that create and tune these algorithms and feed them the data that causes them to do what they do.
It's wrong to blame algorithms. People are responsible for the algorithms. Teams working on these algorithms and the companies that use them in their products have complete control over the algorithms.
Every day, teams make choices on tuning the algorithms and what data goes into the algorithms that change what is emphasized and what is amplified. The algorithm is nothing but a tool, a tool people can control and use any way they like.
It is important to demystify algorithms. Anyone can understand what these algorithms do and why they do it. “While the phrase ‘the algorithm’ has taken on sinister, even mythical overtones, it is, at its most basic level, a system that decides a post’s position on the news feed based on predictions about each user’s preferences and tendencies,” wrote the Washington Post, in an article “How Facebook Shapes Your Feed.” How people tune and optimize the algorithms determines “what sorts of content thrive on the world’s largest social network and what types languish.”
We are in control. We are in control because “different approaches to the algorithm can dramatically alter the categories of content that tend to flourish.” Choices that teams and companies make about how to tune wisdom of the crowd algorithms make an enormous difference for what billions of people see every day.
You can think of all the choices for tuning the algorithms as a bunch of knobs you can turn. Turn that knob to make the algorithm show some stuff more and other stuff less.
When I was working at Amazon many years ago, an important knob we thought hard about turning was how much new items were recommended. When recommending books, one choice would tend to show people more older books that they might like. Another choice we could make would show people more new releases, such as new books that came out in the last year or two. On the one hand, people are particularly unlikely to know about a new release, and new books, especially by an author or in a genre you tend to read, can be particularly interesting to hear about. On the other hand, if you go by how likely you are to buy a book, maybe the algorithm should recommend older books. Help people discover something new or maximize sales today, our team had a choice in how to tune the algorithm.
Wisdom of the crowds works by summarizing people’s opinions. Another way that people control the algorithms is through the information about what people like, buy, and find interesting and useful.
For example, if many people post positive reviews of a new movie, the average review of that movie might be very high. Algorithms use those positive reviews. This movie looks popular! People who haven’t seen it yet might want to hear about it. And people who watched similar other movies, such as movies in the same genre or with the same actors, might be particularly interested in hearing about this new movie.
The algorithms summarize what people are doing. They calculate and collate what people like and don’t like. What people like determines what the algorithms recommend. The data about what people like controls the algorithms.
But that means that people can change what the algorithms do through changing the data about what it seems like people like. For example, let’s say someone wants to sell more of their cheap flashlights, and they don’t really care about the ethics of how they get more sales. So they pay for hundreds of people to rate their flashlight with a 5-star review on Amazon.
If Amazon uses those shilled 5-star reviews in their recommendation engines and search rankers, those algorithms will mistakenly believe that hundreds of people think the flashlights are great. Everyone will see and buy the terrible flashlights. The bad guys win.
If Amazon chooses to treat that data as inauthentic, faked, bought-and-paid-for, and then ignores those hundreds of paid reviews, that poorly-made flashlight is far less likely to be shown to and bought by Amazon customers. After all, most real people don’t like that cheap flashlight. The bad guys tried hard to fake being popular, but they lost in the end.
The choice of what data is used and what is discarded makes an enormous difference in what is amplified and what people are likely to see. And since wisdom of the crowd algorithms assume that each vote for what is interesting and popular is independent, the choice of what votes are considered, and whether ballot-box stuffing is allowed, makes a huge difference in what people see.
Humans make these choices. The algorithms have no agency. It is people, working in teams inside companies, that make choices on how to tune algorithms and what data is used by wisdom of the crowd algorithms. Those people can choose to do things one way, or they can choose to do them another way.
“Facebook employees decide what data sources the software can draw on in making its predictions,” reported the Washington Post. “And they decide what its goals should be — that is, what measurable outcomes to maximize for, and the relative importance of each.”
Small choices by teams inside of these companies can make a big difference for what the algorithms do. “Depending on the lever, the effects of even a tiny tweak can ripple across the network,” wrote the Washington Post in another article titled “Five Points for Anger, One Point for Like”. People control the algorithms. By tuning the algorithms, teams inside Facebook are “shaping whether the news sources in your feed are reputable or sketchy, political or not, whether you saw more of your real friends or more posts from groups Facebook wanted you to join, or if what you saw would be likely to anger, bore or inspire you.”
It's hard to find the right solutions if you don't first correctly identify the problem. The problem is not the algorithm. The problem is how people optimize the algorithm. People control what the algorithms do. What wisdom of the crowd algorithms choose to show depends on the incentives people have.
(This was an excerpt from the draft of my book. Please let me know if you like it and want more.)
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment