• It’s Not You, It’s The Algorithm

    There’s Something Seriously Wrong With Dating Apps

    The algorithm in dating apps has serious flaws. And dating app makers don’t want you to know.

    Can dating apps help you find love? Yes! An algorithm matching people is now the dominant way singles meet each other in the United States. But there’s one thing most dating app makers don’t want you to know. The particular algorithm they use can do the exact opposite of finding your true love. It can permanently separate you from your perfect someone and everyone like them, due entirely to factors you don’t control.

     

    This algorithm is called collaborative filtering. It’s pervasive: It powers your Facebook and Twitter feeds, your Google searches, and your Netflix and Amazon recommendations. It’s not that complicated. You’ve seen this a million times: “You might also like…” How does Amazon know what you might also like, and why does it use the word “also?” Because you’re not the only person on Earth buying tortilla chips.

     

    Amazon looks up what else tortilla chip buyers have bought: salsa. So it knows “you might also like” salsa without really understanding anything about the innate relationship between tortilla chips and salsa. The same exact thing is going on with dating, except the thing that’s on offer is people.

     

    Collaborative filtering in dating means that the earliest and most numerous users of the app have outsize influence on the profiles later users see. Some early user says she likes (by swiping right on) some other active dating app user. Then that same early user says she doesn’t like (by swiping left on) a Jewish user’s profile, for whatever reason.

     

    As soon as some new person also swipes right on that active dating app user, the algorithm assumes the new person “also” dislikes the Jewish user’s profile. Similar users have similar tastes, according to collaborative filtering. So the new person never sees the Jewish profile. A recent look at this phenomenon is going to change the way you think about online dating.

     

    A brand new simulation has quantified our gut feelings about dating apps: that a feedback loop in collaborative filtering gives majority users better matches at the expense of minority users. There’s something innate about collaborative filtering that disfavors people who are underrepresented in the data, in terms of their preferences, when they started using the dating app and how many of those users there are. In no intentional way, collaborative filtering reproduced the underlying causes of an inequality of opportunities in offline life.

     

    This is all beside the point, because there is no perfect dating algorithm, only compromises. There is an imbalance between what people want and what people give in dating. All preferences cannot be satisfied for everyone.

     

    A simple fix: Dating apps can give you a "Reset Button" to clear your history of likes and reset how the algorithm sees you. Or today, you can delete and recreate your dating app account. Both fixes take control of the algorithm in one important way: by not helping it. But we’re not in the business of proposing better alternatives to collaborative filtering. And people who suggest we try are missing the point.

     

    Society ought to be able to inspect how algorithms work, in the sense of looking at the code. Facebook CEO Mark Zuckerberg withstood hours of congressional grandstanding to answer questions about Facebook’s newsfeed algorithms. He never explained how the algorithm worked at a fundamental level, which was the question many congresspeople asked but lacked the authority to answer.

     

    The answer is “collaborative filtering,” but we only know that because of a zeitgeist in the software industry, not because anyone outside of Facebook looked at the code. So let’s just look at the code!

     

    This game I developed shows how the typical dating app algorithm works. You won’t actually have to go on any dates. The game is a simulation. You’ll still build a profile though: a monster profile. It’s called MonsterMatch, and it uses collaborative filtering to decide which monsters you’ll get to swipe left and right on—and which monsters you’ll never get a chance to see.

     

    We’re also sharing all the code, because laypeople explanations are often co-opted to tell you the story the algorithm owner wants you to hear. If you want to see exactly how collaborative filtering works in a dating app, read the algorithm here. Tech companies deploying collaborative filtering, least of all dating apps, never do this. But they ought to. Sharing the code is the only defense against people telling you one thing and actually doing another with software.

     

    In our opinion, if an algorithm’s code penalizes some people somehow, it doesn’t have to be illegal: people just ought to know how. This should assuage big tech companies who resist regulation of core intellectual property like recommendation algorithms. An informed consumer will improve digital inclusion—the treatment of minority people online. But sharing code is sometimes not enough.

     

    Some code, like collaborative filtering, lacks “interpretability.” It’s hard to know why the code does what it does, even when we know what it does. Interpretability crops up whenever the algorithm does counting on lots of data, like counting swipes. So if you look for a piece of code that says, “Score Jewish users worse,” you’ll never find it. That’s not how it works.

     

    In the case where an algorithm can’t be interpreted, we ought to demand data on the consequences of the algorithms. That means if the consequence of an algorithm is discrimination, even if there is no piece of code that says “discriminate,” the algorithm is discriminatory. For example, if there’s something in common about first-time users of dating apps who immediately quit using the app shortly afterwards, we ought to know what that thing is.

     

    We regulate and inspect medicine, energy, finance, agriculture, transportation safety and education based on outcomes. The key feature of those regulations is sharing information for consumer choice and protection. Those industries still innovate and make money. We ought to apply the same standards to algorithms.

     

    Unlike those other industries, dating apps already collect comprehensive data about users and their behavior. There’s little cost burden to answering outcomes questions. While it’s difficult to know exactly which questions to ask, it would take an afternoon and a database connection to answer them.

     

    Today, we already know one thing: dating apps are effectively segregated. JDate and JSwipe for Jewish users; Amo Latina for Latino users; Tinder for coastal users. Dozens exist, each app their own community. Users report more satisfaction by using these segregated apps. Dating app creators will write letters to the editor saying as much. But surely there’s a cost to segregation.

     

    Being funneled into a smaller, segregated experience often gives you fewer opportunities. There isn't any evidence for this in segregated dating apps specifically. The data are not open to independent research. But history has shown segregation to disfavor the segregated. Anti-miscegenation laws reinforced inequalities on future generations. Because it affects who’s having kids with whom, a segregated dating app could be a high-tech version of that shameful past.

     

    In light of this uncertainty, we ought to equip the public with enough information to make an informed choice about what dating apps to use. So share the code to the algorithm, and let the user decide if she’ll get a fair shake in the dating app game.

     

    Our prediction: If people really knew how much these apps screwed them, they’d stop using those apps. For a giant Internet company, that’s the scariest thing of all.