Grindr, a popular gay dating app, tweeted that it will remove the ethnicity filters from its app. This move is meant to show solidarity with the Black Lives Matter movement.
They intend to remove these filters, which allow people to pay to avoid seeing people of certain ethnicities. This will commence in their next app update. This is in unity with protests in the US over police brutality against black women and men.
The ethnicity filters have been controversial for years. But they remained unchanged while Grindr launched an anti-racism campaign on the platform in 2018.
That same year, the corporation discussed removing the ethnicity filters, according to Landen Zumwalt, Grindr’s former head of communications. However, it wasn’t ready to get rid of them.
He said the team wanted to talk to its users first. The filters gave people in minority groups an opportunity to match more easily with each other.
He said, however, that he did believe the ethnicity filter does promote racist behavior in the app. Other minority groups use the filter because they want to quickly find other members of their minority community, he added.
Other apps, such as The League and Hinge, allow users to filter out people of certain ethnicities as well. The League CEO Amanda Bradford said, in 2019, that these filters didn’t encourage racism. Instead, they are useful for people of color to find people similar to them.
She cited an example of an Indian woman wanting to find an Indian man. The app’s filters make searching more effective and efficient than going through thousands of profiles.
At the same time, researchers have found people reject people of color more often than white people on dating apps. The filters could enforce racist attitudes and allow people to continue to discriminate against people of color.
For example, in 2014, OkCupid found white women were much less likely to take interest in black or asian men.
What the Removal of Ethnicity Filters by Grindr App Can Do
Grindr’s removal of the filters might prevent people from communicating their race preferences to the app. But it doesn’t necessarily mean they’ll be matching with people that look different from them.
The people’s matching behaviors could still inform Grindr’s algorithms, too, because the app could learn from users’ behaviors. If they don’t match with people of color, the Grindr app could learn to deprioritize those profiles.
Algorithms, which can be racist and sexist, learn about racial bias across industries. With that information, a computer can make a decision faster, potentially ruling over which things you see.
These systems can hold bias based on their builder, how they’re developed, and how they’re ultimately used. This is known as algorithmic bias.
It’s tough to figure out exactly how systems might be susceptible to algorithmic bias. Especially since this technology often operates in a corporate black box. Frequently people don’t know how the creators designed a particular artificial intelligence or algorithm, what data helped build it, or how it works.
Typically, you could only know the end result: how it has affected you. That is, if you’re even aware that AI or an algorithm was in use in the first place.
That makes addressing the biases of artificial intelligence tricky, but even more important to understand. So, removing filters is merely one step in shaping the experience of people of color to be more pleasant and approximate the experience of white people.