Many new “sharing economy” companies, like Uber and Airbnb, use consumer-sourced ratings to evaluate their workers – but these systems can be fraught with difficulties, including bias based on race or gender.
In their paper, “Discriminating Tastes: Customer Ratings as Vehicles for Bias,” researchers examine how bias creeps into evaluations of Uber drivers from their customers. Researchers Karen Levy and Solon Barocas, assistant professors in the Department of Information Science, Alex Rosenblat of the Data and Society Research Institute, and Tim Hwang of Google, show that through Uber’s rating system consumers can directly assert their preferences and biases in ways that companies are prohibited from doing through federal law.
“The study collaborators are interested in ways bias can enter into algorithmic systems while not being prohibited by anti-discrimination law,” Levy said.
Ratings are especially important to Uber drivers because they risk getting kicked off the platform if their average score falls below a certain level.
“The algorithms impact if these drivers get terminated or advanced,” Levy said. “We know that people tend to have implicit biases that affect how they evaluate people from different groups. It would be illegal for an employer to discriminate directly, but this creates the possibility for backdoor bias creeping in from customers. These new technologies challenge the traditional way law prevents discrimination in the workplace.”
The paper concludes that consumer-sourced ratings are highly likely to be influenced by bias on the basis of factors like race or ethnicity. To help solve this issue, the authors suggest 10 proposed interventions, such as data-quality measures, better design elements in the rating system or human evaluators, to validate ratings.
“Ratings are really subjective and mean different things to different people. Because they’re so general, there’s a lot of potential for bias to enter into them,” Levy said. “There have already been situations in which companies in the sharing economy are dealing with discrimination between their users. The law has not caught up with how to protect people in these new environments.”
“Discriminating Tastes” was awarded a best paper prize at the 2016 Internet, Policy and Politics conference held this fall at the Oxford Internet Institute.
Leslie Morris is director of communications for CIS. This article originally appeared in the Cornell Chronicle.