Breaking News

Working together to defeat bias and discrimination in the sharing economy

In these Trump-times it seems more important than ever to discuss discrimination and how we solve it. As the sharing economy grows it brings to light societal issues of bias and discrimination; but new approaches to data and business models offer hope that bias can one day be a thing of the past.

By Pernille Spang Lyndegaard

Uncovering racial bias

One of the major debates surrounding the sharing economy industry in 2016 was the academic research paper  coming out of Harvard Business School underpinning racial discrimination on platforms like Airbnb. The researchers studied 6,400 listings in 5 major US cities using fake “stereotypical race” profiles and found that guests with African-American sounding names were 16% less likely to be accepted for a booking. In addition, people of colour would earn less money as hosts.

This sparked the  #airbnbwhileblack  hashtag where users highlighted examples of potential racial bias via the Airbnb platform. Users would detail how hosts would decline their request to book for no apparent reason ultimately confirming the study’s findings.

“The study found that guests with African-American sounding names were 16% less likely to be accepted for a booking”

Airbnb’s response was a  non-discrimination policy with detailed and specific guidance to hosts on how to avoid discriminating based on race, gender, religion and more; but that was widely criticised for not being ambitious enough.

Many types of bias

As more and more of these platforms are built, the same issues we face in broader society are showing up in the communities, which are being built around these sharing platforms. And it is not only racial bias, but issues related to gender, sexual orientation, religion and more.

“A growing partnership between reputational venture Deemly, and home-sharing disruptor Innclusive gives insight into possible solutions”

This is naturally not just a problem for Airbnb. Both Uber and Lyft  have been criticised  for discriminating against minorities and women. This study revealed that drivers would take longer to pick up passengers, and also take longer to drop them off or never pick them up at all. Similar issues  have been raised for the LGBT community. So how do we solve these issues with the sharing economy?

Data can prevent discrimination

It seems 2017 is setting out to continue this discussion, as  new research  from University of Michigan Ross School of Business suggests that more information about guests is important for eliminating bias. Specifically the researchers tested adding a single host review to each profile which evened out the bias.

It wasn’t just positive reviews which swayed hosts to allow bookings. Even negative reviews on the fictitious guests received the same acceptance rate and were statistically even across both name groups. The researchers concluded that Airbnb should incentivize hosts to write reviews on new guests and for guests to signal their credibility in a more structured way.

Deemly CEO, Sara Green Brodersen believes that using technology and data tools to track and then inform reputations in the sharing economy offers a significantly more accurate, standard and robust way to reduce discrimination. Such reputational scores allow guests and hosts to quickly and reliably gauge past reliability across the sharing landscape, and eliminates the subjective nature and interpretation of many review systems today.

Tackling discrimination from the start

When Airbnb competitor Innclusive started in late 2016, it took on the mammoth task of designing discrimination-free technology and processes. They opted to remove temptation for bias by reordering when hosts would see certain biographic information. Additionally, they built an organization and business practices committed to using available data and monitoring to predict and act on observed biases on their platform.

“While both of these sharing companies are still young, they are tackling the issues of bias and discrimination head-on”

Innclusive’s Head of Strategy, Kevin Simmons believes that companies have a responsibility to lead by example and in the sharing economy, that means insisting that any and everyone be able to benefit fairly. While supporting the idea that increased data availability could help increase comfort levels between participants he warns that it is not a cure all, and that companies should ensure their technology and practices do not enable biased actions to be taken even in a data rich environment.

While both of these sharing companies are still young, they are tackling the issues of bias and discrimination head-on and proving that data and bias-free business design offer the best solutions to combating prejudice and discrimination.

Follow for more on this including an upcoming video podcast from  Open 2017 .

Kevin Simmons is the Director of Strategy & Business Development at Innclusive. Innclusive is the leading home sharing platform for minority groups. Founded in 2016 it already has hosts in more than 130 countries.

Sara Green Brodersen is the Founder & CEO of Deemly. Deemly works to build trust for users and platforms in the sharing economy through their rating and review system which allows users to take their reputation across platforms.

About Guest writer

Check Also

Hunome is for humans, by humans, and about humans.

In the era of AI, Hunome talks mostly about HI   So, at Arctic15 we …

Leave a Reply

Your email address will not be published. Required fields are marked *