Is Uber’s facial recognition software is firing Black and Asian drivers?

Uber is facing a proceeding on suspicion of indirect racism against a driver who claims to have been fired after the facial recognition software used by the company did not recognize him.

In a lawsuit filed by the Labor Court this week, a Black driver, who asked not to name him, was able to work because Ubers’ UK subsidiary invalidated his account because he could not be recognized in two separate photos. Claims to be gone.

The British Independent Trade Union (IWGB), which filed a claim on behalf of the driver, has told EuronewsNext that it has confirmed at least 35 similar dismissals among its members since the launch of the COVID-19 pandemic. He warned that “hundreds, if not thousands,” could be affected.

IWGB is calling on Uber to stop using “racist algorithms” and recover unfairly dismissed drivers on suspicion of software errors.

In a statement, Uber said that its facial recognition software was “designed to protect the safety and security of everyone who uses the Uber app by helping ensure the correct driver is behind the wheel”.

The company said the system “robust human review to make sure that this algorithm is not making decisions about someone’s livelihood in a vacuum, without oversight”.

The Labor Court’s allegations seen by Euronews Next claim that drivers who worked at Uber from 2016 until their dismissal in April last year were not provided with manual photo confirmation.

Uber has been using real-time identity verification in the UK since April 2020 after London’s traffic regulator TfL has raised concerns about its passenger safety.


Leave a Reply

Your email address will not be published. Required fields are marked *