CLRCRMCK

Unionised drivers call on Microsoft to suspend Uber’s Face API licences

Unionised private hire drivers in the UK are calling for Microsoft to suspend Uber’s licences to use its Face API technology after claims the ride-hailing firm’s ID-checking system has led to drivers losing their jobs and having licences revoked

A union for drivers and couriers working in the gig economy has called on Microsoft to suspend the sale of its facial-verification systems to Uber after allegations drivers have had their accounts wrongly deactivated following misidentification.

The App Drivers and Courier’s Union (ADCU) and its associated data trust Workers Info Exchange (WIE) said it had identified seven cases of Uber drivers losing their jobs and subsequently having their licences revoked by Transport for London (TfL) as a result of the company’s Real-Time ID Check facial-verification system failing to recognise their faces.

As part of an investigation for Wired from early March 2021, a further 14 Uber Eats couriers shared evidence with journalist Andrew Kersley that showed how the technology failed to recognise their faces, leading to threats of termination and account closure.

The Real-Time ID Check system uses Face API, a face-matching software developed by Microsoft that can be used for either facial verification or recognition, and essentially acts as a comparison tool, checking selfies taken by couriers and drivers as they log in against photographs in Uber’s database to confirm their identities.

While facial recognition is a one-to-many process that cannot be opted out of due to a lack of knowledge about when it is taking place, facial verification is a one-to-one process whereby the individual knows the biometric processing is taking place and is actively involved in it by giving information via a personal device.

“Some 94% of licensed private hire drivers in London are BAME [Black, Asian and Ethic Minority] and Microsoft openly acknowledges that its facial recognition systems have a failure rate of as much as 20% when used with women and people of colour,” said ADCU president Yaseen Aslam.

“Thousands of innocent people will lose their jobs and face licensing action if the use of this racist software by Uber is not stopped. Private hire drivers in the UK have been exploited enough by Uber without Microsoft adding to the problem”
Yaseen Aslam, ADCU

“Based on these statistics, and going by our current caseload, thousands of innocent people will lose their jobs and face licensing action if the use of this racist software by Uber is not stopped. Private hire drivers in the UK have been exploited enough by Uber without Microsoft adding to the problem.”

Ironing out fairness flaws

Commenting on the union’s allegations, a Microsoft spokesperson said the company was “committed to testing and improving Face API, paying special attention to fairness and its accuracy across demographic groups”.

“We also provide our customers with detailed guidance for getting the best results and tools that help them to assess fairness in their system.”

An Uber spokesperson added the system was designed to protect the safety and security of passengers by ensuring the correct driver or courier is using the account.

“While no tech or process is perfect and there is always room for improvement, we believe the technology, combined with the thorough process in place to ensure a minimum of two manual human reviews prior to any decision to remove a driver, is fair and important for the safety of our platform,” they said.

In March 2021, during legal proceedings where drivers were seeking greater access to their personal data following account deactivations by Uber, a Dutch court ruled that while there was enough human intervention to find the firm’s decisions were not completely automated, it would still have to give two drivers accused of “fraudulent activity” access to the data it used to make the decisions.

The ADCU has written to the Mayor of London to demand that all TfL private hire driver licence revocations based on evidence produced by Uber’s identification systems are immediately reviewed.

The union further claimed Uber rushed to implement the identification system in a bid to win back its London operating licence after TfL decided in late 2019 that it would not be renewed over problems with unauthorised drivers using the platform.

Evidence given to the regulator by Uber during its licensing appeal in September 2020 showed that, because of failures in the company’s manual identification process, it had started to roll out the Real-Time ID Check system from April 2020 onwards.

“TfL has taken an active interest in [Uber’s] proposals with regard to this product. There are clear benefits to the product and TfL supports any technology which increases passenger safety by ensuring the driver is licensed by TfL and permitted to use the Uber app,” wrote TfL’s director of licensing, regulation and charging, Helen Chapman, in her witness statement.

She added that TfL had received a Data Protection Impact Assessment (DPIA) for the system from Uber in March 2020.

“I consider the use of this product a step in the right direction, although clearly its implementation is still at a very early stage. I therefore cannot meaningfully comment on the effectiveness of it at this stage,” said Chapman.

James Farrar, director of WIE and co-founder of the ADCU, said there was a “surveillance tech arms race in the gig economy”, driven in part by platforms’ attempts to move to “more hidden forms of algorithmic control” and in part by pressure from regulators like TfL.

“Uber has rushed to implement a flawed facial recognition system provided by Microsoft and uses it against a 70,000-strong workforce it is already well known for exploiting, but without the appropriate safeguards in place,” he said. “Microsoft must have no truck with Uber’s exploitation of vulnerable workers [and] must suspend Uber’s licence to use this solution immediately.”

Facial recognition bias

In 2018, research from MIT indicated that Microsoft’s facial-recognition and detection systems – specifically the Face API being used by Uber – had gender and racial biases, finding it had much higher error rates when identifying women or people with darker skin.

“The substantial disparities in the accuracy of classifying darker females, lighter females, darker males and lighter males in gender classification systems require urgent attention if commercial companies are to build genuinely fair, transparent and accountable facial analysis algorithms,” said authors Joy Buolamwini and Timnit Gebru.

In June 2020, Microsoft – alongside Amazon and IBM – suspended the sale of its facial-recognition technologies to US law enforcement agencies in response to several weeks of mass protests against the police murder of George Floyd on 25 May.

Microsoft president Brad Smith said at the time: “We will not sell facial-recognition technology to police departments in the United States until we have a national law in place, grounded in human rights, that will govern this technology.”

Smith previously told ITV in January 2019 that one of the challenges with the technology in its current form was that “it doesn’t work as well for women as it does for men, it doesn’t work as well for people of colour”, adding it was more likely to find errors, mismatch and generally “fail to identify” people from these groups.

Read more about ride-hailing apps

 

Read more on IT for transport and travel industry