Uber faces legal action over ‘racially discriminatory’ facial recognition ID checks

Europe

Ride-hailing giant Uber is facing a legal challenge over its use of real-time facial recognition technology in a driver and courier identity check system that it uses in the UK.

The App Drivers & Couriers Union (ADCU) announced the legal action Tuesday, alleging that Uber’s biometric identity checks discriminate against people of color.

The union said it’s taking the action after the unfair dismissal of a former Uber driver, Imran Javaid Raja, and a former Uber Eats courier, Pa Edrissa Manjang, following failed checks using the facial recognition technology.

Commenting in a statement, Yaseen Aslam, president of ADCU, said: “Last year Uber made a big claim that it was an anti-racist company and challenged all who tolerate racism to delete the app. But rather than root out racism Uber has bedded it into its systems and workers face discrimination daily as a result.”

The ADCU is launching a crowdjustice campaign to help fund the legal action — which it said is also being supported by the Equality & Human Rights Commission and the not-for-profit Worker Info Exchange (WIE).

The latter was set up by former Uber driver, James Farrer — who is now general secretary of ADCU and director of the WIE — and whose name should be familiar as he successfully sued Uber over its employment classification of UK drivers, forcing the company into a U-turn earlier this year when it finally announced it would treat drivers as workers after years trying to overturn successive employment tribunal rulings.

Farrer’s next trick could be to bring a legal reckoning around the issue of algorithmic accountability in the so-called ‘gig economy’.

The action also looks timely as the UK government is eyeing making changes to the legal framework around data protection, which could extend to removing current protections that wrap certain types of AI-driven decisions.

“Workers are prompted to provide a real-time selfie and face dismissal if the system fails to match the selfie with a stored reference photo,” the ADCU writes in a press release explaining how drivers experience Uber’s system. “In turn, private hire drivers who have been dismissed also faced automatic revocation of their private hire driver and vehicle licenses by Transport for London.”

The union says Uber’s real-time facial recognition checks, which incorporate Microsoft’s FACE API technology, have been in use by the ride hailing platform in the UK since March 2020.

Uber introduced the selfie identity checks ahead of another hearing over its licence renewal in London. That followed an earlier suspension by the city’s transport regulator, TfL, which has raised safety concerns over its operations for years — branding Uber “not fit and proper to hold a private hire operator licence” in a shock denial of its licence four years ago.

Despite losing its licence to operate in the UK capital all the way back in 2017, Uber has been able to operate in the city continuously as it has appealed the regulatory action.

It gained a provisional 15-month licence in 2018 — though not the full five year term. Later it got a two-month licence in 2019, with a laundry list of operational conditions from TfL — before once again being denied a full licence renewal in November 2019.

Then in September 2020 Uber was granted a licence renewal — but, again, only for 18 months. So to say Uber’s UK business has been under pressure over safety for years is putting it mildly.

The ADCU notes that in September 2020, when the Westminster Magistrates Court (most recently) renewed Uber’s license for London, it set a condition that the company must “maintain appropriate systems, processes and procedures to confirm that a driver using the app is an individual licensed by TfL and permitted by ULL to use the app”.

“This condition facilitated the introduction of harmful facial recognition systems,” the ADCU argues.

Earlier this year the ADCU and the WIE called for Microsoft to suspend Uber’s use of its B2B facial recognition technology — after finding multiple cases where drivers were mis-identified and went on to have their licence to operate revoked by TfL.

Now the union says its lawyers will argue that facial recognition systems, including those operated by Uber, are “inherently faulty and generate particularly poor accuracy results when used with people of color”.

Under the terms of Uber’s licence to operate in London the company reports failed driver identity checks to TfL — which can then revoke a driver’s licence, meaning he or she is unable to work as a private hire vehicle driver in the city.

The ride hailing giant also appears to use the same real-time facial verification identity check technology for both Uber drivers and Uber Eats couriers — even though the latter are delivering food, not ferrying passengers around. And in one letter seen by TechCrunch, in which TfL writes to an Uber driver to inform him that it is revoking his private hire licence, the regulator makes reference to information provided by Uber regarding the driver’s dismissal as an Uber Eats courier on account of a failed ID check carried out by Uber’s sister company.

That failed ID check as a food delivery courier then appears to be being used as grounds by TfL to justify revoking the same person’s private hire vehicle licence — on “public safety” grounds.

“It is recognized that the failed checks did not occur on a private hire operator’s booking platform or while undertaking any bookings. It is also the case that there does not appear to have been any evidence to suggest that this type of behavior has taken place on the booking platform of a licenced private hire vehicle operator. However, the information that has been provided indicates that you have been seen to fail identification checks that have been conducted,” writes TfL with some particularly tortuous logic.

“This type of activity being identified on any platform does suggest a propensity to behave in the manner that has been alleged,” it goes on, before adding: “When that is then considered in terms of a private hire driver, it does then have the potential to put the travelling public at risk.”

The letter concludes by informing the Uber driver that their licence is being revoked and providing deals of how they can appeal the decision.

Farrer told us that “several” of the Uber drivers the union is representing had their licences revoked by TfL after being dismissed by Uber for failing ID checks on Uber Eats which Uber then reported to TfL — which he called “disturbing”.

Commenting on the lawsuit in a statement, he added: “To secure renewal of their license in London, Uber introduced a flawed facial recognition technology which they knew would generate unacceptable failure rates when used against a workforce mainly composed of people of colour. Uber then doubled down on the problem by not implementing appropriate safeguards to ensure appropriate human review of algorithmic decision making.”

The ADCU’s legal representative, Paul Jennings, a partner at Bates Wells, described the cases as “enormously important” — and with AI “rapidly becoming prevalent in all aspects of employment” he suggested the challenge would establish “important principles”.

Reached for comment on the legal action, an Uber spokesperson claimed that the selfie ID check it uses features “robust human review” — telling us in a statement:

“Our Real-Time ID Check is designed to protect the safety and security of everyone who uses the Uber app by helping ensure the correct driver is behind the wheel. The system includes robust human review to make sure that this algorithm is not making decisions about someone’s livelihood in a vacuum, without oversight.”

The company prefers to refer to the technology it uses for these real-time ID checks as ‘facial verification’ (rather than facial recognition), while its claim of “robust” human review implies that no Uber or Uber Eats account is deactivated solely as a result of AI.

That’s important because under UK and EU law, individuals have a right not to be subject to solely automated decisions that have legal or similar effect on them. And algorithmic denial of employment would very likely meet that bar — hence Uber’s urging that its algorithmic identity checks do involve a human in the loop.

However the question of what constitutes ‘meaningful’ human review in this context is key — and something that courts will have to wrestle with at some point.

Asked what steps Uber has taken to assess the accuracy of its facial verification technology, Uber would not provide a public comment. But we understand that an internal Fairness Research team has carried out an assessment to see whether the Real-Time ID Check system performs differently based on skin color.

However we have not seen this internal research so we are unable to confirm its quality. Nor can we verify an associated claim that an “initial assessment” did not reveal “meaningful differences”.

Additionally, we understand Uber is working with Microsoft on ongoing fairness testing of the facial verification system — with the aim of improving general performance and accuracy.

Farrer told TechCrunch that the union has won at least 10 appeals in the Magistrates court against driver dismissals by TfL that cite Uber’s real-time ID checks. “With Imran, Uber and TfL have already admitted they got it wrong. But he was out of work for three months. No apology. No compensation,” he also said.

In other cases, Farrer said appeals have focused on whether the driver in question was ‘fit and proper’, which is the test TfL applies. For these, he said the union made subject access requests to Uber ahead of each hearing — asking for the driver’s real-time ID data and an explanation for the failed check. But Uber never provided the requested data.

“In many of the cases we got our costs,” Farrer also told us, adding: “This is unusual because public bodies have protection to do their job.” He went on to suggest that the judges had taken a dim view on hearing that Uber had not given the ADCU the requested data, and that TfL also either didn’t get the data from Uber — or too belatedly asked for data.

“At one Crown Court hearing the judge actually adjourned and asked for TfL’s Counsel to phone TfL and ask why Uber had not given them the data and if they ever expected to get it,” he added. “As you can see we eventually did get pictures for Pa and they are displayed in the Crowdjustice page — but we still cannot tell which of these pictures failed [Uber’s real-time ID check].”

TechCrunch asked Uber for a copy of its Data Protection Impact Assessment (DPIA) for the Real-Time ID Check system — which should have considered the technology’s risks to individuals’ rights — but the company did not respond to our question. (We have asked to see a copy of this before — and have never been sent one.)

We have also asked TfL for a copy of the DPIA. Farrer told us that the regulator refused to release the document despite the ADCU making a Freedom of Information request for it.

At the time of writing TfL was not available for comment.

Asked for his view on why the regulator is so keen on the facial recognition checks, Farrer suggested that by getting Uber to carry out this sort of “self enforcement” it sets a defacto regulatory standard without TfL having to define an actual standard — which would require it to carry out proper due diligence on key details such as equality impact assessment.

Products You May Like

Articles You May Like

Y Combinator often backs startups that duplicate other YC companies, data shows — it’s not just AI code editors
Oura valued at $5B following deal with medical device firm Dexcom
Blue Bear Capital lands $160M to back AI founders in climate, energy, and industry
YC-backed Formal brings a clever security reverse-proxy out of stealth
Partiful is Google’s ‘best app’ of 2024

Leave a Reply

Your email address will not be published. Required fields are marked *