Can the algorithms that ride-hailing and delivery startups use be fair?

In June 2020, taking advantage of a Chicago law requiring ride-hailing apps to disclose their prices, researchers from George Washington University published an analysis of algorithms used by ride-sharing startups like Uber and Lyft to set fares. It spotlighted evidence that the algorithms charged riders living in buildings with older, lower-income and less-educated populations more than those who hailed from affluent areas, an effect the researchers pegged on the high popularity of — and thus the high demand for — ride-sharing in richer neighborhoods.

Uber and Lyft rejected the study’s findings, claiming that there were flaws in the methodology. But it was hardly the first study to identify troubling inconsistencies in the apps’ algorithmic decision-making.

Riders aren’t the only ones to be victimized by routing and pricing algorithms. Uber recently faced criticism for implementing “upfront fares” for drivers, which leverages an algorithm to calculate fares in advance using factors that aren’t always in drivers’ favor.

In the delivery space, Amazon’s routing system reportedly encourages drivers to make dangerous on-the-road decisions in pursuit of shorter delivery windows. Meanwhile, apps like DoorDash and Instacart employ algorithms to calculate pay for couriers — algorithms that some delivery people claim have made it harder to predict and figure out their earnings.

As experts like Amos Toh, a senior researcher for Human Rights Watch who studies the effects of AI and algorithms on gig work, note, the more opaque the algorithms, the more regulators and the public have a hard time holding companies accountable.

“When you put a fare calculation behind a black box algorithm, it’s possible to have the capacity to learn from driver behavior … and actually learn what is the lowest rate a driver will take for a ride,” Toh told The Markup in a recent interview. “We don’t have any evidence that Uber is doing this. But the real problem is the secrecy, because it makes it impossible to verify.”

Jamie Woodcock, a researcher at the University of Oxford specializing in the gig economy, refers to the practice as “algorithmic management.” First coined in 2015 to describe the managerial role played by algorithms on Uber and Lyft, algorithmic management can be broadly understood as the delegation of managerial functions to automated systems. The hallmarks include worker surveillance, automated performance reviews and the use of “nudges” or penalties to incentivize workers.

As Woodcock points out, the algorithms used by private hire and delivery companies will be “unfair” so long as they’re designed to minimize the cost of trips while maximizing efficiency. Unfair to whom depends on your perspective; shareholders might feel differently, for example. But from a humanitarian standpoint, Woodcock emphasizes that the routing and pricing algorithms used in today’s popular platforms weren’t designed with the worker — or all customers — in mind.

“Given the workers are not consulted on changes to the algorithm or able to shape it in their interests, the end result is an algorithm shaped in the interest of the platform owner that is designed to encourage customers to use it,” Woodcock told TechCrunch in an email interview. “The current model behind these decisions is about cost minimization: engaging a large number of drivers as self-employed contractors, paying them only when they are delivering food [or riders], and competing for customers as they try to monopolize the market.”

Christo Wilson, an associate professor of computer sciences at Northeastern University, makes the case that gig delivery and transportation apps are user-hostile in a number of ways — not just algorithmically.

“The platforms in question certainly engage in algorithmic management, of which routing is a part, but not the central issue,” he told TechCrunch. “It’s more things like surge pricing, bonus and incentive structures, dark patterns in the design of the apps, and more that strongly incentivize workers to act in certain ways that are stressful.”

So what would it take to change this? There have been several attempts. One of the latest is Alto, which leases its own fleet of ride-hail vehicles, employs drivers and is regulated as a transportation company. Others include Earth, Co-Op Ride and Revel, which has a fleet of Teslas driven by employees in New York City.

They’ve seen some success. For one, Earth claims it achieved a 39% increase in riders from January 2021 to January 2022 (the company launched in 2020). But it’s also had to contend with tradeoffs like low capacity and higher wages that together have pushed fares higher.

“The problem is that while many consumers might want fairer work for riders, the reality is that — and particularly during a cost-of-living crisis — consumers will stick with the services that are cheaper,” Woodcock said. “The challenge for trying to implement ‘fairer’ forms of algorithmic management is that both private hire and food delivery are very competitive markets. There is an ongoing fight between companies to try and monopolize the markets, often involving the investment of huge amounts of venture capital to try and do this. A fairer version of this work would either result in lower profits for the company [or] higher prices for consumers.”

Woodstock sees regulatory changes, spurred by workers organizing and bringing awareness to the issues, as the only possibility for fairer algorithms emerging. Such efforts have had mixed results so far, with California passing legislation protecting gig startups from having to reclassify contractors as employees. But there’s reason for hope. In Massachusetts, the Supreme Judicial Court this summer struck down a ballot measure that borrowed elements of California’s law. And overseas, a Dutch court ruled last year that Uber drivers are employees and thus entitled to employee benefits.

Perhaps most notably, the AI Act, the proposed European Union law that would require companies to practice greater algorithmic transparency and accountability, would mandate that companies including Uber and DoorDash inform workers how algorithms are used to make decisions on the platform and require human oversight of the systems. It would also provide the opportunity for workers to challenge automated decisions that affect their job and require that platform providers consult with worker representatives about changes to the algorithms.

Wilson agrees with Woodstock that this type of reform is the likeliest path to change.

“[Solutions need to address] the real underlying causes of stress for these workers, such as low wages that force them to work at a breakneck pace to secure a living wage or the precariousness and opacity of algorithmic management as a concept,” he said.