Courts crackdown on gig economy algorithms
The Sifted, Jonathan Keane, 9 AUGUST 2021
Antonio Aloisi, an assistant professor at IE Law School in Madrid specialising in tech and labour regulation, tells Sifted that these cases are the next evolution in creating new rules and regulations for the platform economy.
“My intuition is that we are entering a second age of litigation in the field of platform work. In the last five years many workers have been bringing claims before courts all over Europe to challenge the legal consideration [of workers’ status].”
Recent cases, Aloisi says, are part of rising scepticism of algorithmic control in the workplace. In the gig economy, those questions can be particularly fraught as they can be the difference between getting work or not.
“The algorithm is a way to exert control and organisation,” he says. “There is a direct impact on not only the possibility of being hired but also on the remuneration that the worker is able to get.”
IE Law School’s Aloisi says this debate isn’t going away any time soon and expects more questions will be asked of companies about the secret sauces they use. This does not necessarily mean that companies will need to lift the bonnet and expose their intellectual property, but there will be increasing demands for greater transparency, accountability and negotiation.
“Platform workers are asking for more transparency and accountability because they are extremely dependent on the operation of the algorithm and that’s why they want to understand the metrics behind the algorithm,” he says.
“The platform economy is going to be not only a battleground but also a place for experimentation — a testing ground for these issues.”
POLITICO AI: Decoded: Algorithmic transparency
Politico Europe, Leonie Cater, Melissa Heikkilä and Clothilde Goujard, 9 AUGUST 2021
But speaking to Decoded, Madrid-based labor law and AI expert Antonio Aloisi said algorithmic transparency doesn’t have to be complicated. Rather than getting bogged down in source code and jargon, he stressed the importance of explainability, pointing to a “percentage-based” approach. Online delivery couriers, for example, could ascertain the weighting of the metrics used to rank them internally: they could know whether customer reviews constitute 20 percent or 5 percent of algorithmic decisions like order allocations.
“That’s how predictability and reliability of the algorithm is increased, so the workers are aware of the consequences of their behavior,” he added. “This is not about learning the code, but understanding the consequences.”
Aloisi stressed that the new law and increased awareness around algorithmic management could have implications far beyond the traditional gig economy. Online creators and professional streamers across TikTok, Instagram and Twitch have been rallying against what they see as unjustified, inexplicable decisions to delete content and suspend or delete accounts with limited opportunities for appeals — resulting in a loss of revenue. A “streaming strike” in Italy is one example of online creators’ ire.
In Aloisi’s view, the EU’s data protection laws — which ban automated decision-making processes legally affecting a data subject (such as a loss of profitability) — combined with the rider law could increase not only the transparency but also the accuracy and contestability of decisions taken by gig work and social media companies.
“I consider these as a natural evolution in this process of dependency on algorithms used by a platform,” he added.