Sergey Nivens - Fotolia
Gig economy algorithmic management tools ‘unfair and opaque’
Report published by Worker Info Exchange warns of algorithmically enabled rights abuses in the gig economy, noting the insufficient transparency of employers and the lacklustre nature of legal redress
Gig economy workers are being surveilled, profiled and managed by their employers using “unfair and opaque” algorithmic systems, and existing employment and data protection laws largely fail to protect them from digitally enabled abuses of their rights, a report has warned.
Published by campaign group Worker Info Exchange (WIE), which was set up to help workers access and gain insight from data collected from them at work, the Managed by bots report said there are “woefully inadequate levels of transparency” about the extent of the algorithmic surveillance and automated decision-making that workers are subject to throughout the gig economy.
“Workers are denied access to their personal data outright, are frustrated in their request or are simply given an incomplete return,” it said, adding that existing employment and data protection laws are weakly enforced and do not offer sufficient protection.
“Article 22 protections from unfair automated decision-making [in the General Data Protection Regulation] provide escape options for employers who can claim superficial human review to rubber-stamp unfair machine-made decisions,” said the report.
“The proliferation of profiling, generated by machine learning, can make it exceedingly difficult for workers to ever uncover, understand or test the fairness of automated decision-making relating to workplace fundamentals such as work allocation, performance management and disciplinary action.”
In November 2021, research published by the Trades Union Congress (TUC) found that more than four million people in England and Wales now work for gig economy platforms at least once a week – a nearly threefold increase since 2016.
In response to poor working conditions in the gig economy, the TUC also called for a right of access to workplaces for unions, which would also include a digital right of access to reflect the growing use of algorithmic decision-making by platform employers.
The WIE report added that although there have been some legal successes in 2021 – including the UK Supreme Court’s decision in February that Uber must classify its drivers as workers rather than self-employed individuals – it takes significant resources to seek remedy through the courts, so precarious workers in the gig economy need more rapid solutions.
“That is why workers must improve their bargaining power through organising and collective action,” it said. “The ability of workers to access and pool their data is a powerful force in organising that is yet to be properly tapped. When workers can better control their data, they will be better able to control their destiny at work.”
In March 2021, following legal action brought by the App Drivers and Couriers Union (ADCU) on behalf of six Uber drivers, Amsterdam’s District Court ruled that both Uber and Ola must disclose – to different extents – more of the data used to make decisions about drivers’ work and employment.
Read more about workplace data and algorithms
- MPs and peers call for new legislation to regulate the growing use of artificial intelligence in the workplace, which is being used to surveil workers’ performance and behaviour.
- The ICO has launched a public consultation on employers’ use of personal data to help it provide practical guidance for both businesses and workers.
- A UK-based trade union for technology workers and others employed by tech companies is organising around the issue of worker’s privacy and workplace monitoring.
The court also rejected Uber’s and Ola’s claims that drivers collectively taking action to access their data amounts to an abuse of their individual data access rights, laying the ground for drivers to form their own union-controlled data trust.
The WIE report’s lead author, Cansu Safak, said: “Gig platforms are collecting an unprecedented amount of data from workers through invasive surveillance technologies. Every day, companies make allegations of ‘algorithmic wrongdoing’, for which they do not offer any evidence.
“They block and frustrate workers’ efforts to obtain their personal data when they try to defend themselves. This is how gig platforms maintain exploitative power.”
WIE director James Farrar said the intensive surveillance and opaque management techniques of employers means they are exercising increasingly hidden forms of control of their workers.
“The report shows how the latest wave of employment misclassification tactics involves employers telling workers they are truly independent in their jobs, while at the same time management control is wielded as forcefully as ever but from behind the digital curtain,” he said.
To coincide with the launch of the report, WIE, alongside ADCU and campaign group Privacy International, is launching a campaign to challenge gig economy employers’ exploitative data practices.
This will include publishing video interviews with Uber drivers, revealing their experiences with suspensions and dismissals arising from interactions with facial recognition technology, fraud detection systems, and unaccountable intelligence-sharing with law enforcement, as well as launching a public petition that will demand an end to the gig economy’s surveillance practices.
ADCU president Yaseen Aslam said: “Our union has been overwhelmed with casework from workers who were summarily dismissed by gig economy bosses after unsubstantiated allegations were made based on questionable surveillance data and opaque automated decision-making systems. It is crystal clear that workers need greater algorithmic transparency and far better protection from unfair dismissal than they currently have.”