Tuesday, November 24, 2020

Firing Algorithms Test the General Data Protection Regulation

Originally published by Peggy Keene.

GDPR-Article-22-and-Algorithms.jpeg

Uber Faces High-Profile Lawsuit: GDPR Article 22 and Algorithms for Terminations

The use of algorithms to make decisions has become increasingly prevalent in today’s fast-paced, technology-driven world.  From ranking a user’s visibility on social media platforms to calculating grades on tests, the use of these automated processes and formulas have become commonplace, although they often remain behind-the-scenes.  This might change, however, with a recent lawsuit brought under Article 22 of the European’s Union’s General Data Protection Regulation (“GDPR”) against Uber.

GDPR Article 22 and Algorithms

Article 22 of the GDPR states that subjects have a right to not be subject to a “decision based solely on automated processing, including profiling, which produces legal effects” that significantly affects the said subject.  In this case, over a thousand Uber drivers are claiming that they were unfairly terminated by Uber, which used an automated algorithm to do so, and when the terminated drivers requested the reasoning behind their terminations, Uber refused to provide such information.  As a result, the terminated drivers are suing, claiming that, under the GDPR, companies must provide legal grounds when they use such methods, and that they must give drivers the ability to object to the automated decision, which Uber has not done.

Currently, the lawsuit has been brought by drivers in England that have been terminated by Uber, but privacy experts note that, if the British drivers succeed, this would provide legal precedent for all European Union drivers that were terminated under similar circumstances.  Currently, the British drivers are being represented by the App Drivers & Couriers Union (ACDU), which is bringing the suit, claiming that over 1000 British drivers have been wrongly fired and that they were denied their right to an appeal.  Moreover, the British drivers note that Uber’s decision to fire them significantly impacts their lives, as required under Article 22, because the termination of their Uber jobs results in an automatic notification to the Transport for London (“TfL”) board, which has the right to terminate their operating licenses.  Normally, when a case comes up before the TfL for review, drivers typically have fourteen days to defend themselves.  In these cases, the drivers have said that Uber’s unwillingness to provide them with the details behind their termination has made it impossible for the drivers to adequately defend themselves in front of the TfL.  As a result, the plaintiffs are bringing suit against Uber in the Netherlands because that is where Uber’s data resides.

Uber Lawsuit Brings Biggest Challenge Under GDPR Article 22 for Algorithm Use in Firings

Many drivers report that they had simply received an electronic notification that they had been terminated and were simply told to contact customer service if they had issues.  Upon contacting customer service, many drivers were told by Uber that they had engaged in “fraudulent activities,” but beyond that, Uber declined to provide any further information.  Many drivers report that, even when they obtained legal representation, Uber still declined to provide further details, citing their in-house security protocols and stating that giving any further information would compromise the company’s own security.  Many of the terminated drivers claim that they had been driving for Uber for years and had near-perfect ratings before they were terminated.

In response, Uber has stated that they have always been transparent about their practices and that the drivers are wrong in assuming that their terminations came as a result of the use of automated-data algorithms.  Instead, Uber claims that each terminated driver’s record was reviewed manually before termination.  And while many of the drivers were told by Uber that Uber had its own specialized team that was specifically dedicated to dealing with that issue, the affected drivers have never been contacted by the team nor Uber despite the drivers’ repeated requests.  Without going into further detail, some counsel for the fired Uber drivers have stated that they know “for sure” that Uber is using automated algorithms when it came to deciding which drivers to terminate for fraud, and that they are seeing similar types of actions everywhere, not just in England.  Overall, privacy experts agree that this legal challenge is the most high-profile and biggest challenge brought under Article 22 of the GDPR to date.

Key Takeaways About GDPR’s Article 22 and Algorithms

Terminated drivers are bringing a high-profile lawsuit against Uber, claiming that Uber violated Article 22 of the GDPR by using automated algorithms to fire drivers, and when the drivers requested explanation for their terminations, Uber declined to provide further detail which further violates the GDPR.  Privacy counsel should follow this lawsuit because:

  • it is the first major test of Article 22 of the GDPR which states that subjects cannot be subject to a decision based solely on automated processing, including profiling, which produces legal effects that significantly affects said subjects;

  • it could open Uber up to a massive class-action lawsuit; and

  • it has major ramifications on what kind of personal data companies may still have to provide under the GDPR, even if it concerns employment and not simply terms of use.

For more insights on data privacy, see our IP Litigation and Industry Focused Legal Solutions pages.

LinkedIn_Graphics_2018_Final_Intellectual Property Trends.png

technology & data

Read more


You may also be interested in:


Curated by Texas Bar Today. Follow us on Twitter @texasbartoday.



from Texas Bar Today https://ift.tt/3l56SK1
via Abogado Aly Website

No comments:

Post a Comment