US federal attempts to protect workers’ rights and status in the face of Big Tech

A collection of regulatory proposals are trying to shape the relationship between platform companies and the workers who make money on them – reflecting a continued struggle by lawmakers and regulators to catch up with the ways the digital economy has shifted the nature of work in the US.

The platform economy particularly has caused headaches for those who want to rein in Big Tech and to strengthen worker protections, or who are worried about the increasing levels of inequality in the country.

Because platform work allows labourers to choose when they work – often cited as one of its greatest attractions – the status of workers who use these platforms is unclear. While they don’t fit into the classic definition of employee, which involves working under a supervisor, they also don’t fit within the standard definition of an independent contractor as they are central to the business model of platform businesses like Uber, Lyft, and Postmates.

In response, according to an article by Charlotte Garden in Harvard Law and Policy Review, the usual tests to differentiate employees from contractors have proven insufficient, and some legal experts have recommended a new classification between contractor and employee or including the workers as employees under existing legal tests. 

 

Improved collective bargaining rights

 

To the question of how to strengthen protections for these workers, several bills are making it through the US legislature with provisional answers.

One of these answers is to strengthen bargaining powers for platform workers, for whom it can be extremely difficult to organise. A bill moving through the US Congress, for instance, would, if passed, increase protections for collective bargaining and organising.

The Protecting the Right To Organize Act, HR 842, would amend a spattering of laws, including the National Labor Relations Act, the federal law that allows employees to create unions. It would also shift the classification from “contractor” to “employee” for many digital workers, using similar tests to California’s 2019 Assembly Bill 5, popularly known as the “gig worker bill”, as well as several other changes to collective bargaining law meant to increase the bargaining power of digital platform workers.

In effect, AB5 built on the California Supreme Court’s 2018 decision, Dynamex Operations West, Inc vs Superior Court of Los Angeles County. In one of the first lawsuits over how to label platform workers that made legal progress, the decision had limited independent contractor classifications. It used a three-part test, known as the ABC test, that companies must meet to classify a worker as a contractor.

To do that under AB5, employers must show that the employee is outside the control of the company, that they work outside the normal business of the employee, and that they work for competitors. Failing to meet those three conditions would result in a worker being classified as an employee.

 

Ballot initiative

 

Executives from platform companies argued that the proposal would not adequately provide access to the social safety net and would fail to protect workers because of that. They called instead for national regulations that offered existing social safety net benefits to gig workers, and for universal programs like Medicare for all and a national sick leave program that would disconnect protections from employment status.

Although the California bill passed, and was put into effect in 2019, rideshare companies refused to obey it and instead began ballot initiatives to overturn it. California courts held that Uber and Lyft were illegally misclassifying employees under the law and the companies responded by threatening to stop offering services in the state – however, the ballot initiative was successful.

That initiative, Proposition 22, approved by California voters in November 2020, reclassified delivery drivers as contractors. Notably, it only affects how app-based delivery drivers, used by companies like Uber and Lyft, are classified. Other companies, such as TaskRabbit, Rover.com and Lime scooters have not dodged the new classification rules, according to recent reporting.

President Joe Biden has voiced support for both California Assembly Bill 5 and for the proposed Protecting the Right to Organize Act.

The Biden administration has also published a plan to bolster workers’ collective bargaining and organising capabilities, framing them as a way to reduce the high and rising levels of inequality in the US by increasing workers’ political power. It also promises stronger enforcement of labour violations laws already on the books and it explicitly calls out platforms as a sector.

Emergency measures

 

Meanwhile, regulators also believe platform workers’ interests need special protection during the Covid-19 crisis.

A bill currently in its early stages in the US Senate, S 338, the Helping Gig Economy Workers Act of 2021, aims to improve the health and overall well-being of digital workers. The bill is partly a response to the Covid-19 emergency. It would allow platform companies to offer benefits to workers, like health checks and financial assistance, without that assistance being used by local, state, or federal law later to argue that the workers are employees, which would last so long as the coronavirus emergency is in effect. Essentially, it’s a way to reassure platform companies that any assistance they may give to labourers won’t be used against them later. However, its chances of passing are not considered high.

Other regulatory proposals have tackled issues like tax reporting and finding ways to offer platform workers equity.

The American Rescue Plan Act of 2021, the Biden administration’s $1.9tn relief package, had a provision to change the tax code to make platforms increase their tax reporting, which experts say will also address the weighty problem of unpaid back taxes from gig workers who haven’t paid into the social security program and will move towards increased oversight for platform companies.

This passed, which means the new tax reporting requirements will come into effect in the 2022 tax year.

 

Algorithmic management

 

And in November 2020, the US Securities and Exchange Commission (SEC) proposed a temporary rule that would allow platforms to offer “equity compensation” to workers, limited to up to 15% of annual pay. The agency itself commented that the temporary proposal was a reflection of the “the significant [workforce] evolution” technology has caused in recent years.

Another important aspect of the platforms-based regulations is the use of algorithms to manage workers.

Algorithmic management, as the practice of relying on computers to manage workers is called, has evolved as a way of handling the large numbers of workers entering the platform economy. Arguably, the innovation of many platform companies relies heavily upon the use of algorithms to sort workers.

However, the prevalence of algorithms has led to complaints from workers. One study of Uber drivers reported that complaints about algorithmic management stemmed mostly from concerns that workers suffer from endless surveillance, that there is not much transparency in the way the algorithms assess them, and that the process is dehumanising. 

These areas correspond with what researchers have labelled as the main challenges algorithms present for workers’ rights:

  • surveillance and control, since the algorithms keep workers from participating in decisions about their work lives;
  • transparency, since workers are not privy to the details of how these algorithms work, even though some have been fired based on algorithmic recommendations;
  • bias and discrimination, since studies have suggested that algorithms can automate racial and other forms of bias when used to make decisions through consumer ratings;
  • and, finally, accountability, since algorithms can conceal how decisions are being made.

 

Attempting to regulate algorithms

 

An article in Harvard Business Review has argued that companies can mitigate workers’ complaints by sharing information about the algorithm with them, giving them a way to provide feedback, creating opportunities for human contact in the company, and improving benefits as a way to build trust.

In the meantime, however, regulators have begun to look at possible regulations for algorithms in general, although most proposals at present are directed at consumer privacy and use by the government.

According to the University of Berkeley’s Labor Center, there are three broad approaches to possible regulations: tech-specific proposals look at banning or placing moratoriums on specific tech; issue-specific proposals concern themselves with specific categories; harm-specific proposals try to alleviate specific harms created by algorithms. 

One area of particular concern to have arisen is whether algorithms are automating bias.

Another bill in the Senate, the Algorithmic Justice and Online Platform Transparency Act of 2021, would ban the use of algorithms on some popular websites. The bill’s Democratic sponsors, senator Edward Markey of Massachusetts and representative Doris Matsui of Californial, say it would reduce discrimination. 

They say algorithmic processes have automated racism and sexism by slanting housing and job advertisements in a way that excludes minorities. In particular, the pair cited a Cornell University study that found Facebook advertising for housing was “skewed” in a way that harms home-seekers along race and gender lines.

 

‘Unfair’ use of power

 

The bill’s sponsors also cited reporting from The Markup, a New York-based investigative outfit focused on Big Tech, that claimed Google’s advertising has prevented non-binary people from viewing job advertisements.

“As we work to eliminate injustice in our society, we cannot ignore the online ecosystem. It is time to open up Big Tech’s hood, enact strict prohibitions on harmful algorithms, and prioritise justice for communities who have long been discriminated against as we work toward platform accountability,” Senator Markey said. “Biased artificial intelligence systems have become embedded in the fabric of our digital society and they must be rooted out.”

The issue of antitrust has also become relevant to how platforms are regulated. In July, President Biden signed an executive order alleging that the major platforms had used their power to unfairly shut down competition, harming small businesses.

“It is also the policy of my administration to enforce the antitrust laws to meet the challenges posed by new industries and technologies, including the rise of the dominant internet platforms,” the order said.

That order came not long after a majority of US states had filed lawsuits against major platforms for antitrust violations, and after Biden had made several high-profile appointments who are strong on antitrust enforcement, including Lina Khan, an associate professor at Columbia Law, who chairs the Federal Trade Commission (FTC).

– Daniel Mollenkamp PlatformsIntelligence US correspondent

Photo: Cambridge University Press

Print Friendly, PDF & Email