Ice Lounge Media

Ice Lounge Media

Edtech company Chegg has sued Google claiming that the tech giant’s AI summaries of search results have hurt Chegg’s traffic and revenue. In the suit, filed in the U.S. District Court for the District of Columbia, Chegg accuses Google of unfair competition — specifically reciprocal dealing, monopoly maintenance, and unjust enrichment. Google, Chegg claims, forces […]

© 2024 TechCrunch. All rights reserved. For personal use only.

Read more

This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology.

Your boss is watching

Working today—whether in an office, a warehouse, or your car—can mean constant electronic surveillance with little transparency, and potentially with livelihood-­ending consequences if your productivity flags. 

But what matters even more than the effects of this ubiquitous monitoring on privacy may be how all that data is shifting the relationships between workers and managers, companies and their workforce. 

We are in the midst of a shift in work and workplace relationships as significant as the Second Industrial Revolution of the late 19th and early 20th centuries. And new policies and protections may be necessary to correct the balance of power. Read the full story

—Rebecca Ackermann

One option for electric vehicle fires? Let them burn.

Although there isn’t solid data on the frequency of EV battery fires, it’s no secret that these fires are happening.

Despite that, manufacturers offer no standardized steps on how to fight them or avoid them in the first place. What’s more, with EVs, it’s never entirely clear whether the fire is truly out. Cars may ignite, or reignite, weeks or even months after the battery is damaged or a battery fire is initially suppressed. 

Patrick Durham, the owner of one of a growing number of private companies helping first responders learn how to deal with lithium-ion battery safety, has a solution. He believes that the best way to manage EV fires right now is to let them burn. But such an approach not only goes against firefighters’ instincts—it’d require a significant cultural shift. Read the full story.

—Maya L. Kapoor

These stories are from the next edition of our print magazine, which is all about relationships. Subscribe now to read it and get a copy of the magazine when it lands on February 26!

The must-reads

I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology.

1 Elon Musk is forcing US government workers to justify their jobs
Employees have to respond to an email by 11:59pm ET tonight or resign. (Wired $)
+ The new administration is targeting government foreign interference experts. (NYT $)
+ It’s also waging war on what it deems “woke DEI research.” (Undark)
+ A US government shutdown could be on the cards as soon as this month. (NY Mag $)

2 Grok was temporarily blocked from telling the truth about Trump and Musk
An xAI employee got it to ignore sources that say the pair spread misinformation. (The Verge)
+ An xAI engineering lead said the move wasn’t in line with the company’s values. (TechCrunch)

3 The race to dominate satellite internet is heating up
Starlink has some major competition. (Reuters)
+ Chinese rocket firm Deep Blue Aerospace is eyeing an IPO. (WSJ $)
+ The world’s next big environmental problem could come from space. (MIT Technology Review)

4 Apple has pulled its data security tool from the UK
After the UK government demanded backdoor access. (BBC)
+ Other encrypted Apple services are still available, though. (WP $)

5 How AI is changing coding
The outlook for software developers is more likely to be evolution than extinction. (NYT $)
+ AI coding assistants aren’t always all they’re cracked up to be. (TechCrunch)
+ The second wave of AI coding is here. (MIT Technology Review)

6 Inside Facebook’s plans to become cool again
Unfortunately for the social network, you can’t buy cultural cachet. (The Information $)
+ How Facebook got addicted to spreading misinformation. (MIT Technology Review)

7 The internet is disappearing
Digital decay is setting in. What will survive of us?(Vox)
+ The race to save our online lives from a digital dark age. (MIT Technology Review)

8 Where are all the Apple Vision Pro apps?
The number of apps made for the headset has declined every month since it went on sale. (CNBC)

9 How the internet warped the meaning of ‘lore’
From ancient myths to oversharing on TikTok. (Fast Company $)

10 Not everything needs to be tracked 
Knowledge isn’t always power when it comes to your home appliances. (The Guardian)

Quote of the day

“We’re trying to do creative work, and AI is just pushing perfection.”

—Lo Kalani, a Brooklyn-based hair stylist, explains to the Washington Post why she has banned clients from presenting her with AI-generated inspirational images.

The big story

How one mine could unlock billions in EV subsidies

January 2024

On a pine farm north of the tiny town of Tamarack, Minnesota, Talon Metals has uncovered one of America’s densest nickel deposits—and now it wants to begin extracting it.

If regulators approve the mine, it could mark the starting point in what the company claims would become the country’s first complete domestic nickel supply chain, running from the bedrock beneath the Minnesota earth to the batteries in electric vehicles across the nation.

MIT Technology Review wanted to provide a clearer sense of the law’s on-the-ground impact by zeroing in on a single project and examining how these rich subsidies could be unlocked at each point along the supply chain. Take a look at what we found out.

—James Temple

We can still have nice things

A place for comfort, fun and distraction to brighten up your day. (Got any ideas? Drop me a line or skeet ’em at me.)

+ The best way to learn absolutely anything more quickly? That’ll be the Feynman technique.
+ Here’s how to use lemongrass like a pro.
+ I didn’t know it was possible to make a recorder sing like this, but there you go.
+ Vampire couples forever! 🧛🏻❤🧛🏻‍♂️

Read more

In the fall of 2024, a trucking company in Falls Township, Pennsylvania, temporarily stored a storm-damaged Tesla at its yard. A few weeks later, the car burst into flames that grew out of control within seconds, some shooting out 30 feet.

A local fire company tried in vain to squelch the blaze, spraying more than 2,000 gallons of water on the vehicle. Eventually, the firefighters requested help from a fire company in neighboring Bristol Township, led by volunteer fire chief Howard McGoldrick. He’d been fighting fires since 1989, but this conflagration was unusual: It was a chemical fire in a lithium-ion battery, meaning it provided its own heat, fuel, and oxygen. And it was incredibly challenging to extinguish.  

McGoldrick was encountering fires like this more and more often. The previous year, he says, several rowhouses were badly burned after overcharged lithium-ion batteries in racing drones ignited inside. In another nearby incident, old lithium-ion biomedical devices at a scrapyard got soaked in a rainstorm and combusted.

The Tesla fire felt like a breaking point. “We were like, ‘Okay, this is just too many incidents in a short amount of time,’” McGoldrick recalls. He went in search of someone who could help his company get better at responding to fires in lithium-ion batteries. He found Patrick Durham.

Durham is the owner of (and mustache behind) StacheD Training, one of a growing number of private companies helping first responders learn how to deal with lithium-ion battery safety, including electric-vehicle fires.

Although there isn’t solid data on the frequency of EV battery fires, it’s no secret to EV makers that these fires are happening. Yet the manufacturers offer no standardized steps on how to fight them or avoid them in the first place, leaving first responders scrambling to search through each car’s emergency response guide—something that’s hard to do when you’re standing in front of an immolating vehicle.

In this void, Durham offers a wealth of resources to first responders, from easy-to-follow video tutorials to hours-long in-person workshops. In 2024 alone, Durham says he trained approximately 2,000 first responders around the country. As more people buy EVs, in part to help address climate change, the need for this training has only grown; in less than two years, Durham’s YouTube channel has attracted almost 30,000 subscribers. (The US doesn’t currently collect data on the frequency or causes of EV fires, but this year the US Fire Administration and the Fire Safety Research Institute are rolling out a new data collection system for fire departments.)

A circumspect man with a shaved head, brown eyes, and a thick horseshoe mustache framing his mouth, Durham previously worked as a mechanical engineer developing battery boxes for EVs. He is also a volunteer firefighter, and in 2020 he offered his first training on fires in lithium-ion batteries to his local department. From there, his reputation spread by word of mouth. Today, StacheD Training is Durham’s full-time work. He’s also the captain of his local volunteer fire department in Troy, Michigan.  

As more EVs hit the road, what worries Durham most isn’t just the growing likelihood of battery fires—it’s their intensity. “The severity of the fire is significant compared to a regular vehicle fire,” he says.

“The traditional car fires that you and I grew up with—the majority of those always start in the engine compartment,” says Jim Stevenson, a fire chief from rural Michigan who has taken Durham’s training. “So we basically get there, we pop the car hood, and then we put out the fire from there, and if it gets into the inner compartment of the car? Not a big deal. You spray it down with the hose, and it’s out in no time.” With EV fires, Stevenson says, “it’s just a completely different monster.” 

matchbox on wheels
SHAWN HAZEN

An EV battery is essentially a tightly packed array of thousands of cells, each of which ranges from approximately the size and shape of an AA battery to the size of a legal envelope, depending on the battery model. If a single cell gets damaged–such as by getting crushed, overcharged, or waterlogged–that cell can heat uncontrollably in a process called thermal runaway. It will release so much heat and flammable gas that it generates its own fire, which spreads to the other cells. 

Older lithium-ion battery packs exploded “like a pipe bomb” when that happened, Durham says; today’s battery packs have release valves so that during thermal runaway they avoid an explosion by instead spewing flames in what Durham describes as “essentially a blowtorch.” The location of an EV’s battery—underneath the car, between its axles, within a protective case—complicates things further. The batteries are much safer from collision damage than they would be under the hood, but they are also much harder to reach and douse if they ignite.

The result? Fires such as one at an Illinois Rivian plant in 2024, where one EV caught fire and approximately 50 cars parked nearby ended up burning. Or one in Hollywood, Florida, in 2023, where a Tesla was accidentally driven off a dock and burst into flames even though it was underwater.

Durham worries that if an EV battery catches fire in a high-speed crash, it will burn so intensely that first responders won’t be able to save anyone inside the vehicle. Putting out a fire in an internal-combustion car might take as little as 30 minutes and a few hundred gallons of water, he notes, while an electric car battery fire could take upwards of 4,000 gallons of water and many hours to extinguish—and much more for commercial trucks. Indeed, when a Tesla Semi drove off Interstate 80 in Northern California in 2024 and burst into flames, first responders had to douse it with 50,000 gallons of water and close the highway for 15 hours.

What’s more, with EVs, it’s never entirely clear whether the fire is truly out. Cars may ignite, or reignite, weeks or even months after the battery is damaged or a battery fire is initially suppressed. Durham points to one salvaged Tesla in California that burst into flames 308 days after it had flooded in a Florida hurricane. The vehicle hadn’t initially ignited, but saltwater intrusion into the battery pack eventually corroded it enough to produce a chemical fire leading to thermal runaway.

According to Durham, the simple truth is that the best way to manage EV fires right now is to let them burn—while making sure to protect the surrounding area, including other vehicles and people’s homes. Allowing the fire to run its course will ideally also destroy any cells that might otherwise ignite later.

This goes against firefighters’ instincts. When they respond to EV fires, they will spray water “because they want to do something to fix the problem,” he says. [But] … it’s not really doing anything.”

Stevenson worries about how bystanders will perceive  first responders waiting out a blaze. “It’s going to be ugly,” he says, “because the public’s going to see us standing on the side [of the] road just watching it burn, which looks bad for us.” But at the same time, he adds, “we don’t have [an] actual way of getting to the battery to knock it out.”

For now, Durham’s training focuses on the options that first responders do have with EV fires. An important if simple one is using a fire blanket to cover a vehicle and prevent the blaze from spreading as it burns out. Although they hadn’t yet received Durham’s training, that’s exactly what McGoldrick and his crew did when they responded to the burning Tesla last fall: After the facility used a forklift to move the burning car to an isolated part of the yard, first responders covered it with a fire blanket. The car reignited several times over the next few days, McGoldrick says, “but it was contained. We just put it in the middle of an open lot and basically let it go.”

It’s a significant cultural shift that first responders need to make, Durham says, and there’s another one, too: being extra-vigilant about the personal protective equipment they wear from the first moment they arrive at a burning EV. There isn’t yet enough information to compare the toxicity of EV fires and those in gas-powered cars, but Durham warns that first responders could inhale high levels of carbon dioxide, carbon monoxide, and heavy metals from burning EVs.

Overall, Durham says, he is not against EVs, but he thinks there needs to be a change in attitude to handle them safely. When an EV battery catches fire, he says, “until that battery has been removed from the vehicle and shredded and fully recycled, it’s always going be a hazard.”

Maya L. Kapoor is an award-winning freelance journalist who writes about climate change, biodiversity, and environmental justice.

Read more

A full day’s work for Dora Manriquez, who drives for Uber and Lyft in the San Francisco Bay Area, includes waiting in her car for a two-digit number to appear. The apps keep sending her rides that are too cheap to pay for her time—$4 or $7 for a trip across San Francisco, $16 for a trip from the airport for which the customer is charged $100. But Manriquez can’t wait too long to accept a ride, because her acceptance rate contributes to her driving score for both companies, which can then affect the benefits and discounts she has access to. 

The systems are black boxes, and Manriquez can’t know for sure which data points affect the offers she receives or how. But what she does know is that she’s driven for ride-share companies for the last nine years, and this year, having found herself unable to score enough better-­paying rides, she has to file for bankruptcy. 

Every action Manriquez takes—or doesn’t take—is logged by the apps she must use to work for these companies. (An Uber spokesperson told MIT Technology Review that acceptance rates don’t affect drivers’ fares. Lyft did not return a request for comment on the record.) But app-based employers aren’t the only ones keeping a very close eye on workers today.

A study conducted in 2021, when the covid-19 pandemic had greatly increased the number of people working from home, revealed that almost 80% of companies surveyed were monitoring their remote or hybrid workers. A New York Times investigation in 2022 found that eight of the 10 largest private companies in the US track individual worker productivity metrics, many in real time. Specialized software can now measure and log workers’ online activities, physical location, and even behaviors like which keys they tap and what tone they use in their written communications—and many workers aren’t even aware that this is happening.

What’s more, required work apps on personal devices may have access to more than just work—and as we may know from our private lives, most technology can become surveillance technology if the wrong people have access to the data. While there are some laws in this area, those that protect privacy for workers are fewer and patchier than those applying to consumers. Meanwhile, it’s predicted that the global market for employee monitoring software will reach $4.5 billion by 2026, with North America claiming the dominant share.

Working today—whether in an office, a warehouse, or your car—can mean constant electronic surveillance with little transparency, and potentially with livelihood-­ending consequences if your productivity flags. What matters even more than the effects of this ubiquitous monitoring on privacy may be how all that data is shifting the relationships between workers and managers, companies and their workforce. Managers and management consultants are using worker data, individually and in the aggregate, to create black-box algorithms that determine hiring and firing, promotion and “deactivation.” And this is laying the groundwork for the automation of tasks and even whole categories of labor on an endless escalator to optimized productivity. Some human workers are already struggling to keep up with robotic ideals.

We are in the midst of a shift in work and workplace relationships as significant as the Second Industrial Revolution of the late 19th and early 20th centuries. And new policies and protections may be necessary to correct the balance of power.

Data as power

Data has been part of the story of paid work and power since the late 19th century, when manufacturing was booming in the US and a rise in immigration meant cheap and plentiful labor. The mechanical engineer Frederick Winslow Taylor, who would become one of the first management consultants, created a strategy called “scientific management” to optimize production by tracking and setting standards for worker performance.

Soon after, Henry Ford broke down the auto manufacturing process into mechanized steps to minimize the role of individual skill and maximize the number of cars that could be produced each day. But the transformation of workers into numbers has a longer history. Some researchers see a direct line between Taylor’s and Ford’s unrelenting focus on efficiency and the dehumanizing labor optimization practices carried out on slave-owning plantations. 

As manufacturers adopted Taylorism and its successors, time was replaced by productivity as the measure of work, and the power divide between owners and workers in the United States widened. But other developments soon helped rebalance the scales. In 1914, Section 6 of the Clayton Act established the federal legal right for workers to unionize and stated that “the labor of a human being is not a commodity.” In the years that followed, union membership grew, and the 40-hour work week and the minimum wage were written into US law. Though the nature of work had changed with revolutions in technology and management strategy, new frameworks and guardrails stood up to meet that change.

More than a hundred years after Taylor published his seminal book, The Principles of Scientific Management, “efficiency” is still a business buzzword, and technological developments, including new uses of data, have brought work to another turning point. But the federal minimum wage and other worker protections haven’t kept up, leaving the power divide even starker. In 2023, CEO pay was 290 times average worker pay, a disparity that’s increased more than 1,000% since 1978. Data may play the same kind of intermediary role in the boss-worker relationship that it has since the turn of the 20th century, but the scale has exploded. And the stakes can be a matter of physical health.

A humanoid robot with folded arms looms over human workers at an Amazon Warehouse

In 2024, a report from a Senate committee led by Bernie Sanders, based on an 18-month investigation of Amazon’s warehouse practices, found that the company had been setting the pace of work in those facilities with black-box algorithms, presumably calibrated with data collected by monitoring employees. (In California, because of a 2021 bill, Amazon is required to at least reveal the quotas and standards workers are expected to comply with; elsewhere the bar can remain a mystery to the very people struggling to meet it.) The report also found that in each of the previous seven years, Amazon workers had been almost twice as likely to be injured as other warehouse workers, with injuries ranging from concussions to torn rotator cuffs to long-term back pain.

An internal team tasked with evaluating Amazon warehouse safety found that letting robots set the pace for human labor was correlated with subsequent injuries.

The Sanders report found that between 2020 and 2022, two internal Amazon teams tasked with evaluating warehouse safety recommended reducing the required pace of work and giving workers more time off. Another found that letting robots set the pace for human labor was correlated with subsequent injuries. The company rejected all the recommendations for technical or productivity reasons. But the report goes on to reveal that in 2022, another team at Amazon, called Core AI, also evaluated warehouse safety and concluded that unrealistic pacing wasn’t the reason all those workers were getting hurt on the job. Core AI said that the cause, instead, was workers’ “frailty” and “intrinsic likelihood of injury.” The issue was the limitations of the human bodies the company was measuring, not the pressures it was subjecting those bodies to. Amazon stood by this reasoning during the congressional investigation.

Amazon spokesperson Maureen Lynch Vogel told MIT Technology Review that the Sanders report is “wrong on the facts” and that the company continues to reduce incident rates for accidents. “The facts are,” she said, “our expectations for our employees are safe and ­reasonable—and that was validated both by a judge in Washington after a thorough hearing and by the state’s Board of Industrial Insurance Appeals.”

Yet this line of thinking is hardly unique to Amazon, although the company could be seen as a pioneer in the datafication of work. (An investigation found that over one year between 2017 and 2018, the company fired hundreds of workers at a single facility—by means of automatically generated letters—for not meeting productivity quotas.) An AI startup recently placed a series of billboards and bus signs in the Bay Area touting the benefits of its automated sales agents, which it calls “Artisans,” over human workers. “Artisans won’t complain about work-life balance,” one said. “Artisans won’t come into work ­hungover,” claimed another. “Stop hiring humans,” one hammered home.

The startup’s leadership took to the company blog to say that the marketing campaign was intentionally provocative and that Artisan believes in the potential of human labor. But the company also asserted that using one of its AI agents costs 96% less than hiring a human to do the same job. The campaign hit a nerve: When data is king, humans—whether warehouse laborers or knowledge workers—may not be able to outperform machines.

AI management and managing AI

Companies that use electronic employee monitoring report that they are most often looking to the technologies not only to increase productivity but also to manage risk. And software like Teramind offers tools and analysis to help with both priorities. While Teramind, a globally distributed company, keeps its list of over 10,000 client companies private, it provides resources for the financial, health-care, and customer service industries, among others—some of which have strict compliance requirements that can be tricky to keep on top of. The platform allows clients to set data-driven standards for productivity, establish thresholds for alerts about toxic communication tone or language, create tracking systems for sensitive file sharing, and more. 

a person laying in the sidewalk next to a bus sign reading, "Artisans won't complain about Work-Life balance. The era of AI employees is here."
An AI startup recently placed a series of billboards and bus signs in the Bay Area touting the benefits of its automated sales agents, which it calls “Artisans,” over human workers.
JUSTIN SULLIVAN/GETTY IMAGES

With the increase in remote and hybrid work, says Teramind’s chief marketing officer, Maria Osipova, the company’s product strategy has shifted from tracking time spent on tasks to monitoring productivity and security more broadly, because that’s what clients want. “It’s a different set of challenges that the tools have had to evolve to address as we’re moving into fully hybrid work,” says Osipova. “It’s this transition from ‘Do people work?’ or ‘How long do they work?’ to ‘How do they work best?’ How do we as an organization understand where and how and under what conditions they work best? And also, how do I de-risk my company when I give that amount of trust?” 

The clients’ myriad use cases and risks demand a very robust platform that can monitor multiple types of input. “So think about what applications are being used. Think about being able to turn on the conversations that are happening on video or audio as needed, but also with a great amount of flexibility,” says Osipova. “It’s not that it’s a camera that’s always watching over you.” 

Selecting and tuning the appropriate combination of data is up to Teramind’s clients and depends on the size, goals, and capabilities of the particular company. The companies are also the ones to decide, based on their legal and compliance requirements, what measures to take if thresholds for negative behavior or low performance are hit. 

But however carefully it’s implemented, the very existence of electronic monitoring may make it difficult for employees to feel safe and perform well. Multiple studies have shown that monitoring greatly increases worker stress and can break down trust between an employer and its workforce. One 2022 poll of tech workers found that roughly half would rather quit than be monitored. And when algorithmic management comes into the picture, employees may have a harder time being successful—and understanding what success even means. 

Ra Criscitiello, deputy director of research at SEIU–United Healthcare Workers West, a labor union with more than 100,000 members in California, says that one of the most troubling aspects of these technological advances is how they affect performance reviews. According to Criscitiello, union members have complained that they have gotten messages from HR about data they didn’t even know was being collected, and that they are being evaluated by algorithmic models they don’t understand. Dora Manriquez says that when she first started driving for ride-share companies, there was an office to go to or call if she had any issues. Now, she must generally lodge any complaints by text through the app, and any response appears to come from an automated system. “Sometimes they’ll even get stuck,” she says of the chatbots. “They’re like, ‘I don’t understand what you’re saying. Can you repeat that again?’”

Many app-based workers live in fear of being booted off the platform at any moment by the ruling algorithm—sometimes with no way to appeal to a human for recourse.

Veronica Avila, director of worker campaigns for the Action Center for Race and Economy (ACRE), has also seen algorithmic management take over for human supervisors at companies like Uber. “More than the traditional ‘I’m watching you work,’ it’s become this really sophisticated mechanism that exerts control over workers,” she says. 

ACRE and other advocacy groups call what’s happening among app-based companies a “deactivation crisis” because so many workers live in fear that the ruling algorithm will boot them off the platform at any moment in response to triggers like low driver ratings or minor traffic infractions—often with no explicit explanation and no way to appeal to a human for recourse. 

Ryan Gerety, director of the Athena Coalition, which—among other activities—organizes to support Amazon workers, says that workers in those warehouses face continuous monitoring, assessment, and discipline based on their speed and their performance with respect to quotas that they may or may not know about. (In 2024, Amazon was fined in California for failing to disclose quotas to workers who were required to meet them.) “It’s not just like you’re monitored,” Gerety says. “It’s like every second counts, and every second you might get fired.” 

a blonde passerger fires her Uber driver on her phone app as he receives the message in real time from the driver's seat
MICHAEL BYERS

Electronic monitoring and management are also changing existing job functions in real time. Teramind’s clients must figure out who at their company will handle and make decisions around employee data. Depending on the type of company and its needs, Osipova says, that could be HR, IT, the executive team, or another group entirely—and the definitions of those roles will change with these new responsibilities. 

Workers’ tasks, too, can shift with updated technology, sometimes without warning. In 2020, when a major hospital network piloted using robots to clean rooms and deliver food to patients, Criscitiello heard from SEIU-UHW members that they were confused about how to work alongside them. Workers certainly hadn’t received any training for that. “It’s not ‘We’re being replaced by robots,’” says Criscitiello. “It’s ‘Am I going to be responsible if somebody has a medical event because the wrong tray was delivered? I’m supervising the robot—it’s on my floor.’” 

Nurses are also seeing their jobs expand to include technology management. Carmen Comsti of National Nurses United, the largest nurses’ union in the country, says that while management isn’t explicitly saying nurses will be disciplined for errors that occur as algorithmic tools like AI transcription systems or patient triaging mechanisms are integrated into their workflows, that’s functionally how it works. “If a monitor goes off and the nurse follows the algorithm and it’s incorrect, the nurse is going to get blamed for it,” Comsti says. Nurses and their unions don’t have access to the inner workings of the algorithms, so it’s impossible to say what data these or other tools have been trained on, or whether the data on how nurses work today will be used to train future algorithmic tools. What it means to be a worker, manager, or even colleague is on shifting ground, and frontline workers don’t have insight into which way it’ll move next.

The state of the law and the path to protection

Today, there isn’t much regulation on how companies can gather and use workers’ data. While the General Data Protection Regulation (GDPR) offers some worker protections in Europe, no US federal laws consistently shield workers’ privacy from electronic monitoring or establish firm guardrails for the implementation of algorithm-driven management strategies that draw on the resulting data. (The Electronic Communications Privacy Act allows employers to monitor employees if there are legitimate business reasons and if the employee has already given consent through a contract; tracking productivity can qualify as a legitimate business reason.)

But in late 2024, the Consumer Financial Protection Bureau did issue guidance warning companies using algorithmic scores or surveillance-based reports that they must follow the Fair Credit Reporting Act—which previously applied only to consumers—by getting workers’ consent and offering transparency into what data was being collected and how it would be used. And the Biden administration’s Blueprint for an AI Bill of Rights had suggested that the enumerated rights should apply in employment contexts. But none of these are laws.

So far, binding regulation is being introduced state by state. In 2023, the California Consumer Privacy Act (CCPA) was officially extended to include workers and not just consumers in its protections, even though workers had been specifically excluded when the act was first passed. That means California workers now have the right to know what data is being collected about them and for what purpose, and they can ask to correct or delete that data. Other states are working on their own measures. But with any law or guidance, whether at the federal or state level, the reality comes down to enforcement. Criscitiello says SEIU is testing out the new CCPA protections. 

“It’s too early to tell, but my conclusion so far is that the onus is on the workers,” she says. “Unions are trying to fill this function, but there’s no organic way for a frontline worker to know how to opt out [of data collection], or how to request data about what’s being collected by their employer. There’s an education gap about that.” And while CCPA covers the privacy aspect of electronic monitoring, it says nothing about how employers can use any collected data for management purposes.

The push for new protections and guardrails is coming in large part from organized labor. Unions like National Nurses United and SEIU are working with legislators to create policies on workers’ rights in the face of algorithmic management. And app-based ­advocacy groups have been pushing for new minimum pay rates and against wage theft—and winning. There are other successes to be counted already, too. One has to do with electronic visit verification (EVV), a system that records information about in-home visits by health-care providers. The 21st Century Cures Act, signed into law in 2016, required all states to set up such systems for Medicaid-funded home health care. The intent was to create accountability and transparency to better serve patients, but some health-care workers in California were concerned that the monitoring would be invasive and disruptive for them and the people in their care.

Brandi Wolf, the statewide policy and research director for SEIU’s long-term-care workers, says that in collaboration with disability rights and patient advocacy groups, the union was able to get language into legislation passed in the 2017–2018 term that would take effect the next fiscal year. It indicated to the federal government that California would be complying with the requirement, but that EVV would serve mainly a timekeeping function, not a management or disciplinary one.

Today advocates say that individual efforts to push back against or evade electronic monitoring are not enough; the technology is too widespread and the stakes too high. The power imbalances and lack of transparency affect workers across industries and sectors—from contract drivers to unionized hospital staff to well-compensated knowledge workers. What’s at issue, says Minsu Longiaru, a senior staff attorney at PowerSwitch Action, a network of grassroots labor organizations, is our country’s “moral economy of work”—that is, an economy based on human values and not just capital. Longiaru believes there’s an urgent need for a wave of socially protective policies on the scale of those that emerged out of the labor movement in the early 20th century. “We’re at a crucial moment right now where as a society, we need to draw red lines in the sand where we can clearly say just because we can do something technological doesn’t mean that we should do it,” she says. 

Like so many technological advances that have come before, electronic monitoring and the algorithmic uses of the resulting data are not changing the way we work on their own. The people in power are flipping those switches. And shifting the balance back toward workers may be the key to protecting their dignity and agency as the technology speeds ahead. “When we talk about these data issues, we’re not just talking about technology,” says Longiaru. “We spend most of our lives in the workplace. This is about our human rights.” 

Rebecca Ackermann is a writer, designer, and artist based in San Francisco.

Read more
1 21 22 23 24 25 2,594