Ghost work, artificial intelligence and Janelle Shane’s “The Skeleton Crew”.

0


[ad_1]

An expert on how data and algorithms change work answers Janelle Shanes “The skeleton crew. “

“The Skeleton Crew” asks us two questions. The first is an interesting twist on an ancient thought experiment. But the second is more complicated because the story invites us to become aware of a very real phenomenon and to think about what, if anything, should be done about the way the world works for some people.

The first question examines what it would mean if our machines, robots and now artificial intelligence had feelings like us. (Remember Haley Joel Osment’s kid AI, created to endure endless love for their human mother as society dies around her.) “The Skeleton Crew” offers an interesting twist because of that AI actually has feelings, just like us because it is, Actually Us: The AI ​​is a group of remote workers who fake a haunted house to make it appear automated and intelligent.

It’s a fun version of the trope. That the AI ​​are actually real people with real feelings underscores the villainy, heroism or the forgotten indifference of other characters around them. The villains interact with the AI ​​in grueling ways, and their fear of it is their ultimate doom. The tough virgin in need graciously thanks the AI ​​for saving her life before she knew it was humans. Unaware of the actual workings of this world that he created, whether it be shoddy AI or real people, the billionaire haunts when his making money is in question. Interestingly, the crowds walking through the haunted house seem most interested in seeing if they can crack the AI ​​and prove it isn’t really smart (remember the Microsoft Tay version). Perhaps this represents our human bravery to prove that we are a little harder to replace than AI tech companies believe we are.

The second question, which is less familiar and pleasant, is queued when Bud Crack, the senior Filipino remote team manager, tells his team, “I’m trying to explain things to them. What we are. You are confused. “

Before “they” – those in the expected, visible role in society – can offer any kind of support, they must grapple with the existence of remote workers who are faking the operations of the AI ​​from their belief that “The house of the AI ​​is powered by an advanced artificial intelligence” to a new understanding that there is a desperate remote worker in New Zealand who remotely controlled the plastic cabinet skeleton in the house of the AI ​​and is now the only person in the world with it (Distant) eyes on a dangerous situation.

This fictional moment reflects an actual reality that is detailed in the award-winning book Ghost labor by Mary Gray, anthropologist at Microsoft Research and MacArthur Fellow 2020, and Siddharth Suri, computer scientist at Microsoft Research. Ghost labor refers to actual, real people who sit in their homes and actually do paid work to get AI systems up and running. Most machine learning models today use supervised learning, where the model learns to make correct decisions based on a data set that has been labeled by people. Ghost labor refers to the paid, accented data labeling that people do so that the models can learn correct decisions: for example, labeling images, marking X-rated content, marking text or audio content, proofreading and much more. You may have done some of this data labeling work for free by filling out a reCAPTCHA that identifies all bikes or traffic lights on a photo for logging into different websites.

The ten or so years of academic research on this topic provide an opportunity to understand these working conditions and the experiences of people who participate and leave these platforms. Three topics are linked to the “Skeleton Crew” story and offer some insight into this work experience.

First, many, but not all, of these work environments are subject to “algorithmic management” that includes features such as automatic recruitment and termination, and gamification of performance evaluation with wage-based evaluations. Here in Silicon Valley, such automated management functions are made possible as “scaling” because no human superiors or experts are required. In “The Skeleton Crew,” the automated management functions that, among other things, monitored the workers’ success in scaring visitors, had a “hostility that only with [the] deep sloppiness. ”In the story, as in many environments, the people who work on these platforms experience the auto-termination without recourse to be particularly cruel. I suspect many of us have encountered some form of “algorithmic cruelty” such as locking out an online account or cheating on a fake flower website with no recourse, no phone number to call, no people to talk to . Now imagine if your income and livelihood were exposed to such automated systems and dehumanizing reactions. Or, according to Hatim Rahman’s research, imagine losing income and professional status on an automated platform for reasons that are never explained to you and actually seem intentionally obscure. The Skeleton Crew suggests that the shoddy systems and dehumanizing treatment are completely unnecessary and almost puzzling, perhaps because the billionaire company had to pretend there weren’t any people to operate the system. The real-life examples of people or companies faking AI operations are strange but not uncommon. A New Zealand company appears to have faked a digital AI assistant for doctors, with nonsensical interfaces like customers having to email the AI ​​system. The founders reprimanded a questioning reporter for choosing “don’t believe”. But companies don’t have to make explicitly false claims about AI to do ghost work. Some academics and activists, including Lilly Irani, have argued that many automated human-in-the-loop systems like Amazon Mechanical Turk rely on the invisibility of the people involved because it makes the technology appear more advanced and autonomous than it actually is is, and the rhetoric and system design try to create the invisibility of human work.

Second, despite some of these cultural conditions and system designs, like almost all, these work environments are collaborative, social, and meaningful. Take the Uber and Lyft drivers, for example, who team up to play the algorithmically managed prizes. Such collaborations are common even on the targeted individualistic crowd platforms. Another research by Gray and Suri showed the collaboration network created by people who worked at Amazon Mechanical Turk, where they worked together as “crowdworkers” to get the best wages and create social connections (see also the Turkopticon system) . Similarly, the skeleton crew actively worked together to create workarounds within the dysfunctional system, including splitting the Closet Skeleton shifts, because the gamified “scare-o-meter” was so bad for the role that it was usually not rewarded meant (until, hilariously, one of them realized that the Scare-O-Meter registered a mop as a frightened human face and the entire team could be paid back to do the shifts on the cabinet skeleton). Because of their collaboration and the way they “scraped themselves together” through this bizarre system, it would have been catastrophic to lose a colleague in seemingly individualistic jobs.

Thirdly, “The Skeleton Crew” gives an insight into these working environments through vivid examples of how amazingly good people are at improvising and developing situated know-how, skills that are still difficult for automated systems. Lucy Suchman and colleagues have published several books and articles that analyze people’s improvisation and situated expertise, and “The Skeleton Crew” illustrates these ideas with fun details: The team finds out how far the Dragonsulla has advanced through the haunted house, by putting a bit of her star eyeshadow into the flawed bizarre AI profiles on the wall; Cheesella knows that she can throw out one of her cheap plastic skeleton hands to distract the bad guys, and also thinks about setting off the fire alarm when she realizes that her remote-controlled skeletal self has no way of communicating with the people in the room. The Skeleton Crew’s understanding of the peculiar context and collaborative improvisation its members needed to skillfully use the environment to thwart the attack provides a fun and realistic view of how groups of people work together, even when totally are distant and even when mediated through virtual communication.

These themes in the story give us a glimpse into these working conditions and make us ponder the more complicated second question that history prompts us: when society begins to better see and understand the potential atrocities of ghost labor conditions, there is something that is possible? be done? Gray sometimes compares the present moment to when society began to truly understand the realities of child labor and the urgent need for more protective laws. She argues that regulation is needed, particularly regulation that recognizes a new “form of employment that does not fit full-time or fully part-time, or even clearly self-employed”. Such regulations include the new classification of employment To obtain justice and also to ensure the necessary arrangements and services for all types of relevant work, even as technologies, jobs and employment relationships change. The push for this new employment classification and associated rules and regulations requires us to see working conditions that were not easily visible, and also for companies to recognize that these are not just fixed-term working conditions “on the way to automation”, and action. Hopefully The Skeleton Crew will help start or continue that awareness and conversation.

Future Tense is a partnership between Slate, New America, and Arizona State University studying emerging technologies, public policy, and society.

[ad_2]

Leave A Reply

Your email address will not be published.