Scale AI's Outlier Platform Employs Gig Workers to Scrape Personal Data for AI Training
Scale AI, a company 49%-controlled by Meta, has recruited experts across fields such as medicine, physics, and economics to refine artificial intelligence systems through its Outlier platform. However, gig workers for the platform describe scraping personal data from social media, copyrighted artwork, and explicit content, in what they call a morally uncomfortable exercise diverging from high-level AI refinement.
Workers Describe Desperation and Ethical Dilemmas
Tens of thousands of people worldwide have been paid by Scale AI to train AI by combing Instagram accounts, harvesting copyrighted work, and transcribing pornographic soundtracks. Many workers, including journalists, graduate students, teachers, and librarians, took on this extra work due to economic pressures in an AI-threatened economy. One worker stated, "A lot of us were really desperate. Many people really needed this job, myself included, and really tried to make the best of a bad situation." Workers expressed guilt over training their own replacements, with one artist describing "internalised shame and guilt" for contributing to automation.
Scraping Social Media and Copyrighted Content
Outlier workers, also known as "taskers," reported scouring Facebook and Instagram accounts, tagging individuals by name, location, and friends, including accounts of people under 18. Tasks required new data, pushing workers to plumb deeper into social media profiles. One assignment involved selecting photos from Facebook accounts and ordering them by the age of the user. Workers found these tasks unsettling, with some using only celebrity photos to avoid ethical issues. Additionally, taskers harvested images of copyrighted artwork to train AI to produce artistic images, often resorting to artists' social media accounts when other options ran out.
Explicit Content and Monitoring Practices
Taskers described being asked to transcribe pornographic soundtracks, label photos of dead animals or dog faeces, and handle police calls describing violent scenarios. One doctoral student mentioned labeling a diagram of baby genitalia, despite assurances of no nudity or gore. Workers were constantly monitored through Hubstaff, which could screenshot websites visited while working. Scale AI stated Hubstaff ensures accurate payment but does not actively monitor taskers. The company claims it shuts down inappropriate tasks and does not require workers to continue uncomfortable assignments, denying involvement with child sexual abuse material or pornography.
Legal and Corporate Context
Scale AI has contracts with major clients like Google, Meta, OpenAI, the Pentagon, and the US Department of Defense. Its former CEO, Alexandr Wang, is Meta's chief AI officer, and former managing director Michael Kratsios served as science adviser to President Donald Trump. Glenn Danas, a lawyer representing AI gig workers in lawsuits against Scale AI, estimates hundreds of thousands work on platforms like Outlier. Workers accused Scale AI of "bait-and-switch" tactics, promising high salaries during recruitment but offering lower pay later. Scale AI declined to comment on litigation but said pay changes only if workers opt into different projects.
Uncertainty and Future Implications
Taskers expressed uncertainty about what they were training AI to do and how their submissions would be used. Some interacted with ChatGPT and Claude or used Meta data, possibly training Meta's new model, Avocado. OpenAI stated it stopped working with Scale AI in June 2025, emphasizing ethical treatment of workers. Despite unsteady pay and mass layoffs, many taskers continue on Outlier, feeling limited alternatives in the fast-arriving AI future. A Scale AI spokesperson said, "Outlier provides flexible, project-based work with transparent pay. Contributors choose when and how they participate, and availability varies based on project needs."



