Skip to main content
Log in

Rating Working Conditions on Digital Labor Platforms

  • Published:
Computer Supported Cooperative Work (CSCW) Aims and scope Submit manuscript

Abstract

The relations between technology, work organization, worker power, workers’ rights, and workers’ experience of work have long been central concerns of CSCW. European CSCW research, especially, has a tradition of close collaboration with workers and trade unionists in which researchers aim to develop technologies and work processes that increase workplace democracy. This paper contributes a practitioner perspective on this theme in a new context: the (sometimes global) labor markets enabled by digital labor platforms. Specifically, the paper describes a method for rating working conditions on digital labor platforms (e.g., Amazon Mechanical Turk, Uber) developed within a trade union setting. Preliminary results have been made public on a website that is referred to by workers, platform operators, journalists, researchers, and policy makers. This paper describes this technical project in the context of broader cross-sectoral efforts to safeguard worker rights and build worker power in digital labor platforms. Not a traditional research paper, this article instead takes the form of a case study documenting the process of incorporating a human-centered computing perspective into contemporary trade union activities and communicating a practitioner’s perspective on how CSCW research and computational artifacts can come to matter outside of the academy. The paper shows how practical applications can benefit from the work of CSCW researchers, while illustrating some practical constraints of the trade union context. The paper also offers some practical contributions for researchers studying digital platform workers’ experiences and rights: the artifacts and processes developed in the course of the work.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Figure 1
Figure 2
Figure 3

Similar content being viewed by others

Notes

  1. While the average requester in Hara et al.’s data set paid more than USD 11 per hour, low-paying requesters posted much more work than other requesters, and unpaid search time, rejected work, and time spent working on tasks that were ultimately not submitted drove wages down.

  2. This was supported by the Swedish digital strategy consultancy DUMA, whose involvement was funded by Unionen, the Swedish white-collar workers union.

  3. The entire text of the survey is available in an appendix.

  4. See, for example, this reddit thread, “Question about attention checks in surveys” https://www.reddit.com/r/mturk/comments/6iief2/question_about_attention_checks_in_surveys/ in particular, the reply from VeganMinecraft https://www.reddit.com/r/mturk/comments/6iief2/question_about_attention_checks_in_surveys/dj6ifrb/

  5. On US-based platforms, we paid $10 per survey. At the time of the work, while not equivalent, the two currencies were very close in value, and this seemed appropriate to the varied regional contexts.

  6. For platforms where collecting identifiable information from workers is not allowed (e.g., Mechanical Turk), we gave workers our email addresses so that they could reach out to us if desired.

  7. http://docs.aws.amazon.com/AWSMechTurk/latest/RequesterUI/BlockingaWorker.html

  8. See, e.g., https://github.com/cloudyr/MTurkR/wiki/Qualifications-as-Blocks which says “Technically this should have no impact on them, but the message shown is a bit threatening and typically creates bad reactions on the worker forums, in TurkOpticon ratings, and via email.”

  9. We found the documentation on the “Tips For Requesters on Mechanical Turk” blog helpful, with additional credit to Kristy “Spamgirl” Milland. http://turkrequesters.blogspot.com/2014/08/how-to-block-past-workers-from-doing.html

  10. See the Amazon Requester documentation for more details on using CSV files to manage worker details http://docs.aws.amazon.com/AWSMechTurk/latest/RequesterUI/ManagingWorkerDetailsOffline.html

  11. In IG Metall’s 2016 survey of 600 German crowd workers (unpublished), respondents said that pay was more important than all other aspects, so pay is weighted more than the other dimensions.

  12. http://turkopticon.info

  13. Specifically, the terms of service ratings associated with the platform reviews prompted several platform operators to review and change their terms of service. And the detailed information collected on the site signals to platform operators that the trade union is, on one hand, serious about its activities in the space of digital labor platforms, and, on the other, familiar enough with the technical and business workings of platform-based work that it is not likely to make “impossible” demands.

References

  • Adam, D., M. Bremermann, J. D. López, et al. (2016). Digitalisation and working life: lessons from the Uber cases around Europe. EurWORK | European Observatory of Working Life, 25 January 2016. https://www.eurofound.europa.eu/observatories/eurwork/articles/working-conditions-law-and-regulation-business/digitalisation-and-working-life-lessons-from-the-uber-cases-around-europe.

  • Amazon Mechanical Turk (2014). Participation Agreement. 2 December 2014. http://mturk.com/mturk/conditionsofuse. Accessed 13 May 2017.

  • Austrian Chamber of Labour, Austrian Trade Union Confederation, Danish Union of Commercial and Clerical Workers, et al. (2016). Frankfurt Paper on Platform Based Work (Report). Frankfurt: IG Metall (German Metalworkers’ Union). http://tinyurl.com/hdrljk2. Accessed 1 January 2018.

  • Bederson, B. B., and A. J. Quinn (2011). Web Workers Unite! Addressing Challenges of Online Laborers. In CHI EA ‘11. CHI ‘11 Extended Abstracts on Human Factors in Computing Systems, Vancouver, BC, Canada, 07–12 May 2011. New York: ACM, pp. 97–106. https://doi.org/10.1145/1979742.1979606.

    Chapter  Google Scholar 

  • Benner, C. (ed.) (2015). Crowdwork — zurück in die Zukunft? Perspektiven digitale Arbeit. Frankfurt: Bund Verlag.

    Google Scholar 

  • Benson, A., A. J. Sojourner, and A. Umyarov (2015). The value of employer reputation in the absence of contract enforcement: A randomized experiment. Presented at the Allied Social Science Associations Annual Meeting, Boston, MA, USA, 3–5 January 2015.

  • Berg, J. (2016). Income security in the on-demand economy: Findings and policy lessons from a survey of crowdworkers. Comparative Labor Law and Policy Journal, vol. 37, no. 3, pp. 543–576.

    Google Scholar 

  • Bjerknes, G., P. Ehn, M. Kyng, et al. (eds) (1987). Computers and Democracy: A Scandinavian Challenge. England: Avebury.

    Google Scholar 

  • Bjerknes, G., and T. Bratteteig (1995). User participation and democracy: A discussion of Scandinavian research on system development. Scandinavian Journal of Information Systems, vol. 7, no. 1, pp. 73–98.

    Google Scholar 

  • Bohannon, J. (2016). Psychologists grow increasingly dependent on online research subjects. Science Magazine, 7 June 2016. http://www.sciencemag.org/news/2016/06/psychologists-grow-increasingly-dependent-online-research-subjects. Accessed 1 January 2018.

  • Bradshaw, T. (2017). Self-driving cars prove to be labour-intensive for humans. Financial Times, 9 July 2017.

  • Brian (@xyderias), Kristy Milland (@TurkerNational), and Rochelle (@Rochelle) (2017). MTurk requester tip: if your attention check begins with “This is an attention check,” it isn’t. [Tweet]. 7 April 2017, 7 April 2017. https://twitter.com/xyderias/status/857239009633812481. Accessed 3 January 2018.

  • Callison-Burch, C. (2014). Crowd-Workers: Aggregating Information Across Turkers to Help Them Find Higher Paying Work. In Second AAAI Conference on Human Computation and Crowdsourcing, Pittsburgh, PA, USA, 02–04 November 2014.Palo Alto: AAAI Publications. https://www.aaai.org/ocs/index.php/HCOMP/HCOMP14/paper/view/9067. Accessed 1 January 2018.

    Google Scholar 

  • Chandler, J., P. Mueller, and G. Paolacci (2014). Nonnaïveté among Amazon Mechanical Turk workers: Consequences and solutions for behavioral researchers. Behavior Research Methods, vol. 46, no. 1, pp. 112–130. https://doi.org/10.3758/s13428-013-0365-7.

    Article  Google Scholar 

  • Chrisafis, A. (2016). France hit by day of protest as security forces fire teargas at taxi strike. The Guardian, 26 January 2016. http://www.theguardian.com/world/2016/jan/26/french-taxi-drivers-block-paris-roads-in-uber-protest. Accessed 1 January 2018.

  • Court of Justice of the European Union (2017). According to Advocate General Szpunar, the Uber electronic platform, whilst innovative, falls within the field of transport: Uber can thus be required to obtain the necessary licences and authorisations under national law (Press Release, no. 50/17). Luxembourg. https://curia.europa.eu/jcms/upload/docs/application/pdf/2017-05/cp170050en.pdf.

  • De Stefano, V. (2016a). Introduction: Crowdsourcing, the Gig-Economy and the Law. Comparative Labor Law and Policy Journal, vol. 37, no. 3.

  • De Stefano, V. (2016b). The rise of the “just-in-time workforce”: on-demand work, crowd work, and labour protection in the “gig economy”. Comparative Labor Law and Policy Journal, vol. 37, no. 3.

  • Dølvik, J. E., and K. Jesnes (2017). Nordic Labour Markets and the Sharing Economy — Report from a Pilot Project (Report). Copenhagen: Nordic Council of Ministers.

    Book  Google Scholar 

  • Dow, S., A. Kulkarni, S. Klemmer, et al. (2012). Shepherding the Crowd Yields Better Work. In CSCW ‘12. Proceedings of the ACM 2012 Conference on Computer Supported Cooperative Work, Seattle, Washington, USA, 11–15 February 2012. New York: ACM, pp. 1013–1022. https://doi.org/10.1145/2145204.2145355.

    Chapter  Google Scholar 

  • Downs, J. S., M. B. Holbrook, S. Sheng, et al. (2010). Are Your Participants Gaming the System?: Screening Mechanical Turk Workers. In CHI ‘10. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Atlanta, Georgia, USA, 10–15 April 2010. New York: ACM, pp. 2399–2402. https://doi.org/10.1145/1753326.1753688.

    Chapter  Google Scholar 

  • Drahokoupil, J., and M. Jepsen (2017). The digital economy and its implications for labour. 1. The platform economy. Transfer: European Review of Labour and Research, vol. 23, no. 2, pp. 103–107. https://doi.org/10.1177/1024258917701380.

    Article  Google Scholar 

  • Ehn, P. (1990). Work-Oriented Design of Computer Artifacts. Hillsdale: L. Erlbaum Associates.

    Google Scholar 

  • European Commission (2016). A European agenda for the collaborative economy (Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions, no. COM (2016) 356 final). Brussels: European Commission.

  • European Parliament (2017). European agenda for the collaborative economy (INI - Own-initiative procedure, no. 2017/2003(INI)). http://www.europarl.europa.eu/oeil/popups/ficheprocedure.do?lang=en&reference=2017/2003(INI).

  • Forde, C., M. Stuart, S. Joyce, et al. (2017). The Social Protection of Workers in the Platform Economy (Study for the EMPL Committee, no. IP/A/EMPL/2016-11 PE 614.184). Brussels: European Parliament | Directorate-General for Internal Policies | Policy Department A: Economic and Scientific Policy. http://www.europarl.europa.eu/RegData/etudes/STUD/2017/614184/IPOL_STU%282017%29614184_EN.pdf. Accessed 1 January 2018.

  • Gadiraju, U., B. Fetahu, and C. Hube (2016). Crystal clear or very vague? Effects of task clarity in the microtask crowdsourcing ecosystem. In 1st International Workshop on Weaving Relations of Trust in Crowd Work: Transparency and Reputation Across Platforms, Co-located with the 8th International ACM Web Science Conference, Hannover, Germany, 22–25 May 2016.

    Google Scholar 

  • Gaikwad, S. N., D. Morina, R. Nistala, et al. (2015). Daemo: A Self-Governed Crowdsourcing Marketplace. In UIST ‘15 Adjunct. Adjunct Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology, Daegu, Kyungpook, Republic of Korea, 08–11 November 2015. New York: ACM, pp. 101–102. https://doi.org/10.1145/2815585.2815739.

  • Gaikwad, S. N. S., D. Morina, A. Ginzberg, et al. (2016). Boomerang: Rebounding the Consequences of Reputation Feedback on Crowdsourcing Platforms. In UIST ‘16. Proceedings of the 29th Annual Symposium on User Interface Software and Technology, Tokyo, Japan, 16–19 October 2016. New York, NY, USA: ACM, pp. 625–637. https://doi.org/10.1145/2984511.2984542.

  • Gaikwad, S. N., M. E. Whiting, D. Gamage, et al. (2017). The Daemo Crowdsourcing Marketplace. In CSCW ‘17 Companion. Companion of the 2017 ACM Conference on Computer Supported Cooperative Work and Social Computing, Portland, Oregon, USA, 25 February - 01 March 2017. New York: ACM, pp. 1–4. https://doi.org/10.1145/3022198.3023270.

  • Garden City Group (n.d.). Douglas O’Connor v. Uber Technologies, Inc. [Lawsuit Website]. http://www.uberlitigation.com/. Accessed 2 January 2018.

  • Gleibs, I. H. (2017). Are all “research fields” equal? Rethinking practice for the use of data from crowdsourcing market places. Behavior Research Methods, vol. 49, no. 4, pp. 1333–1342. https://doi.org/10.3758/s13428-016-0789-y.

    Article  Google Scholar 

  • Gordon, C. and R. Eisenbrey (2012). As unions decline, inequality rises. Economic Policy Institute Publications. 6 June 2012Economic Policy Institute Publications, 6 June 2012. https://www.epi.org/publication/unions-decline-inequality-rises/. Accessed 8 April 2018.

  • Graham, M., I. Hjorth, and V. Lehdonvirta (2017). Digital labour and development: impacts of global digital labour platforms and the gig economy on worker livelihoods. Transfer: European Review of Labour and Research, vol. 23, no. 2, pp. 135–162. https://doi.org/10.1177/1024258916687250.

    Article  Google Scholar 

  • Greenbaum, J. (1988). In Search of Cooperation: An Historical Analysis of Work Organization and Management Strategies. In CSCW ‘88. Proceedings of the 1988 ACM Conference on Computer-supported Cooperative Work, Portland, Oregon, USA, 26–28 September 1988. New York: ACM, pp. 102–114. https://doi.org/10.1145/62266.62275.

    Chapter  Google Scholar 

  • Greenbaum, J. (1996). Back to Labor: Returning to Labor Process Discussions in the Study of Work. In CSCW ‘96. Proceedings of the 1996 ACM Conference on Computer Supported Cooperative Work, Boston, Massachusetts, USA, 16–20 November 1996. New York: ACM, pp. 229–237. https://doi.org/10.1145/240080.240259.

    Chapter  Google Scholar 

  • Greenbaum, J., and M. Kyng (Eds.) (1991). Design at Work: Cooperative Design of Computer Systems. Hillsdale: Lawrence Erlbaum Associates, Inc., Publishers.

    Google Scholar 

  • Gupta, N., D. Martin, B. V. Hanrahan; et al. (2014). Turk-Life in India. In GROUP ‘14. Proceedings of the 18th International Conference on Supporting Group Work, Sanibel Island, Florida, USA, 09–12 November 2014. New York: ACM, pp. 1–11. https://doi.org/10.1145/2660398.2660403.

    Chapter  Google Scholar 

  • Hanrahan, B. V., J. K. Willamowski, S. Swaminathan; et al. (2015). TurkBench: Rendering the Market for Turkers. In CHI ‘15. Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, Seoul, Republic of Korea, April 18–23, 2015. New York: ACM, pp. 1613–1616. https://doi.org/10.1145/2702123.2702279.

    Chapter  Google Scholar 

  • Hara, K., A. Adams, K. Milland, et al. (2018). A Data-Driven Analysis of Workers’ Earnings on Amazon Mechanical Turk. In CHI ‘18. Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Montréal, QC, Canada, 21–26 April 2018. New York: ACM.

    Google Scholar 

  • Haug, M. C. (2017). Fast, Cheap, and Unethical? The Interplay of Morality and Methodology in Crowdsourced Survey Research. Review of Philosophy and Psychology, pp. 1–17. https://doi.org/10.1007/s13164-017-0374-z.

  • Hauser, D. J., and N. Schwarz (2016). Attentive Turkers: MTurk participants perform better on online attention checks than do subject pool participants. Behavior Research Methods, vol. 48, no. 1, pp. 400–407. https://doi.org/10.3758/s13428-015-0578-z.

    Article  Google Scholar 

  • Herr, B. (2017). Bericht Plattformunternehmen Foodora Wien (Internal Report). Austrian Chamber of Labour.

  • Hickey, S. (2016). Uber tribunal judges criticise “fictions” and “twisted language.” The Guardian, 28 October 2016. http://www.theguardian.com/technology/2016/oct/28/uber-tribunal-judges-fictions-twisted-language-appeal. Accessed 2 January 2018.

  • Houle, C. (2009). Inequality and Democracy: Why Inequality Harms Consolidation but Does Not Affect Democratization. World Politics, vol. 61, no. 4, pp. 589–622. https://doi.org/10.1017/S0043887109990074.

    Article  Google Scholar 

  • Huws, U., N. H. Spencer, and S. Joyce (2016). Crowd Work in Europe: Preliminary Results from a Survey in the UK, Sweden, Germany, Austria and the Netherlands. (Research Report). Brussels: Foundation for European Progressive Studies. http://www.feps-europe.eu/assets/39aad271-85ff-457c-8b23-b30d82bb808f/crowd-work-in-europe-draft-report-last-versionpdf.pdf.

  • International Labour Organization (2017). Inception Report for the Global Commission on the Future of Work (Inception Report, no. ISBN 978-92-2-131372-4). Geneva: International Labour Office.

  • Ipeirotis, P. G. (2010). Demographics of Mechanical Turk (Working Paper, no. CeDER-10-01). New York University | Center for Digital Economy Research. http://hdl.handle.net/2451/29585. Accessed 2 January 2018.

  • Ipeirotis, P. (2012). Why I love crowdsourcing (the concept) and hate crowdsourcing (the term). A Computer Scientist in a Business School. 13 November 2012A Computer Scientist in a Business School, 13 November 2012. http://www.behind-the-enemy-lines.com/2012/11/why-i-love-crowdsourcing-concept-and.html. Accessed 3 January 2018.

  • Ipeirotis, P. G., F. Provost, and J. Wang (2010). Quality Management on Amazon Mechanical Turk. In HCOMP ‘10. Proceedings of the ACM SIGKDD Workshop on Human Computation, Washington DC, USA, 25 July 2010. New York: ACM, pp. 64–67. https://doi.org/10.1145/1837885.1837906.

    Chapter  Google Scholar 

  • Irani, L. (2015). The cultural work of microwork. New Media & Society, vol. 17, no. 5, pp. 720–739. https://doi.org/10.1177/1461444813511926.

    Article  Google Scholar 

  • Irani, L. C., and M. S. Silberman (2013). Turkopticon: Interrupting Worker Invisibility in Amazon Mechanical Turk. In CHI ‘13. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Paris, France, 27 April - 02 May 2013. New York: ACM, pp. 611–620. https://doi.org/10.1145/2470654.2470742.

    Chapter  Google Scholar 

  • Irani, L. C., and M. S. Silberman (2014). From critical design to critical infrastructure: lessons from turkopticon. interactions, vol. 21, no. 4, pp. 32–35.

    Article  Google Scholar 

  • Irani, L. C., and M. S. Silberman (2016). Stories We Tell About Labor: Turkopticon and the Trouble with “Design”. In CHI ‘16. Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, San Jose, California, USA, 07–12 May 2016. New York: ACM, pp. 4573–4586. https://doi.org/10.1145/2858036.2858592.

    Chapter  Google Scholar 

  • Jaumotte, F., and C. O. Buitron (2015). Inequality and labor market institutions (Staff Discussion Note, no. SDN/15/14). International Monetary Fund. https://www.imf.org/external/pubs/ft/sdn/2015/sdn1514.pdf.

  • Jung, H. J., and M. Lease (2012). Improving quality of crowdsourced labels via probabilistic matrix factorization. In Proceedings of the 4th Human Computation Workshop (HCOMP) at AAAI, Toronto ON, Canada, 22–23 July 2012. Palo Alto: AAAI Publications, pp. 101–106. https://www.aaai.org/ocs/index.php/WS/AAAIW12/paper/viewPaper/5258.

    Google Scholar 

  • Kittur, A., J. V. Nickerson, M. Bernstein; et al. (2013). The Future of Crowd Work. In CSCW ‘13. Proceedings of the 2013 Conference on Computer Supported Cooperative Work, San Antonio, Texas, USA, 23–27 February 2013. New York: ACM, pp. 1301–1318. https://doi.org/10.1145/2441776.2441923.

    Chapter  Google Scholar 

  • Kochhar, S., S. Mazzocchi, and P. Paritosh (2010). The Anatomy of a Large-scale Human Computation Engine. In HCOMP ‘10. Proceedings of the ACM SIGKDD Workshop on Human Computation, Washington DC, USA, 25 July 2010. New York: ACM, pp. 10–17. https://doi.org/10.1145/1837885.1837890.

    Chapter  Google Scholar 

  • Kuek, S. C., C. Paradi-Guilford, T. Fayomi; et al. (2015). The Global Opportunity in Online Outsourcing (World Bank Other Operational Studies, no. 22284). The World Bank. https://econpapers.repec.org/paper/wbkwboper/22284.htm. Accessed 2 January 2018.

  • LaPlante, R., and S. Silberman (2015). Design notes for a future crowd work market. 13 February 2015. https://medium.com/@silberman/design-notes-for-a-future-crowd-work-market-2d7557105805. Accessed 3 January 2018.

  • LaPlante, R., and M. S. Silberman (2016). Building trust in crowd worker forums: Worker ownership, governance, and work outcomes. In Weaving Relations of Trust in Crowd Work: Transparency and Reputation Across Platforms, workshop at WebSci ‘16, Hannover, Germany, 22–25 May 2016.

  • Lease, M., J. Hullman, J. Bigham, et al. (2013). Mechanical Turk is Not Anonymous (SSRN Scholarly Paper, no. ID 2228728). Rochester: Social Science Research Network. https://papers.ssrn.com/abstract=2228728. Accessed 3 January 2018.

  • Lee, M. K., D. Kusbit, E. Metsky; et al. (2015). Working with Machines: The Impact of Algorithmic and Data-Driven Management on Human Workers. In CHI ‘15. Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, Seoul, Republic of Korea, April 18–23, 2015. New York: ACM, pp. 1603–1612. https://doi.org/10.1145/2702123.2702548.

    Chapter  Google Scholar 

  • Liebman, W. (2016). Es geht um Menschen, für die der Schutz des amerikanischen Arbeitsrechts nicht gilt (Six Silberman, Interviewer). youtube.com/watch?v=Dy8R3jAu8Fo. Accessed 13 May 2017.

  • Mandl, I., M. Curtarelli, S. Riso, et al. (2015). New forms of employment (Research Report, no. EF1461). Eurofound. https://doi.org/10.2806/989252.

  • Mao, A., A. D. Procaccia, and Y. Chen (2013). Better Human Computation Through Principled Voting. In AAAI’13. Proceedings of the Twenty-Seventh AAAI Conference on Artificial Intelligence, Bellevue, Washington, 14–18 July 2013. Palo Alto: AAAI Publications, pp. 1142–1148. http://dl.acm.org/citation.cfm?id=2891460.2891619. Accessed 2 January 2018.

    Google Scholar 

  • Marder, J., and M. Fritz (2015). The Internet’s hidden science factory. PBS NewsHour, 11 February 2015. https://www.pbs.org/newshour/science/inside-amazons-hidden-science-factory. Accessed 2 January 2018.

  • Martin, D., B. V. Hanrahan, J. O’Neill; et al. (2014). Being a Turker. In CSCW ‘14. Proceedings of the 17th ACM Conference on Computer Supported Cooperative Work & Social Computing, Baltimore, Maryland, USA, 15–19 February 2014. New York: ACM, pp. 224–235. https://doi.org/10.1145/2531602.2531663.

    Chapter  Google Scholar 

  • Martin, D., J. O’Neill, N. Gupta; et al. (2016). Turking in a Global Labour Market. Computer Supported Cooperative Work, vol. 25, no. 1, pp. 39–77. https://doi.org/10.1007/s10606-015-9241-6.

    Article  Google Scholar 

  • Mr Y Aslam, Mr J Farrar and Others -V- Uber. No. 2202551/2015 & Others (Employment Tribunal October 12, 2016). https://www.judiciary.gov.uk/wp-content/uploads/2016/10/aslam-and-farrar-v-uber-employment-judgment-20161028-2.pdf.

  • O’Neill, J., and D. Martin (2013). Relationship-Based Business Process Crowdsourcing? In P. Kotzé et al. (eds): INTERACT 2013. Human-Computer Interaction – INTERACT 2013, Cape Town, South Africa, 2–6 September 2013. Berlin: Springer, pp. 429–446. https://doi.org/10.1007/978-3-642-40498-6_33.

    Chapter  Google Scholar 

  • Osborne, H. (2016). Uber loses right to classify UK drivers as self-employed. The Guardian, 28 October 2016. http://www.theguardian.com/technology/2016/oct/28/uber-uk-tribunal-self-employed-status. Accessed 1 January 2018.

  • Ostrom, E. (2000). The future of democracy. Scandinavian Political Studies, vol. 23, no. 3, pp. 280–283.

    Google Scholar 

  • Pine, K. H, and M. Liboiron (2015). The Politics of Measurement and Action. In CHI ’15. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Seoul, Korea, 18 - 23 April 2015. New York: ACM, pp. 3147–3156. https://doi.org/10.1145/2702123.2702298.

  • Rani, U. (2017). Adapting social protection systems for workers in crowdwork platforms. Presented at the 5th Conference of the Regulating for Decent Work Network, Geneva, 3–5 July 2017. Geneva: International Labour Office.

  • Rao, H., S.-W. Huang, and W.-T. Fu (2013). What Will Others Choose? How a Majority Vote Reward Scheme Can Improve Human Computation in a Spatial Location Identification Task. In First AAAI Conference on Human Computation and Crowdsourcing, Palm Springs, CA, USA, 7–9 November 2013. Palo Alto: AAAI Publications. https://www.aaai.org/ocs/index.php/HCOMP/HCOMP13/paper/view/7525. Accessed 2 January 2018.

    Google Scholar 

  • Raval, N., and P. Dourish (2016). Standing Out from the Crowd: Emotional Labor, Body Labor, and Temporal Labor in Ridesharing. In CSCW ‘16. Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing, San Francisco, California, USA, 27 February - 02 March 2016. New York: ACM, pp. 97–107. https://doi.org/10.1145/2818048.2820026.

    Chapter  Google Scholar 

  • Retelny, D., S. Robaszkiewicz, A. To; et al. (2014). Expert Crowdsourcing with Flash Teams. In UIST ‘14. Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology, Honolulu, Hawaii, USA, 05–08 October 2014. New York: ACM, pp. 75–85. https://doi.org/10.1145/2642918.2647409.

    Chapter  Google Scholar 

  • Sarasua, C., and M. Thimm (2014). Crowd Work CV: Recognition for Micro Work. In L. M. Aiello and D. McFarland (eds): SocInfo 2014.. Social Informatics, Barcelona, Spain, 11 November 2014. Springer, Cham, pp. 429–437. https://doi.org/10.1007/978-3-319-15168-7_52.

    Chapter  Google Scholar 

  • Schmidt, F. A. (2016). Arbeitsmärkte in der Plattformökonomie — Zur Funktionsweise und den Herausforderungen von Crowdwork und Gigwork (Project Report). Bonn: Friedrich-Ebert-Stiftung.

    Google Scholar 

  • Siddiqui, F. (2017). Uber triggers protest for collecting fares during taxi strike against refugee ban. Washington Post, 29 January 2017. https://www.washingtonpost.com/news/dr-gridlock/wp/2017/01/29/uber-triggers-protest-for-not-supporting-taxi-strike-against-refugee-ban/. Accessed 2 January 2018.

  • Silberman, M. S.; and L. C. Irani (2016). Operating an Employer Reputation System: Lessons from Turkopticon, 2008-2015. Comparative Labor Law and Policy Journal, vol. 37, no. 3. https://cllpj.law.illinois.edu/archive/vol_37/.

  • Silberman, M. S., L. Irani, and J. Ross (2010a). Ethics and Tactics of Professional Crowdwork. XRDS, vol. 17, no. 2, pp. 39–43. https://doi.org/10.1145/1869086.1869100.

    Article  Google Scholar 

  • Silberman, M. S., J. Ross, L. Irani, et al. (2010b). Sellers’ Problems in Human Computation Markets. In HCOMP ‘10. Proceedings of the ACM SIGKDD Workshop on Human Computation, Washington DC, USA, 25 July 2010. New York: ACM, pp. 18–21. https://doi.org/10.1145/1837885.1837891.

  • Silberman, M. S., E. Harmon, L. Irani, et al. (2017). Crowd work and the “on-demand” economy. Hesamag, no. 16, pp. 36–39.

  • Silberman, M. S., B. Tomlinson, R. LaPlante, et al. (2018). Responsible research with crowds: pay crowdworkers at least minimum wage. Communications of the ACM, vol. 61, no. 3, pp. 38–41. https://doi.org/10.1145/3180492.

    Article  Google Scholar 

  • Smith, A. (2016). Gig Work, Online Selling and Home Sharing (Research Report). Pew Research Center. http://www.pewinternet.org/2016/11/17/labor-platforms-technology-enabled-gig-work/. Accessed 3 January 2018.

  • Snow, R., B. O’Connor, D. Jurafsky, et al. (2008). Cheap and Fast—but is It Good?: Evaluating Non-expert Annotations for Natural Language Tasks. In EMNLP ‘08. Proceedings of the Conference on Empirical Methods in Natural Language Processing. Stroudsburg: Association for Computational Linguistics, pp. 254–263. http://dl.acm.org/citation.cfm?id=1613715.1613751. Accessed 2 January 2018.

  • Söderqvist, F. (2016). Plattformsekonomin och den svenska partsmodellen (Report). Stockholm: Unionen.

    Google Scholar 

  • Sriraman, A. (2016). CrowdCamp Report: Protecting Humans – Worker-Owned Cooperative Models for Training AI Systems. Follow the Crowd, 29 November 2016. https://humancomputation.com/blog/?p=9502. Accessed 3 January 2018.

  • Stewart, J. (2017). The Human Army Using Phones to Teach AI to Drive. WIRED, 9 July 2017. https://www.wired.com/story/mighty-ai-training-self-driving-cars/. Accessed 2 January 2018.

  • Stone, K. V. W. (2006). Flexibilization, Globalization, and Privatization: Three Challenges to Labour Rights in Our Time. Osgoode Hall Law Journal, vol. 44, pp. 77.

    Google Scholar 

  • Taylor, Alex (2015). After Interaction. Interactions, vol. 22, no. 5, pp. 48–53.https://doi.org/10.1145/2809888.

    Article  Google Scholar 

  • Upwork (2016). User Agreement. 12 September 2016. https://www.upwork.com/legal/. Accessed 15 June 2017.

  • Valentine, M. A., D. Retelny, A. To et al. (2017). Flash Organizations: Crowdsourcing Complex Work by Structuring Crowds As Organizations. In CHI ‘17. Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, Denver, Colorado, USA, 06–11 May 2017. New York: ACM, pp. 3523–3537. https://doi.org/10.1145/3025453.3025811.

  • Vannette, D. (2017). Using Attention Checks in Your Surveys May Harm Data Quality. Qualtrics. 28 June 2017. https://www.qualtrics.com/blog/using-attention-checks-in-your-surveys-may-harm-data-quality/. Accessed 3 January 2018.

  • Whiting, M. E., D. Gamage, S. N. S. Gaikwad, et al. (2017a). Crowd Guilds: Worker-led Reputation and Feedback on Crowdsourcing Platforms. In CSCW ‘17. Proceedings of the 2017 ACM Conference on Computer Supported Cooperative Work and Social Computing, Portland, Oregon, USA, 25 February - 01 March 2017. New York: ACM, pp. 1902–1913. https://doi.org/10.1145/2998181.2998234.

  • Whiting, M. E., D. Gamage, S. Goyal, et al. (2017b). Designing A Constitution for a Self-Governing Crowdsourcing Marketplace. Presented at the Collective Intelligence Conference, Brooklyn, NY, USA, 15–16 June 2017.

  • Wood, A., M. Graham, and V. Lehdonvirta (2016). The new frontier of outsourcing: online labour markets and the consequences for poverty in the Global South. Presented at the Work, Employment and Society Conference, Leeds, UK, 6–8 September 2016.

Download references

Acknowledgements

The work described in this paper (but not the preparation of the paper) was funded by IG Metall (the German Metalworkers’ Union), the Austrian Chamber of Labor (Arbeiterkammer), and Unionen, but the paper was not approved or reviewed by any of these organizations, nor does it reflect any official organizational position. The authors gratefully acknowledge ongoing discussions with Christiane Benner, Vanessa Barth, and Robert Fuss at IG Metall, Sylvia Kuba at the Austrian Chamber of Labor, Karin Zimmermann at the Austrian Confederation of Trade Unions, Fredrik Söderqvist and Carin Hallerström at Unionen, Mattias Beijmo at DUMA, Janine Berg at the International Labour Office, Valerio De Stefano, Mark Graham, Vili Lehdonvirta, Jamie Woodcock, Lilly Irani, Rochelle LaPlante, Benjamin Herr, and Dawn Gearhart. The paper has benefited from suggestions from Erin Goodling, two anonymous reviewers, the editors of this special issue of JCSCW, and the JCSCW editors. The term “automated management systems” is owed to Janine Berg.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ellie Harmon.

Additional information

Ellie Harmon and M. Six Silberman are equal contributors and alphabetical.

Appendix: Digital labor platform working conditions survey

Appendix: Digital labor platform working conditions survey

Notes: Demographic questions such as gender, age, country of residence, native language, and citizenship, and “meta” questions asking the respondents for comments about the survey questions are omitted here. Platform is replaced automatically with the name of the platform the respondent was recruited through.

1.1 General experience on Platform

How long have you worked on/for Platform?

  • Less than 1 month

  • 1–5 months

  • 6–11 months

  • 1–3 years

  • More than 3 years

How much longer do you want to work on/for Platform?

  • Indefinitely

  • Until I find another job

  • Until I finish school

  • Other ____________________

Why do you work on/for Platform?

________________________________________

________________________________________

How often is the work you do while working on/for Platform...? (one per row).

 

Never

Less than half the time

About half the time

More than half the time

Always

I don’t know

meaningful

      

physically dangerous or harmful

      

interesting

      

demeaning or psychologically harmful

      

satisfying

      

ethically questionable

      

fun

      

Last week, how many hours did you spend working on/for Platform in total? Please include time spent actively looking for work, waiting for work to appear, and communicating with other works (e.g., reading and replying to forum posts, chatting about work on/for Platform, etc.).

_____

Was last week a typical week for you?

  • No

  • Yes

[If answered “No” to previous]

In a typical week, how many hours did you spend working on/for Platform in total? Please include time spent actively looking for work, waiting for work to appear, and communicating with other works (e.g., reading and replying to forum posts, chatting about work on/for Platform, etc.).

_____

You said that you {spend/spent} {hours} working {in a typical week/last week}. Of those hours, how many are spent:

_____ actively working on tasks or jobs

_____ looking for work or waiting for work to appear

_____ communicating with workers (including forums, chat, etc.)

Are there any penalties for declining jobs or tasks on Platform?

  • No

  • Yes

  • It’s complicated _____________________

Does Platform assign you a schedule or are you required to work certain hours or times on/for Platform?

  • No

  • Yes

  • It’s complicated _____________________

Do Platform operators or clients/customers tell you how to do your work (for example, specify a particular route for driving/biking, specify which tools or software you must use to complete a task, etc.)?

  • No

  • Yes

  • It’s complicated _____________________

Thinking generally about your experiences working on/for Platform, do you feel in control of your work?

  • No

  • Yes

  • It’s complicated _____________________

While working on/for Platform, do you earn qualifications that give you access to more highly-paid (or otherwise better) work?

  • No

  • Yes

  • I don’t know

[If answered “Yes” to previous]

On Platform, do you understand what is required for you to earn certain qualifications?

  • No

  • Yes

1.2 Pay on Platform

Note: The worker is asked to specify a currency with which to answer the following questions. In this exposition we assume the user has selected EUR (€).

Last week, how much money did you make working on/for Platform (in €)? Please include any tips/bonuses and delivery fees if applicable.

_____

Was last week a typical week for you?

  • No

  • Yes

[If answered “No” to previous]

In a typical week, how much money do you make working on/for Platform (in €)? Please include any tips/bonuses and delivery fees if applicable.

_____

Have you ever received tips or bonuses for your work on/for Platform?

  • No

  • Yes

[If answered “Yes” to previous]

{Last week/In a typical week}, how much money {did/do} you earn from tips or bonuses on Platform?

_____

[For platforms with base hourly wages]

According to your contract with Platform, what is your base hourly wage (in €)?

_____ [may be 0]

[For delivery platforms]

According to your contract with Platform, how much additional money do you earn per delivery, not including tips (in €)?

_____ [may be 0]

Which of the following statements best describes the income you earn from working on/for Platform?

  • It is essential for meeting my basic needs.

  • It is an important component of my budget, but not essential.

  • It is nice to have, but I could live comfortably without it.

When do you usually get paid for work completed on/for Platform?

  • Within 48 h

  • Within 5 working days

  • Within 10 working days

  • Within 1 month

  • It usually takes more than 1 month to get paid

  • I have never been paid

  • I don’t know

Have you ever done work on/for Platform for which you did not get paid?

  • No

  • Yes

[If answered “Yes” to previous]

About how often are you not paid for you work on Platform?

  • It has only happened once or twice. I am almost always paid for my work.

  • It happens for less than half of the work I do.

  • It happens for about half of the work I do.

  • It happens for more than half of the work I do.

  • It happens for all of the work I do. I have never been paid for my work

[If answered “Yes” to “Have you ever done work for which you did not get paid?”]

Please describe what happened/happens:

________________________________________

________________________________________

Do you have any other jobs or do you work on/for any other apps or platforms?

  • No

  • Yes

[If answered “Yes” to previous]

Please list all of your other jobs, including other apps or platforms you work on/for:

________________________________________

________________________________________

[If answered “Yes” to “Do you have other jobs?”]

Last week, how much did you earn from all of this other work combined (in €)?

_____

[If answered “Yes” to “Do you have other jobs?”]

Was last week a typical week for you in terms of how much money you made from work outside of Platform?

  • No

  • Yes

[If answered “Yes” to “Do you have other jobs?” and “No” to previous]

In a typical week, how much do you earn from all of this other work combined (in €)?

_____

Do you receive any kind of government assistance (for example, BAföG/Studienbeihilfe, Arbeitslosengeld, social security, WIC, SNAP/food stamps, etc.)?

  • No

  • Yes

  • Prefer not to answer

1.3 Communication on Platform

  1. I.

    Communication with Platform management

Does Platform management communicate platform/app changes, new policies, and other relevant information to you?

  • No

  • Yes, but only some of the time

  • Yes, always

Have you ever asked Platform management a question?

  • No, but I know how to

  • Yes

  • I do not know how, but I think it is possible

  • It is not possible

[If answered “Yes” to previous]

How often does Platform management answer your questions?

  • Never

  • Less than half of the time

  • About half of the time

  • More than half of the time

  • Always

[If answered “Yes” to “Have you asked management a question?” and anything other than “Never” to previous]

When Platform management responds to your questions, how often are their answers...?

 

Never

Less than half the time

About half the time

More than half the time

Always

I don’t know

prompt

      

respectful

      

useful

      
  1. II.

    Communicating with Platform customers/clients

Have you ever communicated with Platform customers/clients (including asking questions)?

  • No, but I know how to

  • Yes

  • I do not know how, but I think it is possible

  • It is not possible

[If answered “Yes” to previous].

How often do Platform customers/clients respond to your work-related questions or other communications?

  • Never

  • Less than half of the time

  • About half of the time

  • More than half of the time

  • Always

[If answered “Yes” to “Have you communicated with customers?” and anything other than “Never” to previous]

When Platform customers/clients respond to your questions, how often are their answers...?

 

Never

Less than half the time

About half the time

More than half the time

Always

I don’t know

prompt

      

respectful

      

useful

      
  1. III.

    Communicating with other Platform workers

Have you ever communicated with other Platform workers through the official Platform site/app (e.g., official forums, chat)?

  • No, but I know how to

  • Yes

  • I do not know how, but I think it is possible

  • It is not possible

[If answered “Yes” to previous]

How often are worker communications in Platform forums, chat, or other official venues...?

 

Never

Less than half the time

About half the time

More than half the time

Always

I don’t know

respectful

      

useful

      

enjoyable

      

Have you ever communicated with other Platform workers outside of official Platform venues (e.g., on Facebook, on a private forum, etc.)?

  • No, but I know how to

  • Yes

  • I do not know how, but I think it is possible

  • It is not possible

[If answered “Yes” to previous]

How or where can you communicate with other Platform workers outside of the official Platform site/app?

________________________________________

________________________________________

[If answered “Yes” to “Have you communicated with other workers outside the official site/app?”]

How often are worker communications in unofficial venues...?

 

Never

Less than half the time

About half the time

More than half the time

Always

I don’t know

respectful

      

useful

      

enjoyable

      

1.4 Reviews, ratings, and feedback

Can you give feedback about Platform customers/clients to Platform operators/management?

  • No

  • Yes, through the app/website

  • Yes, through some other means

  • I don’t know

[If answered “Yes, through some other means” to previous]

How can you give feedback about Platform customers/clients to Platform operators/management?

________________________________________

________________________________________

Can you give feedback about Platform customers/clients to other workers?

  • No

  • Yes, through the app/website

  • Yes, through some other means

  • I don’t know

[If answered “Yes, through some other means” to previous]

How can you give feedback about Platform customers/clients to other workers?

________________________________________

________________________________________

Do you use other workers’ feedback about Platform customers/clients to make choices about whether to accept a particular task or jobs?

  • No

  • Yes

[If answered “Yes” to previous].

How do you find out about other workers’ feedback about Platform customers/clients?

________________________________________

________________________________________

Do you have access to information about Platform customer/client history on the platform (e.g., payment history, evaluations) through the official Platform interface?

  • No

  • Yes

  • I don’t know

[If answered “Yes” to previous]

Do you use this official customer/client history information to make choices about whether to accept a particular task or job?

  • No

  • Yes

Can customers/clients review, rate, or evaluate your work?

  • No

  • Yes

  • I don’t know

[If answered “Yes” to previous]

How do customers/clients review, rate, or evaluate your work?

  • Through the app or website

  • In person

  • Other ____________________

[If answered “Yes” to “Can customers/clients evaluate your work?”]

How often would you say that customer/client reviews, ratings, or evaluations are...?

 

Never

Less than half the time

About half the time

More than half the time

Always

I don’t know

prompt

      

respectful

      

useful

      

fair

      

[If answered “Yes” to “Can customers/clients evaluate your work?”]

Do customers/clients have to give good reasons for leaving negative ratings or evaluations?

  • No

  • Yes

  • I don’t know

[If answered “Yes” to “Can customers/clients evaluate your work?”]

On Platform, can you contest ratings or evaluations of your work that you think are wrong or unfair through official Platform channels?

  • No

  • Yes

  • I don’t know

[If answered “Yes” to previous]

If you contest a wrong or unfair evaluation through official channels on Platform, how often do platform operators take you seriously?

  • Never

  • Less than half of the time

  • About half of the time

  • More than half of the time

  • Always

[If answered “Yes” to “Can customers/clients evaluate your work?”]

On Platform, can you contest ratings or evaluations of your work that you think are wrong or unfair through non-official channels (for example, by attempting to contact customers/clients directly)?

  • No

  • Yes

  • I don’t know

[If answered “Yes” to previous]

How can you contest unfair or wrong ratings or evaluations outside of official platform channels? Have these methods been successful for you?

________________________________________

________________________________________

1.5 Technology

Would you describe the technology (e.g., website, app) on Platform as reliable?

  • No

  • Yes

  • It’s complicated ____________________

Would you describe the technology (e.g., website, app) on Platform as user-friendly?

  • No

  • Yes

  • It’s complicated ____________________

Would you describe the technology (e.g., website, app) on Platform as fast?

  • No

  • Yes

  • It’s complicated ____________________

1.6 Likes and dislikes

In general, what do you like about working on/for Platform?

________________________________________

________________________________________

In general, what do you not like about working on/for Platform? What problems do you have? What would you change?

________________________________________

________________________________________

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Harmon, E., Silberman, M.S. Rating Working Conditions on Digital Labor Platforms. Comput Supported Coop Work 28, 911–960 (2019). https://doi.org/10.1007/s10606-018-9313-5

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10606-018-9313-5

Key words

Navigation