Winter Milestone 4 vinyoshy Subscription

From crowdresearch
Jump to: navigation, search

The current generation of paid crowdsourcing platforms holds potential to empower workers, allowing them to be judged purely on the quality of their work, rather than on external factors such as appearances, connections, or other workplace politics. Crowd-labour can also allow workers to work on their own schedule, with low time commitment requirements to complete a single task. Crowdsourcing platforms will also mutually benefit employers, as it gives them access to very short-term workers. Rather than having to maintain a consistent workforce at all times, employers would be able to seek qualified individuals when the need arises, without having to hold on to them for longer than is absolutely necessary.

Yet despite all this potential there has yet to be a paid crowdsourcing platform that has become a mainstream phenomenon (Amazon Mechanical Turk being the closest to actually achieving this status). A meta-analysis of existing studies combined with communication with experienced crowd labor workers and requesters has shown us that crowd-labor in its current incarnation is rife with trust issues between workers and their employers. Many requesters feel uncertain as to whether or not the information they receive will be reliable, making it hard to use the results of surveys or other studies posted on a crowdsourced platform confidently. There is no way to distinguish between a worker filling out a hit honestly and one that is just trying to answer as quickly as possible. This combined with the repetitive and fast-paced nature of crowd-labor makes it especially hard to distinguish between the two possibilities. Additionally, workers feel uncertain as to whether they will actually get paid for the work they provide and whether or not their work will be unfairly rejected. Despite these suspicions, meta-analysis has shown that both sides of the equation believe that there are good workers or good requesters active on crowdsourcing platforms, but being able to work with one is not a certainty.

Amazon Mechanical Turk tried to alleviate the requester’s end of this issue through a rating system which requesters could use to judge the performance of a worker. However, there was too much pressure to give workers high reviews, and as a result, the ratings provided no useful information on a worker. Additionally, this solution ignored the workers’ distrust of requesters.

Workers are more than capable of identifying when they are being exploited. If we allow workers to develop long term relationships with requesters and vice-versa, all parties involved will become more considerate of their actions. I propose a subscription system modeled after the types used on YouTube through which workers can subscribe to various requesters they like to receive updates on when work is posted. Additionally, requesters will have the ability to promote workers they like to “early access” status, meaning these workers will access to the hits before anyone else. Requesters will pay more and treat workers fairly if they know they have a chance to core group of “early access” workers whom they can trust to reliably complete hits. Similarly, workers are more likely to fill out hits honestly if they believe doing so could give them access to hits before others. By taking advantage of the competitive nature of crowdsourcing labor platforms, we can actually push people to more ethical practices rather towards trying to take shortcuts.