Milestone 4 Singularity Task clarity: Platform incorporating standardization , micro - moderators and community feedback

From crowdresearch
Jump to: navigation, search

Task clarity: How might workers+requesters work together help produce higher-quality task descriptions?

Introduction

We propose the modification of existing platforms or creation of a new one that can incorporate standardization measures, a system for electing micro-moderators and a way to use community feedback in improving task clarity.

The biggest challenges that a worker faces when working on a job is slogging through the task description and getting a hold of what it wants the worker to do. Once the worker has overcome that hurdle, the rest is a manageable and depends on the efficiency of the worker.

We realize that this initial hurdle is not a small one and is a huge bottleneck when it comes to competeing tasks quickly and efficiently. Some problems that may arise on worker not interpreting tasks correctly could be significantly disadvantageous to both the worker and the requester:

i. Incorrect interpretation of tasks causes errors which is problematic for the requester since he gets poor work and the worker who loses credibility. Poor work is almost always a requester's worst problem. Workers often search for ways to be more accurate and therefore, incorrect understanding of tasks is highly detrimental.

Idea

We propose the modification of existing platforms or the creation of a new platform which can incorporate the following features that can will drastically improve the initial hurdle of task clarity:

1. Standardization: We need to have standard design templates and job categorization. This is absolutely vital for the worker to spend minimal effort in familiarization with tasks. Standardization could be achieved by:

i. Strong Categorization of tasks: One could take inspiration from E-Commerce websites like Amazon which feature a huge variety of products which are extremely easy to find by strong categorization. Analogously, here we need strong categorization of services and jobs that the workers need to find conveniently based on the criteria set by them. Workers can narrow down or filter jobs based on categories such as:
1. Skills required
2. Job difficulty (level of expertise required)
3. Job type (surveys, essays, clicking, captchas etc)
4. Job field (magazines, images, sports, academia(science, maths, computers), general knowledge, politics, internet etc)
5. Estimated time to complete (whether task requires 0.5 hours or 1 or 2 and so on)
6. Estimated payment.
7. Filter jobs based on "tags" that the worker defines


ii. Community-Designed task-design-templates: Popular videogames such as tf2 and counter strike have thriving communities that selflessly devote time and effort in designing new content that the community would like to see incorporated for the benefit of gamers. Taking inspiration from that one could see a similar community in crowdsourcing that exhibits similar selfless behavior. Spamgirl mentioned that the community already has designed templates for writing emails to the requester before. Community designed templates for jobs could be extremely helpful too.

2. Usage of a micro-moderator who overlooks task clarity while the task is in progress

i. The requester can choose a worker based on repuation
ii. That worker acts as a micro-mod overlooking task clarity issues
iii. The micro-mod can be contacted by any worker who wants to work on this task. This method of communication could be chat or video-conferencing.
iv. The micro-mod suggests improvements in task description to the requester. He/She can go through the task description and collaborate with the requester in making it clearer, more precise and if possible shorter. Brevity is important for the worker.
v. The micro-mod reviews the task before it is deployed on the crowdsourcing website or platform. This review would allow him/her to assess the task clarity and improve it before it is assigned to workers.


3. Community feedback in the form of ratings, votes, subjective feedback from workers to requesters in how to improve task design.

i. Workers who have completed the task vote on aspects of the task and rate the task based on "task-clarity" or "convenience" which would allow the requester to gain experience in designing tasks in the future.
ii. Alternatively, this feedback could be provided BEFORE the task is deployed. A small set of "test-workers" could be chosen to assess task clarity in "quarantine" zone before the task is deployed on any platform.

4. Quarantine zone for tasks

i. Before tasks are deployed, they're sent to quarantine zone in which a dedicated test-worker-force improves task-design in exchange for a nominal salary. This way, the requester can ensure that the task will be easily understood by workers and the test-worker-force could earn money.
ii. The test-workers who overlook task design before it exits the quarantine for deployment could be selected based on reputation or elected by the community. This way a set of people of represent the community and understand what the community finds easier to understand can oversee tasks


Although we came up with these ideas ourselves , we also took inspiration from the paper "The Need for Standardization in Crowdsourcing" by authors Panagiotis G. Ipeirotis and John J. Horton who are strong proponents of standardization in crowdsourcing. We realized the need for categorization like E-commerce websites and some way for the community to provide feedback. We found reassurance when we witnessed that several other teams came up with similar ideas which validated this.